We have a pending PR to block users to create the Hive serde table when
using InMemroyCatalog. See: https://github.com/apache/spark/pull/16587 I
believe it answers your question.

BTW, we still can create the regular data source tables and insert the data
into the tables. The major difference is whether the metadata is
persistently stored or not.

Thanks,

Xiao Li

2017-01-22 11:14 GMT-08:00 Reynold Xin <r...@databricks.com>:

> I think this is something we are going to change to completely decouple
> the Hive support and catalog.
>
>
> On Sun, Jan 22, 2017 at 4:51 AM Shuai Lin <linshuai2...@gmail.com> wrote:
>
>> Hi all,
>>
>> Currently when the in-memory catalog is used, e.g. through `--conf
>> spark.sql.catalogImplementation=in-memory`, we can create a persistent
>> table, but inserting into this table would fail with error message "Hive
>> support is required to insert into the following tables..".
>>
>>     sql("create table t1 (id int, name string, dept string)") // OK
>>     sql("insert into t1 values (1, 'name1', 'dept1')")  // ERROR
>>
>>
>> This doesn't make sense for me, because this table would always be empty
>> if we can't insert into it, thus would be of no use. But I wonder if there
>> are other good reasons for the current logic. If not, I would propose to
>> raise an error when creating the table in the first place.
>>
>> Thanks!
>>
>> Regards,
>> Shuai Lin (@lins05)
>>
>>
>>

Reply via email to