Well I have seen this type of error before.
I tend to create the table in hive first and alter it in spark if needed.
This is spark 3.1.1 with Hive (version 3.1.1)
0: jdbc:hive2://rhes75:10099/default> create table my_table2 (col1 int,
col2 int)
0: jdbc:hive2://rhes75:10099/default> describe my_t
Hi there.
I also posted this problem in the spark list. I am no sure this is a
spark or a hive metastore problem. Or if there is some metastore tunning
configuration as workaround.
Spark can't see hive schema updates partly because it stores the schema
in a weird way in hive metastore.
1. FROM