Hi @Wenchen Fan
Thanks for your response. I believe we have not had enough time to
"DISCUSS" this matter.
Currently in order to make Spark take advantage of Hive, I create a soft
link in $SPARK_HOME/conf. FYI, my spark version is 3.4.0 and Hive is 3.1.1
/opt/spark/conf/hive-site.xml ->
/data6/
@Mich Talebzadeh thanks for sharing your
concern!
Note: creating Spark native data source tables is usually Hive compatible
as well, unless we use features that Hive does not support (TIMESTAMP NTZ,
ANSI INTERVAL, etc.). I think it's a better default to create Spark native
table in this case, ins