I think I found the issue, Hive metastore 2.3.6 doesn't have the necessary
support. After upgrading to Hive 3.1.2 I was able to run the select query.
On Sun, 20 Dec 2020 at 12:00, Jay wrote:
> Thanks Matt.
>
> I have set the two configs in my sparkConfig as below
> val spark =
> SparkSession.bu
Thanks Matt.
I have set the two configs in my sparkConfig as below
val spark =
SparkSession.builder().appName("QuickstartSQL").config("spark.sql.extensions",
"io.delta.sql.DeltaSparkSessionExtension").config("spark.sql.catalog.spark_catalog",
"org.apache.spark.sql.delta.catalog.DeltaCatalog").getO
Hi Jay,
Some things to check:
Do you have the following set in your Spark SQL config:
"spark.sql.extensions=io.delta.sql.DeltaSparkSessionExtension"
"spark.sql.catalog.spark_catalog=org.apache.spark.sql.delta.catalog.DeltaCatalog"
Is the JAR for the package delta-core_2.12:0.7.0 available on bo
Hi All -
I have currently setup a Spark 3.0.1 cluster with delta version 0.7.0 which
is connected to an external hive metastore.
I run the below set of commands :-
val tableName = tblname_2
spark.sql(s"CREATE TABLE $tableName(col1 INTEGER) USING delta
options(path='GCS_PATH')")
*20/12/19 17:30: