Hello all,
I am trying to understand how the Spark SQL integration with hive works.
Whenever i build spark with -Phive -P hive-thriftserver options, i see that
it is packaged with hive-2.3.7*.jars and spark-hive*.jars. And the
documentation claims that spark can talk to different versions of hive.
explicitly disclaimed.
> The author will in no case be liable for any monetary damages arising from
> such loss, damage or destruction.
>
>
>
>
> On Thu, 22 Oct 2020 at 16:36, Ravi Shankar wrote:
>
>> Hello all,
>> I am trying to understand how the Spark SQL integratio
Thanks ! I have a very similar setup. I have built spark with -Phive which
includes hive-2.3.7 jars , spark-hive*jars and some hadoop-common* jars.
At runtime, i set SPARK_DIST_CLASSPATH=${hadoop classpath}
and set spark.sql.hive.metastore.version and spark.sql.hive.metastore.jars
to $HIVE_HOME/l