You can look into spark.driver.userClassPathFirst flag.
spark.driver.userClassPathFirstfalse(Experimental) Whether to give
user-added jars precedence over Spark's own jars when loading classes in
the the driver. This feature can be used to mitigate conflicts between
Spark's dependencies and user d
Thanks, that did seem to make a difference. I am a bit scared of this
approach as spark itself has a different guava dependency but the error
does go away this way
On Wed, Jun 24, 2015 at 10:04 AM, Akhil Das
wrote:
> Can you try to add those jars in the SPARK_CLASSPATH and give it a try?
>
> Tha
Can you try to add those jars in the SPARK_CLASSPATH and give it a try?
Thanks
Best Regards
On Wed, Jun 24, 2015 at 12:07 AM, Yana Kadiyska
wrote:
> Hi folks, I have been using Spark against an external Metastore service
> which runs Hive with Cdh 4.6
>
> In Spark 1.2, I was able to successfull
Hi folks, I have been using Spark against an external Metastore service
which runs Hive with Cdh 4.6
In Spark 1.2, I was able to successfully connect by building with the
following:
./make-distribution.sh --tgz -Dhadoop.version=2.0.0-mr1-cdh4.2.0
-Phive-thriftserver -Phive-0.12.0
I see that in S