On Thu, Dec 3, 2015 at 10:32 AM, Mich Talebzadeh <m...@peridale.co.uk> wrote:
> hduser@rhes564::/usr/lib/spark/logs> hive --version > SLF4J: Found binding in > [jar:file:/usr/lib/spark/lib/spark-assembly-1.3.0-hadoop2.4.0.jar!/org/slf4j/impl/StaticLoggerBinder.class] As I suggested before, you have Spark's assembly in the Hive classpath. That's not the way to configure hive-on-spark; if the documentation you're following tells you to do that, it's wrong. (And sorry Ted, but please ignore Ted's suggestion. Hive-on-Spark should work fine with Spark 1.3 if it's configured correctly. You really don't want to be overriding Hive classes with the ones shipped in the Spark assembly, regardless of the version of Spark being used.) -- Marcelo