Hi, Xuefu
You are right.
Maybe I should launch spark-submit by HS2 or Hive CLI ?
Thanks a lot,
Stana
2016-03-22 1:16 GMT+08:00 Xuefu Zhang :
> Stana,
>
> I'm not sure if I fully understand the problem. spark-submit is launched in
> the same host as your application, which should be able to acc
Does anyone have suggestions in setting property of hive-exec-2.0.0.jar
path in application?
Something like
'hiveConf.set("hive.remote.driver.jar","hdfs://storm0:9000/tmp/spark-assembly-1.4.1-hadoop2.6.0.jar")'.
2016-03-11 10:53 GMT+08:00 Stana :
> Thanks for reply
>
> I have set the property s
Thanks for reply
I have set the property spark.home in my application. Otherwise the
application threw 'SPARK_HOME not found exception'.
I found hive source code in SparkClientImpl.java:
private Thread startDriver(final RpcServer rpcServer, final String
clientId, final String secret)
throw
I am trying out Hive on Spark with hive 2.0.0 and spark 1.4.1, and
executing org.apache.hadoop.hive.ql.Driver with java application.
Following are my situations:
1.Building spark 1.4.1 assembly jar without Hive .
2.Uploading the spark assembly jar to the hadoop cluster.
3.Executing the java appli