This one is CDH-specific and is already answered in the forums, so I'd
go there instead.
Ex:
http://community.cloudera.com/t5/Advanced-Analytics-Apache-Spark/Spark-sql-and-Hive-tables/td-p/22051
On Mon, Mar 9, 2015 at 12:33 PM, sachin Singh wrote:
> Hi,
> I am using CDH5.3.1
> I am getting bello
I have copied hive-site.xml to spark conf folder "cp
/etc/hive/conf/hive-site.xml /usr/lib/spark/conf"
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/issue-creating-spark-context-with-CDH-5-3-1-tp21968p21969.html
Sent from the Apache Spark User List mailin
Hi,
I am using CDH5.3.1
I am getting bellow error while, even spark context not getting created,
I am submitting my job like this -
submitting command-
spark-submit --jars
./analiticlibs/utils-common-1.0.0.jar,./analiticlibs/mysql-connector-java-5.1.17.jar,./analiticlibs/log4j-1.2.17.jar,./analiti