ot;).count()
It works so fine.
Thanks,
Kevin
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/SparkSQL-can-not-use-SchemaRDD-from-Hive-tp10841p10847.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
k we can
> manipulate/share the SchemaRDD across SQLContext Instances.
>
> -Original Message-
> From: Kevin Jung [mailto:itsjb.j...@samsung.com ]
> Sent: Tuesday, July 29, 2014 1:47 PM
> To: u...@spark.incubator.apache.org
> Subject: SparkSQL can not use SchemaRDD from Hive
: Tuesday, July 29, 2014 1:47 PM
To: u...@spark.incubator.apache.org
Subject: SparkSQL can not use SchemaRDD from Hive
Hi
I got a error message while using Hive and SparkSQL.
This is code snippet I used.
(in spark-shell , 1.0.0)
val sqlContext = new org.apache.spark.sql.SQLContext(sc)
import s
e Not Found: sample10"
I don't know why this happen. Does SparkSQL conflict with Hive?
Thanks,
Kevin
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/SparkSQL-can-not-use-SchemaRDD-from-Hive-tp10841.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.