Re: SparkSQL can not use SchemaRDD from Hive

2014-07-28 Thread Kevin Jung
Thanks for your fast replies. I was wrong about HiveContext. val hive = new org.apache.spark.sql.hive.HiveContext(sc) var sample = hive.hql("select * from sample10") var countHive = sample.count() hive.registerRDDAsTable(sample,"temp") hive.sql("select * from temp").count() It works so fine. T

Re: SparkSQL can not use SchemaRDD from Hive

2014-07-28 Thread Zongheng Yang
As Hao already mentioned, using 'hive' (the HiveContext) throughout would work. On Monday, July 28, 2014, Cheng, Hao wrote: > In your code snippet, "sample" is actually a SchemaRDD, and SchemaRDD > actually binds a certain SQLContext in runtime, I don't think we can > manipulate/share the Schema

RE: SparkSQL can not use SchemaRDD from Hive

2014-07-28 Thread Cheng, Hao
In your code snippet, "sample" is actually a SchemaRDD, and SchemaRDD actually binds a certain SQLContext in runtime, I don't think we can manipulate/share the SchemaRDD across SQLContext Instances. -Original Message- From: Kevin Jung [mailto:itsjb.j...@samsung.com] Sent: Tuesday, July