Thanks for your fast replies.
I was wrong about HiveContext.
val hive = new org.apache.spark.sql.hive.HiveContext(sc)
var sample = hive.hql("select * from sample10")
var countHive = sample.count()
hive.registerRDDAsTable(sample,"temp")
hive.sql("select * from temp").count()
It works so fine.
T
As Hao already mentioned, using 'hive' (the HiveContext) throughout would
work.
On Monday, July 28, 2014, Cheng, Hao wrote:
> In your code snippet, "sample" is actually a SchemaRDD, and SchemaRDD
> actually binds a certain SQLContext in runtime, I don't think we can
> manipulate/share the Schema
In your code snippet, "sample" is actually a SchemaRDD, and SchemaRDD actually
binds a certain SQLContext in runtime, I don't think we can manipulate/share
the SchemaRDD across SQLContext Instances.
-Original Message-
From: Kevin Jung [mailto:itsjb.j...@samsung.com]
Sent: Tuesday, July