Hi,
I guess that toDF() api in spark 1.3 which is required build from source
code?
Patcharee
On 03. mars 2015 13:42, Cheng, Hao wrote:
Using the SchemaRDD / DataFrame API via HiveContext
Assume you're using the latest code, something probably like:
val hc = new HiveContext(sc)
import hc.im
Will this recognize the hive partitions as well.
Example
insert into specific partition of hive ?
On Tue, Mar 3, 2015 at 11:42 PM, Cheng, Hao wrote:
> Using the SchemaRDD / DataFrame API via HiveContext
>
> Assume you're using the latest code, something probably like:
>
> val hc = new HiveCont
Using the SchemaRDD / DataFrame API via HiveContext
Assume you're using the latest code, something probably like:
val hc = new HiveContext(sc)
import hc.implicits._
existedRdd.toDF().insertInto("hivetable")
or
existedRdd.toDF().registerTempTable("mydata")
hc.sql("insert into hivetable as select