Hi,

This may not be feasible in Spark streaming.

I am trying to create a HiveContext in Spark streaming within the streaming
context

// Create a local StreamingContext with two working thread and batch
interval of 2 seconds.

     val sparkConf = new SparkConf().
             setAppName(sparkAppName).
             set("spark.driver.allowMultipleContexts", "true").
             set("spark.hadoop.validateOutputSpecs", "false")
.....

Now try to create an sc

val sc = new SparkContext(sparkConf)
val HiveContext = new org.apache.spark.sql.hive.HiveContext(sc)

This is accepted but it creates two spark jobs


[image: Inline images 1]

And basically it goes to a waiting state

Any ideas how one  can create a HiveContext within Spark streaming?

Thanks






Dr Mich Talebzadeh



LinkedIn * 
https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
<https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*



http://talebzadehmich.wordpress.com


*Disclaimer:* Use it at your own risk. Any and all responsibility for any
loss, damage or destruction of data or any other property which may arise
from relying on this email's technical content is explicitly disclaimed.
The author will in no case be liable for any monetary damages arising from
such loss, damage or destruction.

Reply via email to