I am not sure that can be done. Receivers are designed to be run only on the executors/workers, whereas a SQLContext (for using Spark SQL) can only be defined on the driver.
On Mon, Dec 29, 2014 at 6:45 PM, sranga <sra...@gmail.com> wrote: > Hi > > Could Spark-SQL be used from within a custom actor that acts as a receiver > for a streaming application? If yes, what is the recommended way of passing > the SparkContext to the actor? > Thanks for your help. > > > - Ranga > > > > -- > View this message in context: > http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Streaming-HiveContext-within-Custom-Actor-tp20892.html > Sent from the Apache Spark User List mailing list archive at Nabble.com. > > --------------------------------------------------------------------- > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org > For additional commands, e-mail: user-h...@spark.apache.org > --------------------------------------------------------------------- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org