So there is no way to share context currently,
1. you can try jobserver by Ooyala but I havnt used it & frankly nobody has
shared feedback on it.
2. If you can load that rdd to Shark then you get a sql interface on that
RDD + columnar storage
3. You can try a crude method of starting a spark shell & passing commands
to it after receiving them through html interface etc.. but you'll have to
do the hard work of managing concurrency.
I was wondering about the usecase, are you looking to pass the spark
closure on rdd & transforming it each time or looking to avoid caching RDD
again & again.





Mayur Rustagi
Ph: +919632149971
h <https://twitter.com/mayur_rustagi>ttp://www.sigmoidanalytics.com
https://twitter.com/mayur_rustagi



On Tue, Feb 25, 2014 at 10:08 AM, abhinav chowdary <
abhinav.chowd...@gmail.com> wrote:

> Sorry for not being clear earlier
>
> how do you want to pass the operations to the spark context?
> this is partly what i am looking for . How to access the active spark
> context and possible ways to pass operations
>
> Thanks
>
>
>
> On Tue, Feb 25, 2014 at 10:02 AM, Mayur Rustagi 
> <mayur.rust...@gmail.com>wrote:
>
>> how do you want to pass the operations to the spark context?
>>
>>
>> Mayur Rustagi
>> Ph: +919632149971
>> h <https://twitter.com/mayur_rustagi>ttp://www.sigmoidanalytics.com
>> https://twitter.com/mayur_rustagi
>>
>>
>>
>> On Tue, Feb 25, 2014 at 9:59 AM, abhinav chowdary <
>> abhinav.chowd...@gmail.com> wrote:
>>
>>> Hi,
>>>        I am looking for ways to share the sparkContext, meaning i need
>>> to be able to perform multiple operations on the same spark context.
>>>
>>> Below is code of a simple app i am testing
>>>
>>>  def main(args: Array[String]) {
>>>     println("Welcome to example application!")
>>>
>>>     val sc = new SparkContext("spark://10.128.228.142:7077", "Simple
>>> App")
>>>
>>>     println("Spark context created!")
>>>
>>>     println("Creating RDD!")
>>>
>>> Now once this context is created i want to access  this to submit
>>> multiple jobs/operations
>>>
>>> Any help is much appreciated
>>>
>>> Thanks
>>>
>>>
>>>
>>>
>>
>
>
> --
> Warm Regards
> Abhinav Chowdary
>

Reply via email to