Thank You Mayur

I will try Ooyala job server to begin with. Is there a way to load RDD
created via sparkContext into shark? Only reason i ask is my RDD is being
created from Cassandra (not Hadoop,  we are trying to get shark work with
Cassandra as well, having troubles with it when running in distributed
mode).


On Tue, Feb 25, 2014 at 10:30 AM, Mayur Rustagi <mayur.rust...@gmail.com>wrote:

> fair scheduler merely reorders tasks .. I think he is looking to run
> multiple pieces of code on a single context on demand from customers...if
> the code & order is decided then fair scheduler will ensure that all tasks
> get equal cluster time :)
>
>
>
> Mayur Rustagi
> Ph: +919632149971
> h <https://twitter.com/mayur_rustagi>ttp://www.sigmoidanalytics.com
> https://twitter.com/mayur_rustagi
>
>
>
> On Tue, Feb 25, 2014 at 10:24 AM, Ognen Duzlevski <
> og...@nengoiksvelzud.com> wrote:
>
>>  Doesn't the fair scheduler solve this?
>> Ognen
>>
>>
>> On 2/25/14, 12:08 PM, abhinav chowdary wrote:
>>
>> Sorry for not being clear earlier
>> how do you want to pass the operations to the spark context?
>> this is partly what i am looking for . How to access the active spark
>> context and possible ways to pass operations
>>
>>  Thanks
>>
>>
>>
>>  On Tue, Feb 25, 2014 at 10:02 AM, Mayur Rustagi <mayur.rust...@gmail.com
>> > wrote:
>>
>>> how do you want to pass the operations to the spark context?
>>>
>>>
>>>  Mayur Rustagi
>>> Ph: +919632149971
>>> h <https://twitter.com/mayur_rustagi>ttp://www.sigmoidanalytics.com
>>>  https://twitter.com/mayur_rustagi
>>>
>>>
>>>
>>> On Tue, Feb 25, 2014 at 9:59 AM, abhinav chowdary <
>>> abhinav.chowd...@gmail.com> wrote:
>>>
>>>> Hi,
>>>>        I am looking for ways to share the sparkContext, meaning i need
>>>> to be able to perform multiple operations on the same spark context.
>>>>
>>>>  Below is code of a simple app i am testing
>>>>
>>>>   def main(args: Array[String]) {
>>>>     println("Welcome to example application!")
>>>>
>>>>      val sc = new SparkContext("spark://10.128.228.142:7077", "Simple
>>>> App")
>>>>
>>>>      println("Spark context created!")
>>>>
>>>>      println("Creating RDD!")
>>>>
>>>>  Now once this context is created i want to access  this to submit
>>>> multiple jobs/operations
>>>>
>>>>  Any help is much appreciated
>>>>
>>>>  Thanks
>>>>
>>>>
>>>>
>>>>
>>>
>>
>>
>>  --
>> Warm Regards
>> Abhinav Chowdary
>>
>>
>>
>


-- 
Warm Regards
Abhinav Chowdary

Reply via email to