Re: Sharing SparkContext

2014-03-10 Thread abhinav chowdary
hdfs 1.0.4 but we primarily use Cassandra + Spark (calliope). I tested it with both Are you using it with HDFS? What version of Hadoop? 1.0.4? Ognen On 3/10/14, 8:49 PM, abhinav chowdary wrote: for any one who is interested to know about job server from Ooyala.. we started using it recently and

Re: Sharing SparkContext

2014-03-10 Thread Ognen Duzlevski
Are you using it with HDFS? What version of Hadoop? 1.0.4? Ognen On 3/10/14, 8:49 PM, abhinav chowdary wrote: for any one who is interested to know about job server from Ooyala.. we started using it recently and been working great so far.. On Feb 25, 2014 9:23 PM, "Ognen Duzlevski"

Re: Sharing SparkContext

2014-03-10 Thread abhinav chowdary
0.8.1 we used branch 0.8 and pull request into our local repo. I remember we have to deal with few issues but once we are thought that its working great. On Mar 10, 2014 6:51 PM, "Mayur Rustagi" wrote: > Which version of Spark are you using? > > > Mayur Rustagi > Ph: +1 (760) 203 3257 > http://

Re: Sharing SparkContext

2014-03-10 Thread Mayur Rustagi
Which version of Spark are you using? Mayur Rustagi Ph: +1 (760) 203 3257 http://www.sigmoidanalytics.com @mayur_rustagi On Mon, Mar 10, 2014 at 6:49 PM, abhinav chowdary < abhinav.chowd...@gmail.com> wrote: > for any one who is interested to know about jo

Re: Sharing SparkContext

2014-03-10 Thread abhinav chowdary
for any one who is interested to know about job server from Ooyala.. we started using it recently and been working great so far.. On Feb 25, 2014 9:23 PM, "Ognen Duzlevski" wrote: > In that case, I must have misunderstood the following (from > http://spark.incubator.apache.org/docs/0.8.1/job-sch

Re: Sharing SparkContext

2014-02-25 Thread Ognen Duzlevski
In that case, I must have misunderstood the following (from http://spark.incubator.apache.org/docs/0.8.1/job-scheduling.html). Apologies. Ognen "Inside a given Spark application (SparkContext instance), multiple parallel jobs can run simultaneously if they were submitted from separate threads

Re: Sharing SparkContext

2014-02-25 Thread Ognen Duzlevski
On 2/25/14, 12:24 PM, Mayur Rustagi wrote: So there is no way to share context currently, 1. you can try jobserver by Ooyala but I havnt used it & frankly nobody has shared feedback on it. One of the major show stoppers for me is that when compiled with Hadoop 2.2.0 - Ooyala standalone serve

Re: Sharing SparkContext

2014-02-25 Thread abhinav chowdary
Thank You Mayur I will try Ooyala job server to begin with. Is there a way to load RDD created via sparkContext into shark? Only reason i ask is my RDD is being created from Cassandra (not Hadoop, we are trying to get shark work with Cassandra as well, having troubles with it when running in dist

Re: Sharing SparkContext

2014-02-25 Thread Mayur Rustagi
fair scheduler merely reorders tasks .. I think he is looking to run multiple pieces of code on a single context on demand from customers...if the code & order is decided then fair scheduler will ensure that all tasks get equal cluster time :) Mayur Rustagi Ph: +919632149971 h

Re: Sharing SparkContext

2014-02-25 Thread Ognen Duzlevski
Doesn't the fair scheduler solve this? Ognen On 2/25/14, 12:08 PM, abhinav chowdary wrote: Sorry for not being clear earlier how do you want to pass the operations to the spark context? this is partly what i am looking for . How to access the active spark context and possible ways to pass opera

Re: Sharing SparkContext

2014-02-25 Thread Mayur Rustagi
So there is no way to share context currently, 1. you can try jobserver by Ooyala but I havnt used it & frankly nobody has shared feedback on it. 2. If you can load that rdd to Shark then you get a sql interface on that RDD + columnar storage 3. You can try a crude method of starting a spark shell

Re: Sharing SparkContext

2014-02-25 Thread abhinav chowdary
Sorry for not being clear earlier how do you want to pass the operations to the spark context? this is partly what i am looking for . How to access the active spark context and possible ways to pass operations Thanks On Tue, Feb 25, 2014 at 10:02 AM, Mayur Rustagi wrote: > how do you want to p

Re: Sharing SparkContext

2014-02-25 Thread Mayur Rustagi
how do you want to pass the operations to the spark context? Mayur Rustagi Ph: +919632149971 h ttp://www.sigmoidanalytics.com https://twitter.com/mayur_rustagi On Tue, Feb 25, 2014 at 9:59 AM, abhinav chowdary < abhinav.chowd...@gmail.com> wrote: > Hi, >

Sharing SparkContext

2014-02-25 Thread abhinav chowdary
Hi, I am looking for ways to share the sparkContext, meaning i need to be able to perform multiple operations on the same spark context. Below is code of a simple app i am testing def main(args: Array[String]) { println("Welcome to example application!") val sc = new SparkContext