Re: Submiting Spark application through code

2014-11-26 Thread sivarani
I am trying to submit spark streaming program, when i submit batch process its working.. but when i do the same with spark streaming.. it throws Anyone please help 14/11/26 17:42:25 INFO server.AbstractConnector: Started SocketConnector@0.0.0.0:50016 14/11/26 17:42:25 INFO server.Server: jetty-8.1

Re: Submiting Spark application through code

2014-11-05 Thread sivarani
Thanks boss its working :) -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Submiting-Spark-application-through-code-tp17452p18250.html Sent from the Apache Spark User List mailing list archive at Nabble.com. -

Re: Submiting Spark application through code

2014-11-02 Thread Marius Soutier
Just a wild guess, but I had to exclude “javax.servlet.servlet-api” from my Hadoop dependencies to run a SparkContext. In your build.sbt: "org.apache.hadoop" % "hadoop-common" % “..." exclude("javax.servlet", "servlet-api"), "org.apache.hadoop" % "hadoop-hdfs" % “..." exclude("javax.servlet",

Re: Submiting Spark application through code

2014-10-30 Thread Sonal Goyal
What do your worker logs say? Best Regards, Sonal Nube Technologies On Fri, Oct 31, 2014 at 11:44 AM, sivarani wrote: > I tried running it but dint work > > public static final SparkConf batchConf= new SparkConf(); > String mast

Re: Submiting Spark application through code

2014-10-30 Thread sivarani
I tried running it but dint work public static final SparkConf batchConf= new SparkConf(); String master = "spark://sivarani:7077"; String spark_home ="/home/sivarani/spark-1.0.2-bin-hadoop2/"; String jar = "/home/sivarani/build/Test.jar"; public static final JavaSparkContext batchSparkContext = n

Re: Submiting Spark application through code

2014-10-28 Thread Akhil Das
​And the scala way of doing it would be: val sc = new SparkContext(conf) sc.addJar("/full/path/to/my/application/jar/myapp.jar") On Wed, Oct 29, 2014 at 1:44 AM, Shailesh Birari wrote: > Yes, this is doable. > I am submitting the Spark job using > JavaSparkContext spark = new JavaSparkCo

Re: Submiting Spark application through code

2014-10-28 Thread Shailesh Birari
Yes, this is doable. I am submitting the Spark job using JavaSparkContext spark = new JavaSparkContext(sparkMaster, "app name", System.getenv("SPARK_HOME"), new String[] {"application JAR"}); To run this first you have to create the application jar and in above API specify

Re: Submiting Spark application through code

2014-10-28 Thread Matt Narrell
Can this be done? Can I just spin up a SparkContext programmatically, point this to my yarn-cluster and this works like spark-submit?? Doesn’t (at least) the application JAR need to be distributed to the workers via HDFS or the like for the jobs to run? mn > On Oct 28, 2014, at 2:29 AM, Akhi

Re: Submiting Spark application through code

2014-10-28 Thread sivarani
Hi I know we can create spark context with new JavaStreamingContext(master, appName, batchDuration, sparkHome, jarFile) but to run the application we will have to use spark-home/spark-submit --class NetworkCount i want skip submitting manually, i wanted to invoke this spark app when a conditio

Re: Submiting Spark application through code

2014-10-28 Thread Akhil Das
How about directly running it? val ssc = new StreamingContext("local[2]","Network WordCount",Seconds(5), "/home/akhld/mobi/localclusterxx/spark-1") val lines=ssc.socketTextStream("localhost", 12345) val words = lines.flatMap(_.split(" ")) val wordCounts = words.map(x => (x