Launching multiple spark jobs within a main spark job.

2016-12-20 Thread Naveen
Hi Team, Is it ok to spawn multiple spark jobs within a main spark job, my main spark job's driver which was launched on yarn cluster, will do some preprocessing and based on it, it needs to launch multilple spark jobs on yarn cluster. Not sure if this right pattern. Please share your thoughts. S

Re: Launching multiple spark jobs within a main spark job.

2016-12-21 Thread Naveen
> Anyway, If you run spark applicaction you would have multiple jobs, which > makes sense that it is not a problem. > > > > Thanks David. > > > > *From:* Naveen [mailto:hadoopst...@gmail.com] > *Sent:* Wednesday, December 21, 2016 9:18 AM > *To:* dev@spark.apache.o

Re: Launching multiple spark jobs within a main spark job.

2016-12-21 Thread Naveen
launching the > jobs? > You can use SparkLauncher in a normal app and just listen for state > transitions > > On Wed, 21 Dec 2016, 11:44 Naveen, wrote: > >> Hi Team, >> >> Thanks for your responses. >> Let me give more details in a picture of how I am trying

Re: Launching multiple spark jobs within a main spark job.

2016-12-21 Thread Naveen
r these spawned sparkcontexts will get different nodes / executors from resource manager? On Wed, Dec 21, 2016 at 6:43 PM, Naveen wrote: > Hi Sebastian, > > Yes, for fetching the details from Hive and HBase, I would want to use > Spark's HiveContext etc. > However, based on your point,

Re: Launching multiple spark jobs within a main spark job.

2016-12-24 Thread Naveen
Thanks Liang, Vadim and everyone for your inputs!! With this clarity, I've tried client modes for both main and sub-spark jobs. Every main spark job and its corresponding threaded spark jobs are coming up on the YARN applications list and the jobs are getting executed properly. I need to now test

Re: Integrating ML/DL frameworks with Spark

2018-05-08 Thread Naveen Swamy
with data parallelism -- how can we leverage Spark's map reduce model to fit distributed training. model of execution here is more of iterative in nature. Please let me know. Thanks, Naveen On Tue, May 8, 2018 at 8:53 AM, Shivaram Venkataraman < shiva...@eecs.berkeley.edu> wrote: > >

Starting with Spark

2014-12-24 Thread Naveen Madhire
Hi All, I am starting to use Spark. I am having trouble getting the latest code from git. I am using Intellij as suggested in the below link, https://cwiki.apache.org/confluence/display/SPARK/Contributing+to+Spark#ContributingtoSpark-StarterTasks The below link isn't working as well, http://sp

Re: Starting with Spark

2014-12-25 Thread Naveen Madhire
orrect docs link is: > https://spark.apache.org/docs/1.2.0/building-spark.html > > Where did you get that bad link from? > > Nick > > > > On Thu Dec 25 2014 at 12:00:53 AM Naveen Madhire > wrote: > >> Hi All, >> >> I am starting to use Spark.

Spark Error - Failed to locate the winutils binary in the hadoop binary path

2014-12-28 Thread Naveen Madhire
hanks for help -Naveen

Re: Spark Error - Failed to locate the winutils binary in the hadoop binary path

2014-12-28 Thread Naveen Madhire
-windows-7 Now it is working fine. Thanks all. On Sun, Dec 28, 2014 at 6:10 PM, Naveen Madhire wrote: > Hi All, > > I am getting the below error while running a simple spark application from > Eclipse. > > I am using Eclipse, Maven, Java. > > I've spark running lo

Spark 1.2.0 build error

2014-12-28 Thread Naveen Madhire
2.0 Thanks Naveen

Re: Spark 1.2.0 build error

2014-12-29 Thread Naveen Madhire
the test failure. This would > have been logged earlier. You would need to say how you ran tests too. The > tests for 1.2.0 pass for me on several common permutations. > On Dec 29, 2014 3:22 AM, "Naveen Madhire" wrote: > >> Hi, >> >> I am follow the below li

Sample Spark Program Error

2014-12-30 Thread Naveen Madhire
Hi All, I am trying to run a sample Spark program using Scala SBT, Below is the program, def main(args: Array[String]) { val logFile = "E:/ApacheSpark/usb/usb/spark/bin/README.md" // Should be some file on your system val sc = new SparkContext("local", "Simple App", "E:/ApacheSpark/