in context: http://apache-spark-
> developers-list.1001551.n3.nabble.com/Launching-multiple-
> spark-jobs-within-a-main-spark-job-tp20311p20327.html
> Sent from the Apache Spark Developers List mailing list archive at
> Nabble.com.
>
> -
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>
>
Val sparkjob= SparkLauncher(...).launch; spark.waitFor
>>>> }
>>>>
>>>> Similarly, future2 to futureN.
>>>>
>>>> future1.onComplete{...}
>>>> }
>>>>
>>>> }// end of mainsparkjob
>>>>
future1.onComplete{...}
>>> }
>>>
>>> }// end of mainsparkjob
>>> --
>>>
>>>
>>> [image: Inline image 1]
>>>
>>> On Wed, Dec 21, 2016 at 3:13 PM, David Hodeffi <
>>> david.hode...@niceactimiz
t;
>>
>> [image: Inline image 1]
>>
>> On Wed, Dec 21, 2016 at 3:13 PM, David Hodeffi <
>> david.hode...@niceactimize.com> wrote:
>>
>> I am not familiar of any problem with that.
>>
>> Anyway, If you run spark applicaction you would have mult
hode...@niceactimize.com> wrote:
>
> I am not familiar of any problem with that.
>
> Anyway, If you run spark applicaction you would have multiple jobs, which
> makes sense that it is not a problem.
>
>
>
> Thanks David.
>
>
>
> *From:* Naveen [mailto:hadoop
rg; user@spark.apache.org
> *Subject:* Launching multiple spark jobs within a main spark job.
>
>
>
> Hi Team,
>
>
>
> Is it ok to spawn multiple spark jobs within a main spark job, my main
> spark job's driver which was launched on yarn cluster, will do some
@spark.apache.org
Subject: Launching multiple spark jobs within a main spark job.
Hi Team,
Is it ok to spawn multiple spark jobs within a main spark job, my main spark
job's driver which was launched on yarn cluster, will do some preprocessing and
based on it, it needs to launch multilple spark jo
Hi Team,
Is it ok to spawn multiple spark jobs within a main spark job, my main
spark job's driver which was launched on yarn cluster, will do some
preprocessing and based on it, it needs to launch multilple spark jobs on
yarn cluster. Not sure if this right pattern.
Please share your thoughts.
S