Berman; user@spark.apache.org
Subject: Re: spark-submit issue
Can you not use the spark jobserver instead? Just submit your job to the job
server who already has the sparkcontext initialized in it, it would make it
much easier i think.
Thanks
Best Regards
On Mon, Aug 31, 2015 at 2:16 PM
t;
> thx
>
> pranay
>
>
> --
> *From:* Igor Berman
> *Sent:* Monday, August 31, 2015 1:39 PM
> *To:* Pranay Tonpay
> *Cc:* user@spark.apache.org
> *Subject:* Re: spark-submit issue
>
> 1. think once again if you want to call spark submi
ibed )
thx
pranay
From: Igor Berman
Sent: Monday, August 31, 2015 1:39 PM
To: Pranay Tonpay
Cc: user@spark.apache.org
Subject: Re: spark-submit issue
1. think once again if you want to call spark submit in such way...I'm not sure
why you do it, but please consider just opening spark con
.. and just tailed "2" (stderr) and the process
> immediately exits .
>
>
> *From:* Igor Berman
> *Sent:* Monday, August 31, 2015 12:41 PM
> *To:* Pranay Tonpay
> *Cc:* user
> *Subject:* Re: spark-submit issue
>
> might be you need to drain stdout/stderr of subpr
s immediately
exits .
From: Igor Berman
Sent: Monday, August 31, 2015 12:41 PM
To: Pranay Tonpay
Cc: user
Subject: Re: spark-submit issue
might be you need to drain stdout/stderr of subprocess...otherwise subprocess
can deadlock
http://stackoverflow.com/questions/3054531/correct-usage-of-processb
: user@spark.apache.org
Subject: Re: spark-submit issue
You can also add a System.exit(0) after the sc.stop.
On 30 Aug 2015 23:55, "Pranay Tonpay"
mailto:pranay.ton...@impetus.co.in>> wrote:
yes, the context is being closed at the end.
From: Akh
gt; If i run the /temp file independently. things work file... only when i
> trigger /temp scrict inside Runtime.exec , this issue occurs... Any
> comments
> ?
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/spark-submit-
*Cc:* user@spark.apache.org
> *Subject:* Re: spark-submit issue
>
> Did you try putting a sc.stop at the end of your pipeline?
>
> Thanks
> Best Regards
>
> On Thu, Aug 27, 2015 at 6:41 PM, pranay
> wrote:
>
>> I have a java program that does this - (using Spark 1
m:* Akhil Das
> *Sent:* Sunday, August 30, 2015 9:03 AM
> *To:* Pranay Tonpay
> *Cc:* user@spark.apache.org
> *Subject:* Re: spark-submit issue
>
> Did you try putting a sc.stop at the end of your pipeline?
>
> Thanks
> Best Regards
>
> On Thu, Aug 27, 2015 at 6:
yes, the context is being closed at the end.
From: Akhil Das
Sent: Sunday, August 30, 2015 9:03 AM
To: Pranay Tonpay
Cc: user@spark.apache.org
Subject: Re: spark-submit issue
Did you try putting a sc.stop at the end of your pipeline?
Thanks
Best Regards
On Thu
gger /temp scrict inside Runtime.exec , this issue occurs... Any
> comments
> ?
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/spark-submit-issue-tp244
27;t detect the execution end...
If i run the /temp file independently. things work file... only when i
trigger /temp scrict inside Runtime.exec , this issue occurs... Any comments
?
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/spark-submit-issue-tp24474.htm
12 matches
Mail list logo