Some additional information: the main.process shares jar files with the Spark 
job's driver and executor as their classpaths. It could not be those files' 
read/write lock, right?

---????????---
??????: "Ted Yu"<yuzhih...@gmail.com>
????????: 2015??10??30?? 11:46:21
??????: "??????"<yuhang.c...@foxmail.com>;
????: "jey"<j...@cs.berkeley.edu>;"user"<user@spark.apache.org>;
????: Re: SparkLauncher is blocked until main process is killed.


Not much clue from the snippet on screen.

Is it possible to pastebin the whole jstack output ?


On Thu, Oct 29, 2015 at 7:58 PM, ?????? <yuhang.c...@foxmail.com> wrote:
????????


Here's&#xA0;a&#xA0;part&#xA0;of&#xA0;the&#xA0;jstack&#xA0;output.

The&#xA0;release&#xA0;is&#xA0;1.5.1.

---??????&#x4EF6;---
??&#x4EF6;??: "Ted Yu"<yuzhih...@gmail.com>
????????: 2015??10??30?? 10:11:34
??&#x4EF6;??: "jey"<j...@cs.berkeley.edu>;
????: "??????"<yuhang.c...@foxmail.com>;"user"<user@spark.apache.org>;
????: Re: SparkLauncher is blocked until main process is killed.


Which Spark release are you using ?

Please note the typo in email subject (corrected as of this reply)

On Thu, Oct 29, 2015 at 7:00 PM, Jey Kottalam <j...@cs.berkeley.edu> wrote:
Could you please provide the jstack output? That would help the devs identify 
the blocking operation more easily.


On Thu, Oct 29, 2015 at 6:54 PM, ?????? <yuhang.c...@foxmail.com> wrote:
I tried to use SparkLauncher (org.apache.spark.launcher.SparkLauncher) to 
submit a Spark Streaming job, however, in my test, the SparkSubmit process got 
stuck in the "addJar" procedure. Only when the main process (the caller of 
SparkLauncher) is killed, the submit procedeure continues to run. I ran jstack 
for the process, it seems jetty was blocking it, and I'm pretty sure there was 
no port conflicts.


The environment is RHEL(RedHot Enterprise Linux) 6u3 x64, Spark runs in 
standalone mode.


Did this happen to any of you?

Reply via email to