I need to run Spark Job as a service in my project, so there is a 
"ServiceManager" in it and it use 
SparkLauncher(org.apache.spark.launcher.SparkLauncher) to submit Spark jobs.


First, I tried to write a demo, putting only the SparkLauncher codes in the 
main and run it with java -jar, it's fine.


Then I tried starting my ServiceManager, and then run the demo, it's still OK, 
this means there should be no conflicts in ports or files between ServiceManger 
and the Spark job.


After that, I copied the codes in the demo into my ServiceManager as a method, 
then start the ServiceManager and call it. However, the SparkSubmit process got 
stuck in/after the "addJars" procedure. 


And here comes the strangest thing, when I killed the ServiceManager process, 
the SparkSubmit process continued to run immediately. It seems if the 
SparkLauncher is called with my ServiceManager, it got blocked.


This is confusing me, do you have any idea why this happened?


Thanks a lot

Reply via email to