Curious. I guess the first question is whether we've got some sort of Listener/UI error so that the UI is not accurately reflecting the Executor's actual state, or whether the "LOADING" Executor really is spending a considerable length of time in this "I'm in the process of being created, but not yet doing anything useful" state.
If you can figure out a little more of what is going on or how to reproduce this state, please do file a JIRA. On Mon, Feb 2, 2015 at 8:28 AM, Ami Khandeshi <ami.khande...@gmail.com> wrote: > Yes > > > On Monday, February 2, 2015, Mark Hamstra <m...@clearstorydata.com> wrote: > >> LOADING is just the state in which new Executors are created but before >> they have everything they need and are fully registered to transition to >> state RUNNING and begin doing actual work: >> >> https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/deploy/worker/Worker.scala#L351 >> >> https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/deploy/client/AppClient.scala#L133 >> >> LOADING should be a fairly brief, transitory state. Are you seeing >> Executors remaining in LOADING for a significant length of time? >> >> >> On Mon, Feb 2, 2015 at 7:56 AM, akhandeshi <ami.khande...@gmail.com> >> wrote: >> >>> I am not sure what Loading status means, followed by Running. In the >>> application UI, I see: >>> Executor Summary >>> >>> ExecutorID Worker Cores Memory State Logs >>> 1 worker-20150202144112-hadoop-w-1.c.fi-mdd-poc.internal-38874 >>> 16 83971 >>> LOADING stdout stderr >>> 0 worker-20150202144112-hadoop-w-2.c.fi-mdd-poc.internal-58685 >>> 16 83971 >>> RUNNING stdout stderr >>> >>> Looking at the executor hadoop-w-2, I see the status is "Loading" . Why >>> different statuses, and what does that mean? >>> >>> Please see below for details: >>> >>> ID: worker-20150202144112-hadoop-w-2.c.fi-mdd-poc.internal-58685 >>> Master URL: spark://hadoop-m:7077 >>> Cores: 16 (16 Used) >>> Memory: 82.0 GB (82.0 GB Used) >>> Back to Master >>> >>> Running Executors (1) >>> >>> ExecutorID Cores State Memory Job Details Logs >>> 0 16 LOADING 82.0 GB >>> ID: app-20150202152154-0001 >>> Name: Simple File Merge Application >>> User: hadoop >>> stdout stderr >>> >>> Thank you, >>> >>> Ami >>> >>> >>> >>> -- >>> View this message in context: >>> http://apache-spark-user-list.1001560.n3.nabble.com/Loading-status-tp21468.html >>> Sent from the Apache Spark User List mailing list archive at Nabble.com. >>> >>> --------------------------------------------------------------------- >>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org >>> For additional commands, e-mail: user-h...@spark.apache.org >>> >>> >>