Sadly, no.

The only evidence I have is the master's log which shows that the Driver
was requested:

15/12/09 18:25:06 INFO Master: Driver submitted
org.apache.spark.deploy.worker.DriverWrapper
15/12/09 18:25:06 INFO Master: Launching driver driver-20151209182506-0164
on worker worker-20151209181534-172.31.31.159-7077



2015-12-09 14:19 GMT-06:00 Ted Yu <yuzhih...@gmail.com>:

> When this happened, did you have a chance to take jstack of the stuck
> driver process ?
>
> Thanks
>
> On Wed, Dec 9, 2015 at 11:38 AM, andresb...@gmail.com <
> andresb...@gmail.com> wrote:
>
>> Forgot to mention that it doesn't happen every time, it's pretty random
>> so far. We've have complete days when it behaves just fine and others when
>> it gets crazy. We're using spark 1.5.2
>>
>> 2015-12-09 13:33 GMT-06:00 andresb...@gmail.com <andresb...@gmail.com>:
>>
>>> Hi everyone,
>>>
>>> We've been getting an issue with spark lately where multiple drivers are
>>> assigned to a same worker but resources are never assigned to them and get
>>> "stuck" forever.
>>>
>>> If I login in the worker machine I see that the driver processes aren't
>>> really running and the worker's log don't show any error or anything
>>> related to the driver. The master UI does show the drivers as submitted and
>>> in RUNNING state.
>>>
>>>
>>> Not sure where else to look for clues, any ideas?
>>>
>>> --
>>> Andrés Blanco Morales
>>>
>>
>>
>>
>> --
>> Andrés Blanco Morales
>>
>
>


-- 
Andrés Blanco Morales

Reply via email to