Hi,
I am running some experiments where not all Map tasks should be executed,
or continue processing. My application will fail/kill the running Map tasks
if a condition is true. The Reduce tasks should continue processing without
waiting for the output of the failed/killed tasks.
I was able to fa
ing of map/reduce tasks on
yarn? Can someone point me to the
> hadoop 2.x source where the data block location is used to calculate
node/container/task assignment (if
> thats still happening).
>
> -bc
>
>
Hi Brad,
Were you able to find an answer for you question?
Sultan
Hi,
New tasks in hadoop always have higher priority than speculative tasks.
Can anyone tell me how and where I can change this priority?
Regards,
Sultan