My hadoop version is 2.2.0, and my spark version is 1.2.0

2015-03-14 17:22 GMT+08:00 Ted Yu <yuzhih...@gmail.com>:

> Which release of hadoop are you using ?
>
> Can you utilize node labels feature ?
> See YARN-2492 and YARN-796
>
> Cheers
>
> On Sat, Mar 14, 2015 at 1:49 AM, James <alcaid1...@gmail.com> wrote:
>
>> Hello,
>>
>> I am got a cluster with spark on yarn. Currently some nodes of it are
>> running a spark streamming program, thus their local space is not enough to
>> support other application. Thus I wonder is that possible to use a
>> blacklist to avoid using these nodes when running a new spark program?
>>
>> Alcaid
>>
>
>

Reply via email to