Which release of hadoop are you using ? Can you utilize node labels feature ? See YARN-2492 and YARN-796
Cheers On Sat, Mar 14, 2015 at 1:49 AM, James <alcaid1...@gmail.com> wrote: > Hello, > > I am got a cluster with spark on yarn. Currently some nodes of it are > running a spark streamming program, thus their local space is not enough to > support other application. Thus I wonder is that possible to use a > blacklist to avoid using these nodes when running a new spark program? > > Alcaid >