Hi,

I may indeed mistakenly be mixing different aspect. Thanks for the
answer! Does this answer my initial question, though, as I'm still
unsure whether the sentence:

> The standalone cluster mode currently only supports a simple FIFO scheduler 
> across applications.

is correct or not? :(

Pozdrawiam,
Jacek

--
Jacek Laskowski | http://blog.japila.pl | http://blog.jaceklaskowski.pl
Follow me at https://twitter.com/jaceklaskowski
Upvote at http://stackoverflow.com/users/1305344/jacek-laskowski


On Fri, Oct 2, 2015 at 8:20 PM, Marcelo Vanzin <van...@cloudera.com> wrote:
> You're mixing app scheduling in the cluster manager (your [1] link)
> with job scheduling within an app (your [2] link). They're independent
> things.
>
> On Fri, Oct 2, 2015 at 2:22 PM, Jacek Laskowski <ja...@japila.pl> wrote:
>> Hi,
>>
>> The docs in Resource Scheduling [1] says:
>>
>>> The standalone cluster mode currently only supports a simple FIFO scheduler 
>>> across applications.
>>
>> There's however `spark.scheduler.mode` that can be one of `FAIR`,
>> `FIFO`, `NONE` values.
>>
>> Is FAIR available for Spark Standalone cluster mode? Is there a page
>> where it's described in more details? I can't seem to find much about
>> FAIR and Standalone in Job Scheduling [2].
>>
>> [1] 
>> http://people.apache.org/~pwendell/spark-nightly/spark-master-docs/latest/spark-standalone.html
>> [2] 
>> http://people.apache.org/~pwendell/spark-nightly/spark-master-docs/latest/job-scheduling.html
>>
>> Pozdrawiam,
>> Jacek
>>
>> --
>> Jacek Laskowski | http://blog.japila.pl | http://blog.jaceklaskowski.pl
>> Follow me at https://twitter.com/jaceklaskowski
>> Upvote at http://stackoverflow.com/users/1305344/jacek-laskowski
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> For additional commands, e-mail: user-h...@spark.apache.org
>>
>
>
>
> --
> Marcelo

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to