Hi, There is a bit of inconsistent in the document. Which is the correct statement?
`http://spark.apache.org/docs/latest/spark-standalone.html` says The standalone cluster mode currently only supports a simple FIFO scheduler across applications. while `http://spark.apache.org/docs/latest/job-scheduling.html` says Starting in Spark 0.8, it is also possible to configure fair sharing between jobs. Thanks, Praveen
