I want to use Rack locality feature of Apache Spark in my application.

Is YARN the only resource manager which supports it as of now?

After going through the source code, I found that default implementation of
getRackForHost() method returns NONE in TaskSchedulerImpl which (I suppose)
would be used by standalone mode.

On the other hand, it is overriden in YarnScheduler.scala to fetch the rack
information by invoking RackResolver api of hadoop which would be used when
its run on YARN.



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Rack-locality-tp22483.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to