Hi Team,


Do we have Job ACL's for Spark which is similar to Hadoop Job ACL’s.


Where I can restrict who can submit the Job to the Spark Master service.

In our hadoop cluster we enabled Job ACL;s by using job queues and
restricting the default queues and have Fair scheduler for managing the
workloads on the platform


Do we have similar functionality in Spark, I have seen some references to
fair scheduler pools but could not get much for Job Queues


please advise :)


--Manoj

Reply via email to