I'm not sure how I should do that.

The documentation says "A job submitter can specify access control
lists for viewing or modifying a job via the configuration properties
mapreduce.job.acl-view-job and mapreduce.job.acl-modify-job
respectively. By default, nobody is given access in these properties."

My understanding is no other user should be able to modify a job
unless explicitly authorized. Is that not the case? Should I set these
two properties before running the job?

Thanks.


On 30 July 2013 19:25, Vinod Kumar Vavilapalli <vino...@apache.org> wrote:
>
> You need to set up Job ACLs. See
> http://hadoop.apache.org/docs/stable/mapred_tutorial.html#Job+Authorization.
>
> It is a per job configuration, you can provide with defaults. If the job
> owner wishes to give others access, he/she can do so.
>
> Thanks,
> +Vinod Kumar Vavilapalli
> Hortonworks Inc.
> http://hortonworks.com/
>
> On Jul 30, 2013, at 11:21 AM, Murat Odabasi wrote:
>
> Hi there,
>
> I am trying to introduce some sort of security to prevent different
> people using the cluster from interfering with each other's jobs.
>
> Following the instructions at
> http://hadoop.apache.org/docs/stable/cluster_setup.html and
> https://www.inkling.com/read/hadoop-definitive-guide-tom-white-3rd/chapter-9/security
> , this is what I put in my mapred-site.xml:
>
> <property>
>  <name>mapred.task.tracker.task-controller</name>
>  <value>org.apache.hadoop.mapred.LinuxTaskController</value>
> </property>
>
> <property>
>  <name>mapred.acls.enabled</name>
>  <value>true</value>
> </property>
>
> I can see the configuration parameters in the job configuration when I
> run a hive query, but the users are still able to kill each other's
> jobs.
>
> Any ideas about what I may be missing?
> Any alternative approaches I can adopt?
>
> Thanks.
>
>

Reply via email to