I solve this by using the property hadoop.http.authentication.type to
specify a custom Java Handler objects that contains the authentication
logic. This class only has to implement the
interface 
org.apache.hadoop.security.authentication.server.AuthenticationHandler.
See:

https://hadoop.apache.org/docs/r2.7.3/hadoop-project-dist/hadoop-common/HttpAuthentication.html

Regards


2018-01-10 19:38 GMT-05:00 Jeff Zhang <zjf...@gmail.com>:

>
> It seems by design of yarn mode. Have you ever make it work in spark-shell
> ?
>
>
> Jhon Anderson Cardenas Diaz <jhonderson2...@gmail.com>于2018年1月10日周三
> 下午9:17写道:
>
>> *Environment*:
>> AWS EMR, yarn cluster.
>>
>> *Description*:
>>
>> I am trying to use a java filter to protect the access to spark ui, this
>> by using the property spark.ui.filters; the problem is that when spark is
>> running on yarn mode, that property is being allways overriden with the
>> filter org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter:
>>
>> *spark.ui.filters:
>> org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter*
>>
>> And this properties are automatically added:
>>
>>
>> *spark.org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter.param.PROXY_HOSTS:
>> ip-x-x-x-226.eu-west-1.compute.internalspark.org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter.param.PROXY_URI_BASES:
>> http://ip-x-x-x-226.eu-west-1.compute.internal:20888/proxy/application_xxxxxxxxxxxxx_xxxx
>> <http://ip-x-x-x-226.eu-west-1.compute.internal:20888/proxy/application_xxxxxxxxxxxxx_xxxx>*
>>
>> Any suggestion of how to add a java security filter so ti does not get
>> overriden, or maybe how to configure the security from hadoop side?
>>
>> Thanks.
>>
>

Reply via email to