Thank you for the answer, it doesn't seem to work neither (I've not log
into the machine as the spark user, but kinit inside the spark-env script),
and also tried inside the job.
I've notice when I run pyspark that the kerberos token is used for
something, but this same behavior is not presented w
On 15 Jun 2015, at 15:43, Borja Garrido Bear
mailto:kazebo...@gmail.com>> wrote:
I tried running the job in a standalone cluster and I'm getting this:
java.io.IOException: Failed on local exception: java.io.IOException:
org.apache.hadoop.security.AccessControlException: Client cannot authentic
I tried running the job in a standalone cluster and I'm getting this:
java.io.IOException: Failed on local exception: java.io.IOException:
org.apache.hadoop.security.AccessControlException: Client cannot
authenticate via:[TOKEN, KERBEROS]; Host Details : local host is:
"worker-node/0.0.0.0"; desti
That's spark on YARN in Kerberos
In Spark 1.3 you can submit work to a Kerberized Hadoop cluster; once the
tokens you passed up with your app submission expire (~72 hours) your job can't
access HDFS any more.
That's been addressed in Spark 1.4, where you can now specify a kerberos keytab
for t
This might help
http://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.2.4/Apache_Spark_Quickstart_v224/content/ch_installing-kerb-spark-quickstart.html
Thanks
Best Regards
On Wed, Jun 10, 2015 at 6:49 PM, kazeborja wrote:
> Hello all.
>
> I've been reading some old mails and notice that the use o