On 19 Oct 2016, at 00:18, Michael Segel
mailto:msegel_had...@hotmail.com>> wrote:
(Sorry sent reply via wrong account.. )
Steve,
Kinda hijacking the thread, but I promise its still on topic to OP’s issue.. ;-)
Usually you will end up having a local Kerberos set up per cluster.
So your machine
(Sorry sent reply via wrong account.. )
Steve,
Kinda hijacking the thread, but I promise its still on topic to OP’s issue.. ;-)
Usually you will end up having a local Kerberos set up per cluster.
So your machine accounts (hive, yarn, hbase, etc …) are going to be local to
the cluster.
So you
On 17 Oct 2016, at 22:11, Michael Segel
mailto:michael_se...@hotmail.com>> wrote:
@Steve you are going to have to explain what you mean by ‘turn Kerberos on’.
Taken one way… it could mean making cluster B secure and running Kerberos and
then you’d have to create some sort of trust between B an
On 13 Oct 2016, at 10:50, dbolshak
mailto:bolshakov.de...@gmail.com>> wrote:
Hello community,
We've a challenge and no ideas how to solve it.
The problem,
Say we have the following environment:
1. `cluster A`, the cluster does not use kerberos and we use it as a source
of data, important thin
I think security has nothing to do with what API you use, spark sql or RDD
API.
Assuming you're running on yarn cluster (that is the only cluster manager
supports Kerberos currently).
Firstly you need to get Kerberos tgt in your local spark-submit process,
after being authenticated by Kerberos, S
The problem happens when writting (reading works fine)
rdd.saveAsNewAPIHadoopFile
We use just RDD and HDFS, no other things.
Spark 1.6.1 version.
`Claster A` - CDH 5.7.1
`Cluster B` - vanilla hadoop 2.6.5
`Cluster C` - CDH 5.8.0
Best regards,
Denis
On 13 October 2016 at 13:06, ayan guha wrote:
And a little more details on Spark version, hadoop version and distribution
would also help...
On Thu, Oct 13, 2016 at 9:05 PM, ayan guha wrote:
> I think one point you need to mention is your target - HDFS, Hive or Hbase
> (or something else) and which end points are used.
>
> On Thu, Oct 13, 2
I think one point you need to mention is your target - HDFS, Hive or Hbase
(or something else) and which end points are used.
On Thu, Oct 13, 2016 at 8:50 PM, dbolshak wrote:
> Hello community,
>
> We've a challenge and no ideas how to solve it.
>
> The problem,
>
> Say we have the following env