It is in yarn module.
"org.apache.spark.deploy.yarn.security.ServiceCredentialProvider".
2018-03-23 15:10 GMT+08:00 Jorge Machado :
> Hi Jerry,
>
> where do you see that Class on Spark ? I only found
> HadoopDelegationTokenManager
> and I don’t see any way to add my Provider into it.
>
> private
Hi Jerry,
where do you see that Class on Spark ? I only found
HadoopDelegationTokenManager and I don’t see any way to add my Provider into
it.
private def getDelegationTokenProviders: Map[String,
HadoopDelegationTokenProvider] = {
val providers = List(new HadoopFSDelegationTokenProvider(fi
I think you can build your own Accumulo credential provider as similar to
HadoopDelegationTokenProvider out of Spark, Spark already provided an
interface "ServiceCredentialProvider" for user to plug-in customized
credential provider.
Thanks
Jerry
2018-03-23 14:29 GMT+08:00 Jorge Machado :
> Hi G
Hi Guys,
I’m on the middle of writing a spark Datasource connector for Apache Spark to
connect to Accumulo Tablets, because we have Kerberos it get’s a little trick
because Spark only handles the Delegation Tokens from Hbase, hive and hdfs.
Would be a PR for a implementation of HadoopDelegati
Hello Madvi,
Some work has been done by @pomadchin using the spark notebook, maybe you
should come on https://gitter.im/andypetrella/spark-notebook and poke him?
There are some discoveries he made that might be helpful to know.
Also you can poke @lossyrob from Azavea, he did that for geotrellis
You can simply use a custom inputformat (AccumuloInputFormat) with the
hadoop RDDs (sc.newApiHadoopFile etc) for that, all you need to do is to
pass the jobConfs. Here's pretty clean discussion:
http://stackoverflow.com/questions/29244530/how-do-i-create-a-spark-rdd-from-accumulo-1-6-in-spark-noteb
Hi all,
Is there anything to integrate spark with accumulo or make spark to
process over accumulo data?
Thanks
Madhvi Gupta
-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h..