Re: Spark and Accumulo Delegation tokens

2018-03-23 Thread Saisai Shao
It is in yarn module. "org.apache.spark.deploy.yarn.security.ServiceCredentialProvider". 2018-03-23 15:10 GMT+08:00 Jorge Machado : > Hi Jerry, > > where do you see that Class on Spark ? I only found > HadoopDelegationTokenManager > and I don’t see any way to add my Provider into it. > > private

Re: Spark and Accumulo Delegation tokens

2018-03-23 Thread Jorge Machado
Hi Jerry, where do you see that Class on Spark ? I only found HadoopDelegationTokenManager and I don’t see any way to add my Provider into it. private def getDelegationTokenProviders: Map[String, HadoopDelegationTokenProvider] = { val providers = List(new HadoopFSDelegationTokenProvider(fi

Re: Spark and Accumulo Delegation tokens

2018-03-22 Thread Saisai Shao
I think you can build your own Accumulo credential provider as similar to HadoopDelegationTokenProvider out of Spark, Spark already provided an interface "ServiceCredentialProvider" for user to plug-in customized credential provider. Thanks Jerry 2018-03-23 14:29 GMT+08:00 Jorge Machado : > Hi G

Re: Spark and accumulo

2015-04-21 Thread andy petrella
Hello Madvi, Some work has been done by @pomadchin using the spark notebook, maybe you should come on https://gitter.im/andypetrella/spark-notebook and poke him? There are some discoveries he made that might be helpful to know. Also you can poke @lossyrob from Azavea, he did that for geotrellis

Re: Spark and accumulo

2015-04-21 Thread Akhil Das
You can simply use a custom inputformat (AccumuloInputFormat) with the hadoop RDDs (sc.newApiHadoopFile etc) for that, all you need to do is to pass the jobConfs. Here's pretty clean discussion: http://stackoverflow.com/questions/29244530/how-do-i-create-a-spark-rdd-from-accumulo-1-6-in-spark-noteb