[ 
https://issues.apache.org/jira/browse/HIVE-11762?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14739647#comment-14739647
 ] 

Jason Dere commented on HIVE-11762:
-----------------------------------

Whoa, lot of failures .. I ran TestSparkCliDriver and see the following error:
{noformat}
2015-09-10 14:14:31,970 INFO  [stderr-redir-1] client.SparkClientImpl 
(SparkClientImpl.java:run(588)) - Exception in thread "main" 
java.lang.NoClassDefFoundError: org/apache/hadoop/crypto/key/KeyProvider
2015-09-10 14:14:31,970 INFO  [stderr-redir-1] client.SparkClientImpl 
(SparkClientImpl.java:run(588)) -         at 
org.apache.hadoop.hive.shims.Hadoop23Shims.<clinit>(Hadoop23Shims.java:1058)
2015-09-10 14:14:31,970 INFO  [stderr-redir-1] client.SparkClientImpl 
(SparkClientImpl.java:run(588)) -         at java.lang.Class.forName0(Native 
Method)
2015-09-10 14:14:31,970 INFO  [stderr-redir-1] client.SparkClientImpl 
(SparkClientImpl.java:run(588)) -         at 
java.lang.Class.forName(Class.java:190)
2015-09-10 14:14:31,970 INFO  [stderr-redir-1] client.SparkClientImpl 
(SparkClientImpl.java:run(588)) -         at 
org.apache.hadoop.hive.shims.ShimLoader.createShim(ShimLoader.java:146)
2015-09-10 14:14:31,970 INFO  [stderr-redir-1] client.SparkClientImpl 
(SparkClientImpl.java:run(588)) -         at 
org.apache.hadoop.hive.shims.ShimLoader.loadShims(ShimLoader.java:141)
2015-09-10 14:14:31,971 INFO  [stderr-redir-1] client.SparkClientImpl 
(SparkClientImpl.java:run(588)) -         at 
org.apache.hadoop.hive.shims.ShimLoader.getHadoopShims(ShimLoader.java:100)
2015-09-10 14:14:31,971 INFO  [stderr-redir-1] client.SparkClientImpl 
(SparkClientImpl.java:run(588)) -         at 
org.apache.hadoop.hive.conf.HiveConf$ConfVars.<clinit>(HiveConf.java:369)
2015-09-10 14:14:31,971 INFO  [stderr-redir-1] client.SparkClientImpl 
(SparkClientImpl.java:run(588)) -         at 
org.apache.hive.spark.client.rpc.RpcConfiguration.<clinit>(RpcConfiguration.java:46)
2015-09-10 14:14:31,971 INFO  [stderr-redir-1] client.SparkClientImpl 
(SparkClientImpl.java:run(588)) -         at 
org.apache.hive.spark.client.RemoteDriver.<init>(RemoteDriver.java:146)
2015-09-10 14:14:31,971 INFO  [stderr-redir-1] client.SparkClientImpl 
(SparkClientImpl.java:run(588)) -         at 
org.apache.hive.spark.client.RemoteDriver.main(RemoteDriver.java:556)
2015-09-10 14:14:31,971 INFO  [stderr-redir-1] client.SparkClientImpl 
(SparkClientImpl.java:run(588)) -         at 
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
2015-09-10 14:14:31,971 INFO  [stderr-redir-1] client.SparkClientImpl 
(SparkClientImpl.java:run(588)) -         at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
2015-09-10 14:14:31,971 INFO  [stderr-redir-1] client.SparkClientImpl 
(SparkClientImpl.java:run(588)) -         at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
2015-09-10 14:14:31,971 INFO  [stderr-redir-1] client.SparkClientImpl 
(SparkClientImpl.java:run(588)) -         at 
java.lang.reflect.Method.invoke(Method.java:606)
2015-09-10 14:14:31,971 INFO  [stderr-redir-1] client.SparkClientImpl 
(SparkClientImpl.java:run(588)) -         at 
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:664)
2015-09-10 14:14:31,971 INFO  [stderr-redir-1] client.SparkClientImpl 
(SparkClientImpl.java:run(588)) -         at 
org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:169)
2015-09-10 14:14:31,971 INFO  [stderr-redir-1] client.SparkClientImpl 
(SparkClientImpl.java:run(588)) -         at 
org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:192)
2015-09-10 14:14:31,971 INFO  [stderr-redir-1] client.SparkClientImpl 
(SparkClientImpl.java:run(588)) -         at 
org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:111)
2015-09-10 14:14:31,971 INFO  [stderr-redir-1] client.SparkClientImpl 
(SparkClientImpl.java:run(588)) -         at 
org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
2015-09-10 14:14:31,971 INFO  [stderr-redir-1] client.SparkClientImpl 
(SparkClientImpl.java:run(588)) - Caused by: java.lang.ClassNotFoundException: 
org.apache.hadoop.crypto.key.KeyProvider
2015-09-10 14:14:31,971 INFO  [stderr-redir-1] client.SparkClientImpl 
(SparkClientImpl.java:run(588)) -         at 
java.net.URLClassLoader$1.run(URLClassLoader.java:366)
2015-09-10 14:14:31,972 INFO  [stderr-redir-1] client.SparkClientImpl 
(SparkClientImpl.java:run(588)) -         at 
java.net.URLClassLoader$1.run(URLClassLoader.java:355)
2015-09-10 14:14:31,972 INFO  [stderr-redir-1] client.SparkClientImpl 
(SparkClientImpl.java:run(588)) -         at 
java.security.AccessController.doPrivileged(Native Method)
2015-09-10 14:14:31,972 INFO  [stderr-redir-1] client.SparkClientImpl 
(SparkClientImpl.java:run(588)) -         at 
java.net.URLClassLoader.findClass(URLClassLoader.java:354)
2015-09-10 14:14:31,972 INFO  [stderr-redir-1] client.SparkClientImpl 
(SparkClientImpl.java:run(588)) -         at 
java.lang.ClassLoader.loadClass(ClassLoader.java:425)
2015-09-10 14:14:31,972 INFO  [stderr-redir-1] client.SparkClientImpl 
(SparkClientImpl.java:run(588)) -         at 
sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
2015-09-10 14:14:31,972 INFO  [stderr-redir-1] client.SparkClientImpl 
(SparkClientImpl.java:run(588)) -         at 
java.lang.ClassLoader.loadClass(ClassLoader.java:358)
2015-09-10 14:14:31,972 INFO  [stderr-redir-1] client.SparkClientImpl 
(SparkClientImpl.java:run(588)) -         ... 19 more
{noformat}

> TestHCatLoaderEncryption failures when using Hadoop 2.7
> -------------------------------------------------------
>
>                 Key: HIVE-11762
>                 URL: https://issues.apache.org/jira/browse/HIVE-11762
>             Project: Hive
>          Issue Type: Bug
>          Components: Shims, Tests
>            Reporter: Jason Dere
>            Assignee: Jason Dere
>         Attachments: HIVE-11762.1.patch, HIVE-11762.2.patch
>
>
> When running TestHCatLoaderEncryption with -Dhadoop23.version=2.7.0, we get 
> the following error during setup():
> {noformat}
> testReadDataFromEncryptedHiveTableByPig[5](org.apache.hive.hcatalog.pig.TestHCatLoaderEncryption)
>   Time elapsed: 3.648 sec  <<< ERROR!
> java.lang.NoSuchMethodError: 
> org.apache.hadoop.hdfs.DFSClient.setKeyProvider(Lorg/apache/hadoop/crypto/key/KeyProviderCryptoExtension;)V
>       at 
> org.apache.hadoop.hive.shims.Hadoop23Shims.getMiniDfs(Hadoop23Shims.java:534)
>       at 
> org.apache.hive.hcatalog.pig.TestHCatLoaderEncryption.initEncryptionShim(TestHCatLoaderEncryption.java:252)
>       at 
> org.apache.hive.hcatalog.pig.TestHCatLoaderEncryption.setup(TestHCatLoaderEncryption.java:200)
> {noformat}
> It looks like between Hadoop 2.6 and Hadoop 2.7, the argument to 
> DFSClient.setKeyProvider() changed:
> {noformat}
>    @VisibleForTesting
> -  public void setKeyProvider(KeyProviderCryptoExtension provider) {
> -    this.provider = provider;
> +  public void setKeyProvider(KeyProvider provider) {
> {noformat}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to