I am using Hadoop 2.5.0.3 and spark 1.1.  My local hive version is 0.12.3 the 
hcatalog.jar of which is included in the path.  The stack trace is as follows:


14/10/28 18:24:24 WARN ipc.Client: Exception encountered while connecting to 
the server : org.apache.hadoop.security.AccessControlException: Client cannot 
authenticate via:[TOKEN, KERBEROS]

14/10/28 18:24:24 WARN ipc.Client: Exception encountered while connecting to 
the server : org.apache.hadoop.security.AccessControlException: Client cannot 
authenticate via:[TOKEN, KERBEROS]

14/10/28 18:24:24 INFO retry.RetryInvocationHandler: Exception while invoking 
getFileInfo of class ClientNamenodeProtocolTranslatorPB over 
zaniumtan-nn1.tan.ygrid.yahoo.com/98.138.109.23:8020 after 1 fail over 
attempts. Trying to fail over immediately.

java.io.IOException: Failed on local exception: java.io.IOException: 
org.apache.hadoop.security.AccessControlException: Client cannot authenticate 
via:[TOKEN, KERBEROS]; Host Details : local host is: 
"dp-dk-shark-dev51.data.ne1.yahoo.com/10.218.110.34"; destination host is: 
"zaniumtan-nn1.tan.ygrid.yahoo.com":8020;

at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:764)

at org.apache.hadoop.ipc.Client.call(Client.java:1375)

at org.apache.hadoop.ipc.Client.call(Client.java:1324)

at 
org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)

at com.sun.proxy.$Proxy23.getFileInfo(Unknown Source)

at 
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:707)

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)

at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:601)

at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187)

at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)

at com.sun.proxy.$Proxy24.getFileInfo(Unknown Source)

at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1785)

at 
org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1068)

at 
org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1064)

at 
org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)

at 
org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1064)

at org.apache.hadoop.hive.metastore.Warehouse.isDir(Warehouse.java:446)

at 
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_database_core(HiveMetaStore.java:564)

at 
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_database(HiveMetaStore.java:602)

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)

at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:601)

at 
org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:103)

at com.sun.proxy.$Proxy12.create_database(Unknown Source)

at 
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createDatabase(HiveMetaStoreClient.java:459)

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)

at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:601)

at 
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)

at com.sun.proxy.$Proxy16.createDatabase(Unknown Source)

at org.apache.hadoop.hive.ql.metadata.Hive.createDatabase(Hive.java:225)

at org.apache.hadoop.hive.ql.exec.DDLTask.createDatabase(DDLTask.java:3442)

at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:227)

at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:151)

at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:65)

at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1414)

at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1192)

at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1020)

at org.apache.hadoop.hive.ql.Driver.run(Driver.java:888)

at org.apache.spark.sql.hive.HiveContext.runHive(HiveContext.scala:298)

at org.apache.spark.sql.hive.HiveContext.runSqlHive(HiveContext.scala:272)

at 
org.apache.spark.sql.hive.execution.NativeCommand.sideEffectResult$lzycompute(NativeCommand.scala:35)

at 
org.apache.spark.sql.hive.execution.NativeCommand.sideEffectResult(NativeCommand.scala:35)

at 
org.apache.spark.sql.hive.execution.NativeCommand.execute(NativeCommand.scala:38)

at 
org.apache.spark.sql.hive.HiveContext$QueryExecution.toRdd$lzycompute(HiveContext.scala:360)

at 
org.apache.spark.sql.hive.HiveContext$QueryExecution.toRdd(HiveContext.scala:360)

at org.apache.spark.sql.SchemaRDDLike$class.$init$(SchemaRDDLike.scala:58)

at org.apache.spark.sql.SchemaRDD.<init>(SchemaRDD.scala:103)

at org.apache.spark.sql.hive.HiveContext.sql(HiveContext.scala:98)

at 
org.apache.spark.sql.hive.thriftserver.server.SparkSQLOperationManager$$anon$1.run(SparkSQLOperationManager.scala:172)

at 
org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementInternal(HiveSessionImpl.java:193)

at 
org.apache.hive.service.cli.session.HiveSessionImpl.executeStatement(HiveSessionImpl.java:175)

at org.apache.hive.service.cli.CLIService.executeStatement(CLIService.java:150)

at 
org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatement(ThriftCLIService.java:207)

at 
org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1133)

at 
org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1118)

at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)

at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)

at 
org.apache.hive.service.auth.TUGIContainingProcessor$1.run(TUGIContainingProcessor.java:58)

at 
org.apache.hive.service.auth.TUGIContainingProcessor$1.run(TUGIContainingProcessor.java:55)

at java.security.AccessController.doPrivileged(Native Method)

at javax.security.auth.Subject.doAs(Subject.java:415)

at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1637)

at 
org.apache.hadoop.hive.shims.HadoopShimsSecure.doAs(HadoopShimsSecure.java:526)

at 
org.apache.hive.service.auth.TUGIContainingProcessor.process(TUGIContainingProcessor.java:55)

at 
org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:206)

at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)

at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)

at java.lang.Thread.run(Thread.java:722)

Caused by: java.io.IOException: 
org.apache.hadoop.security.AccessControlException: Client cannot authenticate 
via:[TOKEN, KERBEROS]

at org.apache.hadoop.ipc.Client$Connection$1.run(Client.java:657)

at java.security.AccessController.doPrivileged(Native Method)

at javax.security.auth.Subject.doAs(Subject.java:415)

at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1637)

at 
org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:621)

at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:712)

at org.apache.hadoop.ipc.Client$Connection.access$2900(Client.java:368)

at org.apache.hadoop.ipc.Client.getConnection(Client.java:1423)

at org.apache.hadoop.ipc.Client.call(Client.java:1342)

... 71 more

Caused by: org.apache.hadoop.security.AccessControlException: Client cannot 
authenticate via:[TOKEN, KERBEROS]

at 
org.apache.hadoop.security.SaslRpcClient.selectSaslClient(SaslRpcClient.java:171)

at org.apache.hadoop.security.SaslRpcClient.saslConnect(SaslRpcClient.java:388)

at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:702)

at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:698)

at java.security.AccessController.doPrivileged(Native Method)

at javax.security.auth.Subject.doAs(Subject.java:415)

at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1637)

at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:697)

... 74 more




From: Cheng Lian <lian.cs....@gmail.com<mailto:lian.cs....@gmail.com>>
Date: Tuesday, October 28, 2014 at 2:50 AM
To: Du Li <l...@yahoo-inc.com.invalid<mailto:l...@yahoo-inc.com.invalid>>
Cc: "user@spark.apache.org<mailto:user@spark.apache.org>" 
<user@spark.apache.org<mailto:user@spark.apache.org>>
Subject: Re: [SPARK SQL] kerberos error when creating database from 
beeline/ThriftServer2

Which version of Spark and Hadoop are you using? Could you please provide the 
full stack trace of the exception?

On Tue, Oct 28, 2014 at 5:48 AM, Du Li 
<l...@yahoo-inc.com.invalid<mailto:l...@yahoo-inc.com.invalid>> wrote:
Hi,

I was trying to set up Spark SQL on a private cluster. I configured a 
hive-site.xml under spark/conf that uses a local metestore with warehouse and 
default FS name set to HDFS on one of my corporate cluster. Then I started 
spark master, worker and thrift server. However, when creating a database on 
beeline, I got the following error:

org.apache.hive.service.cli.HiveSQLException: 
org.apache.spark.sql.execution.QueryExecutionException: FAILED: Execution 
Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. 
MetaException(message:Got exception: java.io.IOException Failed on local 
exception: java.io.IOException: 
org.apache.hadoop.security.AccessControlException: Client cannot authenticate 
via:[TOKEN, KERBEROS]; Host Details : local host is: “<spark-master-host>"; 
destination host is: “<HDFS-namenode:port>"; )

It occurred when spark was trying to create a hdfs directory under the 
warehouse in order to create the database. All processes (spark master, worker, 
thrift server, beeline) were run as a user with the right access permissions. 
My spark classpaths have /home/y/conf/hadoop in the front. I was able to read 
and write files from hadoop fs command line under the same directory and also 
from the spark-shell without any issue.

Any hints regarding the right way of configuration would be appreciated.

Thanks,
Du

Reply via email to