The stack trace implies the problem is occurring in the accumulo-gc
process. Is that the only process where this is occurring, or is that
just a representative example?

Are you using Kerberos to authenticate to Accumulo (client-server, or
server-server with SASL RPC), or only using Kerberos for Accumulo to
talk to HDFS?

I'm not exactly sure how Hadoop's UserGroupInformation is supposed to
work, but does it work if you manually renew outside the Accumulo
process? Can klist find the credentials? Are you sure the keytab is
configured correctly for the accumulo-gc process or any other process
that fails? Does the process have read access to the keytab file on
the filesystem?

The following link may help:
https://docs.oracle.com/javase/8/docs/jre/api/security/jaas/spec/com/sun/security/auth/module/Krb5LoginModule.html
It explains why you're getting the LoginException Cannot read from
System.in. It seems it's trying to prompt for a passphrase, because it
can't find credentials.

On Thu, May 19, 2022 at 11:56 AM Hayes, Phillip <phillip.ha...@cgi.com> wrote:
>
> Hi, we have recently attempted to upgrade a Kerberized  Accumulo 1.8.1 to 
> Accumulo 2.0.1 but have encountered some problems around Kerberos tickets.
>
>
>
> By default we have our krb5.conf files configured with ticket_lifetimes of 
> 24h but have noticed during testing that when the ticket granting ticket 
> expires it fails to contact the KDC to get a new one.
>
>
>
> The exception raised is GSSException: No valid credentials provided 
> (Mechanism level: Failed to find any Kerberos tgt).
>
>
>
> Full Stacktrace:
>
>
>
> java.io.IOException: DestHost:destPort hadoop:9000 , LocalHost:localPort 
> accumulo.accumulo-network/172.24.0.4:0. Failed on local exception: 
> java.io.IOException: javax.security.sasl.SaslException: GSS initiate failed 
> [Caused by GSSException: No valid credentials provided (Mechanism level: 
> Failed to find any Kerberos tgt)]
>
> at 
> java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native
>  Method)
>
> at 
> java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>
> at 
> java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>
> at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:490)
>
> at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:913)
>
> at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:888)
>
> at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1616)
>
> at org.apache.hadoop.ipc.Client.call(Client.java:1558)
>
> at org.apache.hadoop.ipc.Client.call(Client.java:1455)
>
> at 
> org.apache.hadoop.ipc.ProtobufRpcEngine2$Invoker.invoke(ProtobufRpcEngine2.java:242)
>
> at 
> org.apache.hadoop.ipc.ProtobufRpcEngine2$Invoker.invoke(ProtobufRpcEngine2.java:129)
>
> at com.sun.proxy.$Proxy18.delete(Unknown Source)
>
> at 
> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.delete(ClientNamenodeProtocolTranslatorPB.java:655)
>
> at jdk.internal.reflect.GeneratedMethodAccessor3.invoke(Unknown Source)
>
> at 
> java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>
> at java.base/java.lang.reflect.Method.invoke(Method.java:566)
>
> at 
> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:422)
>
> at 
> org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:165)
>
> at 
> org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:157)
>
> at 
> org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:95)
>
> at 
> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:359)
>
> at com.sun.proxy.$Proxy19.delete(Unknown Source)
>
> at org.apache.hadoop.hdfs.DFSClient.delete(DFSClient.java:1662)
>
> at 
> org.apache.hadoop.hdfs.DistributedFileSystem$19.doCall(DistributedFileSystem.java:992)
>
> at 
> org.apache.hadoop.hdfs.DistributedFileSystem$19.doCall(DistributedFileSystem.java:989)
>
> at 
> org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
>
> at 
> org.apache.hadoop.hdfs.DistributedFileSystem.delete(DistributedFileSystem.java:999)
>
> at 
> org.apache.accumulo.server.fs.VolumeManagerImpl.deleteRecursively(VolumeManagerImpl.java:214)
>
> at 
> org.apache.accumulo.gc.SimpleGarbageCollector$GCEnv.lambda$delete$3(SimpleGarbageCollector.java:339)
>
> at 
> java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
>
> at 
> java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
>
> at org.apache.accumulo.fate.util.LoggingRunnable.run(LoggingRunnable.java:35)
>
> at java.base/java.lang.Thread.run(Thread.java:829)
>
> Caused by: java.io.IOException: javax.security.sasl.SaslException: GSS 
> initiate failed [Caused by GSSException: No valid credentials provided 
> (Mechanism level: Failed to find any Kerberos tgt)]
>
> at org.apache.hadoop.ipc.Client$Connection$1.run(Client.java:798)
>
> at java.base/java.security.AccessController.doPrivileged(Native Method)
>
> at java.base/javax.security.auth.Subject.doAs(Subject.java:423)
>
> at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1878)
>
> at 
> org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:752)
>
> at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:856)
>
> at org.apache.hadoop.ipc.Client$Connection.access$3800(Client.java:414)
>
> at org.apache.hadoop.ipc.Client.getConnection(Client.java:1677)
>
> at org.apache.hadoop.ipc.Client.call(Client.java:1502)
>
> ... 25 more
>
> Caused by: javax.security.sasl.SaslException: GSS initiate failed [Caused by 
> GSSException: No valid credentials provided (Mechanism level: Failed to find 
> any Kerberos tgt)]
>
> at 
> jdk.security.jgss/com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:222)
>
> at 
> org.apache.hadoop.security.SaslRpcClient.saslConnect(SaslRpcClient.java:410)
>
> at 
> org.apache.hadoop.ipc.Client$Connection.setupSaslConnection(Client.java:623)
>
> at org.apache.hadoop.ipc.Client$Connection.access$2300(Client.java:414)
>
> at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:843)
>
> at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:839)
>
> at java.base/java.security.AccessController.doPrivileged(Native Method)
>
> at java.base/javax.security.auth.Subject.doAs(Subject.java:423)
>
> at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1878)
>
> at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:839)
>
> ... 28 more
>
> Caused by: GSSException: No valid credentials provided (Mechanism level: 
> Failed to find any Kerberos tgt)
>
> at 
> java.security.jgss/sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:162)
>
> at 
> java.security.jgss/sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:126)
>
> at 
> java.security.jgss/sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:193)
>
> at 
> java.security.jgss/sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:218)
>
> at 
> java.security.jgss/sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:230)
>
> at 
> java.security.jgss/sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:196)
>
> at 
> jdk.security.jgss/com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:203)
>
>
>
> Interestingly when we specify the location of the krb5.conf file in a JVM arg 
> using ('-Djava.security.krb5.conf=/etc/krb5.conf' ) we then get the following 
> exception:
>
>
>
> javax.security.auth.login.LoginException: Cannot read from System.in
>
> at 
> jdk.security.auth/com.sun.security.auth.module.Krb5LoginModule.promptForName(Krb5LoginModule.java:845)
>
>
>
> Full Stacktrace:
>
>
>
> java.io.IOException: DestHost:destPort hadoop:9000 , LocalHost:localPort 
> accumulo.accumulo-network/172.24.0.4:0. Failed on local exception: 
> java.io.IOException: javax.security.sasl.SaslException: GSS initiate failed 
> [Caused by GSSException: No valid credentials provided (Mechanism level: 
> Attempt to obtain new INITIATE credentials failed! (null))]
>
> at 
> java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native
>  Method)
>
> at 
> java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>
> at 
> java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>
> at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:490)
>
> at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:913)
>
> at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:888)
>
> at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1616)
>
> at org.apache.hadoop.ipc.Client.call(Client.java:1558)
>
> at org.apache.hadoop.ipc.Client.call(Client.java:1455)
>
> at 
> org.apache.hadoop.ipc.ProtobufRpcEngine2$Invoker.invoke(ProtobufRpcEngine2.java:242)
>
> at 
> org.apache.hadoop.ipc.ProtobufRpcEngine2$Invoker.invoke(ProtobufRpcEngine2.java:129)
>
> at com.sun.proxy.$Proxy18.delete(Unknown Source)
>
> at 
> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.delete(ClientNamenodeProtocolTranslatorPB.java:655)
>
> at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native 
> Method)
>
> at 
> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>
> at 
> java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>
> at java.base/java.lang.reflect.Method.invoke(Method.java:566)
>
> at 
> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:422)
>
> at 
> org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:165)
>
> at 
> org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:157)
>
> at 
> org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:95)
>
> at 
> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:359)
>
> at com.sun.proxy.$Proxy19.delete(Unknown Source)
>
> at org.apache.hadoop.hdfs.DFSClient.delete(DFSClient.java:1662)
>
> at 
> org.apache.hadoop.hdfs.DistributedFileSystem$19.doCall(DistributedFileSystem.java:992)
>
> at 
> org.apache.hadoop.hdfs.DistributedFileSystem$19.doCall(DistributedFileSystem.java:989)
>
> at 
> org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
>
> at 
> org.apache.hadoop.hdfs.DistributedFileSystem.delete(DistributedFileSystem.java:999)
>
> at 
> org.apache.accumulo.server.fs.VolumeManagerImpl.deleteRecursively(VolumeManagerImpl.java:214)
>
> at 
> org.apache.accumulo.gc.SimpleGarbageCollector$GCEnv.lambda$delete$3(SimpleGarbageCollector.java:339)
>
> at 
> java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
>
> at 
> java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
>
> at org.apache.accumulo.fate.util.LoggingRunnable.run(LoggingRunnable.java:35)
>
> at java.base/java.lang.Thread.run(Thread.java:829)
>
> Caused by: java.io.IOException: javax.security.sasl.SaslException: GSS 
> initiate failed [Caused by GSSException: No valid credentials provided 
> (Mechanism level: Attempt to obtain new INITIATE credentials failed! (null))]
>
> at org.apache.hadoop.ipc.Client$Connection$1.run(Client.java:798)
>
> at java.base/java.security.AccessController.doPrivileged(Native Method)
>
> at java.base/javax.security.auth.Subject.doAs(Subject.java:423)
>
> at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1878)
>
> at 
> org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:752)
>
> at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:856)
>
> at org.apache.hadoop.ipc.Client$Connection.access$3800(Client.java:414)
>
> at org.apache.hadoop.ipc.Client.getConnection(Client.java:1677)
>
> at org.apache.hadoop.ipc.Client.call(Client.java:1502)
>
> ... 26 more
>
> Caused by: javax.security.sasl.SaslException: GSS initiate failed [Caused by 
> GSSException: No valid credentials provided (Mechanism level: Attempt to 
> obtain new INITIATE credentials failed! (null))]
>
> at 
> jdk.security.jgss/com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:222)
>
> at 
> org.apache.hadoop.security.SaslRpcClient.saslConnect(SaslRpcClient.java:410)
>
> at 
> org.apache.hadoop.ipc.Client$Connection.setupSaslConnection(Client.java:623)
>
> at org.apache.hadoop.ipc.Client$Connection.access$2300(Client.java:414)
>
> at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:843)
>
> at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:839)
>
> at java.base/java.security.AccessController.doPrivileged(Native Method)
>
> at java.base/javax.security.auth.Subject.doAs(Subject.java:423)
>
> at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1878)
>
> at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:839)
>
> ... 29 more
>
> Caused by: GSSException: No valid credentials provided (Mechanism level: 
> Attempt to obtain new INITIATE credentials failed! (null))
>
> at 
> java.security.jgss/sun.security.jgss.krb5.Krb5InitCredential.getTgt(Krb5InitCredential.java:384)
>
> at 
> java.security.jgss/sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:160)
>
> at 
> java.security.jgss/sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:126)
>
> at 
> java.security.jgss/sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:193)
>
> at 
> java.security.jgss/sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:218)
>
> at 
> java.security.jgss/sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:230)
>
> at 
> java.security.jgss/sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:196)
>
> at 
> jdk.security.jgss/com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:203)
>
> ... 38 more
>
> Caused by: javax.security.auth.login.LoginException: Cannot read from 
> System.in
>
> at 
> jdk.security.auth/com.sun.security.auth.module.Krb5LoginModule.promptForName(Krb5LoginModule.java:845)
>
> at 
> jdk.security.auth/com.sun.security.auth.module.Krb5LoginModule.attemptAuthentication(Krb5LoginModule.java:684)
>
> at 
> jdk.security.auth/com.sun.security.auth.module.Krb5LoginModule.login(Krb5LoginModule.java:592)
>
> at 
> java.base/javax.security.auth.login.LoginContext.invoke(LoginContext.java:747)
>
> at 
> java.base/javax.security.auth.login.LoginContext$4.run(LoginContext.java:672)
>
> at 
> java.base/javax.security.auth.login.LoginContext$4.run(LoginContext.java:670)
>
> at java.base/java.security.AccessController.doPrivileged(Native Method)
>
> at 
> java.base/javax.security.auth.login.LoginContext.invokePriv(LoginContext.java:670)
>
> at 
> java.base/javax.security.auth.login.LoginContext.login(LoginContext.java:581)
>
> at java.security.jgss/sun.security.jgss.GSSUtil.login(GSSUtil.java:258)
>
> at 
> java.security.jgss/sun.security.jgss.krb5.Krb5Util.getInitialTicket(Krb5Util.java:175)
>
> at 
> java.security.jgss/sun.security.jgss.krb5.Krb5InitCredential$1.run(Krb5InitCredential.java:376)
>
> at 
> java.security.jgss/sun.security.jgss.krb5.Krb5InitCredential$1.run(Krb5InitCredential.java:372)
>
> at java.base/java.security.AccessController.doPrivileged(Native Method)
>
> at 
> java.security.jgss/sun.security.jgss.krb5.Krb5InitCredential.getTgt(Krb5InitCredential.java:371)
>
> ... 45 more
>
>
>
> Any pointers would be  gratefully received.
>
>
>
> Thanks,
>
>
>
> Phill
>
>
>
>
> Public

Reply via email to