Hi

I do not is this issue to Hive or Ranger list - let's see.

I have successfully set up Hive and Kerberos and Ranger Hive plugin enabled and Ranger audit configured to write to HDFS.

After Hive restart I see that Hive takes Kerberos tickets and Ranger parameters are as I set them:

STARTUP_MSG: build = git://c66-slave-20176e25-2/grid/0/jenkins/workspace/HDP-build-centos6/bigtop/build/hive/rpm/BUILD/hive-1.2.1.2.3.4.0 -r efb067075854961dfa41165d5802a62ae334a2db; compiled by 'jenkins' on Wed Dec 16 04:01:39 UTC 2015
************************************************************/
2016-07-08 12:24:42,979 INFO [main]: server.HiveServer2 (HiveServer2.java:startHiveServer2(405)) - Starting HiveServer2 2016-07-08 12:24:43,340 INFO [main]: security.UserGroupInformation (UserGroupInformation.java:loginUserFromKeytab(965)) - Login successful for user hive/hadoopnn1.estpak...@testhadoop.com using keytab file /etc/security/keytabs/hive.service.keytab 2016-07-08 12:24:43,347 INFO [main]: cli.CLIService (CLIService.java:init(104)) - SPNego httpUGI successfully created. 2016-07-08 12:24:43,493 INFO [main]: metastore.HiveMetaStore (HiveMetaStore.java:newRawStore(590)) - 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore ...2016-07-08 12:24:46,554 INFO [main]: provider.AuditProviderFactory (AuditProviderFactory.java:<init>(70)) - AuditProviderFactory: creating.. 2016-07-08 12:24:46,557 INFO [main]: provider.AuditProviderFactory (AuditProviderFactory.java:init(100)) - AuditProviderFactory: initializing.. 2016-07-08 12:24:46,585 INFO [main]: provider.AuditProviderFactory (AuditProviderFactory.java:init(129)) - AUDIT PROPERTY: xasecure.audit.destination.db=false 2016-07-08 12:24:46,585 INFO [main]: provider.AuditProviderFactory (AuditProviderFactory.java:init(129)) - AUDIT PROPERTY: xasecure.audit.destination.hdfs=true 2016-07-08 12:24:46,585 INFO [main]: provider.AuditProviderFactory (AuditProviderFactory.java:init(129)) - AUDIT PROPERTY: ranger.plugin.hive.policy.source.impl=org.apache.ranger.admin.client.RangerAdminRESTClient 2016-07-08 12:24:46,585 INFO [main]: provider.AuditProviderFactory (AuditProviderFactory.java:init(129)) - AUDIT PROPERTY: xasecure.audit.destination.db.batch.filespool.dir=/var/log/hive/audit/db/spool 2016-07-08 12:24:46,585 INFO [main]: provider.AuditProviderFactory (AuditProviderFactory.java:init(129)) - AUDIT PROPERTY: xasecure.audit.is.enabled=true 2016-07-08 12:24:46,585 INFO [main]: provider.AuditProviderFactory (AuditProviderFactory.java:init(129)) - AUDIT PROPERTY: xasecure.audit.destination.solr.batch.filespool.dir=/var/log/hive/audit/solr/spool 2016-07-08 12:24:46,586 INFO [main]: provider.AuditProviderFactory (AuditProviderFactory.java:init(129)) - AUDIT PROPERTY: xasecure.audit.destination.db.jdbc.driver=org.postgresql.Driver 2016-07-08 12:24:46,586 INFO [main]: provider.AuditProviderFactory (AuditProviderFactory.java:init(129)) - AUDIT PROPERTY: xasecure.hive.update.xapolicies.on.grant.revoke=true 2016-07-08 12:24:46,586 INFO [main]: provider.AuditProviderFactory (AuditProviderFactory.java:init(129)) - AUDIT PROPERTY: xasecure.audit.destination.db.password=crypted 2016-07-08 12:24:46,586 INFO [main]: provider.AuditProviderFactory (AuditProviderFactory.java:init(129)) - AUDIT PROPERTY: ranger.plugin.hive.policy.rest.ssl.config.file=/usr/hdp/current/hive-server2/conf/conf.server/ranger-policymgr-ssl.xml 2016-07-08 12:24:46,586 INFO [main]: provider.AuditProviderFactory (AuditProviderFactory.java:init(129)) - AUDIT PROPERTY: xasecure.audit.destination.hdfs.dir=hdfs://mycluster/datasource/ranger/audit/hiveServer2/ 2016-07-08 12:24:46,586 INFO [main]: provider.AuditProviderFactory (AuditProviderFactory.java:init(129)) - AUDIT PROPERTY: xasecure.audit.destination.db.jdbc.url=jdbc:postgresql://hadoopnn2.estpak.ee/ranger_audit 2016-07-08 12:24:46,586 INFO [main]: provider.AuditProviderFactory (AuditProviderFactory.java:init(129)) - AUDIT PROPERTY: xasecure.audit.destination.solr.zookeepers=none 2016-07-08 12:24:46,586 INFO [main]: provider.AuditProviderFactory (AuditProviderFactory.java:init(129)) - AUDIT PROPERTY: xasecure.audit.destination.hdfs.file.rollover.sec=3600 2016-07-08 12:24:46,586 INFO [main]: provider.AuditProviderFactory (AuditProviderFactory.java:init(129)) - AUDIT PROPERTY: xasecure.audit.destination.hdfs.batch.filespool.dir=/var/log/hive/audit/hdfs/spool 2016-07-08 12:24:46,586 INFO [main]: provider.AuditProviderFactory (AuditProviderFactory.java:init(129)) - AUDIT PROPERTY: xasecure.audit.credential.provider.file=jceks://file/etc/ranger/Arendus_hive/cred.jceks 2016-07-08 12:24:46,586 INFO [main]: provider.AuditProviderFactory (AuditProviderFactory.java:init(129)) - AUDIT PROPERTY: ranger.plugin.hive.service.name=Arendus_hive 2016-07-08 12:24:46,586 INFO [main]: provider.AuditProviderFactory (AuditProviderFactory.java:init(129)) - AUDIT PROPERTY: xasecure.audit.destination.solr.urls=http://solr_host:6083/solr/ranger_audits 2016-07-08 12:24:46,586 INFO [main]: provider.AuditProviderFactory (AuditProviderFactory.java:init(129)) - AUDIT PROPERTY: ranger.plugin.hive.policy.rest.url=http://hadoopnn2.estpak.ee:6080 2016-07-08 12:24:46,587 INFO [main]: provider.AuditProviderFactory (AuditProviderFactory.java:init(129)) - AUDIT PROPERTY: ranger.plugin.hive.policy.cache.dir=/etc/ranger/Arendus_hive/policycache 2016-07-08 12:24:46,587 INFO [main]: provider.AuditProviderFactory (AuditProviderFactory.java:init(129)) - AUDIT PROPERTY: xasecure.audit.provider.summary.enabled=true 2016-07-08 12:24:46,587 INFO [main]: provider.AuditProviderFactory (AuditProviderFactory.java:init(129)) - AUDIT PROPERTY: xasecure.audit.destination.solr=false 2016-07-08 12:24:46,587 INFO [main]: provider.AuditProviderFactory (AuditProviderFactory.java:init(129)) - AUDIT PROPERTY: xasecure.audit.destination.hdfs.subdir=%time:yyyy%/%time:MM%/%time:dd% 2016-07-08 12:24:46,587 INFO [main]: provider.AuditProviderFactory (AuditProviderFactory.java:init(129)) - AUDIT PROPERTY: ranger.plugin.hive.policy.pollIntervalMs=30000 2016-07-08 12:24:46,587 INFO [main]: provider.AuditProviderFactory (AuditProviderFactory.java:init(129)) - AUDIT PROPERTY: xasecure.audit.destination.db.user=rangerlogger 2016-07-08 12:24:46,587 INFO [main]: provider.AuditProviderFactory (AuditProviderFactory.java:init(151)) - Audit destination xasecure.audit.destination.hdfs is set to true 2016-07-08 12:24:46,589 INFO [main]: destination.AuditDestination (AuditDestination.java:<init>(36)) - AuditDestination() enter 2016-07-08 12:24:46,590 INFO [main]: provider.BaseAuditHandler (BaseAuditHandler.java:init(85)) - BaseAuditProvider.init() 2016-07-08 12:24:46,590 INFO [main]: provider.BaseAuditHandler (BaseAuditHandler.java:init(90)) - propPrefix=xasecure.audit.destination.hdfs 2016-07-08 12:24:46,590 INFO [main]: provider.BaseAuditHandler (BaseAuditHandler.java:init(102)) - Using providerName from property prefix. providerName=hdfs 2016-07-08 12:24:46,590 INFO [main]: provider.BaseAuditHandler (BaseAuditHandler.java:init(105)) - providerName=hdfs 2016-07-08 12:24:46,591 INFO [main]: destination.HDFSAuditDestination (HDFSAuditDestination.java:init(96)) - logFolder=hdfs://mycluster/datasource/ranger/audit/hiveServer2//%time:yyyy%/%time:MM%/%time:dd%, destName=hdfs 2016-07-08 12:24:46,591 INFO [main]: destination.HDFSAuditDestination (HDFSAuditDestination.java:init(97)) - logFileNameFormat=%app-type%_ranger_audit_%hostname%.log, destName=hdfs 2016-07-08 12:24:46,591 INFO [main]: destination.HDFSAuditDestination (HDFSAuditDestination.java:init(99)) - config={} 2016-07-08 12:24:46,591 INFO [main]: provider.AuditProviderFactory (AuditProviderFactory.java:init(167)) - xasecure.audit.destination.hdfs.queue is not set. Setting queue to batch for hdfs 2016-07-08 12:24:46,591 INFO [main]: provider.AuditProviderFactory (AuditProviderFactory.java:init(172)) - queue for hdfs is batch 2016-07-08 12:24:46,592 INFO [main]: queue.AuditQueue (AuditQueue.java:init(88)) - BaseAuditProvider.init() 2016-07-08 12:24:46,592 INFO [main]: provider.BaseAuditHandler (BaseAuditHandler.java:init(85)) - BaseAuditProvider.init() 2016-07-08 12:24:46,592 INFO [main]: provider.BaseAuditHandler (BaseAuditHandler.java:init(90)) - propPrefix=xasecure.audit.destination.hdfs.batch 2016-07-08 12:24:46,592 INFO [main]: provider.BaseAuditHandler (BaseAuditHandler.java:init(105)) - providerName=batch 2016-07-08 12:24:46,593 INFO [main]: queue.AuditQueue (AuditQueue.java:init(103)) - File spool is enabled for batch, logFolderProp=/var/log/hive/audit/hdfs/spool, xasecure.audit.destination.hdfs.batch.filespool.dir=false 2016-07-08 12:24:46,595 INFO [main]: queue.AuditFileSpool (AuditFileSpool.java:init(158)) - retryDestinationMS=30000, queueName=batch 2016-07-08 12:24:46,595 INFO [main]: queue.AuditFileSpool (AuditFileSpool.java:init(160)) - fileRolloverSec=86400, queueName=batch 2016-07-08 12:24:46,595 INFO [main]: queue.AuditFileSpool (AuditFileSpool.java:init(162)) - maxArchiveFiles=100, queueName=batch 2016-07-08 12:24:46,595 INFO [main]: queue.AuditFileSpool (AuditFileSpool.java:init(184)) - logFolder=/var/log/hive/audit/hdfs/spool, queueName=batch


And about two days Hive and Ranger are working as I expect. After configured time I can see audit events in HDFS.

But after some time (I think about after two days) there is no more logs in HDFS (Ranger successfully still writing them into the local pool) and in hive log I can see that there is a problem writing to the HDFS because the Kerberos ticket:

2016-07-08 11:49:53,290 INFO [hiveServer2.async.summary.batch_hiveServer2.async.summary.batch.hdfs_destWriter]: provider.BaseAuditHandler (BaseAuditHandler.java:logStatus(312)) - Audit Status Log: name=hiveServer2.async.summary.batch.hdfs, interval=01:00.064 minutes, events=1, deferredCount=1, totalEvents=1210, totalSuccessCount=168, totalDeferredCount=1042 2016-07-08 11:49:53,294 INFO [hiveServer2.async.summary.batch_hiveServer2.async.summary.batch.hdfs_destWriter]: destination.HDFSAuditDestination (HDFSAuditDestination.java:createConfiguration(263)) - Returning HDFS Filesystem Config: Configuration: core-default.xml, core-site.xml, mapred-default.xml, mapred-site.xml, yarn-default.xml, yarn-site.xml, hdfs-default.xml, hdfs-site.xml 2016-07-08 11:49:53,309 INFO [hiveServer2.async.summary.batch_hiveServer2.async.summary.batch.hdfs_destWriter]: destination.HDFSAuditDestination (HDFSAuditDestination.java:getLogFileStream(224)) - Checking whether log file exists. hdfPath=hdfs://mycluster/datasource/ranger/audit/hiveServer2//2016/07/08/hiveServer2_ranger_audit_hadoopnn1.estpak.ee.log, UGI=hive/hadoopnn1.estpak...@testhadoop.com (auth:KERBEROS) 2016-07-08 11:49:53,311 WARN [hiveServer2.async.summary.batch_hiveServer2.async.summary.batch.hdfs_destWriter]: ipc.Client (Client.java:run(680)) - Exception encountered while connecting to the server : javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)] at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211) at org.apache.hadoop.security.SaslRpcClient.saslConnect(SaslRpcClient.java:413) at org.apache.hadoop.ipc.Client$Connection.setupSaslConnection(Client.java:558) at org.apache.hadoop.ipc.Client$Connection.access$1800(Client.java:373)
        at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:727)
        at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:723)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657) at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:722) at org.apache.hadoop.ipc.Client$Connection.access$2800(Client.java:373)
        at org.apache.hadoop.ipc.Client.getConnection(Client.java:1493)
        at org.apache.hadoop.ipc.Client.call(Client.java:1397)
        at org.apache.hadoop.ipc.Client.call(Client.java:1358)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229)
        at com.sun.proxy.$Proxy15.getFileInfo(Unknown Source)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:771)
        at sun.reflect.GeneratedMethodAccessor10.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:252) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:104)
        at com.sun.proxy.$Proxy16.getFileInfo(Unknown Source)
at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:2116) at org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1315) at org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1311) at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1311)
        at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1424)
at org.apache.ranger.audit.destination.HDFSAuditDestination.getLogFileStream(HDFSAuditDestination.java:226) at org.apache.ranger.audit.destination.HDFSAuditDestination.logJSON(HDFSAuditDestination.java:123) at org.apache.ranger.audit.queue.AuditFileSpool.sendEvent(AuditFileSpool.java:890) at org.apache.ranger.audit.queue.AuditFileSpool.runDoAs(AuditFileSpool.java:838) at org.apache.ranger.audit.queue.AuditFileSpool$2.run(AuditFileSpool.java:759)

at org.apache.ranger.audit.queue.AuditFileSpool$2.run(AuditFileSpool.java:757)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:360)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1637) at org.apache.ranger.audit.queue.AuditFileSpool.run(AuditFileSpool.java:765)
        at java.lang.Thread.run(Thread.java:745)
Caused by: GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt) at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147) at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:122) at sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187) at sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:224) at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212) at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179) at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:192)
        ... 39 more


In the same time I can create new tables and make queries.


After restarting Hive it works again. I suspect there is issue with Kerveros ticket.


Any hints where to dig?


Br, Margus


--
Margus (margusja) Roo
http://margus.roo.ee
skype: margusja
+372 51 48 780

Reply via email to