Hello Andrey, Can you check the keytabs are properly generated and handled on Hive host ? Using maybe klist with -t option.
Regards, Loïc Loïc CHANEL System Big Data engineer MS&T - WASABI - Worldline (Villeurbanne, France) 2017-05-04 15:42 GMT+02:00 Markovich <amriv...@gmail.com>: > i'm still unable to resolve this... > > INFO [Thread-17]: thrift.ThriftCLIService > (ThriftHttpCLIService.java:run(152)) > - Started ThriftHttpCLIService in http mode on port 10001 > path=/cliservice/* with 5...500 worker threads > 2017-05-04 13:40:14,195 INFO [HiveServer2-HttpHandler-Pool: Thread-60]: > thrift.ThriftHttpServlet (ThriftHttpServlet.java:doPost(145)) - Could not > validate cookie sent, will try to generate a new cookie > 2017-05-04 13:40:14,198 INFO [HiveServer2-HttpHandler-Pool: Thread-60]: > thrift.ThriftHttpServlet (ThriftHttpServlet.java:doKerberosAuth(398)) - > Failed to authenticate with http/_HOST kerberos principal, trying with > hive/_HOST kerberos principal > 2017-05-04 13:40:14,199 ERROR [HiveServer2-HttpHandler-Pool: Thread-60]: > thrift.ThriftHttpServlet (ThriftHttpServlet.java:doKerberosAuth(406)) - > Failed to authenticate with hive/_HOST kerberos principal > 2017-05-04 13:40:14,199 ERROR [HiveServer2-HttpHandler-Pool: Thread-60]: > thrift.ThriftHttpServlet (ThriftHttpServlet.java:doPost(209)) - Error: > org.apache.hive.service.auth.HttpAuthenticationException: > java.lang.reflect.UndeclaredThrowableException > at org.apache.hive.service.cli.thrift.ThriftHttpServlet. > doKerberosAuth(ThriftHttpServlet.java:407) > at org.apache.hive.service.cli.thrift.ThriftHttpServlet. > doPost(ThriftHttpServlet.java:159) > at javax.servlet.http.HttpServlet.service(HttpServlet.java:727) > at javax.servlet.http.HttpServlet.service(HttpServlet.java:820) > at org.eclipse.jetty.servlet.ServletHolder.handle( > ServletHolder.java:565) > at org.eclipse.jetty.servlet.ServletHandler.doHandle( > ServletHandler.java:479) > at org.eclipse.jetty.server.session.SessionHandler. > doHandle(SessionHandler.java:225) > at org.eclipse.jetty.server.handler.ContextHandler. > doHandle(ContextHandler.java:1031) > at org.eclipse.jetty.servlet.ServletHandler.doScope( > ServletHandler.java:406) > at org.eclipse.jetty.server.session.SessionHandler. > doScope(SessionHandler.java:186) > at org.eclipse.jetty.server.handler.ContextHandler. > doScope(ContextHandler.java:965) > at org.eclipse.jetty.server.handler.ScopedHandler.handle( > ScopedHandler.java:117) > at org.eclipse.jetty.server.handler.HandlerWrapper.handle( > HandlerWrapper.java:111) > at org.eclipse.jetty.server.Server.handle(Server.java:349) > at org.eclipse.jetty.server.AbstractHttpConnection.handleRequest( > AbstractHttpConnection.java:449) > at org.eclipse.jetty.server.AbstractHttpConnection$ > RequestHandler.content(AbstractHttpConnection.java:925) > at org.eclipse.jetty.http.HttpParser.parseNext( > HttpParser.java:857) > at org.eclipse.jetty.http.HttpParser.parseAvailable( > HttpParser.java:235) > at org.eclipse.jetty.server.AsyncHttpConnection.handle( > AsyncHttpConnection.java:76) > at org.eclipse.jetty.io.nio.SelectChannelEndPoint.handle( > SelectChannelEndPoint.java:609) > at org.eclipse.jetty.io.nio.SelectChannelEndPoint$1.run( > SelectChannelEndPoint.java:45) > at java.util.concurrent.ThreadPoolExecutor.runWorker( > ThreadPoolExecutor.java:1145) > at java.util.concurrent.ThreadPoolExecutor$Worker.run( > ThreadPoolExecutor.java:615) > at java.lang.Thread.run(Thread.java:745) > Caused by: java.lang.reflect.UndeclaredThrowableException > at org.apache.hadoop.security.UserGroupInformation.doAs( > UserGroupInformation.java:1742) > at org.apache.hive.service.cli.thrift.ThriftHttpServlet. > doKerberosAuth(ThriftHttpServlet.java:404) > ... 23 more > Caused by: org.apache.hive.service.auth.HttpAuthenticationException: > Authorization header received from the client is empty. > at org.apache.hive.service.cli.thrift.ThriftHttpServlet. > getAuthHeader(ThriftHttpServlet.java:548) > at org.apache.hive.service.cli.thrift.ThriftHttpServlet. > access$100(ThriftHttpServlet.java:74) > at org.apache.hive.service.cli.thrift.ThriftHttpServlet$ > HttpKerberosServerAction.run(ThriftHttpServlet.java:449) > at org.apache.hive.service.cli.thrift.ThriftHttpServlet$ > HttpKerberosServerAction.run(ThriftHttpServlet.java:412) > at java.security.AccessController.doPrivileged(Native Method) > at javax.security.auth.Subject.doAs(Subject.java:415) > at org.apache.hadoop.security.UserGroupInformation.doAs( > UserGroupInformation.java:1724) > ... 24 more > 2017-05-04 13:40:14,211 INFO [HiveServer2-HttpHandler-Pool: Thread-60]: > thrift.ThriftHttpServlet (ThriftHttpServlet.java:doPost(145)) - Could not > validate cookie sent, will try to generate a new cookie > 2017-05-04 13:40:14,219 INFO [HiveServer2-HttpHandler-Pool: Thread-60]: > thrift.ThriftHttpServlet (ThriftHttpServlet.java:doPost(204)) - Cookie > added for clientUserName hue > 2017-05-04 13:40:14,229 INFO [HiveServer2-HttpHandler-Pool: Thread-60]: > thrift.ThriftCLIService (ThriftCLIService.java:OpenSession(313)) - Client > protocol version: HIVE_CLI_SERVICE_PROTOCOL_V7 > 2017-05-04 13:40:14,244 WARN [HiveServer2-HttpHandler-Pool: Thread-60]: > thrift.ThriftCLIService (ThriftCLIService.java:OpenSession(327)) - Error > opening session: > org.apache.hive.service.cli.HiveSQLException: Failed to validate proxy > privilege of hue for hdfs > at org.apache.hive.service.auth.HiveAuthFactory.verifyProxyAccess( > HiveAuthFactory.java:396) > at org.apache.hive.service.cli.thrift.ThriftCLIService. > getProxyUser(ThriftCLIService.java:751) > at org.apache.hive.service.cli.thrift.ThriftCLIService. > getUserName(ThriftCLIService.java:386) > at org.apache.hive.service.cli.thrift.ThriftCLIService. > getSessionHandle(ThriftCLIService.java:413) > at org.apache.hive.service.cli.thrift.ThriftCLIService. > OpenSession(ThriftCLIService.java:316) > at org.apache.hive.service.cli.thrift.TCLIService$Processor$ > OpenSession.getResult(TCLIService.java:1257) > at org.apache.hive.service.cli.thrift.TCLIService$Processor$ > OpenSession.getResult(TCLIService.java:1242) > at org.apache.thrift.ProcessFunction.process( > ProcessFunction.java:39) > at org.apache.thrift.TBaseProcessor.process( > TBaseProcessor.java:39) > at org.apache.thrift.server.TServlet.doPost(TServlet.java:83) > at org.apache.hive.service.cli.thrift.ThriftHttpServlet. > doPost(ThriftHttpServlet.java:206) > at javax.servlet.http.HttpServlet.service(HttpServlet.java:727) > at javax.servlet.http.HttpServlet.service(HttpServlet.java:820) > at org.eclipse.jetty.servlet.ServletHolder.handle( > ServletHolder.java:565) > at org.eclipse.jetty.servlet.ServletHandler.doHandle( > ServletHandler.java:479) > at org.eclipse.jetty.server.session.SessionHandler. > doHandle(SessionHandler.java:225) > at org.eclipse.jetty.server.handler.ContextHandler. > doHandle(ContextHandler.java:1031) > at org.eclipse.jetty.servlet.ServletHandler.doScope( > ServletHandler.java:406) > at org.eclipse.jetty.server.session.SessionHandler. > doScope(SessionHandler.java:186) > at org.eclipse.jetty.server.handler.ContextHandler. > doScope(ContextHandler.java:965) > at org.eclipse.jetty.server.handler.ScopedHandler.handle( > ScopedHandler.java:117) > at org.eclipse.jetty.server.handler.HandlerWrapper.handle( > HandlerWrapper.java:111) > at org.eclipse.jetty.server.Server.handle(Server.java:349) > at org.eclipse.jetty.server.AbstractHttpConnection.handleRequest( > AbstractHttpConnection.java:449) > at org.eclipse.jetty.server.AbstractHttpConnection$ > RequestHandler.content(AbstractHttpConnection.java:925) > at org.eclipse.jetty.http.HttpParser.parseNext( > HttpParser.java:857) > at org.eclipse.jetty.http.HttpParser.parseAvailable( > HttpParser.java:235) > at org.eclipse.jetty.server.AsyncHttpConnection.handle( > AsyncHttpConnection.java:76) > at org.eclipse.jetty.io.nio.SelectChannelEndPoint.handle( > SelectChannelEndPoint.java:609) > at org.eclipse.jetty.io.nio.SelectChannelEndPoint$1.run( > SelectChannelEndPoint.java:45) > at java.util.concurrent.ThreadPoolExecutor.runWorker( > ThreadPoolExecutor.java:1145) > at java.util.concurrent.ThreadPoolExecutor$Worker.run( > ThreadPoolExecutor.java:615) > at java.lang.Thread.run(Thread.java:745) > Caused by: org.apache.hadoop.security.authorize.AuthorizationException: > User: hue is not allowed to impersonate hdfs > at org.apache.hadoop.security.authorize. > DefaultImpersonationProvider.authorize(DefaultImpersonationProvider. > java:119) > at org.apache.hadoop.security.authorize.ProxyUsers. > authorize(ProxyUsers.java:102) > at org.apache.hadoop.security.authorize.ProxyUsers. > authorize(ProxyUsers.java:116) > at org.apache.hive.service.auth.HiveAuthFactory.verifyProxyAccess( > HiveAuthFactory.java:392) > ... 32 more > 2017-05-04 13:40:15,654 INFO [HiveServer2-HttpHandler-Pool: Thread-60]: > thrift.ThriftCLIService (ThriftCLIService.java:OpenSession(313)) - Client > protocol version: HIVE_CLI_SERVICE_PROTOCOL_V7 > 2017-05-04 13:40:15,658 WARN [HiveServer2-HttpHandler-Pool: Thread-60]: > thrift.ThriftCLIService (ThriftCLIService.java:OpenSession(327)) - Error > opening session: > org.apache.hive.service.cli.HiveSQLException: Failed to validate proxy > privilege of hue for hdfs > at org.apache.hive.service.auth.HiveAuthFactory.verifyProxyAccess( > HiveAuthFactory.java:396) > at org.apache.hive.service.cli.thrift.ThriftCLIService. > getProxyUser(ThriftCLIService.java:751) > at org.apache.hive.service.cli.thrift.ThriftCLIService. > getUserName(ThriftCLIService.java:386) > at org.apache.hive.service.cli.thrift.ThriftCLIService. > getSessionHandle(ThriftCLIService.java:413) > at org.apache.hive.service.cli.thrift.ThriftCLIService. > OpenSession(ThriftCLIService.java:316) > at org.apache.hive.service.cli.thrift.TCLIService$Processor$ > OpenSession.getResult(TCLIService.java:1257) > at org.apache.hive.service.cli.thrift.TCLIService$Processor$ > OpenSession.getResult(TCLIService.java:1242) > at org.apache.thrift.ProcessFunction.process( > ProcessFunction.java:39) > at org.apache.thrift.TBaseProcessor.process( > TBaseProcessor.java:39) > at org.apache.thrift.server.TServlet.doPost(TServlet.java:83) > at org.apache.hive.service.cli.thrift.ThriftHttpServlet. > doPost(ThriftHttpServlet.java:206) > at javax.servlet.http.HttpServlet.service(HttpServlet.java:727) > at javax.servlet.http.HttpServlet.service(HttpServlet.java:820) > at org.eclipse.jetty.servlet.ServletHolder.handle( > ServletHolder.java:565) > at org.eclipse.jetty.servlet.ServletHandler.doHandle( > ServletHandler.java:479) > at org.eclipse.jetty.server.session.SessionHandler. > doHandle(SessionHandler.java:225) > at org.eclipse.jetty.server.handler.ContextHandler. > doHandle(ContextHandler.java:1031) > at org.eclipse.jetty.servlet.ServletHandler.doScope( > ServletHandler.java:406) > at org.eclipse.jetty.server.session.SessionHandler. > doScope(SessionHandler.java:186) > at org.eclipse.jetty.server.handler.ContextHandler. > doScope(ContextHandler.java:965) > at org.eclipse.jetty.server.handler.ScopedHandler.handle( > ScopedHandler.java:117) > at org.eclipse.jetty.server.handler.HandlerWrapper.handle( > HandlerWrapper.java:111) > at org.eclipse.jetty.server.Server.handle(Server.java:349) > at org.eclipse.jetty.server.AbstractHttpConnection.handleRequest( > AbstractHttpConnection.java:449) > at org.eclipse.jetty.server.AbstractHttpConnection$ > RequestHandler.content(AbstractHttpConnection.java:925) > at org.eclipse.jetty.http.HttpParser.parseNext( > HttpParser.java:857) > at org.eclipse.jetty.http.HttpParser.parseAvailable( > HttpParser.java:235) > at org.eclipse.jetty.server.AsyncHttpConnection.handle( > AsyncHttpConnection.java:76) > at org.eclipse.jetty.io.nio.SelectChannelEndPoint.handle( > SelectChannelEndPoint.java:609) > at org.eclipse.jetty.io.nio.SelectChannelEndPoint$1.run( > SelectChannelEndPoint.java:45) > at java.util.concurrent.ThreadPoolExecutor.runWorker( > ThreadPoolExecutor.java:1145) > at java.util.concurrent.ThreadPoolExecutor$Worker.run( > ThreadPoolExecutor.java:615) > at java.lang.Thread.run(Thread.java:745) > Caused by: org.apache.hadoop.security.authorize.AuthorizationException: > User: hue is not allowed to impersonate hdfs > at org.apache.hadoop.security.authorize. > DefaultImpersonationProvider.authorize(DefaultImpersonationProvider. > java:119) > at org.apache.hadoop.security.authorize.ProxyUsers. > authorize(ProxyUsers.java:102) > at org.apache.hadoop.security.authorize.ProxyUsers. > authorize(ProxyUsers.java:116) > at org.apache.hive.service.auth.HiveAuthFactory.verifyProxyAccess( > HiveAuthFactory.java:392) > ... 32 more > > > > Please give me any ideas where to dig... > > Regards, > Andrey > > 2017-04-20 23:04 GMT+03:00 Markovich <amriv...@gmail.com>: > >> Hi Hive users, >> >> I've got a very strange problem and don't know where to go next, so >> writting here, may be someone could help me. >> >> I've got HDP 2.5 with Hive 1.2.1000.2.5.0.0-1245 and Hadoop >> 2.7.3.2.5.0.0-1245. I've got kerberos nad Ranger enabled. >> I've installed HUE 3.11 on it, I'm getting erros like this: *Failed to >> validate proxy privilege of hue for hdfs*, when logging into hue using >> user hdfs. >> >> I've already added* hadoop.proxyuser.hue.groups=** and >> *hadoop.proxyuser.hue.hosts=** in core-site.xml. Checked that this >> settings were applied: >> >> # hadoop org.apache.hadoop.conf.Configuration | grep hue >> <property><name>hadoop.proxyuser.hue.groups</name><value>*</ >> value><source>core-site.xml</source></property> >> <property><name>hadoop.proxyuser.hue.hosts</name><value>*</ >> value><source>core-site.xml</source></property> >> >> Also checked properties like:*hive.server2.enable.impersonation *and >> *hive.server2.enable.doAs.* >> I've logged into beeline and connected to Hive using hue ticket: >> >> #klist >> Ticket cache: FILE:/tmp/krb5cc_0 >> Default principal: h...@demo.test >> >> Valid starting Expires Service principal >> 04/20/2017 19:40:50 04/21/2017 19:40:50 krbtgt/demo.t...@demo.test >> renew until 04/27/2017 19:40:50 >> >> #/usr/hdp/current/hive-client/bin/beeline --verbose >> !connect jdbc:hive2://drm2.demo.test:10001/default;principal=hive/drm >> 2.demo.t...@demo.test;transportMode=http;httpPath=cliservice >> ;hive.server2.proxy.user=hue >> >> 0: jdbc:hive2://drm2.demo.test:10001/defau> set >> hive.server2.enable.impersonation; >> Getting log thread is interrupted, since query is done! >> +-----------------------------------------+--+ >> | set | >> +-----------------------------------------+--+ >> | hive.server2.enable.impersonation=true | >> +-----------------------------------------+--+ >> 1 row selected (0.144 seconds) >> >> 0: jdbc:hive2://drm2.demo.test:10001/defau> set hive.server2.enable.doAs; >> Getting log thread is interrupted, since query is done! >> +--------------------------------+--+ >> | set | >> +--------------------------------+--+ >> | hive.server2.enable.doAs=true | >> +--------------------------------+--+ >> 1 row selected (0.069 seconds) >> >> When I'm trying to use hdfs as proxyuser through beeline, I've get: >> Connecting to jdbc:hive2://drm2.demo.test:10 >> 001/default;principal=hive/drm2.demo.t...@demo.test;transpor >> tMode=http;httpPath=cliservice;hive.server2.proxy.user=hdfs >> Enter username for jdbc:hive2://drm2.demo.test:10 >> 001/default;principal=hive/drm2.demo.t...@demo.test;transpor >> tMode=http;httpPath=cliservice;hive.server2.proxy.user=hdfs: >> Enter password for jdbc:hive2://drm2.demo.test:10 >> 001/default;principal=hive/drm2.demo.t...@demo.test;transpor >> tMode=http;httpPath=cliservice;hive.server2.proxy.user=hdfs: >> Error: Failed to validate proxy privilege of hue for hdfs >> (state=08S01,code=0) >> org.apache.hive.service.cli.HiveSQLException: Failed to validate proxy >> privilege of hue for hdfs >> ... >> Caused by: org.apache.hive.service.cli.HiveSQLException: Failed to >> validate proxy privilege of hue for hdfs >> ... >> Caused by: org.apache.hadoop.security.authorize.AuthorizationException: >> User: hue is not allowed to impersonate hdfs >> >> I've looked in Hadoop sources and this error means problem with >> hadoop.proxyuser.hue.groups. >> So at some very strange reasone hadoop is unable to allow user Hue to >> impersonate hdfs or any other user. >> >> Where should I dig next? I'm a bit confused. >> >> Also yarn, hive, hdfs and hcat - all this users can impersonate any user, >> so impersonation is working. >> I've also checked if hadoop mapping to local is correct, and it seems to >> be correct: >> # hadoop org.apache.hadoop.security.HadoopKerberosName h...@demo.test >> Name: h...@demo.test to hue >> >> Any ideas or help is welcome. I've stuck with this problem for 2 days >> already. >> >> Regards, >> Markovich >> >> >> >> >> >> >> >> >> >> >> >