[ 
https://issues.apache.org/jira/browse/HIVE-9934?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14365037#comment-14365037
 ] 

Chao commented on HIVE-9934:
----------------------------

Found this in log: 

{noformat}
2015-03-17 04:33:32,728 INFO  [stdout-redir-1]: client.SparkClientImpl 
(SparkClientImpl.java:run(537)) - 2015-03-17 04:33:32,725 INFO  
[pool-1-thread-1] client.RemoteDriver (RemoteDriver.java:call(371)) - Failed to 
run job 681ccfbe-bf9f-491c-a2e7-ad513f62d1dc
2015-03-17 04:33:32,728 INFO  [stdout-redir-1]: client.SparkClientImpl 
(SparkClientImpl.java:run(537)) - java.util.concurrent.ExecutionException: 
Exception thrown by job
2015-03-17 04:33:32,728 INFO  [stdout-redir-1]: client.SparkClientImpl 
(SparkClientImpl.java:run(537)) -        at 
org.apache.spark.JavaFutureActionWrapper.getImpl(FutureAction.scala:311)
2015-03-17 04:33:32,728 INFO  [stdout-redir-1]: client.SparkClientImpl 
(SparkClientImpl.java:run(537)) -        at 
org.apache.spark.JavaFutureActionWrapper.get(FutureAction.scala:316)
2015-03-17 04:33:32,728 INFO  [stdout-redir-1]: client.SparkClientImpl 
(SparkClientImpl.java:run(537)) -        at 
org.apache.hive.spark.client.RemoteDriver$JobWrapper.call(RemoteDriver.java:364)
2015-03-17 04:33:32,728 INFO  [stdout-redir-1]: client.SparkClientImpl 
(SparkClientImpl.java:run(537)) -        at 
org.apache.hive.spark.client.RemoteDriver$JobWrapper.call(RemoteDriver.java:317)
2015-03-17 04:33:32,729 INFO  [stdout-redir-1]: client.SparkClientImpl 
(SparkClientImpl.java:run(537)) -        at 
java.util.concurrent.FutureTask.run(FutureTask.java:262)
2015-03-17 04:33:32,729 INFO  [stdout-redir-1]: client.SparkClientImpl 
(SparkClientImpl.java:run(537)) -        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
2015-03-17 04:33:32,729 INFO  [stdout-redir-1]: client.SparkClientImpl 
(SparkClientImpl.java:run(537)) -        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
2015-03-17 04:33:32,729 INFO  [stdout-redir-1]: client.SparkClientImpl 
(SparkClientImpl.java:run(537)) -        at 
java.lang.Thread.run(Thread.java:744)
2015-03-17 04:33:32,729 INFO  [stdout-redir-1]: client.SparkClientImpl 
(SparkClientImpl.java:run(537)) - Caused by: org.apache.spark.SparkException: 
Job aborted due to stage failure: Task 0 in stage 0.0 failed 4 times, most 
recent failure: Lost task 0.3 in stage 0.0 (TID 3, 
ip-10-182-56-7.ec2.internal): java.io.FileNotFoundException: 
http://10.182.56.7:34690/jars/hive-exec-1.2.0-SNAPSHOT.jar
2015-03-17 04:33:32,729 INFO  [stdout-redir-1]: client.SparkClientImpl 
(SparkClientImpl.java:run(537)) -        at 
sun.net.www.protocol.http.HttpURLConnection.getInputStream(HttpURLConnection.java:1624)
2015-03-17 04:33:32,729 INFO  [stdout-redir-1]: client.SparkClientImpl 
(SparkClientImpl.java:run(537)) -        at 
org.apache.spark.util.Utils$.doFetchFile(Utils.scala:452)
2015-03-17 04:33:32,729 INFO  [stdout-redir-1]: client.SparkClientImpl 
(SparkClientImpl.java:run(537)) -        at 
org.apache.spark.util.Utils$.fetchFile(Utils.scala:383)
2015-03-17 04:33:32,729 INFO  [stdout-redir-1]: client.SparkClientImpl 
(SparkClientImpl.java:run(537)) -        at 
org.apache.spark.executor.Executor$$anonfun$org$apache$spark$executor$Executor$$updateDependencies$6.apply(Executor.scala:350)
2015-03-17 04:33:32,729 INFO  [stdout-redir-1]: client.SparkClientImpl 
(SparkClientImpl.java:run(537)) -        at 
org.apache.spark.executor.Executor$$anonfun$org$apache$spark$executor$Executor$$updateDependencies$6.apply(Executor.scala:347)
2015-03-17 04:33:32,729 INFO  [stdout-redir-1]: client.SparkClientImpl 
(SparkClientImpl.java:run(537)) -        at 
scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(TraversableLike.scala:772)
2015-03-17 04:33:32,729 INFO  [stdout-redir-1]: client.SparkClientImpl 
(SparkClientImpl.java:run(537)) -        at 
scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)
2015-03-17 04:33:32,729 INFO  [stdout-redir-1]: client.SparkClientImpl 
(SparkClientImpl.java:run(537)) -        at 
scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)
2015-03-17 04:33:32,729 INFO  [stdout-redir-1]: client.SparkClientImpl 
(SparkClientImpl.java:run(537)) -        at 
scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:226)
2015-03-17 04:33:32,729 INFO  [stdout-redir-1]: client.SparkClientImpl 
(SparkClientImpl.java:run(537)) -        at 
scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:39)
2015-03-17 04:33:32,730 INFO  [stdout-redir-1]: client.SparkClientImpl 
(SparkClientImpl.java:run(537)) -        at 
scala.collection.mutable.HashMap.foreach(HashMap.scala:98)
2015-03-17 04:33:32,730 INFO  [stdout-redir-1]: client.SparkClientImpl 
(SparkClientImpl.java:run(537)) -        at 
scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:771)
2015-03-17 04:33:32,730 INFO  [stdout-redir-1]: client.SparkClientImpl 
(SparkClientImpl.java:run(537)) -        at 
org.apache.spark.executor.Executor.org$apache$spark$executor$Executor$$updateDependencies(Executor.scala:347)
2015-03-17 04:33:32,730 INFO  [stdout-redir-1]: client.SparkClientImpl 
(SparkClientImpl.java:run(537)) -        at 
org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:177)
2015-03-17 04:33:32,730 INFO  [stdout-redir-1]: client.SparkClientImpl 
(SparkClientImpl.java:run(537)) -        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
2015-03-17 04:33:32,730 INFO  [stdout-redir-1]: client.SparkClientImpl 
(SparkClientImpl.java:run(537)) -        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
2015-03-17 04:33:32,730 INFO  [stdout-redir-1]: client.SparkClientImpl 
(SparkClientImpl.java:run(537)) -        at 
java.lang.Thread.run(Thread.java:744)
2015-03-17 04:33:32,730 INFO  [stdout-redir-1]: client.SparkClientImpl 
(SparkClientImpl.java:run(537)) - 
2015-03-17 04:33:32,730 INFO  [stdout-redir-1]: client.SparkClientImpl 
(SparkClientImpl.java:run(537)) - Driver stacktrace:
2015-03-17 04:33:32,730 INFO  [stdout-redir-1]: client.SparkClientImpl 
(SparkClientImpl.java:run(537)) -        at 
org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1214)
2015-03-17 04:33:32,730 INFO  [stdout-redir-1]: client.SparkClientImpl 
(SparkClientImpl.java:run(537)) -        at 
org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1203)
2015-03-17 04:33:32,730 INFO  [stdout-redir-1]: client.SparkClientImpl 
(SparkClientImpl.java:run(537)) -        at 
org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1202)
2015-03-17 04:33:32,730 INFO  [stdout-redir-1]: client.SparkClientImpl 
(SparkClientImpl.java:run(537)) -        at 
scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
2015-03-17 04:33:32,730 INFO  [stdout-redir-1]: client.SparkClientImpl 
(SparkClientImpl.java:run(537)) -        at 
scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
2015-03-17 04:33:32,730 INFO  [stdout-redir-1]: client.SparkClientImpl 
(SparkClientImpl.java:run(537)) -        at 
org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1202)
2015-03-17 04:33:32,730 INFO  [stdout-redir-1]: client.SparkClientImpl 
(SparkClientImpl.java:run(537)) -        at 
org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:696)
2015-03-17 04:33:32,731 INFO  [stdout-redir-1]: client.SparkClientImpl 
(SparkClientImpl.java:run(537)) -        at 
org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:696)
2015-03-17 04:33:32,731 INFO  [stdout-redir-1]: client.SparkClientImpl 
(SparkClientImpl.java:run(537)) -        at 
scala.Option.foreach(Option.scala:236)
2015-03-17 04:33:32,731 INFO  [stdout-redir-1]: client.SparkClientImpl 
(SparkClientImpl.java:run(537)) -        at 
org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:696)
2015-03-17 04:33:32,731 INFO  [stdout-redir-1]: client.SparkClientImpl 
(SparkClientImpl.java:run(537)) -        at 
org.apache.spark.scheduler.DAGSchedulerEventProcessActor$$anonfun$receive$2.applyOrElse(DAGScheduler.scala:1420)
2015-03-17 04:33:32,731 INFO  [stdout-redir-1]: client.SparkClientImpl 
(SparkClientImpl.java:run(537)) -        at 
akka.actor.Actor$class.aroundReceive(Actor.scala:465)
2015-03-17 04:33:32,731 INFO  [stdout-redir-1]: client.SparkClientImpl 
(SparkClientImpl.java:run(537)) -        at 
org.apache.spark.scheduler.DAGSchedulerEventProcessActor.aroundReceive(DAGScheduler.scala:137
{noformat}

I don't think this is relevant to my patch.

> Vulnerability in LdapAuthenticationProviderImpl enables HiveServer2 client to 
> degrade the authentication mechanism to "none", allowing authentication 
> without password
> ----------------------------------------------------------------------------------------------------------------------------------------------------------------------
>
>                 Key: HIVE-9934
>                 URL: https://issues.apache.org/jira/browse/HIVE-9934
>             Project: Hive
>          Issue Type: Bug
>          Components: Security
>    Affects Versions: 1.1.0
>            Reporter: Chao
>            Assignee: Chao
>         Attachments: HIVE-9934.1.patch, HIVE-9934.2.patch, HIVE-9934.3.patch
>
>
> Vulnerability in LdapAuthenticationProviderImpl enables HiveServer2 client to 
> degrade the authentication mechanism to "none", allowing authentication 
> without password.
> See: http://docs.oracle.com/javase/jndi/tutorial/ldap/security/simple.html
> “If you supply an empty string, an empty byte/char array, or null to the 
> Context.SECURITY_CREDENTIALS environment property, then the authentication 
> mechanism will be "none". This is because the LDAP requires the password to 
> be nonempty for simple authentication. The protocol automatically converts 
> the authentication to "none" if a password is not supplied.”
>  
> Since the LdapAuthenticationProviderImpl.Authenticate method is relying on a 
> NamingException being thrown during creation of initial context, it does not 
> fail when the context result is an “unauthenticated” positive response from 
> the LDAP server. The end result is, one can authenticate with HiveServer2 
> using the LdapAuthenticationProviderImpl with only a user name and an empty 
> password.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to