Found it!  (with sweat in my forehead)

The job was actually running on Mesos using a  Spark 1.1.0 executor.

I guess there's some incompatibility between the 1.0.2 and the 1.1 versions
 - still quite weird.

-kr, Gerard.

On Thu, Sep 18, 2014 at 12:29 PM, Gerard Maas <gerard.m...@gmail.com> wrote:

> My Spark Streaming job (running on Spark 1.0.2) stopped working today and
> consistently throws the exception below.
> No code changed for it, so I'm really puzzled about the cause of the
> issue. Looks like a security issue at HDFS level.  Has anybody seen this
> exception and maybe know the root cause?
>
> 14/09/18 10:16:27 ERROR UserGroupInformation: PriviledgedActionException
> as:********** (auth:SIMPLE) cause:java.util.concurrent.TimeoutException:
> Futures timed out after [30 seconds]
> Exception in thread "main" java.lang.reflect.UndeclaredThrowableException:
> Unknown exception in doAs
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1421)
> at
> org.apache.spark.deploy.SparkHadoopUtil.runAsSparkUser(SparkHadoopUtil.scala:52)
> at
> org.apache.spark.executor.CoarseGrainedExecutorBackend$.run(CoarseGrainedExecutorBackend.scala:113)
> at
> org.apache.spark.executor.CoarseGrainedExecutorBackend$.main(CoarseGrainedExecutorBackend.scala:154)
> at
> org.apache.spark.executor.CoarseGrainedExecutorBackend.main(CoarseGrainedExecutorBackend.scala)
> Caused by: java.security.PrivilegedActionException:
> java.util.concurrent.TimeoutException: Futures timed out after [30 seconds]
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:415)
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
>
>
> Any hints?
>
> -kr, Gerard.
>

Reply via email to