Hi Punneet,

File does not exist:
hdfs://localhost:8020/user/opc/.sparkStaging/application_1466711725829_0033/pipeline-lib-0.1.0-SNAPSHOT.jar

indicates a YARN issue. It is trying to get that file from HDFS and copy it
across to /tmp directory.


   1. Check that the class is actually created at compile time vis sbn mvt
   or if the jar file exists
   2. check that the working directory in YARN have the correct permissions
   3.

In yarn-site.xml check the following parameter is set

<property>
    <name>yarn.nodemanager.local-dirs</name>
    <value>/tmp</value>
  </property>
</configuration>

HTH

Dr Mich Talebzadeh



LinkedIn * 
https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
<https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*



http://talebzadehmich.wordpress.com



On 24 June 2016 at 10:46, Jeff Zhang <zjf...@gmail.com> wrote:

> You might have multiple java servlet jars on your classpath.
>
> On Fri, Jun 24, 2016 at 3:31 PM, Mich Talebzadeh <
> mich.talebza...@gmail.com> wrote:
>
>> can you please check the yarn log files to see what they say (both the
>> nodemamager and resourcemanager)
>>
>> HTH
>>
>> Dr Mich Talebzadeh
>>
>>
>>
>> LinkedIn * 
>> https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
>> <https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*
>>
>>
>>
>> http://talebzadehmich.wordpress.com
>>
>>
>>
>> On 24 June 2016 at 08:14, puneet kumar <puneetkumar.2...@gmail.com>
>> wrote:
>>
>>>
>>>
>>> I am getting below error thrown when I submit Spark Job using Spark
>>> Submit on Yarn. Need a quick help on what's going wrong here.
>>>
>>> 16/06/24 01:09:25 WARN AbstractLifeCycle: FAILED 
>>> org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter-791eb5d5: 
>>> java.lang.IllegalStateException: class 
>>> org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter is not a 
>>> javax.servlet.Filter
>>> java.lang.IllegalStateException: class 
>>> org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter is not a 
>>> javax.servlet.Filter
>>>     at 
>>> org.spark-project.jetty.servlet.FilterHolder.doStart(FilterHolder.java:97)
>>>     at 
>>> org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
>>>     at 
>>> org.spark-project.jetty.servlet.ServletHandler.initialize(ServletHandler.java:768)
>>>     at 
>>> org.spark-project.jetty.servlet.ServletContextHandler.startContext(ServletContextHandler.java:265)
>>>     at 
>>> org.spark-project.jetty.server.handler.ContextHandler.doStart(ContextHandler.java:717)
>>>     at 
>>> org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
>>>     at 
>>> org.spark-project.jetty.server.handler.HandlerWrapper.doStart(HandlerWrapper.java:95)
>>>     at 
>>> org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
>>>     at 
>>> org.spark-project.jetty.server.handler.HandlerCollection.doStart(HandlerCollection.java:229)
>>>     at 
>>> org.spark-project.jetty.server.handler.ContextHandlerCollection.doStart(ContextHandlerCollection.java:172)
>>>     at 
>>> org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
>>>     at 
>>> org.spark-project.jetty.server.handler.HandlerWrapper.doStart(HandlerWrapper.java:95)
>>>     at org.spark-project.jetty.server.Server.doStart(Server.java:282)
>>>     at 
>>> org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
>>>     at 
>>> org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$connect$1(JettyUtils.scala:252)
>>>     at org.apache.spark.ui.JettyUtils$$anonfun$5.apply(JettyUtils.scala:262)
>>>     at org.apache.spark.ui.JettyUtils$$anonfun$5.apply(JettyUtils.scala:262)
>>>     at 
>>> org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1988)
>>>     at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
>>>     at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1979)
>>>     at 
>>> org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:262)
>>>     at org.apache.spark.ui.WebUI.bind(WebUI.scala:137)
>>>     at 
>>> org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:481)
>>>     at 
>>> org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:481)
>>>     at scala.Option.foreach(Option.scala:236)
>>>     at org.apache.spark.SparkContext.<init>(SparkContext.scala:481)
>>>     at 
>>> org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:59)
>>>
>>>
>>>
>>
>
>
> --
> Best Regards
>
> Jeff Zhang
>

Reply via email to