You can probably do that in Spark's conf too:

spark.hadoop.yarn.timeline-service.enabled=false

On Thu, Jul 28, 2016 at 5:13 PM, Jeff Zhang <zjf...@gmail.com> wrote:
> One workaround is disable timeline in yarn-site,
>
> set yarn.timeline-service.enabled as false in yarn-site.xml
>
> On Thu, Jul 28, 2016 at 5:31 PM, censj <ce...@lotuseed.com> wrote:
>>
>> 16/07/28 17:07:34 WARN shortcircuit.DomainSocketFactory: The short-circuit
>> local reads feature cannot be used because libhadoop cannot be loaded.
>> java.lang.NoClassDefFoundError:
>> com/sun/jersey/api/client/config/ClientConfig
>>   at
>> org.apache.hadoop.yarn.client.api.TimelineClient.createTimelineClient(TimelineClient.java:45)
>>   at
>> org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.serviceInit(YarnClientImpl.java:163)
>>   at
>> org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
>>   at
>> org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:150)
>>   at
>> org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:56)
>>   at
>> org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:149)
>>   at org.apache.spark.SparkContext.<init>(SparkContext.scala:500)
>>   at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2256)
>>   at
>> org.apache.spark.sql.SparkSession$Builder$$anonfun$8.apply(SparkSession.scala:831)
>>   at
>> org.apache.spark.sql.SparkSession$Builder$$anonfun$8.apply(SparkSession.scala:823)
>>   at scala.Option.getOrElse(Option.scala:121)
>>   at
>> org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:823)
>>   at org.apache.spark.repl.Main$.createSparkSession(Main.scala:95)
>>   ... 47 elided
>> Caused by: java.lang.ClassNotFoundException:
>> com.sun.jersey.api.client.config.ClientConfig
>>   at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
>>   at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>>   at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
>>   at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>>   ... 60 more
>> <console>:14: error: not found: value spark
>>        import spark.implicits._
>>               ^
>> <console>:14: error: not found: value spark
>>        import spark.sql
>>               ^
>> Welcome to
>>
>>
>>
>>
>> hi:
>> I use spark 2.0,but when I run
>> "/etc/spark-2.0.0-bin-hadoop2.6/bin/spark-shell --master yarn” , appear this
>> Error.
>>
>> /etc/spark-2.0.0-bin-hadoop2.6/bin/spark-submit
>> export YARN_CONF_DIR=/etc/hadoop/conf
>> export HADOOP_CONF_DIR=/etc/hadoop/conf
>> export SPARK_HOME=/etc/spark-2.0.0-bin-hadoop2.6
>>
>>
>> how I to update?
>>
>>
>>
>>
>>
>> ===============================
>> Name: cen sujun
>> Mobile: 13067874572
>> Mail: ce...@lotuseed.com
>>
>
>
>
> --
> Best Regards
>
> Jeff Zhang



-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to