Mostly your shark server is not started.
Are you connecting to the cluster or running in local mode?
What is the lowest error on the stack.

Mayur Rustagi
Ph: +1 (760) 203 3257
http://www.sigmoidanalytics.com
@mayur_rustagi <https://twitter.com/mayur_rustagi>



On Mon, May 12, 2014 at 2:07 PM, Sophia <sln-1...@163.com> wrote:

> When I run the shark command line,it turns out like this,and I cannot see
> something like "shark>".How can I do? the log:
> -------------------------------------
> Starting the Shark Command Line Client
> 14/05/12 16:32:49 WARN conf.Configuration: mapred.max.split.size is
> deprecated. Instead, use mapreduce.input.fileinputformat.split.maxsize
> 14/05/12 16:32:49 WARN conf.Configuration: mapred.min.split.size is
> deprecated. Instead, use mapreduce.input.fileinputformat.split.minsize
> 14/05/12 16:32:49 WARN conf.Configuration: mapred.min.split.size.per.rack
> is
> deprecated. Instead, use
> mapreduce.input.fileinputformat.split.minsize.per.rack
> 14/05/12 16:32:49 WARN conf.Configuration: mapred.min.split.size.per.node
> is
> deprecated. Instead, use
> mapreduce.input.fileinputformat.split.minsize.per.node
> 14/05/12 16:32:49 WARN conf.Configuration: mapred.reduce.tasks is
> deprecated. Instead, use mapreduce.job.reduces
> 14/05/12 16:32:49 WARN conf.Configuration:
> mapred.reduce.tasks.speculative.execution is deprecated. Instead, use
> mapreduce.reduce.speculative
> 14/05/12 16:32:49 WARN conf.Configuration:
> org.apache.hadoop.hive.conf.LoopingByteArrayInputStream@51f782b8:an
> attempt
> to override final parameter:
> mapreduce.job.end-notification.max.retry.interval;  Ignoring.
> 14/05/12 16:32:49 WARN conf.Configuration:
> org.apache.hadoop.hive.conf.LoopingByteArrayInputStream@51f782b8:an
> attempt
> to override final parameter: mapreduce.job.end-notification.max.attempts;
> Ignoring.
>
> Logging initialized using configuration in
>
> jar:file:/root/shark-0.9.1-bin-hadoop2/lib_managed/jars/edu.berkeley.cs.shark/hive-common/hive-common-0.11.0-shark-0.9.1.jar!/hive-log4j.properties
> Hive history
> file=/tmp/root/hive_job_log_root_8413@CHBM220_201405121632_457581193.txt
> SLF4J: Class path contains multiple SLF4J bindings.
> SLF4J: Found binding in
>
> [jar:file:/usr/lib/zookeeper/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: Found binding in
>
> [jar:file:/root/shark-0.9.1-bin-hadoop2/lib_managed/jars/org.slf4j/slf4j-log4j12/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
> explanation.
> 1.363: [GC 131072K->11100K(502464K), 0.0087470 secs]
> [ERROR] [05/12/2014 16:33:00.461] [main] [Remoting] Remoting error:
> [Startup
> timed out] [
> akka.remote.RemoteTransportException: Startup timed out
>         at
> akka.remote.Remoting.akka$remote$Remoting$$notifyError(Remoting.scala:129)
>         at akka.remote.Remoting.start(Remoting.scala:191)
>         at
> akka.remote.RemoteActorRefProvider.init(RemoteActorRefProvider.scala:184)
>         at
> akka.actor.ActorSystemImpl._start$lzycompute(ActorSystem.scala:579)
>         at akka.actor.ActorSystemImpl._start(ActorSystem.scala:577)
>         at akka.actor.ActorSystemImpl.start(ActorSystem.scala:588)
>         at akka.actor.ActorSystem$.apply(ActorSystem.scala:111)
>         at akka.actor.ActorSystem$.apply(ActorSystem.scala:104)
>         at
> org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:96)
>         at org.apache.spark.SparkEnv$.create(SparkEnv.scala:126)
>         at org.apache.spark.SparkContext.<init>(SparkContext.scala:139)
>         at shark.SharkContext.<init>(SharkContext.scala:42)
>         at shark.SharkContext.<init>(SharkContext.scala:61)
>         at shark.SharkEnv$.initWithSharkContext(SharkEnv.scala:78)
>         at shark.SharkEnv$.init(SharkEnv.scala:38)
>         at shark.SharkCliDriver.<init>(SharkCliDriver.scala:278)
>         at shark.SharkCliDriver$.main(SharkCliDriver.scala:162)
>         at shark.SharkCliDriver.main(SharkCliDriver.scala)
> Caused by: java.util.concurrent.TimeoutException: Futures timed out after
> [10000 milliseconds]
>         at
> scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
>         at
> scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)
>         at
> scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)
>         at
>
> scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
>         at scala.concurrent.Await$.result(package.scala:107)
>         at akka.remote.Remoting.start(Remoting.scala:173)
>         ... 16 more
> ]
> Exception in thread "main" java.util.concurrent.TimeoutException: Futures
> timed out after [10000 milliseconds]
>         at
> scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
>         at
> scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)
>         at
> scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)
>         at
>
> scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
>         at scala.concurrent.Await$.result(package.scala:107)
>         at akka.remote.Remoting.start(Remoting.scala:173)
>         at
> akka.remote.RemoteActorRefProvider.init(RemoteActorRefProvider.scala:184)
>         at
> akka.actor.ActorSystemImpl._start$lzycompute(ActorSystem.scala:579)
>         at akka.actor.ActorSystemImpl._start(ActorSystem.scala:577)
>         at akka.actor.ActorSystemImpl.start(ActorSystem.scala:588)
>         at akka.actor.ActorSystem$.apply(ActorSystem.scala:111)
>         at akka.actor.ActorSystem$.apply(ActorSystem.scala:104)
>         at
> org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:96)
>         at org.apache.spark.SparkEnv$.create(SparkEnv.scala:126)
>         at org.apache.spark.SparkContext.<init>(SparkContext.scala:139)
>         at shark.SharkContext.<init>(SharkContext.scala:42)
>         at shark.SharkContext.<init>(SharkContext.scala:61)
>         at shark.SharkEnv$.initWithSharkContext(SharkEnv.scala:78)
>         at shark.SharkEnv$.init(SharkEnv.scala:38)
>         at shark.SharkCliDriver.<init>(SharkCliDriver.scala:278)
>         at shark.SharkCliDriver$.main(SharkCliDriver.scala:162)
>         at shark.SharkCliDriver.main(SharkCliDriver.scala)
>
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/How-to-run-shark-tp5581.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>

Reply via email to