Thank you Mich Talebzadeh  and Sofia Panagiotidi . I changed my spark
version to 1.4.1 , and everything is ok.

2016-01-26 16:45 GMT+08:00 kevin <kiss.kevin...@gmail.com>:

> hi,all
>    I tried hive on spark with version hive1.2.1 spark1.5.2. I build spark
> witout -Phive . And I test spark cluster stand alone with spark-submit and
> it is ok.
>    but when I use hive , on spark web-site I can see the hive on spark
> application ,finally I got error:
>
>
> 16/01/26 16:23:42 INFO slf4j.Slf4jLogger: Slf4jLogger started
> 16/01/26 16:23:42 INFO Remoting: Starting remoting
> 16/01/26 16:23:42 INFO Remoting: Remoting started; listening on addresses 
> :[akka.tcp://driverPropsFetcher@10.1.3.116:42307]
> 16/01/26 16:23:42 INFO util.Utils: Successfully started service 
> 'driverPropsFetcher' on port 42307.
> Exception in thread "main" akka.actor.ActorNotFound: Actor not found for: 
> ActorSelection[Anchor(akka.tcp://sparkDriver@10.1.3.107:34725/), 
> Path(/user/CoarseGrainedScheduler)]
>       at 
> akka.actor.ActorSelection$$anonfun$resolveOne$1.apply(ActorSelection.scala:65)
>       at 
> akka.actor.ActorSelection$$anonfun$resolveOne$1.apply(ActorSelection.scala:63)
>       at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:32)
>       at 
> akka.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:55)
>       at akka.dispatch.BatchingExecutor$Batch.run(BatchingExecutor.scala:73)
>       at 
> akka.dispatch.ExecutionContexts$sameThreadExecutionContext$.unbatchedExecute(Future.scala:74)
>       at 
> akka.dispatch.BatchingExecutor$class.execute(BatchingExecutor.scala:120)
>       at 
> akka.dispatch.ExecutionContexts$sameThreadExecutionContext$.execute(Future.scala:73)
>       at 
> scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:40)
>       at 
> scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:248)
>       at akka.pattern.PromiseActorRef.$bang(AskSupport.scala:266)
>       at akka.remote.DefaultMessageDispatcher.dispatch(Endpoint.scala:89)
>       at 
> akka.remote.EndpointReader$$anonfun$receive$2.applyOrElse(Endpoint.scala:935)
>       at akka.actor.Actor$class.aroundReceive(Actor.scala:467)
>       at akka.remote.EndpointActor.aroundReceive(Endpoint.scala:411)
>       at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)
>       at akka.actor.ActorCell.invoke(ActorCell.scala:487)
>       at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238)
>       at akka.dispatch.Mailbox.run(Mailbox.scala:220)
>       at 
> akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:397)
>       at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
>       at 
> scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
>       at 
> scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
>       at 
> scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
>
>
>
> *Could anyone tell me is it for the version of hive and spark not matching ? 
> which version is ok or there is some other reason? *
>
>

Reply via email to