Hi all:I was running HiveFromSpark on yarn-cluster. While I got the hive 
select's result schemaRDD and tried to run `collect()` on it, the application 
got stuck and don't know what's wrong with it. Here is my code:
val sqlStat = s"SELECT * FROM $TABLE_NAME" val result = 
hiveContext.hql(sqlStat) // got the select's result schemaRDDval rows = 
result.collect()  // This is where the application getting stuck
It was ok when running on yarn-client mode.
Here is the Log===>14/12/09 15:40:58 WARN util.AkkaUtils: Error sending message 
in 1 attempts
java.util.concurrent.TimeoutException: Futures timed out after [30 seconds]
        at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
        at 
scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)
        at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)
        at 
scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
        at scala.concurrent.Await$.result(package.scala:107)
        at org.apache.spark.util.AkkaUtils$.askWithReply(AkkaUtils.scala:176)
        at org.apache.spark.executor.Executor$$anon$1.run(Executor.scala:373)
14/12/09 15:41:31 WARN util.AkkaUtils: Error sending message in 2 attempts
java.util.concurrent.TimeoutException: Futures timed out after [30 seconds]
        at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
        at 
scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)
        at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)
        at 
scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
        at scala.concurrent.Await$.result(package.scala:107)
        at org.apache.spark.util.AkkaUtils$.askWithReply(AkkaUtils.scala:176)
        at org.apache.spark.executor.Executor$$anon$1.run(Executor.scala:373)
14/12/09 15:42:04 WARN util.AkkaUtils: Error sending message in 3 attempts
java.util.concurrent.TimeoutException: Futures timed out after [30 seconds]
        at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
        at 
scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)
        at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)
        at 
scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
        at scala.concurrent.Await$.result(package.scala:107)
        at org.apache.spark.util.AkkaUtils$.askWithReply(AkkaUtils.scala:176)
        at org.apache.spark.executor.Executor$$anon$1.run(Executor.scala:373)
14/12/09 15:42:07 WARN executor.Executor: Issue communicating with driver in 
heartbeater
org.apache.spark.SparkException: Error sending message [message = 
Heartbeat(2,[Lscala.Tuple2;@a810606,BlockManagerId(2, longzhou-hdp1.lz.dscc, 
53356, 0))]
        at org.apache.spark.util.AkkaUtils$.askWithReply(AkkaUtils.scala:190)
        at org.apache.spark.executor.Executor$$anon$1.run(Executor.scala:373)
Caused by: java.util.concurrent.TimeoutException: Futures timed out after [30 
seconds]
        at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
        at 
scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)
        at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)
        at 
scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
        at scala.concurrent.Await$.result(package.scala:107)
        at org.apache.spark.util.AkkaUtils$.askWithReply(AkkaUtils.scala:176)
        ... 1 more
14/12/09 15:42:47 WARN util.AkkaUtils: Error sending message in 1 attempts
java.util.concurrent.TimeoutException: Futures timed out after [30 seconds]
        at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
        at 
scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)
        at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)
        at 
scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
        at scala.concurrent.Await$.result(package.scala:107)
        at org.apache.spark.util.AkkaUtils$.askWithReply(AkkaUtils.scala:176)
        at org.apache.spark.executor.Executor$$anon$1.run(Executor.scala:373)
14/12/09 15:42:55 ERROR executor.CoarseGrainedExecutorBackend: RECEIVED SIGNAL 
15: SIGTERM
14/12/09 15:42:55 DEBUG storage.DiskBlockManager: Shutdown hook called
14/12/09 15:42:55 DEBUG ipc.Client: Stopping client
Thanks.                                           

Reply via email to