Anything?

On Tue, Nov 24, 2015 at 11:49 PM, Dasun Hegoda <dasunheg...@gmail.com>
wrote:

> I'm using *apache-hive-1.2.1-bin *with *spark-1.5.1-bin-hadoop2.6 *on top
> of
> *hadoop-2.7.1 *. Any idea how to fix this error?
>
> On Tue, Nov 24, 2015 at 11:43 PM, Mich Talebzadeh <m...@peridale.co.uk>
> wrote:
>
>> Which version of Hive are you using?
>>
>>
>>
>> Mich Talebzadeh
>>
>>
>>
>> *Sybase ASE 15 Gold Medal Award 2008*
>>
>> A Winning Strategy: Running the most Critical Financial Data on ASE 15
>>
>>
>> http://login.sybase.com/files/Product_Overviews/ASE-Winning-Strategy-091908.pdf
>>
>> Author of the books* "A Practitioner’s Guide to Upgrading to Sybase ASE
>> 15", ISBN 978-0-9563693-0-7*.
>>
>> co-author *"Sybase Transact SQL Guidelines Best Practices", ISBN
>> 978-0-9759693-0-4*
>>
>> *Publications due shortly:*
>>
>> *Complex Event Processing in Heterogeneous Environments*, ISBN:
>> 978-0-9563693-3-8
>>
>> *Oracle and Sybase, Concepts and Contrasts*, ISBN: 978-0-9563693-1-4, volume
>> one out shortly
>>
>>
>>
>> http://talebzadehmich.wordpress.com
>>
>>
>>
>> NOTE: The information in this email is proprietary and confidential. This
>> message is for the designated recipient only, if you are not the intended
>> recipient, you should destroy it immediately. Any information in this
>> message shall not be understood as given or endorsed by Peridale Technology
>> Ltd, its subsidiaries or their employees, unless expressly so stated. It is
>> the responsibility of the recipient to ensure that this email is virus
>> free, therefore neither Peridale Ltd, its subsidiaries nor their employees
>> accept any responsibility.
>>
>>
>>
>> *From:* Dasun Hegoda [mailto:dasunheg...@gmail.com]
>> *Sent:* 24 November 2015 12:01
>> *To:* user@hive.apache.org
>> *Subject:* ERROR util.SparkUncaughtExceptionHandler: Uncaught exception
>> in thread Thread
>>
>>
>>
>> Hi,
>>
>>
>>
>> I get below error when I try to run Hive on Spark, Any idea how to fix
>> this?
>>
>>
>>
>>
>>
>> 15/11/24 06:33:47 ERROR util.SparkUncaughtExceptionHandler: Uncaught
>> exception in thread Thread[appclient-registration-retry-thread,5,main]
>>
>> java.util.concurrent.RejectedExecutionException: Task
>> java.util.concurrent.FutureTask@34d6330a rejected from
>> java.util.concurrent.ThreadPoolExecutor@5ab0f09f[Running, pool size = 1,
>> active threads = 1, queued tasks = 0, completed tasks = 0]
>>
>> at
>> java.util.concurrent.ThreadPoolExecutor$AbortPolicy.rejectedExecution(ThreadPoolExecutor.java:2048)
>>
>> at
>> java.util.concurrent.ThreadPoolExecutor.reject(ThreadPoolExecutor.java:821)
>>
>> at
>> java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:1372)
>>
>> at
>> java.util.concurrent.AbstractExecutorService.submit(AbstractExecutorService.java:110)
>>
>> at
>> org.apache.spark.deploy.client.AppClient$ClientEndpoint$$anonfun$tryRegisterAllMasters$1.apply(AppClient.scala:96)
>>
>> at
>> org.apache.spark.deploy.client.AppClient$ClientEndpoint$$anonfun$tryRegisterAllMasters$1.apply(AppClient.scala:95)
>>
>> at
>> scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
>>
>> at
>> scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
>>
>> at
>> scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
>>
>> at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:108)
>>
>> at scala.collection.TraversableLike$class.map(TraversableLike.scala:244)
>>
>> at scala.collection.mutable.ArrayOps$ofRef.map(ArrayOps.scala:108)
>>
>> at
>> org.apache.spark.deploy.client.AppClient$ClientEndpoint.tryRegisterAllMasters(AppClient.scala:95)
>>
>> at
>> org.apache.spark.deploy.client.AppClient$ClientEndpoint.org$apache$spark$deploy$client$AppClient$ClientEndpoint$$registerWithMaster(AppClient.scala:121)
>>
>> at
>> org.apache.spark.deploy.client.AppClient$ClientEndpoint$$anon$2$$anonfun$run$1.apply$mcV$sp(AppClient.scala:132)
>>
>> at org.apache.spark.util.Utils$.tryOrExit(Utils.scala:1119)
>>
>> at
>> org.apache.spark.deploy.client.AppClient$ClientEndpoint$$anon$2.run(AppClient.scala:124)
>>
>> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>>
>> at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:304)
>>
>> at
>> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:178)
>>
>> at
>> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
>>
>> at
>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>
>> at
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>
>> at java.lang.Thread.run(Thread.java:745)
>>
>> 15/11/24 06:33:47 INFO storage.DiskBlockManager: Shutdown hook called
>>
>> 15/11/24 06:33:47 INFO util.ShutdownHookManager: Shutdown hook called
>>
>> 15/11/24 06:33:47 INFO util.ShutdownHookManager: Deleting directory
>> /tmp/spark-dabf9338-0423-4190-83ed-036e9e61c770
>>
>> 15/11/24 06:33:47 INFO util.ShutdownHookManager: Deleting directory
>> /tmp/spark-c97cfee4-8bff-4b7d-8af2-59e9fd56965
>>
>>
>>
>> --
>>
>> Regards,
>>
>> Dasun Hegoda, Software Engineer
>> www.dasunhegoda.com | dasunheg...@gmail.com
>>
>
>
>
> --
> Regards,
> Dasun Hegoda, Software Engineer
> www.dasunhegoda.com | dasunheg...@gmail.com
>



-- 
Regards,
Dasun Hegoda, Software Engineer
www.dasunhegoda.com | dasunheg...@gmail.com

Reply via email to