[ 
https://issues.apache.org/jira/browse/HIVE-8991?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14233993#comment-14233993
 ] 

Rui Li commented on HIVE-8991:
------------------------------

I looked a little more into this. It seems hive-exec is properly added to class 
path (as user application jar in {{SparkSubmit}}) and class loader can load 
{{HiveIgnoreKeyTextOutputFormat}}:
{noformat}
2014-12-04 08:35:33,383 INFO  [stdout-redir-1]: client.SparkClientImpl 
(SparkClientImpl.java:run(384)) - [Loaded 
org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat from 
file:/home/hive/packaging/target/apache-hive-0.15.0-SNAPSHOT-bin/apache-hive-0.15.0-SNAPSHOT-bin/lib/hive-exec-0.15.0-SNAPSHOT.jar]
{noformat}
Nevertheless I still get the following error:
{noformat}
2014-12-04 08:32:26,681 INFO  [stderr-redir-1]: client.SparkClientImpl 
(SparkClientImpl.java:run(384)) - java.lang.NoClassDefFoundError: 
org/apache/hadoop/hive/ql/io/HiveIgnoreKeyTextOutputFormat
{noformat}
Besides, the exception is thrown when we try to deserialize SparkWork in the 
job, which means {{org.apache.hadoop.hive.ql.exec.spark.KryoSerializer}} has 
been loaded properly.
I'll do more debugging. Wondering if it's possible the error message is not 
accurate.

As for {{SparkSubmitDriverBootstrapper}} hanging issue, it's because it calls 
System.exit in a shutdown hook which causes deadlock. It's been fixed in latest 
branch.

> Fix custom_input_output_format [Spark Branch]
> ---------------------------------------------
>
>                 Key: HIVE-8991
>                 URL: https://issues.apache.org/jira/browse/HIVE-8991
>             Project: Hive
>          Issue Type: Sub-task
>          Components: Spark
>            Reporter: Rui Li
>            Assignee: Rui Li
>         Attachments: HIVE-8991.1-spark.patch
>
>
> After HIVE-8836, {{custom_input_output_format}} fails because of missing 
> hive-it-util in remote driver's class path.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to