[ 
https://issues.apache.org/jira/browse/HIVE-18831?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16418367#comment-16418367
 ] 

Sahil Takiar commented on HIVE-18831:
-------------------------------------

[~lirui] thats a good idea. I spent a while trying to get that to work, but am 
having some trouble with Kryo. I attached what I have so far, any errors stick 
out to you?

The custom serializer is called {{JobResultSerializer}}. I wrote a few tests 
for it in {{TestJobResultSerializer}}, right now 
{{testSerializeNonSerializableObjectWriteObject}} succeeds, but 
{{testSerializeNonSerializableObjectWriteClassAndObject}} fails. And I added an 
integration test in {{TestSparkClient}} called {{testErrorJobNotSerializable}} 
which fails with the same error.

Error:

{code}
com.esotericsoftware.kryo.KryoException: Encountered unregistered class ID: 
13994
        at 
com.esotericsoftware.kryo.util.DefaultClassResolver.readClass(DefaultClassResolver.java:137)
        at com.esotericsoftware.kryo.Kryo.readClass(Kryo.java:670)
        at com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:781)
        at 
org.apache.hive.spark.client.TestJobResultSerializer.testSerializeNonSerializableObjectWriteClassAndObject(TestJobResultSerializer.java:79)
{code}

Any ideas how to fix this? I tried registering the classes explicitly, but that 
didn't help. What confuses me is why 
{{testSerializeNonSerializableObjectWriteObject}} works, but 
{{testSerializeNonSerializableObjectWriteClassAndObject}} fails.

> Differentiate errors that are thrown by Spark tasks
> ---------------------------------------------------
>
>                 Key: HIVE-18831
>                 URL: https://issues.apache.org/jira/browse/HIVE-18831
>             Project: Hive
>          Issue Type: Sub-task
>          Components: Spark
>            Reporter: Sahil Takiar
>            Assignee: Sahil Takiar
>            Priority: Major
>         Attachments: HIVE-18831.1.patch, HIVE-18831.2.patch, 
> HIVE-18831.3.patch, HIVE-18831.4.patch, HIVE-18831.6.patch, 
> HIVE-18831.7.patch, HIVE-18831.8.WIP.patch
>
>
> We propagate exceptions from Spark task failures to the client well, but we 
> don't differentiate between errors from HS2 / RSC vs. errors thrown by 
> individual tasks.
> Main motivation is that when the client sees a propagated Spark exception its 
> difficult to know what part of the excution threw the exception.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

Reply via email to