Hi Marcelo,

 Thanks for the quick response. I understand that I can just write my own
Java classes (I will use that as a fallback option), but in order to avoid
code duplication and further possible changes, I was hoping there would be
a way to use the Spark API classes directly, since it seems there should
be.

 I registered the Scala module in the same way (except in Java instead of
Scala),

mapper.registerModule(new DefaultScalaModule());

But I don’t think the module is being used/registered properly? Do you
happen to know whether the above line should work in Java?



On 9/8/15, 12:55 PM, "Marcelo Vanzin" <van...@cloudera.com> wrote:

>Hi Kevin,
>
>How did you try to use the Scala module? Spark has this code when
>setting up the ObjectMapper used to generate the output:
>
>  
>mapper.registerModule(com.fasterxml.jackson.module.scala.DefaultScalaModul
>e)
>
>As for supporting direct serialization to Java objects, I don't think
>that was the goal of the API. The Scala API classes are public mostly
>so that API compatibility checks are performed against them. If you
>don't mind the duplication, you could write your own Java POJOs that
>mirror the Scala API, and use them to deserialize the JSON.
>
>
>On Tue, Sep 8, 2015 at 12:46 PM, Kevin Chen <kc...@palantir.com> wrote:
>> Hello Spark Devs,
>>
>>  I am trying to use the new Spark API json endpoints at /api/v1/[path]
>> (added in SPARK-3454).
>>
>>  In order to minimize maintenance on our end, I would like to use
>> Retrofit/Jackson to parse the json directly into the Scala classes in
>> org/apache/spark/status/api/v1/api.scala (ApplicationInfo,
>> ApplicationAttemptInfo, etc…). However, Jackson does not seem to know
>>how to
>> handle Scala Seqs, and will throw an error when trying to parse the
>> attempts: Seq[ApplicationAttemptInfo] field of ApplicationInfo. Our
>>codebase
>> is in Java.
>>
>>  My questions are:
>>
>> Do you have any recommendations on how to easily deserialize Scala
>>objects
>> from json? For example, do you have any current usage examples of
>>SPARK-3454
>> with Java?
>> Alternatively, are you committed to the json formats of /api/v1/path? I
>> would guess so, because of the ‘v1’, but wanted to confirm. If so, I
>>could
>> deserialize the json into instances of my own Java classes instead,
>>without
>> worrying about changing the class structure later due to changes in the
>> Spark API.
>>
>> Some further information:
>>
>> The error I am getting with Jackson when trying to deserialize the json
>>into
>> ApplicationInfo is Caused by:
>> com.fasterxml.jackson.databind.JsonMappingException: Can not construct
>> instance of scala.collection.Seq, problem: abstract types either need
>>to be
>> mapped to concrete types, have custom deserializer, or be instantiated
>>with
>> additional type information
>> I tried using Jackson’s DefaultScalaModule, which seems to have support
>>for
>> Scala Seqs, but got no luck.
>> Deserialization works if the Scala class does not have any Seq fields,
>>and
>> works if the fields are Java Lists instead of Seqs.
>>
>> Thanks very much for your help!
>> Kevin Chen
>>
>
>
>
>-- 
>Marcelo

Attachment: smime.p7s
Description: S/MIME cryptographic signature

Reply via email to