Any ideas on how to figure out what is going on when using json4s 3.2.11.
I have a need to use 3.2.11 and just to see if things work I had downgraded
to 3.2.10 and things started working.


On Wed, Feb 11, 2015 at 11:45 AM, Charles Feduke <charles.fed...@gmail.com>
wrote:

> I was having a similar problem to this trying to use the Scala Jackson
> module yesterday. I tried setting `spark.files.userClassPathFirst` to true
> but I was still having problems due to the older version of Jackson that
> Spark has a dependency on. (I think its an old org.codehaus version.)
>
> I ended up solving my problem by using Spray JSON (
> https://github.com/spray/spray-json) which has no dependency on Jackson
> and has great control over the JSON rendering process.
>
> http://engineering.ooyala.com/blog/comparing-scala-json-libraries - based
> on that I looked for something that didn't rely on Jackson.
>
> Now that I see that there is some success with json4s on Spark 1.2.x I'll
> have to give that a try.
>
> On Wed Feb 11 2015 at 2:32:59 PM Jonathan Haddad <j...@jonhaddad.com>
> wrote:
>
>> Actually, yes, I was using 3.2.11.  I thought I would need the UUID
>> encoder that seems to have been added in that version, but I'm not using
>> it.  I've downgraded to 3.2.10 and it seems to work.
>>
>> I searched through the spark repo and it looks like it's got 3.2.10 in a
>> pom.  I don't know the first thing about how dependencies are resolved but
>> I'm guessing it's related?
>>
>> On Wed Feb 11 2015 at 11:20:42 AM Mohnish Kodnani <
>> mohnish.kodn...@gmail.com> wrote:
>>
>>> I was getting similar error after I upgraded to spark 1.2.1 from 1.1.1
>>> Are you by any chance using json4s 3.2.11.
>>> I downgraded to 3.2.10 and that seemed to have worked. But I didnt try
>>> to spend much time debugging the issue than that.
>>>
>>>
>>>
>>> On Wed, Feb 11, 2015 at 11:13 AM, Jonathan Haddad <j...@jonhaddad.com>
>>> wrote:
>>>
>>>> I'm trying to use the json4s library in a spark job to push data back
>>>> into kafka.  Everything was working fine when I was hard coding a string,
>>>> but now that I'm trying to render a string from a simple map it's failing.
>>>> The code works in sbt console.
>>>>
>>>> working console code:
>>>> https://gist.github.com/rustyrazorblade/daa50bf05ff0d48ac6af
>>>>
>>>> failing spark job line:
>>>> https://github.com/rustyrazorblade/killranalytics/blob/master/spark/src/main/scala/RawEventProcessing.scala#L114
>>>>
>>>> exception: https://gist.github.com/rustyrazorblade/1e220d87d41cfcad2bb9
>>>>
>>>> I've seen examples of using render / compact when I searched the ML
>>>> archives, so I'm kind of at a loss here.
>>>>
>>>> Thanks in advance for any help.
>>>>
>>>> Jon
>>>>
>>>
>>>

Reply via email to