Internally Spark is using json4s and jackson parser v3.2.10, AFAIK. So if
you are using Scala they should be available without adding dependencies.
There is v3.2.11 already available but adding to my app was causing
NoSuchMethod exception so I would have to shade it. I'm simply staying on
v3.2.10 for now.

Regards,
Petr

On Sat, Sep 19, 2015 at 2:45 AM, Ted Yu <yuzhih...@gmail.com> wrote:

> For #1, see this thread: http://search-hadoop.com/m/q3RTti0Thneenne2
>
> For #2, also see:
> examples//src/main/python/hbase_inputformat.py
> examples//src/main/python/hbase_outputformat.py
>
> Cheers
>
> On Fri, Sep 18, 2015 at 5:12 PM, Ted Yu <yuzhih...@gmail.com> wrote:
>
>> For #2, please see:
>>
>> examples/src/main/scala//org/apache/spark/examples/HBaseTest.scala
>>
>> examples/src/main/scala//org/apache/spark/examples/pythonconverters/HBaseConverters.scala
>>
>> In hbase, there is hbase-spark module which is being polished. Should be
>> available in hbase 1.3.0 release.
>>
>> Cheers
>>
>> On Fri, Sep 18, 2015 at 5:09 PM, Cui Lin <icecreamlc...@gmail.com> wrote:
>>
>>> Hello,All,
>>>
>>> Parsing JSON's nested structure is easy if using Java or Python API.
>>> Where I can find the similar way to parse JSON file using spark?
>>>
>>> Another question is by using SparkSQL, how can i easily save the results
>>> into NOSQL DB? any examples? Thanks a lot!
>>>
>>>
>>>
>>> --
>>> Best regards!
>>>
>>> Lin,Cui
>>>
>>
>>
>

Reply via email to