disregard my last question - my mistake.
I accessed it as a col not as a row :
jsonData.first.getAs[String]("cty")

Eran

On Sun, Dec 20, 2015 at 11:42 AM Eran Witkon <[email protected]> wrote:

> Thanks, That's works.
> One other thing -
> I have the following code:
>
> val jsonData = sqlContext.read.json("/home/eranw/Workspace/JSON/sample")
>
> jsonData.show()
> +--------------+----------------+---------------+---------+
> |           cty|             hse|             nm|      yrs|
> +--------------+----------------+---------------+---------+
> |United Kingdom|House of Denmark|           Cnut|1016-1035|
> |United Kingdom| House of Wessex|Edmund lronside|     1016|
> +--------------+----------------+---------------+---------+
>
> *Bout when I want to access one of the fields using :*
>
> *jsonData("cty")*
> *I get the name of the field not the value*
> res22: org.apache.spark.sql.Column = cty
> same goes for
> println(cty.toString)
>
> *How do I access the content of the column and pass it as an argument to a
> function?*
>
> *Eran*
>
> On Sun, Dec 20, 2015 at 10:03 AM Alexander Pivovarov <[email protected]>
> wrote:
>
>> Just point loader to the folder. You do not need *
>> On Dec 19, 2015 11:21 PM, "Eran Witkon" <[email protected]> wrote:
>>
>>> Hi,
>>> Can I combine multiple JSON files to one DataFrame?
>>>
>>> I tried
>>> val df = sqlContext.read.json("/home/eranw/Workspace/JSON/sample/*")
>>> but I get an empty DF
>>> Eran
>>>
>>

Reply via email to