Thank you Michael
I will try it out tomorrow

Daniel

> On 19 בנוב׳ 2014, at 21:07, Michael Armbrust <mich...@databricks.com> wrote:
> 
> You can override the schema inference by passing a schema as the second 
> argument to jsonRDD, however thats not a super elegant solution.  We are 
> considering one option to make this easier here: 
> https://issues.apache.org/jira/browse/SPARK-4476
> 
>> On Tue, Nov 18, 2014 at 11:06 PM, Akhil Das <ak...@sigmoidanalytics.com> 
>> wrote:
>> Something like this?
>> 
>>    val map_rdd = json_rdd.map(json => {
>>       val mapper = new ObjectMapper() with ScalaObjectMapper
>>       mapper.registerModule(DefaultScalaModule)
>> 
>>       val myMap = mapper.readValue[Map[String,String]](json)
>>       
>>       myMap
>>     })
>> 
>> Thanks
>> Best Regards
>> 
>>> On Wed, Nov 19, 2014 at 11:01 AM, Daniel Haviv <danielru...@gmail.com> 
>>> wrote:
>>> Hi,
>>> I'm loading a json file into a RDD and then save that RDD as parquet.
>>> One of the fields is a map of keys and values but it is being translated 
>>> and stored as a struct.
>>> 
>>> How can I convert the field into a map?
>>> 
>>> 
>>> Thanks,
>>> Daniel
> 

Reply via email to