You can try this (replace spark with whatever variable your sparksession
is): spark.conf.set("spark.sql.jsonGenerator.ignoreNullFields", False)

On Tue, Oct 4, 2022 at 4:55 PM Karthick Nk <kcekarth...@gmail.com> wrote:

> Thanks
> I am using Pyspark in databricks, I have seen through multiple reference
> but I couldn't find the exact snippet. Could you share a sample snippet for
> the same how do I set that property.
>
> My step:
> df = df.selectExpr(f'to_json(struct(*)) as json_data')
>
> On Tue, Oct 4, 2022 at 10:57 AM Yeachan Park <yeachan...@gmail.com> wrote:
>
>> Hi,
>>
>> There's a config option for this. Try setting this to false in your spark
>> conf.
>>
>> spark.sql.jsonGenerator.ignoreNullFields
>>
>> On Tuesday, October 4, 2022, Karthick Nk <kcekarth...@gmail.com> wrote:
>>
>>> Hi all,
>>>
>>> I need to convert pyspark dataframe into json .
>>>
>>> While converting , if all rows values are null/None for that particular
>>> column that column is getting removed from data.
>>>
>>> Could you suggest a way to do this. I need to convert dataframe into
>>> json with columns.
>>>
>>> Thanks
>>>
>>

Reply via email to