Well its a databricks question so better be asked in their forum.

You can set up cluster level params when you create new cluster or add them
later. Go to cluster page, ipen one cluster, expand additional config
section and add your param there as key value pair separated by space.

On Thu, 16 May 2019 at 11:46 am, Rishi Shah <rishishah.s...@gmail.com>
wrote:

> Hi All,
>
> Any idea?
>
> Thanks,
> -Rishi
>
> On Tue, May 14, 2019 at 11:52 PM Rishi Shah <rishishah.s...@gmail.com>
> wrote:
>
>> Hi All,
>>
>> How can we set spark conf parameter in databricks notebook? My cluster
>> doesn't take into account any spark.conf.set properties... it creates 8
>> worker nodes (dat executors) but doesn't honor the supplied conf
>> parameters. Any idea?
>>
>> --
>> Regards,
>>
>> Rishi Shah
>>
>
>
> --
> Regards,
>
> Rishi Shah
>
-- 
Best Regards,
Ayan Guha

Reply via email to