I think having automatic gettr/settr on spark.conf object seems reasonable
to me.

On Thu, Dec 26, 2024 at 9:32 PM Reynold Xin <r...@databricks.com.invalid>
wrote:

> I actually think this might be confusing (just in general adding too many
> different ways to do the same thing is also un-Pythonic).
>
> On Thu, Dec 26, 2024 at 4:58 PM Hyukjin Kwon <gurwls...@apache.org> wrote:
>
>> Hi all,
>>
>> I hope you guys are enjoying the holiday season. I just wanted to have
>> some quick feedback about this PR
>> https://github.com/apache/spark/pull/49297
>>
>> This PR allows you do set/unset SQL configurations in Pythonic way, e.g.,
>>
>>  >>> 
>> spark.conf["spark.sql.optimizer.runtime.rowLevelOperationGroupFilter.enabled"]
>>  = "false" >>> 
>> spark.conf["spark.sql.optimizer.runtime.rowLevelOperationGroupFilter.enabled"]
>>  'false'
>>
>> as pandas also supports a similar way (
>> https://pandas.pydata.org/docs/user_guide/options.html)
>>
>> Any feedback on this approach would be appreciated.
>>
>> Thanks!
>>
>

-- 
Twitter: https://twitter.com/holdenkarau
Fight Health Insurance: https://www.fighthealthinsurance.com/
<https://www.fighthealthinsurance.com/?q=hk_email>
Books (Learning Spark, High Performance Spark, etc.):
https://amzn.to/2MaRAG9  <https://amzn.to/2MaRAG9>
YouTube Live Streams: https://www.youtube.com/user/holdenkarau
Pronouns: she/her

Reply via email to