gt;
>
>
>
>
>
> *From:* Reynold Xin < r...@databricks.com >
> *Date:* Monday, 25 February 2019 at 00:49
> *To:* Felix Cheung < felixcheun...@hotmail.com >
> *Cc:* dev < dev@spark.apache.org >, Sean Owen < sro...@gmail.com >, André
>
From: Sean Owen mailto:sro...@gmail.com>>
Sent: Sunday, February 24, 2019 6:59 AM
To: André Mello
Cc: dev
Subject: Re: [DISCUSS][SQL][PySpark] Column name support for SQL functions
I just commented on the PR -- I personally don't think it's worth
removing support for,
t;
> (My 2 c)
>
>
>
>
> *From:* Sean Owen < srowen@ gmail. com ( sro...@gmail.com ) >
> *Sent:* Sunday, February 24, 2019 6:59 AM
> *To:* André Mello
> *Cc:* dev
> *Subject:* Re: [DISCUSS][SQL][PySpark] Column name support for SQL
> functions
>
> I
)
From: Sean Owen
Sent: Sunday, February 24, 2019 6:59 AM
To: André Mello
Cc: dev
Subject: Re: [DISCUSS][SQL][PySpark] Column name support for SQL functions
I just commented on the PR -- I personally don't think it's worth
removing support for, say, max("foo") over ma
I just commented on the PR -- I personally don't think it's worth
removing support for, say, max("foo") over max(col("foo")) or
max($"foo") in Scala. We can make breaking changes in Spark 3 but this
seems like it would unnecessarily break a lot of code. The string arg
is more concise in Python and