This is an important feature which can unblock several other projects
including bucket join support for DataSource v2, complete support for
enforcing DataSource v2 distribution requirements on the write path, etc. I
like Ryan's proposals which look simple and elegant, with nice support on
function
+1 for Liang-chi's.
Thanks Ryan and Wenchen for leading this.
2021년 2월 13일 (토) 오후 12:18, Liang-Chi Hsieh 님이 작성:
> Basically I think the proposal makes sense to me and I'd like to support
> the
> SPIP as it looks like we have strong need for the important feature.
>
> Thanks Ryan for working on
Basically I think the proposal makes sense to me and I'd like to support the
SPIP as it looks like we have strong need for the important feature.
Thanks Ryan for working on this and I do also look forward to Wenchen's
implementation. Thanks for the discussion too.
Actually I think the SupportsInv
I have received a ping about a new blocker, a regression on a temporary
function in CTE - worked before but now it's broken (
https://github.com/apache/spark/pull/31550). Thank you @Peter Toth
I tend to treat this as a legitimate blocker. I will cut another RC right
after this fix if we're all goo
+1.
On Sat, Feb 13, 2021 at 10:38 AM Takeshi Yamamuro
wrote:
> +1, too. Thanks, Dongjoon!
>
> 2021/02/13 11:07、Xiao Li のメール:
>
>
> +1
>
> Happy Lunar New Year!
>
> Xiao
>
> On Fri, Feb 12, 2021 at 5:33 PM Hyukjin Kwon wrote:
>
>> Yeah, +1 too
>>
>> 2021년 2월 13일 (토) 오전 4:49, Dongjoon Hyun 님이 작
+1, too. Thanks, Dongjoon!
> 2021/02/13 11:07、Xiao Li のメール:
>
>
> +1
>
> Happy Lunar New Year!
>
> Xiao
>
>> On Fri, Feb 12, 2021 at 5:33 PM Hyukjin Kwon wrote:
>> Yeah, +1 too
>>
>> 2021년 2월 13일 (토) 오전 4:49, Dongjoon Hyun 님이 작성:
>>> Thank you, Sean!
>>>
On Fri, Feb 12, 2021 at 11:4
Excited to see our Spark community rallying behind this important feature!
The proposal lays a solid foundation of minimal feature set with careful
considerations for future optimizations and extensions. Can't wait to see
it leading to more advanced functionalities like views with shared custom
fu
+1
Happy Lunar New Year!
Xiao
On Fri, Feb 12, 2021 at 5:33 PM Hyukjin Kwon wrote:
> Yeah, +1 too
>
> 2021년 2월 13일 (토) 오전 4:49, Dongjoon Hyun 님이 작성:
>
>> Thank you, Sean!
>>
>> On Fri, Feb 12, 2021 at 11:41 AM Sean Owen wrote:
>>
>>> Sounds like a fine time to me, sure.
>>>
>>> On Fri, Feb 12,
Yeah, +1 too
2021년 2월 13일 (토) 오전 4:49, Dongjoon Hyun 님이 작성:
> Thank you, Sean!
>
> On Fri, Feb 12, 2021 at 11:41 AM Sean Owen wrote:
>
>> Sounds like a fine time to me, sure.
>>
>> On Fri, Feb 12, 2021 at 1:39 PM Dongjoon Hyun
>> wrote:
>>
>>> Hi, All.
>>>
>>> As of today, `branch-3.0` has 307
I think this proposal is a very good thing giving Spark a standard way of
getting to and calling UDFs.
I like having the ScalarFunction as the API to call the UDFs. It is simple,
yet covers all of the polymorphic type cases well. I think it would also
simplify using the functions in other contexts
I agree that there is a strong need for a FunctionCatalog within Spark to
provide support for shareable UDFs, as well as make movement towards more
advanced functionality like views which themselves depend on UDFs, so I support
this SPIP wholeheartedly.
I find both of the proposed UDF APIs to b
Thank you, Sean!
On Fri, Feb 12, 2021 at 11:41 AM Sean Owen wrote:
> Sounds like a fine time to me, sure.
>
> On Fri, Feb 12, 2021 at 1:39 PM Dongjoon Hyun
> wrote:
>
>> Hi, All.
>>
>> As of today, `branch-3.0` has 307 patches (including 25 correctness
>> patches) since v3.0.1 tag (released on
Sounds like a fine time to me, sure.
On Fri, Feb 12, 2021 at 1:39 PM Dongjoon Hyun
wrote:
> Hi, All.
>
> As of today, `branch-3.0` has 307 patches (including 25 correctness
> patches) since v3.0.1 tag (released on September 8th, 2020).
>
> Since we stabilized branch-3.0 during 3.1.x preparation
Hi, All.
As of today, `branch-3.0` has 307 patches (including 25 correctness
patches) since v3.0.1 tag (released on September 8th, 2020).
Since we stabilized branch-3.0 during 3.1.x preparation so far,
it would be great if we start to release Apache Spark 3.0.2 next week.
And, I'd like to volunte
Managed to improve the site building a bit more: with a Gemfile we can pin
Jekyll to an exact version. For this we just have to call Jekyll via `bundle
exec jekyll`.
The PR [1] is opened.
[1] https://github.com/apache/spark-website/pull/303
--
Sent from: http://apache-spark-developers-list.10
Sure I will do that, too.
--
Sent from: http://apache-spark-developers-list.1001551.n3.nabble.com/
-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
Seems fine to me. How about just regenerating the whole site once with the
latest version and requiring that?
On Fri, Feb 12, 2021 at 7:09 AM attilapiros
wrote:
> I run into the same problem today and tried to find the version where the
> diff is minimal, so I wrote a script:
>
> ```
> #!/bin/zs
I run into the same problem today and tried to find the version where the
diff is minimal, so I wrote a script:
```
#!/bin/zsh
versions=('3.7.3' '3.7.2' '3.7.0' '3.6.3' '3.6.2' '3.6.1' '3.6.0' '3.5.2'
'3.5.1' '3.5.0' '3.4.5' '3.4.4' '3.4.3' '3.4.2' '3.4.1' '3.4.0')
for i in $versions; do
gem u
18 matches
Mail list logo