Hi, Das:
Thanks for your answer.
I'm talking about multiple streaming aggregations here is :
df.groupBy("key").agg(min("colA").as("min")).groupBy("min").count()
EG: The data source is the user login record.There are two fields in my
temp v
Hello,
What do you mean by multiple streaming aggregations? Something like this is
already supported.
*df.groupBy("key").agg(min("colA"), max("colB"), avg("colC"))*
But the following is not supported.
*df.groupBy("key").agg(min("colA").as("min")).groupBy("min").count()*
In other words, multipl
2.3 around January
0.0 <407216...@qq.com> schrieb am Mi. 29. Nov. 2017 um 05:08:
> Hi, all:
> Multiple streaming aggregations are not yet supported. When will it be
> supported? Is it in the plan?
>
> Thanks.
>
Hi, all:
Multiple streaming aggregations are not yet supported. When will it be
supported? Is it in the plan?
Thanks.
+1 (non-binding)
RC2 is tested on CentOS, too.
Bests,
Dongjoon.
On Tue, Nov 28, 2017 at 4:35 PM, Hyukjin Kwon wrote:
> +1
>
> 2017-11-29 8:18 GMT+09:00 Henry Robinson :
>
>> (My vote is non-binding, of course).
>>
>> On 28 November 2017 at 14:53, Henry Robinson wrote:
>>
>>> +1, tests all pas
I see, thanks for your quick response.
Best regards,
Jerry
2017-11-29 10:45 GMT+08:00 Sean Owen :
> Until the 2.12 build passes tests, no. There is still a real outstanding
> issue with the closure cleaner and serialization of closures as Java 8
> lambdas. I haven't cracked it, and don't think i
Until the 2.12 build passes tests, no. There is still a real outstanding
issue with the closure cleaner and serialization of closures as Java 8
lambdas. I haven't cracked it, and don't think it's simple, but not
insurmountable.
The funny thing is most stuff appears to just work without cleaning sa
Hi Sean,
Two questions about Scala 2.12 for release artifacts.
Are we planning to ship 2.12 artifacts for Spark 2.3 release? If not, will
we only ship 2.11 artifacts?
Thanks
Jerry
2017-11-28 21:51 GMT+08:00 Sean Owen :
> The Scala 2.12 profile mostly works, but not all tests pass. Use
> -Pscal
+1
2017-11-29 8:18 GMT+09:00 Henry Robinson :
> (My vote is non-binding, of course).
>
> On 28 November 2017 at 14:53, Henry Robinson wrote:
>
>> +1, tests all pass for me on Ubuntu 16.04.
>>
>> On 28 November 2017 at 10:36, Herman van Hövell tot Westerflier <
>> hvanhov...@databricks.com> wrote
(My vote is non-binding, of course).
On 28 November 2017 at 14:53, Henry Robinson wrote:
> +1, tests all pass for me on Ubuntu 16.04.
>
> On 28 November 2017 at 10:36, Herman van Hövell tot Westerflier <
> hvanhov...@databricks.com> wrote:
>
>> +1
>>
>> On Tue, Nov 28, 2017 at 7:35 PM, Felix Che
+1, tests all pass for me on Ubuntu 16.04.
On 28 November 2017 at 10:36, Herman van Hövell tot Westerflier <
hvanhov...@databricks.com> wrote:
> +1
>
> On Tue, Nov 28, 2017 at 7:35 PM, Felix Cheung
> wrote:
>
>> +1
>>
>> Thanks Sean. Please vote!
>>
>> Tested various scenarios with R package. Ub
more electrical repairs need to be done on the high voltage leads to our
building, and we will be losing power overnight.
this means the PRB builds will not be working as amplab.cs.berkeley.edu
will be down.
timer-based builds will still run normally.
i'll get everything back up and running firs
+1
On Tue, Nov 28, 2017 at 7:35 PM, Felix Cheung
wrote:
> +1
>
> Thanks Sean. Please vote!
>
> Tested various scenarios with R package. Ubuntu, Debian, Windows r-devel
> and release and on r-hub. Verified CRAN checks are clean (only 1 NOTE!) and
> no leaked files (.cache removed, /tmp clean)
>
>
+1
Thanks Sean. Please vote!
Tested various scenarios with R package. Ubuntu, Debian, Windows r-devel
and release and on r-hub. Verified CRAN checks are clean (only 1 NOTE!) and
no leaked files (.cache removed, /tmp clean)
On Sun, Nov 26, 2017 at 11:55 AM Sean Owen wrote:
> Yes it downloads r
The Scala 2.12 profile mostly works, but not all tests pass. Use
-Pscala-2.12 on the command line to build.
On Tue, Nov 28, 2017 at 5:36 AM Ofir Manor wrote:
> Hi,
> as far as I know, Spark does not support Scala 2.12.
> There is on-going work to make refactor / fix Spark source code to support
Hi,
as far as I know, Spark does not support Scala 2.12.
There is on-going work to make refactor / fix Spark source code to support
Scala 2.12 - look for multiple emails on this list in the last months from
Sean Owen on his progress.
Once Spark supports Scala 2.12, I think the next target would be
16 matches
Mail list logo