+1
--
Sent from: http://apache-spark-developers-list.1001551.n3.nabble.com/
-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
+1
From: Ryan Blue
Sent: Tuesday, February 19, 2019 9:34 AM
To: Jamison Bennett
Cc: dev
Subject: Re: [VOTE] SPIP: Identifiers for multi-catalog Spark
+1
On Tue, Feb 19, 2019 at 8:41 AM Jamison Bennett
wrote:
+1 (non-binding)
Jamison Bennett
Cloudera Softwa
We are waiting for update from CRAN. Please hold on.
From: Takeshi Yamamuro
Sent: Tuesday, February 19, 2019 2:53 PM
To: dev
Subject: Re: Missing SparkR in CRAN
Hi, guys
It seems SparkR still not found in CRAN and any problem
when resubmitting it?
On Fri, Jan
Thanks Shane!! <3
2019년 2월 20일 (수) 오전 10:13, Wenchen Fan 님이 작성:
> Thanks Shane!
>
> On Wed, Feb 20, 2019 at 6:48 AM shane knapp wrote:
>
>> alright, i increased the httpd and proxy timeouts and kicked apache.
>> i'll keep an eye on things, but as of right now we're happily building.
>>
>> On Tue
Thanks Shane!
On Wed, Feb 20, 2019 at 6:48 AM shane knapp wrote:
> alright, i increased the httpd and proxy timeouts and kicked apache. i'll
> keep an eye on things, but as of right now we're happily building.
>
> On Tue, Feb 19, 2019 at 2:25 PM shane knapp wrote:
>
>> aand i had to issue
Hi, guys
It seems SparkR still not found in CRAN and any problem
when resubmitting it?
On Fri, Jan 25, 2019 at 1:41 AM Felix Cheung wrote:
> Yes it was discussed on dev@. We are waiting for 2.3.3 to release to
> resubmit.
>
>
> On Thu, Jan 24, 2019 at 5:33 AM Hyukjin Kwon wrote:
>
>> Hi all,
alright, i increased the httpd and proxy timeouts and kicked apache. i'll
keep an eye on things, but as of right now we're happily building.
On Tue, Feb 19, 2019 at 2:25 PM shane knapp wrote:
> aand i had to issue another restart. it's the ever annoying, and
> never quite clear as to why i
aand i had to issue another restart. it's the ever annoying, and never
quite clear as to why it's happening proxy/502 error.
currently investigating.
On Tue, Feb 19, 2019 at 9:21 AM shane knapp wrote:
> forgot to hit send before i went in to the office: we're back up and
> building!
>
> O
Hi,
We have been using Pyspark's groupby().apply() quite a bit and it has been
very helpful in integrating Spark with our existing pandas-heavy libraries.
Recently, we have found more and more cases where groupby().apply() is not
sufficient - In some cases, we want to group two dataframes by the
+1
On Tue, Feb 19, 2019 at 8:41 AM Jamison Bennett
wrote:
> +1 (non-binding)
>
> Jamison Bennett
>
> Cloudera Software Engineer
>
> jamison.benn...@cloudera.com
>
> 515 Congress Ave, Suite 1212 | Austin, TX | 78701
>
>
> On Tue, Feb 19, 2019 at 10:33 AM Maryann Xue
> wrote:
>
>> +1
>>
>
forgot to hit send before i went in to the office: we're back up and
building!
On Tue, Feb 19, 2019 at 8:06 AM shane knapp wrote:
> yep, it got wedged. issued a restart and it should be back up in a few
> minutes.
>
> On Tue, Feb 19, 2019 at 7:32 AM Parth Gandhi
> wrote:
>
>> Yes, it seems to
+1 (non-binding)
Jamison Bennett
Cloudera Software Engineer
jamison.benn...@cloudera.com
515 Congress Ave, Suite 1212 | Austin, TX | 78701
On Tue, Feb 19, 2019 at 10:33 AM Maryann Xue
wrote:
> +1
>
> On Mon, Feb 18, 2019 at 10:46 PM John Zhuge wrote:
>
>> +1
>>
>> On Mon, Feb 18, 2
yep, it got wedged. issued a restart and it should be back up in a few
minutes.
On Tue, Feb 19, 2019 at 7:32 AM Parth Gandhi
wrote:
> Yes, it seems to be down. The unit tests are not getting kicked off.
>
> Regards,
> Parth Kamlesh Gandhi
>
>
> On Tue, Feb 19, 2019 at 8:29 AM Hyukjin Kwon wrot
Yes, it seems to be down. The unit tests are not getting kicked off.
Regards,
Parth Kamlesh Gandhi
On Tue, Feb 19, 2019 at 8:29 AM Hyukjin Kwon wrote:
> Hi all,
>
> Looks Jenkins stopped working. Did I maybe miss a thread, or anybody
> didn't report this yet?
>
> Thanks!
>
>
>
Hi all,
Looks Jenkins stopped working. Did I maybe miss a thread, or anybody didn't
report this yet?
Thanks!
15 matches
Mail list logo