Congrats and thanks!
From: Hyukjin Kwon
Sent: Wednesday, March 3, 2021 4:09:23 PM
To: Dongjoon Hyun
Cc: Gabor Somogyi ; Jungtaek Lim
; angers zhu ; Wenchen Fan
; Kent Yao ; Takeshi Yamamuro
; dev ; user @spark
Subject: Re: [ANNOUNCE] Announcing Apache Spark
Welcome!
From: Driesprong, Fokko
Sent: Friday, March 26, 2021 1:25:33 PM
To: Matei Zaharia
Cc: Spark Dev List
Subject: Re: Welcoming six new Apache Spark committers
Well deserved all! Welcome!
Op vr 26 mrt. 2021 om 21:21 schreef Matei Zaharia
mailto:matei.zah
, 2021 at 10:19 PM
Subject: CRAN package SparkR
To: Felix Cheung
CC:
Dear maintainer,
Checking this apparently creates the default directory as per
#' @param localDir a local directory where Spark is installed. The
directory con
tains
#' version-specific folder
Any suggestion or comment on this? They are going to remove the package by
6-28
Seems to me if we have a switch to opt in to install (and not by default
on), or prompt the user in interactive session, should be good as user
confirmation.
On Sun, Jun 13, 2021 at 11:25 PM Felix Cheung
wrote
ser' confirmation when we
> install.spark?
> IIRC, the auto installation is only triggered by interactive shell so
> getting user's confirmation should be fine.
>
> 2021년 6월 18일 (금) 오전 2:54, Felix Cheung 님이 작성:
>
>> Any suggestion or comment on this? They are g
-- Forwarded message -
From: Gregor Seyer
Date: Wed, Oct 20, 2021 at 4:42 AM
Subject: Re: CRAN submission SparkR 3.2.0
To: Felix Cheung , CRAN <
cran-submissi...@r-project.org>
Thanks,
Please add \value to .Rd files regarding exported methods and explain
the functions r
+1 to doc, seed argument would be great if possible
From: Sean Owen
Sent: Monday, September 26, 2022 5:26:26 PM
To: Nicholas Gustafson
Cc: dev
Subject: Re: Why are hash functions seeded with 42?
Oh yeah I get why we love to pick 42 for random things. I'm guessin
Reposting for shane here
[SPARK-27178]
https://github.com/apache/spark/commit/342e91fdfa4e6ce5cc3a0da085d1fe723184021b
Is problematic too and it’s not in the rc8 cut
https://github.com/apache/spark/commits/branch-2.4
(Personally I don’t want to delay 2.4.1 either..)
___
I’m +1 if 3.0
From: Sean Owen
Sent: Monday, March 25, 2019 6:48 PM
To: Hyukjin Kwon
Cc: dev; Bryan Cutler; Takuya UESHIN; shane knapp
Subject: Re: Upgrading minimal PyArrow version to 0.12.x [SPARK-27276]
I don't know a lot about Arrow here, but seems reasonable
PARK-27276. Shane is also
correct in that newer versions of pyarrow have stopped support for Python 3.4,
so we should probably have Jenkins test against 2.7 and 3.5.
On Mon, Mar 25, 2019 at 9:44 PM Reynold Xin
mailto:r...@databricks.com>> wrote:
+1 on doing this in 3.0.
On Mon, Mar 25, 20
3.4 is end of life but 3.5 is not. From your link
we expect to release Python 3.5.8 around September 2019.
From: shane knapp
Sent: Thursday, March 28, 2019 7:54 PM
To: Hyukjin Kwon
Cc: Bryan Cutler; dev; Felix Cheung
Subject: Re: Upgrading minimal PyArrow
(I think the .invalid is added by the list server)
Personally I’d rather everyone just +1 or -1, and shouldn’t add binding or not.
It’s really the responsibility of the RM to confirm if a vote is binding.
Mistakes have been made otherwise.
From: Marcelo Vanzin
+1
build source
R tests
R package CRAN check locally, r-hub
From: d_t...@apple.com on behalf of DB Tsai
Sent: Wednesday, March 27, 2019 11:31 AM
To: dev
Subject: [VOTE] Release Apache Spark 2.4.1 (RC9)
Please vote on releasing the following candidate as Apache
Definitely the part on the PR. Thanks!
From: shane knapp
Sent: Thursday, March 28, 2019 11:19 AM
To: dev; Stavros Kontopoulos
Subject: [k8s][jenkins] spark dev tool docs now have k8s+minikube instructions!
https://spark.apache.org/developer-tools.html
search fo
To: Bryan Cutler
Cc: Felix Cheung; Hyukjin Kwon; dev
Subject: Re: Upgrading minimal PyArrow version to 0.12.x [SPARK-27276]
i'm not opposed to 3.6 at all.
On Fri, Mar 29, 2019 at 4:16 PM Bryan Cutler
mailto:cutl...@gmail.com>> wrote:
PyArrow dropping Python 3.4 was mainly due to support
Hi Spark community!
As you know ApacheCon NA 2019 is coming this Sept and it’s CFP is now open!
This is an important milestone as we celebrate 20 years of ASF. We have tracks
like Big Data and Machine Learning among many others. Please submit your
talks/thoughts/challenges/learnings here:
https
I kinda agree it is confusing when a parameter is not used...
From: Ryan Blue
Sent: Thursday, April 11, 2019 11:07:25 AM
To: Bruce Robbins
Cc: Dávid Szakállas; Spark Dev List
Subject: Re: Dataset schema incompatibility bug when reading column partitioned
data
I
And a plug for the Graph Processing track -
A discussion of comparison talk between the various Spark options (GraphX,
GraphFrames, CAPS), or the ongoing work with SPARK-25994 Property Graphs,
Cypher Queries, and Algorithms
Would be great!
From: Felix Cheung
Re shading - same argument I’ve made earlier today in a PR...
(Context- in many cases Spark has light or indirect dependencies but bringing
them into the process breaks users code easily)
From: Michael Heuer
Sent: Thursday, April 18, 2019 6:41 AM
To: Reynold Xi
+1
R tests, package tests on r-hub. Manually check commits under R, doc etc
From: Sean Owen
Sent: Saturday, April 20, 2019 11:27 AM
To: Wenchen Fan
Cc: Spark dev list
Subject: Re: [VOTE] Release Apache Spark 2.4.2
+1 from me too.
It seems like there is support
Just my 2c
If there is a known security issue, we should fix it rather waiting for if it
actually could be might be affecting Spark to be found by a black hat, or worse.
I don’t think any of us want to see Spark in the news for this reason.
From: Sean Owen
Sent:
I ran basic tests on R, r-hub etc. LGTM.
+1 (limited - I didn’t get to run other usual tests)
From: Sean Owen
Sent: Wednesday, May 1, 2019 2:21 PM
To: Xiao Li
Cc: dev@spark.apache.org
Subject: Re: [VOTE] Release Apache Spark 2.4.3
+1 from me. There is little cha
You could
df.filter(col(“c”) = “c1”).write().partitionBy(“c”).save
It could get some data skew problem but might work for you
From: Burak Yavuz
Sent: Tuesday, May 7, 2019 9:35:10 AM
To: Shubham Chaurasia
Cc: dev; u...@spark.apache.org
Subject: Re: Static parti
+1
I’d prefer to see more of the end goal and how that could be achieved (such as
ETL or SPARK-24579). However given the rounds and months of discussions we have
come down to just the public API.
If the community thinks a new set of public API is maintainable, I don’t see
any problem with that
We don’t usually reference a future release on website
> Spark website and state that Python 2 is deprecated in Spark 3.0
I suspect people will then ask when is Spark 3.0 coming out then. Might need to
provide some clarity on that.
From: Reynold Xin
Sent: Thur
.
From: shane knapp
Sent: Friday, May 31, 2019 7:38:10 PM
To: Denny Lee
Cc: Holden Karau; Bryan Cutler; Erik Erlandson; Felix Cheung; Mark Hamstra;
Matei Zaharia; Reynold Xin; Sean Owen; Wenchen Fen; Xiangrui Meng; dev; user
Subject: Re: Should python-2 be supported in Spark 3.0?
+1000 ;)
On
So to be clear, min version check is 0.23
Jenkins test is 0.24
I’m ok with this. I hope someone will test 0.23 on releases though before we
sign off?
From: shane knapp
Sent: Friday, June 14, 2019 10:23:56 AM
To: Bryan Cutler
Cc: Dongjoon Hyun; Holden Karau; Hyuk
How about pyArrow?
From: Holden Karau
Sent: Friday, June 14, 2019 11:06:15 AM
To: Felix Cheung
Cc: Bryan Cutler; Dongjoon Hyun; Hyukjin Kwon; dev; shane knapp
Subject: Re: [DISCUSS] Increasing minimum supported version of Pandas
Are there other Python
+1
Glad to see the progress in this space - it’s been more than a year since the
original discussion and effort started.
From: Yinan Li
Sent: Monday, June 17, 2019 7:14:42 PM
To: rb...@netflix.com
Cc: Dongjoon Hyun; Saisai Shao; Imran Rashid; Ilan Filonenko; bo
That’s great!
From: ☼ R Nair
Sent: Saturday, August 24, 2019 10:57:31 AM
To: Dongjoon Hyun
Cc: dev@spark.apache.org ; user @spark/'user
@spark'/spark users/user@spark
Subject: Re: JDK11 Support in Apache Spark
Finally!!! Congrats
On Sat, Aug 24, 2019, 11:11 A
+1
Run tests, R tests, r-hub Debian, Ubuntu, mac, Windows
From: Hyukjin Kwon
Sent: Wednesday, August 28, 2019 9:14 PM
To: Takeshi Yamamuro
Cc: dev; Dongjoon Hyun
Subject: Re: [VOTE] Release Apache Spark 2.4.4 (RC3)
+1 (from the last blocker PR)
2019년 8월 29일 (목)
I did review it and solving this problem makes sense. I will comment in the
JIRA.
From: Jungtaek Lim
Sent: Sunday, August 25, 2019 3:34:22 PM
To: dev
Subject: Design review of SPARK-28594
Hi devs,
I have been working on designing SPARK-28594 [1] (though I've s
(Hmm, what is spark-...@apache.org?)
From: Sean Owen
Sent: Tuesday, September 3, 2019 11:58:30 AM
To: Xiao Li
Cc: Tom Graves ; spark-...@apache.org
Subject: Re: maven 3.6.1 removed from apache maven repo
It's because build/mvn only queries ASF mirrors, and the
I’d prefer strict mode and fail fast (analysis check)
Also I like what Alastair suggested about standard clarification.
I think we can re-visit this proposal and restart the vote
From: Ryan Blue
Sent: Friday, September 6, 2019 5:28 PM
To: Alastair Green
Cc: Reyn
+1
From: Thomas graves
Sent: Wednesday, September 4, 2019 7:24:26 AM
To: dev
Subject: [VOTE] [SPARK-27495] SPIP: Support Stage level resource configuration
and scheduling
Hey everyone,
I'd like to call for a vote on SPARK-27495 SPIP: Support Stage level
resour
this is about test description and not test file name right?
if yes I don’t see a problem.
From: Hyukjin Kwon
Sent: Thursday, November 14, 2019 6:03:02 PM
To: Shixiong(Ryan) Zhu
Cc: dev ; Felix Cheung ;
Shivaram Venkataraman
Subject: Re: Adding JIRA ID as the
1000% with Steve, the org.spark-project hive 1.2 will need a solution. It is
old and rather buggy; and It’s been *years*
I think we should decouple hive change from everything else if people are
concerned?
From: Steve Loughran
Sent: Sunday, November 17, 2019 9:
Just to add - hive 1.2 fork is definitely not more stable. We know of a few
critical bug fixes that we cherry picked into a fork of that fork to maintain
ourselves.
From: Dongjoon Hyun
Sent: Wednesday, November 20, 2019 11:07:47 AM
To: Sean Owen
Cc: dev
Subje
; Christopher Crosbie ; Griselda
Cuevas ; Holden Karau ; Mayank Ahuja
; Kalyan Sivakumar ; alfo...@fb.com
; Felix Cheung ; Matt Cheah
; Yifei Huang (PD)
Subject: Re: Enabling fully disaggregated shuffle on Spark
That sounds great!
On Wed, Nov 20, 2019 at 9:02 AM John Zhuge
mailto:jzh
I think it’s a good idea
From: Hyukjin Kwon
Sent: Wednesday, January 15, 2020 5:49:12 AM
To: dev
Cc: Sean Owen ; Nicholas Chammas
Subject: Re: More publicly documenting the options under spark.sql.*
Resending to the dev list for archive purpose:
I think automa
Congrats
From: Jungtaek Lim
Sent: Thursday, June 18, 2020 8:18:54 PM
To: Hyukjin Kwon
Cc: Mridul Muralidharan ; Reynold Xin ;
dev ; user
Subject: Re: [ANNOUNCE] Apache Spark 3.0.0
Great, thanks all for your efforts on the huge step forward!
On Fri, Jun 19, 20
-- Forwarded message -
We are pleased to announce that ApacheCon @Home will be held online,
September 29 through October 1.
More event details are available at https://apachecon.com/acah2020 but
there’s a few things that I want to highlight for you, the members.
Yes, the CFP has
I think pluggable storage in shuffle is essential for k8s GA
From: Holden Karau
Sent: Monday, June 29, 2020 9:33 AM
To: Maxim Gekk
Cc: Dongjoon Hyun; dev
Subject: Re: Apache Spark 3.1 Feature Expectation (Dec. 2020)
Should we also consider the shuffle service ref
Welcome!
From: Nick Pentreath
Sent: Tuesday, July 14, 2020 10:21:17 PM
To: dev
Cc: Dilip Biswal ; Jungtaek Lim
; huaxin gao
Subject: Re: Welcoming some new Apache Spark committers
Congratulations and welcome as Apache Spark committers!
On Wed, 15 Jul 2020 at
+1
From: Holden Karau
Sent: Wednesday, July 22, 2020 10:49:49 AM
To: Steve Loughran
Cc: dev
Subject: Re: Exposing Spark parallelized directory listing & non-locality
listing in core
Wonderful. To be clear the patch is more to start the discussion about how we
What would be the reason for separate git repo?
From: Hyukjin Kwon
Sent: Monday, August 3, 2020 1:58:55 AM
To: Maciej Szymkiewicz
Cc: Driesprong, Fokko ; Holden Karau
; Spark Dev List
Subject: Re: [PySpark] Revisiting PySpark type annotations
Okay, seems like
So IMO maintaining outside in a separate repo is going to be harder. That was
why I asked.
From: Maciej Szymkiewicz
Sent: Tuesday, August 4, 2020 12:59 PM
To: Sean Owen
Cc: Felix Cheung; Hyukjin Kwon; Driesprong, Fokko; Holden Karau; Spark Dev List
Subject: Re
Ok - it took many years to get it first published, so it was hard to get
there.
On Tue, Dec 22, 2020 at 5:45 PM Hyukjin Kwon wrote:
> Adding @Shivaram Venkataraman and @Felix
> Cheung FYI
>
> 2020년 12월 23일 (수) 오전 9:22, Michael Heuer 님이 작성:
>
>> Anecdotally, as a projec
-31918 and
> https://issues.apache.org/jira/browse/SPARK-32073.
> I wonder why other releases were not uploaded yet. Do you guys know any
> context or if there is a standing issue on this, @Felix Cheung
> or @Shivaram Venkataraman
> ?
>
> 2020년 12월 23일 (수) 오전 11:21, Mridul Mu
consider
> dropping it as Dongjoon initially pointed out.
>
> 2020년 12월 30일 (수) 오후 1:59, Felix Cheung 님이 작성:
>
>> Ah, I don’t recall actually - maybe it was just missed?
>>
>> The last message I had, was in June when it was broken by R 4.0.1, which
>> was fixed.
>
-1
(Sorry) spark-2.1.2-bin-hadoop2.7.tgz is missing the R directory, not sure why
yet.
Tested on multiple platform as source package, (against 2.1.1 jar) seemed fine
except this WARNING on R-devel
* checking for code/documentation mismatches ... WARNING
Codoc mismatches from documentation obje
To be sure, this is only for JIRA and not for github PR, right?
If then +1 but I think the access control on JIRA does not necessarily match
the committer list, and is manually maintained, last I hear.
From: Sean Owen
Sent: Wednesday, October 4, 2017 7:51:37 PM
Hmm, sounds like some sort of corruption of the maven directory on the Jenkins
box...
From: Liwei Lin
Sent: Wednesday, October 4, 2017 6:52:54 PM
To: Spark dev list
Subject: Nightly builds for master branch failed
https://amplab.cs.berkeley.edu/jenkins/job/spar
+1
Tested SparkR package manually on multiple platforms and checked different
Hadoop release jar.
And previously tested the last RC on different R releases (see the last RC vote
thread)
I found some differences in bin release jars created by the different options
when running the make-release
Thanks Shane!
From: shane knapp
Sent: Thursday, October 5, 2017 9:14:54 AM
To: Felix Cheung
Cc: Liwei Lin; Spark dev list
Subject: Re: Nightly builds for master branch failed
yep, it was a corrupted jar on amp-jenkins-worker-01. i grabbed a new one from
Thanks Nick, Hyukjin. Yes this seems to be a longer standing issue on RHEL with
respect to forking.
From: Nick Pentreath
Sent: Friday, October 6, 2017 6:16:53 AM
To: Hyukjin Kwon
Cc: dev
Subject: Re: [VOTE] Spark 2.1.2 (RC4)
Ah yes - I recall that it was fixed.
Yes - unfortunately something was found after it was published and made
available publicly.
We have a JIRA on this and are working on the best course of action.
_
From: Holden Karau mailto:hol...@pigscanfly.ca>>
Sent: Wednesday, October 25, 2017 1:35 AM
Subject: CRAN
Karau
Cc: Felix Cheung; dev@spark.apache.org
Subject: Re: Kicking off the process around Spark 2.2.1
It would be reasonably consistent with the timing of other x.y.1 releases, and
more release managers sounds useful, yeah.
Note also that in theory the code freeze for 2.3.0 starts in about 2 weeks
For the 2.2.1, we are still working through a few bugs. Hopefully it won't be
long.
From: Kevin Grealish
Sent: Thursday, November 2, 2017 9:51:56 AM
To: Felix Cheung; Sean Owen; Holden Karau
Cc: dev@spark.apache.org
Subject: RE: Kicking off the process a
, 2017 10:38:48 AM
To: Felix Cheung; Kevin Grealish; Sean Owen
Cc: dev@spark.apache.org
Subject: Re: Kicking off the process around Spark 2.2.1
If it’s desired I’d be happy to start on 2.3 once 2.2.1 is finished.
On Thu, Nov 2, 2017 at 10:24 AM Felix Cheung
mailto:felixcheun...@hotmail.com>>
We actually have some immediate needs for custom config for some upcoming
integration tests.
I don't know if such changes are possible in ASF Jenkins but the work is in
progress in RISELab Jenkins :)
From: holden.ka...@gmail.com on behalf of Holden Karau
Sen
13 PM
To: Reynold Xin
Cc: Felix Cheung; Sean Owen; dev@spark.apache.org
Subject: Re: Kicking off the process around Spark 2.2.1
I agree, except in this case we probably want some of the fixes that are going
into the maintenance release to be present in the new feature release (like the
CRAN issue)
Hi!
As we are closing down on the few known issues I think we are ready to tag and
cut the 2.2.1 release.
If you are aware of any issue that you think should go into this release please
feel free to ping me and mark the JIRA as targeting 2.2.1. I will be scrubbing
JIRA in the next few days.
S
Thanks Dongjoon! I will track that.
From: Dongjoon Hyun
Sent: Wednesday, November 8, 2017 7:41:20 PM
To: Holden Karau
Cc: Felix Cheung; dev@spark.apache.org
Subject: Re: Cutting the RC for Spark 2.2.1 release
It's great, Felix!
As of today, `branch-2.2`
).
There should not be any issue targetting 2.2.1 except for SPARK-22042. As it is
not a regression and it seems it might take a while, we won’t be blocking the
release.
_
From: Felix Cheung mailto:felixcheun...@hotmail.com>>
Sent: Wednesday, November 8, 2017 3
g/browse/MVNCENTRAL-1369
Stay tuned.
________
From: Felix Cheung
Sent: Monday, November 13, 2017 12:00:41 AM
To: dev@spark.apache.org
Subject: Re: Cutting the RC for Spark 2.2.1 release
Quick update:
We merged 6 fixes Friday and 7 fixes today (thanks!), since some are
han
Anything to build with maven on a clean machine.
It couldn’t connect to maven central repo.
From: Holden Karau
Sent: Monday, November 13, 2017 10:38:03 AM
To: Felix Cheung
Cc: dev@spark.apache.org
Subject: Re: Cutting the RC for Spark 2.2.1 release
Which script
r 13, 2017 10:48 AM
Subject: Re: Cutting the RC for Spark 2.2.1 release
To: Felix Cheung mailto:felixcheun...@hotmail.com>>
Cc: Holden Karau mailto:hol...@pigscanfly.ca>>,
mailto:dev@spark.apache.org>>
I'm not seeing a problem building, myself. However we could change the
Ouch ;) yes that works and RC1 is tagged.
From: Sean Owen
Sent: Monday, November 13, 2017 10:54:48 AM
To: Felix Cheung
Cc: Holden Karau; dev@spark.apache.org
Subject: Re: Cutting the RC for Spark 2.2.1 release
It's repo.maven.apache.org
nterface at repository.apache.org either.
____
From: Felix Cheung
Sent: Monday, November 13, 2017 11:23:44 AM
To: Sean Owen
Cc: Holden Karau; dev@spark.apache.org
Subject: Re: Cutting the RC for Spark 2.2.1 release
Ouch ;) yes that works and RC1 is tagged.
__
Please vote on releasing the following candidate as Apache Spark version
2.2.1. The vote is open until Monday November 20, 2017 at 23:00 UTC and
passes if a majority of at least 3 PMC +1 votes are cast.
[ ] +1 Release this package as Apache Spark 2.2.1
[ ] -1 Do not release this package because
o include this regression of 2.2? It works in 2.1
>>
>> Thanks,
>>
>> Xiao
>>
>>
>>
>> 2017-11-14 22:25 GMT-08:00 Felix Cheung :
>>
>>> Please vote on releasing the following candidate as Apache Spark version
>>> 2.2.1. The vote
ssues.apache.org/jira/browse/SPARK-16845?focusedCommentId=16018840&page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#comment-16018840
>
>
> On Wed, Nov 15, 2017 at 12:25 AM Felix Cheung
> wrote:
>
> Please vote on releasing the following candidate as Apache Spar
This vote is cancelled due to no vote.
I’m going to test or track down a few issues (please see link below for
those targeting this release) and roll RC2 in a few days if we could make
progress.
On Tue, Nov 14, 2017 at 10:25 PM Felix Cheung
wrote:
> Please vote on releasing the follow
Please vote on releasing the following candidate as Apache Spark version
2.2.1. The vote is open until Friday December 1, 2017 at 8:00:00 am UTC and
passes if a majority of at least 3 PMC +1 votes are cast.
[ ] +1 Release this package as Apache Spark 2.2.1
[ ] -1 Do not release this package beca
>
> tar: Error is not recoverable: exiting now
>
> *** RUN ABORTED ***
>
> java.io.IOException: Cannot run program "./bin/spark-submit" (in
> directory "/tmp/test-spark/spark-2.0.2"): error=2, No such file or directory
>
> On Sat, Nov 25, 2017 at 12:
, Nov 25, 2017 at 10:36 AM Felix Cheung
wrote:
> Thanks Sean.
>
> For the second one, it looks like the HiveExternalCatalogVersionsSuite is
> trying to download the release tgz from the official Apache mirror, which
> won’t work unless the release is actually, released?
>
>
is release.
>
> (I committed the change to set -Xss4m for tests consistently, but this
> shouldn't block a release.)
>
>
> On Sat, Nov 25, 2017 at 12:47 PM Felix Cheung
> wrote:
>
>> Ah sorry digging through the history it looks like this is changed
>> rel
This vote passes. Thanks everyone for testing this release.
+1:
Sean Owen (binding)
Herman van Hövell tot Westerflier (binding)
Wenchen Fan (binding)
Shivaram Venkataraman (binding)
Felix Cheung
Henry Robinson
Hyukjin Kwon
Dongjoon Hyun
Kazuaki Ishizaki
Holden Karau
Weichen Xu
0
ting a hand in finishing the release process,
> including copying artifacts in svn. Was there anything else you're waiting
> on someone to do?
>
>
> On Fri, Dec 1, 2017 at 2:10 AM Felix Cheung
> wrote:
>
>> This vote passes. Thanks everyone for testing this rel
hould announce the release officially too then.
>>
>> On Wed, Dec 6, 2017 at 5:00 PM Felix Cheung
>> wrote:
>>
>>> I saw the svn move on Monday so I’m working on the website updates.
>>>
>>> I will look into maven today. I will ask if I couldn’t do i
eed to give you all necessary access if you're the
> release manager!
>
>
> On Thu, Dec 14, 2017 at 6:32 AM Felix Cheung
> wrote:
>
>> And I don’t have access to publish python.
>>
>> On Wed, Dec 13, 2017 at 9:55 AM Shivaram Venkataraman <
>> shiva.
+1
I think the earlier we cut a branch the better.
From: Michael Armbrust
Sent: Tuesday, December 19, 2017 4:41:44 PM
To: Holden Karau
Cc: Sameer Agarwal; Erik Erlandson; dev
Subject: Re: Timeline for Spark 2.3
Do people really need to be around for the branch cu
ct: Re: [VOTE] Spark 2.2.1 (RC2)
Hi Felix Cheung:
When to pulish the new version 2.2.1 of spark doc to the website, now it's
still the version 2.2.0.
--
Sent from: http://apache-spark-developers-list.1001551.n3.nabble.com/
-
+1
Thanks for taking on this.
That was my feedback on one of the long comment thread as well, I think we
should have one docker image instead of 3 (also pending in the fork are python
and R variant, we should consider having one that we official release instead
of 9, for example)
How would (2) be uncommon elsewhere?
On Mon, Jan 8, 2018 at 10:16 PM Anirudh Ramanathan
wrote:
> This is with regard to the Kubernetes Scheduler Backend and scaling the
> process to accept contributions. Given we're moving past upstreaming
> changes from our fork, and into getting *new* patches,
+1 hangout
From: Xiao Li
Sent: Wednesday, January 31, 2018 10:46:26 PM
To: Ryan Blue
Cc: Reynold Xin; dev; Wenchen Fen; Russell Spitzer
Subject: Re: data source v2 online meetup
Hi, Ryan,
wow, your Iceberg already used data source V2 API! That is pretty cool! I
Quick questions:
is there search link for sql functions quite right?
https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-docs/_site/api/sql/search.html?q=app
this file shouldn't be included?
https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-bin/spark-parent_2.11.iml
Any idea with sql func docs search result returning broken links as below?
From: Felix Cheung
Sent: Sunday, February 18, 2018 10:05:22 AM
To: Sameer Agarwal; Sameer Agarwal
Cc: dev
Subject: Re: [VOTE] Spark 2.3.0 (RC4)
Quick questions:
is there search link for
These are two separate things:
Does the search result links work for you?
The second is the dist location we are voting on has a .iml file.
_
From: Sean Owen
Sent: Tuesday, February 20, 2018 2:19 AM
Subject: Re: [VOTE] Spark 2.3.0 (RC4)
To: Felix Cheung
Cc: dev
be in the release)
Thanks!
_
From: Shivaram Venkataraman
Sent: Tuesday, February 20, 2018 2:24 AM
Subject: Re: [VOTE] Spark 2.3.0 (RC4)
To: Felix Cheung
Cc: Sean Owen , dev
FWIW The search result link works for me
Shivaram
On Mon, Feb 19, 2018 at 6:21 PM, Felix
This is recent change. The html file column_math_functions.html should have the
right help content.
What is the problem you are experiencing?
From: Mihály Tóth
Sent: Sunday, February 25, 2018 10:42:50 PM
To: dev@spark.apache.org
Subject: Help needed in R documen
o the same column_math_functions.html ?
Thanks,
Misi
On Sun, Feb 25, 2018, 22:53 Felix Cheung
mailto:felixcheun...@hotmail.com>> wrote:
This is recent change. The html file column_math_functions.html should have the
right help content.
What is the problem you are experiencing?
_
+1
Tested R:
install from package, CRAN tests, manual tests, help check, vignettes check
Filed this https://issues.apache.org/jira/browse/SPARK-23461
This is not a regression so not a blocker of the release.
Tested this on win-builder and r-hub. On r-hub on multiple platforms everything
passed
is sounds like a bug in the documentation of Spark R, does'nt it? Shall I
file a Jira about it?
Locally I ran SPARK_HOME/R/create-docs.sh and it returned successfully.
Unfortunately with the result mentioned above.
Best Regards,
Misi
--------
From: Felix Cheung mailto:
: Tuesday, February 27, 2018 9:13:18 AM
To: Felix Cheung
Cc: Mihály Tóth; dev@spark.apache.org
Subject: Re: Help needed in R documentation generation
Hi,
Earlier, at https://spark.apache.org/docs/latest/api/R/index.html I see
1. sin as a title
2. description describes what sin does
3. usage
, 2018 10:26:23 AM
To: Felix Cheung
Cc: Mihály Tóth; Mihály Tóth; dev@spark.apache.org
Subject: Re: Help needed in R documentation generation
I followed Misi's instructions:
- click on
https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc5-docs/_site/api/R/index.html
- click on "s" at
Also part of the problem is that the latest news panel is static on each page,
so any new link added changes hundreds of files?
From: holden.ka...@gmail.com on behalf of Holden Karau
Sent: Thursday, March 1, 2018 6:36:43 PM
To: dev
Subject: Using bundler for Je
Congrats and welcome!
From: Dongjoon Hyun
Sent: Friday, March 2, 2018 4:27:10 PM
To: Spark dev list
Subject: Re: Welcoming some new committers
Congrats to all!
Bests,
Dongjoon.
On Fri, Mar 2, 2018 at 4:13 PM, Wenchen Fan
mailto:cloud0...@gmail.com>> wrote:
Con
Instead of using gpg to create the sha512 hash file we could just change to
using sha512sum? That would output the right format that is in turns verifiable.
From: Ryan Blue
Sent: Friday, March 16, 2018 8:31:45 AM
To: Nicholas Chammas
Cc: Spark dev list
Subject:
1 - 100 of 252 matches
Mail list logo