RE: Contributing to Spark

2018-03-12 Thread Roman Maier
Hi Marco, I get it, thank you. Sincerely, Roman Maier From: Marco Gaido [mailto:marcogaid...@gmail.com] Sent: Monday, March 12, 2018 1:51 PM To: Roman Maier Cc: dev@spark.apache.org Subject: Re: Contributing to Spark Hi Roman, welcome to the community. Actually, this is not how it works. If

Re: Contributing to Spark

2018-03-12 Thread Marco Gaido
Hi Roman, welcome to the community. Actually, this is not how it works. If you want to contribute to Spark you can just look for open JIRAs and submit a PR for that. JIRAs are assigned by committers once the PR gets merged. If you want, you can eventually comment on the JIRA that you are working o

Re: Contributing to Spark needs PySpark build/test instructions

2014-07-21 Thread Nicholas Chammas
That works! Thank you. On Tue, Jul 22, 2014 at 12:28 AM, Reynold Xin wrote: > I missed that bullet point. I removed that and just pointed it towards the > instruction. > > > On Mon, Jul 21, 2014 at 9:20 PM, Nicholas Chammas < > nicholas.cham...@gmail.com> wrote: > > > Looks good. Does sbt/sbt t

Re: Contributing to Spark needs PySpark build/test instructions

2014-07-21 Thread Reynold Xin
I missed that bullet point. I removed that and just pointed it towards the instruction. On Mon, Jul 21, 2014 at 9:20 PM, Nicholas Chammas < nicholas.cham...@gmail.com> wrote: > Looks good. Does sbt/sbt test cover the same tests as /dev/run-tests? > > I’m looking at step 5 under “Contributing Cod

Re: Contributing to Spark needs PySpark build/test instructions

2014-07-21 Thread Nicholas Chammas
Looks good. Does sbt/sbt test cover the same tests as /dev/run-tests? I’m looking at step 5 under “Contributing Code”. Someone contributing to PySpark will want to be directed to run something in addition to (or instead of) sbt/sbt test, I believe. Nick ​ On Mon, Jul 21, 2014 at 11:43 PM, Reyno

Re: Contributing to Spark needs PySpark build/test instructions

2014-07-21 Thread Reynold Xin
I added an automated testing section: https://cwiki.apache.org/confluence/display/SPARK/Contributing+to+Spark#ContributingtoSpark-AutomatedTesting Can you take a look to see if it is what you had in mind? On Mon, Jul 21, 2014 at 3:54 PM, Nicholas Chammas < nicholas.cham...@gmail.com> wrote: >

Re: Contributing to Spark needs PySpark build/test instructions

2014-07-21 Thread Nicholas Chammas
For the record, the triggering discussion is here . I assumed that sbt/sbt test covers all the tests required before submitting a patch, and it appears that it doesn’t. ​ On Mon, Jul 21, 2014 at 6:42 PM, Nicholas Chammas < nicholas.

Re: Contributing to Spark

2014-04-09 Thread Reynold Xin
Usually you can just run Spark in local mode on a single machine for most dev/testing. If you want to simulate a cluster locally using multiple Spark worker processes, you can use the undocumented local cluster mode, e.g. local-cluster[2,1,512] this launches two worker processes, each with one c

Re: Contributing to Spark

2014-04-09 Thread Sujeet Varakhedi
Another starter question which probably should have asked before is what is the most efficient way to iterate quickly on dev/test. I am currently using a local cluster (via vagrant and shared folders) and also spark-shell. Sujeet On Tue, Apr 8, 2014 at 9:50 AM, Michael Ernest wrote: > Ha ha! n

Re: Contributing to Spark

2014-04-08 Thread Michael Ernest
Ha ha! nice try, sheepherder! ;-) On Tue, Apr 8, 2014 at 12:37 PM, Matei Zaharia wrote: > Shh, maybe I really wanted people to fix that one issue. > > On Apr 8, 2014, at 9:34 AM, Aaron Davidson wrote: > > > Matei's link seems to point to a specific starter project as part of the > > starter lis

Re: Contributing to Spark

2014-04-08 Thread Matei Zaharia
Shh, maybe I really wanted people to fix that one issue. On Apr 8, 2014, at 9:34 AM, Aaron Davidson wrote: > Matei's link seems to point to a specific starter project as part of the > starter list, but here is the list itself: > https://issues.apache.org/jira/issues/?jql=project%20%3D%20SPARK%20

Re: Contributing to Spark

2014-04-08 Thread Aaron Davidson
Matei's link seems to point to a specific starter project as part of the starter list, but here is the list itself: https://issues.apache.org/jira/issues/?jql=project%20%3D%20SPARK%20AND%20labels%20%3D%20Starter%20AND%20status%20in%20(Open%2C%20%22In%20Progress%22%2C%20Reopened) On Mon, Apr 7, 20

Re: Contributing to Spark

2014-04-07 Thread Matei Zaharia
I’d suggest looking for the issues labeled “Starter” on JIRA. You can find them here: https://issues.apache.org/jira/browse/SPARK-1438?jql=project%20%3D%20SPARK%20AND%20labels%20%3D%20Starter%20AND%20status%20in%20(Open%2C%20%22In%20Progress%22%2C%20Reopened) Matei On Apr 7, 2014, at 9:45 PM, M

Re: Contributing to Spark

2014-04-07 Thread Mukesh G
Hi Sujeet, Thanks. I went thru the website and looks great. Is there a list of items that I can choose from, for contribution? Thanks Mukesh On Mon, Apr 7, 2014 at 10:14 PM, Sujeet Varakhedi wrote: > This is a good place to start: > https://cwiki.apache.org/confluence/display/SPARK/Contri

Re: Contributing to Spark

2014-04-07 Thread Sujeet Varakhedi
This is a good place to start: https://cwiki.apache.org/confluence/display/SPARK/Contributing+to+Spark Sujeet On Mon, Apr 7, 2014 at 9:20 AM, Mukesh G wrote: > Hi, > >How I contribute to Spark and it's associated projects? > > Appreciate the help... > > Thanks > > Mukesh >