Re: Contributing to PySpark

2016-10-18 Thread Holden Karau
Hi Krishna, Thanks for your interest contributing to PySpark! I don't personally use either of those IDEs so I'll leave that part for someone else to answer - but in general you can find the building spark documentation at http://spark.apache.org/docs/latest/building-spark.html which includes note

Re: Contributing to pyspark

2015-06-12 Thread Manoj Kumar
1, Yes, because the issues are in JIRA. 2. Nope, (at least as far as MLlib is concerned) because most if it are just wrappers to the underlying Scala functions or methods and are not implemented in pure Python. 3. I'm not sure about this. It seems to work fine for me! HTH On Fri, Jun 12, 2015 at

Re: Contributing to pyspark

2015-06-11 Thread Manoj Kumar
Hi, Thanks for your interest in PySpark. The first thing is to have a look at the "how to contribute" guide https://cwiki.apache.org/confluence/display/SPARK/Contributing+to+Spark and filter the JIRA's using the label PySpark. If you have your own improvement in mind, you can file your a JIRA, d