Following what Ted said, if you leverage the `mvn` from within the `build/` directory of Spark you¹ll get zinc for free which should help speed up build times.
On 5/1/15, 9:45 AM, "Ted Yu" <yuzhih...@gmail.com> wrote: >Pramod: >Please remember to run Zinc so that the build is faster. > >Cheers > >On Fri, May 1, 2015 at 9:36 AM, Ulanov, Alexander ><alexander.ula...@hp.com> >wrote: > >> Hi Pramod, >> >> For cluster-like tests you might want to use the same code as in mllib's >> LocalClusterSparkContext. You can rebuild only the package that you >>change >> and then run this main class. >> >> Best regards, Alexander >> >> -----Original Message----- >> From: Pramod Biligiri [mailto:pramodbilig...@gmail.com] >> Sent: Friday, May 01, 2015 1:46 AM >> To: dev@spark.apache.org >> Subject: Speeding up Spark build during development >> >> Hi, >> I'm making some small changes to the Spark codebase and trying it out >>on a >> cluster. I was wondering if there's a faster way to build than running >>the >> package target each time. >> Currently I'm using: mvn -DskipTests package >> >> All the nodes have the same filesystem mounted at the same mount point. >> >> Pramod >> ________________________________________________________ The information contained in this e-mail is confidential and/or proprietary to Capital One and/or its affiliates. The information transmitted herewith is intended only for use by the individual or entity to which it is addressed. If the reader of this message is not the intended recipient, you are hereby notified that any review, retransmission, dissemination, distribution, copying or other use of, or taking of any action in reliance upon this information is strictly prohibited. If you have received this communication in error, please contact the sender and delete the material from your computer. --------------------------------------------------------------------- To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org For additional commands, e-mail: dev-h...@spark.apache.org