Hi,

I compiled using sbt and it takes lesser time. Thanks for the tip. I'm able
to run these examples (
https://spark.apache.org/docs/latest/mllib-linear-methods.html related to
MLib in the pyspark shell.

However I got some errors related to Spark SQL while compiling. Is that a
reason to worry? I have posted the errors as a gist over here,
https://gist.github.com/MechCoder/7a9c89ee38e194513080

Thanks.


On Sat, Jan 3, 2015 at 11:54 PM, Michael Armbrust <mich...@databricks.com>
wrote:

> I'll add that most of the spark developers I know use sbt for day to day
> development as it can be much faster for incremental compilation and it has
> several nice features.
>
> On Sat, Jan 3, 2015 at 9:59 AM, Simon Elliston Ball <
> si...@simonellistonball.com> wrote:
>
>> You can use the same build commands, but it's well worth setting up a
>> zinc server if you're doing a lot of builds. That will allow incremental
>> scala builds, which speeds up the process significantly.
>>
>> SPARK-4501 might be of interest too.
>>
>> Simon
>>
>> On 3 Jan 2015, at 17:27, Manoj Kumar <manojkumarsivaraj...@gmail.com>
>> wrote:
>>
>> My question was that if once I make changes in the source code to a file,
>>
>> do I rebuild it using any other command, such that it takes in only the
>> changes (because it takes a lot of time)?
>>
>> On Sat, Jan 3, 2015 at 10:40 PM, Manoj Kumar <
>> manojkumarsivaraj...@gmail.com> wrote:
>>
>>> Yes, I've built spark successfully, using the same command
>>>
>>> mvn -DskipTests clean package
>>>
>>> but it built because now I do not work behind a proxy.
>>>
>>> Thanks.
>>>
>>>
>>>
>>
>>
>>
>> --
>> Godspeed,
>> Manoj Kumar,
>> Intern, Telecom ParisTech
>> Mech Undergrad
>> http://manojbits.wordpress.com
>>
>>
>


-- 
Godspeed,
Manoj Kumar,
Intern, Telecom ParisTech
Mech Undergrad
http://manojbits.wordpress.com

Reply via email to