Hi Jakob,

Thanks for your suggestion. I downloaded a pre built version with Hadoop and 
followed your steps

I posted the result on the forum thread, not sure if you can see it?

I was just wondering whether this means it has been successfully installed as 
there are a number of warning/error messages etc. do I need to do anything else?

Thanks,

Aida

Sent from my iPhone

> On 8 Mar 2016, at 22:42, Jakob Odersky <ja...@odersky.com> wrote:
> 
> I've had some issues myself with the user-provided-Hadoop version.
> If you simply just want to get started, I would recommend downloading
> Spark (pre-built, with any of the hadoop versions) as Cody suggested.
> 
> A simple step-by-step guide:
> 
> 1. curl 
> http://apache.arvixe.com/spark/spark-1.6.0/spark-1.6.0-bin-hadoop2.6.tgz
> -O
> 
> 2. tar -xzf spark-1.6.0-bin-hadoop2.6.tgz
> 
> 3. cd spark-1.6.0-bin-hadoop2.6
> 
> 4. ./bin/spark-shell --master local[2]
> 
>> On Tue, Mar 8, 2016 at 2:01 PM, Aida Tefera <aida1.tef...@gmail.com> wrote:
>> Ok, once I downloaded the pre built version, I created a directory for it 
>> and named Spark
>> 
>> When I try ./bin/start-all.sh
>> 
>> It comes back with : no such file or directory
>> 
>> When I try ./bin/spark-shell --master local[2]
>> 
>> I get: no such file or directory
>> Failed to find spark assembly, you need to build Spark before running this 
>> program
>> 
>> 
>> 
>> Sent from my iPhone
>> 
>>> On 8 Mar 2016, at 21:50, Cody Koeninger <c...@koeninger.org> wrote:
>>> 
>>> That's what I'm saying, there is no "installing" necessary for
>>> pre-built packages.  Just unpack it and change directory into it.
>>> 
>>> What happens when you do
>>> 
>>> ./bin/spark-shell --master local[2]
>>> 
>>> or
>>> 
>>> ./bin/start-all.sh
>>> 
>>> 
>>> 
>>>> On Tue, Mar 8, 2016 at 3:45 PM, Aida Tefera <aida1.tef...@gmail.com> wrote:
>>>> Hi Cody, thanks for your reply
>>>> 
>>>> I tried "sbt/sbt clean assembly" in the Terminal; somehow I still end up 
>>>> with errors.
>>>> 
>>>> I have looked at the below links, doesn't give much detail on how to 
>>>> install it before executing "./sbin/start-master.sh"
>>>> 
>>>> Thanks,
>>>> 
>>>> Aida
>>>> Sent from my iPhone
>>>> 
>>>>> On 8 Mar 2016, at 19:02, Cody Koeninger <c...@koeninger.org> wrote:
>>>>> 
>>>>> You said you downloaded a prebuilt version.
>>>>> 
>>>>> You shouldn't have to mess with maven or building spark at all.  All
>>>>> you need is a jvm, which it looks like you already have installed.
>>>>> 
>>>>> You should be able to follow the instructions at
>>>>> 
>>>>> http://spark.apache.org/docs/latest/
>>>>> 
>>>>> and
>>>>> 
>>>>> http://spark.apache.org/docs/latest/spark-standalone.html
>>>>> 
>>>>> If you want standalone mode (master and several worker processes on
>>>>> your machine) rather than local mode (single process on your machine),
>>>>> you need to set up passwordless ssh to localhost
>>>>> 
>>>>> http://stackoverflow.com/questions/7134535/setup-passphraseless-ssh-to-localhost-on-os-x
>>>>> 
>>>>> 
>>>>> 
>>>>> On Tue, Mar 8, 2016 at 12:45 PM, Eduardo Costa Alfaia
>>>>> <e.costaalf...@unibs.it> wrote:
>>>>>> Hi Aida,
>>>>>> The installation has detected a maven version 3.0.3. Update to 3.3.3 and 
>>>>>> try
>>>>>> again.
>>>>>> 
>>>>>> Il 08/Mar/2016 14:06, "Aida" <aida1.tef...@gmail.com> ha scritto:
>>>>>>> 
>>>>>>> Hi all,
>>>>>>> 
>>>>>>> Thanks everyone for your responses; really appreciate it.
>>>>>>> 
>>>>>>> Eduardo - I tried your suggestions but ran into some issues, please see
>>>>>>> below:
>>>>>>> 
>>>>>>> ukdrfs01:Spark aidatefera$ cd spark-1.6.0
>>>>>>> ukdrfs01:spark-1.6.0 aidatefera$ build/mvn -DskipTests clean package
>>>>>>> Using `mvn` from path: /usr/bin/mvn
>>>>>>> Java HotSpot(TM) 64-Bit Server VM warning: ignoring option
>>>>>>> MaxPermSize=512M;
>>>>>>> support was removed in 8.0
>>>>>>> [INFO] Scanning for projects...
>>>>>>> [INFO]
>>>>>>> ------------------------------------------------------------------------
>>>>>>> [INFO] Reactor Build Order:
>>>>>>> [INFO]
>>>>>>> [INFO] Spark Project Parent POM
>>>>>>> [INFO] Spark Project Test Tags
>>>>>>> [INFO] Spark Project Launcher
>>>>>>> [INFO] Spark Project Networking
>>>>>>> [INFO] Spark Project Shuffle Streaming Service
>>>>>>> [INFO] Spark Project Unsafe
>>>>>>> [INFO] Spark Project Core
>>>>>>> [INFO] Spark Project Bagel
>>>>>>> [INFO] Spark Project GraphX
>>>>>>> [INFO] Spark Project Streaming
>>>>>>> [INFO] Spark Project Catalyst
>>>>>>> [INFO] Spark Project SQL
>>>>>>> [INFO] Spark Project ML Library
>>>>>>> [INFO] Spark Project Tools
>>>>>>> [INFO] Spark Project Hive
>>>>>>> [INFO] Spark Project Docker Integration Tests
>>>>>>> [INFO] Spark Project REPL
>>>>>>> [INFO] Spark Project Assembly
>>>>>>> [INFO] Spark Project External Twitter
>>>>>>> [INFO] Spark Project External Flume Sink
>>>>>>> [INFO] Spark Project External Flume
>>>>>>> [INFO] Spark Project External Flume Assembly
>>>>>>> [INFO] Spark Project External MQTT
>>>>>>> [INFO] Spark Project External MQTT Assembly
>>>>>>> [INFO] Spark Project External ZeroMQ
>>>>>>> [INFO] Spark Project External Kafka
>>>>>>> [INFO] Spark Project Examples
>>>>>>> [INFO] Spark Project External Kafka Assembly
>>>>>>> [INFO]
>>>>>>> [INFO]
>>>>>>> ------------------------------------------------------------------------
>>>>>>> [INFO] Building Spark Project Parent POM 1.6.0
>>>>>>> [INFO]
>>>>>>> ------------------------------------------------------------------------
>>>>>>> [INFO]
>>>>>>> [INFO] --- maven-clean-plugin:2.6.1:clean (default-clean) @
>>>>>>> spark-parent_2.10 ---
>>>>>>> [INFO]
>>>>>>> [INFO] --- maven-enforcer-plugin:1.4:enforce (enforce-versions) @
>>>>>>> spark-parent_2.10 ---
>>>>>>> [WARNING] Rule 0: org.apache.maven.plugins.enforcer.RequireMavenVersion
>>>>>>> failed with message:
>>>>>>> Detected Maven Version: 3.0.3 is not in the allowed range 3.3.3.
>>>>>>> [INFO]
>>>>>>> ------------------------------------------------------------------------
>>>>>>> [INFO] Reactor Summary:
>>>>>>> [INFO]
>>>>>>> [INFO] Spark Project Parent POM .......................... FAILURE
>>>>>>> [0.821s]
>>>>>>> [INFO] Spark Project Test Tags ........................... SKIPPED
>>>>>>> [INFO] Spark Project Launcher ............................ SKIPPED
>>>>>>> [INFO] Spark Project Networking .......................... SKIPPED
>>>>>>> [INFO] Spark Project Shuffle Streaming Service ........... SKIPPED
>>>>>>> [INFO] Spark Project Unsafe .............................. SKIPPED
>>>>>>> [INFO] Spark Project Core ................................ SKIPPED
>>>>>>> [INFO] Spark Project Bagel ............................... SKIPPED
>>>>>>> [INFO] Spark Project GraphX .............................. SKIPPED
>>>>>>> [INFO] Spark Project Streaming ........................... SKIPPED
>>>>>>> [INFO] Spark Project Catalyst ............................ SKIPPED
>>>>>>> [INFO] Spark Project SQL ................................. SKIPPED
>>>>>>> [INFO] Spark Project ML Library .......................... SKIPPED
>>>>>>> [INFO] Spark Project Tools ............................... SKIPPED
>>>>>>> [INFO] Spark Project Hive ................................ SKIPPED
>>>>>>> [INFO] Spark Project Docker Integration Tests ............ SKIPPED
>>>>>>> [INFO] Spark Project REPL ................................ SKIPPED
>>>>>>> [INFO] Spark Project Assembly ............................ SKIPPED
>>>>>>> [INFO] Spark Project External Twitter .................... SKIPPED
>>>>>>> [INFO] Spark Project External Flume Sink ................. SKIPPED
>>>>>>> [INFO] Spark Project External Flume ...................... SKIPPED
>>>>>>> [INFO] Spark Project External Flume Assembly ............. SKIPPED
>>>>>>> [INFO] Spark Project External MQTT ....................... SKIPPED
>>>>>>> [INFO] Spark Project External MQTT Assembly .............. SKIPPED
>>>>>>> [INFO] Spark Project External ZeroMQ ..................... SKIPPED
>>>>>>> [INFO] Spark Project External Kafka ...................... SKIPPED
>>>>>>> [INFO] Spark Project Examples ............................ SKIPPED
>>>>>>> [INFO] Spark Project External Kafka Assembly ............. SKIPPED
>>>>>>> [INFO]
>>>>>>> ------------------------------------------------------------------------
>>>>>>> [INFO] BUILD FAILURE
>>>>>>> [INFO]
>>>>>>> ------------------------------------------------------------------------
>>>>>>> [INFO] Total time: 1.745s
>>>>>>> [INFO] Finished at: Tue Mar 08 18:01:48 GMT 2016
>>>>>>> [INFO] Final Memory: 19M/183M
>>>>>>> [INFO]
>>>>>>> ------------------------------------------------------------------------
>>>>>>> [ERROR] Failed to execute goal
>>>>>>> org.apache.maven.plugins:maven-enforcer-plugin:1.4:enforce
>>>>>>> (enforce-versions) on project spark-parent_2.10: Some Enforcer rules 
>>>>>>> have
>>>>>>> failed. Look above for specific messages explaining why the rule failed.
>>>>>>> ->
>>>>>>> [Help 1]
>>>>>>> [ERROR]
>>>>>>> [ERROR] To see the full stack trace of the errors, re-run Maven with the
>>>>>>> -e
>>>>>>> switch.
>>>>>>> [ERROR] Re-run Maven using the -X switch to enable full debug logging.
>>>>>>> [ERROR]
>>>>>>> [ERROR] For more information about the errors and possible solutions,
>>>>>>> please
>>>>>>> read the following articles:
>>>>>>> [ERROR] [Help 1]
>>>>>>> http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
>>>>>>> ukdrfs01:spark-1.6.0 aidatefera$
>>>>>>> 
>>>>>>> 
>>>>>>> 
>>>>>>> 
>>>>>>> 
>>>>>>> 
>>>>>>> 
>>>>>>> --
>>>>>>> View this message in context:
>>>>>>> http://apache-spark-user-list.1001560.n3.nabble.com/Installing-Spark-on-Mac-tp26397p26431.html
>>>>>>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>>>>>> 
>>>>>>> ---------------------------------------------------------------------
>>>>>>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>>>>>> For additional commands, e-mail: user-h...@spark.apache.org
>>>>>> 
>>>>>> Informativa sulla Privacy: http://www.unibs.it/node/8155
>> 
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> For additional commands, e-mail: user-h...@spark.apache.org
>> 

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to