If you are only interested in getting a hands on with Spark and not with
building it with specific version of Hadoop use one of the bundle provider
like cloudera.

It will give you a very easy way to install and monitor your services.( I
find installing via cloudera manager
http://www.cloudera.com/content/cloudera/en/downloads.html) super easy.
Currently there are on Spark 1.2.

..Manas

On Mon, Mar 30, 2015 at 1:34 PM, vance46 <wang2...@purdue.edu> wrote:

> Hi all,
>
> I'm a newbee try to setup spark for my research project on a RedHat system.
> I've downloaded spark-1.3.0.tgz and untared it. and installed python, java
> and scala. I've set JAVA_HOME and SCALA_HOME and then try to use "sudo
> sbt/sbt assembly" according to
> https://docs.sigmoidanalytics.com/index.php/How_to_Install_Spark_on_CentOS6
> .
> It pop-up with "sbt command not found". I then try directly start
> spark-shell in "./bin" using "sudo ./bin/spark-shell" and still "command
> not
> found". I appreciate your help in advance.
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Cannot-run-spark-shell-command-not-found-tp22299.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to