Re: Apache Spark Installation error

2018-05-31 Thread Irving Duran
You probably want to recognize "spark-shell" as a command in your environment. Maybe try "sudo ln -s /path/to/spark-shell /usr/bin/spark-shell" Have you tried "./spark-shell" in the current path to see if it works? Thank You, Irving Duran On Thu, May 31, 2018 at 9:00 AM Remil Mohanan wrote:

Re: Spark Installation to work on Spark Streaming and MLlib

2016-06-10 Thread Ram Krishna
Thanks for suggestion. Can you suggest me from where and how I how to start from the scratch to work on Spark. On Fri, Jun 10, 2016 at 8:18 PM, Holden Karau wrote: > So that's a bit complicated - you might want to start with reading the > code for the existing algorithms and go from there. If yo

Re: Spark Installation to work on Spark Streaming and MLlib

2016-06-10 Thread Holden Karau
So that's a bit complicated - you might want to start with reading the code for the existing algorithms and go from there. If your goal is to contribute the algorithm to Spark you should probably take a look at the JIRA as well as the contributing to Spark guide on the wiki. Also we have a seperate

Re: Spark Installation to work on Spark Streaming and MLlib

2016-06-10 Thread Ram Krishna
Hi All, How to add new ML algo in Spark MLlib. On Fri, Jun 10, 2016 at 12:50 PM, Ram Krishna wrote: > Hi All, > > I am new to this this field, I want to implement new ML algo using Spark > MLlib. What is the procedure. > > -- > Regards, > Ram Krishna KT > > > > > > -- Regards, Ram Krishna KT

Re: Spark Installation to work on Spark Streaming and MLlib

2016-06-10 Thread Holden Karau
Hi Ram, Not super certain what you are looking to do. Are you looking to add a new algorithm to Spark MLlib for streaming or use Spark MLlib on streaming data? Cheers, Holden On Friday, June 10, 2016, Ram Krishna wrote: > Hi All, > > I am new to this this field, I want to implement new ML alg

Spark Installation to work on Spark Streaming and MLlib

2016-06-10 Thread Ram Krishna
Hi All, I am new to this this field, I want to implement new ML algo using Spark MLlib. What is the procedure. -- Regards, Ram Krishna KT

Re: Spark installation

2015-02-10 Thread prabeesh k
Refer this blog for step by step installation On 11 February 2015 at 03:42, Mohit Singh wrote: > For local machine, I dont think there is any to install.. Just unzip and > go to $SPARK_DIR/bin/spark-shell and t

Re: Spark installation

2015-02-10 Thread Mohit Singh
For local machine, I dont think there is any to install.. Just unzip and go to $SPARK_DIR/bin/spark-shell and that will open up a repl... On Tue, Feb 10, 2015 at 3:25 PM, King sami wrote: > Hi, > > I'm new in Spark. I want to install it on my local machine (Ubunti 12.04) > Could you help me plea

Spark installation

2015-02-10 Thread King sami
Hi, I'm new in Spark. I want to install it on my local machine (Ubunti 12.04) Could you help me please to install step by step Spark on may machine and run some Scala programms. Thanks

Re: Spark Installation Maven PermGen OutOfMemoryException

2014-12-27 Thread varun sharma
This works for me: export MAVEN_OPTS="-Xmx2g -XX:MaxPermSize=512M -XX:ReservedCodeCacheSize=512m" && mvn -DskipTests clean package -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Installation-Maven-PermGen-OutOfMemoryException-tp208

Re: Spark Installation Maven PermGen OutOfMemoryException

2014-12-24 Thread Vladimir Protsenko
rification Sean. >>> >>> Best Regards, >>> Guru Medasani >>> >>> >>> >>> >>> > From: so...@cloudera.com >>> > Date: Tue, 23 Dec 2014 15:39:59 + >>> > Subject: Re: Spark Installation Maven Perm

Re: Spark Installation Maven PermGen OutOfMemoryException

2014-12-24 Thread Sean Owen
x27;t work also. > > Best Regards, > Vladimir Protsenko > > 2014-12-23 19:45 GMT+04:00 Guru Medasani : > >> Thanks for the clarification Sean. >> >> Best Regards, >> Guru Medasani >> >> >> >> >> > From: so...@cloudera.com &g

Re: Spark Installation Maven PermGen OutOfMemoryException

2014-12-24 Thread Vladimir Protsenko
gt; Guru Medasani > > > > > > From: so...@cloudera.com > > Date: Tue, 23 Dec 2014 15:39:59 + > > Subject: Re: Spark Installation Maven PermGen OutOfMemoryException > > To: gdm...@outlook.com > > CC: protsenk...@gmail.com; user@spark.apache.org > > >

RE: Spark Installation Maven PermGen OutOfMemoryException

2014-12-23 Thread Guru Medasani
Thanks for the clarification Sean. Best Regards,Guru Medasani > From: so...@cloudera.com > Date: Tue, 23 Dec 2014 15:39:59 + > Subject: Re: Spark Installation Maven PermGen OutOfMemoryException > To: gdm...@outlook.com > CC: protsenk...@gmail.com; user@spark.apache.or

Re: Spark Installation Maven PermGen OutOfMemoryException

2014-12-23 Thread Sean Owen
The text there is actually unclear. In Java 8, you still need to set the max heap size (-Xmx2g). The optional bit is the "-XX:MaxPermSize=512M" actually. Java 8 no longer has a separate permanent generation. On Tue, Dec 23, 2014 at 3:32 PM, Guru Medasani wrote: > Hi Vladimir, > > From the link Se

RE: Spark Installation Maven PermGen OutOfMemoryException

2014-12-23 Thread Guru Medasani
15:04:42 + > Subject: Re: Spark Installation Maven PermGen OutOfMemoryException > To: protsenk...@gmail.com > CC: user@spark.apache.org > > You might try a little more. The official guidance suggests 2GB: > > https://spark.apache.org/docs/latest/building-spark.html#

RE: Spark Installation Maven PermGen OutOfMemoryException

2014-12-23 Thread Somnath Pandeya
I think you should use minimum of 2gb of memory for building it from maven . -Somnath -Original Message- From: Vladimir Protsenko [mailto:protsenk...@gmail.com] Sent: Tuesday, December 23, 2014 8:28 PM To: user@spark.apache.org Subject: Spark Installation Maven PermGen

Re: Spark Installation Maven PermGen OutOfMemoryException

2014-12-23 Thread Sean Owen
Xmx=1500m -XX:MaxPermSize=512m > -XX:ReservedCodeCacheSize=512m` doesn't help. > > Waht is a straight forward way to start using Spark? > > > > -- > View this message in context: > http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Installation-Maven-PermGen-OutOfMemoryException-tp

Spark Installation Maven PermGen OutOfMemoryException

2014-12-23 Thread Vladimir Protsenko
? export MAVEN_OPTS=`-Xmx=1500m -XX:MaxPermSize=512m -XX:ReservedCodeCacheSize=512m` doesn't help. Waht is a straight forward way to start using Spark? -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Installation-Maven-PermGen-OutOfMemoryException-tp

Re: Spark Installation

2014-07-08 Thread 田毅
Hi Srikrishna the reason to this issue is you had uploaded assembly jar to HDFS twice. paste your command could be better diagnosis 田毅 === 橘云平台产品线 大数据产品部 亚信联创科技(中国)有限公司 手机:13910177261 电话:010-82166322 传真:010-82166617 Q Q:20057509 MSN:yi.t...@hotmail.c

Re: Spark Installation

2014-07-08 Thread Srikrishna S
Hi All, I tried the make distribution script and it worked well. I was able to compile the spark binary on our CDH5 cluster. Once I compiled Spark, I copied over the binaries in the dist folder to all the other machines in the cluster. However, I run into an issue while submit a job in yarn-clie

Re: Spark Installation

2014-07-08 Thread Sandy Ryza
Hi Srikrishna, The binaries are built with something like mvn package -Pyarn -Dhadoop.version=2.3.0-cdh5.0.1 -Dyarn.version=2.3.0-cdh5.0.1 -Sandy On Tue, Jul 8, 2014 at 3:14 AM, 田毅 wrote: > try this command: > > make-distribution.sh --hadoop 2.3.0-cdh5.0.0 --with-yarn --with-hive > > > > > 田毅

Re: Spark Installation

2014-07-08 Thread 田毅
try this command: make-distribution.sh --hadoop 2.3.0-cdh5.0.0 --with-yarn --with-hive 田毅 === 橘云平台产品线 大数据产品部 亚信联创科技(中国)有限公司 手机:13910177261 电话:010-82166322 传真:010-82166617 Q Q:20057509 MSN:yi.t...@hotmail.com 地址:北京市海淀区东北旺西路10号院东区 亚信联创大厦 ===

Re: Spark Installation

2014-07-08 Thread Sean Owen
On Tue, Jul 8, 2014 at 4:07 AM, Srikrishna S wrote: > Hi All, > > Does anyone know what the command line arguments to mvn are to generate > the pre-built binary for spark on Hadoop 2-CHD5. > > I would like to pull in a recent bug fix in spark-master and rebuild the > binaries in the exact same wa

Re: Spark Installation

2014-07-07 Thread Krishna Sankar
Couldn't find any reference of CDH in pom.xml - profiles or the hadoop.version.Am also wondering how the cdh compatible artifact was compiled. Cheers On Mon, Jul 7, 2014 at 8:07 PM, Srikrishna S wrote: > Hi All, > > Does anyone know what the command line arguments to mvn are to generate > the

Re: Spark Installation

2014-07-07 Thread Jaideep Dhok
Hi Srikrishna, You can use the make-distribution script in Spark to generate the binary. Example - ./make-distribution.sh --tgz --hadoop HADOOP_VERSION The above script calls maven, so you can look into it to get the exact mvn command too. Thanks, Jaideep On Tue, Jul 8, 2014 at 8:37 AM, Srikris

Spark Installation

2014-07-07 Thread Srikrishna S
Hi All, Does anyone know what the command line arguments to mvn are to generate the pre-built binary for spark on Hadoop 2-CHD5. I would like to pull in a recent bug fix in spark-master and rebuild the binaries in the exact same way that was used for that provided on the website. I have tried th

Re: error with cdh 5 spark installation

2014-06-04 Thread Sean Owen
Spark is already part of the distribution, and the core CDH5 parcel. You shouldn't need extra steps unless you're doing something special. It may be that this is the very cause of the error when trying to install over the existing services. On Wed, Jun 4, 2014 at 3:19 PM, chirag lakhani wrote: >

Re: error with cdh 5 spark installation

2014-06-04 Thread Patrick Wendell
Hey Chirag, Those init scripts are part of the Cloudera Spark package (they are not in the Spark project itself) so you might try e-mailing their support lists directly. - Patrick On Wed, Jun 4, 2014 at 7:19 AM, chirag lakhani wrote: > I recently spun up an AWS cluster with cdh 5 using Cloudera

error with cdh 5 spark installation

2014-06-04 Thread chirag lakhani
I recently spun up an AWS cluster with cdh 5 using Cloudera Manager. I am trying to install spark and simply used the install command, as stated in the CDH 5 documentation. sudo apt-get install spark-core spark-master spark-worker spark-python I get the following error Setting up spark-master