Sure, Thanks Projod for the detailed steps!
bit1...@163.com
From: prajod.vettiyat...@wipro.com
Date: 2015-06-19 16:56
To: bit1...@163.com; ak...@sigmoidanalytics.com
CC: user@spark.apache.org
Subject: RE: RE: Build spark application into uber jar
Multiple maven profiles may be the ideal way
..@163.com]
Sent: 19 June 2015 13:01
To: Prajod S Vettiyattil (WT01 - BAS); Akhil Das
Cc: user
Subject: Re: RE: Build spark application into uber jar
Thanks.
I guess what you mean by "maven build target" is maven profile. I added two
profiles, one is LocalRun, the other is ClusterR
ClusterRun
provided
bit1...@163.com
From: prajod.vettiyat...@wipro.com
Date: 2015-06-19 15:22
To: bit1...@163.com; ak...@sigmoidanalytics.com
CC: user@spark.apache.org
Subject: RE: Re: Build spark application into uber jar
Hi,
When running inside Eclipse IDE, I use another maven target
target can have its own
command line options.
prajod
From: bit1...@163.com [mailto:bit1...@163.com]
Sent: 19 June 2015 12:36
To: Akhil Das; Prajod S Vettiyattil (WT01 - BAS)
Cc: user
Subject: Re: Re: Build spark application into uber jar
Thank you Akhil.
Hmm.. but I am using Maven as the
@163.com
From: prajod.vettiyat...@wipro.com
Date: 2015-06-19 14:39
To: user@spark.apache.org
Subject: RE: Build spark application into uber jar
> but when I run the application locally, it complains that spark related stuff
> is missing
I use the uber jar option. What do you mean by “locally” ? In t
This is how i used to build a assembly jar with sbt:
Your build.sbt file would look like this:
*import AssemblyKeys._*
*assemblySettings*
*name := "FirstScala"*
*version := "1.0"*
*scalaVersion := "2.10.4"*
*libraryDependencies += "org.apache.spark" %% "spark-core" % "1.3.1"*
*libraryDepend
> but when I run the application locally, it complains that spark related stuff
> is missing
I use the uber jar option. What do you mean by “locally” ? In the Spark scala
shell ? In the
From: bit1...@163.com [mailto:bit1...@163.com]
Sent: 19 June 2015 08:11
To: user
Subject: Build spark applica