Sure, Thanks Projod for the detailed steps!
bit1...@163.com
From: prajod.vettiyat...@wipro.com
Date: 2015-06-19 16:56
To: bit1...@163.com; ak...@sigmoidanalytics.com
CC: user@spark.apache.org
Subject: RE: RE: Build spark application into uber jar
Multiple maven profiles may be the ideal way
..@163.com]
Sent: 19 June 2015 13:01
To: Prajod S Vettiyattil (WT01 - BAS); Akhil Das
Cc: user
Subject: Re: RE: Build spark application into uber jar
Thanks.
I guess what you mean by "maven build target" is maven profile. I added two
profiles, one is LocalRun, the other is ClusterR
ClusterRun
provided
bit1...@163.com
From: prajod.vettiyat...@wipro.com
Date: 2015-06-19 15:22
To: bit1...@163.com; ak...@sigmoidanalytics.com
CC: user@spark.apache.org
Subject: RE: Re: Build spark application into uber jar
Hi,
When running inside Eclipse IDE, I use another maven target
target can have its own
command line options.
prajod
From: bit1...@163.com [mailto:bit1...@163.com]
Sent: 19 June 2015 12:36
To: Akhil Das; Prajod S Vettiyattil (WT01 - BAS)
Cc: user
Subject: Re: Re: Build spark application into uber jar
Thank you Akhil.
Hmm.. but I am using Maven as the
Thank you for the reply.
"Run the application locally" means that I run the application in my IDE with
master as local[*].
When spark stuff is marked as provided, then I can't run it because the spark
stuff is missing.
So, how do you work around this? Thanks!
bit1...@163.com
From: prajod.vet