Thanks, will try this out and get back...
On Tue, Jun 23, 2015 at 2:30 AM, Tathagata Das wrote:
> Try adding the provided scopes
>
>
> org.apache.spark
> spark-core_2.10
> 1.4.0
>
> *provided *
>
> org.apache.spark
> spark-stre
Try adding the provided scopes
org.apache.spark
spark-core_2.10
1.4.0
*provided *
org.apache.spark
spark-streaming_2.10
1.4.0
*provided *
This prevents these artifacts from being included in the assemb
Hi Tathagata,
I am attaching a snapshot of my pom.xml. It would help immensely, if I can
include max, and min values in my mapper phase.
The question is still open at :
http://stackoverflow.com/questions/30902090/adding-max-and-min-in-spark-stream-in-java/30909796#30909796
I see that there is a
Hi Tathagata,
When you say please mark spark-core and spark-streaming as dependencies how
do you mean?
I have installed the pre-build spark-1.4 for Hadoop 2.6 from spark
downloads. In my maven pom.xml, I am using version 1.4 as described.
Please let me know how I can fix that?
Thanks
Nipun
On T
I think you may be including a different version of Spark Streaming in your
assembly. Please mark spark-core nd spark-streaming as provided
dependencies. Any installation of Spark will automatically provide Spark in
the classpath so you do not have to bundle it.
On Thu, Jun 18, 2015 at 8:44 AM, Ni
Hi,
I have the following piece of code, where I am trying to transform a spark
stream and add min and max to it of eachRDD. However, I get an error saying
max call does not exist, at run-time (compiles properly). I am using
spark-1.4
I have added the question to stackoverflow as well:
http://stac