Unsubscribe
Hi,
I am developing one java process which will consume data from Kafka using
Apache Spark Streaming.
For this I am using following:
Java:
openjdk version "11.0.1" 2018-10-16 LTS
OpenJDK Runtime Environment Zulu11.2+3 (build 11.0.1+13-LTS) OpenJDK 64-Bit
Server VM Zulu11.2+3 (build 11.0.1+13-LT
Thank you Sean. Happy Diwali !!
-- Dilip
- Original message -From: Xiao Li To: "user@spark.apache.org" , user Cc:Subject: Happy Diwali everyone!!!Date: Wed, Nov 7, 2018 3:10 PM
Happy Diwali everyone!!!
ents on why the --files option may be
redundant in
your case.
Regards,
Dilip Biswal
Tel: 408-463-4980
dbis...@us.ibm.com
From: Giri
To: user@spark.apache.org
Date: 10/15/2015 02:44 AM
Subject:Re: SPARK SQL Error
Hi Ritchard,
Thank you so much again for your input.This
Thanks Sanjay. I will give it a try.
Thanks
Dilip
On Sat, Jan 3, 2015 at 11:25 PM, Sanjay Subramanian <
sanjaysubraman...@yahoo.com> wrote:
> so I changed the code to
>
> rdd1InvIndex.join(rdd2Pair).map(str => str._2).groupByKey().map(str =>
> (str._1,str._2.toList)).c
A simple
sbt assembly
is not working. Is there any other way to include particular jars with
assembly command?
Regards,
Dilip
On Friday 11 July 2014 12:45 PM, Bill Jay wrote:
I have met similar issues. The reason is probably because in Spark
assembly, spark-streaming-kafka is not included
thing I am missing?
Thanks,
Dilip
On Friday 11 July 2014 12:02 PM, Akhil Das wrote:
Easiest fix would be adding the kafka jars to the SparkContext while
creating it.
Thanks
Best Regards
On Fri, Jul 11, 2014 at 4:39 AM, Dilip <mailto:dilip_ram...@hotmail.com>> wrote:
Hi,
I
+= "Akka Repository" at "http://repo.akka.io/releases/";
resolvers += "Maven Repository" at "http://central.maven.org/maven2/";
sbt package was successful. I also tried sbt "++2.10.3 package" to build
it for my scala version. Problem remains the same.
Can anyone help me out here? Ive been stuck on this for quite some time now.
Thank You,
Dilip