Check that you get the data from kafka producer
lines.foreachRDD(new Function, Void>() {
@Override
public Void call(JavaRDD rdd) throws
Exception {
List collect = rdd.collect();
for (S
Here's a simple working version.
import com.google.common.collect.Lists;
import org.apache.spark.SparkConf;
import org.apache.spark.api.java.function.FlatMapFunction;
import org.apache.spark.api.java.function.Function;
import org.apache.spark.api.java.function.Function2;
import org.apache.spark.a
I am not running locally. The Spark master is:
"spark://:7077"
On Mon, Nov 10, 2014 at 3:47 PM, Tathagata Das
wrote:
> What is the Spark master that you are using. Use local[4], not local
> if you are running locally.
>
> On Mon, Nov 10, 2014 at 3:01 PM, Something Something
> wrote:
> > I a
What is the Spark master that you are using. Use local[4], not local
if you are running locally.
On Mon, Nov 10, 2014 at 3:01 PM, Something Something
wrote:
> I am embarrassed to admit but I can't get a basic 'word count' to work under
> Kafka/Spark streaming. My code looks like this. I don't
I am embarrassed to admit but I can't get a basic 'word count' to work
under Kafka/Spark streaming. My code looks like this. I don't see any
word counts in console output. Also, don't see any output in UI. Needless
to say, I am newbie in both 'Spark' as well as 'Kafka'.
Please help. Thanks.