-- Forwarded message --
From: Shixiong(Ryan) Zhu
Date: Fri, Jan 20, 2017 at 12:06 PM
Subject: Re: Spark streaming app that processes Kafka DStreams produces no
output and no error
To: shyla deshpande
That's how KafkaConsumer works right now. It will retry forever for ne
There was a issue connecting to Kafka, once that was fixed the spark app
works. Hope this helps someone.
Thanks
On Mon, Jan 16, 2017 at 7:58 AM, shyla deshpande
wrote:
> Hello,
> I checked the log file on the worker node and don't see any error there.
> This is the first time I am asked to run
Hello,
I checked the log file on the worker node and don't see any error there.
This is the first time I am asked to run on such a small cluster. I feel
its the resources issue, but it will be great help is somebody can confirm
this or share your experience. Thanks
On Sat, Jan 14, 2017 at 4:01 PM
Hello,
I want to add that,
I don't even see the streaming tab in the application UI on port 4040 when
I run it on the cluster.
The cluster on EC2 has 1 master node and 1 worker node.
The cores used on the worker node is 2 of 2 and memory used is 6GB of 6.3GB.
Can I run a spark streaming job with
Hello,
My spark streaming app that reads kafka topics and prints the DStream works
fine on my laptop, but on AWS cluster it produces no output and no errors.
Please help me debug.
I am using Spark 2.0.2 and kafka-0-10
Thanks
The following is the output of the spark streaming app...
17/01/14