Hi all,

Thanks for the information. I am running Spark streaming in a yarn cluster
and the configuration should be correct. I followed the KafkaWordCount to
write the current code three months ago. It has been working for several
months. The messages are in json format. Actually, this code worked a few
days ago. But now it is not working. Below please find my spark submit
script:


SPARK_BIN=/home/hadoop/spark/bin/
$SPARK_BIN/spark-submit \
     --class com.test \
     --master yarn-cluster \
     --deploy-mode cluster \
     --verbose \
     --driver-memory 20G \
     --executor-memory 20G \
     --executor-cores 6 \
     --num-executors $2 \
     $1 $3 $4 $5

Thanks!

 Bill

On Wed, Nov 12, 2014 at 4:53 PM, Shao, Saisai <saisai.s...@intel.com> wrote:

>  Did you configure Spark master as local, it should be local[n], n > 1
> for local mode. Beside there’s a Kafka wordcount example in Spark Streaming
> example, you can try that. I’ve tested with latest master, it’s OK.
>
>
>
> Thanks
>
> Jerry
>
>
>
> *From:* Tobias Pfeiffer [mailto:t...@preferred.jp]
> *Sent:* Thursday, November 13, 2014 8:45 AM
> *To:* Bill Jay
> *Cc:* u...@spark.incubator.apache.org
> *Subject:* Re: Spark streaming cannot receive any message from Kafka
>
>
>
> Bill,
>
>
>
>   However, when I am currently using Spark 1.1.0. the Spark streaming job
> cannot receive any messages from Kafka. I have not made any change to the
> code.
>
>
>
> Do you see any suspicious messages in the log output?
>
>
>
> Tobias
>
>
>

Reply via email to