xtrahotsauce wrote
> I had this same problem as well.  I ended up just adding the necessary
> code
> in KafkaUtil and compiling my own spark jar.  Something like this for the
> "raw" stream:
> 
>   def createRawStream(
>       jssc: JavaStreamingContext,
>       kafkaParams: JMap[String, String],
>       topics: JMap[String, JInt]
>    ): JavaPairDStream[Array[Byte], Array[Byte]] = {
>     new KafkaInputDStream[Array[Byte], Array[Byte], DefaultDecoder,
> DefaultDecoder](
>       jssc.ssc, kafkaParams.toMap,
> Map(topics.mapValues(_.intValue()).toSeq: _*),
> StorageLevel.MEMORY_AND_DISK_SER_2)
>   }


I had this same problem, and this solution also worked for me so thanks for
this!

One question...  what is this doing?

> Map(topics.mapValues(_.intValue()).toSeq: _*),

it appears to be converting the incoming Map[String, Integer] to a
Map[String, Integer].  I'm not seeing the purpose of it...  help?  (I'm a
bit of a scala newbie.)




--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/NoSuchMethodError-in-KafkaReciever-tp2209p8953.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Reply via email to