val topics="first"

shouldn't it be val topics = Set("first") ?

On Sun, Sep 20, 2015 at 1:01 PM, Petr Novak <oss.mli...@gmail.com> wrote:

> val topics="first"
>
> shouldn't it be val topics = Set("first") ?
>
> On Sat, Sep 19, 2015 at 10:07 PM, kali.tumm...@gmail.com <
> kali.tumm...@gmail.com> wrote:
>
>> Hi ,
>>
>> I am trying to develop in intellij Idea same code I am having the same
>> issue
>> is there any work around.
>>
>> Error in intellij:- cannot resolve symbol createDirectStream
>>
>> import kafka.serializer.StringDecoder
>> import org.apache.spark._
>> import org.apache.spark.SparkContext._
>> import org.apache.spark.sql.SQLContext
>> import org.apache.spark.SparkConf
>> import org.apache.log4j.Logger
>> import org.apache.log4j.Level
>> import org.apache.spark.streaming.dstream.InputDStream
>> import org.apache.spark.streaming.{Seconds,StreamingContext}
>> import org.apache.spark._
>> import  org.apache.spark.streaming._
>> import org.apache.spark.streaming.StreamingContext._
>> import org.apache.spark.streaming.kafka.KafkaUtils
>> import  org.apache.spark.streaming.kafka._
>> import com.datastax.spark.connector.streaming._
>> import org.apache.spark.streaming.kafka._
>>
>> object SparkKafkaOffsetTest {
>>
>>   def main(args: Array[String]): Unit = {
>>     //Logger.getLogger("org").setLevel(Level.WARN)
>>     //Logger.getLogger("akka").setLevel(Level.WARN)
>>
>>     val conf = new
>>
>> SparkConf().setMaster("local").setAppName("KafkaOffsetStreaming").set("spark.executor.memory",
>> "1g")
>>     val sc = new SparkContext(conf)
>>     val ssc = new StreamingContext(sc, Seconds(2))
>>
>>     val zkQuorm="localhost:2181"
>>     val group="test-group"
>>     val topics="first"
>>     val numThreads=1
>>     val broker="localhost:9091"
>>     val kafkaParams = Map[String, String]("metadata.broker.list" ->
>> broker)
>>     //val kafkaParams = Map[String, String]("metadata.broker.list" )
>>     val topicMap=topics.split(",").map((_,numThreads.toInt)).toMap
>>
>>     //val lineMap=KafkaUtils.createStream(ssc,zkQuorm,group,topicMap)
>>
>>     //val directKafkaStream=KafkaUtils.createDirectStream[String, String,
>> StringDecoder, StringDecoder](ssc, kafkaParams, topics)
>>
>>     val messages= KafkaUtils.createDirectStream[String, String,
>> StringDecoder, StringDecoder](ssc, kafkaParams, topics)
>>
>>
>>
>>     //val directKafkaStream = KafkaUtils.createDirectStream[
>>       //[key class], [value class], [key decoder class], [value decoder
>> class] ](
>>       //streamingContext, [map of Kafka parameters], [set of topics to
>> consume])
>>
>>   }
>>
>> }
>>
>> Thanks
>> Sri
>>
>>
>>
>>
>> --
>> View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/Kafka-createDirectStream-issue-tp23456p24749.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> For additional commands, e-mail: user-h...@spark.apache.org
>>
>>
>

Reply via email to