Hi All,

I am running a simple command on spark-shell - like this. It's a piece of
structured streaming.

val lines = (spark
   .readStream
   .format("kafka")
   .option("kafka.bootstrap.servers", "localhost:9092")
   .option("subscribe", "test")
   .load()
   .selectExpr("CAST(value AS STRING)")
   .as[String]
    )

I also downloaded spark-streaming-kafka-0-10-assembly_2.11-2.1.0.jar and
placed it in jars folder. re ran the spark-shell and executed above command.
but no luck. getting following error.

java.lang.ClassNotFoundException: Failed to find data source: kafka. Please
find packages at http://spark.apache.org/third-party-projects.html

Please suggest if I am missing anything here. I am running spark on windows.



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/How-to-load-kafka-as-a-data-source-tp28534.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to