; libraryDependencies ++= Seq(
>
> "org.apache.spark" %% "spark-core" % "1.1.0",
>
> "org.apache.spark" %% "spark-streaming" % "1.1.0",
>
> "org.apache.spark" %% "spark-streaming-kafka" % "1.1.0"
424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 17 more
On Tue, Oct 14, 2014 at 12:05 AM, Akhil Das
wrote:
> Just make sure you have the same version of spark-streaming-kafka
> <http://mvnrepository.com/artifact/org.apache.spark/spark-streaming-kafka_2.10>
> jar an
Hi
Anyone can share a project as a sample? I tried them a couple days ago but
couldn't make it work. Looks like it's due to some Kafka dependency issue.
I'm using sbt-assembly.
Thanks
Gary
Hello
I'm trying to connect kafka in spark shell, but failed as below. Could you
take a look what I missed.
scala> val kafkaStream = KafkaUtils.createStream(ssc,
"test-vip.snc1:2181", "test_spark", Map("user-test"->1))
error: bad symbolic reference. A signature in KafkaUtils.class refers to
term
Hello
I'm new to Spark and I couldn't make the SimpleApp run on my macbook. I
feel it's related to network configuration. Could anyone take a look?
Thanks.
14/09/14 10:10:36 INFO Utils: Fetching
http://10.63.93.115:59005/jars/simple-project_2.11-1.0.jar to
/var/folders/3p/l2d9ljnx7f99q8hmms3wpcg4
Hello
I'm new to Spark and playing around, but saw the following error. Could
anyone to help on it?
Thanks
Gary
scala> c
res15: org.apache.spark.rdd.RDD[String] = FlatMappedRDD[7] at flatMap at
:23
scala> group
res16: org.apache.spark.rdd.RDD[(String, Iterable[String])] =
MappedValuesRDD[5] a
Thanks Andrew. How did you do it?
On Thu, Aug 7, 2014 at 10:20 PM, Andrew Ash wrote:
> Yes, I've done it before.
>
>
> On Thu, Aug 7, 2014 at 10:18 PM, Gary Zhao wrote:
>
>> Hello
>>
>> Is it possible to use spark-cassandra-connector in spark-shell?
>>
>> Thanks
>> Gary
>>
>
>
Hello
Is it possible to use spark-cassandra-connector in spark-shell?
Thanks
Gary
Hello
I'm trying to modify Spark sample app to integrate with Cassandra, however
I saw exception when submitting the app. Anyone knows why it happens?
Exception in thread "main" java.lang.NoClassDefFoundError:
com/datastax/spark/connector/rdd/reader/RowReaderFactory
at SimpleApp.main(SimpleApp.sc