You need to use 2.0.0-M2-s_2.11 since Spark 2.0 is compiled with Scala 2.11
by default.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Write-to-Cassandra-table-from-pyspark-fails-with-scala-reflect-error-tp27723p27729.html
Sent from the Apache Spark User
any conflicts with cached or older
> versions. I only have SPARK_HOME environment variable set (env variables
> related to Spark and Python).
>
> --
> *From:* Russell Spitzer
> *To:* Trivedi Amit ; "user@spark.apache.org" <
> user@spark.apache.org>
> *Sent:* Wednesday,
Python).
From: Russell Spitzer
To: Trivedi Amit ; "user@spark.apache.org"
Sent: Wednesday, September 14, 2016 11:24 PM
Subject: Re: Write to Cassandra table from pyspark fails with scala reflect
error
Spark 2.0 defaults to Scala 2.11, so if you didn't build it yourself you need
t
"user@spark.apache.org"
Sent: Wednesday, September 14, 2016 11:24 PM
Subject: Re: Write to Cassandra table from pyspark fails with scala reflect
error
Spark 2.0 defaults to Scala 2.11, so if you didn't build it yourself you need
the 2.11 artifact for the Spark Cassandra Con
Spark 2.0 defaults to Scala 2.11, so if you didn't build it yourself you
need the 2.11 artifact for the Spark Cassandra Connector.
On Wed, Sep 14, 2016 at 7:44 PM Trivedi Amit
wrote:
> Hi,
>
>
>
> I am testing a pyspark program that will read from a csv file and write
> data into Cassandra table
Hi,
I am testing a pyspark program that will read from a csv file and write data
into Cassandra table. I am using pyspark with spark-cassandra-connector
2.10:2.0.0-M3. I am using Spark v2.0.0.
While executing below command
```df.write.format("org.apache.spark.sql.cassandra").mode('append').o