Oops I think that should fix it. I am going to try it now..
Great catch! I feel like an idiot.
On Fri, Feb 17, 2017 at 10:02 AM, Russell Spitzer wrote:
> Great catch Anastasios!
>
> On Fri, Feb 17, 2017 at 9:59 AM Anastasios Zouzias
> wrote:
>
>> Hey,
>>
>> Can you try with the 2.11 spark-cass
Great catch Anastasios!
On Fri, Feb 17, 2017 at 9:59 AM Anastasios Zouzias
wrote:
> Hey,
>
> Can you try with the 2.11 spark-cassandra-connector? You just reported
> that you use spark-cassandra-connector*_2.10*-2.0.0-RC1.jar
>
> Best,
> Anastasios
>
> On Fri, Feb 17, 2017 at 6:40 PM, kant kodal
Hey,
Can you try with the 2.11 spark-cassandra-connector? You just reported that
you use spark-cassandra-connector*_2.10*-2.0.0-RC1.jar
Best,
Anastasios
On Fri, Feb 17, 2017 at 6:40 PM, kant kodali wrote:
> Hi,
>
>
> val df = spark.read.format("org.apache.spark.sql.cassandra").options(Map(
> "
Hi,
val df = spark.read.format("org.apache.spark.sql.cassandra").options(Map(
"table" -> "hello", "keyspace" -> "test" )).load()
This line works fine. I can see it actually pulled the table schema from
cassandra. however when I do
df.count I get the error below.
I am using the following jars.