Re: IllegalArgumentException while saving RDD to cassandra using Spark Cassandra Connector

2018-07-22 Thread M Singh
not valid (see below). I tried to trace the code in the connector and it appears that the row size (3 below) is different from the column count (which turns out be 1).  I am trying to follow the example from  https://github.com/datastax/spark-cassandra-connector/blob/master/doc/2_loading.md  with

IllegalArgumentException while saving RDD to cassandra using Spark Cassandra Connector

2018-07-18 Thread M Singh
turns out be 1).  I am trying to follow the example from  https://github.com/datastax/spark-cassandra-connector/blob/master/doc/2_loading.md  with customer having two more fields than just the id as mentioned in the example.  In case of the example I think it will work because it has only 1 column

Re: How to register cassandra custom codecs in spark? (https://github.com/datastax/spark-cassandra-connector/issues/1173)

2018-04-15 Thread Dinesh Joshi
Hi Revanth, I took a quick look and don't think you can override existing mappings. You should see a warning like this - 18/04/15 19:50:27 WARN CodecRegistry: Ignoring codec CustomTSTypeCodec [date <-> java.lang.Long] because it collides with previously registered codec CustomTSTypeCodec [date <

Re: How to register cassandra custom codecs in spark? (https://github.com/datastax/spark-cassandra-connector/issues/1173)

2018-04-14 Thread Dinesh Joshi
Hi Revanth, How do you register the custom codec? Do you get any errors? Have you tried using a pre-existing codec? It would be helpful if you can give more information. DineshOn Saturday, April 14, 2018, 7:29:30 PM PDT, Revanth Reddy wrote: Hi Team , I want to write a custom cassand

How to register cassandra custom codecs in spark? (https://github.com/datastax/spark-cassandra-connector/issues/1173)

2018-04-14 Thread Revanth Reddy
Hi Team , I want to write a custom cassandra codec and i want to use that codec in my spark application while reading the data from cassandra table . Basically the custom codecs are used to convert one column type to another while reading from cassandra. for example i have a timestamp column in c

RE: Spark-Cassandra connector

2016-06-22 Thread Joaquin Alzola
>Your best bet for a response will be on the spark-cassandra-connector mailing >list: >https://groups.google.com/a/lists.datastax.com/forum/#!forum/spark-connector-user Didn’t know about that list. Thanks Carl. This email is confidential and may be subject to privilege. If you ar

Re: Spark-Cassandra connector

2016-06-22 Thread Carl Yeksigian
Hi Joaquin, Your best bet for a response will be on the spark-cassandra-connector mailing list: https://groups.google.com/a/lists.datastax.com/forum/#!forum/spark-connector-user Hope you find your answer. -Carl On Wed, Jun 22, 2016 at 4:58 AM, Joaquin Alzola wrote: > Hi List > >

Spark-Cassandra connector

2016-06-22 Thread Joaquin Alzola
Hi List I am trying to install the Spark-Cassandra connector through maven or sbt but neither works. Both of them try to connect to the Internet (which I do not have connection) to download certain files. Is there a way to install the files manually? I downloaded from the maven repository

Issue with protobuff and Spark cassandra connector

2015-11-16 Thread Cassa L
Hi, Has anyone used Protobuff with spark-cassandra connector? I am using protobuff-3.0-beta with spark-1.4 and cassandra-connector-2.10. I keep getting "Unable to find proto buffer class" in my code. I checked version of protobuff jar and it is loaded with 3.0-beta in classpath. Pr

spark-cassandra-connector not present on Maven Central

2015-09-18 Thread Renato Perini
https://repo1.maven.org/maven2/com/datastax/spark/spark-cassandra-connector-java_2.11/1.5.0-M1/spark-cassandra-connector-java_2.11-1.5.0-M1.jar https://repo1.maven.org/maven2/com/datastax/spark/spark-cassandra-connector_2.11/1.5.0-M1/spark-cassandra-connector_2.11-1.5.0-M1.jar https://repo1

Spark Cassandra Connector for Python

2015-04-09 Thread mwiewiorski
Hi, At https://github.com/datastax/spark-cassandra-connector I see that you are extending API that Spark provides for interacting with RDDs to leverage some native Cassandra features. We are using Apache Cassandra together with PySpark to do some analytics and since we have community version

How to autogenerate timestamp with Spark Cassandra connector

2014-11-11 Thread Shing Hing Man
Hi, I am trying to insert into the following column family using Spark Cassandra connector. CREATE TABLE myks.mycf ( id bigint, msg text, type text, ts, timestamp, primary key (id, msg) ) Is there a way to to have the ts field automatically generate : // dataRdd is of Type RDD[(Int,String