Source code check for the Java version:
https://github.com/datastax/spark-cassandra-connector/blob/master/spark-cassandra-connector-java/src/main/java/com/datastax/spark/connector/RDDJavaFunctions.java#L26

It's using the RDDFunctions from scala code so yes, it's Java driver again.


On Wed, Sep 10, 2014 at 8:09 PM, DuyHai Doan <doanduy...@gmail.com> wrote:

> "As far as I know, the Datastax connector uses thrift to connect Spark
> with Cassandra although thrift is already deprecated, could someone confirm
> this point?"
>
> --> the Scala connector is using the latest Java driver, so no there is no
> Thrift there.
>
>  For the Java version, I'm not sure, have not looked into it but I think
> it also uses the new Java driver
>
>
> On Wed, Sep 10, 2014 at 7:27 PM, Francisco Madrid-Salvador <
> pmad...@stratio.com> wrote:
>
>> Hi Oleg,
>>
>> Stratio Deep is just a library you must include in your Spark deployment
>> so it doesn't guarantee any high availability at all. To achieve HA you
>> must use Mesos or any other 3rd party resource manager.
>>
>> Stratio doesn't currently support PySpark, just Scala and Java. Perhaps
>> in the future...
>>
>> It should be ready for production use, but like always please test before
>> on a testing environment ;-)
>>
>> As far as I know, the Datastax connector uses thrift to connect Spark
>> with Cassandra although thrift is already deprecated, could someone confirm
>> this point?
>>
>> Paco
>>
>
>

Reply via email to