You can use https://github.com/datastax/spark-cassandra-connector to integrate
Cassandra using Spark SQL.
docs in progress but for now:
https://github.com/datastax/spark-cassandra-connector/blob/master/spark-cassandra-connector/src/main/scala/org/apache/spark/sql/cassandra/CassandraSQLContext.sca
I believe DataStax is working on better integration here, but until that is
ready you can use the applySchema API. Basically you will convert the
CassandraTable into and RDD of Row objects using a .map() and then you can
call applySchema (provided by SQLContext) to get a SchemaRDD.
More details w