Mohammed,

This doesn¹t really answer your question, but I¹m working on a new REST
server that allows people to submit SQL queries over REST, which get
executed via Spark SQL.   Based on what I started here:
http://brianoneill.blogspot.com/2015/05/spark-sql-against-cassandra-example.
html

I assume you need JDBC connectivity specifically?

-brian

---
Brian O'Neill 
Chief Technology Officer
Health Market Science, a LexisNexis Company
215.588.6024 Mobile € @boneill42 <http://www.twitter.com/boneill42>


This information transmitted in this email message is for the intended
recipient only and may contain confidential and/or privileged material. If
you received this email in error and are not the intended recipient, or the
person responsible to deliver it to the intended recipient, please contact
the sender at the email above and delete this email and any attachments and
destroy any copies thereof. Any review, retransmission, dissemination,
copying or other use of, or taking any action in reliance upon, this
information by persons or entities other than the intended recipient is
strictly prohibited.
 


From:  Mohammed Guller <moham...@glassbeam.com>
Reply-To:  <user@cassandra.apache.org>
Date:  Thursday, May 28, 2015 at 8:26 PM
To:  "user@cassandra.apache.org" <user@cassandra.apache.org>
Subject:  RE: Spark SQL JDBC Server + DSE

Anybody out there using DSE + Spark SQL JDBC server?
 

Mohammed
 

From: Mohammed Guller [mailto:moham...@glassbeam.com]
Sent: Tuesday, May 26, 2015 6:17 PM
To: user@cassandra.apache.org
Subject: Spark SQL JDBC Server + DSE
 
Hi ­
As I understand, the Spark SQL Thrift/JDBC server cannot be used with the
open source C*. Only DSE supports  the Spark SQL JDBC server.
 
We would like to find out whether how many organizations are using this
combination. If you do use DSE + Spark SQL JDBC server, it would be great if
you could share your experience. For example, what kind of issues you have
run into? How is the performance? What reporting tools you are using?
 
Thank  you!
 
Mohammed 
 


Reply via email to