Hi Jerome,
This is cool. It would be great if you could share more details about you got 
your setup to work finally. For example, what additional libraries/jars you are 
using. How are you configuring the ThriftServer to use the additional jars to 
communicate with Cassandra?

In addition, how you are you mapping HIVE tables to Cassandra CFs in beeline? 
It would be great if you could share an example beeline session right from the 
beginning.

Thanks.
Mohammed

From: Ashic Mahtab [mailto:as...@live.com]
Sent: Thursday, November 20, 2014 10:15 AM
To: jererc; u...@spark.incubator.apache.org
Subject: RE: tableau spark sql cassandra

Hi Jerome,
I've been trying to get this working as well...

Where are you specifying cassandra parameters (i.e. seed nodes, consistency 
levels, etc.)?

-Ashic.
> Date: Thu, 20 Nov 2014 10:34:58 -0700
> From: jer...@gmail.com<mailto:jer...@gmail.com>
> To: u...@spark.incubator.apache.org<mailto:u...@spark.incubator.apache.org>
> Subject: Re: tableau spark sql cassandra
>
> Well, after many attempts I can now successfully run the thrift server using
> root@cdb-01:~/spark# ./sbin/start-thriftserver.sh --master
> spark://10.194.30.2:7077 --hiveconf hive.server2.thrift.bind.host 0.0.0.0
> --hiveconf hive.server2.thrift.port 10000
>
> (the command was failing because of the --driver-class-path $CLASSPATH
> parameter which I guess was setting the spark.driver.extraClassPath) and I
> can get the cassandra data using beeline!
>
> However, the table's values are null in Tableau but this is another problem
> ;)
>
> Best,
> Jerome
>
>
>
> --
> View this message in context: 
> http://apache-spark-user-list.1001560.n3.nabble.com/tableau-spark-sql-cassandra-tp19282p19392.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: 
> user-unsubscr...@spark.apache.org<mailto:user-unsubscr...@spark.apache.org>
> For additional commands, e-mail: 
> user-h...@spark.apache.org<mailto:user-h...@spark.apache.org>
>

Reply via email to