near bottom:
http://tobert.github.io/post/2014-07-15-installing-cassandra-spark-stack.html



On Fri, Aug 8, 2014 at 2:00 AM, chutium <teng....@gmail.com> wrote:

> try to add following jars in classpath
>
>
> xxx/cassandra-all-2.0.6.jar:xxx/cassandra-thrift-2.0.6.jar:xxx/libthrift-0.9.1.jar:xxx/cassandra-driver-spark_2.10-1.0.0-SNAPSHOT.jar:xxx/cassandra-java-driver-2.0.2/cassandra-driver-core-2.0.2.jar:xxx/cassandra-java-driver-2.0.2/cassandra-driver-dse-2.0.2.jar
>
> then in spark-shell
>
> import org.apache.spark.{SparkConf, SparkContext}
> val conf = new SparkConf(true).set("cassandra.connection.host",
> "your-cassandra-host")
> val sc = new SparkContext("local[1]", "cassandra-driver", conf)
> import com.datastax.driver.spark._
> sc.cassandraTable("db1", "table1").select("key").count
>
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/How-to-use-spark-cassandra-connector-in-spark-shell-tp11757p11781.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>


-- 
Thomas Nieborowski
510-207-7049 mobile
510-339-1716 home

Reply via email to