we use vanilla spark right now. Here is an ansible script to install spark: https://github.com/seregasheypak/ansible-vagrant-dse-spark/blob/master/roles/spark_configuration/tasks/main.yml
and cassandra dse: https://github.com/seregasheypak/ansible-vagrant-dse-spark/blob/master/roles/dse-install/tasks/main.yml 2015-04-22 20:11 GMT+02:00 Sebastian Estevez <sebastian.este...@datastax.com >: > Oh sorry Jay mentioned DSE and you said we already use it. > > I think the answer to your question is Brian's response. > > These are the dse docs if you want to read about it: > > > http://docs.datastax.com/en/datastax_enterprise/4.6/datastax_enterprise/spark/sparkTOC.html > On Apr 22, 2015 2:05 PM, "Serega Sheypak" <serega.shey...@gmail.com> > wrote: > >> What is "embedded" spark? Where can I read about it? >> Right now we just install spark 1.2 built fro hadoop 2.4 and use it to >> query data from cassandra. >> >> 2015-04-22 19:56 GMT+02:00 Sebastian Estevez < >> sebastian.este...@datastax.com>: >> >>> There is no supported way to replace the embedded spark that comes in >>> DSE with something else. >>> >>> However you could probably read or write from/to DSE / Cassandra from a >>> cloudera spark cluster using the open source DataStax connector. Are you >>> looking for a particular feature that is not available in Spark 1.1? >>> On Apr 22, 2015 1:50 PM, "Serega Sheypak" <serega.shey...@gmail.com> >>> wrote: >>> >>>> We already use it. Would like to use Spark from cloudera distribution. >>>> Should it work? >>>> >>>> 2015-04-22 19:43 GMT+02:00 Jay Ken <jaytechg...@gmail.com>: >>>> >>>>> There is a Enerprise Edition from Datastax; where they have Spark and >>>>> Cassandra Integration. >>>>> >>>>> >>>>> http://www.datastax.com/what-we-offer/products-services/datastax-enterprise >>>>> >>>>> Thanks, >>>>> Jay >>>>> >>>>> On Wed, Apr 22, 2015 at 6:41 AM, Serega Sheypak < >>>>> serega.shey...@gmail.com> wrote: >>>>> >>>>>> Hi, are Cassandra and Spark from Cloudera compatible? >>>>>> Where can I find these compatilibity notes? >>>>>> >>>>> >>>>> >>>> >>