HI,
are there any cluster specific prerequisites for running spark on Cassandra?

I create two DCs. DC1 and DC2. DC1 had two cassandra nodes with vnodes.
I create two nodes in DC2 with murmu partitioning and set num_token: 1.
Enabled Hadoop and Spark and started DSE.

I can verify that hadoop started because the Jobtracker page shows up. But
I don't know how to verify Spark.
I also see a lot of this in logs:

Activating plugin: com.datastax.bdp.plugin.ExternalProcessAuthPlugin
 INFO [main] 2014-08-08 17:31:14,492 PluginManager.java (line 232) No
enough available nodes to start plugin
com.datastax.bdp.plugin.ExternalProcessAuthPlugin. Trying once again...


When I run command dse shark-> it just hangs.

Am I doing something wrong?

Thanks

Reply via email to