RIAK TS installed nodes not connecting

2016-09-13 Thread Agtmaal, Joris van
Hi I'm new to Riak and followed the installation instructions to get it working on an AWS cluster (3 nodes). So far ive been able to use Riak in pyspark (zeppelin) to create/read/write tables, but i would like to use the dataframes directly from spark, using the Spark-Riak Connector. When foll

Re: Riak 2.1.3 - Multiple indexes created by Solr for the same Riak object

2016-09-13 Thread Magnus Kessler
On 11 September 2016 at 02:27, Weixi Yen wrote: > Sort of a unique case, my app was under heavy stress and one of my riak > nodes got backed up (other 4 nodes were fine). > > I think this caused Riak.update to create an extra index in Solr for the > same object when users began running .update on

Re: RIAK TS installed nodes not connecting

2016-09-13 Thread Stephen Etheridge
Hi Joris, I have looked at the tutorial you have been following but I confess I am confused. In the example you are following I do not see where the spark and sql contexts are created. I use PySpark through the Jupyter notebook and I have to specify a path to the connector on invoking the jupyte

Re: RIAK TS installed nodes not connecting

2016-09-13 Thread Alex Moore
Joris, One thing to check - since you are using a downloaded jar, are you using the Uber jar that contains all the dependencies? http://search.maven.org/remotecontent?filepath=com/basho/riak/spark-riak-connector_2.10/1.6.0/spark-riak-connector_2.10-1.6.0-uber.jar Thanks, Alex On Tue, Sep 13, 201