Storing spark processed output to Database asynchronously.

2015-05-20 Thread Gautam Bajaj
Hi, >From my understanding of Spark Streaming, I created a spark entry point, for continuous UDP data, using: SparkConf conf = new SparkConf().setMaster("local[2]").setAppName("NetworkWordCount");JavaStreamingContext jssc = new JavaStreamingContext(conf, new Duration(1));JavaReceiverInputDStr

Re: Storing spark processed output to Database asynchronously.

2015-05-21 Thread Gautam Bajaj
o help either. The "work" is just going to keep piling up as many > many async jobs even though your batch processing times will be low as that > processing time is not going to reflect how much of overall work is pending > in the system. > > On Wed, May 20, 2015 at 10:28 PM,

Re: Storing spark processed output to Database asynchronously.

2015-05-22 Thread Gautam Bajaj
This is just a friendly ping, just to remind you of my query. Also, is there a possible explanation/example on the usage of AsyncRDDActions in Java ? On Thu, May 21, 2015 at 7:18 PM, Gautam Bajaj wrote: > I am received data at UDP port 8060 and doing processing on it using Spark > and s

Re: Using Neo4j with Apache Spark

2015-03-12 Thread Gautam Bajaj
Alright, I have also asked this question in StackOverflow: http://stackoverflow.com/questions/28896898/using-neo4j-with-apache-spark The code there is pretty neat. On Thu, Mar 12, 2015 at 4:55 PM, Tathagata Das wrote: > I am not sure if you realized but the code snipper it pretty mangled up in

Re: Using Neo4j with Apache Spark

2015-03-12 Thread Gautam Bajaj
On Thu, Mar 12, 2015 at 12:58 AM, Gautam Bajaj > wrote: > >> Alright, I have also asked this question in StackOverflow: >> http://stackoverflow.com/questions/28896898/using-neo4j-with-apache-spark >> >> The code there is pretty neat. >> >> On Thu, Mar 12, 2015

Re: Using Neo4j with Apache Spark

2015-03-12 Thread Gautam Bajaj
GraphDatabaseService > objects > // Use it to send the whole partition to Neo4j > // Destroy the object or release it to the pool > }) > > > On Thu, Mar 12, 2015 at 1:15 AM, Gautam Bajaj > wrote: > >> Neo4j is running externally. It has nothing to do

Re: Using Neo4j with Apache Spark

2015-03-12 Thread Gautam Bajaj
object and therefore not serialize > anything. > > On Thu, Mar 12, 2015 at 3:46 AM, Gautam Bajaj > wrote: > >> Here: https://gist.github.com/d34th4ck3r/0c99d1e9fa288e0cc8ab >> >> I'll add the flag and send you stack trace, I have meetings now. >> >> On

Re: Using Neo4j with Apache Spark

2015-03-13 Thread Gautam Bajaj
: > Well, that's why I had also suggested using a pool of the GraphDBService > objects :) > Also present in the programming guide link I had given. > > TD > > > On Thu, Mar 12, 2015 at 7:38 PM, Gautam Bajaj > wrote: > >> Thanks a ton! That worked. >> >

Error connecting to localhost:8060: java.net.ConnectException: Connection refused

2015-01-15 Thread Gautam Bajaj
Hi, I'm new to Apache Storm. I'm receiving data at my UDP port 8060, I want to capture it and perform some operations in the real time, for which I'm using Spark Streaming. While the code seems to be correct, I get the following output: https://gist.github.com/d34th4ck3r/0e88896eac864d6d7193 I'm