Re: running Spark Streaming just once and stop it

2014-06-12 Thread Ravi Hemnani
Hey, I did sparkcontext.addstreaminglistener(streaminglistener object) in my code and i am able to see some stats in the logs and cant see anything in web UI. How to add the Streaming Tab to the web UI ? I need to get queuing delays and related information. -- View this message in context:

Spark workers keep getting disconnected(Keep dying) from the cluster.

2014-05-16 Thread Ravi Hemnani
Hey, I am facing a weird issue. My spark workers keep dying every now and then and in the master logs i keep on seeing following messages, 14/05/14 10:09:24 WARN Master: Removing worker-20140514080546-x.x.x.x-50737 because we got no heartbeat in 60 seconds 14/05/14 14:18:41 WARN Master: Removi

RE: JMX with Spark

2014-04-25 Thread Ravi Hemnani
Can you share your working metrics.properties.? I want remote jmx to be enabled so i need to use the JMXSink and monitor my spark master and workers. But what are the parameters that are to be defined like host and port ? So your config can help. -- View this message in context: http://ap

Re: How to use FlumeInputDStream in spark cluster?

2014-03-21 Thread Ravi Hemnani
I'll start with Kafka implementation. Thanks for all the help. On Mar 21, 2014 7:00 PM, "anoldbrain [via Apache Spark User List]" < ml-node+s1001560n2994...@n3.nabble.com> wrote: > It is my understanding that there is no way to make FlumeInputDStream work > in a cluster environment with the curre

Re: How to use FlumeInputDStream in spark cluster?

2014-03-21 Thread Ravi Hemnani
On 03/21/2014 06:17 PM, anoldbrain [via Apache Spark User List] wrote: > he actual , which in turn causes the 'Fail to bind to ...' > error. This comes naturally because the slave that is running the code > to bind to : has a different ip. I ran sudo ./run-example org.apache.spark.streaming.exa

Re: How to use FlumeInputDStream in spark cluster?

2014-03-21 Thread Ravi Hemnani
On 03/21/2014 06:17 PM, anoldbrain [via Apache Spark User List] wrote: > he actual , which in turn causes the 'Fail to bind to ...' > error. This comes naturally because the slave that is running the code > to bind to : has a different ip. So if we run the code on the slave where we are sending

Re: How to use FlumeInputDStream in spark cluster?

2014-03-21 Thread Ravi Hemnani
Hey, Even i am getting the same error. I am running, sudo ./run-example org.apache.spark.streaming.examples.FlumeEventCount spark://:7077 7781 and getting no events in the spark streaming. --- Time: 1395395676000 ms -

Using flume to create stream for spark streaming.

2014-03-10 Thread Ravi Hemnani
Hey, I am using the following flume flow, Flume agent 1 consisting of Rabbitmq-> source, files-> channet, avro-> sink sending data to a slave node of spark cluster. Flume agent 2, slave node of spark cluster, consisting of avro-> source, files-> channel, now for the sink i tried avro, hdfs, file