Hi,

I've got a Spark Streaming driver job implemented and in it, I register a
streaming listener, like so:

JavaStreamingContext jssc = new JavaStreamingContext(sparkConf,
   Durations.milliseconds(params.getBatchDurationMillis()));
jssc.addStreamingListener(new JobListener(jssc));

where JobListener is defined like so
        private static class JobListener implements StreamingListener {

                private JavaStreamingContext jssc;
                
                JobListener(JavaStreamingContext jssc) {
                        this.jssc = jssc;
                }
                
                @Override
                public void onBatchCompleted(StreamingListenerBatchCompleted
batchCompleted) {
                        System.out.println(">> Batch completed.");
                        jssc.stop(true);
                        System.out.println(">> The job has been stopped.");
                }
........................

I do not seem to be seeing onBatchCompleted being triggered.  Am I doing
something wrong?

In this particular case, I was trying to implement a bulk ingest type of
logic where the first batch is all we're interested in (reading out of a
Kafka topic with offset reset set to "smallest").




--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/StreamingListener-anyone-tp23140.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to