Re: Expected behavior of metrics after Pipeline cancellation

2017-11-13 Thread Paul Gerver
> It doesn't make any mention of differences based on final pipeline state. > So I would interpret it to mean counters should also be available for > canceled pipelines. Good point! Thanks! On 2017-11-13 11:19, Scott Wegner wrote: > The javadoc for the Metrics utility states [1]: > > "It i

Re: Expected behavior of metrics after Pipeline cancellation

2017-11-13 Thread Scott Wegner
The javadoc for the Metrics utility states [1]: "It is runner-dependent whether Metrics are accessible during pipeline execution or only after jobs have completed." It doesn't make any mention of differences based on final pipeline state. So I would interpret it to mean counters should also be

Re: [DISCUSS] Move away from Apache Maven as build tool

2017-11-13 Thread Lukasz Cwik
There has been plenty of time for comments on the PR and the approach. So far Ken Knowles has provided the most feedback on the PR, Ken would you like to finish the review? On Fri, Nov 10, 2017 at 1:22 PM, Romain Manni-Bucau wrote: > This is only a setup thing and better to not break the mast

Re: Apache Beam and Spark

2017-11-13 Thread Jean-Baptiste Onofré
Hi, my target is to have Spark 2.x support in Beam 2.3.0. Regards JB On 11/13/2017 12:22 PM, Nishu wrote: Hi Jean, Thanks for your response. So when can we expect Spark 2.x support for spark runner? Thanks, Nishu On Mon, Nov 13, 2017 at 11:53 AM, Jean-Baptiste Onofré wrote: Hi, Regardi

Re: Apache Beam and Spark

2017-11-13 Thread Nishu
Hi Jean, Thanks for your response. So when can we expect Spark 2.x support for spark runner? Thanks, Nishu On Mon, Nov 13, 2017 at 11:53 AM, Jean-Baptiste Onofré wrote: > Hi, > > Regarding your question: > > 1. Not yet, but as you might have seen on the mailing list, we have a PR > about Spar

Re: Apache Beam and Spark

2017-11-13 Thread Jean-Baptiste Onofré
Hi, Regarding your question: 1. Not yet, but as you might have seen on the mailing list, we have a PR about Spark 2.x support. 2. We have additional triggers supported and in progress. GroupByKey and Accumator are also supported. 3. No, I did the change to both allows you to define the def

Apache Beam and Spark

2017-11-13 Thread Nishu
Hi Team, I am writing a streaming pipeline in Apache beam using spark runner. Use case : To join the multiple kafka streams using windowed collections. I use GroupByKey to group the events based on common business key and that output is used as input for Join operation. Pipeline run on direct runn