Re: Issue with counter metrics for large number of keys

2019-01-17 Thread Jamie Grier
+1 to what Zhenghua said. You're abusing the metrics system I think. Rather just do a stream.keyBy().sum() and then write a Sink to do something with the data -- for example push it to your metrics system if you wish. However, from experience, many metrics systems don't like that sort of thing.

Re: Issue with counter metrics for large number of keys

2019-01-16 Thread Zhenghua Gao
So what you want is the counts of every keys ? Why didn't you use count aggregation? -- Sent from: http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/

Re: Issue with counter metrics for large number of keys

2019-01-16 Thread Gaurav Luthra
Hi Ken, Thanks for your inputs again. I will wait for Flink guys to come back to me for the suggestion of implementation of 100 K unique counters. For time being, I will make the number of counter metric value a configurable parameter in my application. So, user will know what he is trying to do.

Re: Issue with counter metrics for large number of keys

2019-01-16 Thread Ken Krugler
Hi Gaurav, I’ve use a few hundred counters before without problems. My concern about > 100K unique counters is that you wind up generating load (and maybe memory issues) for the JobManager. E.g. with Hadoop’s metric system trying to go much beyond 1000 counters could cause significant problems

Issue with counter metrics for large number of keys

2019-01-16 Thread Gaurav Luthra
I want new counter for every key of my windowed stream, And I want the same counter to get increment when the same key comes multiple times in incoming event. So, I will write below code for every incoming event. getRuntimeContext().getMetricGroup().counter(myKey).inc(); But above code fails when