It would be a dream to have an easy-to-use dynamic metric system AND a
reliable counting system (accumulator-like) in Spark...
Thanks
Roberto
On Tue, May 7, 2019 at 3:54 AM Saisai Shao wrote:
> I think the main reason why that was not merged is that Spark itself
> doesn't have such requirement,
I think the main reason why that was not merged is that Spark itself
doesn't have such requirement, and the metrics system is mainly used for
spark itself. Most of the needs are from the custom sources/sinks, but
Spark's MetricsSystem is not designed as a public API.
I think we could revisit or im
Hi Saisai,
Thanks a lot for the link! This is exactly what I need.
Just curious, why this PR has not been merged, as it seems to implement
rather natural requirement.
There are a number or use cases which can benefit from this feature, e.g.
- collecting business metrics based on the data's attrib
I remembered there was a PR about doing similar thing (
https://github.com/apache/spark/pull/18406). From my understanding, this
seems like a quite specific requirement, it may requires code change to
support your needs.
Thanks
Saisai
Sergey Zhemzhitsky 于2019年5月4日周六 下午4:44写道:
> Hello Spark User