I wrote up a simple metric sink for Spark that publishes metrics to a Kafka broker. Each metric is published as a message (in json format), with the metric name as the message key.
https://github.com/erikerlandson/spark-kafka-sink Build with "(x)sbt assembly" and make sure the resulting jar file is available on the classpath for the relevant spark components. A quick-start demo is included on the readme file. Cheers! Erik --------------------------------------------------------------------- To unsubscribe e-mail: user-unsubscr...@spark.apache.org