Hi, > Forwarding Spark Event Logs to identify critical events like job start, executor failures, job failures etc to ElasticSearch via log4j. However I could not find any way to foward event log via log4j configurations. Is there any other recommended approach to track these application events?
I'd use SparkListener API ( http://spark.apache.org/docs/latest/api/scala/org/apache/spark/scheduler/SparkListener.html ) > 2 - For Spark streaming jobs, is there any way to identify that data from Kafka is not consumed for whatever reason, or the offsets are not progressing as expected and also forward that to ElasticSearch via log4j for monitoring Think SparkListener API would help here too. Pozdrawiam, Jacek Laskowski ---- https://about.me/JacekLaskowski "The Internals Of" Online Books <https://books.japila.pl/> Follow me on https://twitter.com/jaceklaskowski <https://twitter.com/jaceklaskowski> On Wed, Jan 13, 2021 at 5:15 PM raymond.tan <raymond.chiew....@gmail.com> wrote: > Hello here, I am new to spark and am trying to add some monitoring for > spark applications specifically to handle the below situations - 1 - > Forwarding Spark Event Logs to identify critical events like job start, > executor failures, job failures etc to ElasticSearch via log4j. However I > could not find any way to foward event log via log4j configurations. Is > there any other recommended approach to track these application events? 2 - > For Spark streaming jobs, is there any way to identify that data from Kafka > is not consumed for whatever reason, or the offsets are not progressing as > expected and also forward that to ElasticSearch via log4j for monitoring > Thanks, Raymond > ------------------------------ > Sent from the Apache Spark User List mailing list archive > <http://apache-spark-user-list.1001560.n3.nabble.com/> at Nabble.com. >