Hi, I am working on a streaming use case where I need to run multiple spark streaming applications at the same time and measure the throughput and latencies. The spark UI provides all the statistics, but if I want to run more than 100 applications at the same time then I do not have any clue on how to aggregate these statistics. Opening 100 windows and collecting all the data does not seem to be an easy job. Hence, if you could provide any help on how to collect these statistics from code, then I can write a script to run my experiment. Any help is greatly appreciated. Thanks in advance.
Regards, Sitakanta Mishra -- Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/ --------------------------------------------------------------------- To unsubscribe e-mail: [email protected]
