I have a silly question: Do multiple spark jobs running on yarn have any impact on each other? e.g. If the traffic on one streaming job increases too much does it have any effect on second job? Will it slow it down or any other consequences?
I have enough resources(memory,cores) for both jobs in the same cluster. Thanks Vaibhav -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Running-multiple-Spark-Jobs-on-Yarn-Client-mode-tp27364.html Sent from the Apache Spark User List mailing list archive at Nabble.com. --------------------------------------------------------------------- To unsubscribe e-mail: user-unsubscr...@spark.apache.org