Hello Experts

I have a spark streaming application(DStream). I use spark 3.0.2, scala
2.12 This application reads about 20 different kafka topics and produces a
single stream and I filter the RDD per topic and store in cassandra

I see that there is a steady increase in executor memory over the hours
until it reaches max allocated memory and then it stays  at that value. No
matter how high I allocate to the executor this pattern is seen. I suspect
memory leak

Any guidance you may be able provide as to how to debug will be highly
appreciated

Thanks in advance
Regards
Kiran

Reply via email to