Hello Everyone,

I am encountering trouble running Spark applications when I shut down my
EC2 instances. Everything else seems to work except Spark. When I try
running a simple Spark application, like sc.parallelize() I get the message
that hdfs name node is in safemode.

Has anyone else had this issue? Is there a proper protocol I should be
following to turn off my spark nodes?

Thank you!

Reply via email to