Hi All,

I'm executing a simple job in spark which reads a file on HDFS, processes
the lines and saves the processed lines back to HDFS. All the 3 stages are
happening correctly and I'm able to see the processed file on the HDFS.

But on the spark UI, the worker state is shown as "killed". And I'm not
finding any exceptions being thrown in the logs.

What could be going wrong?

...
var newLines = lines.flatMap(line => process(line));
newLines.saveAsTextFile(hdfsPath);
...
def process(line: String): Array[String] = {
...
Array(str1, str2);
}
...

~Sarath.

Reply via email to