Hello all,
I'm hoping someone can give me some direction for troubleshooting this issue,
I'm trying to write from Spark on an HortonWorks(Cloudera) HDP cluster. I ssh
directly to the first datanode and run PySpark with the following command;
however, it is always failing no matter what size I s
How so?
From: Mich Talebzadeh
Sent: Wednesday, May 19, 2021 5:45 PM
To: Clay McDonald
Cc: user@spark.apache.org
Subject: Re: PySpark Write File Container exited with a non-zero exit code 143
*** EXTERNAL EMAIL ***
Hi Clay,
Those parameters you are passing are not valid
pyspark --conf
Still get the same error with “pyspark --conf queue=default --conf
executor-memory=24G”
From: ayan guha
Sent: Thursday, May 20, 2021 12:23 AM
To: Clay McDonald
Cc: Mich Talebzadeh ; user@spark.apache.org
Subject: Re: PySpark Write File Container exited with a non-zero exit code 143