Do you want the job to fail if there is an error exit code?

You could set checkCode to True
spark.apache.org/docs/latest/api/python/pyspark.html?highlight=pipe#pyspark.RDD.pipe<http://spark.apache.org/docs/latest/api/python/pyspark.html?highlight=pipe#pyspark.RDD.pipe>

Otherwise maybe you want to output the status into stdout so you could process 
it individually.


_____________________________
From: Xuchen Yao <yaoxuc...@gmail.com<mailto:yaoxuc...@gmail.com>>
Sent: Friday, February 10, 2017 11:18 AM
Subject: Getting exit code of pipe()
To: <user@spark.apache.org<mailto:user@spark.apache.org>>


Hello Community,

I have the following Python code that calls an external command:

rdd.pipe('run.sh', env=os.environ).collect()

run.sh can either exit with status 1 or 0, how could I get the exit code from 
Python? Thanks!

Xuchen


Reply via email to