Maybe set spark.hadoop.validateOutputSpecs=false?
发件人: Gautham Acharya
发送时间: 2020年3月15日 3:23
收件人: user@spark.apache.org
主题: [PySpark] How to write HFiles as an 'append' to the same directory?
I have a process in Apache Spark that attempts to write HFiles to S3
Hi all,
When using spark launcher starts app in yarn client mode, the
sparkAppHandle#stop() can not stop the application.
SparkLauncher launcher = new SparkLauncher()
.setAppName("My Launcher")
.setJavaHome("/usr/bin/hadoop/software/java")
.setSparkHome("/usr/bin/hadoop/