Hi Tim,
I think you can try setting the option *spark.sql.files.ignoreCorruptFiles *as
*true*. With the option enabled, the Spark jobs will continue to run
when encountering corrupted files and the contents that have been read will
still be returned.
The CSV/JSON data source supports the Permissiv
/facepalm
Here we go: https://issues.apache.org/jira/browse/SPARK-27093
Tim
--
Sent from: http://apache-spark-developers-list.1001551.n3.nabble.com/
-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
Thanks Xiao, it's good to have that validated.
I've created a ticket here: https://issues.apache.org/jira/browse/AVRO-2342
--
Sent from: http://apache-spark-developers-list.1001551.n3.nabble.com/
-
To unsubscribe e-mail: dev-u
Hi Spark Devs,
We're processing a large number of Avro files with Spark and found that the
Avro reader is missing the ability to handle malformed or truncated files
like the JSON reader. Currently the Avro reader throws exceptions when it
encounters any bad or truncated record in an Avro file, cau