Hi,
I am trying to read the multiple parquet files in sparksql. In one dir there
are two files, of which one is corrupted. While trying to read these files,
sparksql throws Exception for the corrupted file.
val newDataDF =
sqlContext.read.parquet("/data/testdir/data1.parquet","/data/testdir/corru
Getting error for the following code snippet:
object SparkTaskTry extends Logging {
63 /**
64* Extends the normal Try constructor to allow TaskKilledExceptions
to propagate
65*/
66 def apply[T](r: => T): Try[T] =
67 try scala.util.Success(r) catch {
68 case e: Tas