MaxGekk commented on code in PR #49833: URL: https://github.com/apache/spark/pull/49833#discussion_r1945027764
########## core/src/main/scala/org/apache/spark/util/HadoopFSUtils.scala: ########## @@ -250,8 +250,15 @@ private[spark] object HadoopFSUtils extends Logging { "method" -> u.getStackTrace.head.getMethodName)) } - val filteredStatuses = - statuses.filterNot(status => shouldFilterOutPathName(status.getPath.getName)) + val filteredStatuses = { + try { + statuses.filterNot(status => shouldFilterOutPathName(status.getPath.getName)) + } catch { + case e: Exception => + logError(s"Failed to filter out path names from ${path.toString}", e) + throw SparkException.internalError(s"Unexpected statuses for path ${path.toString}", e) Review Comment: Can users face to the error? If so, we shouldn't use `internalError`. How do you repro the issue? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org