Hi,

We are trying to run a Spark application using spark-submit on Windows 8.1.
The application runs successfully to completion on MacOS 10.10 and on
Ubuntu Linux. On Windows, we get the following error messages (see below).
It appears that Spark is trying to delete some temporary directory that it
creates.

How do we solve this problem?

Thanks,
arun

5/04/07 10:55:14 ERROR Utils: Exception while deleting Spark temp dir:
C:\Users\JOSHMC~1\AppData\Local\Temp\spark-339bf2d9-8b89-46e9-b5c1-404caf9d3cd7\userFiles-62976ef7-ab56-41c0-a35b-793c7dca31c7

java.io.IOException: Failed to delete:
C:\Users\JOSHMC~1\AppData\Local\Temp\spark-339bf2d9-8b89-46e9-b5c1-404caf9d3cd7\userFiles-62976ef7-ab56-41c0-a35b-793c7dca31c7

              at
org.apache.spark.util.Utils$.deleteRecursively(Utils.scala:932)

              at
org.apache.spark.util.Utils$$anon$4$$anonfun$run$1$$anonfun$apply$mcV$sp$2.apply(Utils.scala:181)

              at
org.apache.spark.util.Utils$$anon$4$$anonfun$run$1$$anonfun$apply$mcV$sp$2.apply(Utils.scala:179)

              at scala.collection.mutable.HashSet.foreach(HashSet.scala:79)

              at
org.apache.spark.util.Utils$$anon$4$$anonfun$run$1.apply$mcV$sp(Utils.scala:179)

              at
org.apache.spark.util.Utils$$anon$4$$anonfun$run$1.apply(Utils.scala:177)

              at
org.apache.spark.util.Utils$$anon$4$$anonfun$run$1.apply(Utils.scala:177)

              at
org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1617)

              at org.apache.spark.util.Utils$$anon$4.run(Utils.scala:177)

Reply via email to