> I think Rostyslav is using a DFS which logs at warn/error if you try to
delete a directory that isn't there, so is seeing warning messages that
nobody else does
Yep, you are correct.
> Rostyslav —like I said, i'd be curious as to which DFS/object store you
are working with
Unfortunately, I am
I think Rostyslav is using a DFS which logs at warn/error if you try to delete
a directory that isn't there, so is seeing warning messages that nobody else
does
Rostyslav —like I said, i'd be curious as to which DFS/object store you are
working with, as it is behaving slightly differently from
On 16 Jan 2017, at 12:51, Rostyslav Sotnychenko
mailto:r.sotnyche...@gmail.com>> wrote:
Thanks all!
I was using another DFS instead of HDFS, which was logging an error when
fs.delete got called on non-existing path.
really? Whose DFS, if you don't mind me asking? I'm surprised they logged th
x27;t we add some check to Client?
> >
> >
> > Thanks,
> > Rostyslav
>
>
>
>
>
> -----
> Liang-Chi Hsieh | @viirya
> Spark Technology Center
> http://www.spark.tc/
> --
> View this message in context: http://apache-spark-
>
ry.
>
> Shouldn't we add some check to Client?
>
>
> Thanks,
> Rostyslav
-
Liang-Chi Hsieh | @viirya
Spark Technology Center
http://www.spark.tc/
--
View this message in context:
http://apache-spark-developers-list.1001551.n3.nabble.com/Both-Spark-AM-and-Client-are-tryin
scala> org.apache.hadoop.fs.FileSystem.getLocal(sc.hadoopConfiguration)
res0: org.apache.hadoop.fs.LocalFileSystem =
org.apache.hadoop.fs.LocalFileSystem@3f84970b
scala> res0.delete(new org.apache.hadoop.fs.Path("/tmp/does-not-exist"), true)
res3: Boolean = false
Does that explain your confusion?
Are you actually seeing a problem or just questioning the code?
I have never seen a situation where there's a failure because of that
part of the current code.
On Fri, Jan 13, 2017 at 3:24 AM, Rostyslav Sotnychenko
wrote:
> Hi all!
>
> I am a bit confused why Spark AM and Client are both trying
Hi all!
I am a bit confused why Spark AM and Client are both trying to delete
Staging Directory.
https://github.com/apache/spark/blob/branch-2.1/yarn/src/main/scala/org/apache/spark/deploy/yarn/Client.scala#L1110
https://github.com/apache/spark/blob/branch-2.1/yarn/src/main/scala/org/apache/spark
Hi all!
I am a bit confused why Spark AM and Client are both trying to delete
Staging Directory.
https://github.com/apache/spark/blob/branch-2.1/yarn/
src/main/scala/org/apache/spark/deploy/yarn/Client.scala#L1110
https://github.com/apache/spark/blob/branch-2.1/yarn/
src/main/scala/org/apache/spa