I forgot to say, I am using bin/spark-shell, spark-1.0.2
That host has scala version 2.10.4 (Java HotSpot(TM) 64-Bit Server VM, Java
1.8.0_11)
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/wholeTextFiles-not-working-with-HDFS-tp7490p12678.html
Sent from t
I had the same issue with spark-1.0.2-bin-hadoop*1*, and indeed the issue
seems related to Hadoop1. When switching to using
spark-1.0.2-bin-hadoop*2*, the issue disappears.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/wholeTextFiles-not-working-with-HDF
That worked for me as well, I was using spark 1.0 compiled against Hadoop
1.0, switching to 1.0.1 compiled against hadoop 2
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/wholeTextFiles-not-working-with-HDFS-tp7490p10547.html
Sent from the Apache Spark User
I have the same issue
val a = sc.textFile("s3n://MyBucket/MyFolder/*.tif")
a.first
works perfectly fine, but
val d = sc.wholeTextFiles("s3n://MyBucket/MyFolder/*.tif") does not
work
d.first
Gives the following error message
java.io.FileNotFoundExceptio
I can write one if you'll point me to where I need to write it.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/wholeTextFiles-not-working-with-HDFS-tp7490p7737.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
Hi Sguj and littlebird,
I'll try to fix it tomorrow evening and the day after tomorrow, because I
am now busy preparing a talk (slides) tomorrow. Sorry for the inconvenience
to you. Would you mind to write an issue on Spark JIRA?
2014-06-17 20:55 GMT+08:00 Sguj :
> I didn't fix the issue so muc
I didn't fix the issue so much as work around it. I was running my cluster
locally, so using HDFS was just a preference. The code worked with the local
file system, so that's what I'm using until I can get some help.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabb
Hi, I have the same exception. Can you tell me how did you fix it? Thank you!
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/wholeTextFiles-not-working-with-HDFS-tp7490p7665.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
My exception stack looks about the same.
java.io.FileNotFoundException: File /user/me/target/capacity-scheduler.xml
does not exist.
at
org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:397)
at
org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFil
Hi Sguj,
Could you give me the exception stack?
I test it on my laptop and find that it gets the wrong FileSystem. It should
be DistributedFileSystem, but it finds the RawLocalFileSystem.
If we get the same exception stack, I'll try to fix it.
Here is my exception stack:
java.io.FileNotFoundEx
10 matches
Mail list logo