Hi
I was using saveAsTextFile earlier. It was working fine. When we migrated to
spark-1.0, I started getting the following error:
java.lang.ClassNotFoundException:
org.apache.spark.rdd.RDD$$anonfun$saveAsTextFile$1
        java.net.URLClassLoader$1.run(URLClassLoader.java:366)
        java.net.URLClassLoader$1.run(URLClassLoader.java:355)

Hence I changed my code as follows:

x.map(x => (NullWritable.get(), new
Text(x.toString))).saveAsHadoopFile[TextOutputFormat[NullWritable,
Text]](path)

After this I am facing this problem when I write very huge data to s3. This
also occurs while writing to some partitions only, say while writing to 240
partitions, it might succeed for 156 files and then it will start throwing
the Bad Digest Error and then it hangs.

Please advise.

Regards,
lmk



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Bad-Digest-error-while-doing-aws-s3-put-tp10036p10780.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Reply via email to