Ok, digging a bit into Spark i think i got it:
sc.newAPIHadoopFile("s3n://missingPattern/*",
EmptiableTextInputFormat.class, LongWritable.class, Text.class,
sc.hadoopConfiguration()).map(new Function, String>() {@Overridepublic String
call(Tuple2 arg0) throws Exception {
to ignore 0 files ?
Thanks
Regards,
Laurent T
- Mail original -
De: "Mayur Rustagi"
À: "laurent thoulon"
Envoyé: Mercredi 21 Mai 2014 13:51:46
Objet: Re: Ignoring S3 0 files exception
You can try newhaoopapi in spark context. Should be able to c
Noone has any idea ?It's really troublesome, it seems like i have no way to
catch errors while an action is beeing processed and just ignore it.Here's
a bit more details on what i'm doing:
JavaRDD a = sc.textFile("s3n://"+missingFilenamePattern) JavaRDD b =
sc.textFile("s3n://"+existingFilenamePat