Re: Spark GraphFrame ConnectedComponents

2017-01-09 Thread Ankur Srivastava
with delete. >> >> Could you by chance run just the delete to see if it fails >> >> FileSystem.get(sc.hadoopConfiguration) >> .delete(new Path(somepath), true) >> ------ >> *From:* Ankur Srivastava >> *Sent:* Thursday, January 5, 2

Re: Spark GraphFrame ConnectedComponents

2017-01-06 Thread Steve Loughran
FileSystem.get(sc.hadoopConfiguration) .delete(new Path(somepath), true) From: Ankur Srivastava mailto:ankur.srivast...@gmail.com>> Sent: Thursday, January 5, 2017 10:05:03 AM To: Felix Cheung Cc: user@spark.apache.org<mailto:user@spark.apache.org> Subject:

Re: Spark GraphFrame ConnectedComponents

2017-01-05 Thread Joseph Bradley
:45:59 PM > *To:* Felix Cheung; d...@spark.apache.org > > *Cc:* user@spark.apache.org > *Subject:* Re: Spark GraphFrame ConnectedComponents > > Adding DEV mailing list to see if this is a defect with ConnectedComponent > or if they can recommend any solution. > > Thanks

Re: Spark GraphFrame ConnectedComponents

2017-01-05 Thread Felix Cheung
. From: Ankur Srivastava Sent: Thursday, January 5, 2017 3:45:59 PM To: Felix Cheung; d...@spark.apache.org Cc: user@spark.apache.org Subject: Re: Spark GraphFrame ConnectedComponents Adding DEV mailing list to see if this is a defect with ConnectedComponent or if they can recommend any solution

Re: Spark GraphFrame ConnectedComponents

2017-01-05 Thread Ankur Srivastava
FileSystem.get(sc.hadoopConfiguration) >> .delete(new Path(somepath), true) >> -- >> *From:* Ankur Srivastava >> *Sent:* Thursday, January 5, 2017 10:05:03 AM >> *To:* Felix Cheung >> *Cc:* user@spark.apache.org >> >> *Sub

Re: Spark GraphFrame ConnectedComponents

2017-01-05 Thread Ankur Srivastava
> hadoop.fs.FileSystem. >> >> Is the URL scheme for s3n registered? >> Does it work when you try to read from s3 from Spark? >> >> _____ >> From: Ankur Srivastava >> Sent: Wednesday, January 4, 2017 9:23 PM >> Subject: Re

Re: Spark GraphFrame ConnectedComponents

2017-01-05 Thread Felix Cheung
Felix Cheung Cc: user@spark.apache.org Subject: Re: Spark GraphFrame ConnectedComponents Yes it works to read the vertices and edges data from S3 location and is also able to write the checkpoint files to S3. It only fails when deleting the data and that is because it tries to use the default file s

Re: Spark GraphFrame ConnectedComponents

2017-01-05 Thread Ankur Srivastava
m s3 from Spark? > > _ > From: Ankur Srivastava > Sent: Wednesday, January 4, 2017 9:23 PM > Subject: Re: Spark GraphFrame ConnectedComponents > To: Felix Cheung > Cc: > > > > This is the exact trace from the driver logs > > Exception in thread &quo

Re: Spark GraphFrame ConnectedComponents

2017-01-05 Thread Felix Cheung
day, January 4, 2017 9:23 PM Subject: Re: Spark GraphFrame ConnectedComponents To: Felix Cheung mailto:felixcheun...@hotmail.com>> Cc: mailto:user@spark.apache.org>> This is the exact trace from the driver logs Exception in thread "main" java.lang.IllegalArgumentExceptio

Re: Spark GraphFrame ConnectedComponents

2017-01-04 Thread Ankur Srivastava
This is the exact trace from the driver logs Exception in thread "main" java.lang.IllegalArgumentException: Wrong FS: s3n:///8ac233e4-10f9-4eb3-aa53-df6d9d7ea7be/connected-components-c1dbc2b0/3, expected: file:/// at org.apache.hadoop.fs.FileSystem.checkPath(FileSystem.java:645) at org.apache.hado

Re: Spark GraphFrame ConnectedComponents

2017-01-04 Thread Ankur Srivastava
Hi I am rerunning the pipeline to generate the exact trace, I have below part of trace from last run: Exception in thread "main" java.lang.IllegalArgumentException: Wrong FS: s3n://, expected: file:/// at org.apache.hadoop.fs.FileSystem.checkPath(FileSystem.java:642) at org.apache.hadoop.fs.RawLo

Re: Spark GraphFrame ConnectedComponents

2017-01-04 Thread Felix Cheung
Do you have more of the exception stack? From: Ankur Srivastava Sent: Wednesday, January 4, 2017 4:40:02 PM To: user@spark.apache.org Subject: Spark GraphFrame ConnectedComponents Hi, I am trying to use the ConnectedComponent algorithm of GraphFrames but by de