RE: How to support dependency jars and files on HDFS in standalone cluster mode?

2015-06-14 Thread Dong Lei
https://issues.apache.org/jira/browse/SPARK-8369 Created And I’m working on a PR. Thanks Dong Lei From: Cheng Lian [mailto:lian.cs@gmail.com] Sent: Friday, June 12, 2015 7:03 PM To: Dong Lei Cc: Dianfei (Keith) Han; dev@spark.apache.org Subject: Re: How to support dependency jars and files

RE: How to support dependency jars and files on HDFS in standalone cluster mode?

2015-06-10 Thread Dong Lei
step fails. Even if I can make the first step works(use option1), it seems that the classpath in driver is not correctly set. Thanks Dong Lei From: Cheng Lian [mailto:lian.cs@gmail.com] Sent: Thursday, June 11, 2015 2:32 PM To: Dong Lei Cc: Dianfei (Keith) Han; dev@spark.apache.org Subject

RE: How to support dependency jars and files on HDFS in standalone cluster mode?

2015-06-10 Thread Dong Lei
Thanks Cheng, If I do not use --jars how can I tell spark to search the jars(and files) on HDFS? Do you mean the driver will not need to setup a HTTP file server for this scenario and the worker will fetch the jars and files from HDFS? Thanks Dong Lei From: Cheng Lian [mailto:lian.cs

How to support dependency jars and files on HDFS in standalone cluster mode?

2015-06-10 Thread Dong Lei
This sounds more reasonable that option 1 for downloading files. But this way I need to read the "spark.jars" and "spark.files" on downloadUserJar or DriverRunnder.start and replace it with a local path. How can I do that? Do you have a more elegant solution, or do we have a plan to support it in the furture? Thanks Dong Lei