Looking at SparkSubmit#addJarToClasspath():

    uri.getScheme match {
      case "file" | "local" =>
...
      case _ =>
        printWarning(s"Skip remote jar $uri.")

It seems hdfs scheme is not recognized.

FYI

On Thu, Feb 26, 2015 at 6:09 PM, dilm <dmend...@exist.com> wrote:

> I'm trying to run a spark application using bin/spark-submit. When I
> reference my application jar inside my local filesystem, it works. However,
> when I copied my application jar to a directory in hdfs, i get the
> following
> exception:
>
> Warning: Skip remote jar
> hdfs://localhost:9000/user/hdfs/jars/simple-project-1.0-SNAPSHOT.jar.
> java.lang.ClassNotFoundException: com.example.SimpleApp
>
> Here's the comand:
>
> $ ./bin/spark-submit --class com.example.SimpleApp --master local
> hdfs://localhost:9000/user/hdfs/jars/simple-project-1.0-SNAPSHOT.jar
>
> I'm using hadoop version 2.6.0, spark version 1.2.1
>
> In the official documentation‌​, it stated there that: "application-jar:
> Path to a bundled jar including your application and all dependencies. The
> URL must be globally visible inside of your cluster, for instance, an
> *hdfs:// path* or a file:// path that is present on all nodes." I'm
> thinking
> maybe this is a valid bug?
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Spark-submit-not-working-when-application-jar-is-in-hdfs-tp21840.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to