Hi Gerard,

We're using the Spark Job Server in production, from GitHub [master]
running against a recent Spark-1.0 snapshot so it definitely works.  I'm
afraid the only time we've seen a similar error was an unfortunate case of
PEBKAC <http://en.wikipedia.org/wiki/User_error>.

First and foremost, have you tried doing an unzip -l "/tmp/spark-jobserver/
filedao/data/test-2014-05-22T18:44:09.254+02:00.jar" on the JAR uploaded to
the server to make sure the class is where you're expecting it to be?

It's not uncommon for a package statement to be neglected when moving
classes around in an IDE like Eclipse.

Best,

Michael




*Michael Cutler*
Founder, CTO


*Mobile: +44 789 990 7847Email:   mich...@tumra.com <mich...@tumra.com>Web:
    tumra.com <http://tumra.com/?utm_source=signature&utm_medium=email>*
*Visit us at our offices in Chiswick Park <http://goo.gl/maps/abBxq>*
*Registered in England & Wales, 07916412. VAT No. 130595328*


This email and any files transmitted with it are confidential and may also
be privileged. It is intended only for the person to whom it is addressed.
If you have received this email in error, please inform the sender immediately.
If you are not the intended recipient you must not use, disclose, copy,
print, distribute or rely on this email.


On 22 May 2014 18:25, Gerard Maas <gerard.m...@gmail.com> wrote:

> Hi,
>
> I'm starting to explore the Spark Job Server contributed by Ooyala [1],
> running from the master branch.
>
> I started by developing and submitting a simple job and the JAR check gave
> me errors on a seemingly good jar. I disabled the fingerprint checking on
> the jar and  I could submit it, but when I tried to submit the job, it
> could not find it's classpath. Therefore I decided to take a couple of
> steps backwards and go through the example in the docs.
>
> Using the (Hello)WordCount example, upload is OK and the jar in in the UI
> as well, but when I submit the job, I get the same classpathNotFound error
> as before:
>
> 19:07 $ curl -d ""
> 'localhost:8090/jobs?appName=test&classPath=spark.jobserver.WordCountExample'
> {
>   "status": "ERROR",
>   "result": "classPath spark.jobserver.WordCountExample not found"
> }
>
> I'm not sure where it goes wrong.  Here's what seems to be the relevant
> snippet in the server logs:
>
> [2014-05-22 19:17:28,891] INFO  .apache.spark.SparkContext []
> [akka://JobServer/user/context-
> supervisor/666d021a-spark.jobserver.WordCountExample] - Added JAR
> /tmp/spark-jobserver/filedao/data/test-2014-05-22T18:44:09.254+02:00.jar at
> http://172.17.42.1:37978/jars/test-2014-05-22T18:44:09.254+02:00.jar with
> timestamp 1400779048891
> [2014-05-22 19:17:28,891] INFO  util.ContextURLClassLoader []
> [akka://JobServer/user/context-supervisor/666d021a-spark.jobserver.WordCountExample]
> - Added URL
> file:/tmp/spark-jobserver/filedao/data/test-2014-05-22T18:44:09.254+02:00.jar
> to ContextURLClassLoader
> [2014-05-22 19:17:28,891] INFO  spark.jobserver.JarUtils$ []
> [akka://JobServer/user/context-supervisor/666d021a-spark.jobserver.WordCountExample]
> - Loading object spark.jobserver.WordCountExample$ using loader
> spark.jobserver.util.ContextURLClassLoader@5deae1b7
> [2014-05-22 19:17:28,892] INFO  spark.jobserver.JarUtils$ []
> [akka://JobServer/user/context-supervisor/666d021a-spark.jobserver.WordCountExample]
> - Loading class spark.jobserver.WordCountExample using loader
> spark.jobserver.util.ContextURLClassLoader@5deae1b7
>
> ***** all OK until here and then ...*****
>
> [2014-05-22 19:17:28,892] INFO  ocalContextSupervisorActor []
> [akka://JobServer/user/context-supervisor] - Shutting down context
> 666d021a-spark.jobserver.WordCountExample
>
> Any ideas? Something silly I might be doing?  btw, I'm running in dev mode
> using sbt and default config (local).
>
> -kr, Gerard.
>
>
> [1] https://github.com/ooyala/spark-jobserver
>

Reply via email to