Ah, of course, there are no application jars in spark-shell, then it seems that there are no workarounds for this at the moment. We will look into a fix shortly, but for now you will have to create an application and use spark-submit (or use spark-shell on Linux).
2014-06-11 10:42 GMT-07:00 Ulanov, Alexander <alexander.ula...@hp.com>: > Could you elaborate on this? I don’t have an application, I just use > spark shell. > > > > *From:* Andrew Or [mailto:and...@databricks.com] > *Sent:* Wednesday, June 11, 2014 9:40 PM > > *To:* user@spark.apache.org > *Subject:* Re: Adding external jar to spark-shell classpath in spark 1.0 > > > > This is a known issue: https://issues.apache.org/jira/browse/SPARK-1919. > We haven't found a fix yet, but for now, you can workaround this by > including your simple class in your application jar. > > > > 2014-06-11 10:25 GMT-07:00 Ulanov, Alexander <alexander.ula...@hp.com>: > > Hi, > > > > I am currently using spark 1.0 locally on Windows 7. I would like to use > classes from external jar in the spark-shell. I followed the instruction > in: > http://mail-archives.apache.org/mod_mbox/spark-user/201402.mbox/%3CCALrNVjWWF6k=c_jrhoe9w_qaacjld4+kbduhfv0pitr8h1f...@mail.gmail.com%3E > > > > I have set ADD_JARS=”my.jar” SPARK_CLASSPATH=”my.jar” in spark-shell.cmd > but this didn’t work. > > > > I also tried running “spark-shell.cmd --jars my.jar --driver-class-path > my.jar --driver-library-path my.jar” and it didn’t work either. > > > > I cannot load any class from my jar into spark shell. Btw my.jar contains > a simple Scala class. > > > > Best regards, Alexander > > >