Update:
The issue in my previous post was solved:
I had to change the sbt file name from .sbt to build.sbt.
-
Thanks!
-Caron
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/scopt-OptionParser-tp8436p20581.html
Sent from the Apache Spark User List mai
Hi,
I ran into the same problem posted in this thread earlier when I tried to
write my own program:
"$ spark-submit --class "someClass" --master local[4]
target/scala-2.10/someclass_2.10-1.0.jar " gives me
"Exception in thread "main" java.lang.NoClassDefFoundError:
scopt/OptionParser"
I tried yo
Thanks for posting the solution! You can also append `% "provided"` to
the `spark-mllib` dependency line and remove `spark-core` (because
spark-mllib already depends on spark-core) to make the assembly jar
smaller. -Xiangrui
On Fri, Aug 8, 2014 at 10:05 AM, SK wrote:
> i was using sbt package whe
i was using sbt package when I got this error. Then I switched to using sbt
assembly and that solved the issue. To run "sbt assembly", you need to have
a file called plugins.sbt in the "/project" directory and it
has the following line:
addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.11.2")
T
Hi,
I tried to develop some code to use Logistic Regression, following the code
in BinaryClassification.scala in examples/mllib. My code compiles, but at
runtime complains that scopt/OptionParser class cannot be found. I have the
following import statement in my code:
import scopt.OptionParser