I have used breeze fine with scala shell:

scala -cp ./target/spark-mllib_2.10-1.3.0-SNAPSHOT.
jar:/Users/v606014/.m2/repository/com/github/fommil/netlib/core/1.1.2/core-1.1.2.jar:/Users/v606014/.m2/repository/org/jblas/jblas/1.2.3/jblas-1.2.3.jar:/Users/v606014/.m2/repository/org/scalanlp/breeze_2.10/0.10/breeze_2.10-0.10.jar:/Users/v606014/.m2/repository/org/slf4j/slf4j-api/1.7.5/slf4j-api-1.7.5.jar:/Users/v606014/.m2/repository/net/sourceforge/f2j/arpack_combined_all/0.1/arpack_combined_all-0.1.jar
<http://jar/Users/v606014/.m2/repository/com/github/fommil/netlib/core/1.1.2/core-1.1.2.jar:/Users/v606014/.m2/repository/org/jblas/jblas/1.2.3/jblas-1.2.3.jar:/Users/v606014/.m2/repository/org/scalanlp/breeze_2.10/0.10/breeze_2.10-0.10.jar:/Users/v606014/.m2/repository/org/slf4j/slf4j-api/1.7.5/slf4j-api-1.7.5.jar:/Users/v606014/.m2/repository/net/sourceforge/f2j/arpack_combined_all/0.1/arpack_combined_all-0.1.jar>org.apache.spark.mllib.optimization.QuadraticMinimizer
100 1 1.0 0.99

For spark-shell my assumption is spark-shell -cp option should work fine....

On Thu, Nov 27, 2014 at 9:15 AM, Dean Jones <dean.m.jo...@gmail.com> wrote:

> Hi,
>
> I'm trying to use the breeze library in the spark scala shell, but I'm
> running into the same problem documented here:
>
> http://apache-spark-user-list.1001560.n3.nabble.com/org-apache-commons-math3-random-RandomGenerator-issue-td15748.html
>
> As I'm using the shell, I don't have a pom.xml, so the solution
> suggested in that thread doesn't work for me. I've tried the
> following:
>
> - adding commons-math3 using the "--jars" option
> - adding both breeze and commons-math3 using the "--jar" option
> - using the "spark.executor.extraClassPath" option on the cmd line as
> follows: --conf "spark.executor.extraClassPath=commons-math3-3.2.jar"
>
> None of these are working for me. Any thoughts on how I can get this
> working?
>
> Thanks,
>
> Dean.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to