Hi Xiang,

this error also appears in client mode (maybe the situation that you
were referring to and that worked was local mode?), however the error
is expected and is not a bug.

this line in your snippet:
    object Main extends A[String] { //...
is, after desugaring, equivalent to:
    object Main extends
A[String]()(Env.spark.implicits.newStringEncoder) { //...
Essentially, when the singleton object `Main` is initialised, it will
evaluate all its parameters, i.e. it will call
`Env.spark.implicitcs.newStringEncoder`. Since your `main` method is
also defined in this object, it will be initialised as soon as your
application starts, that is before a spark session is started. The
"problem" is that encoders require an active session and hence you
have an initialisation order problem. (You can replay the problem
simply by defining a `val x = Env.spark.implicits.newStringEncoder` in
your singleton object)

The error message is weird and not so helpful (I think this is due to
the way Spark uses ClassLoaders internally when running a submitted
application), however it isn't a bug in spark.

In local mode you will not experience the issue because you are
starting a session when the session builder is accessed the first time
via `Env.spark`.

Aside from the errors you're getting, there's another subtlety in your
snippet that may bite you later: the adding "T : Encoder" to your
super class has no effect with the current way that also imports
Env.spark.implicits._

best,
--Jakob


On Sat, Sep 17, 2016 at 8:26 PM, Xiang Gao <qasdfgtyu...@gmail.com> wrote:
> Yes. Besides, if you change the "T : Encoder" to "T", it OK too.
>
>
>
> --
> View this message in context: 
> http://apache-spark-developers-list.1001551.n3.nabble.com/java-lang-NoClassDefFoundError-is-this-a-bug-tp18972p18981.html
> Sent from the Apache Spark Developers List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org

Reply via email to