That's a great idea and it is also a pain point for some users. However, it is not possible to solve this problem at compile time, because the content of serialization can only be determined at runtime.
There are some efforts in Scala to help users avoid mistakes like this. One example project that is more researchy is Spore: http://docs.scala-lang.org/sips/pending/spores.html On Sun, Nov 16, 2014 at 4:12 PM, jay vyas <jayunit100.apa...@gmail.com> wrote: > This is more a curiosity than an immediate problem. > > Here is my question: I ran into this easily solved issue > > http://stackoverflow.com/questions/22592811/task-not-serializable-java-io-notserializableexception-when-calling-function-ou > recently. The solution was to replace my "class" with a scala singleton, > which i guess is readily serializable. > > So its clear that spark needs to serialize objects which carry the driver > methods for an app, in order to run... but I'm wondering,,, maybe there is > a way to change or update the spark API to catch unserializable spark apps > at compile time? > > > -- > jay vyas >