Hi
Is there any way to prepare spark executor? Like what we do in MapReduce, we
implements a setup and a clearup method.
For my case, I need this prepare method to init StaticParser base on the
env(dev, production). Then, I can directly use this StaticParser on
executor. like this
object Sta
We now have a method to work this around.
For the classes that can't easily implement serialized, we wrap this class a
scala object.
For example:
class A {} // This class is not serializable,
object AHolder {
private val a: A = new A()
def get: A = a
}
This wo
Does this means that every class I used in Spark must be serializable? Even
the class that I dependent on?
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/java-io-NotSerializableException-Of-dependent-Java-lib-tp1973p2006.html
Sent from the Apache Spark User