This worked great. Thanks a lot
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Java-API-Serialization-Issue-tp1460p3178.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
I can suggest two things:
1. While creating worker, submitting task make sure you are not keeping any
unwanted external class resource (which is not used in closure and not
serializable)
2. If this is ensured and you still get some issue from 3rd party library
you can make thet 3rd party variable
I am also facing the same problem. I have implemented Serializable for my
code, but the exception is thrown from third party libraries on which I have
no control .
Exception in thread "main" org.apache.spark.SparkException: Job aborted:
Task not serializable: java.io.NotSerializableException: (li