ssage will be added to the discussion
> below:
>
> http://apache-spark-user-list.1001560.n3.nabble.com/Cannot-run-SimpleApp-as-regular-Java-app-tp13695p14570.html
> To unsubscribe from Cannot run SimpleApp as regular Java app, click here
> <http://apache-spark-user-list.1001560.
Upgrading from spark-1.0.2-hadoop2 to spark-1.1.0-hadoop1 fixed my problem.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Cannot-run-SimpleApp-as-regular-Java-app-tp13695p14570.html
Sent from the Apache Spark User List mailing list archive at Nabble.com
o class loader
I've been assuming that my conf.setJars() is the proper way to provide my
code to Spark.
Thanks!
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Cannot-run-SimpleApp-as-regular-Java-app-tp13695p13842.html
Sent from the Apache Spark User
at
> java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1004)
> at
> java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1872)
> at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1777)
&
at
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1970)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1894)
at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1777)
It seems like a serialization problem because there