You could also try to put transform in a companion object.
On Fri, 8 Apr 2016 16:48 mpawashe [via Apache Spark User List], <
ml-node+s1001560n26718...@n3.nabble.com> wrote:
> The class declaration is already marked Serializable ("with Serializable")
>
> --
> If you rep
The class declaration is already marked Serializable ("with Serializable")
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Streaming-NotSerializableException-Methods-Closures-tp26672p26718.html
Sent from the Apache Spark User List mailing list archive
you can declare you class serializable, as spark would want to serialise the
whole class.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Streaming-NotSerializableException-Methods-Closures-tp26672p26689.html
Sent from the Apache Spark User List maili
Hi. I am using 2.10.4 for Scala. 1.6.0 for Spark related dependencies. I am
also using spark-streaming-kafka and including kafka (0.8.1.1) which apparently
is needed for deserializers.
> On Apr 4, 2016, at 6:18 PM, Ted Yu wrote:
>
> bq. I'm on version 2.10 for spark
>
> The above is Scala v
bq. I'm on version 2.10 for spark
The above is Scala version.
Can you give us the Spark version ?
Thanks
On Mon, Apr 4, 2016 at 2:36 PM, mpawashe wrote:
> Hi all,
>
> I am using Spark Streaming API (I'm on version 2.10 for spark and
> streaming), and I am running into a function serialization