I'm writing a Clojure DSL for Spark. I use kryo to serialize my clojure
functions and for efficiency I hook into Spark's kryo serializer. In order
to do that I get a SerializerInstance from SparkEnv and call the serialize
and deserialize methods. I was able to workaround it by making ClassTag
object in clojure, but it's less than ideal.


On Sun, Jun 1, 2014 at 4:25 PM, Matei Zaharia <matei.zaha...@gmail.com>
wrote:

> BTW passing a ClassTag tells the Serializer what the type of object being
> serialized is when you compile your program, which will allow for more
> efficient serializers (especially on streams).
>
> Matei
>
> On Jun 1, 2014, at 4:24 PM, Matei Zaharia <matei.zaha...@gmail.com> wrote:
>
> > Why do you need to call Serializer from your own program? It’s an
> internal developer API so ideally it would only be called to extend Spark.
> Are you looking to implement a custom Serializer?
> >
> > Matei
> >
> > On Jun 1, 2014, at 3:40 PM, Soren Macbeth <so...@yieldbot.com> wrote:
> >
> >>
> https://github.com/apache/spark/blob/v1.0.0/core/src/main/scala/org/apache/spark/serializer/Serializer.scala#L64-L66
> >>
> >> These changes to the SerializerInstance make it really gross to call
> >> serialize and deserialize from non-scala languages. I'm not sure what
> the
> >> purpose of a ClassTag is, but if we could get some other arities that
> don't
> >> require classtags that would help a ton.
> >
>
>

Reply via email to