Hi Jakob,
I had multiple versions of Spark installed in my machine. The code now
works without issues in spark-shell and the IDE. I have verified this
with Spark 1.6 and 2.0.
Cheers,
Kabeer.
On Mon, 3 Oct, 2016 at 7:30 PM, Jakob Odersky wrote:
Hi Kabeer,
which version of Spark are you us
Hi Kabeer,
which version of Spark are you using? I can't reproduce the error in
latest Spark master.
regards,
--Jakob
-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
I have had a quick look at the query from Maciej. I see different
behaviour while running the piece of code in spark-shell and a
different one while running it as spark app.
1. While running in the spark-shell, I see the serialization error that
Maciej has reported.
2. But while running the sa
Thanks guys.
This is not a big issue in general. More an annoyance and can be rather
confusing when encountered for the first time.
On 09/29/2016 02:05 AM, Jakob Odersky wrote:
> I agree with Sean's answer, you can check out the relevant serializer
> here
> https://github.com/twitter/chill/blob
I agree with Sean's answer, you can check out the relevant serializer
here
https://github.com/twitter/chill/blob/develop/chill-scala/src/main/scala/com/twitter/chill/Traversable.scala
On Wed, Sep 28, 2016 at 3:11 AM, Sean Owen wrote:
> My guess is that Kryo specially handles Maps generically or
My guess is that Kryo specially handles Maps generically or relies on
some mechanism that does, and it happens to iterate over all
key/values as part of that and of course there aren't actually any
key/values in the map. The Java serialization is a much more literal
(expensive) field-by-field seria
Hi everyone,
I suspect there is no point in submitting a JIRA to fix this (not a
Spark issue?) but I would like to know if this problem is documented
anywhere. Somehow Kryo is loosing default value during serialization:
scala> import org.apache.spark.{SparkContext, SparkConf}
import org.a