etters
>>>
>>> });
>>>
>>> Long numberCalls = totalCounts.value();
>>>
>>> I believe the mistake is to pass the accumulator to the function
e function rather
>> than
>> letting the function find the accumulator - I do this in this case by
>> using
>> a final local variable
>>
>>
>>
>> --
>> View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/Bug-
g
> a final local variable
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Bug-in-Accumulators-tp17263p19579.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> -
spark-user-list.1001560.n3.nabble.com/Bug-in-Accumulators-tp17263p19579.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional comman
nstance.serialize(JavaSerializer.scala:73)
at
org.apache.spark.util.ClosureCleaner$.ensureSerializable(ClosureCleaner.scala:164)
... 14 more
Seems that there is a problem with mapPartitions ...
Thanks for your suggestion,
--
View this message in context:
http://apache-spark-user-list.
that uses accumulators.
> Otherwise, you have to change their code such that the above issue does not
> appear anymore.
>
> Best,
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Bug-in-Accumulators-tp17263
to see a real application that uses accumulators.
Otherwise, you have to change their code such that the above issue does not
appear anymore.
Best,
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Bug-in-Accumulators-tp17263p19567.html
Sent from the Apache Spark User
l mode! )
>>>
>>> If I put the accumulator inside the for loop, everything will work fine.
>>> I
>>> guess the bug is that an accumulator can be applied to JUST one RDD.
>>>
>>> Still another undocumented 'f
the bug is that an accumulator can be applied to JUST one RDD.
>>
>> Still another undocumented 'feature' of Spark that no one from the people
>> who maintain Spark is willing to solve or at least to tell us about ...
>>
>>
>>
>> --
>>
iew this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Bug-in-Accumulators-tp17263p17372.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> -
> To unsubscribe
; of Spark that no one from the people
who maintain Spark is willing to solve or at least to tell us about ...
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Bug-in-Accumulators-tp17263p17372.html
Sent from the Apache Spark User List mailing list archive at
Sorry, I forgot to say that this gives the above error just when run on a
cluster, not in local mode.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Bug-in-Accumulators-tp17263p17277.html
Sent from the Apache Spark User List mailing list archive at
1560.n3.nabble.com/Accumulators-Task-not-serializable-java-io-NotSerializableException-org-apache-spark-SparkContext-td17262.html
>
>
> http://apache-spark-user-list.1001560.n3.nabble.com/NullPointerException-when-using-Accumulators-on-cluster-td17261.html
>
>
>
> --
> Vi
ble.com/NullPointerException-when-using-Accumulators-on-cluster-td17261.html
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Bug-in-Accumulators-tp17263.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
-
14 matches
Mail list logo