That's an interesting question for which I do not know the answer.
Probably a question for someone with more knowledge of the internals
of the shell interpreter...
On Mon, Nov 24, 2014 at 2:19 PM, aecc wrote:
> Ok, great, I'm gonna do do it that way, thanks :). However I still don't
> understand
Ok, great, I'm gonna do do it that way, thanks :). However I still don't
understand why this object should be serialized and shipped?
aaa.s and sc are both the same object org.apache.spark.SparkContext@1f222881
However this :
aaa.s.parallelize(1 to 10).filter(_ == myNumber).count
Needs to be ser
On Mon, Nov 24, 2014 at 1:56 PM, aecc wrote:
> I checked sqlContext, they use it in the same way I would like to use my
> class, they make the class Serializable with transient. Does this affects
> somehow the whole pipeline of data moving? I mean, will I get performance
> issues when doing this b
Yes, I'm running this in the Shell. In my compiled Jar it works perfectly,
the issue is I need to do this on the shell.
Any available workarounds?
I checked sqlContext, they use it in the same way I would like to use my
class, they make the class Serializable with transient. Does this affects
som
Hello,
On Mon, Nov 24, 2014 at 12:07 PM, aecc wrote:
> This is the stacktrace:
>
> org.apache.spark.SparkException: Job aborted due to stage failure: Task not
> serializable: java.io.NotSerializableException: $iwC$$iwC$$iwC$$iwC$AAA
> - field (class "$iwC$$iwC$$iwC$$iwC", name: "aaa", typ
If I actually instead of using myNumber I use the 5 value, the exception is
not given. E.g:
aaa.s.parallelize(1 to 10).filter(_ == 5).count
Works perfectly
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Using-Spark-Context-as-an-attribute-of-a-class-canno
Marcelo Vanzin wrote
> Do you expect to be able to use the spark context on the remote task?
Not At all, what I want to create is a wrapper of the SparkContext, to be
used only on the driver node.
I would like to have in this "AAA" wrapper several attributes, such as the
SparkContext and other con
Do you expect to be able to use the spark context on the remote task?
If you do, that won't work. You'll need to rethink what it is you're
trying to do, since SparkContext is not serializable and it doesn't
make sense to make it so. If you don't, you could mark the field as
@transient.
But the tw