Re: Using Spark Context as an attribute of a class cannot be used

2014-11-24 Thread Marcelo Vanzin
That's an interesting question for which I do not know the answer. Probably a question for someone with more knowledge of the internals of the shell interpreter... On Mon, Nov 24, 2014 at 2:19 PM, aecc wrote: > Ok, great, I'm gonna do do it that way, thanks :). However I still don't > understand

Re: Using Spark Context as an attribute of a class cannot be used

2014-11-24 Thread aecc
de=19692&i=1> > For additional commands, e-mail: [hidden email] > <http://user/SendEmail.jtp?type=node&node=19692&i=2> > > > > ---------- > If you reply to this email, your message will be added to the discussion > below: > > ht

Re: Using Spark Context as an attribute of a class cannot be used

2014-11-24 Thread Marcelo Vanzin
On Mon, Nov 24, 2014 at 1:56 PM, aecc wrote: > I checked sqlContext, they use it in the same way I would like to use my > class, they make the class Serializable with transient. Does this affects > somehow the whole pipeline of data moving? I mean, will I get performance > issues when doing this b

Re: Using Spark Context as an attribute of a class cannot be used

2014-11-24 Thread aecc
--- > To unsubscribe, e-mail: [hidden email] > <http://user/SendEmail.jtp?type=node&node=19687&i=1> > For additional commands, e-mail: [hidden email] > <http://user/SendEmail.jtp?type=node&node=19687&i=2> > > > > -------------- > If you reply to this

Re: Using Spark Context as an attribute of a class cannot be used

2014-11-24 Thread Marcelo Vanzin
Hello, On Mon, Nov 24, 2014 at 12:07 PM, aecc wrote: > This is the stacktrace: > > org.apache.spark.SparkException: Job aborted due to stage failure: Task not > serializable: java.io.NotSerializableException: $iwC$$iwC$$iwC$$iwC$AAA > - field (class "$iwC$$iwC$$iwC$$iwC", name: "aaa", typ

Re: Using Spark Context as an attribute of a class cannot be used

2014-11-24 Thread aecc
If I actually instead of using myNumber I use the 5 value, the exception is not given. E.g: aaa.s.parallelize(1 to 10).filter(_ == 5).count Works perfectly -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Using-Spark-Context-as-an-attribute-of-a-class

Re: Using Spark Context as an attribute of a class cannot be used

2014-11-24 Thread aecc
", ) - field (class "org.apache.spark.rdd.FilteredRDD", name: "f", type: "interface scala.Function1") - root object (class "org.apache.spark.rdd.FilteredRDD", FilteredRDD[3] at filter at :20) at org.apache.spark.scheduler.DAGSched

Re: Using Spark Context as an attribute of a class cannot be used

2014-11-24 Thread Marcelo Vanzin
ts about how to solve this issue and how can I give a workaround > to it? I'm actually developing an Api that will need the usage of this > SparkContext several times in different locations, so I will needed to be > accessible. > > Thanks a lot for the cooperation

Using Spark Context as an attribute of a class cannot be used

2014-11-24 Thread aecc
tion -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Using-Spark-Context-as-an-attribute-of-a-class-cannot-be-used-tp19668.html Sent from the Apache Spark User List mailing list archive at Nabble.com.