You can move the definition of `t` upward.
My example is still valide.


On Mon, May 9, 2016 at 1:46 PM, Ted Yu  wrote:

> Using spark-shell, I was not allowed to define the map() without declaring
> t first:
>
> scala> rdd = rdd.map(x => x*t)
> <console>:26: error: not found: value t
>        rdd = rdd.map(x => x*t)
>                             ^
>
> On Mon, May 9, 2016 at 4:19 AM, Daniel Haviv  wrote:
>
>> How come that for the first() function it calculates an updated value and
>> for collect it doesn't ?
>>
>>
>>
>> On Sun, May 8, 2016 at 4:17 PM, Ted Yu  wrote:
>>
>>> I don't think so.
>>> RDD is immutable.
>>>
>>> > On May 8, 2016, at 2:14 AM, Sisyphuss <wrote:
>>> >
>>> > <
>>> http://apache-spark-user-list.1001560.n3.nabble.com/file/n26898/09.png>
>>> >
>>> >
>>> >
>>> > --
>>> > View this message in context:
>>> http://apache-spark-user-list.1001560.n3.nabble.com/Is-it-a-bug-tp26898.html
>>> > Sent from the Apache Spark User List mailing list archive at
>>> Nabble.com.
>>> >
>>> > ---------------------------------------------------------------------
>>> > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>> > For additional commands, e-mail: user-h...@spark.apache.org
>>> >
>>>
>>> ---------------------------------------------------------------------
>>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>> For additional commands, e-mail: user-h...@spark.apache.org
>>>
>>>
>>
>

Reply via email to