collect() returns the contents of the RDD back to the Driver in a local
variable. Where is the local variable?
Try
val result = rdd.map(x => x + 1).collect()
regards,
Apostolos
On 21/2/20 21:28, Nikhil Goyal wrote:
Hi all,
I am trying to use almond scala kernel to run spark session on
Ju
Hi all,
I am trying to use almond scala kernel to run spark session on Jupyter. I
am using scala version 2.12.8. I am creating spark session with master set
to Yarn.
This is the code:
val rdd = spark.sparkContext.parallelize(Seq(1, 2, 4))
rdd.map(x => x + 1).collect()
Exception:
java.lang.ClassC