Hi Abdulfattah,
Make sure you have enough resource available when submit the application, it
seems like Spark is waiting to have enough resource.
Best,
Khwunchai Jaengsawang
Email: khwuncha...@ku.th
Mobile: +66 88 228 1715
LinkedIn <https://linkedin.com/in/khwunchai> |
into PairRDD and use
reduceByKey()
Example:
val pairs = data.map(row => (row(1), row(2)).reduceByKey(_+_)
Best,
Khwunchai Jaengsawang
Email: khwuncha...@ku.th
LinkedIn <https://linkedin.com/in/khwunchai> | Github
<https://github.com/khwunchai>
> On Mar 4, 2560 B
r and the whole job is failed. I tried using flatMap with try statement,
still failed. Is there any way to handle this?
Regards,
Khwunchai Jaengsawang
Email: khwuncha...@ku.th
Mobile: +66 88 228 1715
LinkedIn <https://linkedin.com/in/khwunchai> | Github
<https://github.com/khwunch