Hi, All.
Unfortunately, there is an on-going discussion about the new decimal
correctness.
Although we fixed one correctness issue at master and backported it
partially to 3.0/2.4, it turns out that it needs more patched to be
complete.
Please see https://github.com/apache/spark/pull/29125 for o
I will agree that the side effects of using Futures in driver code tend to
be tricky to track down.
If you forget to clear the job description and job group information, when
the LocalProperties on the SparkContext remain intact -
SparkContext#submitJob makes sure to pass down the localProperties.
Why do you need to do it, and can you just use a future in your driver code?
On Fri, Aug 7, 2020 at 9:01 AM Antonin Delpeuch (lists)
wrote:
>
> Hi all,
>
> Following my request on the user mailing list [1], there does not seem
> to be any simple way to save RDDs to the file system in an asynchron
Hi all,
Following my request on the user mailing list [1], there does not seem
to be any simple way to save RDDs to the file system in an asynchronous
way. I am looking into implementing this, so I am first checking whether
there is consensus around the idea.
The goal would be to add methods such