Hi,
I'm not sure if this is the right place to raise this, if not hopefully you can
direct me to the right place.
I believe I have discovered a bug when loading
MultilayerPerceptronClassificationModel in spark 3.0.0, scala 2.1.2 which I
have tested and can see is not there in at least Spark 2.
Yeah that's a bug, I can reproduce it. Can you open a JIRA?
It works in Scala, so must be an issue with the Python wrapper. The
serialized model is fine; it's loading it back.
I think it's because the MultilayerPerceptronParams extends HasSolver
which defaults to 'auto', but doesn't seem to fully
this will be happening tomorrow... today is Meeting Hell Day[tm].
On Tue, Jul 7, 2020 at 1:59 PM shane knapp ☠ wrote:
> i wasn't able to get to it today, so i'm hoping to squeeze in a quick trip
> to the colo tomorrow morning. if not, then first thing thursday.
>
> --
> Shane Knapp
> Computer
Thanks Shane!
BTW, it's getting serious .. e.g) https://github.com/apache/spark/pull/28969
.
The tests could not pass in 7 days .. Hopefully restarting the machines
will make the current situation better :-)
Separately, I am working on a PR to run the Spark tests in Github Actions.
We could hopef