Glad to hear. Could you please share your solution on the user mailing
list? -Xiangrui
On Mon, Jul 18, 2016 at 2:26 AM Alger Remirata
wrote:
> Hi Xiangrui,
>
> We have now solved the problem. Thanks for all the tips you've given.
>
> Best Regards,
>
> Alger
>
> On Thu, Jul 14, 2016 at 2:43 AM, A
(+user@spark. Please copy user@ so other people could see and help.)
The error message means you have an MLlib jar on the classpath but it
didn't contain ALS$StandardNNLSSolver. So it is either the modified jar not
deployed to the workers or there existing an unmodified MLlib jar sitting
in front
This seems like a deployment or dependency issue. Please check the
following:
1. The unmodified Spark jars were not on the classpath (already existed on
the cluster or pulled in by other packages).
2. The modified jars were indeed deployed to both master and slave nodes.
On Tue, Jul 5, 2016 at 12: