Could you tell us the Spark version you used?
We have fixed this bug at Spark 1.6.2 and Spark 2.0, please upgrade to
these versions and retry.
If this issue still exists, please let us know.
Thanks
Yanbo
2016-07-12 11:03 GMT-07:00 Pasquinell Urbani <
pasquinell.urb...@exalitica.com>:
> In the fo
In the forum mentioned above the flowing solution is suggested
Problem is in line 113 and 114 of QuantileDiscretizer.scala and can be
fixed by changing line 113 like so:
before: val requiredSamples = math.max(numBins * numBins, 1)
after: val requiredSamples = math.max(numBins * numBins, 1.