Switching the sides worked (I tried that shortly after sending the mail).
Thanks for the fast response :)
On 26.05.2015 22:26, Stephan Ewen wrote:
If you have this case, giving more memory is fighting a symptom, rather
than a cause.
If you really have that many duplicates in the data set (and
Hi,
What can I do to give Flink more memory when running it from my IDE? I'm
getting the following exception:
Caused by: java.lang.RuntimeException: Hash join exceeded maximum number
of recursions, without reducing partitions enough to be memory resident.
Probably cause: Too many duplicate k
If you have this case, giving more memory is fighting a symptom, rather
than a cause.
If you really have that many duplicates in the data set (and you have not
just a bad implementation of "hashCode()"), then try the following:
1) Reverse hash join sides. Duplicates hurt only on the build-side, n