Hi Flavio, These kinds of exceptions generally arise from the fact that you ran out of `user` memory. You can try to increase that a bit. In your flink-conf.yaml try adding # The memory fraction allocated system -user taskmanager.memory.fraction: 0.4
This will give 0.6 of the unit of memory to the user and 0.4 to the system. Tell me if that helped. Andra On Thu, Aug 20, 2015 at 12:02 PM, Flavio Pompermaier <pomperma...@okkam.it> wrote: > Hi to all, > > I tried to run my gelly job on Flink 0.9-SNAPSHOT and I was having an > EOFException, so I tried on 0.10-SNAPSHOT and now I have the following > error: > > Caused by: java.lang.RuntimeException: Memory ran out. Compaction failed. > numPartitions: 32 minPartition: 73 maxPartition: 80 number of overflow > segments: 0 bucketSize: 570 Overall memory: 102367232 Partition memory: > 81100800 Message: null > at > org.apache.flink.runtime.operators.hash.CompactingHashTable.insertRecordIntoPartition(CompactingHashTable.java:465) > at > org.apache.flink.runtime.operators.hash.CompactingHashTable.insertOrReplaceRecord(CompactingHashTable.java:414) > at > org.apache.flink.runtime.operators.hash.CompactingHashTable.buildTableWithUniqueKey(CompactingHashTable.java:325) > at > org.apache.flink.runtime.iterative.task.IterationHeadPactTask.readInitialSolutionSet(IterationHeadPactTask.java:211) > at > org.apache.flink.runtime.iterative.task.IterationHeadPactTask.run(IterationHeadPactTask.java:272) > at > org.apache.flink.runtime.operators.RegularPactTask.invoke(RegularPactTask.java:354) > at org.apache.flink.runtime.taskmanager.Task.run(Task.java:581) > at java.lang.Thread.run(Thread.java:745) > > Probably I'm doing something wrong but I can't understand how to estimate > the required memory for my Gelly job.. > > Best, > Flavio >