Hi Robertm
I forgot to update about this error.
The root cause was an OOM cause by Jena RDF serialization that was causing
the failing of the entire job.
I also
https://stackoverflow.com/questions/29660894/jena-thrift-serialization-oom-due-to-gc-overhead
I created a thread on StackOverflow, let's see (
https://stackoverflow.com/questions/29660894/jena-thrift-serialization-oom-due-to-gc-overhead
).

Best,
Flavio

On Wed, Apr 15, 2015 at 9:18 PM, Flavio Pompermaier <pomperma...@okkam.it>
wrote:

> Yes Robert,
> Unfortunately I discovered that the error was caused by Phoenix just a
> little bit later the mail sending.
>
> The error is generated in the finalize() method of Pheonix MemoryManager
> so it seems somehow related to gc.
> I rerun the experiment logging to a file so I can investigate deeper the
> error..in Eclipse I was seeing just that error and nothing else because if
> the console buffer limit.
> I'll keep you up to date and maybe I could share my code as Phoenix-Flink
> integration as I did for Mongodb!
>
> Thanks,
> Flavio
> On Apr 15, 2015 8:33 PM, "Robert Metzger" <rmetz...@apache.org> wrote:
>
>> Hey Flavio,
>>
>> I was not able to find the String "Orphaned chunk" in the Flink code
>> base.
>>
>> However, I found it here:
>> https://github.com/apache/phoenix/blob/master/phoenix-core/src/main/java/org/apache/phoenix/memory/GlobalMemoryManager.java#L157
>> Maybe you've send the message to the wrong mailing list?
>> If you're accessing something from/with Phoenix using Flink, please give
>> us the full stack trace to further investigate the issue.
>>
>>
>>
>>
>>
>> On Wed, Apr 15, 2015 at 4:18 PM, Flavio Pompermaier <pomperma...@okkam.it
>> > wrote:
>>
>>> Hi to all,
>>>
>>> another error today :(
>>> My job ended with a lot of "Orphaned chunk of XXXX bytes found during
>>> finalize".
>>> What could be the cause of this error?
>>>
>>> Best,
>>> Flavio
>>>
>>
>>

Reply via email to