[ 
https://issues.apache.org/jira/browse/NIFI-3126?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16568635#comment-16568635
 ] 

Otto Fowler commented on NIFI-3126:
-----------------------------------

[~OlavJ]

The  [ForkRecord  | 
https://nifi.apache.org/docs/nifi-docs/components/org.apache.nifi/nifi-standard-nar/1.7.1/org.apache.nifi.processors.standard.ForkRecord/index.html]
  processor should work for what you are trying to do.  The record readers used 
for json in Nifi do not read the whole document into memory, and the ForkRecord 
processor has split functionality.

Would using that be an option for you?

> A large JSON file consisting of an array of many json elements can cause an 
> out of memory error if passed to SplitJSON
> ----------------------------------------------------------------------------------------------------------------------
>
>                 Key: NIFI-3126
>                 URL: https://issues.apache.org/jira/browse/NIFI-3126
>             Project: Apache NiFi
>          Issue Type: Bug
>          Components: Core Framework
>    Affects Versions: 1.0.0
>         Environment: RHEL 6.5 - single instance of Nifi
>            Reporter: Olav Jordens
>            Assignee: Otto Fowler
>            Priority: Major
>
> Create a flowfile containing a large JSON array (large=many elements in the 
> array) with many attributes. Do this in such a way that a set of flowfiles 
> corresponding to the individual json elements in the array (with all the 
> attributes) would require more RAM than available in your nifi JVM. Now feed 
> this flowfile into a SplitJSON processor. I did not see an OOM error in the 
> logs and saw very strange behavior as the whole Nifi instance hangs.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

Reply via email to