Arsalan, What version of NiFi are you using? I ran with what will become NiFi 1.1.0 using your JSONTest.txt and SplitJson with $.Records and I got 10 flow files on the "split" relationship, none to failure. Are you sure what is coming in from GetHTTP is JSON with the schema you expect?
For the PutSQL problem, it looks like it's throwing a BatchUpdateException except there are no flow files causing that failure. What DB are you using? I see backticks so I'm guessing MySQL, can you try removing those (and/or the semicolon at the end) to see if that clears it up? If so, I suspect there's a logic bug in PutSQL where we can get that BatchUpdateException without having flow file(s) to match it to. Regards, Matt On Thu, Oct 27, 2016 at 10:19 AM, Arsalan Siddiqi <arsalan_sidd...@live.com> wrote: > Thanks for the help. Unfortunately there is a problem. I have used the > SplitJson processor with $.Records to split the record array. I get the > following warning > WARN [Timer-Driven Process Thread-3] o.a.nifi.processors.standard.SplitJson > SplitJson[id=0157101a-f11b-1e48-29c6-bf7c07ddd553] JsonPath $['Records'] > could not be found for FlowFile > StandardFlowFileRecord[uuid=ab2412a5-6fab-42b7-9f9a-31870607ba5c,claim=StandardContentClaim > [resourceClaim=StandardResourceClaim[id=1477564943159-1, container=default, > section=1], offset=56802, length=136],offset=0,name=adabas,size=136] > > To make sure the JSON was correct I tested it at http://jsonpath.com/ and I > do see that using $.Records gives me the array. The surprising thing is that > the incoming flow file is split correctly and new flow files are generated > and routed to the failure relationship. Why does it route to failure ? > > You can check the data in the attached text file .The very first spit file > from the failure relationship looks like > {"ISN":200,"AA":"50005900","AB":{"AC":"GASTON","AD":"","AE":"BODIN"},"AF":"S","AG":"M","AH":713043,"AW":[{"AX":19990801,"AY":19990831}]} > > while is correct and what i wanted. I have carried on and made my work with > the failure relationship but can you point out what is wrong? > <http://apache-nifi-developer-list.39713.n7.nabble.com/file/n13752/splitjson.png> > JSONTest.txt > <http://apache-nifi-developer-list.39713.n7.nabble.com/file/n13752/JSONTest.txt> > > Lastly I use the put SQL to insert the data into MySQL and i get the > following error: > > 2016-10-27 17:41:43,066 ERROR [Timer-Driven Process Thread-5] > o.apache.nifi.processors.standard.PutSQL > PutSQL[id=f11b1e51-102b-1157-b108-c9ea7dead3ca] > PutSQL[id=f11b1e51-102b-1157-b108-c9ea7dead3ca] failed to process session > due to java.lang.IndexOutOfBoundsException: Index: 1, Size: 1: > java.lang.IndexOutOfBoundsException: Index: 1, Size: 1 > 2016-10-27 17:41:43,067 ERROR [Timer-Driven Process Thread-5] > o.apache.nifi.processors.standard.PutSQL > java.lang.IndexOutOfBoundsException: Index: 1, Size: 1 > at java.util.ArrayList.rangeCheck(ArrayList.java:653) ~[na:1.8.0_101] > at java.util.ArrayList.get(ArrayList.java:429) ~[na:1.8.0_101] > at > org.apache.nifi.processors.standard.PutSQL.onTrigger(PutSQL.java:304) > ~[na:na] > at > org.apache.nifi.processor.AbstractProcessor.onTrigger(AbstractProcessor.java:27) > ~[nifi-api-1.0.0.jar:1.0.0] > at > org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1064) > ~[nifi-framework-core-1.0.0.jar:1.0.0] > at > org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:136) > [nifi-framework-core-1.0.0.jar:1.0.0] > at > org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:47) > [nifi-framework-core-1.0.0.jar:1.0.0] > at > org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:132) > [nifi-framework-core-1.0.0.jar:1.0.0] > at > java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) > [na:1.8.0_101] > at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) > [na:1.8.0_101] > at > java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) > [na:1.8.0_101] > at > java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) > [na:1.8.0_101] > at > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) > [na:1.8.0_101] > at > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) > [na:1.8.0_101] > at java.lang.Thread.run(Thread.java:745) [na:1.8.0_101] > > > The input to the PutSQL processors are multiple files of the form: > INSERT INTO periodic_group > (`ISN`, > `AX`, > `AY`) > VALUES ('204', > '19980101', > '19980115'); > > Half the files were loaded successfully but exactly half failed and are not > even extracted from the incoming connection. > > Thankyou > Arsalan > > > > > -- > View this message in context: > http://apache-nifi-developer-list.39713.n7.nabble.com/Replace-Text-for-updating-the-flow-file-content-tp13712p13752.html > Sent from the Apache NiFi Developer List mailing list archive at Nabble.com.