I think Xpath in postgresql is fast enough. I am dumping the raw xml file
into a table and then generating data from it. I want all the data to be
consistent thats why I am using a database.

I am planning to use triggers to generate content of
table1_master...table2_master, etc...

master -> table 1_master
            -> table 2_master
            -> table 3_master ->   table 1_table3




On Sun, Jan 29, 2017 at 9:45 PM, David G. Johnston <
david.g.johns...@gmail.com> wrote:

> On Saturday, January 28, 2017, Rita <rmorgan...@gmail.com> wrote:
>
>> After xmltest has been populated, I can run xpath and unest to get my
>> data into a row but I would like to store that result in another table, I
>> am guessing I should look into triggers for something like that?
>>
>
> I suspect that using xpath in the database is not the right tool for doing
> what you describe.  Whatever middleware layer receives the XML should be
> considered for the logic of deserialization and storage to the database in
> normalized form.  If you do want something like that in the database I'd
> probably write a volatile function the receives xml and does whatever it
> needs to do.  I cannot imagine the indirection of a trigger would be
> gainful here.
>
> In particular, at scale you'd probably be better off with using
> a streaming parser instead of a DOM one.
>
> David J.




-- 
--- Get your facts first, then you can distort them as you please.--

Reply via email to