Bulk Insert/Update Scenario
I am setting up the data processing pipeline and the end result gets stored in Postgres. Have not been a heavy DB user in general and had question regarding how best to handle bulk insert/updates scenario with Postgres. Here is my use case: * I get file with thousands of entries (lines) periodically. * I process each entry (line) from the file and data is split and stored in different Postgres tables. Some tables have foreign keys on other tables. There is "no" straight mapping from the entry in file to Postgres tables. * Some data could be updates on existing rows in Postgres tables while others could be inserts. * Would like to ensure the atomicity (either all rows gets stored in all tables or nothing gets stored on failure from Postgres). * Also like to make sure no concurrency issues in case two different processes try to perform above at the same time. * Ideally, would want to avoid individual upserts after processing every single entry (line) from the file. I thought this would be a fairly common use case. What is the best way to handle above? What performance issues I should keep in mind and what are the pitfalls? I tried looking around for articles for such use case - any pointers would be greatly appreciated. By the way, the application is in Python running in Apache Spark and can use any Python libraries that can help simplify above. Thanks in advance.
Re: Bulk Insert/Update Scenario
Thanks Jordan. One more question I had was - anyway to avoid doing individual INSERT ... ON CONFLICT? I was thinking about dumping everything into TEMP table and using that as source for INSERT ... ON CONFLICT. However, I was not sure on how to get thousands of rows from my Python application into TEMP table in one shot. Or is there any better alternatives? Thanks. On Thu, Jan 4, 2018 at 12:43 PM, Jordan Deitch wrote: > Hi Mana, > > A starting point would be reading about the batch upsert functionality: > https://www.postgresql.org/docs/current/static/sql-insert.html > > You would do something like: > INSERT INTO table ON CONFLICT update... > > This operation would be atomic. You can also look into deferrable > constraints such that you would perform all your insert / update operations > in a transaction block and accommodate for the constraints. > > I hope this helps to get you on the right track! > > Thanks, > Jordan Deitch > http://id.rsa.pub > > >