I am setting up the data processing pipeline and the end result gets stored
in Postgres. Have not been a heavy DB user in general and had question
regarding how best to handle bulk insert/updates scenario with Postgres.
Here is my use case:
* I get file with thousands of entries (lines) periodical
Thanks Jordan.
One more question I had was - anyway to avoid doing individual INSERT ...
ON CONFLICT? I was thinking about dumping everything into TEMP table and
using that as source for INSERT ... ON CONFLICT. However, I was not sure on
how to get thousands of rows from my Python application into