Hi All,

I have several third-party living datasets (accessible via web services)
that I need to replicate to an internal postgres instance (ETL processes).

My question is, given my script/model has obtained a copy of the dataset.
What would you recommend I use to synchronize the new/freshly downloaded
data with the older data in the postgres database? In other words, how do I
sync/reconcile the database.

I have looked at the Export to postgres
<https://docs.qgis.org/3.22/en/docs/user_manual/processing_algs/qgis/database.html#export-to-postgresql>,
but this doesn't seem like the right approach. It can overwrite (i.e. drop,
create) a table but this is not a suitable option. It does not document
what it does if it finds a duplicate primary key/constraint either...
(perhaps it does an upsert or maybe it just fails)?

Alternatively, there is a Postgres Execute SQL
<https://docs.qgis.org/3.22/en/docs/user_manual/processing_algs/qgis/database.html#postgresql-execute-sql>
algorithm,
but that requires me to hand code all the sql statements in my script.
Do-able, but not elegant.

Advice would be greatly appreciated.

Thanks for reading,
Andrew



isn't documented unfortunately does not offer this. All it does is
drop/recreate the
_______________________________________________
Qgis-user mailing list
[email protected]
List info: https://lists.osgeo.org/mailman/listinfo/qgis-user
Unsubscribe: https://lists.osgeo.org/mailman/listinfo/qgis-user

Reply via email to