On Tue, Dec 12, 2017 at 2:26 PM Peter Eisentraut <
peter.eisentr...@2ndquadrant.com> wrote:

> On 12/12/17 13:03, Jeremy Finzel wrote:
> > To be clear, what I mean is batch updating a large set of data in small
> > pieces so as to avoid things like lock contention and replication lags.
> > Sometimes these have a driving table that has the source data to update
> > in a destination table based on a key column, but sometimes it is
> > something like setting just a single specific value for a huge table.
> >
> > I would love instead to have a Postgres extension that uses postgres
> > background workers to accomplish this, especially if it were part of
> > core.  Before I venture into exploring writing something like this as an
> > extension, would this ever be considered something appropriate as an
> > extension in Postgres core?  Would that be appropriate?
>
> I don't see what the common ground between different variants of this
> use case would be.  Aren't you basically just looking to execute a
> use-case-specific stored procedure in the background?
>
> --
> Peter Eisentraut              http://www.2ndQuadrant.com/
> PostgreSQL Development, 24x7 Support, Remote DBA, Training & Services


The common ground is some column in some table needs to be bulk updated. I
may not be explaining well, but in our environment we have done hundreds of
these using a generic framework to build a backfill. So I’m not sure what
you are questioning about the need? We have had to build a worker to
accomplish this because it can’t be done as a sql script alone.

I’m not sure what you mean by a stored procedure in the background. Since
it would not be a single transaction, it doesn’t fit as a stored procedure
at least in Postgres when a function is 1 transaction.

Sorry if I’m misunderstanding.

Thanks,
Jeremy

Reply via email to