Hi Folks,

I’m writing a little utility for dumping parts of tables into files which will 
be later slurped into another primary instance. The primary source is our 
referential data (big, and subject to heavy loads when adding data), the second 
is a smaller version used in driving our web app. (smaller, portable, less 
prone to lags.)

Yes, a replication strategy can work but since the web app version is so much 
smaller (10% of the size) I thought the partial snapshot would be easier to 
manage.

I have SQL that does it with \copy (select * from <table> where…) … And that is 
fine. But it would be nice to be able to run the \copy commands in parallel. So 
I was thinking of writing a background worker. 

Never having done that before, I’m curious:
1) Is a background worker that I can execute in parallel appropriate for this 
job
2) Are there non-trivial examples of background workers out there to copy learn 
from?
3) Will doing multiple \copy’s in parallel just be of no benefit. Since pg_dump 
and pg_restore have the options of running multiple instances in parallel I 
thought the answer was it should help.

Thanks

Joe

Reply via email to