Hi  Wong
On one occasion I had to upload 600 million records, and the most viable
and safest option was to generate plans and upload them through a massively
parallelized process (because for each process we audited that everything
was correct)

Atte.
JRBM

El lun, 14 oct 2024 a las 14:59, Wong, Kam Fook (TR Technology) (<
kamfook.w...@thomsonreuters.com>) escribió:

> I am trying to copy a table (Postgres) that is close to 1 billion rows
> into a Partition table (Postgres) within the same DB.  What is the fastest
> way to copy the data?   This table has 37 columns where some of which are
> text data types.
>
> Thank you
> Kam Fook Wong
>
>
> This e-mail is for the sole use of the intended recipient and contains
> information that may be privileged and/or confidential. If you are not an
> intended recipient, please notify the sender by return e-mail and delete
> this e-mail and any attachments. Certain required legal entity disclosures
> can be accessed on our website:
> https://www.thomsonreuters.com/en/resources/disclosures.html
>

Reply via email to