Re: How to Copy/Load 1 billions rows into a Partition Tables Fast

2024-10-14 Thread David Rowley
On Tue, 15 Oct 2024 at 06:59, Wong, Kam Fook (TR Technology) wrote: > I am trying to copy a table (Postgres) that is close to 1 billion rows into a > Partition table (Postgres) within the same DB. What is the fastest way to > copy the data? This table has 37 columns where some of which are te

Re: How to Copy/Load 1 billions rows into a Partition Tables Fast

2024-10-14 Thread Juan Rodrigo Alejandro Burgos Mella
Hi Wong On one occasion I had to upload 600 million records, and the most viable and safest option was to generate plans and upload them through a massively parallelized process (because for each process we audited that everything was correct) Atte. JRBM El lun, 14 oct 2024 a las 14:59, Wong, Ka

Re: How to Copy/Load 1 billions rows into a Partition Tables Fast

2024-10-14 Thread Muhammad Usman Khan
Hi, There are many methods to achieve this and one of them is pg_bulkload utility as described in previous email but I always preferred using python multiprocessing which I think is more efficient. Below is the code which you can modify as per your requirement: import multiprocessing import psycop

Re: How to Copy/Load 1 billions rows into a Partition Tables Fast

2024-10-14 Thread Durgamahesh Manne
On Mon, 14 Oct, 2024, 23:29 Wong, Kam Fook (TR Technology), < kamfook.w...@thomsonreuters.com> wrote: > I am trying to copy a table (Postgres) that is close to 1 billion rows > into a Partition table (Postgres) within the same DB. What is the fastest > way to copy the data? This table has 37 co

How to Copy/Load 1 billions rows into a Partition Tables Fast

2024-10-14 Thread Wong, Kam Fook (TR Technology)
I am trying to copy a table (Postgres) that is close to 1 billion rows into a Partition table (Postgres) within the same DB. What is the fastest way to copy the data? This table has 37 columns where some of which are text data types. Thank you Kam Fook Wong This e-mail is for the sole use o