On Wed, Aug 7, 2024 at 4:39 PM Lok P <loknath...@gmail.com> wrote:

> Hello all,
> We have a postgres table which is a range partitions on a timestamp column
> having total size ~3TB holding a total ~5billion rows spanning across ~150
> daily partitions and there are ~140+columns in the table. Also this table
> is a child to another partition table. And we have partition creation
> handled through pg_partman extension on this table.
>
> We have a requirement of modifying the existing column lengths as below .
> So doing it directly through a single alter command will probably scan and
> rewrite the whole table which may take hours of run time.
>
> So trying to understand from experts what is the best possible way to
> tackle such changes in postgres database? And if any side effects we may
> see considering this table being child to another one and also dependency
> with pg_partman extension.
>
> two of the columns from varchar(20) to numeric(3)
> one of the columns from varchar(20) to varchar(2)
> one of the columns from Number(10,2) to Numeric(8,2)
>
>
>

Others may correct but i think, If you don't have the FK defined on these
columns you can do below.


--Alter table add column which will be very fast within seconds as it will
just add it to the data dictionary.

ALTER TABLE tab1 ADD COLUMN new_column1 NUMERIC(3),   new_column2
varchar2(3);


*-- Back populate the data partition wise and commit, if it's really needed*

UPDATE tab1_part1 SET new_column1 = CAST(old_column1 AS NUMERIC(3)),
new_column2 = CAST(old_column2 AS varchar2(3)) ;
commit;
UPDATE tab1_part2 SET new_column1 = CAST(old_column1 AS NUMERIC(3)),
new_column2 = CAST(old_column2 AS varchar2(3)) ;
commit;
UPDATE tab1_part3 SET new_column1 = CAST(old_column1 AS NUMERIC(3)),
new_column2 = CAST(old_column2 AS varchar2(3)) ;
commit;
.....


*--Alter table drop old columns which will be very fast within seconds as
it will just drop it from the data dictionary.*
ALTER TABLE your_table DROP COLUMN old_column1, DROP COLUMN old_column2;

Reply via email to