> On Mar 19, 2020, at 3:36 PM, pabloa98 <pablo...@gmail.com> wrote:
>
> Hello,
>
> My schema requires a counter for each combination of 2 values. Something like:
>
> CREATE TABLE counter(
> group INT NOT NULL,
> element INT NOT NULL,
> seq_number INT NOT NULL default 0,
> CONSTRAINT PRIMARY KEY (group, element)
> );
>
> For each entry in counter, aka for each (group, element) pair, the model
> requires a seq_number.
>
> If I use a table "counter", I could still have counter collisions between 2
> transactions. I need truly sequence behavior. Is that possible by using a
> table like "counter" table, where the counter could be increased out of the
> transaction so it performs as a sequence without having race conditions
> between concurrent transactions?
>
> The other option is to create sequences for each new pair of (group, element)
> using triggers. There are millions of pairs. So this approach will generate
> millions of sequences.
>
> How a PostgreSQL database would behave having millions of sequences in a
> schema? Would it degrade its performance? Is there any negative impact?
>
> Regards
>
> Pablo
>
To clarify, are you hoping for consecutive numbers as the each row is added to
the table, i.e. “serial”?
What is the intension of “seq_number”?
>
>