I'd restructure the table to be split into perhaps 100 or so inherited
tables (or more). That many rows in a table are usually not efficient with
postgres in my experience. My target is to keep the tables under about 100
million rows. I slice them up based on the common query patterns, usually
by some ID number modulo 100. I don't really ever use date ranges like most
tutorials you'll see will suggest.

On Thu, Jan 15, 2015 at 7:44 AM, Daniel Begin <jfd...@hotmail.com> wrote:

> Hi, I'm trying to create an index on coordinates (geography type) over a
> large table (4.5 billion records) using GiST...
>
> CREATE INDEX nodes_geom_idx ON nodes USING gist (geom);
>
> The command ran for 5 days until my computer stops because a power outage!
> Before restarting the index creation, I am asking the community if there
> are
> ways to shorten the time it took the first time :-)
>
> Any idea?
>
> Daniel
>
>
>
> --
> Sent via pgsql-general mailing list (pgsql-general@postgresql.org)
> To make changes to your subscription:
> http://www.postgresql.org/mailpref/pgsql-general
>

Reply via email to