You should look at table partitioning. That is, you make a master
table and then make a table for each state that would inherit the
master. That way you can query each state individually or you can query
the whole country if need be.
http://www.postgresql.org/docs/current/static/ddl-partitioning.html
On 7/28/2010 12:09 PM, Bill Thoen wrote:
I'm building a national database of agricultural information and one
of the layers is a bit more than a gigabyte per state. That's 1-2
million records per state, with a mult polygon geometry, and i've got
about 40 states worth of data. I trying to store everything in a
single PG table. What I'm concerned about is if I combine every state
into one big table then will performance will be terrible, even with
indexes? On the other hand, if I store the data in several smaller
files, then if a user zooms in on a multi-state region, I've got to
build or find a much more complicated way to query multiple files.
So I'm wondering, should I be concerned with building a single
national size table (possibly 80-100 Gb) for all these records, or
should I keep the files smaller and hope there's something like
ogrtindex out there for PG tables? what do you all recommend in this
case? I just moved over to Postgres to handle big files, but I don't
know its limits. With a background working with MS Access and bitter
memories of what happens when you get near Access' two gigabyte
database size limit, I'm a little nervous of these much bigger files.
So I'd appreciate anyone's advice here.
TIA,
- Bill Thoen