On Tue, 28 Apr 2009, Tim Uckun wrote:
Does anybody know if there is a sample database or text files I can
import to do some performance testing? I would like to have tables with
tens of millions of records if possible.
There is a utility that ships with PostgreSQL named pgbench that includes
In response to Tim Uckun :
> Thanks I'll try something like that.
>
> I guess can create some random dates or something for other types of fields
> too.
Sure, dates for instance:
test=*# select (current_date + random() * 1000 * '1day'::interval)::date from
generate_series(1,10);
date
--
>
> > I would like to have tables with tens of millions of records if possible.
>
> It is easy to create such a table:
>
> test=# create table huge_data_table as select s, md5(s::text) from
> generate_series(1,10) s;
Thanks I'll try something like that.
I guess can create some random dates or so
In response to Tim Uckun :
> Does anybody know if there is a sample database or text files I can import to
> do some performance testing?
>
> I would like to have tables with tens of millions of records if possible.
It is easy to create such a table:
test=# create table huge_data_table as select
Hello Tim,
you can create this by yourself very easily, e.g. you have a table
CREATE TABLE test1
(
a_int serial NOT NULL,
a_text character varying(200),
dt timestamp without time zone DEFAULT now(),
primary key (a_int)
);
create a bunch of data with something like:
insert into test1
Does anybody know if there is a sample database or text files I can import
to do some performance testing?
I would like to have tables with tens of millions of records if possible.