Thanks Dim; I was not aware of pgloader. This, and the other suggestions,
have helped a lot; thanks everyone.

--rick

On Mon, Mar 29, 2010 at 7:41 AM, Dimitri Fontaine <dfonta...@hi-media.com>wrote:

> Rick Casey <caseyr...@gmail.com> writes:
>
> > So, I am wondering if there is any to optimize this process? I have been
> using Postgres for several years, but have never had to partition or
> optimize it for files
> > of this size until now.
> > Any comments or suggestions would be most welcomed from this excellent
> forum.
>
> The pgloader tool will import your data as batches of N lines, you get
> to say how many lines you want to consider in each transaction. Plus,
> you can have more than one python thread importing your big file, either
> sharing one writer and having the other threads doing the parsing and
> COPY, or having N independent threads doing the reading/parsing/COPY.
>
>  http://pgloader.projects.postgresql.org/
>
> Hope this helps,
> --
> dim
>



-- 
----------------------------------------------------------------------------
Rick Casey :: caseyr...@gmail.com :: 303.345.8893

Reply via email to