Re: Importing several thousand records into a database

2006-06-13 Thread Eric Walstad
On Tuesday 13 June 2006 11:44, keukaman wrote: > I'd like to import several thousand records, in CSV format, into an > existing postgres table. Does anyone know a utility that would allow me > to do this? Hey keukaman, I had to struggle with a system of regularly importing around 1.5 million re

Re: Importing several thousand records into a database

2006-06-13 Thread Joshua D. Drake
keukaman wrote: > I'd like to import several thousand records, in CSV format, into an > existing postgres table. Does anyone know a utility that would allow me > to do this? If you are literally only doing a couple of thousand, I would use the CSV module for Python and create a single transactio

Re: Importing several thousand records into a database

2006-06-13 Thread aaloy
Python itself could do it. Of course ;) Just be sure not to create a transaction for every record, as it would be terribly slow. One transaction every 5.000 record works for me on an import of ~150.000 records. I didi it with the DB Api. To be honest I haven't tested directly with Django. Best

Re: Importing several thousand records into a database

2006-06-13 Thread Don Arbow
The psql utility will do that for you. You can specify the field and record separator on the command line. You can prefix your data with the COPY command and it will get slurped right into the database. Note that the default value for null columns is \N, but you can specify a different valu

Importing several thousand records into a database

2006-06-13 Thread keukaman
I'd like to import several thousand records, in CSV format, into an existing postgres table. Does anyone know a utility that would allow me to do this? --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups "Django users"