Re: [GENERAL] function SETOF return type with variable columns?

2008-08-22 Thread James Neff
Merlin Moncure wrote: On Wed, Aug 20, 2008 at 12:59 PM, James Neff <[EMAIL PROTECTED]> wrote: Greetings, Is it possible to have a function with a return type of SETOF that has variable number of return columns? On Wed, Aug 20, 2008 at 10:08 PM, Tom Lane <[EMAIL PROTECTE

[GENERAL] function SETOF return type with variable columns?

2008-08-20 Thread James Neff
Greetings, Is it possible to have a function with a return type of SETOF that has variable number of return columns? The input parameter for this function will be a String containing a number of codes separated by a tilde character. I would like to have 1 output column for each of these co

Re: [GENERAL] question: knopixx and postgresql on flash drive

2007-03-23 Thread James Neff
Mark wrote: I would like to use postgresql with knopixx, Sounds like a simple idea :-) and I would like to get full version of postgresql stored on flash drive. I remeber I've seen postgresql tar files before, but do not recall the location - can anybody point? Also, how big (in MB) postgresql

Re: [GENERAL] Practical question.

2007-03-16 Thread James Neff
louis gonzales wrote: Is it better to have 1 monolithic table and have to search it, or small individual tables but many of them? Ron Johnson wrote: Yes, 1 large table. This is what RDMS were designed for. ---(end of broadcast)--- TIP 1: if p

[GENERAL] pg_dump and very slow database

2007-01-10 Thread James Neff
Greetings, I started a process last night that is reading records from one table, applying some business rules (validation) and then moving some data to another table. I had a cron job which I forgot about that ran last night too. This simply executes pg_dump : /usr/local/pgsql/bin/pg_dum

Re: [GENERAL] Database versus filesystem for storing images

2007-01-05 Thread James Neff
"... and Moses said unto them, 'The eleventh commandment : thou shalt store images in a database!'..." What if you had another database where you stored just the images and not back it up if you don't want to? As an application developer, I like the idea of storing files and images in the d

Re: [GENERAL] About auto_increment

2007-01-02 Thread James Neff
Yesh wrote: Hi, I need to know how to increment a primary key field automatically in run time. If you use the "serial" data type, the database does this for you automatically and you don't have to worry about it. Is this the data type your using? -- James Neff Techno

[GENERAL] psql script error handling

2006-12-29 Thread James Neff
I have an sql script that I am trying to execute in psql client on the database itself. The script is just a bunch (hundreds of thousands) of INSERT statements. I don't know how, but I seem to have bad characters throughout my file and when I run the script it will of course error out complai

Re: [GENERAL] slow speeds after 2 million rows inserted

2006-12-29 Thread James Neff
Frank Finner wrote: In Java, assuming you have a Connection c, you simply say "c.commit();" after doing some action on the database. After every commit, the transaction will be executed and closed and a new one opened, which runs until the next commit. Regards, Frank. That did it, thank

Re: [GENERAL] slow speeds after 2 million rows inserted

2006-12-29 Thread James Neff
Joshua D. Drake wrote: You need to vacuum during the inserts :) Joshua D. Drake I ran the vacuum during the INSERT and it seemed to help a little, but its still relatively slow compared to the first 2 million records. Any other ideas? Thanks, James ---(end of br

Re: [GENERAL] slow speeds after 2 million rows inserted

2006-12-29 Thread James Neff
Joshua D. Drake wrote: Also as you are running 8.2 you can use multi valued inserts... INSERT INTO data_archive values () () () Would this speed things up? Or is that just another way to do it? Thanks, James ---(end of broadcast)--- TIP 6:

[GENERAL] slow speeds after 2 million rows inserted

2006-12-29 Thread James Neff
Greetings, Ive got a java application I am reading data from a flat file and inserting it into a table. The first 2 million rows (each file contained about 1 million lines) went pretty fast. Less than 40 mins to insert into the database. After that the insert speed is slow. I think I may