Re: [GENERAL] importing a messy text file

2014-04-30 Thread Michael Paquier
On Thu, May 1, 2014 at 1:30 AM, bricklen wrote: > > On Wed, Apr 30, 2014 at 1:07 AM, Willy-Bas Loos wrote: >> >> >> Hi, >> >> I have a 56GB textfile that i want to import into postgres. >> The file is tab delimited and not quoted. > > > Would Pgloader be an option? http://tapoueh.org/pgloader/ Or

Re: [GENERAL] WAL Replication + PITR

2014-04-30 Thread Michael Paquier
On Tue, Apr 29, 2014 at 5:44 PM, basti wrote: > is it possible to have WAL Replication and Point-in-Time Recovery like > follows: > > DB-Master -- (WAL to Slave) --> DB-Slave > | > |---> (PITR to an other Server) Yes. Be sure to take periodic base backups of the master and an archive o

Re: [GENERAL] SQL query runs fine on one platform (FreeBSD), but hangs on another (Windows)

2014-04-30 Thread David Noel
> There is nothing wrong with LATERALs, they just have no business being > used here. Sorry for the noise. Ah. No trouble. In fact I'm glad you chimed in -- it motivated me to learn about laterals so now I know some new SQL syntax! -- Sent via pgsql-general mailing list (pgsql-general@postgres

Re: [GENERAL] arrays of rows and dblink

2014-04-30 Thread Joe Conway
-BEGIN PGP SIGNED MESSAGE- Hash: SHA1 On 04/30/2014 12:52 PM, Torsten Förtsch wrote: > Hi, > > we have the ROW type and we have arrays. We also can create arrays > of rows like: > > select array_agg(r) from (values (1::int, 'today'::timestamp, > 'a'::text), (2, 'yesterday', 'b')) r(a,b,c

Re: [GENERAL] Vacuuming strategy

2014-04-30 Thread Sergey Konoplev
On Wed, Apr 30, 2014 at 10:40 AM, Elanchezhiyan Elango wrote: > log_autovacuum_min_duration = 0 > > autovacuum_vacuum_scale_factor = 0 > autovacuum_vacuum_threshold = 4 > autovacuum_analyze_scale_factor = 0 > autovacuum_analyze_threshold = 4 I don't think it is a good idea to set scale fa

Re: [GENERAL] arrays of rows and dblink

2014-04-30 Thread David G Johnston
Torsten Förtsch wrote > On 30/04/14 20:19, David G Johnston wrote: >> ISTM that you have to "CREATE TYPE ..." as appropriate then >> >> ... tb ( col_alias type_created_above[] ) >> >> There is only so much you can do with anonymous types (which is what the >> ROW >> construct creates; ROW is not

Re: [GENERAL] arrays of rows and dblink

2014-04-30 Thread Torsten Förtsch
On 30/04/14 20:19, David G Johnston wrote: > ISTM that you have to "CREATE TYPE ..." as appropriate then > > ... tb ( col_alias type_created_above[] ) > > There is only so much you can do with anonymous types (which is what the ROW > construct creates; ROW is not a type but an expression anchor

Re: [GENERAL] arrays of rows and dblink

2014-04-30 Thread David G Johnston
Torsten Förtsch wrote > Hi, > > we have the ROW type and we have arrays. We also can create arrays of > rows like: > > select array_agg(r) > from (values (1::int, 'today'::timestamp, 'a'::text), >(2, 'yesterday', 'b')) r(a,b,c); > array_agg > ---

[GENERAL] arrays of rows and dblink

2014-04-30 Thread Torsten Förtsch
Hi, we have the ROW type and we have arrays. We also can create arrays of rows like: select array_agg(r) from (values (1::int, 'today'::timestamp, 'a'::text), (2, 'yesterday', 'b')) r(a,b,c); array_agg -

Re: [GENERAL] Vacuuming strategy

2014-04-30 Thread Elanchezhiyan Elango
> > > Each table has 4 indices. The updates are to the indexed columns. > > Here is the schema of a table: http://pastebin.com/ffu0dUjS All tables > have this same schema except that some tables don't have a port column and > so will have one less index What indexes exist? Are the updates to inde

Re: [GENERAL] Vacuuming strategy

2014-04-30 Thread Elanchezhiyan Elango
Missed to answer this one: > > Is the 5GB for the table plus indexes, or just the table itself? No it's not including the the indices. Including indices it's actually 17GB!! On Wed, Apr 30, 2014 at 10:40 AM, Elanchezhiyan Elango wrote: > Sergey, > Thanks for the aggressive settings. I have li

Re: [GENERAL] Vacuuming strategy

2014-04-30 Thread Elanchezhiyan Elango
Sergey, Thanks for the aggressive settings. I have listed some settings I am planning to try below. Please review and let me know your feedback. Francisco, Thanks for the partitioning idea. I used to have the tables partitioned. But now that I have moved to a schema where data is split across abou

Re: [GENERAL] SQL query runs fine on one platform (FreeBSD), but hangs on another (Windows)

2014-04-30 Thread Vik Fearing
On 04/30/2014 01:08 PM, David Noel wrote: >> For 9.3, you can write that as: >> >> select p.*, s.NoOfSentences >> from page p, >> lateral (select count(*) as NoOfSentences >> from sentence s >> where s."PageURL" = p."URL") s >> where "Classification" like case ... e

Re: [GENERAL] Escape double-quotes in text[]?

2014-04-30 Thread Raymond O'Donnell
On 30/04/2014 17:32, David G Johnston wrote: > Raymond O'Donnell wrote >> On 29/04/2014 22:54, David G Johnston wrote: >>> Raymond O'Donnell wrote Hi all, Probably a silly question, but I'm having trouble figuring out the answer... if I'm constructing an string representation of

[GENERAL] WAL Replication + PITR

2014-04-30 Thread basti
Hello, is it possible to have WAL Replication and Point-in-Time Recovery like follows: DB-Master -- (WAL to Slave) --> DB-Slave | |---> (PITR to an other Server) Thanks for any help! Basti p.s. is there a tutorial somewhere how describe the steps todo when the master is crashed?

Re: [GENERAL] Escape double-quotes in text[]?

2014-04-30 Thread David G Johnston
Raymond O'Donnell wrote > On 29/04/2014 22:54, David G Johnston wrote: >> Raymond O'Donnell wrote >>> Hi all, >>> >>> Probably a silly question, but I'm having trouble figuring out the >>> answer... if I'm constructing an string representation of a value to go >>> into a text[] column, and one of t

Re: [GENERAL] importing a messy text file

2014-04-30 Thread bricklen
On Wed, Apr 30, 2014 at 1:07 AM, Willy-Bas Loos wrote: > > Hi, > > I have a 56GB textfile that i want to import into postgres. > The file is tab delimited and not quoted. > Would Pgloader be an option? http://tapoueh.org/pgloader/

Re: [GENERAL] Escape double-quotes in text[]?

2014-04-30 Thread Raymond O'Donnell
On 29/04/2014 22:54, David G Johnston wrote: > Raymond O'Donnell wrote >> Hi all, >> >> Probably a silly question, but I'm having trouble figuring out the >> answer... if I'm constructing an string representation of a value to go >> into a text[] column, and one of the text literals includes >> dou

Re: [GENERAL] Vacuuming strategy

2014-04-30 Thread Jeff Janes
On Tue, Apr 29, 2014 at 4:59 PM, Elanchezhiyan Elango wrote: > Hi, > > I need help on deciding my vacuuming strategy. I need to know if I ever > need to do 'vacuum full' for my tables. > > Tables1: Following is the query patterns on 4 high traffic table in my > database: > 1. Every 5 minutes about

Re: [GENERAL] Ingres to Postgres migration

2014-04-30 Thread Michael Bostock
Wow, thanks for the swift response and a lot of detailed advice! I will look further into what you have suggested and see what bits I can take forward with our migration. Thanks a lot, Mike On 30 April 2014 15:14, Andy Colson wrote: > On 4/30/2014 8:34 AM, Jeff Johnston wrote: > >> We moved fr

Re: [GENERAL] Ingres to Postgres migration

2014-04-30 Thread Andy Colson
On 4/30/2014 8:34 AM, Jeff Johnston wrote: We moved from Oracle to PostgreSQL recently and just used the \copy command with csv files. To get the data out of oracle we wrote a small php script to connect to Oracle and then generate the csv files. The way it works is each csv file represented a da

Re: [GENERAL] Altering array(composite-types) without breaking code when inserting them and similar questions

2014-04-30 Thread Merlin Moncure
On Tue, Apr 29, 2014 at 2:40 PM, Dorian Hoxha wrote: > So : > > drop function > alter type: add column > create again function with new default argument in a transaction ? Yeah -- something like that. Try it out (carefully). AIUI, In most cases the function execution will be inlined so the perf

Re: [GENERAL] Ingres to Postgres migration

2014-04-30 Thread Jeff Johnston
We moved from Oracle to PostgreSQL recently and just used the \copy command with csv files. To get the data out of oracle we wrote a small php script to connect to Oracle and then generate the csv files. The way it works is each csv file represented a database table. The copy command is amazingly f

Re: [GENERAL] importing a messy text file

2014-04-30 Thread Rob Sargentg
On 04/30/2014 03:50 AM, Willy-Bas Loos wrote: On Wed, Apr 30, 2014 at 11:06 AM, Rémi Cura > wrote: with latest version 9.3 you can use a copy from with a programm as argument. You can use the unix | pipe for most of the stuff. If you have an older ver

[GENERAL] Ingres to Postgres migration

2014-04-30 Thread Michael Bostock
Hi, We are using an old version of Ingres (3.0.1) and I have been asked to investigate the possibility of migrating to PostgreSQL (9.3.4). I was just wondering if there is any general advice you can give me or if there are any details/tools that will help with the migration. Thanks, Mike

Re: [GENERAL] Logging authentication requests with GSSAPI

2014-04-30 Thread Stephen Frost
Joshua, * Joshua Warburton (j.warbur...@irax.com) wrote: > I'm authenticating to postgres using GSSAPI and (for audit reasons) > I need to be able to log the principle name that connects as well as > the username it is mapped to. Is there any way I can get postgres to > log this without cranking u

[GENERAL] Logging authentication requests with GSSAPI

2014-04-30 Thread Joshua Warburton
Hello, I'm authenticating to postgres using GSSAPI and (for audit reasons) I need to be able to log the principle name that connects as well as the username it is mapped to. Is there any way I can get postgres to log this without cranking up the log level for everything? Thanks very much , J

Re: [GENERAL] SQL query runs fine on one platform (FreeBSD), but hangs on another (Windows)

2014-04-30 Thread Tomas Vondra
On 30 Duben 2014, 10:46, David Noel wrote: > Very strange. I ran the query and it seemed slow so I rewrote it with > a join instead. Using join it finished in 800ms. The query using the > lateral finished in more than a minute. I guess I need to do some > analysis on those queries to figure out why

Re: [GENERAL] Security Issues: Allowing Clients to Execute SQL in the Backend.

2014-04-30 Thread Geoff Montee
They don't apply server-wide. They apply to that user's session: http://www.postgresql.org/docs/8.4/interactive/sql-set.html Geoff On Wed, Apr 30, 2014 at 6:19 AM, Hello World wrote: > > SET statement_timeout=0; >> SET work_mem=1024GB; >> > > I just realized about the SET command. > > Isn't i

Re: [GENERAL] SQL query runs fine on one platform (FreeBSD), but hangs on another (Windows)

2014-04-30 Thread David Noel
> For 9.3, you can write that as: > > select p.*, s.NoOfSentences > from page p, > lateral (select count(*) as NoOfSentences > from sentence s > where s."PageURL" = p."URL") s > where "Classification" like case ... end > order by "PublishDate" desc > limit 100; > >

Re: [GENERAL] Vacuuming strategy

2014-04-30 Thread Francisco Olarte
Hi: On Wed, Apr 30, 2014 at 1:59 AM, Elanchezhiyan Elango wrote: > I need help on deciding my vacuuming strategy. I need to know if I ever need > to do 'vacuum full' for my tables. > > Tables1: Following is the query patterns on 4 high traffic table in my > database: > 1. Every 5 minutes about 50

Re: [GENERAL] Security Issues: Allowing Clients to Execute SQL in the Backend.

2014-04-30 Thread Hello World
> SET statement_timeout=0; > SET work_mem=1024GB; > I just realized about the SET command. Isn't it weird that any user can set parameters such as this that will apply server wide? to all future sessions? I noticed that some of the parameters can only be set by superusers, and some require re-st

Re: [GENERAL] importing a messy text file

2014-04-30 Thread Willy-Bas Loos
On Wed, Apr 30, 2014 at 11:06 AM, Rémi Cura wrote: > with latest version 9.3 you can use a copy from with a programm as > argument. > You can use the unix | pipe for most of the stuff. > > If you have an older version you can use the mkfifo command to "trick" > postgres into believing he will cop

Re: [GENERAL] Security Issues: Allowing Clients to Execute SQL in the Backend.

2014-04-30 Thread Albe Laurenz
Hello World wrote: > Denial of service is indeed a problem. Is there a way to limit the execution > time of a request? Yes, setting statement_timeout. But if a client can exectue arbitrary statements, that could also be statements like: SET statement_timeout=0; SET work_mem=1024GB; > I'm using

Re: [GENERAL] importing a messy text file

2014-04-30 Thread Willy-Bas Loos
On Wed, Apr 30, 2014 at 11:03 AM, Alberto Cabello Sánchez wrote: > > > What about using "cut" to strip the extra fields? > Wow, i didn't know "cut", this looks promising. thanks. -- Willy-Bas Loos

Re: [GENERAL] importing a messy text file

2014-04-30 Thread Rémi Cura
Hey, with latest version 9.3 you can use a copy from with a programm as argument. I had a similar problem (an extra space at the end of each line), so I used sed in a pipeline to remove it before feeding it to database. You can use the unix | pipe for most of the stuff. If you have an older versi

Re: [GENERAL] importing a messy text file

2014-04-30 Thread Alberto Cabello Sánchez
On Wed, 30 Apr 2014 10:47:12 +0200 Willy-Bas Loos wrote: > Alberto, it would be hard to use sed( s/^I$// ), because there is no > explicit NULL value and there are many NULLs in the last column. > So i can't be sure how many tabs should be in the end of each line. Ok, I understand (I think). Wh

Re: [GENERAL] SQL query runs fine on one platform (FreeBSD), but hangs on another (Windows)

2014-04-30 Thread David Noel
The FreeBSD system is running 9.3, the Windows systems are running 9.2. I am waiting on the output from the other developer. On Tue, Apr 29, 2014 at 8:46 AM, Tom Lane wrote: > David Noel writes: >> Both queries are run from a Java project using the latest JDBC driver. >> The PostgreSQL Server ve

Re: [GENERAL] importing a messy text file

2014-04-30 Thread Willy-Bas Loos
Alberto, it would be hard to use sed( s/^I$// ), because there is no explicit NULL value and there are many NULLs in the last column. So i can't be sure how many tabs should be in the end of each line. Yes, Karsten, maybe scripting would be the easiest way to fix this, i would then probably insert

Re: [GENERAL] SQL query runs fine on one platform (FreeBSD), but hangs on another (Windows)

2014-04-30 Thread David Noel
Very strange. I ran the query and it seemed slow so I rewrote it with a join instead. Using join it finished in 800ms. The query using the lateral finished in more than a minute. I guess I need to do some analysis on those queries to figure out why there was such a vast difference in performance. %

Re: [GENERAL] Security Issues: Allowing Clients to Execute SQL in the Backend.

2014-04-30 Thread Chris Travers
On Wed, Apr 30, 2014 at 12:32 AM, Hello World wrote: > Hello! > > I'm developing a web application that needs to display data from a > postgres backend. > > The most convenient way for the app to get the data is by expressing the > request in SQL. > > I'm thinking about the following architecture

Re: [GENERAL] importing a messy text file

2014-04-30 Thread Alberto Cabello Sánchez
On Wed, 30 Apr 2014 10:07:09 +0200 Willy-Bas Loos wrote: > Hi, > > I have a 56GB textfile that i want to import into postgres. !!! > The problem is this: > There is a tab after the last column, in many but not all records. You could clean it up with something like s/^I$// , don't you? -- Al

Re: [GENERAL] importing a messy text file

2014-04-30 Thread Karsten Hilbert
On Wed, Apr 30, 2014 at 10:07:09AM +0200, Willy-Bas Loos wrote: > it would probably not be > feasible to manually add tabs for every record that misses one, although i > don't know how many it would be. Why not a) let a script do that, b) remove tabs from the end of lines as needed ? All in all

Re: [GENERAL] Security Issues: Allowing Clients to Execute SQL in the Backend.

2014-04-30 Thread Rory Campbell-Lange
On 30/04/14, Hello World (worldani...@gmail.com) wrote: > I'm developing a web application that needs to display data from a postgres > backend. > > The most convenient way for the app to get the data is by expressing the > request in SQL. > > I'm thinking about the following architecture > > [

[GENERAL] importing a messy text file

2014-04-30 Thread Willy-Bas Loos
Hi, I have a 56GB textfile that i want to import into postgres. The file is tab delimited and not quoted. I deleted the header with the column names (using sed) so that i could use COPY with the non-csv text type (because some of the text values contain quotes). I had some minor trouble with the

Re: [GENERAL] Security Issues: Allowing Clients to Execute SQL in the Backend.

2014-04-30 Thread Hello World
Hello, Thank you very much. Denial of service is indeed a problem. Is there a way to limit the execution time of a request? I'm using libpq to communicate with the server. PS. I've just taken a look, it seems I could do some asynchronous queries, time them, then cancel them if they take too lon

Re: [GENERAL] Security Issues: Allowing Clients to Execute SQL in the Backend.

2014-04-30 Thread Albe Laurenz
Hello World wrote: > Given this are there any security other issues about letting client > applications execute arbitrary SQL > commands on the backend database? There shouldn't be any security problems, just be careful that you don't give the user more permissions than you want to. But a user

[GENERAL] Security Issues: Allowing Clients to Execute SQL in the Backend.

2014-04-30 Thread Hello World
Hello! I'm developing a web application that needs to display data from a postgres backend. The most convenient way for the app to get the data is by expressing the request in SQL. I'm thinking about the following architecture [ App/Client ] -> query in SQL ---> [Web server] ---> same SQL q