thanks all, i will be looking into it.
Met vriendelijke groet,
Henk
On 16 jun. 2012, at 18:23, Edson Richter wrote:
> Em 16/06/2012 12:59, h...@101-factory.eu escreveu:
>> thanks i thought about splitting the file, but that did no work out well.
>>
>> so we receive 2 files evry 30 seconds a
Em 16/06/2012 12:59, h...@101-factory.eu escreveu:
thanks i thought about splitting the file, but that did no work out well.
so we receive 2 files evry 30 seconds and need to import this as fast as
possible.
we do not run java curently but maybe it's an option.
are you willing to share your co
h...@101-factory.eu wrote:
> thanks i thought about splitting the file, but that did no work out well.
>
> so we receive 2 files evry 30 seconds and need to import this as fast as
> possible.
>
> we do not run java curently but maybe it's an option.
> are you willing to share your code?
>
>
thanks i thought about splitting the file, but that did no work out well.
so we receive 2 files evry 30 seconds and need to import this as fast as
possible.
we do not run java curently but maybe it's an option.
are you willing to share your code?
also i was thinking using perl for it
henk
Em 16/06/2012 12:04, h...@101-factory.eu escreveu:
hi there,
I am trying to import large data files into pg.
for now i used the. xarg linux command to spawn the file line for line and set
and use the maximum available connections.
we use pg pool as connection pool to the database, and so try
hi there,
I am trying to import large data files into pg.
for now i used the. xarg linux command to spawn the file line for line and set
and use the maximum available connections.
we use pg pool as connection pool to the database, and so try to maximize the
concurrent data import of the fil