On 2019-06-17 16:01:57 -0300, Leandro Guimarães wrote:
> I've installed all dependencies, but when I try to "make install" in
> pg_bulkload folder if have some errors like this:
>
> In file included from pgut/pgut.h:24:0,
> from pgut/pgut-fe.h:13,
> from pg_bulklo
On 6/17/19 12:01 PM, Leandro Guimarães wrote:
I've installed all dependencies, but when I try to "make install" in
pg_bulkload folder if have some errors like this:
In file included from pgut/pgut.h:24:0,
from pgut/pgut-fe.h:13,
from pg_bulkload.c:17:
/usr/in
I've installed all dependencies, but when I try to "make install" in
pg_bulkload folder if have some errors like this:
In file included from pgut/pgut.h:24:0,
from pgut/pgut-fe.h:13,
from pg_bulkload.c:17:
/usr/include/postgresql/internal/pqexpbuffer.h:149:13: err
On 6/17/19 10:04 AM, Leandro Guimarães wrote:
Hi Adrian,
Yes, that's the problem!
I'm testing now the pg_bulkload but I'm facing some issues to
install it on Postgresql 9.4.
The issues would be?
Leandro Guimarães
--
Adrian Klaver
adrian.kla...@aklaver.com
Hi Adrian,
Yes, that's the problem!
I'm testing now the pg_bulkload but I'm facing some issues to install it
on Postgresql 9.4.
Leandro Guimarães
On Mon, Jun 17, 2019 at 1:22 PM Adrian Klaver
wrote:
> On 6/17/19 9:06 AM, Leandro Guimarães wrote:
> Please reply to list also.
> Ccing lis
On 6/17/19 9:22 AM, Adrian Klaver wrote:
On 6/17/19 9:06 AM, Leandro Guimarães wrote:
Please reply to list also.
Ccing list.
Ugh My bad again.
They are UNIQUE:
CONSTRAINT unique_const_value_20190501_45 UNIQUE (customer_id,
date_time, indicator_id, element_id),
I've made a mistake typing "che
On 6/17/19 9:06 AM, Leandro Guimarães wrote:
Please reply to list also.
Ccing list.
Ugh My bad again.
They are UNIQUE:
CONSTRAINT unique_const_value_20190501_45 UNIQUE (customer_id,
date_time, indicator_id, element_id),
I've made a mistake typing "check constraint" before because these are
p
On 6/17/19 8:14 AM, Leandro Guimarães wrote:
Hi Adrian,
You are right, these fields are in CHECK CONSTRAiNTS and they are
not formally defined as Primary Keys.
Alright. Two things:
1) If you are are thinking of them as keys, why not make them a PK or a
UNIQUE index?
2) Still not clear
Hi Adrian,
You are right, these fields are in CHECK CONSTRAiNTS and they are not
formally defined as Primary Keys.
Thanks!
Leandro Guimarães
On Sat, Jun 15, 2019 at 10:45 AM Adrian Klaver
wrote:
> On 6/14/19 7:24 PM, Leandro Guimarães wrote:
> > Hi Tim, thanks for you answer!
> >
> > The
On 6/14/19 7:24 PM, Leandro Guimarães wrote:
Hi Tim, thanks for you answer!
The columns were just examples, but let me explain the database
structure, the fields in *bold are the keys*:
*customer_id integer*
*date_time timestamp*
*indicator_id integer*
*element_id integer*
indicator_value dou
Hi Adrian,
I'll take a look about pg_bulkload, but I populate the database via a
Java application with JDBC.
I'll try the query you kindly sent to me!
Thanks!
Leandro Guimarães
On Fri, Jun 14, 2019 at 6:59 PM Adrian Klaver
wrote:
> On 6/14/19 2:04 PM, Leandro Guimarães wrote:
> > Hi,
> >
Hi Tim, thanks for you answer!
The columns were just examples, but let me explain the database structure,
the fields in *bold are the keys*:
*customer_id integer*
*date_time timestamp*
*indicator_id integer*
*element_id integer*
indicator_value double precision
The table is partitioned per day a
Leandro Guimarães writes:
> Hi,
>
>I have a scenario with a large table and I'm trying to insert it via a
> COPY command with a csv file.
>
>Everything works, but sometimes my source .csv file has duplicated data
> in the previously fulfilled table. If I add a check constraint and try t
On 6/14/19 2:04 PM, Leandro Guimarães wrote:
Hi,
I have a scenario with a large table and I'm trying to insert it via
a COPY command with a csv file.
Everything works, but sometimes my source .csv file has duplicated
data in the previously fulfilled table. If I add a check constraint
On 6/14/19 2:04 PM, Leandro Guimarães wrote:
Hi,
I have a scenario with a large table and I'm trying to insert it via
a COPY command with a csv file.
Everything works, but sometimes my source .csv file has duplicated
data in the previously fulfilled table. If I add a check constraint
Hi,
I have a scenario with a large table and I'm trying to insert it via a
COPY command with a csv file.
Everything works, but sometimes my source .csv file has duplicated data
in the previously fulfilled table. If I add a check constraint and try to
run the COPY command I have an error tha
16 matches
Mail list logo