I'm sorry, but I attached the incorrect file the first time!
We are using MS Access as a frontend to import many water quality records into 
postgresql v. 8.0.2 on Red Hat 3 Enterprise.
When importing a table of 40,000+ records something is going wrong with the 
timestamp field only for certain records.
The records are listed below.


MS Access Details:

DATA TO IMPORT FROM LOCAL MSACCESS Table into linked ODBC pg table:
ID  original_deployment_code    date_time               water_temp  depth_m     
salinity    sp_cond     do_per      do_mg_l     ph
1   MLM20010327                 2001-04-01 02:00:00     17.860001   1.49352     
30.1        46.299999   80.400002   6.42        7.64
2   MLM20010327                 2001-04-01 02:30:00     17.700001   1.61544     
30.200001   46.400002   78.699997   6.3         7.64
3   MLM20010327                 2001-04-01 03:00:00     17.67       1.64592     
30.200001   46.400002   72.800003   5.84        7.62
4   MLM20010327                 2001-04-01 03:30:00     17.639999   1.524       
30.1        46.299999   79.300003   6.36        7.61

Here's the SQL run from Access:
INSERT INTO public_wq ( original_deployment_code, collection_id, date_time, 
water_temp, ph, sp_cond, salinity, do_per, do_mg_l, depth_m )
SELECT original_deployment_code, c.id, lwq.date_time, water_temp, ph, sp_cond, 
salinity, do_per, do_mg_l, depth_m
FROM DD5 AS lwq INNER JOIN public_collection AS c ON 
lwq.original_deployment_code=c.original_collection_code;

I get a uniqueness constraint error.

Here are the Postgresql table details:

CREATE TABLE "public"."wq" (
  "id" SERIAL,
  "collection_id" INTEGER NOT NULL,
  "date_time" TIMESTAMP(0) WITH TIME ZONE NOT NULL,
  "original_deployment_code" VARCHAR(20),
  "water_temp" NUMERIC(28,6),
  "depth_m" NUMERIC(28,6),
  "salinity" NUMERIC(28,6),
  "sp_cond" NUMERIC(28,6),
  "do_per" NUMERIC(28,6),
  "do_mg_l" NUMERIC(28,6),
  "ph" NUMERIC(28,6),
  "turbidity" NUMERIC(28,6),
  "chlorophyll_ug_l" NUMERIC(28,6),
  "orp_mv" NUMERIC(28,6),
  "reviewed" BOOLEAN DEFAULT false NOT NULL,
  "date_entered" TIMESTAMP(0) WITH TIME ZONE NOT NULL,
  "date_updated" TIMESTAMP(0) WITH TIME ZONE NOT NULL,
  "entered_by" VARCHAR(50) NOT NULL,
  "updated_by" VARCHAR(50) NOT NULL,
  CONSTRAINT "wq_pkey" PRIMARY KEY("id"),
  CONSTRAINT "fk_collection" FOREIGN KEY ("collection_id")
    REFERENCES "public"."collection"("id")
    ON DELETE NO ACTION
    ON UPDATE CASCADE
    NOT DEFERRABLE
) WITH OIDS;

COMMENT ON TABLE "public"."wq"
IS 'This is continuous wq.  Unique constraint is on collection and date_time.  
Do we need fields for raw and interpolated values???';

COMMENT ON COLUMN "public"."wq"."collection_id"
IS 'fk to collection table.  ';

COMMENT ON COLUMN "public"."wq"."date_time"
IS 'Date that the sample was taken.  This is usually in 30 minute increments 
per collection.';

COMMENT ON COLUMN "public"."wq"."original_deployment_code"
IS '???';

COMMENT ON COLUMN "public"."wq"."do_per"
IS 'Calculate this in a trigger?';

COMMENT ON COLUMN "public"."wq"."reviewed"
IS 'This defaults to false and must be set manually in order to verify entries. 
 Possibly setup a view to filter these out.';

COMMENT ON COLUMN "public"."wq"."date_entered"
IS 'Triggered value upon insert.  Will use current_timestamp unless specified.';

COMMENT ON COLUMN "public"."wq"."date_updated"
IS 'Triggered value upon update.  Will use current_timestamp.';

COMMENT ON COLUMN "public"."wq"."entered_by"
IS 'Triggered value upon insert.  Will use current_user unless otherwise 
specified.';

COMMENT ON COLUMN "public"."wq"."updated_by"
IS 'Triggered value upon update.  Will use current_user.';

CREATE UNIQUE INDEX "wq_unique_key" ON "public"."wq"
  USING btree ("collection_id", "date_time");

CREATE TRIGGER "new_wq_trig" BEFORE INSERT
ON "public"."wq" FOR EACH ROW
EXECUTE PROCEDURE "public"."new_record_logger"();

CREATE TRIGGER "update_wq_trig" BEFORE UPDATE
ON "public"."wq" FOR EACH ROW
EXECUTE PROCEDURE "public"."update_record_logger"();


****new_record_logger trigger****
BEGIN
     if new.date_entered is null then
        new.date_entered := current_timestamp;
     end if;
     if new.entered_by is null then
        new.entered_by := current_user;
     end if;
     if new.date_updated is null then
        new.date_updated := current_timestamp;
     end if;
     if new.updated_by is null then
        new.updated_by := current_user;
     end if;


     return new;
END;


RESULTS:

Taking the uniqueness constraint off allows import and the above inserted data 
is below (Notice that 2:00 and 2:30 changed to 3:00 and 3:30 during the insert):

id      collection_id   date_time           original_deployment_code    
water_temp  depth_m     salinity    sp_cond     do_per      do_mg_l     ph      
turbidity   chlorophyll_ug_l    orp_mv  reviewed    date_entered            
date_updated            entered_by  updated_by
204414  1982            2001-04-01 03:30:00 MLM20010327                 
17.639999   1.524       30.1        46.299999   79.300003   6.36        7.61    
                                0                   2005-07-27 14:28:39     
2005-07-27 14:28:39     ddabney     ddabney
204413  1982            2001-04-01 03:00:00 MLM20010327                 17.67   
    1.64592     30.200001   46.400002   72.800003   5.84        7.62            
                        0                   2005-07-27 14:28:39     2005-07-27 
14:28:39     ddabney     ddabney
204412  1982            2001-04-01 03:30:00 MLM20010327                 
17.700001   1.61544     30.200001   46.400002   78.699997   6.3         7.64    
                                0                   2005-07-27 14:28:39     
2005-07-27 14:28:39     ddabney     ddabney
204411  1982            2001-04-01 03:00:00 MLM20010327                 
17.860001   1.49352     30.1        46.299999   80.400002   6.42        7.64    
                                0                   2005-07-27 14:28:39     
2005-07-27 14:28:39     ddabney     ddabney

I'm assuming this is a bug.



---------------------------(end of broadcast)---------------------------
TIP 6: explain analyze is your friend

Reply via email to