For what it's worth, I've been storing MARC recs as varchars for a while
with no problems.  The only caveat is records that exceed the maximum
field size - I just split the data, and store the rest in an "overflow"
table (just a record-id field, another varchar field for the leftover
data, and a sequence number (on the off chance that the record exceeds
*two* varchar max lengths...)).

I've got a couple of routines to do the splitting (and un-splitting,
when you retrieve the record).... I can pass you the code, if you're
interested (it's designed for PostgreSQL, but doesn't use any
Postgres-specific stuff, so it should be easy to convert for use with
MySQL).

-David Christensen

> -----Original Message-----
> From: bargioni [mailto:[EMAIL PROTECTED] 
> Sent: Tuesday, January 03, 2006 1:54 AM
> To: perl4lib@perl.org
> Subject: Re: Problem cleaning up MARC for MySQL 
> 
> 
> 
> On 02/gen/06, at 16:34, Dennis Boone wrote:
> 
> >> I am trying to clean up a raw MARC record for insertion in a MySQL 
> >> database.  Here is my code:
> >>
> >> $rec = $rec->rawdata();
> >> $rec = $dbh->quote($rec);
> >>
> >> I get the following error back:
> >>
> >> DBD::mysql::st execute failed: You have an error in your 
> SQL syntax.  
> >> Check the manual that corresponds to your MySQL server version for 
> >> the right syntax to use near '02852cam 2200325Ia 
> >> 45e0001000800000005001700008008004100025010 at ./zlite.pl line 73, 
> >> <STDIN> line 3.
> >
> > Unless the quote() call is the one producing this error, it 
> would help 
> > to see the actual SQL statement and the prepare and execute 
> calls, as 
> > well as the field definitions.
> 
> The error seems produced by the execute statement, not the 
> quote one. Anyway, trying to insert binary data in 
> PostgreSQL's fields of type 
> text or varchar causes the same error. This is another reason why 
> yesterday I suggested to use blob.
> SB
> 
> 

Reply via email to