Is there an equivalent function for bytea columns that works like lo_import?
Alternatively, is there a way to copy from a large object to a bytea column from SQL? Or maybe someone has another way of attacking this problem: I've got some Perl code that does this: undef $/; $data = <FHFOR89MBFILE>; $sth = $dbh->prepare("insert into data (bigbyteacolumn) values (?)"); $sth->bind_param(1, $data, DBI::SQL_BINARY); $sth->execute; Which has worked fine for a while, with file sizes around 10MB. However, now I have someone who wants to use this for a file that's 89MB, and it's taking up about 500M of memory before crashing. I'm trying to find a less-memory-consuming way of handling this, even if just for a temporary hack for this one file. I think what's happening is that Perl is reading in the 89M, and then I'm guessing that either Perl or the driver is converting that into a fully-escaped string for transfer, and this is where the problem is occuring. Any ideas? Thanks, Jonathan Bartlett ---------------------------(end of broadcast)--------------------------- TIP 9: the planner will ignore your desire to choose an index scan if your joining column's datatypes do not match