Scott Marlowe wrote:
On 10/4/07, Mike Ginsburg <[EMAIL PROTECTED]> wrote:
export it. Memory limit is a major concern, but the query for one row
returns a result set too large and PHP fails.
If this is true, and one single db row makes php exceed its memory
limit just by returning it, you've done something very very in your
design.
This is for the export only. Since it is an export of ~50,000
registrants, it takes some time to process. We also have load balanced
web servers, so unless I want to create identical processes on all
webservers, or write some crazy script to scp it across the board,
storing it as a text file is not an option. I realize that my way of
doing it is flawed, which the reason I came here for advice. The CSV
contains data from approximately 15 tables, several of which are
many-to-ones making joins a little tricky. My thought was to do all of
the processing in the background, store the results in the DB, and
allowing the requester to download it at their convenience.
Would it be a good idea to create a temporary table that stored all of
the export data in it broken out by rows and columns, and when download
time comes, query from their?
You'd proably be better served using either a plain text file system
to store these things, or large objects in postgresql.
But if you're stuffing ~8 megs worth of csv text data into a single
row* you're probably not using a very relational layout of your data.
And you're losing all the advantages (checking your data for
consistency and such) that a relational db could give you.
* Note that I'm assuming a few things up there. 1: php uses about 2:1
memory to store data it's holding, roughly. If you're set to 16 Meg
max, I'm assuming your return set is 8Meg or larger.
---------------------------(end of broadcast)---------------------------
TIP 4: Have you searched our list archives?
http://archives.postgresql.org/
Mike Ginsburg
Collaborative Fusion, Inc.
[EMAIL PROTECTED]
412-422-3463 x4015