There's other simpler way if you have limited access to the server.
You must do the query by hand, get the datasource for your model,
execute the query with the query() method, and iterate throught the
Result resource identifier you receive from the query. In order to
dump this to a file, if the data is potentially big, you would create
a new file, and append each chunk of data or each line while iterating
with the query result, and then close the file. If you create a big
string and try to file_put_content it in a file, you'll eventually end
up in the same kind of problem, only this time will be memory limit.


On Feb 5, 2:51 pm, AD7six <andydawso...@gmail.com> wrote:
> On Feb 5, 5:27 pm, Dan <grip...@gmail.com> wrote:
>
> > Thanks for the replies and suggestions everyone. When I have a chance
> > I will dig deeper into this. I've implemented a workaround that is
> > fine for now. I was mainly curious if this was a known issue or just
> > poor implementation on my part. Pete's suggestion may be the solution,
> > I'll know after I try it.
>
> > As I said the actual queries are correct and add no overhead. The
> > overhead is in __mergeHasMany's building of the result array. I don't
> > see how doing a custom query would solve this and may result in a mal-
> > formatted result array. This is a very basic query.
>
> What you should have discovered by now, is that doing *any* kind of
> loop in php when you don't need to is the wrong approach.
>
> Here's what I would suggest is the best approach for generating csv
> database dumps:
>
> "The SELECT ... INTO OUTFILE statement is intended primarily to let
> you very quickly dump a table to a text file on the server machine."
> Taken fromhttp://dev.mysql.com/doc/refman/5.1/en/select.html
>
> Irgo, anything at all that you want to dump, which you can generate
> via a query, should be dumped using this approch.
>
> If the db isn't mysql, it should still be possible to do the
> equivalent of:
> mysql -e "SELECT ..." >  file_name
>
> If it *is* necessary to use php logic in the act of generating the
> report data there are 2.1 obvious choices:
> 1) write a loop to process the data, write to a tmp table and use the
> above approach
> 2) write a batch-loop process, e.g. while($results = $this->find
> ('pending', array('limit' => 100))) { ...... foreach($results as
> $result) {} $this->commit(); }
>
> expecting to be able to find massive amounts of data and then loop
> over them is inevitably going to end in tears.
>
> hth,
>
> AD
--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"CakePHP" group.
To post to this group, send email to cake-php@googlegroups.com
To unsubscribe from this group, send email to 
cake-php+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/cake-php?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to