It might be worth checking out http://github.com/ice799/memprof or
perftools.rb by tmm1.

Hope that helps.

On Thu, 2010-01-14 at 17:24 -0800, Michael wrote:
> The differences between your machine and Heroku could be the Ruby
> version. I've encountered a number if little quirks running 1.8.7
> locally but running 1.8.6 on Heroku. Try to simulate the same
> environment on your machine or EC2 and see if the leak occurs.
> 
> Good luck, those problems are always a pain.
> 
> On Jan 14, 4:00 pm, Ryan Heneise <[email protected]> wrote:
> > I have a long-running task that generates a CSV file for exporting.
> > Here's how it works:
> > 1. An Export object is created, and the job is queued in DJ
> > 2. the process iterates over the dataset with ActiveRecord::find_each
> > and writes the result to a tempfile.
> > 3. The tempfile is uploaded to S3 with Paperclip
> > 4. The export record is saved
> >
> > This works for small-ish datasets, say less than 8,000 rows. However,
> > I have a memory leak somewhere that causes the memory usage for that
> > process to quickly balloon over 400MB, and the process is eventually
> > killed off. The problem is that I just can't find where the memory is
> > leaking.
> >
> > I've created a little debugging routine that updates the Export record
> > every 100 rows with the progress and the memory usage. Here's a link
> > to the Export model code:http://gist.github.com/277544
> >
> > This all works fine on my local machine - there's no memory leak, and
> > the memory stays at a constant 120 MB throughout the export process.
> > So I can't figure out if it's a problem with my code, or a gem that's
> > being used on Heroku, or what...
> >
> > So basically I was wondering if anyone else has experienced any memory
> > leak issues, and if any of this looks familiar, and if so how you
> > solved it?

Attachment: signature.asc
Description: This is a digitally signed message part

Reply via email to