[S3tools-general] Memory usage on large buckets

2009-08-17 Thread Jean Jordaan
Hi there I'm deleting a bucket with many objects like this: $ ./s3cmd --verbose --progress rb --force s3://BUCKETNAME Memory use is at 1.8GB resident now. s3cmd version 0.9.9 -- jean . .. //\\\oo///\\ --

Re: [S3tools-general] Memory usage on large buckets

2009-08-17 Thread Michal Ludvig
Hi Jean, > I'm deleting a bucket with many objects like this: > $ ./s3cmd --verbose --progress rb --force s3://BUCKETNAME > > Memory use is at 1.8GB resident now. I know this is an issue for users with extremly large buckets. For now, before the memory utilisation is optimised, I suggest to dele

Re: [S3tools-general] Memory usage on large buckets

2009-08-17 Thread Jean Jordaan
Hi Michal > I know this is an issue for users with extremly large buckets. Ah, good to know others are also feeling the pain ;-) > For now, > before the memory utilisation is optimised, I suggest to delete the > objects in chunks, Everything is in one massive bucket. The objects are smallish fi

Re: [S3tools-general] Memory usage on large buckets

2009-08-17 Thread Michal Ludvig
Jean Jordaan wrote: > When s3cmd reports > File s3://BUCKETNAME/1cc78b9209b129e9ab52a7a532ec4213dc7a7b05-667535 deleted You can still use the method I suggested: for PREFIX_INT in $(seq 0 255); do PREFIX=$(printf "%02x" ${PREFIX_INT}) s3cmd del -r s3://BUCKETNAME/${PREFIX}* done That'll rem