Hi Sebastian,

> I sync approx. 1 Million files (50GB in total) using the --skip-existing 
> command to achieve reasonable run times. But the memory usage goes up to 
> 1.6GB. Have other people experienced similar high memory usages? Is 
> there anything that could be done about it? Anywhere to trace, what 
> variable gets blown up? I believe I traced it down to when there are 
> many local files involved.

I admit I haven't done much optimization on the memory usage front. Both
local and remote file lists contain a lot of information about each
object so that we can compare the lists easily and decide what to upload
/ download.

However if 1 million of files eats up 1.6 GB of memory it means each
object would take some 1.6kB. That's too much indeed. Perhaps we don't
un-reference up some temporary objects and the garbage collector can't
remove them or something like that.

I'm sure the local/remote lists could be optimized, some things removed
and some others computed only when needed.

The first step is apparently to create a filesystem hierarchy with ~1
mio files and upload it to S3 ;-)

Michal

------------------------------------------------------------------------------
The NEW KODAK i700 Series Scanners deliver under ANY circumstances! Your
production scanning environment may not be a perfect world - but thanks to
Kodak, there's a perfect scanner to get the job done! With the NEW KODAK i700
Series Scanner you'll get full speed at 300 dpi even with all image 
processing features enabled. http://p.sf.net/sfu/kodak-com
_______________________________________________
S3tools-general mailing list
S3tools-general@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/s3tools-general

Reply via email to