Hello, I have been trialling Ant as a driver for a large scale build execution. The preparation before the build involves copying and unzipping >100,000 files spread across >20,000 directories. When using Ant's built in copy task with filesets selecting large parts of these files, a long time is spent building the list of files to copy, which also takes a lot of memory. This is my understanding of how Ant works with filesets after browsing the source.
Is there any way to avoid this high memory usage and time spent building a list? Has there ever been any consideration of refactoring the way Ant processes filesets and similar constructs such that each selected file is processed once read in an iterative fashion, rather than building a complete list and then processing? thanks paul --------------------------------------------------------------------- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]