Does anyone have any good examples of normalizing paths in files? I have files that have paths like this in them:
../../foo.htm Which I process to make fully-qualified like this: /abc/def/ghi/jkl/mno/../../foo.htm Now I want to normalize the fully-qualified paths to remove the "../" parts to get a result like this: /abc/def/ghi/foo.htm I've found I can use <replaceregexp> like this to replace one set of "../" pairs: <replaceregexp match="${util.includeTagStartRegExp}(.*?)/([^/.]*?)/\.\./(.*?)${util.includeTagEndRegExp}" replace="${util.includeTagStartRegExp}\1/\3${util.includeTagEndRegExp}" flags="sg" byline="false"> And if I enclose this in a for loop, I can call it multiple times to remove multiple sets of "../" pairs. However, I don't like hardcoding the number of loops. Because it would be complicated to actually scan the files to get the maximum number of "../" path components, the simplest thing for me to use as my loop count is the maximum depth of the directory tree. Is there a way to compute this? I've tried looping over the directory tree, getting the directories, stripping out everything that isn't a "/". However, I can't figure out how to determine which of these strings is the longest inside a for loop so I know what my loop count will be. -- View this message in context: http://www.nabble.com/Normalizing-paths-in-files-tp22695672p22695672.html Sent from the Ant - Users mailing list archive at Nabble.com. --------------------------------------------------------------------- To unsubscribe, e-mail: user-unsubscr...@ant.apache.org For additional commands, e-mail: user-h...@ant.apache.org