El vie, 07-10-2005 a las 20:12 +0200, Danilo Šegan escribió: > Today at 5:20, Owen Taylor wrote: [...] > It does do significant disk work: it basically checks out entire gnome > cvs, runs "intltool-update -p" and then "msgmerge" on every single PO > file in Gnome CVS repository (sometimes for multiple branches) and > creates hundreds of static .html files containing statistics. >
Creation of "hundreds of static .html files" seems like a prehistoric thing to do. Would it be too complicated to create a few xml files with the statistics data (or maybe insert it in a database) and then generate the html dinamically (to avoid too much intensive db access, these could be cached upon generation to avoid duplication). The flow would be like this: -process generates data (xml or sql) -user accesses a page first time (eg. /gnome-2.12/es/desktop/) -php generates the page from data and stores it in a temp dir -other user visits the same page -it gets shown from the temp dir (no duplicate generation, no db overhead) -temp dir gets erased every time the process runs Questions: -Pages that are never visited never get generated, is this a pro or a con? -Is this really less cpu intensive? I don't know if the sum of cpu time for all the pages that are visited (for the first time) is less than what we have now. Anyway, it's just an idea, I would volunteer to help with this if possible. Cheers, Lucas [...] -- Lucas Vieites Fariña <[EMAIL PROTECTED]> Web: <http://www.asixinformatica.com/users/lucas/> Blog: <http://www.asixinformatica.com/blog/> _______________________________________________ gnome-i18n mailing list gnome-i18n@gnome.org http://mail.gnome.org/mailman/listinfo/gnome-i18n