Arash Esbati <[email protected]> writes: Hi Arash,
> I was playing with the regexp's in AUCTeX and before coming to the > actual thing I wanted to test, I took an arbitraty, not so complex > .tex file (around 1180 lines) and eval'ed these two forms: > > (let ((gc-cons-threshold 800000)) > (benchmark-run 15 (TeX-normal-mode))) > > (let ((gc-cons-threshold most-positive-fixnum)) > (benchmark-run 15 (TeX-normal-mode))) > > The results are: > > (0.984757 13 0.517633) > > (0.564401 0 0.0) > > Preventing GC has a large impact on parsing. I'd like to hear from > people with large, complex files if they could also run the test on > them and report back their findings. These are the results with the largest and most complex LaTeX document I've ever written: (7.79938455 43 3.516914559) (4.881446238 0 0.0) > And is there any major restriction why we shouldn't use this in order > to speed up parsing? The above effectively forbids gc during parsing and I'm not sure what would happen when a very large file is parsed on a system with little memory then. Maybe it would be better to use a gc-cons-percentage value of 0.5 or something. With that, I get (5.124124372000001 6 0.5745011899999923) which is not much slower but doesn't forbid gc altogether. Bye, Tassilo
