Great post, Felix!  Very well said.


What does this mean? It means that jQuery is
nowhere as slow as the final test results make it appear (26x slower then
mootools). It means that mootools got the performance lead in some specific
selector (and does good in general) which is given way too much "weight" by
the test itself.

 I'm also questioning how far one can even go in terms of benchmarking
selector engines. I mean everybody has different needs. Most of the time I
use very simple selectors which jQuery does very fast according to the test.
So I'd actually be willing to loose performance on the more complicated
selectors if that allows the more common ones to run faster. What's missing
in my eyes is a survey or analysis of common selector usage that could be
used to weigh in the different selector results.

Reply via email to