2011/12/20 David Gerard <dger...@gmail.com>: > On 20 December 2011 01:16, Tom Morris <t...@tommorris.org> wrote: > >> Under your metric, in this scenario, the edits of a sysop and an >> experienced user, or later the WikiProject editors, would not be >> chosen as the high-quality stable version. > > > Yao did in fact mention that other factors would need consideration. > > And being able to pick a hole doesn't make the algorithm useless - > Google certainly went past simple page rank very early on. > > The question is if Yao's algorithm has markedly better results than > just picking the latest. This would warrant investigation, at the > least.
It is just a 2-3 hours work to select random 100-200 articles - check their history and evaluate if this idea really gona work... IMHO rather not at all. I just checked 10 random articles in English Wikipedia and found that the current versions are usually better than the most stable ones. It is quite common that the last stable version of article is covered by a set of bot-made edits. So at least the bot-made edits should not to be taken into consideration when choosing the "most stable" version. -- Tomek "Polimerek" Ganicz http://pl.wikimedia.org/wiki/User:Polimerek http://www.ganicz.pl/poli/ http://www.cbmm.lodz.pl/work.php?id=29&title=tomasz-ganicz _______________________________________________ foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l