Wolfgang Keller wrote: > The way I understand this, resuming a generator causes less overhead than the > inital overhead of a function call.
I don't have Python 2.4 handy, but it doesn't seem to be true in 2.3. I'm not very proficient with generators though, so maybe I'm doing something stupid here... >>> from __future__ import generators >>> def f(): ... return 1 ... >>> def g(): ... while 1: ... yield 1 ... >>> it = g() >>> import time >>> def t(c, n): ... start = time.time() ... for i in xrange(n): ... c() ... print time.time()-start ... >>> t(f,1000000) 0.277699947357 >>> t(f,1000000) 0.279093980789 >>> t(f,1000000) 0.270813941956 >>> t(it.next,1000000) 0.297060966492 >>> t(it.next,1000000) 0.263942956924 >>> t(it.next,1000000) 0.293347120285 For refernce: >>> def t0(c, n): ... start = time.time() ... for i in xrange(n): ... pass ... print time.time()-start ... >>> t0(it.next,1000000) 0.0523891448975 Maybe the ratio is completely different in a newer Python than 2.3.4 (RH EL3 standard install). Or maybe it's very different if there are plenty of local variables etc in f / g. -- http://mail.python.org/mailman/listinfo/python-list