Does work. Thank you both very much! Now that I have thousands of queued/backlogged tasks in a scheduler, I noticed that my regular tasks, which are of higher priority will be on hold until everything else gets processed. Maybe, it would be a good idea to have a field for a priority of a task? (just a thought)
On Fri, Oct 19, 2012 at 5:11 PM, Niphlod <niph...@gmail.com> wrote: > it's missing the outer loop. > > _last_id = 0 > _items_per_page=1000 > while True: > rows = db(db.table.id>_last_id).select(limitby=(0,_items_per_page), > orderby=db.table.id) > if len(rows) == 0: > break > for row in rows: > > #do something > _last_id = row.id > > Should work. > > > On Friday, October 19, 2012 10:52:06 PM UTC+2, Adi wrote: > >> i put it exactly as it is, but it stopped working after 1000 records... >> will double check again. >> >> >> On Fri, Oct 19, 2012 at 3:47 PM, Vasile Ermicioi <elf...@gmail.com>wrote: >> >>> _last_id = 0 >>>> _items_per_page=1000 >>>> for row in db(db.table.id>_last_id).selec**t(limitby=(0,_items_per_page), >>>> orderby=db.table.id): >>>> #do something >>>> _last_id = row.id >>> >>> >>> you don;t need to change anything to load all data, this code is >>> loading everything in slices as you need, >>> all records are ordered by id, and next query will load all next >>> _items_per_page items >>> db.table.id>_last_id - will skip all previous records >>> >>> -- >>> >>> >>> >>> >> >> >> -- > > > > -- Thanks, Adnan video: http://vimeo.com/24653283 --