New submission from Марк Коренберг <socketp...@gmail.com>:

I want a new function introduced in intertools. Something like this, but more 
optimal, and in C:

=======================
from itertools import chain, islice
from typing import Iterable, TypeVar

T = TypeVar('T')  # pylint: disable=invalid-name


def batches(items: Iterable[T], num: int) -> Iterable[Iterable[T]]:
    items = iter(items)
    while True:
        try:
            first_item = next(items)
        except StopIteration:
            break
        yield chain((first_item,), islice(items, 0, num - 1))
=======================

Splits big arrays to iterable chunks of fixed size (except the last one). 
Similar to `group_by`, but spawns new iterable group based on the group size.

For example, when passing many record to a database, passing one by one is 
obviously too slow. Passing all the records at once may increase latency. So, a 
good solution is to pass, say, 1000 records in one transaction. The smae in 
REST API batches.

P.S. Yes, I saw solution  
https://docs.python.org/3/library/itertools.html#itertools-recipes `def 
grouper`, but it is not optimal for big `n` values.

----------
components: Library (Lib)
messages: 413061
nosy: socketpair
priority: normal
severity: normal
status: open
title: Feature: iptertools: add batches
type: enhancement
versions: Python 3.11

_______________________________________
Python tracker <rep...@bugs.python.org>
<https://bugs.python.org/issue46718>
_______________________________________
_______________________________________________
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com

Reply via email to