Antal Nemes <thoneyva...@gmail.com> added the comment:
Thanks for sharing the discussion above. I did not know this was discussed earlier. Indeed, I do not come from a real world example. I ran into this problem while solving an online coding challenge that also measures performance. I got the right answer, just took too long time to calculate. In the crafted testcase the stop condition could have occurred within the first few elements of the input iterator, but the execution took longer because of exhausting all elements. I could pass the challenge by adding my own lazy_combination, that is basically a wrapper around itertools.combinations. I came up with this version, which is of course incomplete for production. Not too difficult, but it was not straightforward either. def lazy_combinations(g, n): try: known_elements = [] for i in range(n-1): known_elements.append(next(g)) while True: final_element = next(g) for i in itertools.combinations(known_elements, n-1): next_element = i+(final_element,) yield next_element known_elements.append(final_element) except StopIteration: pass What I would like to say is the demand for such behavior might be out there. I understand that such feature would induce complexity for the core. However, if this is not part of the core, the complexity does not disappear, just manifests elsewhere: users might need to spend some time with the debugging, and also write some nontrivial code as a workaround. ---------- _______________________________________ Python tracker <rep...@bugs.python.org> <https://bugs.python.org/issue37671> _______________________________________ _______________________________________________ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com