Fast pythonic way to process a huge integer list

Steven D'Aprano steve at pearwood.info
Thu Jan 7 04:25:48 EST 2016


On Thu, 7 Jan 2016 01:36 pm, high5storage at gmail.com wrote:

> 
> I have a list of 163.840 integers. What is a fast & pythonic way to
> process this list in 1,280 chunks of 128 integers?


py> from itertools import izip_longest
py> def grouper(iterable, n, fillvalue=None):
...     "Collect data into fixed-length chunks or blocks"
...     # grouper('ABCDEFG', 3, 'x') --> ABC DEF Gxx
...     args = [iter(iterable)] * n
...     return izip_longest(fillvalue=fillvalue, *args)
...
py> alist = range(163840)
py> count = 0
py> for block in grouper(alist, 128):
...     assert len(list(block)) == 128
...     count += 1
...
py> count
1280


This was almost instantaneous on my computer. 163840 isn't a very large
number of ints.




-- 
Steven




More information about the Python-list mailing list