scalable bottleneck

Sayth Renshaw flebber.crue at gmail.com
Wed Apr 3 21:03:16 EDT 2019


On Thursday, 4 April 2019 10:51:35 UTC+11, Paul Rubin  wrote:
> Sayth Renshaw writes:
> >     for x in range ( max_root ):
> > 1) Do you see a memory bottleneck here? If so, what is it?
> > 2) Can you think of a way to fix the memory bottleneck?
> 
> In Python 2, range(n) creates a list in memory.  To make an iterator you
> would use xrange instead of range.  In Python 3, range(n) is an iterator.
> 
> Either way, fetch_squares creates a list of squares and not an iterator.
> 
> > for square in fetch_squares (MAX ):
> >     do_something_with ( square )
> 
> Here's you're creating that list and throwing it away.  So you have the
> right idea with:
> 
>     def supply_squares(max_root):
>         for x in max_root:
>             yield x
> 
> But supply_squares returns a generator which means you have to iterate
> through it.
> 
>     print(supply_squares(item))
> 
> will print something like
> 
>    <generator object supply_squares at 0x0000000004DEAC00>
> 
> which is what you saw.  You want:
> 
>    for square in supply_squares(MAX):
>       print square

Ah thanks for your help. Happy I identified the correct bottleneck too. 

Cheers

Sayth



More information about the Python-list mailing list