scalable bottleneck

Sayth Renshaw flebber.crue at gmail.com
Wed Apr 3 17:42:10 EDT 2019


In an email, I received this question as part of a newsletter.

def fetch_squares ( max_root ):
    squares = []
    for x in range ( max_root ):
        squares . append (x **2)
    return squares

MAX = 5

for square in fetch_squares (MAX ):
     do_something_with ( square )

1) Do you see a memory bottleneck here? If so, what is it?
2) Can you think of a way to fix the memory bottleneck?

Want to know if I am trying to solve the correct bottleneck.
I am thinking that the bottleneck is to create a list only to iterate the same list you created sort of doubling the time through.

Is that the correct problem to solve?

If it is then I thought the best way is just to supply the numbers on the fly, a generator.

def supply_squares(max_root):
    for x in max_root:
        yield x

MAX = 5


So then I set up a loop and do whatever is needed. At this time I am creating generator objects. But is this the correct way to go? More of a am I thinking correctly questino.

item = 0
while item < MAX:
    print(supply_squares(item))
    item += 1
    
<generator object supply_squares at 0x0000000004DEAC00>
<generator object supply_squares at 0x0000000004DEAC00>
<generator object supply_squares at 0x0000000004DEAC00>
<generator object supply_squares at 0x0000000004DEAC00>
<generator object supply_squares at 0x0000000004DEAC00>

Thanks

Sayth



More information about the Python-list mailing list