Possibly Pythonic Tail Call Optimization (TCO/TRE)

Joonas Liik liik.joonas at gmail.com
Thu Jul 16 14:23:45 EDT 2015


On 16 July 2015 at 20:49, Chris Angelico <rosuav at gmail.com> wrote:
>
> This sounds like a denial-of-service attack. If you can state that no
> reasonable document will ever have more than 100 levels of nesting,
> then you can equally state that cutting the parser off with a tidy
> exception if it exceeds 100 levels is safe.
>
This particular example does have that kind of smell.. my bad for
being careless with examples.

what if its not a ddos tho, maybe its just strange data?

how about you run some genetic algorithm to optimise something, and
you store a log of your progress as a tree structure.
then you have a script to traverse that tree for some reason, maybe to
analyse the history and optimise the parameters of the search in the
future.
when the problem is complex it might well require thousands of steps
to get to the desired outcome..

but over time the log grows longer and longer.

at first as the script is written it's probably tested on some rather
small logs, but they grow over time so eventually your program will
implode on completely reasonable input.

notice that the tree grows at a constant rate with generations rather
than ~log(n) so it will not reach a natural limit other than finding a
satisfying solution.

whether that limit be 1k or 8k there is no single limit that is
reasonable for all use cases and the range you can vary it is rather
restricted..


(notice: this example has the issue with the genetic algorithm being
potentially expensive to run so it might not reach the 1k limit, but
that does not mean there are not other problems that share this
property. what I want to convey here is that not all tree traversal
has a natural depth limit and there are cases where it will be hit
even with completely natural inputs)

>
> Partly because infinite recursion requires infinite storage too. If it
> truly is tail calls, then you can turn it into a while loop and not
> have the limit; otherwise, you run the risk of blowing out your RAM.

A valid argument. tho 100MB stack space vs infinite stack space is
still very much distinct.

For a long running program it might not even be a big issue if some of
the stack (the very bottom) is swapped to disk as the top will be nice
and warm in the cache. and yes such a program is not exactly common
but just because it uses a lot of memory does not mean it has
"frozen". it is the job of the programmer to say how much heap his
program can use and it is also his job to say how much stack space is
acceptable.

also:

def much_memory_1(str):
    return munch_memory_1(str+"munch much memory!")

much_memory_1(str)

--vs--
def much_memory_2(str):
    tmp = str[:]
    while True:
        tmp +="munch much memory!"
    return tmp  # will never reach this

much_memory_2(str)


>> Having a user defined maximum stack limit might be a good idea, eg if
>> my stack takes up 100MB its prolly broke, but it should be my duty as
>> a programmer to specify such a limit, not have it inflicted upon me
>> (worse, in a manner that cannot be changed!).
>
> Actually, it is up to you. There's a default, but you can change it.
>
>>>> def recurse(n): return n and recurse(n-1)
> ...
>>>> recurse(998)
> 0
>>>> recurse(999)
> (throws RuntimeError)
>>>> sys.getrecursionlimit()
> 1000
>>>> sys.setrecursionlimit(5)
>>>> recurse(5)
> Traceback (most recent call last):
>   File "<stdin>", line 1, in <module>
>   File "<stdin>", line 1, in recurse
>   File "<stdin>", line 1, in recurse
>   File "<stdin>", line 1, in recurse
>   File "<stdin>", line 1, in recurse
> RuntimeError: maximum recursion depth exceeded
>>>> sys.setrecursionlimit(5000)
>>>> recurse(5000)
> (throws RuntimeError with a gigantic traceback)
>>>> sys.setrecursionlimit(50000)
>>>> recurse(50000)
> Segmentation fault

..as i recall reading from a certain stackoverflow post the limit was
about 8000 and possibly varying..



More information about the Python-list mailing list