[Cython] Segfault with large cdef'd list

Stefan Behnel stefan_ml at behnel.de
Sun Jan 7 05:18:09 EST 2018


Robert Bradshaw schrieb am 07.01.2018 um 09:48:
> On Sat, Jan 6, 2018 at 10:57 PM, Dan Stromberg wrote:
>> I'm getting a weird segfault from a tiny function (SSCCE) using cython
>> with python 2.7.  I'm seeing something similar with cython and python
>> 3.5, though I did not create an SSCCE for 3.5.
>>
>> This same code used to work with slightly older cythons and pythons,
>> and a slightly older version of Linux Mint.
>>
>> The code is at http://stromberg.dnsalias.org/svn/why-is-python-slow/trunk
>> (more specifically at
>> http://stromberg.dnsalias.org/svn/why-is-python-slow/trunk/tst.pyx )
>>
>> In short, cdef'ing a list of doubles with about a million elements,
>> and using only the 0th element once, segfaults - but cdef'ing a
>> slightly smaller array does not segfault under otherwise identical
>> conditions.
>>
>> Any suggestions?  Does Cython have a limit on the max size of a stack frame?
>>
> Cython itself doesn't impose any limits, but it does inherit whatever
> limit exists in the C complier and runtime. The variance may be due to
> whatever else happens to be placed on the stack.

Let me add that I wouldn't consider it a good idea to allocate large chunks
of memory on the stack. If it's meant to hold substantial amounts of data
(which also suggests that there is a substantial amount of processing
and/or copying involved), it's probably also worth a [PyMem_]malloc() call.
Heap allocation allows you to respond to allocation failures with a
MemoryError rather than a crash, as you get now. How much stack space you
have left is user controlled through call depth and recursion, which makes
it a somewhat easy target.

Stefan


More information about the cython-devel mailing list