[Numpy-discussion] numpy.any segfaults for large object arrays

Charles R Harris charlesr.harris at gmail.com
Mon Mar 24 17:19:22 EDT 2008


On Mon, Mar 24, 2008 at 2:00 PM, Bruce Southey <bsouthey at gmail.com> wrote:

> Hi,
> True, I noticed that on my system (with 8 Gb memory) that using 9999
> works but not 10000.
> Also, use of a 2 dimensional array also crashes if the size if large
> enough:
> large_m=numpy.vstack((large_none, large_none))
>
> Bruce
>
>
> Martin Manns wrote:
> > Bruce Southey <bsouthey at gmail.com> wrote:> Hi,
> >
> >> This also crashes by numpy 1.0.4 under python 2.5.1. I am guessing it
> >> may be due to numpy.any() probably not understanding the 'None' .
> >>
> >
> > I doubt that because I get the segfault for all kinds of object arrays
> that I try out:
> >
> > ~$ python
> > Python 2.4.5 (#2, Mar 12 2008, 00:15:51)
> > [GCC 4.2.3 (Debian 4.2.3-2)] on linux2
> > Type "help", "copyright", "credits" or "license" for more information.
> >
> >>>> import numpy
> >>>> small_obj = numpy.array([1]*10**3, dtype="O")
> >>>> numpy.any(small_obj)
> >>>>
> > True
> >
> >>>> large_obj = numpy.array([1]*10**6, dtype="O")
> >>>> numpy.any(large_obj)
> >>>>
> > Segmentation fault
> > ~$ python
> >
> >>>> import numpy
> >>>> large_strobj = numpy.array(["Yet another string."]*10**6, dtype="O")
> >>>> numpy.any(large_strobj)
> >>>>
> > Segmentation fault
> >
> > Martin
> >
> >
>

Maybe we are forgetting to check the return value of malloc or overlooking
failures in python allocation.

Chuck
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/numpy-discussion/attachments/20080324/1999003b/attachment.html>


More information about the NumPy-Discussion mailing list