[Numpy-discussion] advanced indexing bug with huge arrays?
Sturla Molden
sturla at molden.no
Tue Jan 24 03:37:26 EST 2012
On 24.01.2012 09:21, Sturla Molden wrote:
> randomkit.c handles C long correctly, I think. There are different codes
> for 32 and 64 bit C long, and buffer sizes are size_t.
distributions.c take C longs as parameters e.g. for the binomial
distribution. mtrand.pyx correctly handles this, but it can give an
unexpected overflow error on 64-bit Windows:
In [1]: np.random.binomial(2**31, .5)
---------------------------------------------------------------------------
OverflowError Traceback (most recent call last)
C:\Windows\system32\<ipython-input-1-000aa0626c42> in <module>()
----> 1 np.random.binomial(2**31, .5)
C:\Python27\lib\site-packages\numpy\random\mtrand.pyd in
mtrand.RandomState.binomial (numpy\random\mtrand\mtrand.c:13770)()
OverflowError: Python int too large to convert to C long
On systems where C longs are 64 bit, this is likely not to produce an
error.
This begs the question if also randomkit.c and districutions.c should be
changed to use npy_intp for consistency across all platforms.
(I assume we are not supporting 16 bit NumPy, in which case we will need
C long there...)
Sturla
More information about the NumPy-Discussion
mailing list