[Numpy-discussion] last call for fixes for numpy 1.8.1rc1

Julian Taylor jtaylor.debian at googlemail.com
Fri Feb 28 07:52:18 EST 2014


performance should not be impacted as long as we stay on the stack, it
just increases offset of a stack pointer a bit more.
E.g. nditer and einsum use temporary stack arrays of this type for its
initialization:
op_axes_arrays[NPY_MAXARGS][NPY_MAXDIMS]; // both 32 currently
The resulting nditer structure is then in heap space and dependent on
the real amount of arguments it got.
So I'm more worried about running out of stack space, though the limit
is usually 8mb so taking 128kb for a short while should be ok.

On 28.02.2014 13:32, Francesc Alted wrote:
> Well, what numexpr is using is basically NpyIter_AdvancedNew:
> 
> https://github.com/pydata/numexpr/blob/master/numexpr/interpreter.cpp#L1178
> 
> and nothing else.  If NPY_MAXARGS could be increased just for that, and 
> without ABI breaking, then fine.  If not, we should have to wait until 
> 1.9 I am afraid.
> 
> On the other hand, increasing the temporary arrays in nditer from 32kb 
> to 128kb is a bit worrying, but probably we should do some benchmarks 
> and see how much performance would be compromised (if any).
> 
> Francesc
> 
> On 2/28/14, 1:09 PM, Julian Taylor wrote:
>> hm increasing it for PyArrayMapIterObject would break the public ABI.
>> While nobody should be using this part of the ABI its not appropriate
>> for a bugfix release.
>> Note that as it currently stands in numpy 1.9.dev we will break this ABI
>> for the indexing improvements.
>>
>> Though for nditer and some other functions we could change it if thats
>> enough.
>> It would bump some temporary arrays of nditer from 32kb to 128kb, I
>> think that would still be fine, but getting to the point where we should
>> move them onto the heap.
>>
>> On 28.02.2014 12:41, Francesc Alted wrote:
>>> Hi Julian,
>>>
>>> Any chance that NPY_MAXARGS could be increased to something more than
>>> the current value of 32?  There is a discussion about this in:
>>>
>>> https://github.com/numpy/numpy/pull/226
>>>
>>> but I think that, as Charles was suggesting, just increasing NPY_MAXARGS
>>> to something more reasonable (say 256) should be enough for a long while.
>>>
>>> This issue limits quite a bit the number of operands in numexpr
>>> expressions, and hence, to other projects that depends on it, like
>>> PyTables or pandas.  See for example this bug report:
>>>
>>> https://github.com/PyTables/PyTables/issues/286
>>>
>>> Thanks,
>>> Francesc
>>>
>>> On 2/27/14, 9:05 PM, Julian Taylor wrote:
>>>> hi,
>>>>
>>>> We want to start preparing the release candidate for the bugfix release
>>>> 1.8.1rc1 this weekend, I'll start preparing the changelog tomorrow.
>>>>
>>>> So if you want a certain issue fixed please scream now or better create
>>>> a pull request/patch on the maintenance/1.8.x branch.
>>>> Please only consider bugfixes, no enhancements (unless they are really
>>>> really simple), new features or invasive changes.
>>>>
>>>> I just finished my list of issues I want backported to numpy 1.8
>>>> (gh-4390, gh-4388). Please check if its already included in these PRs.
>>>> I'm probably still going to add gh-4284 after some though tomorrow.
>>>>
>>>> Cheers,
>>>> Julian
>>>> _______________________________________________
>>>> NumPy-Discussion mailing list
>>>> NumPy-Discussion at scipy.org
>>>> http://mail.scipy.org/mailman/listinfo/numpy-discussion
>>>
>> _______________________________________________
>> NumPy-Discussion mailing list
>> NumPy-Discussion at scipy.org
>> http://mail.scipy.org/mailman/listinfo/numpy-discussion
> 
> 




More information about the NumPy-Discussion mailing list