[Numpy-discussion] Why is the truth value of ndarray not simply size>0 ?

Neil Martinsen-Burrell nmb at wartburg.edu
Mon Sep 7 09:46:35 EDT 2009


On 2009-09-07 07:11 , Robert wrote:
> Is there a reason why ndarray truth tests (except scalars)
> deviates from the convention of other Python iterables
> list,array.array,str,dict,... ?
>
> Furthermore there is a surprising strange exception for arrays
> with size 1 (!= scalars).

Historically, numpy's predecessors used "not equal to zero" as the 
meaning for truth (consistent with numerical types in Python).  However, 
this introduces an ambiguity as both any(a != 0) and all(a != 0) are 
reasonable interpretations of the truth value of a sequence of numbers. 
  Numpy refuses to guess and raises the exception shown below.  For 
sequences with a single item, there is no ambiguity and numpy does the 
(numerically) ordinary thing.

The ndarray type available in Numpy is not conceptually an extension of 
Python's iterables.  If you'd like to help other Numpy users with this 
issue, you can edit the documentation in the online documentation editor 
at http://docs.scipy.org/numpy/docs/numpy-docs/user/index.rst

-Neil



More information about the NumPy-Discussion mailing list