[SciPy-dev] numpy - dual.py problems

Fernando Perez Fernando.Perez at colorado.edu
Sun Jan 8 11:44:25 EST 2006


Arnd Baecker wrote:
> Hi Travis,
> 
> On Sat, 7 Jan 2006, Travis Oliphant wrote:

>>>Later on Travis' wrote:
>>>"""The solution is that now to get at functions that are in both numpy and
>>>scipy and you want the scipy ones first and default to the numpy ones if
>>>scipy is not installed,  there is a numpy.dual  module that must be
>>>loaded separately that contains all the overlapping functions."""
>>>
>>>I think this is fine, if a user does this in his own code,
>>>but I have found the following `from numpy.dual import`s within numpy
>>> core/defmatrix.py:        from numpy.dual import inv
>>> lib/polynomial.py:    from numpy.dual import eigvals, lstsq
>>> lib/mlab.py:from numpy.dual import eig, svd
>>> lib/function_base.py:    from numpy.dual import i0
>>>
>>
>>BTW, these are all done inside of a function call.
> 
> 
> Yes - I saw that.
> 
> 
>>I want to be able
>>to use the special.i0 method when its available inside numpy (for kaiser
>>window).  I want to be able to use a different inverse for matrix
>>inverses and better eig and svd for polynomial root finding.
> 
> 
> Are these numerically better, or just faster?
> 
> I very much understand your point of view on this
> (until Fernando's mail I would have silently agreed ;-).
> On the other I think that Fernando's point,
> that the mere installation of scipy
> will change the behaviour of numpy implicitely,
> without the user being aware of this
> or having asked for the change.
> 
> Now, it could be that this works fine in 99.9% of
> the cases, but if it does not, it might
> be very hard to track down.
> 
> So I am still thinking that something like a
>   numpy.enable_scipy_functions()
> might be a better approach.

[...]

>>So, I don't see this concept of enhancing internal functions going
>>away.  Now, I don't see the current  numpy.dual approach as the
>>*be-all*.  I think it can be improved on.   In fact, I suppose some
>>mechanism for registering replacement functions should be created
>>instead of giving special place to SciPy.   SciPy could then call these
>>functions.  This could all be done inside of numpy.dual.  So, I think
>>the right structure is there....
> 
> 
> Anyway, sorry if I am wasting your time with this
> discussion, I don't feel too strongly about this point
> (especially after the version check),
> maybe Fernando would like to add something  -
> also I have to move on to other stuff
> (but whom do I tell that ;-).

Well, I do think that having code like (current SVN):

abdul[numpy]> egrep -r 'from numpy.dual' * | grep -v '\.svn/'
core/defmatrix.py:        from numpy.dual import inv
dual.py:#  Usage  --- from numpy.dual import fft, inv
lib/function_base.py:    from numpy.dual import i0
lib/polynomial.py:    from numpy.dual import eigvals, lstsq
lib/mlab.py:from numpy.dual import eig, svd
random/mtrand/mtrand.pyx:        from numpy.dual import svd


sort of defeats the whole purpose of dual, doesn't it?  dual is meant to 
isolate the contributions from full scipy, so that the _existence_ of scipy 
isn't a hidden side-effect for numpy.  If this is the case, then I think there 
shouldn't be code in numpy ever doing 'from dual import...'.  Otherwise, we 
might as well go back to simply writing 'from scipy import ...' as before, no?

Just like in scipy not all packages are auto-loaded (see Pearu's response on 
that today) and you have to call scipy.pkgload() if you want the whole thing 
in memory, I do think that numpy should be strict about the no-hidden 
side-effects policy suggested by dual.  Providing a numpy.load_dual() or 
numpy.enable_scipy() or something would be OK, but I think it should be done 
explicitly.

Note that if an explicit call is made, it should set a global (numpy-level) 
flag, so that any code can check for this condition:

In numpy.__init__, we should have

scipy_dual_loaded = False

def dual_load():
    global scipy_dual_loaded
    from dual import ...
    scipy_dual_loaded = True


This will at least let you check whether this thing was called by other 
libraries you may be importing.   I am trying to ensure that we have a 
mechanism for tracking this kind of side effect, because the call could be 
made by code you didn't write yourself.  With this, at least you can do 
something like:

if _something_weird and numpy.scipy_dual_loaded:
   print 'scipy effects, check for conflicts in that direction'

Ultimately, I think that this should be reversible, with a dual_unload() 
matching routine, but that's icing on the cake.  I do feel that at least the 
explicit dual_load() is the technically correct solution, even at a (minor) 
loss of convenience.

Cheers,

f




More information about the SciPy-Dev mailing list