[SciPy-user] Bundling numpy/scipy with applications

Johann Cohen-Tanugi cohen at slac.stanford.edu
Tue Nov 13 02:12:04 EST 2007


David Cournapeau wrote:
> Johann Cohen-Tanugi wrote:
>   
>> David Cournapeau wrote:
>>     
>>> Michael Hearne wrote:
>>>   
>>>       
>>>> I'm creating a Python application that is internal to my organization, 
>>>> but will be installed on both Linux and Mac OS X machines.  The 
>>>> application depends heavily on a number of "non-pure" modules (those 
>>>> that have C/C++/FORTRAN components), like numpy, scipy, gdal, etc.
>>>>
>>>> What is the most "pythonic" way to bundle these types of modules 
>>>> inside my application?
>>>>
>>>> I've investigated dist_utils and setuptools, and I don't see an easy 
>>>> way with those tools to include build instructions for packages built 
>>>> with autconf tools.
>>>>
>>>> Is my only recourse to write a custom install script that calls the 
>>>> "configure;make;make install" from a shell?
>>>>
>>>>     
>>>>         
>>> You have two problems: building and packaging. For packaging, autoconf 
>>> will not help you much outside providing a coherence between packages, I 
>>> think; distutils/setuptools have packaging tools, but I don't know how 
>>> good they are.
>>>
>>> On Linux, the packaging system depends on the distribution: since it is 
>>> only internal, the problem is severely simplified, since you are likely 
>>> to target only one format (rpm, deb, etc...). On Mac OS X, you have also 
>>> tools to package .pkg and .mpkg (which are a set of .pkg), available 
>>> freely on apple dev website; I have not used them much, but they look 
>>> simple and easy to use for simple command line packages.
>>>
>>> You could hack something in distutils to call configure and so on, but I 
>>> don't think it will be pleasant. Building packages using 
>>> distutils/setuptools is easy, and do not need much configuration. So I 
>>> think it is more natural to use a top level tool calling 
>>> distutils/setuptools than to use distutils as the top level tool. As a 
>>> pythonic way, you may consider build tools, such as scons or waf, both 
>>> written in python.
>>>
>>> But frankly, I would not bother too much about pythonic way: since some 
>>> tools will be shell based anyway (autotools, packaging tools), and you 
>>> don't have to care about windows, using make and shell may actually be 
>>> easier.
>>>
>>> For the build part, you may take a look at my garnumpy package: it is 
>>> essentially a set of rules for Gnu Makefiles, and it can build numpy + 
>>> scipy, with a configurable set of dependencies (ATLAS, NETLIB 
>>> BLAS/LAPACK, fftw, etc....). It can build both distutils and 
>>> autotools-based packages.
>>>
>>> http://www.ar.media.kyoto-u.ac.jp/members/david/archives/garnumpy/garnumpy-0.4.tbz2
>>>
>>> I used it sucessfully on linux and cygwin, so I would say it should work 
>>> on Mac OS X without too much trouble. The only thing which will be 
>>> likely to be a pain is fat binaries (Universal). I use it to build a 
>>> totally self contained numpy/scipy installation, which is a first step 
>>> toward packaging. If you think it can be useful to you, don't hesitate 
>>> to ask questions; there is also a bzr archive if you want to have access 
>>> to the dev history of the tool
>>>
>>> cheers,
>>>
>>> David
>>> _______________________________________________
>>> SciPy-user mailing list
>>> SciPy-user at scipy.org
>>> http://projects.scipy.org/mailman/listinfo/scipy-user
>>>   
>>>       
>> Hi David,
>> I am having a hell of a time with lapack/atlas. 
>>     
> That's because they are far from trivial to build correctly. As always 
> with build problems, it is never complicated, but problems are often 
> difficult to track down. To make things worse, some linux distribution 
> (including fedora and suse) did include bogus packages for blas and 
> lapack. That's why I did this garnumpy thing in the first place: quickly 
> build blas/lapack correctly, so that I can test things more easily.
>   
>> I already posted the 
>> issue I currently have with scipy after following the building doc in 
>> scipy web site (see 
>> http://projects.scipy.org/pipermail/scipy-user/2007-November/014506.html). 
>> Besides, octave build seems to also have serious issues with my 
>> lapack/atlas build.
>>     
> Which distribution are you using ?
>   
>> So I immediately downloaded youor package, but I seem to have a problem 
>> with the numpy download :
>> make[3]: Leaving directory 
>> `/data1/sources/python/garnumpy-0.4/platform/numpy'
>> # Change default path when looking for libs to fake dir,
>> # so we can set everything by env variables
>> cd work/main.d/numpy-1.0.3.1 && 
>> PYTHONPATH=/home/cohen/garnumpyinstall/lib/python2.5/site-packages:/home/cohen/garnumpyinstall/lib/python2.5/site-packages/gtk-2.0 
>> /usr/bin/python \
>>         setup.py config_fc --fcompiler=gnu config
>> Traceback (most recent call last):
>>   File "setup.py", line 90, in <module>
>>     setup_package()
>>   File "setup.py", line 60, in setup_package
>>     from numpy.distutils.core import setup
>>   File 
>> "/data1/sources/python/garnumpy-0.4/platform/numpy/work/main.d/numpy-1.0.3.1/numpy/__init__.py", 
>> line 39, in <module>
>>     import core
>>   File 
>> "/data1/sources/python/garnumpy-0.4/platform/numpy/work/main.d/numpy-1.0.3.1/numpy/core/__init__.py", 
>> line 8, in <module>
>>     import numerictypes as nt
>>   File 
>> "/data1/sources/python/garnumpy-0.4/platform/numpy/work/main.d/numpy-1.0.3.1/numpy/core/numerictypes.py", 
>> line 83, in <module>
>>     from numpy.core.multiarray import typeinfo, ndarray, array, empty, dtype
>> ImportError: No module named multiarray
>> make[2]: *** [configure-custom] Error 1
>> make[2]: Leaving directory 
>> `/data1/sources/python/garnumpy-0.4/platform/numpy'
>> make[1]: *** [../../platform/numpy/cookies/main.d/install] Error 2
>> make[1]: Leaving directory 
>> `/data1/sources/python/garnumpy-0.4/platform/scipy'
>> make: *** [imgdep-main] Error 2
>>
>>     
> This is really strange, I have never seen this error. This is what I 
> would do:
>     - remove garnumpyinstall directory
>     - go into bootstrap/lapack, and do make install
>     - go into platform/numpy, do make install
> At this point, you should have an installed numpy in garnumpyinstall: 
> test it (import numpy; numpy.test(level = 9999, verbosity = 9999). If 
> this works, then go into platform/scipy. The errors should be clearer.
>   
do make install in platform/numpy fails exactly in the same way :
[cohen at localhost numpy]$ make install
[===== NOW BUILDING:    numpy-1.0.3.1   =====]
        [fetch] complete for numpy.
        [checksum] complete for numpy.
        [extract] complete for numpy.
        [patch] complete for numpy.
# Change default path when looking for libs to fake dir,
# so we can set everything by env variables
cd work/main.d/numpy-1.0.3.1 && 
PYTHONPATH=/home/cohen/garnumpyinstall/lib/python2.5/site-packages:/home/cohen/garnumpyinstall/lib/python2.5/site-packages/gtk-2.0 
/usr/bin/python \
        setup.py config_fc --fcompiler=gnu config
Traceback (most recent call last):
  File "setup.py", line 90, in <module>
    setup_package()
  File "setup.py", line 60, in setup_package
    from numpy.distutils.core import setup
  File 
"/data1/sources/python/garnumpy-0.4/platform/numpy/work/main.d/numpy-1.0.3.1/numpy/__init__.py", 
line 39, in <module>
    import core
  File 
"/data1/sources/python/garnumpy-0.4/platform/numpy/work/main.d/numpy-1.0.3.1/numpy/core/__init__.py", 
line 8, in <module>
    import numerictypes as nt
  File 
"/data1/sources/python/garnumpy-0.4/platform/numpy/work/main.d/numpy-1.0.3.1/numpy/core/numerictypes.py", 
line 83, in <module>
    from numpy.core.multiarray import typeinfo, ndarray, array, empty, dtype
ImportError: No module named multiarray

>> Any idea? Your package is anyway a great idea. I would actually love to 
>> see it work, of course, but also allow for the possibility to point to 
>> already exisiting source directories for numpy and scipy (for instance 
>> from a previous svn checkout). I did not read the doc of your package so 
>> it might actually be there....
>>     
> You cannot use svn, because this prevents patching from working, and 
> also there is a checksum when downloading: this is a major limitation 
> (but not so major if you realize that the only thing you really want to 
> use from svn are numpy and scipy; you can reuse blas/lapack/fftw/atlas, 
> and that's my main use when testing on different platforms).
>
> You can reuse downloaded archives, though:
>
> make garchive
>
> it will download all the tarballs (you cannot choose a subset, 
> unfortunately) and put them into a directory which will be reused 
> automatically afterwards. This means that even if you do make clean 
> anywhere in the source tree, you won't need to download over and over 
> the same things.
>
> cheers,
>
> David
>
> _______________________________________________
> SciPy-user mailing list
> SciPy-user at scipy.org
> http://projects.scipy.org/mailman/listinfo/scipy-user
>   




More information about the SciPy-User mailing list