From bedouglas at earthlink.net Thu Nov 1 00:45:07 2007 From: bedouglas at earthlink.net (bruce) Date: Wed, 31 Oct 2007 20:45:07 -0800 Subject: [SciPy-user] SciPY/NumPY - rhel4 installation issues.... Message-ID: <067e01c81c41$fc024bc0$0301a8c0@tmesa.com> Hi. I'm trying to install numpy/scipy on a rhel system with python v2.4.3 I'm following the information from various sites. My goal is to be able to successfully get numpy/scipy up/running, with a few basic tests. I'm using the tar files for: drwxr-xr-x 6 root root 4096 Oct 31 19:30 . drwxr-xr-x 32 root root 4096 Oct 31 12:26 .. drwxr-xr-x 12 root root 4096 Oct 31 18:29 ATLAS -rw-r--r-- 1 root root 3192987 Oct 31 12:35 atlas3.8.0.tar.gz drwxr-x--- 8 13686 731 4096 Oct 31 18:11 lapack-3.1.1 -rw-r--r-- 1 root root 10407595 Feb 26 2007 lapack-3.1.1.tgz drwxrwxr-x 4 test test 4096 Oct 31 19:43 numpy-1.0.3.1 -rw-r--r-- 1 root root 1500255 Oct 31 12:49 numpy-1.0.3.1.tar.gz -rw-r--r-- 1 root root 54 Oct 31 12:27 readme.txt drwxrwxr-x 3 test test 4096 Sep 22 01:14 scipy-0.6.0 -rw-r--r-- 1 root root 6572954 Oct 31 12:52 scipy-0.6.0.tar.gz I believe I've successfully downloaded/built/installed the atlas/lapack apps. I'm using a RHEL4 system with a single 1.8GHz. processor. I did a: "python setup.py install" for numpy, and it appeared to finish. I then did a: "python setup.py install" for scipy, and got the following err: [root at mfgtest3 scipy-0.6.0]# python setup.py install Traceback (most recent call last): File "setup.py", line 53, in ? setup_package() File "setup.py", line 28, in setup_package from numpy.distutils.core import setup File "/usr/lib/python2.4/site-packages/numpy/__init__.py", line 39, in ? import core File "/usr/lib/python2.4/site-packages/numpy/core/__init__.py", line 5, in ? import multiarray ImportError: /usr/lib/python2.4/site-packages/numpy/core/multiarray.so: undefined symbol: PyUnicodeUCS2_FromUnicode I can't find anything via google to indicate what's the cause... Any thoughts/comments would be seriously helpful!! Could this be an issue of incompatible versions between numpy/scipy?? thanks -sam From parvel.gu at gmail.com Thu Nov 1 02:08:38 2007 From: parvel.gu at gmail.com (Parvel Gu) Date: Thu, 1 Nov 2007 14:08:38 +0800 Subject: [SciPy-user] Generating random variables in a joint normal distribution? In-Reply-To: <4725637C.7030207@gmail.com> References: <4725487E.40509@gmail.com> <4725637C.7030207@gmail.com> Message-ID: Hi, Thanks so much for the reply. I could go furthur in this problem. And here is the following issue: In my understanding, the continues calls to random.multivariate_normal would get a serials of random vairables which are all following the joint normal distribution. Then what about how to reset the random generating in each iteration? Assuming that I have to simulate something for 50 iterations then to get the average value. In each iteration I need a serial of random pairs which follows the joint normal distrubition. The interations are expected to be independent to each other. Thus: In iteration 1: serial = [] in sub loop serial.append(random.multivariate_normal(m, cov)) ... And in iteration 2, I want a fresh serial which should not be affected by the previous one s. Is there any problem to empty the serial then still call random.multivariate_normal(m, cov) ? In my understanding if there is no refresh or something performed to the internal random generator, the random.multivariate_normal would continue to generate variables in the context of the previous ones. Br, Parvel From robert.kern at gmail.com Thu Nov 1 04:09:00 2007 From: robert.kern at gmail.com (Robert Kern) Date: Thu, 01 Nov 2007 03:09:00 -0500 Subject: [SciPy-user] Generating random variables in a joint normal distribution? In-Reply-To: References: <4725487E.40509@gmail.com> <4725637C.7030207@gmail.com> Message-ID: <4729899C.8090904@gmail.com> Parvel Gu wrote: > Hi, > > Thanks so much for the reply. I could go furthur in this problem. > > And here is the following issue: > > In my understanding, the continues calls to random.multivariate_normal > would get a serials of random vairables which are all following the > joint normal distribution. Then what about how to reset the random > generating in each iteration? > > Assuming that I have to simulate something for 50 iterations then to > get the average value. In each iteration I need a serial of random > pairs which follows the joint normal distrubition. The interations are > expected to be independent to each other. > > Thus: > > In iteration 1: > serial = [] > in sub loop > serial.append(random.multivariate_normal(m, cov)) > ... > > And in iteration 2, I want a fresh serial which should not be affected > by the previous one s. Is there any problem to empty the serial > then still call random.multivariate_normal(m, cov) ? > In my understanding if there is no refresh or something performed to > the internal random generator, the random.multivariate_normal would > continue to generate variables in the context of the previous ones. Just keep sampling. The samples will be as independent of each other as you can possibly get them. Reseeding the generator (the only thing I can imagine that you meant by "refresh or something") will only make things worse. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From lxander.m at gmail.com Thu Nov 1 09:13:28 2007 From: lxander.m at gmail.com (Alexander Michael) Date: Thu, 1 Nov 2007 09:13:28 -0400 Subject: [SciPy-user] SciPY/NumPY - rhel4 installation issues.... In-Reply-To: <067e01c81c41$fc024bc0$0301a8c0@tmesa.com> References: <067e01c81c41$fc024bc0$0301a8c0@tmesa.com> Message-ID: <525f23e80711010613g73959bdaw98d31f1288f75654@mail.gmail.com> On 11/1/07, bruce wrote: > I then did a: > "python setup.py install" for scipy, and got the following err: > [root at mfgtest3 scipy-0.6.0]# python setup.py > install > Traceback (most recent call last): > File "setup.py", line 53, in ? > setup_package() > File "setup.py", line 28, in setup_package > from numpy.distutils.core import setup > File "/usr/lib/python2.4/site-packages/numpy/__init__.py", line 39, in ? > import core > File "/usr/lib/python2.4/site-packages/numpy/core/__init__.py", line 5, > in ? > import multiarray > ImportError: /usr/lib/python2.4/site-packages/numpy/core/multiarray.so: > undefined symbol: PyUnicodeUCS2_FromUnicode > > I can't find anything via google to indicate what's the cause... > > Any thoughts/comments would be seriously helpful!! This could be a result of your numpy installation being incomplete (likely do to not being correctly linked to blas/lapack). What happens when you try: python -c "import numpy" on its own? Does it fail with the same error? If so, you need to focus on numpy. One subtlety to be aware of is that you need to remove the build directory in your numpy directory whenever you need to rebuild numpy. HTH, Alex From berthe.loic at gmail.com Thu Nov 1 10:27:38 2007 From: berthe.loic at gmail.com (LB) Date: Thu, 01 Nov 2007 07:27:38 -0700 Subject: [SciPy-user] Bug in scipy.integrate.odeint ? Message-ID: <1193927258.315014.184820@50g2000hsm.googlegroups.com> Hi, I find the behaviour of scipy.integrate.odeint very strange : >>> from numpy import * >>> from scipy import integrate >>> >>> def f(x, t): ... return cos(x) ... >>> >>> X0 = array([0.0]) >>> X0_ini = X0.copy() >>> t = linspace(0, 20., 200) >>> X = integrate.odeint(f, X0, t) >>> X0 == X0_ini array([False], dtype=bool) >>> X0 == X[-1] array([True], dtype=bool) Why do odeint modify X0 ? Is there any reason for this ? For me, it seems really like a bug and could easily lead to bugs if there's no mention of this in the doc string. -- LB From cookedm at physics.mcmaster.ca Thu Nov 1 10:56:19 2007 From: cookedm at physics.mcmaster.ca (David M. Cooke) Date: Thu, 1 Nov 2007 10:56:19 -0400 Subject: [SciPy-user] Bug in scipy.integrate.odeint ? In-Reply-To: <1193927258.315014.184820@50g2000hsm.googlegroups.com> References: <1193927258.315014.184820@50g2000hsm.googlegroups.com> Message-ID: On Nov 1, 2007, at 10:27 , LB wrote: > Hi, > > I find the behaviour of scipy.integrate.odeint very strange : >>>> from numpy import * >>>> from scipy import integrate >>>> >>>> def f(x, t): > ... return cos(x) > ... >>>> >>>> X0 = array([0.0]) >>>> X0_ini = X0.copy() >>>> t = linspace(0, 20., 200) >>>> X = integrate.odeint(f, X0, t) >>>> X0 == X0_ini > array([False], dtype=bool) >>>> X0 == X[-1] > array([True], dtype=bool) > > Why do odeint modify X0 ? > Is there any reason for this ? > For me, it seems really like a bug and could easily lead to bugs if > there's no mention of this in the doc string. I'm guessing you've got an old version of scipy installed, as it was changed in February to use a copy of the input arguments instead. -- |>|\/|< /------------------------------------------------------------------\ |David M. Cooke http://arbutus.physics.mcmaster.ca/dmc/ |cookedm at physics.mcmaster.ca From bradford.n.cross at gmail.com Thu Nov 1 13:29:16 2007 From: bradford.n.cross at gmail.com (Bradford Cross) Date: Thu, 1 Nov 2007 10:29:16 -0700 Subject: [SciPy-user] Installing TimeSeries Message-ID: Install works OK for MAskedArray but fails for TimeSeries with following output: bcross at ubuntu:~/maskedarray$ sudo python setup.py install Password: running install running build running config_fc running build_py running install_lib running install_data running install_egg_info Removing /usr/lib/python2.5/site-packages/maskedarray-0.0.0.egg-info Writing /usr/lib/python2.5/site-packages/maskedarray-0.0.0.egg-info bcross at ubuntu:~/maskedarray$ cd bcross at ubuntu:~$ cd timeseries/ bcross at ubuntu:~/timeseries$ sudo python setup.py install Warning: Assuming default configuration (./lib/{setup_lib,setup}.py was not found) Appending timeseries.lib configuration to timeseries Ignoring attempt to set 'name' (from 'timeseries' to 'timeseries.lib') Warning: Assuming default configuration (./tests/{setup_tests,setup}.py was not found) Appending timeseries.tests configuration to timeseries Ignoring attempt to set 'name' (from 'timeseries' to 'timeseries.tests') running install running build running config_fc running build_src building extension "timeseries.cseries" sources running build_py running build_ext customize UnixCCompiler customize UnixCCompiler using build_ext building 'timeseries.cseries' extension compiling C sources C compiler: gcc -pthread -fno-strict-aliasing -DNDEBUG -g -O2 -Wall -Wstrict-prototypes -fPIC compile options: '-I/usr/lib/python2.5/site-packages/numpy/core/include/numpy -I./include -I/usr/lib/python2.5/site-packages/numpy/core/include -I/usr/include/python2.5 -c' gcc: ./src/cseries.c In file included from ./src/cseries.c:1: ./include/c_lib.h:4:20: error: Python.h: No such file or directory ./include/c_lib.h:5:26: error: structmember.h: No such file or directory In file included from /usr/lib/python2.5/site-packages/numpy/core/include/numpy/arrayobject.h:14, from ./include/c_lib.h:6, from ./src/cseries.c:1: /usr/lib/python2.5/site-packages/numpy/core/include/numpy/ndarrayobject.h:91:2: error: #error Must use Python with unicode enabled. In file included from /usr/lib/python2.5/site-packages/numpy/core/include/numpy/arrayobject.h:14, from ./include/c_lib.h:6, from ./src/cseries.c:1: /usr/lib/python2.5/site-packages/numpy/core/include/numpy/ndarrayobject.h:918: error: expected '=', ',', ';', 'asm' or '__attribute__' before 'npy_intp' /usr/lib/python2.5/site-packages/numpy/core/include/numpy/ndarrayobject.h:919: error: expected '=', ',', ';', 'asm' or '__attribute__' before 'npy_uintp' /usr/lib/python2.5/site-packages/numpy/core/include/numpy/ndarrayobject.h:1009: error: expected '=', ',', ';', 'asm' or '__attribute__' before '*' token /usr/lib/python2.5/site-packages/numpy/core/include/numpy/ndarrayobject.h:1010: error: expected ')' before '*' token /usr/lib/python2.5/site-packages/numpy/core/include/numpy/ndarrayobject.h:1012: error: expected declaration specifiers or '...' before 'npy_intp' /usr/lib/python2.5/site-packages/numpy/core/include/numpy/ndarrayobject.h:1012: error: expected declaration specifiers or '...' before 'npy_intp' /usr/lib/python2.5/site-packages/numpy/core/include/numpy/ndarrayobject.h:1013: error: expected declaration specifiers or '...' before 'npy_intp' /usr/lib/python2.5/site-packages/numpy/core/include/numpy/ndarrayobject.h:1023: error: expected declaration specifiers or '...' before 'npy_intp' /usr/lib/python2.5/site-packages/numpy/core/include/numpy/ndarrayobject.h:1023: error: expected declaration specifiers or '...' before 'npy_intp' /usr/lib/python2.5/site-packages/numpy/core/include/numpy/ndarrayobject.h:1025: error: expected declaration specifiers or '...' before 'npy_intp' /usr/lib/python2.5/site-packages/numpy/core/include/numpy/ndarrayobject.h:1025: error: expected declaration specifiers or '...' before 'npy_intp' /usr/lib/python2.5/site-packages/numpy/core/include/numpy/ndarrayobject.h:1026: error: expected declaration specifiers or '...' before 'npy_intp' /usr/lib/python2.5/site-packages/numpy/core/include/numpy/ndarrayobject.h:1028: error: expected declaration specifiers or '...' before 'npy_intp' /usr/lib/python2.5/site-packages/numpy/core/include/numpy/ndarrayobject.h:1031: error: expected ')' before '*' token /usr/lib/python2.5/site-packages/numpy/core/include/numpy/ndarrayobject.h:1034: error: expected declaration specifiers or '...' before 'npy_intp' /usr/lib/python2.5/site-packages/numpy/core/include/numpy/ndarrayobject.h:1036: error: expected declaration specifiers or '...' before 'npy_intp' /usr/lib/python2.5/site-packages/numpy/core/include/numpy/ndarrayobject.h:1037: error: expected declaration specifiers or '...' before 'npy_intp' /usr/lib/python2.5/site-packages/numpy/core/include/numpy/ndarrayobject.h:1037: error: expected declaration specifiers or '...' before 'npy_intp' /usr/lib/python2.5/site-packages/numpy/core/include/numpy/ndarrayobject.h:1039: error: expected declaration specifiers or '...' before 'npy_intp' /usr/lib/python2.5/site-packages/numpy/core/include/numpy/ndarrayobject.h:1044: error: expected specifier-qualifier-list before 'npy_intp' /usr/lib/python2.5/site-packages/numpy/core/include/numpy/ndarrayobject.h:1057: error: expected specifier-qualifier-list before 'PyArray_GetItemFunc' /usr/lib/python2.5/site-packages/numpy/core/include/numpy/ndarrayobject.h:1159: error: expected specifier-qualifier-list before 'PyObject_HEAD' /usr/lib/python2.5/site-packages/numpy/core/include/numpy/ndarrayobject.h:1191: error: expected specifier-qualifier-list before 'PyObject' /usr/lib/python2.5/site-packages/numpy/core/include/numpy/ndarrayobject.h:1201: error: expected specifier-qualifier-list before 'PyObject_HEAD' /usr/lib/python2.5/site-packages/numpy/core/include/numpy/ndarrayobject.h:1227: error: expected specifier-qualifier-list before 'PyObject_HEAD' /usr/lib/python2.5/site-packages/numpy/core/include/numpy/ndarrayobject.h:1234: error: expected declaration specifiers or '...' before 'PyObject' /usr/lib/python2.5/site-packages/numpy/core/include/numpy/ndarrayobject.h:1367: error: expected specifier-qualifier-list before 'PyObject_HEAD' /usr/lib/python2.5/site-packages/numpy/core/include/numpy/ndarrayobject.h:1505: error: expected specifier-qualifier-list before 'PyObject_HEAD' /usr/lib/python2.5/site-packages/numpy/core/include/numpy/ndarrayobject.h:1562: error: expected specifier-qualifier-list before 'PyObject_HEAD' /usr/lib/python2.5/site-packages/numpy/core/include/numpy/ndarrayobject.h:1762: error: expected specifier-qualifier-list before 'npy_intp' In file included from /usr/lib/python2.5/site-packages/numpy/core/include/numpy/ndarrayobject.h:1777, from /usr/lib/python2.5/site-packages/numpy/core/include/numpy/arrayobject.h:14, from ./include/c_lib.h:6, from ./src/cseries.c:1: /usr/lib/python2.5/site-packages/numpy/core/include/numpy/__multiarray_api.h:397: error: 'NULL' undeclared here (not in a function) /usr/lib/python2.5/site-packages/numpy/core/include/numpy/__multiarray_api.h: In function '_import_array': /usr/lib/python2.5/site-packages/numpy/core/include/numpy/__multiarray_api.h:945: error: 'PyObject' undeclared (first use in this function) /usr/lib/python2.5/site-packages/numpy/core/include/numpy/__multiarray_api.h:945: error: (Each undeclared identifier is reported only once /usr/lib/python2.5/site-packages/numpy/core/include/numpy/__multiarray_api.h:945: error: for each function it appears in.) /usr/lib/python2.5/site-packages/numpy/core/include/numpy/__multiarray_api.h:945: error: 'numpy' undeclared (first use in this function) /usr/lib/python2.5/site-packages/numpy/core/include/numpy/__multiarray_api.h:945: warning: implicit declaration of function 'PyImport_ImportModule' /usr/lib/python2.5/site-packages/numpy/core/include/numpy/__multiarray_api.h:946: error: 'c_api' undeclared (first use in this function) /usr/lib/python2.5/site-packages/numpy/core/include/numpy/__multiarray_api.h:948: warning: implicit declaration of function 'PyObject_GetAttrString' /usr/lib/python2.5/site-packages/numpy/core/include/numpy/__multiarray_api.h:949: warning: implicit declaration of function 'Py_DECREF' /usr/lib/python2.5/site-packages/numpy/core/include/numpy/__multiarray_api.h:950: warning: implicit declaration of function 'PyCObject_Check' /usr/lib/python2.5/site-packages/numpy/core/include/numpy/__multiarray_api.h:951: warning: implicit declaration of function 'PyCObject_AsVoidPtr' /usr/lib/python2.5/site-packages/numpy/core/include/numpy/__multiarray_api.h:958: warning: implicit declaration of function 'PyErr_Format' /usr/lib/python2.5/site-packages/numpy/core/include/numpy/__multiarray_api.h:958: error: 'PyExc_RuntimeError' undeclared (first use in this function) In file included from ./src/cseries.c:1: ./include/c_lib.h: At top level: ./include/c_lib.h:19: error: expected '=', ',', ';', 'asm' or '__attribute__' before '*' token ./include/c_lib.h:20: error: expected '=', ',', ';', 'asm' or '__attribute__' before '*' token ./include/c_lib.h:21: error: expected '=', ',', ';', 'asm' or '__attribute__' before '*' token ./include/c_lib.h:22: error: expected '=', ',', ';', 'asm' or '__attribute__' before '*' token ./include/c_lib.h:23: error: expected ')' before '*' token ./include/c_lib.h:24: error: expected ')' before '*' token ./include/c_lib.h:26: error: expected '=', ',', ';', 'asm' or '__attribute__' before '*' token ./include/c_lib.h:28: error: expected ')' before '*' token In file included from ./src/cseries.c:2: ./include/c_dates.h:103: error: expected ')' before '*' token ./include/c_dates.h:109: error: expected '=', ',', ';', 'asm' or '__attribute__' before '*' token ./include/c_dates.h:110: error: expected '=', ',', ';', 'asm' or '__attribute__' before '*' token ./include/c_dates.h:113: error: expected '=', ',', ';', 'asm' or '__attribute__' before '*' token ./include/c_dates.h:114: error: expected '=', ',', ';', 'asm' or '__attribute__' before '*' token ./include/c_dates.h:115: error: expected '=', ',', ';', 'asm' or '__attribute__' before '*' token ./include/c_dates.h:116: error: expected '=', ',', ';', 'asm' or '__attribute__' before '*' token ./include/c_dates.h:118: error: expected '=', ',', ';', 'asm' or '__attribute__' before '*' token ./include/c_dates.h:119: error: expected '=', ',', ';', 'asm' or '__attribute__' before '*' token ./include/c_dates.h:121: error: expected ')' before '*' token In file included from ./src/cseries.c:3: ./include/c_tseries.h:6: error: expected '=', ',', ';', 'asm' or '__attribute__' before '*' token ./include/c_tseries.h:8: error: expected '=', ',', ';', 'asm' or '__attribute__' before '*' token ./include/c_tseries.h:9: error: expected '=', ',', ';', 'asm' or '__attribute__' before '*' token ./include/c_tseries.h:10: error: expected '=', ',', ';', 'asm' or '__attribute__' before '*' token ./include/c_tseries.h:11: error: expected '=', ',', ';', 'asm' or '__attribute__' before '*' token ./include/c_tseries.h:13: error: expected ')' before '*' token ./src/cseries.c:5: error: expected '=', ',', ';', 'asm' or '__attribute__' before 'cseries_methods' ./src/cseries.c:55: error: expected '=', ',', ';', 'asm' or '__attribute__' before 'initcseries' In file included from ./src/cseries.c:1: ./include/c_lib.h:4:20: error: Python.h: No such file or directory ./include/c_lib.h:5:26: error: structmember.h: No such file or directory In file included from /usr/lib/python2.5/site-packages/numpy/core/include/numpy/arrayobject.h:14, from ./include/c_lib.h:6, from ./src/cseries.c:1: /usr/lib/python2.5/site-packages/numpy/core/include/numpy/ndarrayobject.h:91:2: error: #error Must use Python with unicode enabled. In file included from /usr/lib/python2.5/site-packages/numpy/core/include/numpy/arrayobject.h:14, from ./include/c_lib.h:6, from ./src/cseries.c:1: /usr/lib/python2.5/site-packages/numpy/core/include/numpy/ndarrayobject.h:918: error: expected '=', ',', ';', 'asm' or '__attribute__' before 'npy_intp' /usr/lib/python2.5/site-packages/numpy/core/include/numpy/ndarrayobject.h:919: error: expected '=', ',', ';', 'asm' or '__attribute__' before 'npy_uintp' /usr/lib/python2.5/site-packages/numpy/core/include/numpy/ndarrayobject.h:1009: error: expected '=', ',', ';', 'asm' or '__attribute__' before '*' token /usr/lib/python2.5/site-packages/numpy/core/include/numpy/ndarrayobject.h:1010: error: expected ')' before '*' token /usr/lib/python2.5/site-packages/numpy/core/include/numpy/ndarrayobject.h:1012: error: expected declaration specifiers or '...' before 'npy_intp' /usr/lib/python2.5/site-packages/numpy/core/include/numpy/ndarrayobject.h:1012: error: expected declaration specifiers or '...' before 'npy_intp' /usr/lib/python2.5/site-packages/numpy/core/include/numpy/ndarrayobject.h:1013: error: expected declaration specifiers or '...' before 'npy_intp' /usr/lib/python2.5/site-packages/numpy/core/include/numpy/ndarrayobject.h:1023: error: expected declaration specifiers or '...' before 'npy_intp' /usr/lib/python2.5/site-packages/numpy/core/include/numpy/ndarrayobject.h:1023: error: expected declaration specifiers or '...' before 'npy_intp' /usr/lib/python2.5/site-packages/numpy/core/include/numpy/ndarrayobject.h:1025: error: expected declaration specifiers or '...' before 'npy_intp' /usr/lib/python2.5/site-packages/numpy/core/include/numpy/ndarrayobject.h:1025: error: expected declaration specifiers or '...' before 'npy_intp' /usr/lib/python2.5/site-packages/numpy/core/include/numpy/ndarrayobject.h:1026: error: expected declaration specifiers or '...' before 'npy_intp' /usr/lib/python2.5/site-packages/numpy/core/include/numpy/ndarrayobject.h:1028: error: expected declaration specifiers or '...' before 'npy_intp' /usr/lib/python2.5/site-packages/numpy/core/include/numpy/ndarrayobject.h:1031: error: expected ')' before '*' token /usr/lib/python2.5/site-packages/numpy/core/include/numpy/ndarrayobject.h:1034: error: expected declaration specifiers or '...' before 'npy_intp' /usr/lib/python2.5/site-packages/numpy/core/include/numpy/ndarrayobject.h:1036: error: expected declaration specifiers or '...' before 'npy_intp' /usr/lib/python2.5/site-packages/numpy/core/include/numpy/ndarrayobject.h:1037: error: expected declaration specifiers or '...' before 'npy_intp' /usr/lib/python2.5/site-packages/numpy/core/include/numpy/ndarrayobject.h:1037: error: expected declaration specifiers or '...' before 'npy_intp' /usr/lib/python2.5/site-packages/numpy/core/include/numpy/ndarrayobject.h:1039: error: expected declaration specifiers or '...' before 'npy_intp' /usr/lib/python2.5/site-packages/numpy/core/include/numpy/ndarrayobject.h:1044: error: expected specifier-qualifier-list before 'npy_intp' /usr/lib/python2.5/site-packages/numpy/core/include/numpy/ndarrayobject.h:1057: error: expected specifier-qualifier-list before 'PyArray_GetItemFunc' /usr/lib/python2.5/site-packages/numpy/core/include/numpy/ndarrayobject.h:1159: error: expected specifier-qualifier-list before 'PyObject_HEAD' /usr/lib/python2.5/site-packages/numpy/core/include/numpy/ndarrayobject.h:1191: error: expected specifier-qualifier-list before 'PyObject' /usr/lib/python2.5/site-packages/numpy/core/include/numpy/ndarrayobject.h:1201: error: expected specifier-qualifier-list before 'PyObject_HEAD' /usr/lib/python2.5/site-packages/numpy/core/include/numpy/ndarrayobject.h:1227: error: expected specifier-qualifier-list before 'PyObject_HEAD' /usr/lib/python2.5/site-packages/numpy/core/include/numpy/ndarrayobject.h:1234: error: expected declaration specifiers or '...' before 'PyObject' /usr/lib/python2.5/site-packages/numpy/core/include/numpy/ndarrayobject.h:1367: error: expected specifier-qualifier-list before 'PyObject_HEAD' /usr/lib/python2.5/site-packages/numpy/core/include/numpy/ndarrayobject.h:1505: error: expected specifier-qualifier-list before 'PyObject_HEAD' /usr/lib/python2.5/site-packages/numpy/core/include/numpy/ndarrayobject.h:1562: error: expected specifier-qualifier-list before 'PyObject_HEAD' /usr/lib/python2.5/site-packages/numpy/core/include/numpy/ndarrayobject.h:1762: error: expected specifier-qualifier-list before 'npy_intp' In file included from /usr/lib/python2.5/site-packages/numpy/core/include/numpy/ndarrayobject.h:1777, from /usr/lib/python2.5/site-packages/numpy/core/include/numpy/arrayobject.h:14, from ./include/c_lib.h:6, from ./src/cseries.c:1: /usr/lib/python2.5/site-packages/numpy/core/include/numpy/__multiarray_api.h:397: error: 'NULL' undeclared here (not in a function) /usr/lib/python2.5/site-packages/numpy/core/include/numpy/__multiarray_api.h: In function '_import_array': /usr/lib/python2.5/site-packages/numpy/core/include/numpy/__multiarray_api.h:945: error: 'PyObject' undeclared (first use in this function) /usr/lib/python2.5/site-packages/numpy/core/include/numpy/__multiarray_api.h:945: error: (Each undeclared identifier is reported only once /usr/lib/python2.5/site-packages/numpy/core/include/numpy/__multiarray_api.h:945: error: for each function it appears in.) /usr/lib/python2.5/site-packages/numpy/core/include/numpy/__multiarray_api.h:945: error: 'numpy' undeclared (first use in this function) /usr/lib/python2.5/site-packages/numpy/core/include/numpy/__multiarray_api.h:945: warning: implicit declaration of function 'PyImport_ImportModule' /usr/lib/python2.5/site-packages/numpy/core/include/numpy/__multiarray_api.h:946: error: 'c_api' undeclared (first use in this function) /usr/lib/python2.5/site-packages/numpy/core/include/numpy/__multiarray_api.h:948: warning: implicit declaration of function 'PyObject_GetAttrString' /usr/lib/python2.5/site-packages/numpy/core/include/numpy/__multiarray_api.h:949: warning: implicit declaration of function 'Py_DECREF' /usr/lib/python2.5/site-packages/numpy/core/include/numpy/__multiarray_api.h:950: warning: implicit declaration of function 'PyCObject_Check' /usr/lib/python2.5/site-packages/numpy/core/include/numpy/__multiarray_api.h:951: warning: implicit declaration of function 'PyCObject_AsVoidPtr' /usr/lib/python2.5/site-packages/numpy/core/include/numpy/__multiarray_api.h:958: warning: implicit declaration of function 'PyErr_Format' /usr/lib/python2.5/site-packages/numpy/core/include/numpy/__multiarray_api.h:958: error: 'PyExc_RuntimeError' undeclared (first use in this function) In file included from ./src/cseries.c:1: ./include/c_lib.h: At top level: ./include/c_lib.h:19: error: expected '=', ',', ';', 'asm' or '__attribute__' before '*' token ./include/c_lib.h:20: error: expected '=', ',', ';', 'asm' or '__attribute__' before '*' token ./include/c_lib.h:21: error: expected '=', ',', ';', 'asm' or '__attribute__' before '*' token ./include/c_lib.h:22: error: expected '=', ',', ';', 'asm' or '__attribute__' before '*' token ./include/c_lib.h:23: error: expected ')' before '*' token ./include/c_lib.h:24: error: expected ')' before '*' token ./include/c_lib.h:26: error: expected '=', ',', ';', 'asm' or '__attribute__' before '*' token ./include/c_lib.h:28: error: expected ')' before '*' token In file included from ./src/cseries.c:2: ./include/c_dates.h:103: error: expected ')' before '*' token ./include/c_dates.h:109: error: expected '=', ',', ';', 'asm' or '__attribute__' before '*' token ./include/c_dates.h:110: error: expected '=', ',', ';', 'asm' or '__attribute__' before '*' token ./include/c_dates.h:113: error: expected '=', ',', ';', 'asm' or '__attribute__' before '*' token ./include/c_dates.h:114: error: expected '=', ',', ';', 'asm' or '__attribute__' before '*' token ./include/c_dates.h:115: error: expected '=', ',', ';', 'asm' or '__attribute__' before '*' token ./include/c_dates.h:116: error: expected '=', ',', ';', 'asm' or '__attribute__' before '*' token ./include/c_dates.h:118: error: expected '=', ',', ';', 'asm' or '__attribute__' before '*' token ./include/c_dates.h:119: error: expected '=', ',', ';', 'asm' or '__attribute__' before '*' token ./include/c_dates.h:121: error: expected ')' before '*' token In file included from ./src/cseries.c:3: ./include/c_tseries.h:6: error: expected '=', ',', ';', 'asm' or '__attribute__' before '*' token ./include/c_tseries.h:8: error: expected '=', ',', ';', 'asm' or '__attribute__' before '*' token ./include/c_tseries.h:9: error: expected '=', ',', ';', 'asm' or '__attribute__' before '*' token ./include/c_tseries.h:10: error: expected '=', ',', ';', 'asm' or '__attribute__' before '*' token ./include/c_tseries.h:11: error: expected '=', ',', ';', 'asm' or '__attribute__' before '*' token ./include/c_tseries.h:13: error: expected ')' before '*' token ./src/cseries.c:5: error: expected '=', ',', ';', 'asm' or '__attribute__' before 'cseries_methods' ./src/cseries.c:55: error: expected '=', ',', ';', 'asm' or '__attribute__' before 'initcseries' error: Command "gcc -pthread -fno-strict-aliasing -DNDEBUG -g -O2 -Wall -Wstrict-prototypes -fPIC -I/usr/lib/python2.5/site-packages/numpy/core/include/numpy -I./include -I/usr/lib/python2.5/site-packages/numpy/core/include -I/usr/include/python2.5 -c ./src/cseries.c -o build/temp.linux-i686-2.5/src/cseries.o" failed with exit status 1 -------------- next part -------------- An HTML attachment was scrubbed... URL: From zelbier at gmail.com Thu Nov 1 14:06:29 2007 From: zelbier at gmail.com (Olivier) Date: Thu, 01 Nov 2007 18:06:29 -0000 Subject: [SciPy-user] Bug in scipy.sparse In-Reply-To: References: Message-ID: <1193940389.563473.128880@d55g2000hsg.googlegroups.com> Nobody seems to care much about sparse arrays in scipy.... One reason why i tremendously prefer arrays over matrices is that it is extremely easy to multiply all the rows or columns of a matrix by a vector of coefficients. You just do M*V to multiply column-wise or M*V.reshape(-1,1) to multiply row-wise. I hate having to create a diagonal matrix for such a simple task. Nobody is interested in having sparse arrays in scipy? == Olivier On Oct 31, 12:01 pm, "Olivier Verdier" wrote: > The first thing I'm surprised about with scipy.sparse is that it uses the > matrix type instead of the array type. This is very unfortunate, in my > opinion. > The bug is that a sparse matrix won't work correctly with `dot` and `array`: > > spmat = speye(3,3) > vec = array([1,0,0]) > > dot(spmat, vec) > returns: > array([ (0, 0) 1.0 > (1, 1) 1.0 > (2, 2) 1.0, > (0, 0) 0.0 > (1, 1) 0.0 > (2, 2) 0.0, > (0, 0) 0.0 > (1, 1) 0.0 > (2, 2) 0.0], dtype=object) > > Ouch!! > > The right result may however be obtained with spmat * vec. > > This is a pity because any code that works with arrays or matrices in > general and uses `dot` to do the matrix vector multiplication will be > *broken* with sparse matrices. > > How reliable is scipy.sparse? Is there any plan to make it more compatible > with the array type? Behave like the array type? How can I help? > > == Olivier > > _______________________________________________ > SciPy-user mailing list > SciPy-u... at scipy.orghttp://projects.scipy.org/mailman/listinfo/scipy-user From s.mientki at ru.nl Thu Nov 1 14:16:54 2007 From: s.mientki at ru.nl (Stef Mientki) Date: Thu, 01 Nov 2007 19:16:54 +0100 Subject: [SciPy-user] Scipy + Vision = LabView ? In-Reply-To: References: <4728FFC8.2070007@ru.nl> <40e64fa20710311536v68f351bbq4cbc7a836e7620bf@mail.gmail.com> Message-ID: <472A1816.6060902@ru.nl> Matthieu Brucher wrote: > > > 2007/10/31, Paul Barrett >: > > No, you are not dreaming. I also commented on this possibility > several years ago to some people in the SciPy community. > Unfortunately, I have not had the time or motivation to act on it. It > would certainly be nice to have a basic package to show people. > You'll need to create a set of APIs so that you can connect to > instrumentation. It shouldn't be too hard now that the interfaces > have been standardized. > > > Sorry, it was in fact Vision and Viper I saw in the past as well. Where can I find Viper ? Google hates those common names ;-) > The idea is great but the application, IMHO, is not quite optimal : > - even for the tutorial, the rotate bloc should have two inputs, the > image and the rotation angle (this last factor could be given by the > result of another algorithm) maybe 3, also one for the interpolation method ? I would also make the major controls (for models with not too many controls), in this case the rotation angle directly available with a small slider on the visual component. > - what about multithreading in this model ? > - what about flow control ? (which is really a huge issue that is > lacking in ITK for instance) > - is there a way to create a bunch of blocks easily ? > Does LabView supports these ? > All this has consequences on the interface. Besides, it would be very > great to use Traits as input and output blocks. I Don't know Traits, so can't judge that. cheers, Stef From millman at berkeley.edu Thu Nov 1 14:19:52 2007 From: millman at berkeley.edu (Jarrod Millman) Date: Thu, 1 Nov 2007 11:19:52 -0700 Subject: [SciPy-user] Scipy + Vision = LabView ? In-Reply-To: <472A1816.6060902@ru.nl> References: <4728FFC8.2070007@ru.nl> <40e64fa20710311536v68f351bbq4cbc7a836e7620bf@mail.gmail.com> <472A1816.6060902@ru.nl> Message-ID: On Nov 1, 2007 11:16 AM, Stef Mientki wrote: > Where can I find Viper ? http://www.scripps.edu/~sanner/python/viper/index.html -- Jarrod Millman Computational Infrastructure for Research Labs 10 Giannini Hall, UC Berkeley phone: 510.643.4014 http://cirl.berkeley.edu/ From matthieu.brucher at gmail.com Thu Nov 1 14:56:47 2007 From: matthieu.brucher at gmail.com (Matthieu Brucher) Date: Thu, 1 Nov 2007 19:56:47 +0100 Subject: [SciPy-user] Scipy + Vision = LabView ? In-Reply-To: <472A1816.6060902@ru.nl> References: <4728FFC8.2070007@ru.nl> <40e64fa20710311536v68f351bbq4cbc7a836e7620bf@mail.gmail.com> <472A1816.6060902@ru.nl> Message-ID: > > maybe 3, also one for the interpolation method ? > I would also make the major controls (for models with not too many > controls), > in this case the rotation angle directly available with a small slider > on the visual component. Exactly. > - what about multithreading in this model ? > > - what about flow control ? (which is really a huge issue that is > > lacking in ITK for instance) > > - is there a way to create a bunch of blocks easily ? > > Does LabView supports these ? I don't know, but I know that it is what I need and what we want to build more or less in the lab I'm doing my PhD. The algorithms are very long to run, and there is multiple calculations that can be run in parallel. This induces multithreading, flow control (actually not very complicated once a flow control instruction is thought as another block) and easiness of creation (perhaps a simple xml file that would be parsed to create a meta-class that would be then instanciated). And something else is relevent : saving results after each block. -- French PhD student Website : http://miles.developpez.com/ Blogs : http://matt.eifelle.com and http://blog.developpez.com/?blog=92 -------------- next part -------------- An HTML attachment was scrubbed... URL: From stef.mientki at gmail.com Thu Nov 1 15:04:20 2007 From: stef.mientki at gmail.com (stef mientki) Date: Thu, 01 Nov 2007 20:04:20 +0100 Subject: [SciPy-user] Scipy + Vision = LabView ? In-Reply-To: References: <4728FFC8.2070007@ru.nl> <40e64fa20710311536v68f351bbq4cbc7a836e7620bf@mail.gmail.com> <472A1816.6060902@ru.nl> Message-ID: <472A2334.3080109@gmail.com> Jarrod Millman wrote: > On Nov 1, 2007 11:16 AM, Stef Mientki wrote: > >> Where can I find Viper ? >> > > http://www.scripps.edu/~sanner/python/viper/index.html > > if Vision is Viper : choice = False :-) cheers, Stef From millman at berkeley.edu Thu Nov 1 15:13:04 2007 From: millman at berkeley.edu (Jarrod Millman) Date: Thu, 1 Nov 2007 12:13:04 -0700 Subject: [SciPy-user] Scipy + Vision = LabView ? In-Reply-To: <472A2334.3080109@gmail.com> References: <4728FFC8.2070007@ru.nl> <40e64fa20710311536v68f351bbq4cbc7a836e7620bf@mail.gmail.com> <472A1816.6060902@ru.nl> <472A2334.3080109@gmail.com> Message-ID: On Nov 1, 2007 12:04 PM, stef mientki wrote: > Jarrod Millman wrote: > > On Nov 1, 2007 11:16 AM, Stef Mientki wrote: > >> Where can I find Viper ? > >> > > http://www.scripps.edu/~sanner/python/viper/index.html > > > if Vision is Viper : > choice = False > :-) I believe Viper was renamed Vision. You can see the legacy of Viper still in the URL path and it is referenced in their publications at the bottom of the page. -- Jarrod Millman Computational Infrastructure for Research Labs 10 Giannini Hall, UC Berkeley phone: 510.643.4014 http://cirl.berkeley.edu/ From mattknox_ca at hotmail.com Thu Nov 1 15:21:06 2007 From: mattknox_ca at hotmail.com (Matt Knox) Date: Thu, 1 Nov 2007 19:21:06 +0000 (UTC) Subject: [SciPy-user] Installing TimeSeries References: Message-ID: > ./include/c_lib.h:4:20: error: Python.h: No such file or directory I am mostly a windows guy, but I *believe* you may need to install some python development package (which is probably available through the standard package repository menu in Ubuntu). Might be called something like "python-devel". If that doesn't work, perhaps one of the many linux folks on this mailing list could help you. Once you get it working, let me know if you need any help using it or have any suggestions, etc. - Matt From stefan at sun.ac.za Thu Nov 1 16:13:02 2007 From: stefan at sun.ac.za (Stefan van der Walt) Date: Thu, 1 Nov 2007 22:13:02 +0200 Subject: [SciPy-user] Bug in scipy.sparse In-Reply-To: <1193940389.563473.128880@d55g2000hsg.googlegroups.com> References: <1193940389.563473.128880@d55g2000hsg.googlegroups.com> Message-ID: <20071101201302.GB14429@mentat.za.net> On Thu, Nov 01, 2007 at 06:06:29PM -0000, Olivier wrote: > Nobody seems to care much about sparse arrays in scipy.... One reason > why i tremendously prefer arrays over matrices is that it is extremely > easy to multiply all the rows or columns of a matrix by a vector of > coefficients. You just do M*V to multiply column-wise or > M*V.reshape(-1,1) to multiply row-wise. > > I hate having to create a diagonal matrix for such a simple task. > > Nobody is interested in having sparse arrays in scipy? It's not that we aren't interested, but someone has to write it. The sparse matrix base classes need to support dot, element-wise multiplication and broadcasting. Then it is a simple matter of subclassing and overriding * to get matrix functionality. Nathan has really done a lot to improve the underlying engine, but the Python end of things could do with a good refactoring. Cheers St?fan From nmarais at sun.ac.za Fri Nov 2 09:51:13 2007 From: nmarais at sun.ac.za (Neilen Marais) Date: Fri, 2 Nov 2007 13:51:13 +0000 (UTC) Subject: [SciPy-user] Bug in scipy.sparse References: Message-ID: Hi Olivier On Wed, 31 Oct 2007 12:01:50 +0100, Olivier Verdier wrote: > The first thing I'm surprised about with scipy.sparse is that it uses > the matrix type instead of the array type. This is very unfortunate, in > my opinion. > The bug is that a sparse matrix won't work correctly with `dot` and > `array`: > > This is a pity because any code that works with arrays or matrices in > general and uses `dot` to do the matrix vector multiplication will be > *broken* with sparse matrices. Yeah, I agree this is a bit irritating sometimes. OTOH you usually know when you are dealing with sparse matrices since they tend to be fairly specific in their application. > > How reliable is scipy.sparse? Is there any plan to make it more > compatible with the array type? Behave like the array type? How can I > help? I find it to be quite reliable for my work (i.e. WorksForMe), but I've been using spmat.matvec() since the beginning. I use SVN versions of scipy/numpy since the sparse bits seem to improve faster than the release cycles come around. Of course you can always help by reporting bugs and making patches for things you think could be better. There's a lot to be done. The trick is not to be shy, if your patches are OK they'll get applied quickly, and the worst that can happen is that someone will say no ;P Regards Neilen From gael.varoquaux at normalesup.org Fri Nov 2 10:08:21 2007 From: gael.varoquaux at normalesup.org (Gael Varoquaux) Date: Fri, 2 Nov 2007 15:08:21 +0100 Subject: [SciPy-user] Scipy + Vision = LabView ? In-Reply-To: References: <4728FFC8.2070007@ru.nl> <40e64fa20710311536v68f351bbq4cbc7a836e7620bf@mail.gmail.com> Message-ID: <20071102140821.GA1517@clipper.ens.fr> On Wed, Oct 31, 2007 at 11:51:07PM +0100, Matthieu Brucher wrote: > 2007/10/31, Paul Barrett <[1]pebarrett at gmail.com>: > No, you are not dreaming. I also commented on this possibility > several years ago to some people in the SciPy community. > Unfortunately, I have not had the time or motivation to act on it. It > would certainly be nice to have a basic package to show people. > You'll need to create a set of APIs so that you can connect to > instrumentation. It shouldn't be too hard now that the interfaces > have been standardized. > Sorry, it was in fact Vision and Viper I saw in the past as well. > The idea is great but the application, IMHO, is not quite optimal : > - even for the tutorial, the rotate bloc should have two inputs, the image > and the rotation angle (this last factor could be given by the result of > another algorithm) > - what about multithreading in this model ? > - what about flow control ? (which is really a huge issue that is lacking > in ITK for instance) > - is there a way to create a bunch of blocks easily ? > All this has consequences on the interface. Besides, it would be very > great to use Traits as input and output blocks. Enthought has a very promising deveolppement in this direction, that uses inspection of a Python script to build the diagram. IMHO it is much more powerful than vision because most of the work is done automaticaly. Look at Eric Jones' presentation at this year's Scipy to read a bit more about it. Vision is really impressive. I am just worried that it is hard to reuse. It is an empire of its own. As a general note on the mailing list, there are a lot of impressive projects, but often they are centered on themselves, and do not provide a good interface and reusable components. Recently Prabhu and I have been working on Mayavi2 to make it more reusable, turning it into an engine and widget that can be decoupled from the UI. This kind of approach is very important to avoid duplication of effort. Concerning vision, there are at least 3 or 4 projects doing Visual programming for scientific Python purposes. It tears my heart to see that many different efforts not sharing code, but the reason is simply that each project is hard to reuse. An, yes, I agree that traits would do a perfect basis to build a reusable visual progamming model. Traits do need a better packaging and deployement, but that is a problem much easier to solve than build a new GUI dialog engine. My 2cents, Gael From unpingco.jose at gmail.com Fri Nov 2 10:11:14 2007 From: unpingco.jose at gmail.com (Jose Unpingco) Date: Fri, 2 Nov 2007 07:11:14 -0700 Subject: [SciPy-user] communications toolbox for scientific python? Message-ID: Is there anything like the Matlab communications toolbox in scientific python? Otherwise, is there anybody out there interested in putting one together? Thanks. -------------- next part -------------- An HTML attachment was scrubbed... URL: From matthieu.brucher at gmail.com Fri Nov 2 10:11:58 2007 From: matthieu.brucher at gmail.com (Matthieu Brucher) Date: Fri, 2 Nov 2007 15:11:58 +0100 Subject: [SciPy-user] Scipy + Vision = LabView ? In-Reply-To: <20071102140821.GA1517@clipper.ens.fr> References: <4728FFC8.2070007@ru.nl> <40e64fa20710311536v68f351bbq4cbc7a836e7620bf@mail.gmail.com> <20071102140821.GA1517@clipper.ens.fr> Message-ID: > > Enthought has a very promising deveolppement in this direction, that uses > inspection of a Python script to build the diagram. IMHO it is much more > powerful than vision because most of the work is done automaticaly. Look > at Eric Jones' presentation at this year's Scipy to read a bit more about > it. Do you have a link to the slides perhaps ? As a general note on the mailing list, there are a lot of impressive > projects, but often they are centered on themselves, and do not provide a > good interface and reusable components. Recently Prabhu and I have been > working on Mayavi2 to make it more reusable, turning it into an engine > and widget that can be decoupled from the UI. This kind of approach is > very important to avoid duplication of effort. Concerning vision, there > are at least 3 or 4 projects doing Visual programming for scientific > Python purposes. It tears my heart to see that many different efforts not > sharing code, but the reason is simply that each project is hard to > reuse. Exactly. This means a common way of describing the blocks/plugins, a common library to interact wth the user -> traits ?, ... An, yes, I agree that traits would do a perfect basis to build a reusable > visual progamming model. Traits do need a better packaging and > deployement, but that is a problem much easier to solve than build a new > GUI dialog engine. > > My 2cents, > > Gael > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > -- French PhD student Website : http://miles.developpez.com/ Blogs : http://matt.eifelle.com and http://blog.developpez.com/?blog=92 LinkedIn : http://www.linkedin.com/in/matthieubrucher -------------- next part -------------- An HTML attachment was scrubbed... URL: From s.mientki at ru.nl Fri Nov 2 14:31:53 2007 From: s.mientki at ru.nl (Stef Mientki) Date: Fri, 02 Nov 2007 19:31:53 +0100 Subject: [SciPy-user] Scipy + Vision = LabView ? In-Reply-To: <20071102140821.GA1517@clipper.ens.fr> References: <4728FFC8.2070007@ru.nl> <40e64fa20710311536v68f351bbq4cbc7a836e7620bf@mail.gmail.com> <20071102140821.GA1517@clipper.ens.fr> Message-ID: <472B6D19.50409@ru.nl> hi Gael, Gael Varoquaux wrote: > On Wed, Oct 31, 2007 at 11:51:07PM +0100, Matthieu Brucher wrote: > >> 2007/10/31, Paul Barrett <[1]pebarrett at gmail.com>: >> > > >> No, you are not dreaming. I also commented on this possibility >> several years ago to some people in the SciPy community. >> Unfortunately, I have not had the time or motivation to act on it. It >> would certainly be nice to have a basic package to show people. >> You'll need to create a set of APIs so that you can connect to >> instrumentation. It shouldn't be too hard now that the interfaces >> have been standardized. >> > > >> Sorry, it was in fact Vision and Viper I saw in the past as well. >> The idea is great but the application, IMHO, is not quite optimal : >> - even for the tutorial, the rotate bloc should have two inputs, the image >> and the rotation angle (this last factor could be given by the result of >> another algorithm) >> - what about multithreading in this model ? >> - what about flow control ? (which is really a huge issue that is lacking >> in ITK for instance) >> - is there a way to create a bunch of blocks easily ? >> > > >> All this has consequences on the interface. Besides, it would be very >> great to use Traits as input and output blocks. >> > > Enthought has a very promising deveolppement in this direction, that uses > inspection of a Python script to build the diagram. IMHO it is much more > powerful than vision because most of the work is done automaticaly. Look > at Eric Jones' presentation at this year's Scipy to read a bit more about > it. > > Vision is really impressive. I am just worried that it is hard to reuse. > It is an empire of its own. > > As a general note on the mailing list, there are a lot of impressive > projects, but often they are centered on themselves, and do not provide a > good interface and reusable components. Recently Prabhu and I have been > working on Mayavi2 to make it more reusable, turning it into an engine > and widget that can be decoupled from the UI. This kind of approach is > very important to avoid duplication of effort. Concerning vision, there > are at least 3 or 4 projects doing Visual programming for scientific > Python purposes. Well I'm very interested to see these projects ! > It tears my heart to see that many different efforts not > sharing code, but the reason is simply that each project is hard to > reuse. > > An, yes, I agree that traits would do a perfect basis to build a reusable > visual progamming model. Traits do need a better packaging and > deployement, but that is a problem much easier to solve than build a new > GUI dialog engine. > > My 2cents, > > Gael > I'm not sure how to react on this message, I think Enthought has done an incredible job, in making the Scipy Enthought Edition, ... ... and probably Traits is remarkable development, but ... ... I don't understand one bit of it !! So it's either much too high brow for me, or it's brought too much high brow. I can't find a few simple examples that I can download, with the guarantee that my current Scipy environment is untouched. I took a very quick look at one of your powerpoints presentations, more than 95% text ?? are we taking about a UI ?? Gael, forget my words, make it more open and easier to step into. cheers, Stef Mientki From anand.prabhakar.patil at gmail.com Fri Nov 2 20:03:25 2007 From: anand.prabhakar.patil at gmail.com (Anand Patil) Date: Fri, 2 Nov 2007 17:03:25 -0700 Subject: [SciPy-user] Memory quirk Message-ID: <2bc7a5a50711021703n6d5e138fq8c10124dec711b77@mail.gmail.com> Hi all, The following script leaks memory when an array with dtype=object is used instead of a list. Is this a bug, or a behavior I should be expecting? Any help is much appreciated. Thanks, Anand from numpy import * import time import gc class MyObject(object): def __init__(self): self.data = zeros(100000, dtype=float) # This leaks memory... self.stufflist = array([self], dtype=object) # This doesn't # self.stufflist = [self] for i in xrange(10000): q=MyObject() time.sleep(.01) gc.collect() -------------- next part -------------- An HTML attachment was scrubbed... URL: From anand.prabhakar.patil at gmail.com Fri Nov 2 21:40:23 2007 From: anand.prabhakar.patil at gmail.com (Anand Patil) Date: Fri, 2 Nov 2007 18:40:23 -0700 Subject: [SciPy-user] Another memory quirk Message-ID: <2bc7a5a50711021840r29580f44o793624072f2e2ef3@mail.gmail.com> Hi all, Not sure if this is related to my last post but the script below leaks memory like mad... PLEASE DON'T RUN IT, I don't want to be responsible for crashing your computer! Is there any safe way to store a ravelled view of an array in a subclass like this? I'm using the ravelled view inside a very tight loop & would rather not spend time calling ravel() if possible. Many thanks in advance, Anand from numpy import * from PyMC import * class MyArray(ndarray): pass for i in xrange(1000): A=zeros((1000,1000),dtype=float) A = A.view(MyArray) # If I uncomment this and run it, a huge memory leak ensues. # A._ravelled = A.ravel() -------------- next part -------------- An HTML attachment was scrubbed... URL: From oliphant at enthought.com Sat Nov 3 02:55:35 2007 From: oliphant at enthought.com (Travis E. Oliphant) Date: Sat, 03 Nov 2007 01:55:35 -0500 Subject: [SciPy-user] [Fwd: Re: Memory quirk] Message-ID: <472C1B67.1000703@enthought.com> -------------- next part -------------- An embedded message was scrubbed... From: "Travis E. Oliphant" Subject: Re: [SciPy-user] Memory quirk Date: Sat, 03 Nov 2007 01:13:32 -0500 Size: 1391 URL: From oliphant at enthought.com Sat Nov 3 02:57:49 2007 From: oliphant at enthought.com (Travis E. Oliphant) Date: Sat, 03 Nov 2007 01:57:49 -0500 Subject: [SciPy-user] Another memory quirk In-Reply-To: <2bc7a5a50711021840r29580f44o793624072f2e2ef3@mail.gmail.com> References: <2bc7a5a50711021840r29580f44o793624072f2e2ef3@mail.gmail.com> Message-ID: <472C1BED.4060904@enthought.com> Anand Patil wrote: > Hi all, > > Not sure if this is related to my last post but the script below leaks > memory like mad... PLEASE DON'T RUN IT, I don't want to be responsible > for crashing your computer! > > Is there any safe way to store a ravelled view of an array in a > subclass like this? I'm using the ravelled view inside a very tight > loop & would rather not spend time calling ravel() if possible. > > Many thanks in advance, > Anand Here you have created another cyclic reference. A.ravel() returns a new array object which references the old one (in the .base attribute). In this case you are storing this new array object into the ._ravelled attribute of the old one. Thus, the old MyArray object (the one returned from the view statement) is being referenced indirectly by-itself through its "_ravelled" attribute. The reference count never goes to zero on the MyArray object and these cycles just blossom off each holding on to their original ~8 MB array. You've just shown two good examples where object cycles using arrays can hurt you. It's an important point. It may be possible to add cyclic garbage collection to NumPy objects (and it's sub-classes), but it won't be trivial). In this case, just keep the original array reference and set the _ravelled attribute to a raveled version of the original array: B = zeros((1000,1000), float) A = B.view(MyArray) A._ravelled = B.ravel() This way A is not referencing itself through one of its attributes and will be deleted once A gets re-assigned. -Travis > > > from numpy import * > from PyMC import * > > class MyArray(ndarray): > pass > > for i in xrange(1000): > > A=zeros((1000,1000),dtype=float) > > A = A.view(MyArray) > > # If I uncomment this and run it, a huge memory leak ensues. > # A._ravelled = A.ravel() > > ------------------------------------------------------------------------ > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > From gael.varoquaux at normalesup.org Sat Nov 3 10:25:35 2007 From: gael.varoquaux at normalesup.org (Gael Varoquaux) Date: Sat, 3 Nov 2007 15:25:35 +0100 Subject: [SciPy-user] Scipy + Vision = LabView ? In-Reply-To: References: <4728FFC8.2070007@ru.nl> <40e64fa20710311536v68f351bbq4cbc7a836e7620bf@mail.gmail.com> <20071102140821.GA1517@clipper.ens.fr> Message-ID: <20071103142535.GD1956@clipper.ens.fr> On Fri, Nov 02, 2007 at 03:11:58PM +0100, Matthieu Brucher wrote: > Look at Eric Jones' presentation at this year's Scipy to read a > bit more about it. > Do you have a link to the slides perhaps ? Sorry, I was on the road, with very poor internet access, so I wrote the e-mail a bit quickly. The link is http://scipy.org/SciPy2007/TutorialMaterials. Ga?l From gael.varoquaux at normalesup.org Sat Nov 3 10:52:33 2007 From: gael.varoquaux at normalesup.org (Gael Varoquaux) Date: Sat, 3 Nov 2007 15:52:33 +0100 Subject: [SciPy-user] Scipy + Vision = LabView ? In-Reply-To: <472B6D19.50409@ru.nl> References: <4728FFC8.2070007@ru.nl> <40e64fa20710311536v68f351bbq4cbc7a836e7620bf@mail.gmail.com> <20071102140821.GA1517@clipper.ens.fr> <472B6D19.50409@ru.nl> Message-ID: <20071103145233.GE1956@clipper.ens.fr> On Fri, Nov 02, 2007 at 07:31:53PM +0100, Stef Mientki wrote: > > Concerning vision, there are at least 3 or 4 projects doing Visual > > programming for scientific Python purposes. > Well I'm very interested to see these projects ! OK, now I have to name them, huho. So we have Vision, that is indeed impressive, but really built around custom tools. It would be a large amount of work to abstract out the visual programming part of the rest, but it is probably possible. Then we have orange, that dies no seem to be fully written in Python, but certainly uses Python (http://www.ailab.si/orange/screenshots.asp) Elefant also uses a visual programming engine. see http://scipy.org/SciPy2007/Presentations for Christfried Webers' slides on elefant, or their main webpage ( http://elefant.developer.nicta.com.au/ ). I talked to Christfried at scipy, and he definitly seemed open for collaborations. And we have enthought's approach. If you want to have a glimpse at the UI, you can look on page 24 of Eric's slides. The two interesting things about enthought's approach is that they reuse their knowhow in layout engine to generate the dialogs (I am not to sure it is pure traits, but it seems linked), and, most interesting, that they use standard python functions to build their visual blocks, and do not need descriptions of input and output. Obviously this imposes a lot of restriction on the way the functions behave (basically pure functionnal-like behavior, where side effects are banned), and last time I talked with them, they had problems adding flow control to this. But they are using this engine in a commercial app, which means that there is money going in it to improve it, so it is definetaly worth the look. Ga?l From nmarais at sun.ac.za Sat Nov 3 11:50:42 2007 From: nmarais at sun.ac.za (Neilen Marais) Date: Sat, 3 Nov 2007 15:50:42 +0000 (UTC) Subject: [SciPy-user] communications toolbox for scientific python? References: Message-ID: Hi, On Fri, 02 Nov 2007 07:11:14 -0700, Jose Unpingco wrote: > Is there anything like the Matlab communications toolbox in scientific > python? > > Otherwise, is there anybody out there interested in putting one > together? I'm not personally familiar with what the Matlab comms toolbox can do. I think a good contribution would be to describe the functionality that exists in that toolbox in a way that doesn't reference Matlab. The scipy wiki would no doubt be a good place to do that. It would then be much easier for individual bits to get implemented. I might even find that some of those bits overlap with my needs, giving my selfish little self incentive to work on them :) Regards Neilen Regards Neilen > http://projects.scipy.org/mailman/listinfo/scipy-user From anand.prabhakar.patil at gmail.com Sat Nov 3 13:31:20 2007 From: anand.prabhakar.patil at gmail.com (Anand Patil) Date: Sat, 3 Nov 2007 10:31:20 -0700 Subject: [SciPy-user] Memory quirk Message-ID: <2bc7a5a50711031031m486818bua66d34eb2fdea3c3@mail.gmail.com> Hi Travis, thanks for your help. I think I get it but your response to my first posting didn't come through, would you mind resending? Also, if possible it would be nice to have this information in Guide to NumPy. I had no idea numpy did its own garbage collecting, so I had no clue where to start looking. Cheers Anand ------------------------------ > > Message: 4 > Date: Sat, 03 Nov 2007 01:55:35 -0500 > From: "Travis E. Oliphant" > Subject: [SciPy-user] [Fwd: Re: Memory quirk] > To: SciPy Users List > Message-ID: <472C1B67.1000703 at enthought.com> > Content-Type: text/plain; charset="iso-8859-1" > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From oliphant at enthought.com Sat Nov 3 15:52:05 2007 From: oliphant at enthought.com (Travis E. Oliphant) Date: Sat, 03 Nov 2007 14:52:05 -0500 Subject: [SciPy-user] Memory quirk In-Reply-To: <2bc7a5a50711021703n6d5e138fq8c10124dec711b77@mail.gmail.com> References: <2bc7a5a50711021703n6d5e138fq8c10124dec711b77@mail.gmail.com> Message-ID: <472CD165.8000800@enthought.com> Anand Patil wrote: > Hi all, > > The following script leaks memory when an array with dtype=object is > used instead of a list. Is this a bug, or a behavior I should be > expecting? Any help is much appreciated. It's not really a bug, just an unsupported use of NumPy object arrays which do not participate in Python's cyclic garbage collector. To add support would require more re-factoring than it has been worth to anybody at this point. It would require handing object array allocations differently so that they participate in the garbage collection strategy and it may not be possible (I'm not sure if an object can sometimes have it's memory allocated using it's own allocators and sometimes have them allocated using PyObject_GC_New and still participate in the garbage collection cycle). -Travis From s.mientki at ru.nl Sat Nov 3 18:52:52 2007 From: s.mientki at ru.nl (Stef Mientki) Date: Sat, 03 Nov 2007 23:52:52 +0100 Subject: [SciPy-user] Scipy + Vision = LabView ? In-Reply-To: <20071103145233.GE1956@clipper.ens.fr> References: <4728FFC8.2070007@ru.nl> <40e64fa20710311536v68f351bbq4cbc7a836e7620bf@mail.gmail.com> <20071102140821.GA1517@clipper.ens.fr> <472B6D19.50409@ru.nl> <20071103145233.GE1956@clipper.ens.fr> Message-ID: <472CFBC4.1090802@ru.nl> Gael Varoquaux wrote: > On Fri, Nov 02, 2007 at 07:31:53PM +0100, Stef Mientki wrote: > >>> Concerning vision, there are at least 3 or 4 projects doing Visual >>> programming for scientific Python purposes. >>> > > >> Well I'm very interested to see these projects ! >> > > OK, now I have to name them, huho. thanks very much Gael, it's very good to have such an overview. > > > So we have Vision, that is indeed impressive, but really built around > custom tools. It would be a large amount of work to abstract out the > visual programming part of the rest, but it is probably possible. > Yes I took a quick look, and doesn't seems easy to extract the Vision part form the total package. > Then we have orange, that dies no seem to be fully written in Python, but > certainly uses Python (http://www.ailab.si/orange/screenshots.asp) > > Looks nice, but is very dedicated to statistics (and Qt-based :-( , so I don't know if we can build a hardware control loop in it. If you get more library modules, it will be hard to manage them. > Elefant also uses a visual programming engine. see > http://scipy.org/SciPy2007/Presentations for Christfried Webers' slides > on elefant, or their main webpage ( > http://elefant.developer.nicta.com.au/ ). I talked to Christfried at > scipy, and he definitly seemed open for collaborations. > Looks very nice, also statistical oriented, graphing is almost the same as ogl-like.py. Not yet suited for windows :-( > And we have enthought's approach. If you want to have a glimpse at the > UI, you can look on page 24 of Eric's slides. couldn't find that :-( > The two interesting things > about enthought's approach is that they reuse their knowhow in layout > engine to generate the dialogs (I am not to sure it is pure traits, but it > seems linked), and, most interesting, that they use standard python > functions to build their visual blocks, and do not need descriptions of > input and output. Obviously this imposes a lot of restriction on the way > the functions behave (basically pure functionnal-like behavior, where > side effects are banned), and last time I talked with them, they had > problems adding flow control to this. But they are using this engine in a > commercial app, which means that there is money going in it to improve > it, so it is definetaly worth the look. > I don't think any of the mentioned packages is capable of (pseudo-) realtime programming, so there might be room for one more package ;-) cheers, Stef From gael.varoquaux at normalesup.org Sat Nov 3 19:07:01 2007 From: gael.varoquaux at normalesup.org (Gael Varoquaux) Date: Sun, 4 Nov 2007 00:07:01 +0100 Subject: [SciPy-user] Scipy + Vision = LabView ? In-Reply-To: <472CFBC4.1090802@ru.nl> References: <4728FFC8.2070007@ru.nl> <40e64fa20710311536v68f351bbq4cbc7a836e7620bf@mail.gmail.com> <20071102140821.GA1517@clipper.ens.fr> <472B6D19.50409@ru.nl> <20071103145233.GE1956@clipper.ens.fr> <472CFBC4.1090802@ru.nl> Message-ID: <20071103230701.GL6725@clipper.ens.fr> On Sat, Nov 03, 2007 at 11:52:52PM +0100, Stef Mientki wrote: > I don't think any of the mentioned packages is capable of (pseudo-) > realtime programming, What do you mean by that. Reaction to the changes as they happen, a bit like in labview. I think enthought's pacakage would be the one to look at, as a reactive interaction with the user was one of their design goals. > so there might be room for one more package ;-) No, you mean, it would be interesting to extend an existing package :->. Ga?l From s.mientki at ru.nl Sat Nov 3 20:37:28 2007 From: s.mientki at ru.nl (Stef Mientki) Date: Sun, 04 Nov 2007 01:37:28 +0100 Subject: [SciPy-user] Scipy + Vision = LabView ? In-Reply-To: <20071103230701.GL6725@clipper.ens.fr> References: <4728FFC8.2070007@ru.nl> <40e64fa20710311536v68f351bbq4cbc7a836e7620bf@mail.gmail.com> <20071102140821.GA1517@clipper.ens.fr> <472B6D19.50409@ru.nl> <20071103145233.GE1956@clipper.ens.fr> <472CFBC4.1090802@ru.nl> <20071103230701.GL6725@clipper.ens.fr> Message-ID: <472D1448.2090901@ru.nl> Gael Varoquaux wrote: > On Sat, Nov 03, 2007 at 11:52:52PM +0100, Stef Mientki wrote: > >> I don't think any of the mentioned packages is capable of (pseudo-) >> realtime programming, >> > > What do you mean by that. Reaction to the changes as they happen, a bit > like in labview. I think enthought's pacakage would be the one to look > at, as a reactive interaction with the user was one of their design > goals. > What I mean, is for "realtime" control loops. Let's say I sample data at something like 20 kHz, want to perform some calculations on it, display it and feed the manipulated data back to my hardware, how am I going to let the data flow: - sample by sample, much too slow - chunking, that will work, but do these environments support that ? I do the latter in Scipy, but with a Delphi GUI ;-) , that works quit well. In the radio world there seems to be a even much faster approach (real time demodulation). >> so there might be room for one more package ;-) >> > > No, you mean, it would be interesting to extend an existing package :->. > It's always funny to see that "others" know better what "I" mean ;-) Well Yes and No, seeing the mentioned examples, these are my quick thoughts (and it's not all about taste) - I like the Elefant / SciCos graphical view - I like the rectangle connections of Vision - Libraries should be organized in a tree, which consists of an auto-organizing part and a user configurable part, this tree should have full drag and drop and be fully editable - Dataflow should be selectable per block: sample-based, chunked or sequential - Each block should be directly editable, which will automatically create a new device (unless ..) - A block can be created by pure code, flow chart, graphical state machine or any combination of these, but also from an electrical circuit diagram, a pneumatic circuit, a physical design or a virtual world design. - In sequential mode, partial execution should be possible, which means that all intermediate results should be stored - The user interface of the final application should be controlled completely, independent of the functional program - Not the deep nesting of LabView, everything should be available in at most 2 clicks - Code snippets should be available in user organizable way ok, I'm dreaming again ;-) but on the other hand, most of the parts are somewhere in Python available ! cheers, Stef From gael.varoquaux at normalesup.org Sat Nov 3 20:52:13 2007 From: gael.varoquaux at normalesup.org (Gael Varoquaux) Date: Sun, 4 Nov 2007 01:52:13 +0100 Subject: [SciPy-user] Scipy + Vision = LabView ? In-Reply-To: <472D1448.2090901@ru.nl> References: <4728FFC8.2070007@ru.nl> <40e64fa20710311536v68f351bbq4cbc7a836e7620bf@mail.gmail.com> <20071102140821.GA1517@clipper.ens.fr> <472B6D19.50409@ru.nl> <20071103145233.GE1956@clipper.ens.fr> <472CFBC4.1090802@ru.nl> <20071103230701.GL6725@clipper.ens.fr> <472D1448.2090901@ru.nl> Message-ID: <20071104005213.GA17516@clipper.ens.fr> On Sun, Nov 04, 2007 at 01:37:28AM +0100, Stef Mientki wrote: > What I mean, is for "realtime" control loops. > Let's say I sample data at something like 20 kHz, > want to perform some calculations on it, > display it and feed the manipulated data back to my hardware, > how am I going to let the data flow: > - sample by sample, much too slow > - chunking, that will work, but do these environments support that ? > I do the latter in Scipy, but with a Delphi GUI ;-) , that works quit well. > In the radio world there seems to be a even much faster approach (real > time demodulation). I have done this kind of stuff, and I don't think the current status of the different Python visual programming engines is good enough to do this, unless you want really low timescales for you feedback loop. I am not saying it is impossible, I am just saying there is a lot of work to be done. The problem is that unless your time scales are really slow, you will want some reasonnably fine control on what is happening in you loop. My 2 cents, Ga?l From mark at mitre.org Sat Nov 3 21:08:45 2007 From: mark at mitre.org (Mark Heslep) Date: Sat, 03 Nov 2007 21:08:45 -0400 Subject: [SciPy-user] nullclines for nonlinear ODEs In-Reply-To: <88e473830710300926l66c350c3k72b7fda9e6fd5c31@mail.gmail.com> References: <88e473830710300926l66c350c3k72b7fda9e6fd5c31@mail.gmail.com> Message-ID: <472D1B9D.5050003@mitre.org> John Hunter wrote: > In a workshop on scientific computing in python I ran with Andrew > Straw last weekend at the Claremont Colleges, I presented a unit on > ODEs, numerically integrating the Lotka-Volterra equations, drawing a > direction field, plotting a trajectory in the phase plane, and > plotting the nullclines. For the nullclines, I used matplotlib's > contouring routine with levels=[0], but I was wondering if there was a > better/more direct route. > > Anyone care to comment on scipy's useability for solving ODEs versus, say, Mathematica or Matlab? I'm not very familiar w/ either of the 'M' packages, but may have them forced on me soon unless I'm able to say, no, we are better off w/ scipy for user metric 1, 2, ... If scipy comes up short in some way, sign me up as able and willing to help. Apologies for hijacking your thread, your example is perfect. Mark From mark at mitre.org Sat Nov 3 22:34:08 2007 From: mark at mitre.org (Mark Heslep) Date: Sat, 03 Nov 2007 22:34:08 -0400 Subject: [SciPy-user] Subject: Re: Scientific Python publications In-Reply-To: References: Message-ID: <472D2FA0.3010001@mitre.org> > > The animation, all figures and all calculations were done in Python > (+Scilab+Pylab), but obviously this is not a programming paper and Python > is a research tool here (which I think is the goal of the creators of the > wonderful language and libraries). Unfortunately there is only code for > Matlab - just a few one-liners. In HRV (heart rate variability) research > most people use Matlab and hardly anyone even knows about Python. I am > trying to advertise Python, but the paper would not have been accepted with > code in Python. > > Of course the paper clearly says that the calculations were done in Python. > > Jarek > > Given that you felt for the sake of publication you must use Matlab, was it worth it then to use scipy to begin with? That is, if you knew you would need to convert to Matlab, was scipy worth the trouble? From evanlezar at gmail.com Sun Nov 4 04:37:42 2007 From: evanlezar at gmail.com (Evan Lezar) Date: Sun, 4 Nov 2007 11:37:42 +0200 Subject: [SciPy-user] resizing a dok_matrix Message-ID: Hi I have been using a dok_matrix from scipy 0.5.3 to implement re-sizable sparse matrices of which the final size is not known initially. This was done by adjusting the shape property of an object inherited from the dok_matrix class. My problem is that in scipy 0.6 the dok_matrix is no longer re-sizable in this way. Is there a specific reason for this? Any suggestions will be welcomed. Thanks Evan -- visit http://randomestrandom.blogspot.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From samrobertsmith at gmail.com Sun Nov 4 05:08:21 2007 From: samrobertsmith at gmail.com (linda.s) Date: Sun, 4 Nov 2007 02:08:21 -0800 Subject: [SciPy-user] about MLab Message-ID: <1d987df30711040208l53e36652q909252d34ed6fcd1@mail.gmail.com> I have quite a few codes written in Numeric. MLab comes with Numeric too. I got confused when I began to use numpy because it does not have MLab and does not have some same functions in Numeric. is there any smooth way to change the code developed in Numeric to numpy? Thanks, Linda From C.J.Lee at tnw.utwente.nl Sun Nov 4 06:06:31 2007 From: C.J.Lee at tnw.utwente.nl (C.J.Lee at tnw.utwente.nl) Date: Sun, 4 Nov 2007 12:06:31 +0100 Subject: [SciPy-user] about MLab References: <1d987df30711040208l53e36652q909252d34ed6fcd1@mail.gmail.com> Message-ID: <9E7F7554631C0047AB598527A4E9569B6F8102@tnx4.dynamic.tnw.utwente.nl> Some Numeric functionality can be obtained as follows: The old Numeric routine names and behavior for functions in the add-on packages UserArray, Matrix, MLab, LinearAlgebra, FFT, RandomArray, RNG and RNG.Statistics have been moved to numpy.oldnumeric. where is user_array, matrix, mlab, linear_algebra, fft, random_array, rng, and rng_stats respectively Hope this helps Cheers Chris -----Original Message----- From: scipy-user-bounces at scipy.org on behalf of linda.s Sent: Sun 11/4/2007 11:08 AM To: SciPy Users List Subject: [SciPy-user] about MLab I have quite a few codes written in Numeric. MLab comes with Numeric too. I got confused when I began to use numpy because it does not have MLab and does not have some same functions in Numeric. is there any smooth way to change the code developed in Numeric to numpy? Thanks, Linda _______________________________________________ SciPy-user mailing list SciPy-user at scipy.org http://projects.scipy.org/mailman/listinfo/scipy-user From samrobertsmith at gmail.com Sun Nov 4 06:33:04 2007 From: samrobertsmith at gmail.com (linda.s) Date: Sun, 4 Nov 2007 03:33:04 -0800 Subject: [SciPy-user] about MLab In-Reply-To: <9E7F7554631C0047AB598527A4E9569B6F8102@tnx4.dynamic.tnw.utwente.nl> References: <1d987df30711040208l53e36652q909252d34ed6fcd1@mail.gmail.com> <9E7F7554631C0047AB598527A4E9569B6F8102@tnx4.dynamic.tnw.utwente.nl> Message-ID: <1d987df30711040333w5aaa7fe1r49fa6ccd109d6f87@mail.gmail.com> >>> numpy.oldnumeric.mlab Traceback (most recent call last): File "", line 1, in AttributeError: 'module' object has no attribute 'oldnumeric' On Nov 4, 2007 3:06 AM, wrote: > Some Numeric functionality can be obtained as follows: > > The old Numeric routine names and behavior for functions in the add-on packages UserArray, Matrix, MLab, LinearAlgebra, FFT, RandomArray, RNG and RNG.Statistics have been moved to numpy.oldnumeric. where is user_array, matrix, mlab, linear_algebra, fft, random_array, rng, and rng_stats respectively > > Hope this helps > Cheers > Chris > > > > -----Original Message----- > From: scipy-user-bounces at scipy.org on behalf of linda.s > Sent: Sun 11/4/2007 11:08 AM > To: SciPy Users List > Subject: [SciPy-user] about MLab > > I have quite a few codes written in Numeric. > MLab comes with Numeric too. > I got confused when I began to use numpy because it does not have MLab > and does not have some same functions in Numeric. > is there any smooth way to change the code developed in Numeric to numpy? > Thanks, > Linda > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > From jaropis at gazeta.pl Sun Nov 4 06:50:36 2007 From: jaropis at gazeta.pl (jaropis) Date: Sun, 04 Nov 2007 12:50:36 +0100 Subject: [SciPy-user] Subject: Re: Scientific Python publications References: <472D2FA0.3010001@mitre.org> Message-ID: > Given that you felt for the sake of publication you must use Matlab, was > it worth it then to use scipy to begin with? That is, if you knew you > would need to convert to Matlab, was scipy worth the trouble? A very good question. I use scipy because my research is multidisciplinary and multicenter. This means that I have to write code that will be used in various places all around the world. I cannot expect all my friends to buy Matlab. The second reason is that I do not own a copy of Matlab - I used the copy owned by my University only to test the code. If I had wanted to use Matlab for this paper I would have had to spend weeks at the University. Using Python I could work at home. The third reason is probably the most important: Python is a real programming language and therefore it is much more convenient. I get medical time series in various formats, usualy some kind of text files, and python is great at processing them and unifying all the formats I come accross to the single format which is used by my software. Everyone who has ever tried to process gigabytes of text files with Matlab knows that this is extremally difficult. It is also much easier to write small scripts that have a specific function and send them over the net to those who need them. As I said - the use of Matlab was necessary to include the code for the various descriptors introduced in the paper - unfortunately Scipy code would not have been accepted (I know this for a fact). The Python software used in this paper was much more complex than the Matlab one-liners - I had to process the recordings and do the calculations. Also, I had to write a speparate program to produce the animation. Of course this software was not converted to Matlab - just the mathematical expressions. Jarek From matthieu.brucher at gmail.com Sun Nov 4 06:58:15 2007 From: matthieu.brucher at gmail.com (Matthieu Brucher) Date: Sun, 4 Nov 2007 12:58:15 +0100 Subject: [SciPy-user] Scipy + Vision = LabView ? In-Reply-To: <472D1448.2090901@ru.nl> References: <4728FFC8.2070007@ru.nl> <40e64fa20710311536v68f351bbq4cbc7a836e7620bf@mail.gmail.com> <20071102140821.GA1517@clipper.ens.fr> <472B6D19.50409@ru.nl> <20071103145233.GE1956@clipper.ens.fr> <472CFBC4.1090802@ru.nl> <20071103230701.GL6725@clipper.ens.fr> <472D1448.2090901@ru.nl> Message-ID: > > > What do you mean by that. Reaction to the changes as they happen, a bit > > like in labview. I think enthought's pacakage would be the one to look > > at, as a reactive interaction with the user was one of their design > > goals. > What I mean, is for "realtime" control loops. > Let's say I sample data at something like 20 kHz, > want to perform some calculations on it, > display it and feed the manipulated data back to my hardware, > how am I going to let the data flow: > - sample by sample, much too slow > - chunking, that will work, but do these environments support that ? > I do the latter in Scipy, but with a Delphi GUI ;-) , that works quit > well. > In the radio world there seems to be a even much faster approach (real > time demodulation). > This needs, IMHO, a special package. Usual algorithms are not realtime capable. Real-time arises when there is a time notion, and this needs the use of chunks and thus a special interface. For instance in audio, the standard is hard to use, and there are a lot of pitfalls when implementing a host that respects the different latencies. Matthieu -- French PhD student Website : http://miles.developpez.com/ Blogs : http://matt.eifelle.com and http://blog.developpez.com/?blog=92 LinkedIn : http://www.linkedin.com/in/matthieubrucher -------------- next part -------------- An HTML attachment was scrubbed... URL: From C.J.Lee at tnw.utwente.nl Sun Nov 4 07:06:23 2007 From: C.J.Lee at tnw.utwente.nl (C.J.Lee at tnw.utwente.nl) Date: Sun, 4 Nov 2007 13:06:23 +0100 Subject: [SciPy-user] about MLab References: <1d987df30711040208l53e36652q909252d34ed6fcd1@mail.gmail.com><9E7F7554631C0047AB598527A4E9569B6F8102@tnx4.dynamic.tnw.utwente.nl> <1d987df30711040333w5aaa7fe1r49fa6ccd109d6f87@mail.gmail.com> Message-ID: <9E7F7554631C0047AB598527A4E9569B6F8103@tnx4.dynamic.tnw.utwente.nl> I would guess that means you have an incomplete install of numpy. Check that in the site-packages directory that numpy/oldnumeric/mlab.py exists. If it doesn't then you need to reinstall numpy I can't check on my linux or windows machine right now, but it certainly works on my Mac OS X box Cheers Chris -----Original Message----- From: scipy-user-bounces at scipy.org on behalf of linda.s Sent: Sun 11/4/2007 12:33 PM To: SciPy Users List Subject: Re: [SciPy-user] about MLab >>> numpy.oldnumeric.mlab Traceback (most recent call last): File "", line 1, in AttributeError: 'module' object has no attribute 'oldnumeric' On Nov 4, 2007 3:06 AM, wrote: > Some Numeric functionality can be obtained as follows: > > The old Numeric routine names and behavior for functions in the add-on packages UserArray, Matrix, MLab, LinearAlgebra, FFT, RandomArray, RNG and RNG.Statistics have been moved to numpy.oldnumeric. where is user_array, matrix, mlab, linear_algebra, fft, random_array, rng, and rng_stats respectively > > Hope this helps > Cheers > Chris > > > > -----Original Message----- > From: scipy-user-bounces at scipy.org on behalf of linda.s > Sent: Sun 11/4/2007 11:08 AM > To: SciPy Users List > Subject: [SciPy-user] about MLab > > I have quite a few codes written in Numeric. > MLab comes with Numeric too. > I got confused when I began to use numpy because it does not have MLab > and does not have some same functions in Numeric. > is there any smooth way to change the code developed in Numeric to numpy? > Thanks, > Linda > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > _______________________________________________ SciPy-user mailing list SciPy-user at scipy.org http://projects.scipy.org/mailman/listinfo/scipy-user From matthieu.brucher at gmail.com Sun Nov 4 07:57:33 2007 From: matthieu.brucher at gmail.com (Matthieu Brucher) Date: Sun, 4 Nov 2007 13:57:33 +0100 Subject: [SciPy-user] Scipy + Vision = LabView ? In-Reply-To: <472D1448.2090901@ru.nl> References: <4728FFC8.2070007@ru.nl> <40e64fa20710311536v68f351bbq4cbc7a836e7620bf@mail.gmail.com> <20071102140821.GA1517@clipper.ens.fr> <472B6D19.50409@ru.nl> <20071103145233.GE1956@clipper.ens.fr> <472CFBC4.1090802@ru.nl> <20071103230701.GL6725@clipper.ens.fr> <472D1448.2090901@ru.nl> Message-ID: > > - A block can be created by pure code, flow chart, graphical state > machine or any combination of these, > but also from an electrical circuit diagram, a pneumatic circuit, a > physical design or a virtual world design. > The Simulink twin in Python ? -- French PhD student Website : http://miles.developpez.com/ Blogs : http://matt.eifelle.com and http://blog.developpez.com/?blog=92 LinkedIn : http://www.linkedin.com/in/matthieubrucher -------------- next part -------------- An HTML attachment was scrubbed... URL: From matthieu.brucher at gmail.com Sun Nov 4 08:01:02 2007 From: matthieu.brucher at gmail.com (Matthieu Brucher) Date: Sun, 4 Nov 2007 14:01:02 +0100 Subject: [SciPy-user] Subject: Re: Scientific Python publications In-Reply-To: References: <472D2FA0.3010001@mitre.org> Message-ID: > > As I said - the use of Matlab was necessary to include the code for the > various descriptors introduced in the paper - unfortunately Scipy code > would not have been accepted (I know this for a fact). The Python software > used in this paper was much more complex than the Matlab one-liners - I > had > to process the recordings and do the calculations. Also, I had to write a > speparate program to produce the animation. Of course this software was > not > converted to Matlab - just the mathematical expressions. > In my field, the same problem arises with a Matlab toolbox, SPM. You can't write an article without using this toolbox, especially if you want to publish it in NeuroImage (the editor in chief is one of the writter of the toolbox). Matthieu -- French PhD student Website : http://miles.developpez.com/ Blogs : http://matt.eifelle.com and http://blog.developpez.com/?blog=92 LinkedIn : http://www.linkedin.com/in/matthieubrucher -------------- next part -------------- An HTML attachment was scrubbed... URL: From et.gaudrain at free.fr Sun Nov 4 08:03:13 2007 From: et.gaudrain at free.fr (Etienne Gaudrain) Date: Sun, 04 Nov 2007 14:03:13 +0100 Subject: [SciPy-user] Scipy + Vision = LabView ? In-Reply-To: References: <4728FFC8.2070007@ru.nl> <40e64fa20710311536v68f351bbq4cbc7a836e7620bf@mail.gmail.com> <20071102140821.GA1517@clipper.ens.fr> <472B6D19.50409@ru.nl> <20071103145233.GE1956@clipper.ens.fr> <472CFBC4.1090802@ru.nl> <20071103230701.GL6725@clipper.ens.fr> <472D1448.2090901@ru.nl> Message-ID: <472DC311.40606@free.fr> Hi, I'm a Python newbie, and I'm not particularly familiar with LabView... However, I know a soft/language that could be interesting to mention in this discussion. The FAUST language is a functional language for audio processing. The language describes a block diagram (that can be generated in SVG from the code). One particularity is that the "FAUST compiler" reduces the process (using lambda-calculus) and then translates it in C++. In addition, the C++ code can then be wrapped with a GUI wrapper (GTK, wx...) and a sound system wrapper (ALSA, OSS...). The goal is cleary to interact with users during realtime processing. I thought a few months ago that it could be just wonderful to be able to use FAUST code in Python just like done with Weave. The developers of FAUST have developed an online version so you can test it here: http://faust.grame.fr/. -Etienne Matthieu Brucher a ?crit : > > > What do you mean by that. Reaction to the changes as they > happen, a bit > > like in labview. I think enthought's pacakage would be the one > to look > > at, as a reactive interaction with the user was one of their design > > goals. > What I mean, is for "realtime" control loops. > Let's say I sample data at something like 20 kHz, > want to perform some calculations on it, > display it and feed the manipulated data back to my hardware, > how am I going to let the data flow: > - sample by sample, much too slow > - chunking, that will work, but do these environments support that ? > I do the latter in Scipy, but with a Delphi GUI ;-) , that works > quit well. > In the radio world there seems to be a even much faster approach (real > time demodulation). > > > This needs, IMHO, a special package. Usual algorithms are not realtime > capable. > Real-time arises when there is a time notion, and this needs the use > of chunks and thus a special interface. For instance in audio, the > standard is hard to use, and there are a lot of pitfalls when > implementing a host that respects the different latencies. > > Matthieu > > -- > French PhD student > Website : http://miles.developpez.com/ > Blogs : http://matt.eifelle.com and http://blog.developpez.com/?blog=92 > LinkedIn : http://www.linkedin.com/in/matthieubrucher > ------------------------------------------------------------------------ > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > -- ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Etienne Gaudrain Universite Claude Bernard LYON 1 CNRS - UMR5020, Neurosciences Sensorielles, Comportement, Cognition 50, avenue Tony Garnier 69366 LYON Cedex 07 FRANCE T?l : 04 37 28 74 85 Fax : 04 37 28 76 01 Page web equipe : http://olfac.univ-lyon1.fr/unite/equipe-02/equipe-2-f.html ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ -------------- next part -------------- An HTML attachment was scrubbed... URL: From aisaac at american.edu Sun Nov 4 08:12:34 2007 From: aisaac at american.edu (Alan G Isaac) Date: Sun, 4 Nov 2007 08:12:34 -0500 Subject: [SciPy-user] about MLab In-Reply-To: <1d987df30711040333w5aaa7fe1r49fa6ccd109d6f87@mail.gmail.com> References: <1d987df30711040208l53e36652q909252d34ed6fcd1@mail.gmail.com><9E7F7554631C0047AB598527A4E9569B6F8102@tnx4.dynamic.tnw.utwente.nl><1d987df30711040333w5aaa7fe1r49fa6ccd109d6f87@mail.gmail.com> Message-ID: On Sun, 4 Nov 2007, "linda.s" apparently wrote: > >>> numpy.oldnumeric.mlab > Traceback (most recent call last): > File "", line 1, in > AttributeError: 'module' object has no attribute > 'oldnumeric' >>> from numpy.oldnumeric import mlab hth, Alan Isaac From matthieu.brucher at gmail.com Sun Nov 4 08:23:15 2007 From: matthieu.brucher at gmail.com (Matthieu Brucher) Date: Sun, 4 Nov 2007 14:23:15 +0100 Subject: [SciPy-user] Scipy + Vision = LabView ? In-Reply-To: <472CFBC4.1090802@ru.nl> References: <4728FFC8.2070007@ru.nl> <40e64fa20710311536v68f351bbq4cbc7a836e7620bf@mail.gmail.com> <20071102140821.GA1517@clipper.ens.fr> <472B6D19.50409@ru.nl> <20071103145233.GE1956@clipper.ens.fr> <472CFBC4.1090802@ru.nl> Message-ID: > > > Elefant also uses a visual programming engine. see > > http://scipy.org/SciPy2007/Presentations for Christfried Webers' slides > > on elefant, or their main webpage ( > > http://elefant.developer.nicta.com.au/ ). I talked to Christfried at > > scipy, and he definitly seemed open for collaborations. > > > Looks very nice, also statistical oriented, graphing is almost the same > as ogl-like.py. > Not yet suited for windows :-( > Why is that ? If this is because no binaries are available for Windows, this can be done. Matthieu -- French PhD student Website : http://miles.developpez.com/ Blogs : http://matt.eifelle.com and http://blog.developpez.com/?blog=92 LinkedIn : http://www.linkedin.com/in/matthieubrucher -------------- next part -------------- An HTML attachment was scrubbed... URL: From s.mientki at ru.nl Sun Nov 4 09:31:24 2007 From: s.mientki at ru.nl (Stef Mientki) Date: Sun, 04 Nov 2007 15:31:24 +0100 Subject: [SciPy-user] Scipy + Vision = LabView ? In-Reply-To: References: <4728FFC8.2070007@ru.nl> <40e64fa20710311536v68f351bbq4cbc7a836e7620bf@mail.gmail.com> <20071102140821.GA1517@clipper.ens.fr> <472B6D19.50409@ru.nl> <20071103145233.GE1956@clipper.ens.fr> <472CFBC4.1090802@ru.nl> <20071103230701.GL6725@clipper.ens.fr> <472D1448.2090901@ru.nl> Message-ID: <472DD7BC.4070104@ru.nl> Matthieu Brucher wrote: > > - A block can be created by pure code, flow chart, graphical state > machine or any combination of these, > but also from an electrical circuit diagram, a pneumatic circuit, a > physical design or a virtual world design. > > > The Simulink twin in Python ? Simulinh AND PowerSim ;-) Stef From s.mientki at ru.nl Sun Nov 4 10:10:21 2007 From: s.mientki at ru.nl (Stef Mientki) Date: Sun, 04 Nov 2007 16:10:21 +0100 Subject: [SciPy-user] Scipy + Vision = LabView ? In-Reply-To: References: <4728FFC8.2070007@ru.nl> <40e64fa20710311536v68f351bbq4cbc7a836e7620bf@mail.gmail.com> <20071102140821.GA1517@clipper.ens.fr> <472B6D19.50409@ru.nl> <20071103145233.GE1956@clipper.ens.fr> <472CFBC4.1090802@ru.nl> Message-ID: <472DE0DD.6090900@ru.nl> Matthieu Brucher wrote: > > > Elefant also uses a visual programming engine. see > > http://scipy.org/SciPy2007/Presentations for Christfried Webers' > slides > > on elefant, or their main webpage ( > > http://elefant.developer.nicta.com.au/ ). I talked to Christfried at > > scipy, and he definitly seemed open for collaborations. > > > Looks very nice, also statistical oriented, graphing is almost the > same > as ogl-like.py. > Not yet suited for windows :-( > > > Why is that ? If this is because no binaries are available for > Windows, this can be done. Probably yes, but not within my capabilities ;-) cheers, Stef From unpingco.jose at gmail.com Sun Nov 4 10:13:06 2007 From: unpingco.jose at gmail.com (Jose Unpingco) Date: Sun, 4 Nov 2007 07:13:06 -0800 Subject: [SciPy-user] Scipy + Vision = LabView Message-ID: All: I put the tutorial together, and I plan to do more along these lines. My interest is in using VISION as a framework for a detailed communications simulation, although up to this point, VISION has been primarily used as a tool in chemistry and biology. I plan to write certain communications and signal processing components based upon scipy and integrate them into VISION as a distinct block set (i.e. component library). The developers of VISION are situated nearby and I am looking forward to that collaboration. Progress will be announced on the VISION mailing list (http://mgldev.scripps.edu/mailman/listinfo/vision) and on my blog (http://www.osc.edu/blogs/index.php/sip), but not on this list to avoid polluting it. As always, I appreciate your criticism. --Jos? From s.mientki at ru.nl Sun Nov 4 10:20:22 2007 From: s.mientki at ru.nl (Stef Mientki) Date: Sun, 04 Nov 2007 16:20:22 +0100 Subject: [SciPy-user] Subject: Re: Scientific Python publications In-Reply-To: References: <472D2FA0.3010001@mitre.org> Message-ID: <472DE336.5060007@ru.nl> Matthieu Brucher wrote: > > As I said - the use of Matlab was necessary to include the code > for the > various descriptors introduced in the paper - unfortunately Scipy > code > would not have been accepted (I know this for a fact). The Python > software > used in this paper was much more complex than the Matlab > one-liners - I had > to process the recordings and do the calculations. Also, I had to > write a > speparate program to produce the animation. Of course this > software was not > converted to Matlab - just the mathematical expressions. > > > In my field, the same problem arises with a Matlab toolbox, SPM. You > can't write an article without using this toolbox, especially if you > want to publish it in NeuroImage (the editor in chief is one of the > writter of the toolbox). This is really SAD: you're doing medical research, investigating and describing some physiological phenomena, well I can assume they are interested what algorithm you used, to verify your conclusions, but it's absolutely absurd, they make it obligatory that you write your manuscript with a pen of a certain brand !! It's even more absurd, if you realize that in MatLab the algorithms are unknown, so you can't verify them in case of weird results. ( I had a few a couple of years ago, and the answer of MathWorks was "try to get a job at MathWorks" ;-) But rethinking this attitude, maybe it's not absurd at all: because there are unknown errors in MatLab, they are unable to verify your results if you did the analysis in SciPy ;-) ;-) cheers, Stef Mientki From matthew.brett at gmail.com Sun Nov 4 10:39:49 2007 From: matthew.brett at gmail.com (Matthew Brett) Date: Sun, 4 Nov 2007 07:39:49 -0800 Subject: [SciPy-user] Subject: Re: Scientific Python publications In-Reply-To: <472DE336.5060007@ru.nl> References: <472D2FA0.3010001@mitre.org> <472DE336.5060007@ru.nl> Message-ID: <1e2af89e0711040739i394bce23r4145f041bb2032d4@mail.gmail.com> Hi, > > In my field, the same problem arises with a Matlab toolbox, SPM. You > > can't write an article without using this toolbox, especially if you > > want to publish it in NeuroImage (the editor in chief is one of the > > writter of the toolbox). > This is really SAD: > you're doing medical research, investigating and describing some > physiological phenomena, > well I can assume they are interested what algorithm you used, > to verify your conclusions, > but it's absolutely absurd, they make it obligatory that you write your > manuscript with a pen of a certain brand !! Hmm - I'm in the same field, and trying to work out a way to fix this with: http://neuroimaging.scipy.org The problem is not as silly as it seems. Neuroimaging is moderately complex, and it's difficult for reviewers to understand what has been done if you have used a variety of software packages. The advantage of SPM, in this case, is that the code for the neuroimaging part is open source (GPL), it covers the whole range of processing (more or less), and it's fairly easy to extend. The disadvantage is that - you need matlab - and that matlab is so clunky as a programming language that it deals very badly with the increasing need for complexity, meaning it is harder and harder to develop in SPM as time goes by. Our thought was that the only way out of this was to develop a system that would allow you to call the core algorithms of SPM within python, and develop new ones that you could validate against SPM (and other tools) as the standard. But it's hard work getting that right, because our community is so fragmented. Best, Matthew Matthew From et.gaudrain at free.fr Sun Nov 4 10:57:29 2007 From: et.gaudrain at free.fr (Etienne Gaudrain) Date: Sun, 04 Nov 2007 16:57:29 +0100 Subject: [SciPy-user] Subject: Re: Scientific Python publications In-Reply-To: <472DE336.5060007@ru.nl> References: <472D2FA0.3010001@mitre.org> <472DE336.5060007@ru.nl> Message-ID: <472DEBE9.4030905@free.fr> Hi, I understand you consider this is absurd. In a way, it is. And I agree that it is not pleasant to have to go back to Matlab when you moved up to Python. However, scientific publication have to meet a few standards and beside the infightings and abuse of power of some editors, we have to consider that scientific publication: - must be archived and archives must be reachable - must be readable/understandable by the community - must describe things that one could reproduce The two last points are, for now, not good for Python. Most of the community still don't know that Python exists. Then they consider themselves as unable to understand Python code (even if anybody who read in Matlab can actually read in Python). Worth, they don't have Python installed on their computer, so they can't run the .py. Ok, this no big deal, since Python can be freely downloaded. But, publications don't come along with a tutorial on how to install Python... (and Numpy.... and Scipy... and Pylab... and...) Finally, what is the most important in choosing between Python and Matlab is that Matlab is far more broadly used is the community than Python. For now. In scientific publication, the most common language must be used, even if it is not optimal. Look, we (try to) write papers in English, while French is far more elegant :-P . So, since Matlab is the standard, editors will require source codes in Matlab. But The Mathworks has recently done some mistakes in Matlab distributions (broke some backward compatibility for example), and a part of the community is moving to Python. So we can hope that since Scipy developement look stable, and since the documentation of Python/Numpy/Scipy is reliableenough, the community will largely move to Python. And will make substantial economies of money and time :-D . Regards, -Etienne Stef Mientki a ?crit : > Matthieu Brucher wrote: > >> As I said - the use of Matlab was necessary to include the code >> for the >> various descriptors introduced in the paper - unfortunately Scipy >> code >> would not have been accepted (I know this for a fact). The Python >> software >> used in this paper was much more complex than the Matlab >> one-liners - I had >> to process the recordings and do the calculations. Also, I had to >> write a >> speparate program to produce the animation. Of course this >> software was not >> converted to Matlab - just the mathematical expressions. >> >> >> In my field, the same problem arises with a Matlab toolbox, SPM. You >> can't write an article without using this toolbox, especially if you >> want to publish it in NeuroImage (the editor in chief is one of the >> writter of the toolbox). >> > This is really SAD: > you're doing medical research, investigating and describing some > physiological phenomena, > well I can assume they are interested what algorithm you used, > to verify your conclusions, > but it's absolutely absurd, they make it obligatory that you write your > manuscript with a pen of a certain brand !! > > It's even more absurd, if you realize > that in MatLab the algorithms are unknown, > so you can't verify them in case of weird results. > ( I had a few a couple of years ago, > and the answer of MathWorks was "try to get a job at MathWorks" ;-) > > But rethinking this attitude, maybe it's not absurd at all: > because there are unknown errors in MatLab, > they are unable to verify your results if you did the analysis in SciPy > ;-) ;-) > > cheers, > Stef Mientki > > > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > > > -- ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Etienne Gaudrain Universite Claude Bernard LYON 1 CNRS - UMR5020, Neurosciences Sensorielles, Comportement, Cognition 50, avenue Tony Garnier 69366 LYON Cedex 07 FRANCE T?l : 04 37 28 74 85 Fax : 04 37 28 76 01 Page web equipe : http://olfac.univ-lyon1.fr/unite/equipe-02/equipe-2-f.html ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ -------------- next part -------------- An HTML attachment was scrubbed... URL: From et.gaudrain at free.fr Sun Nov 4 11:02:49 2007 From: et.gaudrain at free.fr (Etienne Gaudrain) Date: Sun, 04 Nov 2007 17:02:49 +0100 Subject: [SciPy-user] Subject: Re: Scientific Python publications In-Reply-To: <1e2af89e0711040739i394bce23r4145f041bb2032d4@mail.gmail.com> References: <472D2FA0.3010001@mitre.org> <472DE336.5060007@ru.nl> <1e2af89e0711040739i394bce23r4145f041bb2032d4@mail.gmail.com> Message-ID: <472DED29.1030101@free.fr> Is there any attempt to merge this project with http://neuralensemble.org/ ? -Etienne Matthew Brett a ?crit : > Hi, > > Hmm - I'm in the same field, and trying to work out a way to fix this with: > > http://neuroimaging.scipy.org > > The problem is not as silly as it seems. Neuroimaging is moderately > complex, and it's difficult for reviewers to understand what has been > done if you have used a variety of software packages. The advantage > of SPM, in this case, is that the code for the neuroimaging part is > open source (GPL), it covers the whole range of processing (more or > less), and it's fairly easy to extend. The disadvantage is that - you > need matlab - and that matlab is so clunky as a programming language > that it deals very badly with the increasing need for complexity, > meaning it is harder and harder to develop in SPM as time goes by. > Our thought was that the only way out of this was to develop a system > that would allow you to call the core algorithms of SPM within python, > and develop new ones that you could validate against SPM (and other > tools) as the standard. But it's hard work getting that right, > because our community is so fragmented. > > Best, > > Matthew > > Matthew > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > > > -- ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Etienne Gaudrain Universite Claude Bernard LYON 1 CNRS - UMR5020, Neurosciences Sensorielles, Comportement, Cognition 50, avenue Tony Garnier 69366 LYON Cedex 07 FRANCE T?l : 04 37 28 74 85 Fax : 04 37 28 76 01 Page web equipe : http://olfac.univ-lyon1.fr/unite/equipe-02/equipe-2-f.html ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ From nicolas.pettiaux at ael.be Sun Nov 4 11:09:57 2007 From: nicolas.pettiaux at ael.be (Nicolas Pettiaux) Date: Sun, 4 Nov 2007 17:09:57 +0100 Subject: [SciPy-user] Subject: Re: Scientific Python publications In-Reply-To: References: Message-ID: 2007/10/29, jaropis : > > Maybe Jarek could send us or post on his website a copy of a previous > > version of the paper. These are not covered usually by the (abusive) > > copyright of the Journal > > No problem - the copyright agreement gives me the right to post it on my > website, so here it is: > > http://www.if.uz.zgora.pl/~jaropis/geomasy much thanks, Nicolas > The animation, all figures and all calculations were done in Python > (+Scilab+Pylab), but obviously this is not a programming paper and Python > is a research tool here (which I think is the goal of the creators of the > wonderful language and libraries). Unfortunately there is only code for > Matlab - just a few one-liners. In HRV (heart rate variability) research > most people use Matlab and hardly anyone even knows about Python. I am > trying to advertise Python, but the paper would not have been accepted with > code in Python. such a dependance on the language is very sad. As well as is the fact IMHO, that in a scientific paper, the author do not have to publish the code, or at least allow the other researchers to have access to it and read it and rerun it, to check the results as this one of the very purpose of publishing. Nicolas -- Nicolas Pettiaux - email: nicolas.pettiaux at ael.be Utiliser des formats ouverts et des logiciels libres - http://www.passeralinux.org. Pour la bureautique, les seuls formats ISO sont ceux de http://fr.openoffice.org From matthew.brett at gmail.com Sun Nov 4 11:41:36 2007 From: matthew.brett at gmail.com (Matthew Brett) Date: Sun, 4 Nov 2007 08:41:36 -0800 Subject: [SciPy-user] Subject: Re: Scientific Python publications In-Reply-To: <472DED29.1030101@free.fr> References: <472D2FA0.3010001@mitre.org> <472DE336.5060007@ru.nl> <1e2af89e0711040739i394bce23r4145f041bb2032d4@mail.gmail.com> <472DED29.1030101@free.fr> Message-ID: <1e2af89e0711040841h362f1fa1v94340e28a0707fa0@mail.gmail.com> Hi, On 11/4/07, Etienne Gaudrain wrote: > Is there any attempt to merge this project with http://neuralensemble.org/ ? No, not so far - but neural network modelling and neuroimaging are very different areas, and there's very little overlap in the algorithms used. Neuroimaging is mainly image processing for example. Best, Matthew From oliphant at enthought.com Sun Nov 4 11:48:09 2007 From: oliphant at enthought.com (Travis E. Oliphant) Date: Sun, 04 Nov 2007 10:48:09 -0600 Subject: [SciPy-user] about MLab In-Reply-To: <1d987df30711040208l53e36652q909252d34ed6fcd1@mail.gmail.com> References: <1d987df30711040208l53e36652q909252d34ed6fcd1@mail.gmail.com> Message-ID: <472DF7C9.8020305@enthought.com> linda.s wrote: > I have quite a few codes written in Numeric. > MLab comes with Numeric too. > I got confused when I began to use numpy because it does not have MLab > and does not have some same functions in Numeric. > is there any smooth way to change the code developed in Numeric to numpy? > You should also read the first chapters of the book "Guide to NumPy" which are available on-line for immediate download. There is a comprehensive description of the differences between Numeric and NumPy as well as tips for compatibility. You must also import numpy.oldnumeric before it is available. import numpy.oldnumeric numpy.oldnumeric.mlab You will notice that the module is simple a re-naming module. All of the functionality is in numpy but under possibly different names. All functions in Numeric are available in NumPy (as far as I know). If you encounter a situation where that is not true, please let me know. -Travis From matthieu.brucher at gmail.com Sun Nov 4 11:48:07 2007 From: matthieu.brucher at gmail.com (Matthieu Brucher) Date: Sun, 4 Nov 2007 17:48:07 +0100 Subject: [SciPy-user] Scipy + Vision = LabView ? In-Reply-To: <472DE0DD.6090900@ru.nl> References: <4728FFC8.2070007@ru.nl> <40e64fa20710311536v68f351bbq4cbc7a836e7620bf@mail.gmail.com> <20071102140821.GA1517@clipper.ens.fr> <472B6D19.50409@ru.nl> <20071103145233.GE1956@clipper.ens.fr> <472CFBC4.1090802@ru.nl> <472DE0DD.6090900@ru.nl> Message-ID: I'm searching a way of using the GUI, but I didn't find it ATM, even for Linux. I'm noticed that it used its own "OGL", this explains the better look and feel than Viper and OGL-based applications. Matthieu > Why is that ? If this is because no binaries are available for > > Windows, this can be done. > Probably yes, but not within my capabilities ;-) > cheers, > Stef > Matthieu -- French PhD student Website : http://miles.developpez.com/ Blogs : http://matt.eifelle.com and http://blog.developpez.com/?blog=92 LinkedIn : http://www.linkedin.com/in/matthieubrucher -------------- next part -------------- An HTML attachment was scrubbed... URL: From matthieu.brucher at gmail.com Sun Nov 4 11:52:59 2007 From: matthieu.brucher at gmail.com (Matthieu Brucher) Date: Sun, 4 Nov 2007 17:52:59 +0100 Subject: [SciPy-user] Subject: Re: Scientific Python publications In-Reply-To: <1e2af89e0711040739i394bce23r4145f041bb2032d4@mail.gmail.com> References: <472D2FA0.3010001@mitre.org> <472DE336.5060007@ru.nl> <1e2af89e0711040739i394bce23r4145f041bb2032d4@mail.gmail.com> Message-ID: Hi, It's even worse. If you design a new algorithm with results that cannot be verified with SPM, you can't ublish anything (worse : the neurologists do not want to publish results with your algorithms). The fact that SPM is GPL is not relevent for me. What bothers me is that there is a conflict of interest between the SPM team and the editorial board of some journals et the fact that even you demonstrated that your algorithms are correct, you can't publish something that SPM cannot coroborate (knowing that SPM relies on some heavy hypothesis that are not even correct). Matthieu PS : I really hope nipy will go on growing and that people at my lab will collaborate, at last, even if it is not for mainstream algorithms (SPMs, ...) 2007/11/4, Matthew Brett : > > Hi, > > > > In my field, the same problem arises with a Matlab toolbox, SPM. You > > > can't write an article without using this toolbox, especially if you > > > want to publish it in NeuroImage (the editor in chief is one of the > > > writter of the toolbox). > > This is really SAD: > > you're doing medical research, investigating and describing some > > physiological phenomena, > > well I can assume they are interested what algorithm you used, > > to verify your conclusions, > > but it's absolutely absurd, they make it obligatory that you write your > > manuscript with a pen of a certain brand !! > > Hmm - I'm in the same field, and trying to work out a way to fix this > with: > > http://neuroimaging.scipy.org > > The problem is not as silly as it seems. Neuroimaging is moderately > complex, and it's difficult for reviewers to understand what has been > done if you have used a variety of software packages. The advantage > of SPM, in this case, is that the code for the neuroimaging part is > open source (GPL), it covers the whole range of processing (more or > less), and it's fairly easy to extend. The disadvantage is that - you > need matlab - and that matlab is so clunky as a programming language > that it deals very badly with the increasing need for complexity, > meaning it is harder and harder to develop in SPM as time goes by. > Our thought was that the only way out of this was to develop a system > that would allow you to call the core algorithms of SPM within python, > and develop new ones that you could validate against SPM (and other > tools) as the standard. But it's hard work getting that right, > because our community is so fragmented. > > Best, > > Matthew > > Matthew > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > -- French PhD student Website : http://miles.developpez.com/ Blogs : http://matt.eifelle.com and http://blog.developpez.com/?blog=92 LinkedIn : http://www.linkedin.com/in/matthieubrucher -------------- next part -------------- An HTML attachment was scrubbed... URL: From jh at physics.ucf.edu Sun Nov 4 11:53:01 2007 From: jh at physics.ucf.edu (Joe Harrington) Date: Sun, 04 Nov 2007 11:53:01 -0500 Subject: [SciPy-user] surface plotting Message-ID: <1194195181.5621.364.camel@glup.physics.ucf.edu> I'm in search of either Python or an open-source, standalone image viewer that can visualize surface plots. A surface plot is a rendering of a 2D image as a 3D surface using the image values as the Z coordinate. The new wiremesh in matplotlib is way too slow to be of practical use. It takes over a minute to plot a 1024**2 image, and for my purposes I need to move the surface around and look at it from different angles, which requires many re-renderings. Previously, I had used IDL, which has two routines called surface and shade_surface. The first is a wiremesh, the second is the same but adds shading according to viewing angle. These routines ran at decent speeds in IDL even when they were introduced about 20 years ago. IRAF also has them (or at least the wiremesh), but its size and klunkiness make it a poor solution as well. I'm not sure what's up with matplotlib, but this is a very new feature and it just might not be doing its thing in the most efficient way. In the long run, perhaps the source code for the IRAF routine could be a guide for doing it efficiently. In the short run, any ideas? It would make a good cookbook recipe. --jh-- From matthieu.brucher at gmail.com Sun Nov 4 11:54:50 2007 From: matthieu.brucher at gmail.com (Matthieu Brucher) Date: Sun, 4 Nov 2007 17:54:50 +0100 Subject: [SciPy-user] Subject: Re: Scientific Python publications In-Reply-To: <472DEBE9.4030905@free.fr> References: <472D2FA0.3010001@mitre.org> <472DE336.5060007@ru.nl> <472DEBE9.4030905@free.fr> Message-ID: Well, and in some communities, if your work is done in Matlab and not in C or C++, your work is not even looked at... (not even talking about the fact that generally you don't even say if your code is in X or Y) Matthieu 2007/11/4, Etienne Gaudrain : > > Hi, > > I understand you consider this is absurd. In a way, it is. And I agree > that it is not pleasant to have to go back to Matlab when you moved up to > Python. > > However, scientific publication have to meet a few standards and beside > the infightings and abuse of power of some editors, we have to consider that > scientific publication: > - must be archived and archives must be reachable > - must be readable/understandable by the community > - must describe things that one could reproduce > > The two last points are, for now, not good for Python. Most of the > community still don't know that Python exists. Then they consider themselves > as unable to understand Python code (even if anybody who read in Matlab can > actually read in Python). Worth, they don't have Python installed on their > computer, so they can't run the .py. Ok, this no big deal, since Python can > be freely downloaded. But, publications don't come along with a tutorial on > how to install Python... (and Numpy.... and Scipy... and Pylab... and...) > > Finally, what is the most important in choosing between Python and Matlab > is that Matlab is far more broadly used is the community than Python. For > now. In scientific publication, the most common language must be used, even > if it is not optimal. Look, we (try to) write papers in English, while > French is far more elegant :-P . > > So, since Matlab is the standard, editors will require source codes in > Matlab. But The Mathworks has recently done some mistakes in Matlab > distributions (broke some backward compatibility for example), and a part of > the community is moving to Python. So we can hope that since Scipy > developement look stable, and since the documentation of Python/Numpy/Scipy > is reliableenough, the community will largely move to Python. And will make > substantial economies of money and time :-D . > > Regards, > -Etienne > > > > Stef Mientki a ?crit : > > Matthieu Brucher wrote: > > As I said - the use of Matlab was necessary to include the code > for the > various descriptors introduced in the paper - unfortunately Scipy > code > would not have been accepted (I know this for a fact). The Python > software > used in this paper was much more complex than the Matlab > one-liners - I had > to process the recordings and do the calculations. Also, I had to > write a > speparate program to produce the animation. Of course this > software was not > converted to Matlab - just the mathematical expressions. > > > In my field, the same problem arises with a Matlab toolbox, SPM. You > can't write an article without using this toolbox, especially if you > want to publish it in NeuroImage (the editor in chief is one of the > writter of the toolbox). > > This is really SAD: > you're doing medical research, investigating and describing some > physiological phenomena, > well I can assume they are interested what algorithm you used, > to verify your conclusions, > but it's absolutely absurd, they make it obligatory that you write your > manuscript with a pen of a certain brand !! > > It's even more absurd, if you realize > that in MatLab the algorithms are unknown, > so you can't verify them in case of weird results. > ( I had a few a couple of years ago, > and the answer of MathWorks was "try to get a job at MathWorks" ;-) > > But rethinking this attitude, maybe it's not absurd at all: > because there are unknown errors in MatLab, > they are unable to verify your results if you did the analysis in SciPy > ;-) ;-) > > cheers, > Stef Mientki > > > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.orghttp://projects.scipy.org/mailman/listinfo/scipy-user > > > > > -- > ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ > Etienne Gaudrain > Universite Claude Bernard LYON 1 > CNRS - UMR5020, Neurosciences Sensorielles, Comportement, Cognition > 50, avenue Tony Garnier > 69366 LYON Cedex 07 > FRANCE > T?l : 04 37 28 74 85 > Fax : 04 37 28 76 01 > Page web equipe : http://olfac.univ-lyon1.fr/unite/equipe-02/equipe-2-f.html > ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ > > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > > -- French PhD student Website : http://miles.developpez.com/ Blogs : http://matt.eifelle.com and http://blog.developpez.com/?blog=92 LinkedIn : http://www.linkedin.com/in/matthieubrucher -------------- next part -------------- An HTML attachment was scrubbed... URL: From gael.varoquaux at normalesup.org Sun Nov 4 12:19:04 2007 From: gael.varoquaux at normalesup.org (Gael Varoquaux) Date: Sun, 4 Nov 2007 18:19:04 +0100 Subject: [SciPy-user] surface plotting In-Reply-To: <1194195181.5621.364.camel@glup.physics.ucf.edu> References: <1194195181.5621.364.camel@glup.physics.ucf.edu> Message-ID: <20071104171904.GB29007@clipper.ens.fr> On Sun, Nov 04, 2007 at 11:53:01AM -0500, Joe Harrington wrote: > I'm in search of either Python or an open-source, standalone image > viewer that can visualize surface plots. A surface plot is a rendering > of a 2D image as a 3D surface using the image values as the Z > coordinate. Mayavi2 can do this, and has a scripting interface that should be easy to use for what you want to do. You can have a look at https://svn.enthought.com/enthought/attachment/wiki/MayaVi/Sciy2007_mlab_slides.pdf?format=raw for instance. Hopefully you should find all the info you need to install mayavi2 on https://svn.enthought.com/enthought/wiki/MayaVi, if not just ask. HTH, Ga?l From matthew.brett at gmail.com Sun Nov 4 12:21:54 2007 From: matthew.brett at gmail.com (Matthew Brett) Date: Sun, 4 Nov 2007 17:21:54 +0000 Subject: [SciPy-user] Subject: Re: Scientific Python publications In-Reply-To: References: <472D2FA0.3010001@mitre.org> <472DE336.5060007@ru.nl> <1e2af89e0711040739i394bce23r4145f041bb2032d4@mail.gmail.com> Message-ID: <1e2af89e0711040921vb19c7cdq9126ab1b6dd0162f@mail.gmail.com> Hi, > It's even worse. If you design a new algorithm with results that cannot be > verified with SPM, you can't ublish anything (worse : the neurologists do > not want to publish results with your algorithms). Yes - I know what you mean. What we are trying to do is collect some support across the imaging community for NiPy as a valid alternative. When it's functional (ahem). Several of the team are well known in the imaging community, so I think, if we produce something functional, and publish on it, it will be hard to argue against using it for published work. Especially if you can show that using SPM (via wrappers) gives the same (similar) result. The problem is momentum. We're very close to critical mass now, but somehow we haven't succeeded in pooling our efforts enough to really break the back of the work, Matthew From C.J.Lee at tnw.utwente.nl Sun Nov 4 12:28:12 2007 From: C.J.Lee at tnw.utwente.nl (C.J.Lee at tnw.utwente.nl) Date: Sun, 4 Nov 2007 18:28:12 +0100 Subject: [SciPy-user] Subject: Re: Scientific Python publications References: <472D2FA0.3010001@mitre.org><472DE336.5060007@ru.nl><1e2af89e0711040739i394bce23r4145f041bb2032d4@mail.gmail.com> Message-ID: <9E7F7554631C0047AB598527A4E9569B6F8104@tnx4.dynamic.tnw.utwente.nl> Perhaps a compromise(for the moment) is to provide both. So minimal matlab scripting to allow publishable and the full python code also available. That way the phd students learn that there is an alternative available and the matlab *only* attitude will die out with retirements. This way you meet the requirements for the peer review and you can grow the python neuroimaging community It might also be interesting to attempt publishing results which show the differences in functionality between the matlab based code and python based code, particularly if you can find a test case where the matlab toolbox is definitively wrong. (I am not involved in neuroimaging so I can't really say anything about the feasibility of this) Cheers Chris -----Original Message----- From: scipy-user-bounces at scipy.org on behalf of Matthieu Brucher Sent: Sun 11/4/2007 5:52 PM To: SciPy Users List Subject: Re: [SciPy-user] Subject: Re: Scientific Python publications Hi, It's even worse. If you design a new algorithm with results that cannot be verified with SPM, you can't ublish anything (worse : the neurologists do not want to publish results with your algorithms). The fact that SPM is GPL is not relevent for me. What bothers me is that there is a conflict of interest between the SPM team and the editorial board of some journals et the fact that even you demonstrated that your algorithms are correct, you can't publish something that SPM cannot coroborate (knowing that SPM relies on some heavy hypothesis that are not even correct). Matthieu PS : I really hope nipy will go on growing and that people at my lab will collaborate, at last, even if it is not for mainstream algorithms (SPMs, ...) 2007/11/4, Matthew Brett : > > Hi, > > > > In my field, the same problem arises with a Matlab toolbox, SPM. You > > > can't write an article without using this toolbox, especially if you > > > want to publish it in NeuroImage (the editor in chief is one of the > > > writter of the toolbox). > > This is really SAD: > > you're doing medical research, investigating and describing some > > physiological phenomena, > > well I can assume they are interested what algorithm you used, > > to verify your conclusions, > > but it's absolutely absurd, they make it obligatory that you write your > > manuscript with a pen of a certain brand !! > > Hmm - I'm in the same field, and trying to work out a way to fix this > with: > > http://neuroimaging.scipy.org > > The problem is not as silly as it seems. Neuroimaging is moderately > complex, and it's difficult for reviewers to understand what has been > done if you have used a variety of software packages. The advantage > of SPM, in this case, is that the code for the neuroimaging part is > open source (GPL), it covers the whole range of processing (more or > less), and it's fairly easy to extend. The disadvantage is that - you > need matlab - and that matlab is so clunky as a programming language > that it deals very badly with the increasing need for complexity, > meaning it is harder and harder to develop in SPM as time goes by. > Our thought was that the only way out of this was to develop a system > that would allow you to call the core algorithms of SPM within python, > and develop new ones that you could validate against SPM (and other > tools) as the standard. But it's hard work getting that right, > because our community is so fragmented. > > Best, > > Matthew > > Matthew > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > -- French PhD student Website : http://miles.developpez.com/ Blogs : http://matt.eifelle.com and http://blog.developpez.com/?blog=92 LinkedIn : http://www.linkedin.com/in/matthieubrucher From matthew.brett at gmail.com Sun Nov 4 13:53:22 2007 From: matthew.brett at gmail.com (Matthew Brett) Date: Sun, 4 Nov 2007 18:53:22 +0000 Subject: [SciPy-user] Subject: Re: Scientific Python publications In-Reply-To: <9E7F7554631C0047AB598527A4E9569B6F8104@tnx4.dynamic.tnw.utwente.nl> References: <472D2FA0.3010001@mitre.org> <472DE336.5060007@ru.nl> <1e2af89e0711040739i394bce23r4145f041bb2032d4@mail.gmail.com> <9E7F7554631C0047AB598527A4E9569B6F8104@tnx4.dynamic.tnw.utwente.nl> Message-ID: <1e2af89e0711041053j4262a1b5t829a2003793110c3@mail.gmail.com> Hi, On 11/4/07, C.J.Lee at tnw.utwente.nl wrote: > Perhaps a compromise(for the moment) is to provide both. So minimal matlab scripting to allow publishable and the full python code also available. That way the phd students learn that there is an alternative available and the matlab *only* attitude will die out with retirements. This way you meet the requirements for the peer review and you can grow the python neuroimaging community Yes, that was more or less exactly our pitch to the NIH in the grant that got funded recently. > It might also be interesting to attempt publishing results which show the differences in functionality between the matlab based code and python based code, particularly if you can find a test case where the matlab toolbox is definitively wrong. (I am not involved in neuroimaging so I can't really say anything about the feasibility of this) I should say, in fairness, that the algorithms in SPM are generally very good, and I have found the main maintainer of SPM to be very generous and open-minded. For example, he wrote a very helpful and thoughtful letter of recommendation for our grant. For me, my main problem with SPM is matlab, and the very great difficulty of building a productive architecture with matlab. I'm quite sure we can build something much more fruitful in python, and would like to provide that as a service to the community, including the SPM community, of which I'm a long-time member. Matthew From karl.young at ucsf.edu Sun Nov 4 14:23:47 2007 From: karl.young at ucsf.edu (Young, Karl) Date: Sun, 4 Nov 2007 11:23:47 -0800 Subject: [SciPy-user] Subject: Re: Scientific Python publications References: <472D2FA0.3010001@mitre.org><472DE336.5060007@ru.nl><1e2af89e0711040739i394bce23r4145f041bb2032d4@mail.gmail.com><472DED29.1030101@free.fr> <1e2af89e0711040841h362f1fa1v94340e28a0707fa0@mail.gmail.com> Message-ID: <9D202D4E86A4BF47BA6943ABDF21BE78039F099B@EXVS06.net.ucsf.edu> >> On 11/4/07, Etienne Gaudrain wrote: >> Is there any attempt to merge this project with http://neuralensemble.org/ ? > No, not so far - but neural network modelling and neuroimaging are > very different areas, and there's very little overlap in the > algorithms used. Neuroimaging is mainly image processing for example. Which is not to say that it might not be interesting at some future point. E.g. some recent modeling work by Sporns et. al. re. using a detailed map of the map of the Macaque brain to generate a dynamic network model that reproduced patterns seen in resting state fMRI is drawing these areas a little closer (as if there wasn't already enough on the plates of NiPy folks !). From haase at msg.ucsf.edu Sun Nov 4 15:22:14 2007 From: haase at msg.ucsf.edu (Sebastian Haase) Date: Sun, 4 Nov 2007 21:22:14 +0100 Subject: [SciPy-user] surface plotting In-Reply-To: <20071104171904.GB29007@clipper.ens.fr> References: <1194195181.5621.364.camel@glup.physics.ucf.edu> <20071104171904.GB29007@clipper.ens.fr> Message-ID: On Nov 4, 2007 6:19 PM, Gael Varoquaux wrote: > On Sun, Nov 04, 2007 at 11:53:01AM -0500, Joe Harrington wrote: > > I'm in search of either Python or an open-source, standalone image > > viewer that can visualize surface plots. A surface plot is a rendering > > of a 2D image as a 3D surface using the image values as the Z > > coordinate. > > Mayavi2 can do this, and has a scripting interface that should be easy to > use for what you want to do. You can have a look at > https://svn.enthought.com/enthought/attachment/wiki/MayaVi/Sciy2007_mlab_slides.pdf?format=raw > for instance. > > Hopefully you should find all the info you need to install mayavi2 on > https://svn.enthought.com/enthought/wiki/MayaVi, if not just ask. > Hi Ga?l, I'm following MayaVi for a long time. And I'm looking forward to install the new (wx based version of) MavaVi (mayavi2). The page you referred to, however, lists the dependencies, which is just daunting !! VTK by itself is considered to be a difficult install... (you can't do nothing about that). But now, what is TVTK ? (Does it depend on other parts of Enthough ?) How do you get Traits, without having to get ALL the rest of Enthought ? Last but not least, I'm not a big fan of setuptools: googling for it turns up mostly scipy and some other projects - but no "official endorsement" by python.org ! Some people warn that it might download files and install them on your system without warning .... In my particular case, I don't have system admin permissions, and it seems to require (or at least, work best) when writing to the systems site-packages folder. Don't get me wrong: I'm not criticising, I'm very interested in most of the above (especially Traits, BTW). So far I just could not manage to start using any of them: neither on Linux (debian), nor on Windows. Any help is very much appreciated. -Sebastian Haase From zunzun at zunzun.com Sun Nov 4 16:44:24 2007 From: zunzun at zunzun.com (James Phillips) Date: Sun, 4 Nov 2007 16:44:24 -0500 Subject: [SciPy-user] surface plotting In-Reply-To: <1194195181.5621.364.camel@glup.physics.ucf.edu> References: <1194195181.5621.364.camel@glup.physics.ucf.edu> Message-ID: <268756d30711041344x17c2230y8d60a0b536e36e02@mail.gmail.com> I use VTK to make VRML, which you can then rotate around and view using a VRML viewer. If you use Debian or Ubuntu, a quick "sudo apt-get install python-vtk" will install Python bindings and all dependencies. http://www.vtk.org/ Here's a quick synopsis from IBM that's fairly recent, they discuss MayaVi which also does what you're looking for - and for debian/ubuntu, "sudo apt-get install mayavi": http://www.ibm.com/developerworks/linux/library/l-datavistools/ Since MayaVi uses VTK, the examples use many of the VTK examples: http://mayavi.sourceforge.net/screenshots/index.html James On 11/4/07, Joe Harrington wrote: > > I'm in search of either Python or an open-source, standalone image > viewer that can visualize surface plots. -------------- next part -------------- An HTML attachment was scrubbed... URL: From eike.welk at gmx.net Sun Nov 4 17:13:27 2007 From: eike.welk at gmx.net (Eike Welk) Date: Sun, 04 Nov 2007 23:13:27 +0100 Subject: [SciPy-user] Scipy + Vision = LabView ? In-Reply-To: <472D1448.2090901@ru.nl> References: <4728FFC8.2070007@ru.nl> <20071103230701.GL6725@clipper.ens.fr> <472D1448.2090901@ru.nl> Message-ID: <200711042313.28683.eike.welk@gmx.net> On Sunday 04 November 2007 01:37, Stef Mientki wrote: > What I mean, is for "realtime" control loops. > Let's say I sample data at something like 20 kHz, > want to perform some calculations on it, > display it and feed the manipulated data back to my hardware, For realtime and high speed there is also Gnuradio. http://www.gnu.org/software/gnuradio/ It is organized as a flow-graph, with the nodes written in C++. It works in the MHz range. The graph is constructed in Python and there is some facility to transfer results back to Python. It has no graphical front end. I didn't try Gnuradio however. I read only some of the documentation: http://www.gnu.org/software/gnuradio/doc/exploring-gnuradio.html#software If a graphical front end would be written, a wrapper for Gnuradio and a method to transfer data to/from Numpy would be great. Regards, Eike. From gael.varoquaux at normalesup.org Sun Nov 4 17:20:32 2007 From: gael.varoquaux at normalesup.org (Gael Varoquaux) Date: Sun, 4 Nov 2007 23:20:32 +0100 Subject: [SciPy-user] surface plotting In-Reply-To: References: <1194195181.5621.364.camel@glup.physics.ucf.edu> <20071104171904.GB29007@clipper.ens.fr> Message-ID: <20071104222032.GD29007@clipper.ens.fr> On Sun, Nov 04, 2007 at 09:22:14PM +0100, Sebastian Haase wrote: > The page you referred to, however, lists the dependencies, which is > just daunting !! > VTK by itself is considered to be a difficult install... (you can't do > nothing about that). I agree. this is quite bad. We are working on a Debian package, so this will settle the problem for Debian and Ubuntu. > But now, what is > TVTK ? (Does it depend on other parts of Enthough ?) > How do you get Traits, without having to get ALL the rest of Enthought ? > Last but not least, I'm not a big fan of setuptools: googling for it > turns up mostly scipy and some other projects - but no "official > endorsement" by python.org ! Some people warn that it might download > files and install them on your system without warning .... I agree completely with you. My trust in setuptools has gone down the more I have used it. You can install mayavi by getting the three tarballs on http://code.enthought.com/downloads/source/ets2.6/, and building them. The install is just a question of copying the resulting build dirs in your PYTHONPATH, so that is simple. You might not have all the dependencies for the build. I think the easiest is to to install the Debian packages if you are under debian. They are listed on the install page. If you are not under Debian (or alike), just state your system, and we might be able to help you along a bit more. > In my particular case, I don't have system admin permissions, and it > seems to require (or at least, work best) when writing to the systems > site-packages folder. setuptools works very well when you don't have admin permissions, but it doesn't solve the fact that you need vtk and swig installed. You can give it a switch to tell it to write to another folder than the systems site-packages. > Don't get me wrong: I'm not criticising, I'm very interested in most > of the above (especially Traits, BTW). So far I just could not manage > to start using any of them: neither on Linux (debian), nor on Windows. Under windows the simplest way to install mayavi in to install Python using enstaller: http://code.enthought.com/enstaller/ , under Debian we are working on Debian packages. On a more general note, I have only recently realized that there was a packaging problem. We are slowly working on improving that, but after I finish my thesis (end of the year), I'll start some heavier work on improving this, in particular providing Debians packages, and hopefully binaries for other operating systems. Currently I am under deadline pressure. If you use the tarballs on http://code.enthought.com/downloads/source/ets2.6/ you will need all the dependencies, but no internet access, and building is separated from installing. If you can only get the run-time dependencies, please state precisely your system, and we can try building a binary tarball for you. You will need: VTK >= 4.4 numpy >= 1.01 wxPython 2.6.x all the rest is either compile-time dependencies, or included in the bigs tarballs. HTH, Ga?l From s.mientki at ru.nl Sun Nov 4 17:47:13 2007 From: s.mientki at ru.nl (Stef Mientki) Date: Sun, 04 Nov 2007 23:47:13 +0100 Subject: [SciPy-user] [ANN] Scope_Plot, another plot library for real time signals. Message-ID: <472E4BF1.2050905@ru.nl> hello, I justed finished, another plot library, called Scope_Plot, based on wxPython. Scope_Plot is special meant for displaying real time signals, and therefor has some new functionalities: - signal selection - each signal has it's own scale, - moving erase block - measurement cursor ans should be a lot faster than MatPlot, Plot and FloatCanvas, at least for real time signals (because it only draws the changes). An animated demo can be seen here (2 MB, 2:10): http://stef.mientki.googlepages.com/jalspy_scope.html A description of the library, as used in an application, can be found here: http://oase.uci.kun.nl/~mientki/data_www/pic/jalspy/jalspy_scope.html And finally the file (and a few necessary libs) can be found here: http://oase.uci.kun.nl/~mientki/download/Scope_Plot.zip The library has a main section which contains a simple demo, the animated demo and the application description shows a more complex signal organization. cheers, Stef Mientki From fredmfp at gmail.com Sun Nov 4 19:10:45 2007 From: fredmfp at gmail.com (fred) Date: Mon, 05 Nov 2007 01:10:45 +0100 Subject: [SciPy-user] surface plotting In-Reply-To: References: <1194195181.5621.364.camel@glup.physics.ucf.edu> <20071104171904.GB29007@clipper.ens.fr> Message-ID: <472E5F85.8050806@gmail.com> On Sun, Nov 04, 2007 at 09:22:14PM +0100, Sebastian Haase wrote: > The page you referred to, however, lists the dependencies, which is > just daunting !! > VTK by itself is considered to be a difficult install... (you can't do > nothing about that). What do you mean here ? Which difficulties did you encounter ? From tarball sources ? from your packages manager ? On which platforms ? -- http://scipy.org/FredericPetit From prabhu at aero.iitb.ac.in Sun Nov 4 23:58:17 2007 From: prabhu at aero.iitb.ac.in (Prabhu Ramachandran) Date: Mon, 05 Nov 2007 10:28:17 +0530 Subject: [SciPy-user] surface plotting In-Reply-To: <20071104222032.GD29007@clipper.ens.fr> References: <1194195181.5621.364.camel@glup.physics.ucf.edu> <20071104171904.GB29007@clipper.ens.fr> <20071104222032.GD29007@clipper.ens.fr> Message-ID: <472EA2E9.7000206@aero.iitb.ac.in> Gael Varoquaux wrote: > On Sun, Nov 04, 2007 at 09:22:14PM +0100, Sebastian Haase wrote: >> The page you referred to, however, lists the dependencies, which is >> just daunting !! >> VTK by itself is considered to be a difficult install... (you can't do >> nothing about that). > > I agree. this is quite bad. We are working on a Debian package, so this > will settle the problem for Debian and Ubuntu. VTK is packaged for Debian and on FC7. > I agree completely with you. My trust in setuptools has gone down the > more I have used it. I'm afraid I don't quite share that sentiment as regards being a user of setuptools/easy_install. Too often the version of setuptools installed in a distribution is old with bugs. Right now if you are on ubuntu, feisty/gutsy or Debian it is almost trivial to install things using the binary eggs. As Fred suggested, the best place for help on installation issues is the enthought-dev list. cheers, prabhu From timmichelsen at gmx-topmail.de Mon Nov 5 08:19:56 2007 From: timmichelsen at gmx-topmail.de (Timmie) Date: Mon, 5 Nov 2007 13:19:56 +0000 (UTC) Subject: [SciPy-user] time series analysis Message-ID: Dear list members, I am looking for some hints and recommendations. I want to use python to analyse and evaluate (measurement) time series. Therefore I am looking for some modules and documentation or tutorials which could help me in * filtering time series for a cetain criteria * reduce them to a lower time resolution: e.g. from houly values to daily mean values * filling data gaps for days or hours without valid data using statistical methods like regression * general data quality and trend assessment I found a conversion of the |Stat ("pipe-stat") programs written by Gary Perlman into a python module: pstat.py - http://www.nmr.mgh.harvard.edu/Neural_Systems_Group/gary/python/pstat.py This can be used as a start. But I am interested in what you could recommend. Thanks. Kind regards, Timmie From gael.varoquaux at normalesup.org Mon Nov 5 10:50:40 2007 From: gael.varoquaux at normalesup.org (Gael Varoquaux) Date: Mon, 5 Nov 2007 16:50:40 +0100 Subject: [SciPy-user] Smoothing of a signal In-Reply-To: <353163.97628.qm@web52306.mail.re2.yahoo.com> References: <353163.97628.qm@web52306.mail.re2.yahoo.com> Message-ID: <20071105155040.GE21399@clipper.ens.fr> Hi, On Mon, Nov 05, 2007 at 07:20:35AM -0800, Marc Beck wrote: > I am a graduate assistant at the Space Science Center at Morehead State > University. I am currently writing a program to convert raw data from our > 21m antenna into a *.FITS image using numpy and scipy. I need to convolve > the pixels of my image. I have three arrays: one for the x- coordinate, > one for the y- coordinate and one for the brightness of the pixel. I need > to feed those three arrays into a function to convolve the pixels and I > need to get the same three arrays out with the new values, or the rest of > the program does not work. Someone from the python forum referred me to > some code you wrote at [1]http://www.scipy.org/Cookbook/SignalSmooth > but I don't understand which variable is what and how to integrate this > into my program. Please explain to me how I can make my program work. I > hope that I have described clearly enough what I am planning to do. Your are better off asking the mailing list for these kind of question. I am terribly busy currently (important deadlines) and almost deleted you message before realising it had been sent only to me. Is you data on a regular grid ? You currently have an (X, Y, V) data layout, and it would be much easier to have a 2D array as a layout for your data, where the row and column index would give the X and Y position (scaled so that they fit in consecutive integers). With some luck you can convert you V array to a properly-shaped 2D array simply using "reshape". This correpsonds to the Z array in the code on the page you reference. Hope this helps, Ga?l From pgmdevlist at gmail.com Mon Nov 5 11:03:07 2007 From: pgmdevlist at gmail.com (Pierre GM) Date: Mon, 5 Nov 2007 11:03:07 -0500 Subject: [SciPy-user] time series analysis In-Reply-To: References: Message-ID: <200711051103.07410.pgmdevlist@gmail.com> Timmie, > I want to use python to analyse and evaluate (measurement) time series. The package TimeSeries was designed to perform the kind of operations you want (filtering, frequency conversion, handling missing data...). It is currently available in the sandbox of scipy. More info here: http://www.scipy.org/SciPyPackages/TimeSeries > * filling data gaps for days or hours without valid data using statistical > methods like regression This one might be a tad trickier. You may want to give pyloess a try, it's a wrapper around the dloess functions. Also available in the scipy sandbox. From gael.varoquaux at normalesup.org Mon Nov 5 13:45:13 2007 From: gael.varoquaux at normalesup.org (Gael Varoquaux) Date: Mon, 5 Nov 2007 19:45:13 +0100 Subject: [SciPy-user] Kernel density approximation Message-ID: <20071105184513.GK21399@clipper.ens.fr> Hi, What is the recommended way to compute a kernel density approximation of a probability distribution function given a sample data ? My problem is two dimensional, each data element is made of a pair of numbers. I currently use an histogram2d, but I would like to use a smooth kernel, such as a gaussian, and most important, I would like the algorithm to do the work for me as far as choosing the bandwidth :->. And I forgot to say that the function should be fast, as it is used in a Monte Carlo algorithm, and is called many times. Yes, I know, I want the Moon, on a stick, please :->. Thanks, Ga?l From zpincus at stanford.edu Mon Nov 5 13:58:36 2007 From: zpincus at stanford.edu (Zachary Pincus) Date: Mon, 5 Nov 2007 13:58:36 -0500 Subject: [SciPy-user] Kernel density approximation In-Reply-To: <20071105184513.GK21399@clipper.ens.fr> References: <20071105184513.GK21399@clipper.ens.fr> Message-ID: This might be a start: scipy.stats.kde.gaussian_kde I use it regularly for my n-d KDE needs. I think it has all of the moon/stick features required. Zach On Nov 5, 2007, at 1:45 PM, Gael Varoquaux wrote: > Hi, > > What is the recommended way to compute a kernel density > approximation of > a probability distribution function given a sample data ? My > problem is > two dimensional, each data element is made of a pair of numbers. > > I currently use an histogram2d, but I would like to use a smooth > kernel, > such as a gaussian, and most important, I would like the algorithm > to do > the work for me as far as choosing the bandwidth :->. > > And I forgot to say that the function should be fast, as it is used > in a > Monte Carlo algorithm, and is called many times. > > Yes, I know, I want the Moon, on a stick, please :->. > > Thanks, > > Ga?l > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user From gael.varoquaux at normalesup.org Mon Nov 5 14:11:01 2007 From: gael.varoquaux at normalesup.org (Gael Varoquaux) Date: Mon, 5 Nov 2007 20:11:01 +0100 Subject: [SciPy-user] Kernel density approximation In-Reply-To: References: <20071105184513.GK21399@clipper.ens.fr> Message-ID: <20071105191101.GL21399@clipper.ens.fr> On Mon, Nov 05, 2007 at 01:58:36PM -0500, Zachary Pincus wrote: > This might be a start: > scipy.stats.kde.gaussian_kde wow ! Cool. I don't know how I managed to miss that, it was right under my eyes, where I was looking, with a sensible name. > I use it regularly for my n-d KDE needs. I think it has all of the > moon/stick features required. Scipy really is a large arm useful for grabbing the moon and putting it on a stick. Once again scipy rocks ! Thanks Zachary for the pointer, and thanks scipy devs for the function. Ga?l From timmichelsen at gmx-topmail.de Mon Nov 5 14:24:24 2007 From: timmichelsen at gmx-topmail.de (Timmie) Date: Mon, 5 Nov 2007 19:24:24 +0000 (UTC) Subject: [SciPy-user] time series analysis References: <200711051103.07410.pgmdevlist@gmail.com> Message-ID: Thanks for your answer. > The package TimeSeries was designed to perform the kind of operations you want > (filtering, frequency conversion, handling missing data...). It is currently > available in the sandbox of scipy. More info here: > http://www.scipy.org/SciPyPackages/TimeSeries Wow, this looks great. But a little complex ;-) Well, one could write functions for common tasks that fascilitate it a bit... > > * filling data gaps for days or hours without valid data using statistical > > methods like regression > > This one might be a tad trickier. You may want to give pyloess a try, it's a > wrapper around the dloess functions. Also available in the scipy sandbox. Any idea when there will be a first binary release of all this together with maskedarray? Has anyone else done this kind of things? Maybe with a interface to R? Regards, Timmie From pgmdevlist at gmail.com Mon Nov 5 14:37:33 2007 From: pgmdevlist at gmail.com (Pierre GM) Date: Mon, 5 Nov 2007 14:37:33 -0500 Subject: [SciPy-user] time series analysis In-Reply-To: References: <200711051103.07410.pgmdevlist@gmail.com> Message-ID: <200711051437.34362.pgmdevlist@gmail.com> On Monday 05 November 2007 14:24:24 Timmie wrote: > Wow, this looks great. But a little complex ;-) > Well, one could write functions for common tasks that fascilitate it a > bit... You'd be surprised of how easy it is to use after a while. The conversion functions are quite useful. Please don't hesitate to contact em off-list if you have some special requests > Any idea when there will be a first binary release of all this together > with maskedarray? Nope. The inclusion of maskedarray in numpy is still in the air, timeseries may become a scikit sooner or later. > Has anyone else done this kind of things? What kind of things ? > Maybe with a interface to R? I started that way (with the rpy package), but decided to stick to pure Python to avoid surprises... From william.ratcliff at gmail.com Mon Nov 5 15:28:55 2007 From: william.ratcliff at gmail.com (william ratcliff) Date: Mon, 5 Nov 2007 15:28:55 -0500 Subject: [SciPy-user] weave and masked arrays? Message-ID: <827183970711051228r77c2c871ne26c7abb893def23@mail.gmail.com> Does anyone know if scipy.weave supports numpy masked arrays? Thanks, William From s.mientki at ru.nl Mon Nov 5 15:37:24 2007 From: s.mientki at ru.nl (Stef Mientki) Date: Mon, 05 Nov 2007 21:37:24 +0100 Subject: [SciPy-user] Scipy + Vision = LabView ? part 2 In-Reply-To: <4728FFC8.2070007@ru.nl> References: <4728FFC8.2070007@ru.nl> Message-ID: <472F7F04.1050905@ru.nl> and here is the second part of the Vision demo: http://www.osc.edu/~unpingco/Tutorial_5Nov.html I find this a very useful discussion, not only for the good overview, but also for what is possible now and will be possible in the future. It also started me thinking: "why do so many people (including me until now) find these graphical design packages, like Vision / LabView so attractive / easy ? " OK schematic designs in general read more easily than a piece of text, for most people, watching TV is much more relaxing and attractive than reading a book. But the main reason seems to me "a priori knowledge" When I saw the second demo the "LabView" shivers came over me: it's becoming too complex, there's too much behind the blocks that can't be seen. Looking back at the first demo, I could also write that in one line of text (if I had the right libraries): Display_Image ( Rotate_Image ( Image_From_File ( 'demo.png' ) ) ) This line of code is easier to write then placing the blocks and connecting them in Vision. Still I (and possible many others) find the Vision approach much more intuitive and much easier. But there is another major difference than the graphics alone: in the text version I need to know - the exact name of each item: is it Rotate_Image, rotate_image, SetRotateImage, or ... - the exact number of inputs and outputs - the range and units of each input parameter: is rotation in 360 degrees / 400 degrees or in radians ? So I think we not necessarily need a graphical user interface, but good libraries with easy to understand functions, a priori given ranges and units, good help files etc. So maybe not "graphical" is the magic word, but moreover "bricks" like Lego uses them ;-) Like to know what others think, because I'm still very interested in getting non-programmers from LabView to SciPy, and until now I couldn't convince any of the current LabView teachers / users :-( cheers, Stef From jdh2358 at gmail.com Mon Nov 5 16:09:34 2007 From: jdh2358 at gmail.com (John Hunter) Date: Mon, 5 Nov 2007 15:09:34 -0600 Subject: [SciPy-user] Scipy + Vision = LabView ? part 2 In-Reply-To: <472F7F04.1050905@ru.nl> References: <4728FFC8.2070007@ru.nl> <472F7F04.1050905@ru.nl> Message-ID: <88e473830711051309v1556e3d5t579ef79a2f5e5e14@mail.gmail.com> On Nov 5, 2007 2:37 PM, Stef Mientki wrote: > Display_Image ( Rotate_Image ( Image_From_File ( 'demo.png' ) ) ) > So maybe not "graphical" is the magic word, but moreover "bricks" like > Lego uses them ;-) I think for the most part we have the bricks: it's just that they are scattered in many places w/o comprehensive documentation that illustrates how to find them and put them together from scipy.misc import imread, imshow, imrotate imshow(imrotate(imread('moonlanding.jpg'), 20)) but how many know to look in scipy.misc? JDH From gael.varoquaux at normalesup.org Mon Nov 5 17:16:43 2007 From: gael.varoquaux at normalesup.org (Gael Varoquaux) Date: Mon, 5 Nov 2007 23:16:43 +0100 Subject: [SciPy-user] Scipy + Vision = LabView ? part 2 In-Reply-To: <472F7F04.1050905@ru.nl> References: <4728FFC8.2070007@ru.nl> <472F7F04.1050905@ru.nl> Message-ID: <20071105221643.GB28991@clipper.ens.fr> On Mon, Nov 05, 2007 at 09:37:24PM +0100, Stef Mientki wrote: > So I think we not necessarily need a graphical user interface, > but good libraries with easy to understand functions, > a priori given ranges and units, good help files etc. > So maybe not "graphical" is the magic word, but moreover "bricks" like > Lego uses them ;-) I think we need a good IDE, with integrated help/calltips, a good help, a library browsers, a interactive debug for exploring program flow... I wonder if SPE has a reusable API that we could use to build this API, along with envisage, ipython, once it has a wx front end, MPL,... I must admit this is the dream plan that I would really like to see happen in the next few years, and this is a lot what my pitch about reusable components was about. If we have set of tools, all based on WxWidget (just choosing WxWidget because it already has a lot of tools available, and looks windowsish under windows), that we could assemble for a good IDE (that means better than Matlab, it wouldn't be to hard to do), something that would meld pycrust, ipython, MPL, mayavi (the tree on the side is a good UI feature IMHO, that could be generalized), SPE... I don't think the planets are yet aligned to start coding such a tools, we are missing key components (e.g. wx front end to ipython), but if all the people working on a tools think hard how they can make it a reusable as possible, as easy to integrate seamlessly in an application, then we'll get there in a year or two. Ga?l From oliphant at enthought.com Mon Nov 5 17:50:37 2007 From: oliphant at enthought.com (Travis E. Oliphant) Date: Mon, 05 Nov 2007 16:50:37 -0600 Subject: [SciPy-user] weave and masked arrays? In-Reply-To: <827183970711051228r77c2c871ne26c7abb893def23@mail.gmail.com> References: <827183970711051228r77c2c871ne26c7abb893def23@mail.gmail.com> Message-ID: <472F9E3D.3020605@enthought.com> william ratcliff wrote: > Does anyone know if scipy.weave supports numpy masked arrays? > Not directly. You should pass the data and mask arrays separately to a weave subroutine. -Travis O. From mattknox_ca at hotmail.com Mon Nov 5 22:47:41 2007 From: mattknox_ca at hotmail.com (Matt Knox) Date: Tue, 6 Nov 2007 03:47:41 +0000 (UTC) Subject: [SciPy-user] time series analysis References: <200711051103.07410.pgmdevlist@gmail.com> <200711051437.34362.pgmdevlist@gmail.com> Message-ID: > > Wow, this looks great. But a little complex > > > Well, one could write functions for common tasks that fascilitate it a > > bit... If you have any ideas for simplifying/improving things, we are certainly open to suggestions and would love the feedback. Being a sandbox package currently, there is no better time then now to get your ideas incorporated into the timeseries module. > > Any idea when there will be a first binary release of all this together > > with maskedarray? > > Nope. The inclusion of maskedarray in numpy is still in the air, timeseries > may become a scikit sooner or later. The timeseries package won't likely be moving anywhere until maskedarray moves somewhere else since it is dependent on it. I won't personally be providing any binaries while it is still in the sandbox either, but I am happy to provide advice on how to compile it on Windows. Assuming maskedarray eventually moves into the core of numpy, my preference would be to put the timeseries module right into the main scipy trunk as I believe it is general purpose enough to warrant inclusion in scipy, and many people have expressed interest in it. But much discussion and debate will have to take place before that will happen. Looking ahead really long term, I think built in support in matplotlib for TimeSeries objects (along the lines of the "plotlib" timeseries sub module) would be an improvement over the current approach to time series plotting in matplotlib too. But I have not discussed such things with any of the matplotlib developers and have no idea how they feel about that, and it is too early to discuss that yet anyway. - Matt From s.mientki at ru.nl Tue Nov 6 03:26:37 2007 From: s.mientki at ru.nl (Stef Mientki) Date: Tue, 06 Nov 2007 09:26:37 +0100 Subject: [SciPy-user] Scipy + Vision = LabView ? In-Reply-To: <472CFBC4.1090802@ru.nl> References: <4728FFC8.2070007@ru.nl> <40e64fa20710311536v68f351bbq4cbc7a836e7620bf@mail.gmail.com> <20071102140821.GA1517@clipper.ens.fr> <472B6D19.50409@ru.nl> <20071103145233.GE1956@clipper.ens.fr> <472CFBC4.1090802@ru.nl> Message-ID: <4730253D.5050105@ru.nl> >> >> So we have Vision, that is indeed impressive, but really built around >> custom tools. It would be a large amount of work to abstract out the >> visual programming part of the rest, but it is probably possible. >> >> > Yes I took a quick look, and doesn't seems easy to extract the Vision > part form the total package. > > I asked a question about this in the Vision discussion list, and according to the answer (given below) it shouldn't be too difficult:. I tried this solution, but couldn't get it working, because there seems something wrong with my tcl-package, and now I remember, I have seen that before ;-) cheers, Stef Vision can be easily extracted from the MGLTools package but it requires some additional packages. MGLTools packages list critical and non-critical dependencies at the end of their __init__.py file. Critical dependencies are the packages that are required for Vision to work at all. Non critical dependencies will not prevent the package from working however, some functionality will be missing. The latest release of MGLTools (1.4.6) runs with python 2.4 It needs some 3rd party packages: Pmw, Numeric (23.8 or 24.2) Inside MGLTools, the critical dependencies for Vision are: Vision, NetworkEditor, mglutil, Support an easy solution is to move thees 4 packages inside /usr/lib/python2.4/site-packages/ then you can launch python import Vision Vision.runVision() guillaume vareille Het UMC St Radboud staat geregistreerd bij de Kamer van Koophandel in het handelsregister onder nummer 41055629. The Radboud University Nijmegen Medical Centre is listed in the Commercial Register of the Chamber of Commerce under file number 41055629. From timmichelsen at gmx-topmail.de Tue Nov 6 03:58:18 2007 From: timmichelsen at gmx-topmail.de (Timmie) Date: Tue, 6 Nov 2007 08:58:18 +0000 (UTC) Subject: [SciPy-user] time series analysis References: <200711051103.07410.pgmdevlist@gmail.com> <200711051437.34362.pgmdevlist@gmail.com> Message-ID: > > > Wow, this looks great. But a little complex > > > > > Well, one could write functions for common tasks that fascilitate it a > > > bit... I believe you. Time is the matter. > If you have any ideas for simplifying/improving things, we are certainly open > to suggestions and would love the feedback. Being a sandbox package currently, > there is no better time then now to get your ideas incorporated into the > timeseries module. Well, since I am still a Python beginner somehow and still need to know how to use it efficiently for my data analysis I will not really be able to contribute to core. At my current state of py knowledge I can alredy write my own simple modules consisting of functions that I find useful. Some things I can imagine are the following: create a tools directory under the timeseries tree. then there we could place things like * common frequency conversions: reduce to hourly values * error checking of measurement data: statistically and logically > > > Any idea when there will be a first binary release of all this together > > > with maskedarray? > > > > Nope. The inclusion of maskedarray in numpy is still in the air, timeseries > > may become a scikit sooner or later. Is it safe for me to replace/patch my current maskarray? > The timeseries package won't likely be moving anywhere until maskedarray moves > somewhere else since it is dependent on it. I won't personally be providing > any binaries while it is still in the sandbox either, but I am happy to > provide advice on how to compile it on Windows. I have experience in compiling on linux but have to work on a windows box. Therefore an advice on that would be useful. Is there a possibility to subscibe to SVN to get an email on chnages? Thanks for your responses. Timmie From mattknox_ca at hotmail.com Tue Nov 6 06:38:17 2007 From: mattknox_ca at hotmail.com (Matt Knox) Date: Tue, 6 Nov 2007 11:38:17 +0000 (UTC) Subject: [SciPy-user] time series analysis References: <200711051103.07410.pgmdevlist@gmail.com> <200711051437.34362.pgmdevlist@gmail.com> Message-ID: > Some things I can imagine are the following: > create a tools directory under the timeseries tree. There is a "lib" sub-directory for stuff that falls outside the core Date/TimeSeries classes. It currently includes a sub-module for "moving functions" (moving average, etc...), and interpolation. > * common frequency conversions: reduce to hourly values Frequency conversions are simple to do using the "convert" method of the TimeSeries class. Here is an example converting an hourly frequency series to daily... >>> import numpy as np >>> import maskedarray as ma >>> import timeseries as ts >>> h = ts.time_series(np.arange(50, dtype=np.float32), start_date=ts.today ('hourly')) >>> h timeseries([ 0. 1. 2. 3. 4. 5. 6. 7. 8. 9. 10. 11. 12. 13. 14. 15. 16. 17. 18. 19. 20. 21. 22. 23. 24. 25. 26. 27. 28. 29. 30. 31. 32. 33. 34. 35. 36. 37. 38. 39. 40. 41. 42. 43. 44. 45. 46. 47. 48. 49.], dates = [06-Nov-2007 06:00 ... 08-Nov-2007 07:00], freq = H) >>> d = h.convert('daily') >>> d timeseries( [[-- -- -- -- -- -- 0.0 1.0 2.0 3.0 4.0 5.0 6.0 7.0 8.0 9.0 10.0 11.0 12.0 13.0 14.0 15.0 16.0 17.0] [18.0 19.0 20.0 21.0 22.0 23.0 24.0 25.0 26.0 27.0 28.0 29.0 30.0 31.0 32.0 33.0 34.0 35.0 36.0 37.0 38.0 39.0 40.0 41.0] [42.0 43.0 44.0 45.0 46.0 47.0 48.0 49.0 -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- --]], dates = [06-Nov-2007 ... 08-Nov-2007], freq = D) >>> d_avg = h.convert('daily', ma.average) >>> d_avg timeseries([ 8.5 29.5 45.5], dates = [06-Nov-2007 ... 08-Nov-2007], freq = D) =============================================== If any of the above seems mysterious, let me know and I can offer a more detailed explanation. > * error checking of measurement data: statistically and logically Some data error checking algorithms could be useful, yes. I won't likely be working on them in the near future though. > Is it safe for me to replace/patch my current maskarray? Generally speaking, the maskedarray package is mostly backwards compatible with the current numpy.ma package , and thus the api is very stable so you should be able to update it without any problems. > I have experience in compiling on linux but have > to work on a windows box. > Therefore an advice on that would be useful. If you are using Python 2.5 on windows, it is easy to do using mingw. If you are using an earlier version of Python, Visual studio 2003 is the easiest way to go. > Is there a possibility to subscibe to SVN to get an email on chnages? I don't know of any svn clients that have a built in way to do this, but it likely wouldn't be difficult to write your own script to do this since subversion has a command line interface. - Matt From elr1979 at gmail.com Tue Nov 6 08:41:17 2007 From: elr1979 at gmail.com (Eduardo Rodrigues) Date: Tue, 6 Nov 2007 10:41:17 -0300 Subject: [SciPy-user] Solve References: <200711051103.07410.pgmdevlist@gmail.com><200711051437.34362.pgmdevlist@gmail.com> Message-ID: <000001c8207b$cebe66b0$a300a8c0@rodrigues> Hi, I am starting in Python. I usually use Maple and I am with a problem to do the same project in Python. I would like to solve an equation like x+y+1=0 for x. In Maple I use "solve(x+y+1=0,x);". Do exist a similar command in Python/NumPy/SciPy? Regards, Eduardo. From mnandris at btinternet.com Tue Nov 6 08:42:43 2007 From: mnandris at btinternet.com (Michael Nandris) Date: Tue, 6 Nov 2007 13:42:43 +0000 (GMT) Subject: [SciPy-user] Solve In-Reply-To: <000001c8207b$cebe66b0$a300a8c0@rodrigues> Message-ID: <965265.30328.qm@web86504.mail.ird.yahoo.com> These examples for solving equations are in scipy_tutorial.pdf from scipy import mat, linalg """ Solves three simultaneous equations: x + 3y + 5z = 10 2x + 5y + z = 8 2x + 3y + 8z =3 """ XYparameters = mat('[1 3 5; 2 5 1; 2 3 8]') constants = mat('[10; 8; 3]') print linalg.solve( XYparameters, constants ) # array([[-9.28], # [ 5.16], # [ 0.76]]) """ Solves two simultaneous equations: 5x + 2y - 9 = 0 3x + 4y - 4 = 0 """ XYparameters = mat('[5 2; 3 4]') constants = mat('[9; 4]') print linalg.solve( XYparameters, constants ) # should print (2,-0.5) Eduardo Rodrigues wrote: Hi, I am starting in Python. I usually use Maple and I am with a problem to do the same project in Python. I would like to solve an equation like x+y+1=0 for x. In Maple I use "solve(x+y+1=0,x);". Do exist a similar command in Python/NumPy/SciPy? Regards, Eduardo. _______________________________________________ SciPy-user mailing list SciPy-user at scipy.org http://projects.scipy.org/mailman/listinfo/scipy-user -------------- next part -------------- An HTML attachment was scrubbed... URL: From Fabrice.Silva at crans.org Tue Nov 6 08:53:54 2007 From: Fabrice.Silva at crans.org (Fabrice Silva) Date: Tue, 06 Nov 2007 14:53:54 +0100 Subject: [SciPy-user] Solve In-Reply-To: <965265.30328.qm@web86504.mail.ird.yahoo.com> References: <000001c8207b$cebe66b0$a300a8c0@rodrigues> <965265.30328.qm@web86504.mail.ird.yahoo.com> Message-ID: <1194357234.3135.26.camel@localhost> Le mardi 06 novembre 2007 ? 13:42 +0000, Michael Nandris a ?crit : > These examples for solving equations are in scipy_tutorial.pdf > Solves three simultaneous equations: > x + 3y + 5z = 10 > 2x + 5y + z = 8 > 2x + 3y + 8z =3 > """ > print linalg.solve( XYparameters, constants ) > Eduardo Rodrigues wrote: > Hi, I am starting in Python. I usually use Maple and I am with > a problem to do the same project in Python. > I would like to solve an equation like x+y+1=0 for x. In Maple > I use "solve(x+y+1=0,x);". > Do exist a similar command in Python/NumPy/SciPy? Isn't the request about symbolic computations, like in maple ? -- Fabrice Silva From gael.varoquaux at normalesup.org Tue Nov 6 09:01:03 2007 From: gael.varoquaux at normalesup.org (Gael Varoquaux) Date: Tue, 6 Nov 2007 15:01:03 +0100 Subject: [SciPy-user] Solve In-Reply-To: <000001c8207b$cebe66b0$a300a8c0@rodrigues> References: <000001c8207b$cebe66b0$a300a8c0@rodrigues> Message-ID: <20071106140103.GQ21399@clipper.ens.fr> On Tue, Nov 06, 2007 at 10:41:17AM -0300, Eduardo Rodrigues wrote: > Hi, I am starting in Python. I usually use Maple and I am with a problem to > do the same project in Python. > I would like to solve an equation like x+y+1=0 for x. In Maple I use > "solve(x+y+1=0,x);". > Do exist a similar command in Python/NumPy/SciPy? Do you want a numericaly result, given the value of y, or do you want a formal result ? The numerical value can be calculated using scipy, for linear equation with the method exposed by Michael Nandris, for non linear equations, you should look at scipy.optimize.zeros. If you want a formal result, or a functional result, you can use sympy: from sympy import solve, Symbol x = Symbol('x') y = Symbol('y') f = solve(x+y+1, x) # f gives you the symbolic result F = lambda v: f.subs(y, v) F is a function that returns the exact expression of the root of you equation. If you have several roots, than you need to do something like: F = lambda v: [r.subs(y, v) for r in f] Currently sympy is still yound, and won't be able to solve everythin you throw at it. HTH, Ga?l From elr1979 at gmail.com Tue Nov 6 11:31:39 2007 From: elr1979 at gmail.com (Eduardo Rodrigues) Date: Tue, 6 Nov 2007 13:31:39 -0300 Subject: [SciPy-user] Solve References: <965265.30328.qm@web86504.mail.ird.yahoo.com> Message-ID: <005501c82092$84736c10$a300a8c0@rodrigues> If I want to solve an equation with linalg.solve I must know the equation, right? >From help: "solve(a, b) Return the solution of a*x = b" But if I don't know - or if I don't want to look - the equation for x, how can I solve it? I will import equations from a file. []'s. ----- Original Message ----- From: Michael Nandris To: SciPy Users List Sent: Tuesday, November 06, 2007 10:42 AM Subject: Re: [SciPy-user] Solve These examples for solving equations are in scipy_tutorial.pdf from scipy import mat, linalg """ Solves three simultaneous equations: x + 3y + 5z = 10 2x + 5y + z = 8 2x + 3y + 8z =3 """ XYparameters = mat('[1 3 5; 2 5 1; 2 3 8]') constants = mat('[10; 8; 3]') print linalg.solve( XYparameters, constants ) # array([[-9.28], # [ 5.16], # [ 0.76]]) """ Solves two simultaneous equations: 5x + 2y - 9 = 0 3x + 4y - 4 = 0 """ XYparameters = mat('[5 2; 3 4]') constants = mat('[9; 4]') print linalg.solve( XYparameters, constants ) # should print (2,-0.5) Eduardo Rodrigues wrote: Hi, I am starting in Python. I usually use Maple and I am with a problem to do the same project in Python. I would like to solve an equation like x+y+1=0 for x. In Maple I use "solve(x+y+1=0,x);". Do exist a similar command in Python/NumPy/SciPy? Regards, Eduardo. _______________________________________________ SciPy-user mailing list SciPy-user at scipy.org http://projects.scipy.org/mailman/listinfo/scipy-user ------------------------------------------------------------------------------ _______________________________________________ SciPy-user mailing list SciPy-user at scipy.org http://projects.scipy.org/mailman/listinfo/scipy-user -------------- next part -------------- An HTML attachment was scrubbed... URL: From lbolla at gmail.com Tue Nov 6 10:45:39 2007 From: lbolla at gmail.com (lorenzo bolla) Date: Tue, 6 Nov 2007 16:45:39 +0100 Subject: [SciPy-user] Solve In-Reply-To: <005501c82092$84736c10$a300a8c0@rodrigues> References: <965265.30328.qm@web86504.mail.ird.yahoo.com> <005501c82092$84736c10$a300a8c0@rodrigues> Message-ID: <80c99e790711060745i47039a76k91b4d898083d1274@mail.gmail.com> Maybe I'm misunderstanding, but it seems you look at solve(a,b) as a function to return the solution of the linear symbolic equation: a * x = b (whose solution is: x = b / a), with x scalar. Instead, solve(a,b) gives the numerical solution of the linear system a * x = b, with x and b 1d-arrays and a 2d-array. For examples, something like: 2 * x1 + 3 * x2 = 4 4 * x1 + 5 * x2 = 6 where x = [x1, x2], b = [4, 6] and a = [[2, 3], [4, 5]]. solve(a,b) returns [-1, 2]. If from the file you import a string with the symbolic representation of you equation, and you want to solve it symbolically, stay with simpy. hth, L. On 11/6/07, Eduardo Rodrigues wrote: > > If I want to solve an equation with linalg.solve I must know the > equation, right? > > From help: > "solve(a, b) > Return the solution of a*x = b" > > But if I don't know - or if I don't want to look - the equation for x, > how can I solve it? > I will import equations from a file. > > []'s. > > > > > > ----- Original Message ----- > *From:* Michael Nandris > *To:* SciPy Users List > *Sent:* Tuesday, November 06, 2007 10:42 AM > *Subject:* Re: [SciPy-user] Solve > > > These examples for solving equations are in *scipy_tutorial.pdf* > > from scipy import mat, linalg > > """ > Solves three simultaneous equations: > x + 3y + 5z = 10 > 2x + 5y + z = 8 > 2x + 3y + 8z =3 > """ > > XYparameters = mat('[1 3 5; 2 5 1; 2 3 8]') > constants = mat('[10; 8; 3]') > print linalg.solve( XYparameters, constants ) > # array([[-9.28], > # [ 5.16], > # [ 0.76]]) > > """ > Solves two simultaneous equations: > 5x + 2y - 9 = 0 > 3x + 4y - 4 = 0 > """ > > XYparameters = mat('[5 2; 3 4]') > constants = mat('[9; 4]') > print linalg.solve( XYparameters, constants ) # should print (2,-0.5) > > *Eduardo Rodrigues * wrote: > > Hi, I am starting in Python. I usually use Maple and I am with a problem > to > do the same project in Python. > I would like to solve an equation like x+y+1=0 for x. In Maple I use > "solve(x+y+1=0,x);". > Do exist a similar command in Python/NumPy/SciPy? > Regards, > Eduardo. > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > > > ------------------------------ > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From gael.varoquaux at normalesup.org Tue Nov 6 11:01:56 2007 From: gael.varoquaux at normalesup.org (Gael Varoquaux) Date: Tue, 6 Nov 2007 17:01:56 +0100 Subject: [SciPy-user] Solve In-Reply-To: <005501c82092$84736c10$a300a8c0@rodrigues> References: <965265.30328.qm@web86504.mail.ird.yahoo.com> <005501c82092$84736c10$a300a8c0@rodrigues> Message-ID: <20071106160156.GV21399@clipper.ens.fr> On Tue, Nov 06, 2007 at 01:31:39PM -0300, Eduardo Rodrigues wrote: > If I want to solve an equation with linalg.solve I must know the equation, > right? > From help: > "solve(a, b) > Return the solution of a*x = b" > But if I don't know - or if I don't want to look - the equation for x, how > can I solve it? > I will import equations from a file. How are these equations defined ? Can you give us an example. Cheers, Ga?l From pgmdevlist at gmail.com Tue Nov 6 11:56:03 2007 From: pgmdevlist at gmail.com (Pierre GM) Date: Tue, 6 Nov 2007 11:56:03 -0500 Subject: [SciPy-user] Assessing the use of packages Message-ID: <200711061156.03652.pgmdevlist@gmail.com> All, It's evaluation time in my department (Bio. & Ag. Engng, UGA), and I'd need to document the impact of my Python contributions on the scientific community at large, or more realistically on the numpy/scipy user community... * Is there a way to estimate how many people installed one particular package from the SVN ? * if it's too tricky, would anybody using maskedarray and/or timeseries and/or pyloess (the SVN packages I helped implementing) mind dropping me a line off-list, with a very short description of the main field of research/use ? I'd basically need some kind of numbers to give to the People-In-Charge. Thanks a lot in advance for your time. P. PS: Sorry for the cross-posting.. From marcbeck1982 at yahoo.com Tue Nov 6 12:29:20 2007 From: marcbeck1982 at yahoo.com (Marc Beck) Date: Tue, 6 Nov 2007 09:29:20 -0800 (PST) Subject: [SciPy-user] gaussian smoothing Message-ID: <919551.97062.qm@web52309.mail.re2.yahoo.com> I am a graduate assistant at the Space Science Center at Morehead State University. I am currently writing a program to convert raw data from our 21m antenna into a *.FITS image using numpy and scipy. I need to convolve the pixels of my image. I have three arrays: one for the x- coordinate, one for the y- coordinate and one for the brightness of the pixel. I need to feed those three arrays into a function to convolve the pixels and I need to get the same three arrays out with the new values, or the rest of the program does not work. Someone from the python forum referred me to some code at http://www.scipy.org/Cookbook/SignalSmooth but I don't understand which variable is what and how to integrate this into my program. Please explain to me how I can make my program work. I hope that I have described clearly enough what I am planning to do. I e-mailed the author of that code and asked for help, but this person is very busy and does not have the time to help me. Marc Beck __________________________________________________ Do You Yahoo!? Tired of spam? Yahoo! Mail has the best spam protection around http://mail.yahoo.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From oliphant at enthought.com Tue Nov 6 12:37:25 2007 From: oliphant at enthought.com (Travis E. Oliphant) Date: Tue, 06 Nov 2007 11:37:25 -0600 Subject: [SciPy-user] gaussian smoothing In-Reply-To: <919551.97062.qm@web52309.mail.re2.yahoo.com> References: <919551.97062.qm@web52309.mail.re2.yahoo.com> Message-ID: <4730A655.6070508@enthought.com> Marc Beck wrote: > I am a graduate assistant at the Space Science Center at Morehead > State University. I am currently writing a program to convert raw data > from our 21m antenna into a *.FITS image using numpy and scipy. I need > to convolve the pixels of my image. I have three arrays: one for the > x- coordinate, one for the y- coordinate and one for the brightness of > the pixel. I need to feed those three arrays into a function to > convolve the pixels and I need to get the same three arrays out with > the new values, or the rest of the program does not work. Someone from > the python forum referred me to some code at > http://www.scipy.org/Cookbook/SignalSmooth > > > but I don't understand which variable is what and how to integrate > this into my program. Please explain to me how I can make my program > work. I hope that I have described clearly enough what I am planning > to do. > > I e-mailed the author of that code and asked for help, but this person > is very busy and does not have the time to help me. Hi Marc, We are going to need more information in order to help you. Convolution usually requires two inputs (the images or signals to be convolved) and it returns one output (the convolved signal). What are the purposes of the x- and y- coordinates? Is the data not evenly-sampled? What do you expect to be done with these coordinates values? You are not mentioning what you are convolving the input with? To help you explore. You can find general convolution as scipy.signal.convolve Best regards, --Travis O. From as8ca at virginia.edu Tue Nov 6 13:06:44 2007 From: as8ca at virginia.edu (Alok Singhal) Date: Tue, 6 Nov 2007 13:06:44 -0500 Subject: [SciPy-user] gaussian smoothing In-Reply-To: <919551.97062.qm@web52309.mail.re2.yahoo.com> References: <919551.97062.qm@web52309.mail.re2.yahoo.com> Message-ID: <20071106180643.GF21039@virginia.edu> Hi, On 06/11/07: 09:29, Marc Beck wrote: > > I am a graduate assistant at the Space Science Center at Morehead > State University. I am currently writing a program to convert raw data > from our 21m antenna into a *.FITS image using numpy and scipy. I need > to convolve the pixels of my image. I have three arrays: one for the > x- coordinate, one for the y- coordinate and one for the brightness of > the pixel. I need to feed those three arrays into a function to > convolve the pixels and I need to get the same three arrays out with > the new values, or the rest of the program does not work. Here's one way to do it: Assuming the brightness array is a two-dimensional array with shape (nx, ny), where nx = len(x), ny = len(y); you can do something like: # is the two-dimensional array of brightness values, # , are the sigma values for the two gaussians # in x and y respectively. NOTE: the sigma values are in terms of array # indices, and not in the units of x and y values. # # So, if x and y are uniformly spaced: # # sigma_x = sigma_x / (x[1]-x[0]) # sigma_y = sigma_y / (y[1]-y[0]) # # The output image is in the array in this example from scipy.ndimage import gaussian_filter gaussian_filter(brightness, sigma=(sigma_x, sigma_y), output=out, mode='constant', cval=0.0) -Alok -- Alok Singhal * * Graduate Student, dept. of Astronomy * * * University of Virginia http://www.astro.virginia.edu/~as8ca/ * * From timmichelsen at gmx-topmail.de Tue Nov 6 14:36:25 2007 From: timmichelsen at gmx-topmail.de (Timmie) Date: Tue, 6 Nov 2007 19:36:25 +0000 (UTC) Subject: [SciPy-user] time series analysis References: <200711051103.07410.pgmdevlist@gmail.com> <200711051437.34362.pgmdevlist@gmail.com> Message-ID: Matt Knox hotmail.com> writes: > > > Some things I can imagine are the following: > > create a tools directory under the timeseries tree. > > There is a "lib" sub-directory for stuff that falls outside the core > Date/TimeSeries classes. It currently includes a sub-module for "moving > functions" (moving average, etc...), and interpolation. > > > * common frequency conversions: reduce to hourly values > > Frequency conversions are simple to do using the "convert" method of the > TimeSeries class. Here is an example converting an hourly frequency series to > daily... > > >>> import numpy as np > >>> import maskedarray as ma > >>> import timeseries as ts > >>> h = ts.time_series(np.arange(50, dtype=np.float32), start_date=ts.today > ('hourly')) > >>> h > timeseries([ 0. 1. 2. 3. 4. 5. 6. 7. 8. 9. 10. 11. 12. > 13. 14. > 15. 16. 17. 18. 19. 20. 21. 22. 23. 24. 25. 26. 27. 28. 29. > 30. 31. 32. 33. 34. 35. 36. 37. 38. 39. 40. 41. 42. 43. 44. > 45. 46. 47. 48. 49.], > dates = [06-Nov-2007 06:00 ... 08-Nov-2007 07:00], > freq = H) > > >>> d = h.convert('daily') > >>> d > timeseries( > [[-- -- -- -- -- -- 0.0 1.0 2.0 3.0 4.0 5.0 6.0 7.0 8.0 9.0 10.0 11.0 12.0 > 13.0 14.0 15.0 16.0 17.0] > [18.0 19.0 20.0 21.0 22.0 23.0 24.0 25.0 26.0 27.0 28.0 29.0 30.0 31.0 > 32.0 33.0 34.0 35.0 36.0 37.0 38.0 39.0 40.0 41.0] > [42.0 43.0 44.0 45.0 46.0 47.0 48.0 49.0 -- -- -- -- -- -- -- -- -- -- -- > -- -- -- -- --]], > dates = > [06-Nov-2007 ... 08-Nov-2007], > freq = D) > > >>> d_avg = h.convert('daily', ma.average) > >>> d_avg > timeseries([ 8.5 29.5 45.5], > dates = [06-Nov-2007 ... 08-Nov-2007], > freq = D) > > =============================================== > If any of the above seems mysterious, let me know and I can offer a more > detailed explanation. > > > * error checking of measurement data: statistically and logically > > Some data error checking algorithms could be useful, yes. I won't likely be > working on them in the near future though. I think that I will have a closer look at the package and then see how I can use it. Maybe I can contribute something or at least give you feedback. Well, I am very happy that I found this package. I really uses some Google search but nothing helpful had turned out. What I also fould (I think via the moin wiki at www.python.org): * It is a Python package designed to accomplish some usual tasks during the analysis of climate variability using Python: http://www.pyclimate.org/ * CDAT makes use of an open-source, object-oriented, easy-to-learn scripting language (Python) to link together separate software subsystems and packages to form an integrated environment for data analysis. http://www- pcmdi.llnl.gov/software-portal/cdat These two packages seem to do quite similar tasks. Or at least head in the same direction. But they are depending mainly on the use of netCDF file storage. But maybe there could be some likeage? Kind regards and thanks for your help, Timmie P.S.: You my also send me PM on this, too. From pgmdevlist at gmail.com Tue Nov 6 15:17:07 2007 From: pgmdevlist at gmail.com (Pierre GM) Date: Tue, 6 Nov 2007 15:17:07 -0500 Subject: [SciPy-user] time series analysis In-Reply-To: References: Message-ID: <200711061517.07602.pgmdevlist@gmail.com> Timmie, Matt and I started TimeSeries each on our own, before merging our efforts last Christmas. On my side, I was trying to find something equivalent to pyclimate for my own purpose, but supporting numpy. I poked around this package a bit, wasn't completely happy with it, then figured it would be as easy to redo on top of numpy. It's while I was struggling with subclassing the masked arrays of numpy.core.ma that I decided to reimplement maskedarray, which eventually leads to the current version of timeseries. If I'm not completely mistaken, numpy.core.ma is a translation via numeric of Paul Dubois' s implementation for CDAT. So yeah, I could have followed that path, but I was learning python at the same time and it was a very good exercise to reinvent the wheel, thus adding more noise and confusion. I do agree that there could be some linkage between timeseries, pyclimate and CDAT, at least in terms of converting objects from package to another. However, it's unlikely to happen any time soon. Nevertheless, I may try to implement some of the functions of pyclimate and CDAT in numpy/scipy, when I'll have the need for them. What I would suggest you is to start with timeseries, as it's pretty easy to use. Then, depending on what your needs are, you can start developing your own functions. In any case, don't hesitate to contact Matt or myself if you need some specific help. The TimeSeriesPackage page of the scipy wiki is a good start, Matt did a terrific job. Sincerely P. From brian.clowers at pnl.gov Tue Nov 6 17:53:20 2007 From: brian.clowers at pnl.gov (Clowers, Brian H) Date: Tue, 6 Nov 2007 14:53:20 -0800 Subject: [SciPy-user] HDF5 Viewer for Python In-Reply-To: References: Message-ID: I have data sets in a HDF5 format that contain a number of arrays which I'd like to plot using matplotlib. I can do this pretty easily using just IDLE and pyTables, however, it would be nice to put together a GUI and automate some of the key strokes. Does anyone know of or have an alternative to viTables that they'd be willing to share? viTables looks great but it is also 250 Euros for single license--then again if it had the capability to directly plot to matplotlib I'd consider it. Anyway, when it comes to a GUI I imagine the challenge would be how to link the arrays or tables in the HDF5 file to a Treeview in an interactive manner. Any thoughts? Thanks, Brian From R.Springuel at umit.maine.edu Tue Nov 6 18:49:56 2007 From: R.Springuel at umit.maine.edu (R. Padraic Springuel) Date: Tue, 06 Nov 2007 18:49:56 -0500 Subject: [SciPy-user] Element ranks for an array Message-ID: <4730FDA4.1070107@umit.maine.edu> Does scipy have a function that can provide the element ranks by size for a 1-d array? I.e. a function that would return [3,4,2,1] or [2,1,3,4] (depending on ascending or descending ranking) for array([50, 10, 62, 1000]). -- R. Padraic Springuel Teaching Assistant Department of Physics and Astronomy University of Maine Bennett 309 Office Hours: By appointment only From pgmdevlist at gmail.com Tue Nov 6 19:10:41 2007 From: pgmdevlist at gmail.com (Pierre GM) Date: Tue, 6 Nov 2007 19:10:41 -0500 Subject: [SciPy-user] Element ranks for an array In-Reply-To: <4730FDA4.1070107@umit.maine.edu> References: <4730FDA4.1070107@umit.maine.edu> Message-ID: <200711061910.42609.pgmdevlist@gmail.com> On Tuesday 06 November 2007 18:49:56 R. Padraic Springuel wrote: > Does scipy have a function that can provide the element ranks by size > for a 1-d array? > > I.e. a function that would return [3,4,2,1] or [2,1,3,4] (depending on > ascending or descending ranking) for array([50, 10, 62, 1000]). For a 1D array numpy.arange(1,len(x)+1)[x.argsort()] or simply scipy.stats.rankdata(x) That'll give you ranking by descending order. Use the [::-1] to get the reverse. From as8ca at virginia.edu Tue Nov 6 19:20:37 2007 From: as8ca at virginia.edu (Alok Singhal) Date: Tue, 6 Nov 2007 19:20:37 -0500 Subject: [SciPy-user] Element ranks for an array In-Reply-To: <4730FDA4.1070107@umit.maine.edu> References: <4730FDA4.1070107@umit.maine.edu> Message-ID: <20071107002036.GA23267@virginia.edu> Hi, On 06/11/07: 18:49, R. Padraic Springuel wrote: > Does scipy have a function that can provide the element ranks by size > for a 1-d array? > > I.e. a function that would return [3,4,2,1] or [2,1,3,4] (depending on > ascending or descending ranking) for array([50, 10, 62, 1000]). I assume you meant [4, 3, 1, 2] in the first case. Look at argsort() in numpy. This should do what you want: def rank(x, ascending=True): i = numpy.argsort(x) + 1 if ascending: return i else: return i[::-1] -Alok -- Alok Singhal * * Graduate Student, dept. of Astronomy * * * University of Virginia http://www.astro.virginia.edu/~as8ca/ * * From robert.kern at gmail.com Tue Nov 6 19:30:24 2007 From: robert.kern at gmail.com (Robert Kern) Date: Tue, 06 Nov 2007 18:30:24 -0600 Subject: [SciPy-user] Element ranks for an array In-Reply-To: <20071107002036.GA23267@virginia.edu> References: <4730FDA4.1070107@umit.maine.edu> <20071107002036.GA23267@virginia.edu> Message-ID: <47310720.3020305@gmail.com> Alok Singhal wrote: > Hi, > > On 06/11/07: 18:49, R. Padraic Springuel wrote: >> Does scipy have a function that can provide the element ranks by size >> for a 1-d array? >> >> I.e. a function that would return [3,4,2,1] or [2,1,3,4] (depending on >> ascending or descending ranking) for array([50, 10, 62, 1000]). > > I assume you meant [4, 3, 1, 2] in the first case. > > Look at argsort() in numpy. > > This should do what you want: > > def rank(x, ascending=True): > i = numpy.argsort(x) + 1 > if ascending: > return i > else: > return i[::-1] No, this is incorrect. He correctly wanted [3,4,2,1] for the descending ranking. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From as8ca at virginia.edu Tue Nov 6 20:12:05 2007 From: as8ca at virginia.edu (Alok Singhal) Date: Tue, 6 Nov 2007 20:12:05 -0500 Subject: [SciPy-user] Element ranks for an array In-Reply-To: <47310720.3020305@gmail.com> References: <4730FDA4.1070107@umit.maine.edu> <20071107002036.GA23267@virginia.edu> <47310720.3020305@gmail.com> Message-ID: <20071107011205.GA23486@virginia.edu> On 06/11/07: 18:30, Robert Kern wrote: > No, this is incorrect. He correctly wanted [3,4,2,1] for the > descending ranking. Oops. I misunderstood the question and gave the wrong answer. Sorry, Alok -- Alok Singhal * * Graduate Student, dept. of Astronomy * * * University of Virginia http://www.astro.virginia.edu/~as8ca/ * * From gael.varoquaux at normalesup.org Wed Nov 7 02:37:50 2007 From: gael.varoquaux at normalesup.org (Gael Varoquaux) Date: Wed, 7 Nov 2007 08:37:50 +0100 Subject: [SciPy-user] HDF5 Viewer for Python In-Reply-To: References: Message-ID: <20071107073750.GD27897@clipper.ens.fr> On Tue, Nov 06, 2007 at 02:53:20PM -0800, Clowers, Brian H wrote: > I have data sets in a HDF5 format that contain a number of arrays which > I'd like to plot using matplotlib. I can do this pretty easily using > just IDLE and pyTables, however, it would be nice to put together a GUI > and automate some of the key strokes. Does anyone know of or have an > alternative to viTables that they'd be willing to share? viTables looks > great but it is also 250 Euros for single license--then again if it had > the capability to directly plot to matplotlib I'd consider it. Anyway, > when it comes to a GUI I imagine the challenge would be how to link the > arrays or tables in the HDF5 file to a Treeview in an interactive > manner. Any thoughts? I wacked something specific for the hdf5 structure of the data of my experiment together with pytables, matplotlib, and traits in 2 day's work. It's ugly (really) and terribly specific to my data structure, but it shows that doing something quick and dirty isn't hard. I am not to sure I am willing to share the code, (:->, I will, if somebody _really_ asks for it) because I think I could just puke by looking at it. It came from an immediate need, not thought at all in advance, with deadlines right at the corner, and no intention of reuse, so it is horrendous. Something pretty would be possible with a bit more work, but not that much, and by putting some actual thinking in it, rather than just coding something strait up. I would use the same tools: pytables, matplotlib and traits. Ga?l From emin.shopper at gmail.com Wed Nov 7 10:10:04 2007 From: emin.shopper at gmail.com (Emin.shopper Martinian.shopper) Date: Wed, 7 Nov 2007 10:10:04 -0500 Subject: [SciPy-user] Is this a bug: scipy.cov([0,1]) = 0 Message-ID: <32e43bb70711070710x65c42568rbdc4942743af534b@mail.gmail.com> Dear Experts, I recently noticed some strange behavior with scipy (and numpy as well). Doing scipy.cov([0,1]) returns 0 which I believe is incorrect. Doing scipy.cov([0.0,1.0]) returns a more reasonable answer so I suspect that the problem is that scipy is using integer math. Is this considered a bug? I can see no good reason to do integer math when computing a covariance. Thanks, -Emin From elcorto at gmx.net Wed Nov 7 10:36:33 2007 From: elcorto at gmx.net (Steve Schmerler) Date: Wed, 07 Nov 2007 16:36:33 +0100 Subject: [SciPy-user] Is this a bug: scipy.cov([0,1]) = 0 In-Reply-To: <32e43bb70711070710x65c42568rbdc4942743af534b@mail.gmail.com> References: <32e43bb70711070710x65c42568rbdc4942743af534b@mail.gmail.com> Message-ID: <4731DB81.9050800@gmx.net> Emin.shopper Martinian.shopper wrote: > Dear Experts, > > I recently noticed some strange behavior with scipy (and numpy as > well). Doing scipy.cov([0,1]) returns 0 which I believe is incorrect. > Doing scipy.cov([0.0,1.0]) returns a more reasonable answer so I > suspect that the problem is that scipy is using integer math. > > Is this considered a bug? > > I can see no good reason to do integer math when computing a covariance. > Works here. In [14]: scipy.stats.cov([0,1]) Out[14]: 0.5 In [15]: scipy.cov([0,1]) Out[15]: array(0.5) In [16]: numpy.cov([0,1]) Out[16]: array(0.5) In [17]: numpy.cov([0.,1.]) Out[17]: array(0.5) In [18]: scipy.cov([0.,1.]) Out[18]: array(0.5) In [19]: scipy.stats.cov([0.,1.]) Out[19]: 0.5 In [20]: scipy.__version__ Out[20]: '0.7.0.dev3498' In [21]: numpy.__version__ Out[21]: '1.0.5.dev4407' -- cheers, steve Random number generation is the art of producing pure gibberish as quickly as possible. From j.reid at mail.cryst.bbk.ac.uk Wed Nov 7 11:03:06 2007 From: j.reid at mail.cryst.bbk.ac.uk (John Reid) Date: Wed, 07 Nov 2007 16:03:06 +0000 Subject: [SciPy-user] Undefined variable bug in scipy.stats.gaussian_kde.integrate_1d Message-ID: I get the following error when calling scipy.stats.gaussian_kde.integrate_1d c:\apps\python25\lib\site-packages\scipy\stats\kde.py in integrate_box_1d(self, low, high) 204 raise ValueError("integrate_box_1d() only handles 1D pdfs") 205 --> 206 stdev = np.ravel(sqrt(self.covariance))[0] 207 208 normalized_low = ravel((low - self.dataset)/stdev) : global name 'np' is not defined > c:\apps\python25\lib\site-packages\scipy\stats\kde.py(206)integrate_box_1d() 205 --> 206 stdev = np.ravel(sqrt(self.covariance))[0] 207 I had a quick look at the source and I think perhaps np should be numpy. Also ravel is explicitly imported from numpy in the file. John. From brian.clowers at pnl.gov Wed Nov 7 12:49:34 2007 From: brian.clowers at pnl.gov (Clowers, Brian H) Date: Wed, 7 Nov 2007 09:49:34 -0800 Subject: [SciPy-user] HDF5 Viewer for Python Message-ID: Gael, Thanks for the input. A while back I dabbled with Traits and look at your tutorial--thanks by the way. However, I was unsure how to use traits to develop some sort of Treeview for the HDF5 format. Which leads me to beg for your help...I'm sure your 'ugly' code is better than mine on just about any given day. If you'd be willing to share--even if it is to get me started I'd appreciate your help. The enthought packed looks very promising so now might be a good time for me to take it up. I upgraded my system to use the latest version of traits (I believe it is 3) but am unsure how to begin populating a tree. Any examples would be great. Thanks Brian -------------- next part -------------- An HTML attachment was scrubbed... URL: From robert.kern at gmail.com Wed Nov 7 13:04:05 2007 From: robert.kern at gmail.com (Robert Kern) Date: Wed, 07 Nov 2007 12:04:05 -0600 Subject: [SciPy-user] Undefined variable bug in scipy.stats.gaussian_kde.integrate_1d In-Reply-To: References: Message-ID: <4731FE15.7050503@gmail.com> John Reid wrote: > I get the following error when calling scipy.stats.gaussian_kde.integrate_1d > > c:\apps\python25\lib\site-packages\scipy\stats\kde.py in > integrate_box_1d(self, low, high) > 204 raise ValueError("integrate_box_1d() only handles > 1D pdfs") > 205 > --> 206 stdev = np.ravel(sqrt(self.covariance))[0] > 207 > 208 normalized_low = ravel((low - self.dataset)/stdev) > > : global name 'np' is not defined > > > c:\apps\python25\lib\site-packages\scipy\stats\kde.py(206)integrate_box_1d() > 205 > --> 206 stdev = np.ravel(sqrt(self.covariance))[0] > 207 > > I had a quick look at the source and I think perhaps np should be numpy. > Also ravel is explicitly imported from numpy in the file. When reporting errors, please state the version which you are using. In this case, the errors have already been fixed in SVN. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From s.mientki at ru.nl Wed Nov 7 13:14:54 2007 From: s.mientki at ru.nl (Stef Mientki) Date: Wed, 07 Nov 2007 19:14:54 +0100 Subject: [SciPy-user] Is Elefant alive ? Message-ID: <4732009E.1010605@ru.nl> hello, Does anyone knows how I can reach one of the Elefant designers: http://elefant.developer.nicta.com.au/ I've spitted through all their mailinglists ;-) but couldn't find an address. I also subscribed to their mailing lists, but didn't hear anything for 2 days ;-) thanks, Stef Mientki From matthieu.brucher at gmail.com Wed Nov 7 13:20:24 2007 From: matthieu.brucher at gmail.com (Matthieu Brucher) Date: Wed, 7 Nov 2007 19:20:24 +0100 Subject: [SciPy-user] Is Elefant alive ? In-Reply-To: <4732009E.1010605@ru.nl> References: <4732009E.1010605@ru.nl> Message-ID: Did you try : admin at lineal.developer.nicta.com.au ? Matthieu 2007/11/7, Stef Mientki : > > hello, > > Does anyone knows how I can reach one of the Elefant designers: > http://elefant.developer.nicta.com.au/ > > I've spitted through all their mailinglists ;-) > but couldn't find an address. > > I also subscribed to their mailing lists, > but didn't hear anything for 2 days ;-) > > thanks, > Stef Mientki > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > -- French PhD student Website : http://miles.developpez.com/ Blogs : http://matt.eifelle.com and http://blog.developpez.com/?blog=92 LinkedIn : http://www.linkedin.com/in/matthieubrucher -------------- next part -------------- An HTML attachment was scrubbed... URL: From stef.mientki at gmail.com Wed Nov 7 13:32:29 2007 From: stef.mientki at gmail.com (stef mientki) Date: Wed, 07 Nov 2007 19:32:29 +0100 Subject: [SciPy-user] Is Elefant alive ? In-Reply-To: References: <4732009E.1010605@ru.nl> Message-ID: <473204BD.2000203@gmail.com> Matthieu Brucher wrote: > Did you try : admin at lineal.developer.nicta.com.au > ? thanks Matthieu, their email addresses seems to be scares, but I found one, now wait again ;-) cheers, Stef From markbak at gmail.com Wed Nov 7 15:37:14 2007 From: markbak at gmail.com (Mark Bakker) Date: Wed, 7 Nov 2007 21:37:14 +0100 Subject: [SciPy-user] Geostatistics with Scipy? Message-ID: <6946b9500711071237k17d76122ta2dc9e7fe505c0d8@mail.gmail.com> Hello - I have been looking for some package that can doe Geostatistics with Scipy. Mostly Kriging of 2D data. I expect somebody has done something like this before. Does anybody know of some code somewhere? Thanks, Mark -------------- next part -------------- An HTML attachment was scrubbed... URL: From emin.shopper at gmail.com Wed Nov 7 15:37:28 2007 From: emin.shopper at gmail.com (Emin.shopper Martinian.shopper) Date: Wed, 7 Nov 2007 15:37:28 -0500 Subject: [SciPy-user] Is this a bug: scipy.cov([0,1]) = 0 In-Reply-To: <4731DB81.9050800@gmx.net> References: <32e43bb70711070710x65c42568rbdc4942743af534b@mail.gmail.com> <4731DB81.9050800@gmx.net> Message-ID: <32e43bb70711071237u78c7656bie3aef20d774c99cb@mail.gmail.com> Ahh, good point. I was using numpy version 1.0 and scipy version 0.5.2. I guess its time to upgrade. Thanks, -Emin On Nov 7, 2007 10:36 AM, Steve Schmerler wrote: > > Emin.shopper Martinian.shopper wrote: > > Dear Experts, > > > > I recently noticed some strange behavior with scipy (and numpy as > > well). Doing scipy.cov([0,1]) returns 0 which I believe is incorrect. > > Doing scipy.cov([0.0,1.0]) returns a more reasonable answer so I > > suspect that the problem is that scipy is using integer math. > > > > Is this considered a bug? > > > > I can see no good reason to do integer math when computing a covariance. > > > > Works here. > > In [14]: scipy.stats.cov([0,1]) > Out[14]: 0.5 > > In [15]: scipy.cov([0,1]) > Out[15]: array(0.5) > > In [16]: numpy.cov([0,1]) > Out[16]: array(0.5) > > In [17]: numpy.cov([0.,1.]) > Out[17]: array(0.5) > > In [18]: scipy.cov([0.,1.]) > Out[18]: array(0.5) > > In [19]: scipy.stats.cov([0.,1.]) > Out[19]: 0.5 > > In [20]: scipy.__version__ > Out[20]: '0.7.0.dev3498' > > In [21]: numpy.__version__ > Out[21]: '1.0.5.dev4407' > > > -- > cheers, > steve > > Random number generation is the art of producing pure gibberish as quickly as > possible. > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > From william.ratcliff at gmail.com Wed Nov 7 17:55:32 2007 From: william.ratcliff at gmail.com (william ratcliff) Date: Wed, 7 Nov 2007 17:55:32 -0500 Subject: [SciPy-user] openmp and weave? Message-ID: <827183970711071455m29f4869bo8c8312e0eca79a6e@mail.gmail.com> Has anyone had any luck using weave with openMP? If so, what did you have to do? I've started by updating my compiler to MinGW: gcc-4.2.1-dw-2-2 (and similarly for g++), but am running into problems with code written in weave that doesn't use any of openmp: Here is the code: import numpy as N import weave from weave import converters def blitz_interpolate(x,y): code = """ int pts = Ny[0]; //#pragma omp parallel for for (int i=0; i < pts-1; i++){ y(i) = sin( exp( cos( - exp( sin(x(i)) ) ) ) ); } return_val = 3; """ extra=["-fopenmp -Lc:/python25/ -lPthreadGC2"] extra=[] z=weave.inline(code,['x','y'],type_converters=converters.blitz,compiler='gcc') print z return if __name__=="__main__": x=N.arange(1000) y=N.zeros(x.shape,'d') blitz_interpolate(x,y) print x[35], y[35],N.sin(N.exp(N.cos(-N.exp(N.sin(x[35]))))) This works fine with version 3.4.2 of gcc, g++ Thanks, William From lev at vpac.org Wed Nov 7 18:08:08 2007 From: lev at vpac.org (Lev Lafayette) Date: Thu, 08 Nov 2007 10:08:08 +1100 Subject: [SciPy-user] SciPy test fails.. Message-ID: <1194476888.22909.1.camel@sys09.in.vpac.org> Hey people, I've just installed NUMPY and SciPy on our cluster and the standard test fails. Can someone give me some pointers on what's wrong? The relevant sections of the test follows. Thanks in advance, Lev Found 128 tests for scipy.lib.blas.fblas **************************************************************** WARNING: clapack module is empty ----------- See scipy/INSTALL.txt for troubleshooting. Notes: * If atlas library is not found by numpy/distutils/system_info.py, then scipy uses flapack instead of clapack. **************************************************************** Found 7 tests for scipy.linalg.matfuncs Warning: FAILURE importing tests for /usr/local/python/2.4.4-gcc/lib/python2.4/site-packages/scipy/linsolve/umfpack/tests/test_umfpack.py:17: AttributeError: 'module' object has no attribute 'umfpack' (in ?) Warning: FAILURE importing tests for /usr/local/python/2.4.4-gcc/lib/python2.4/site-packages/scipy/linsolve/umfpack/tests/test_umfpack.py:17: AttributeError: 'module' object has no attribute 'umfpack' (in ?) Found 0 tests for __main__ .../usr/local/python/2.4.4-gcc/lib/python2.4/site-packages/scipy/cluster/vq.py:477: UserWarning: One of the clusters is empty. Re-run kmean with a different initialization. warnings.warn("One of the clusters is empty. " exception raised as expected: One of the clusters is empty. Re-run kmean with a different initialization. ................................................Residual: 1.05006987327e-07 ..................../usr/local/python/2.4.4-gcc/lib/python2.4/site-packages/scipy/interpolate/fitpack2.py:458: UserWarning: The coefficients of the spline returned have been computed as the minimal norm least-squares solution of a (numerically) rank deficient system (deficiency=7). If deficiency is large, the results may be inaccurate. Deficiency may strongly depend on the value of eps. warnings.warn(message) ...... Don't worry about a warning regarding the number of bytes read. ..zswap:n=3 ..................................................................................... **************************************************************** WARNING: cblas module is empty ----------- See scipy/INSTALL.txt for troubleshooting. Notes: * If atlas library is not found by numpy/distutils/system_info.py, then scipy uses fblas instead of cblas. **************************************************************** ...........................................................................................caxpy:n=4 ..zswap:n=3 .......... **************************************************************** WARNING: clapack module is empty ----------- See scipy/INSTALL.txt for troubleshooting. Notes: * If atlas library is not found by numpy/distutils/system_info.py, then scipy uses flapack instead of clapack. **************************************************************** ...Result may be inaccurate, approximate err = 1.73156859734e-08 ...Result may be inaccurate, approximate err = 7.73070496507e-12 ............................................................................................................/usr/local/python/2.4.4-gcc/lib/python2.4/site-packages/scipy/ndimage/interpolation.py:41: UserWarning: Mode "reflect" may yield incorrect results on boundaries. Please use "mirror" instead. warnings.warn('Mode "reflect" may yield incorrect results on ' ........................................................................................................................................................................................................................................................................................................F..F.........................................................Use minimum degree ordering on A'+A. .....................................Use minimum degree ordering on A'+A. .....................................Use minimum degree ordering on A'+A. ................................Use minimum degree ordering on A'+A. ....................................................................................................................................................................................................................................................................................................................................................0.2 0.2 0.2 ......0.2 ..0.2 0.2 0.2 0.2 0.2 .........................................................................................................................................................................................................Ties preclude use of exact statistic. ..Ties preclude use of exact statistic. ...... ====================================================================== FAIL: test_explicit (scipy.tests.test_odr.test_odr) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/python/2.4.4-gcc/lib/python2.4/site-packages/scipy/odr/tests/test_odr.py", line 49, in test_explicit np.array([ 1.2646548050648876e+03, -5.4018409956678255e+01, File "/usr/local/python/2.4.4-gcc/lib/python2.4/site-packages/numpy/testing/utils.py", line 230, in assert_array_almost_equal header='Arrays are not almost equal') File "/usr/local/python/2.4.4-gcc/lib/python2.4/site-packages/numpy/testing/utils.py", line 215, in assert_array_compare assert cond, msg AssertionError: Arrays are not almost equal (mismatch 100.0%) x: array([ 1.26462971e+03, -5.42545890e+01, -8.64250389e-02]) y: array([ 1.26465481e+03, -5.40184100e+01, -8.78497122e-02]) ====================================================================== FAIL: test_multi (scipy.tests.test_odr.test_odr) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/python/2.4.4-gcc/lib/python2.4/site-packages/scipy/odr/tests/test_odr.py", line 190, in test_multi np.array([ 4.3799880305938963, 2.4333057577497703, 8.0028845899503978, File "/usr/local/python/2.4.4-gcc/lib/python2.4/site-packages/numpy/testing/utils.py", line 230, in assert_array_almost_equal header='Arrays are not almost equal') File "/usr/local/python/2.4.4-gcc/lib/python2.4/site-packages/numpy/testing/utils.py", line 215, in assert_array_compare assert cond, msg AssertionError: Arrays are not almost equal (mismatch 100.0%) x: array([ 4.31272063, 2.44289312, 7.76215871, 0.55995622, 0.46423343]) y: array([ 4.37998803, 2.43330576, 8.00288459, 0.51011472, 0.51739023]) ---------------------------------------------------------------------- Ran 1716 tests in 3.670s FAILED (failures=2) From robert.kern at gmail.com Wed Nov 7 18:09:45 2007 From: robert.kern at gmail.com (Robert Kern) Date: Wed, 07 Nov 2007 17:09:45 -0600 Subject: [SciPy-user] SciPy test fails.. In-Reply-To: <1194476888.22909.1.camel@sys09.in.vpac.org> References: <1194476888.22909.1.camel@sys09.in.vpac.org> Message-ID: <473245B9.2040202@gmail.com> Lev Lafayette wrote: > Hey people, > > I've just installed NUMPY and SciPy on our cluster and the standard test > fails. Can someone give me some pointers on what's wrong? These are fixed in SVN. http://scipy.org/scipy/scipy/ticket/357 -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From lev at vpac.org Wed Nov 7 18:31:56 2007 From: lev at vpac.org (Lev Lafayette) Date: Thu, 08 Nov 2007 10:31:56 +1100 Subject: [SciPy-user] SciPy test fails.. In-Reply-To: <473245B9.2040202@gmail.com> References: <1194476888.22909.1.camel@sys09.in.vpac.org> <473245B9.2040202@gmail.com> Message-ID: <1194478316.22909.3.camel@sys09.in.vpac.org> On Wed, 2007-11-07 at 17:09 -0600, Robert Kern wrote: > Lev Lafayette wrote: > > Hey people, > > > > I've just installed NUMPY and SciPy on our cluster and the standard test > > fails. Can someone give me some pointers on what's wrong? > > These are fixed in SVN. > > http://scipy.org/scipy/scipy/ticket/357 > Ahh, brilliant! Thank you! From samrobertsmith at gmail.com Wed Nov 7 19:49:13 2007 From: samrobertsmith at gmail.com (linda.s) Date: Wed, 7 Nov 2007 17:49:13 -0700 Subject: [SciPy-user] about MLab In-Reply-To: <472DF7C9.8020305@enthought.com> References: <1d987df30711040208l53e36652q909252d34ed6fcd1@mail.gmail.com> <472DF7C9.8020305@enthought.com> Message-ID: <1d987df30711071649j4fd428j4aa999a04c6aea2a@mail.gmail.com> On 11/4/07, Travis E. Oliphant wrote: > linda.s wrote: > > I have quite a few codes written in Numeric. > > MLab comes with Numeric too. > > I got confused when I began to use numpy because it does not have MLab > > and does not have some same functions in Numeric. > > is there any smooth way to change the code developed in Numeric to numpy? > > > > You should also read the first chapters of the book "Guide to NumPy" > which are available on-line for immediate download. There is a > comprehensive description of the differences between Numeric and NumPy > as well as tips for compatibility. > > You must also import numpy.oldnumeric before it is available. > > import numpy.oldnumeric > > numpy.oldnumeric.mlab > > You will notice that the module is simple a re-naming module. All of > the functionality is in numpy but under possibly different names. > > All functions in Numeric are available in NumPy (as far as I know). If > you encounter a situation where that is not true, please let me know. > > -Travis I still have the problem such as: >>> import numpy >>> numpy.oldnumeric.mlab Traceback (most recent call last): File "", line 1, in AttributeError: 'module' object has no attribute 'oldnumeric' I use Mac machine and did download and install Scipy... Linda From aisaac at american.edu Wed Nov 7 19:57:35 2007 From: aisaac at american.edu (Alan G Isaac) Date: Wed, 7 Nov 2007 19:57:35 -0500 Subject: [SciPy-user] about MLab In-Reply-To: <1d987df30711071649j4fd428j4aa999a04c6aea2a@mail.gmail.com> References: <1d987df30711040208l53e36652q909252d34ed6fcd1@mail.gmail.com><472DF7C9.8020305@enthought.com><1d987df30711071649j4fd428j4aa999a04c6aea2a@mail.gmail.com> Message-ID: On Wed, 7 Nov 2007, "linda.s" apparently wrote: > I still have the problem such as: > >>> import numpy > >>> numpy.oldnumeric.mlab That is incorrect. Apparently you missed my post: >>> from numpy.oldnumeric import mlab hth, Alan Isaac From garyrob at mac.com Wed Nov 7 20:43:29 2007 From: garyrob at mac.com (Gary Robinson) Date: Wed, 7 Nov 2007 20:43:29 -0500 Subject: [SciPy-user] genetic algorithms Message-ID: <20071107204329385576.30c69279@mac.com> I've seen a few references to a GA module in scipy, as either scipy.ga or sandbox.ga. But I can't find it in scipy in either of those locations, or elsewhere. (I have version 0.6.) Is there still such a module? If so, where can I find it? BTW, while I'm at it, I'll mention that the INSTALL.TXT file for scipy specifies that you need to have 2.3.x or 2.4.x. I'm using it with 2.5 and it seems OK -- does it really need an earlier version or does that file need to be updated? Thanks, Gary -- Gary Robinson CTO Emergent Music, LLC garyrob at mac.com 207-942-3463 Company: http://www.goombah.com Blog: http://www.garyrobinson.net From robert.kern at gmail.com Wed Nov 7 20:46:13 2007 From: robert.kern at gmail.com (Robert Kern) Date: Wed, 07 Nov 2007 19:46:13 -0600 Subject: [SciPy-user] genetic algorithms In-Reply-To: <20071107204329385576.30c69279@mac.com> References: <20071107204329385576.30c69279@mac.com> Message-ID: <47326A65.6070705@gmail.com> Gary Robinson wrote: > I've seen a few references to a GA module in scipy, as either scipy.ga or sandbox.ga. > > But I can't find it in scipy in either of those locations, or elsewhere. (I have version 0.6.) > > Is there still such a module? If so, where can I find it? scipy/sandbox/ga/ Nothing in the sandbox is built by default. See scipy/sandbox/setup.py for instructions on enabling them. I've done some work on updating ga for numpy, but I have not tested any of it. If you get it to work, please let us know. > BTW, while I'm at it, I'll mention that the INSTALL.TXT file for scipy specifies that you need to have 2.3.x or 2.4.x. I'm using it with 2.5 and it seems OK -- does it really need an earlier version or does that file need to be updated? It should say "at least 2.3". -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From samrobertsmith at gmail.com Wed Nov 7 21:32:48 2007 From: samrobertsmith at gmail.com (linda.s) Date: Wed, 7 Nov 2007 19:32:48 -0700 Subject: [SciPy-user] about MLab In-Reply-To: References: <1d987df30711040208l53e36652q909252d34ed6fcd1@mail.gmail.com> <472DF7C9.8020305@enthought.com> <1d987df30711071649j4fd428j4aa999a04c6aea2a@mail.gmail.com> Message-ID: <1d987df30711071832n5c877952n71cb9404f91fa489@mail.gmail.com> On 11/7/07, Alan G Isaac wrote: > On Wed, 7 Nov 2007, "linda.s" apparently wrote: > > I still have the problem such as: > > >>> import numpy > > >>> numpy.oldnumeric.mlab > > That is incorrect. > Apparently you missed my post: > >>> from numpy.oldnumeric import mlab > > hth, > Alan Isaac > Thanks, Linda From sinclaird at ukzn.ac.za Thu Nov 8 01:29:30 2007 From: sinclaird at ukzn.ac.za (Scott Sinclair) Date: Thu, 08 Nov 2007 08:29:30 +0200 Subject: [SciPy-user] Geostatistics with Scipy? In-Reply-To: <6946b9500711071237k17d76122ta2dc9e7fe505c0d8@mail.gmail.com> References: <6946b9500711071237k17d76122ta2dc9e7fe505c0d8@mail.gmail.com> Message-ID: <4732C867.F934.009F.0@ukzn.ac.za> Hi, I'd be keen to hear about what you uncover... Options I'm aware of: 1. Use the gstat R package via RPy (http://www.gstat.org/). 2. Inelegant but effective, install the gstat executable on your system and call using os.system(). This means that you need to write the data points to a file, write an appropriate gstat command file and then read the gstat output files from your Python code. It makes you feel dirty, but means that you can solve your problem easily within a short time frame.. 3. Wrap gstat for Python. I have a vague plans of trying to do this one day, but don't hold your breath. 4. Have a go at wrapping something like the Geostatistics Template Library (http://gstl.sourceforge.net/) 5. or try hacking the Python interface to SGeMS (http://sgems.sourceforge.net/), which is based the gstl code. From what I remember, SGeMS embeds the Python interpreter and has some kind of scripting interface, that may (or may not) be easy to call from your Python code. 6. Implement your own Kriging algorithm. 7. There's also gslib (http://www.gslib.com/) lurking around. I don't recall why I didn't find it particularly interesting. Hope that helps to direct your investigation. Cheers, Scott >>> "Mark Bakker" 11/7/2007 22:37 >>> Hello - I have been looking for some package that can doe Geostatistics with Scipy. Mostly Kriging of 2D data. I expect somebody has done something like this before. Does anybody know of some code somewhere? Thanks, Mark Please find our Email Disclaimer here: http://www.ukzn.ac.za/disclaimer/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From w.richert at gmx.net Thu Nov 8 03:20:45 2007 From: w.richert at gmx.net (Willi Richert) Date: Thu, 8 Nov 2007 09:20:45 +0100 Subject: [SciPy-user] Where is the border between SciPy and ML algorithms? Message-ID: <200711080920.45261.w.richert@gmx.net> Hi, the scipy.cluster module up to now contains already algorithms for Vector Quantization and Kmeans. In the docs it's said that SOMs are following. Now, after having followed the mails mentioning Orange and Elefant: Where do we draw the line? I'm asking, because a student of mine has implemented SWIG-Python-wrappers to the C++ implementations of diverse decision trees (continuous VFDT, C4.5 and some others), which work quite well. We would of course be glad if those would be included in standard SciPy. However, I'm not quite sure, what the current policy of the core SciPy developers is regarding non-core algorithms. Thanks in advance for any light on this topic, wr From matthieu.brucher at gmail.com Thu Nov 8 03:26:59 2007 From: matthieu.brucher at gmail.com (Matthieu Brucher) Date: Thu, 8 Nov 2007 09:26:59 +0100 Subject: [SciPy-user] Where is the border between SciPy and ML algorithms? In-Reply-To: <200711080920.45261.w.richert@gmx.net> References: <200711080920.45261.w.richert@gmx.net> Message-ID: Hi, As there are SWIG files and as Scipy does not depend ATM on SWIG, you could create a scikit with or see with David Cournapeau how it could fit in its machine learning scikit. Matthieu 2007/11/8, Willi Richert : > > Hi, > > the scipy.cluster module up to now contains already algorithms for Vector > Quantization and Kmeans. In the docs it's said that SOMs are following. > Now, > after having followed the mails mentioning Orange and Elefant: Where do we > draw the line? > > I'm asking, because a student of mine has implemented SWIG-Python-wrappers > to > the C++ implementations of diverse decision trees (continuous VFDT, C4.5and > some others), which work quite well. We would of course be glad if those > would be included in standard SciPy. However, I'm not quite sure, what the > current policy of the core SciPy developers is regarding non-core > algorithms. > > Thanks in advance for any light on this topic, > wr > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > -- French PhD student Website : http://miles.developpez.com/ Blogs : http://matt.eifelle.com and http://blog.developpez.com/?blog=92 LinkedIn : http://www.linkedin.com/in/matthieubrucher -------------- next part -------------- An HTML attachment was scrubbed... URL: From anand.prabhakar.patil at gmail.com Thu Nov 8 03:46:02 2007 From: anand.prabhakar.patil at gmail.com (Anand Patil) Date: Thu, 8 Nov 2007 08:46:02 +0000 Subject: [SciPy-user] Geostatistics with Scipy? Message-ID: <2bc7a5a50711080046u1d3aa1carc3dbbd9365c41516@mail.gmail.com> Mark Bakker wrote: > > Hello - > > I have been looking for some package that can doe Geostatistics with Scipy. > Mostly Kriging of 2D data. > > I expect somebody has done something like this before. > > Does anybody know of some code somewhere? > > Thanks, > > Mark Hi Mark, You could try the observe() function in my Gaussian process package at code.google.com/p/random-realizations. Don't tell anyone, I haven't released it yet. ;) Email me if you try it and need help. Anand From david at ar.media.kyoto-u.ac.jp Thu Nov 8 04:12:19 2007 From: david at ar.media.kyoto-u.ac.jp (David Cournapeau) Date: Thu, 08 Nov 2007 18:12:19 +0900 Subject: [SciPy-user] Where is the border between SciPy and ML algorithms? In-Reply-To: <200711080920.45261.w.richert@gmx.net> References: <200711080920.45261.w.richert@gmx.net> Message-ID: <4732D2F3.3060506@ar.media.kyoto-u.ac.jp> Willi Richert wrote: > Hi, > > the scipy.cluster module up to now contains already algorithms for Vector > Quantization and Kmeans. In the docs it's said that SOMs are following. Now, > after having followed the mails mentioning Orange and Elefant: Where do we > draw the line? > My understanding is that we do not throw everything we can think of in scipy; scipy.cluster was kept in scipy for backward compatibility ? That being said, I don't remember having seen any mention of line between core and non core algorithms. Maybe the main difference between scikits and scipy is the license: whereas scipy avoids any non BSD -like license, scikits is freeer in this respect (you can depend on GPL code, for example). > I'm asking, because a student of mine has implemented SWIG-Python-wrappers to > the C++ implementations of diverse decision trees (continuous VFDT, C4.5 and > some others), which work quite well. We would of course be glad if those > would be included in standard SciPy. However, I'm not quite sure, what the > current policy of the core SciPy developers is regarding non-core algorithms. > If this is under open source license, as Matthieu said, we (I) can import it into the learn scikits, which implements already several machine learning algorithms (EM for Gaussian mixtures, datasets, SVM). cheers, David From fredmfp at gmail.com Thu Nov 8 04:28:46 2007 From: fredmfp at gmail.com (fred) Date: Thu, 08 Nov 2007 10:28:46 +0100 Subject: [SciPy-user] Geostatistics with Scipy? In-Reply-To: <6946b9500711071237k17d76122ta2dc9e7fe505c0d8@mail.gmail.com> References: <6946b9500711071237k17d76122ta2dc9e7fe505c0d8@mail.gmail.com> Message-ID: <4732D6CE.7000002@gmail.com> Mark Bakker a ?crit : > Does anybody know of some code somewhere? In my own company (http://www.estimages.com), I develop our kriging codes with scipy+f2py+traits (chaco2 & mayavi2 for the 2D & 3D data viz') for the UI. But my answer is quite useless, as we don't share our python code for now. Anyway, did you look at this: http://www.ai-geostats.org Don't know if there is python code. Cheers, -- http://scipy.org/FredericPetit From j.reid at mail.cryst.bbk.ac.uk Thu Nov 8 04:53:15 2007 From: j.reid at mail.cryst.bbk.ac.uk (John Reid) Date: Thu, 08 Nov 2007 09:53:15 +0000 Subject: [SciPy-user] Undefined variable bug in scipy.stats.gaussian_kde.integrate_1d In-Reply-To: <4731FE15.7050503@gmail.com> References: <4731FE15.7050503@gmail.com> Message-ID: Robert Kern wrote: > When reporting errors, please state the version which you are using. In this > case, the errors have already been fixed in SVN. > How has it been fixed in SVN? by removing the 'np.' ? From oliphant at enthought.com Thu Nov 8 10:15:23 2007 From: oliphant at enthought.com (Travis E. Oliphant) Date: Thu, 08 Nov 2007 09:15:23 -0600 Subject: [SciPy-user] about MLab In-Reply-To: References: <1d987df30711040208l53e36652q909252d34ed6fcd1@mail.gmail.com><472DF7C9.8020305@enthought.com><1d987df30711071649j4fd428j4aa999a04c6aea2a@mail.gmail.com> Message-ID: <4733280B.6040804@enthought.com> Alan G Isaac wrote: > On Wed, 7 Nov 2007, "linda.s" apparently wrote: > >> I still have the problem such as: >> >>>>> import numpy >>>>> numpy.oldnumeric.mlab >>>>> > > That is incorrect. > Apparently you missed my post: > >>>> from numpy.oldnumeric import mlab >>>> > > You need to import oldnumeric specifically (it is not loaded by default). import numpy.oldnumeric Then, you will have numpy.oldnumeric.mlab -Travis From garyrob at mac.com Thu Nov 8 11:42:41 2007 From: garyrob at mac.com (Gary Robinson) Date: Thu, 8 Nov 2007 11:42:41 -0500 Subject: [SciPy-user] genetic algorithms Message-ID: <20071108114241470129.c7dd7bf8@mac.com> > scipy/sandbox/ga/ > > Nothing in the sandbox is built by default. See scipy/sandbox/setup.py for > instructions on enabling them. It's not in scipy/sandbox/setup.py. setup.py has the appropriate lines for 'montecarlo', 'spline', and others, but no 'ga'. Again, I have version = '0.6.0', according to scipy/version.py. -- Gary Robinson CTO Emergent Music, LLC garyrob at mac.com 207-942-3463 Company: http://www.goombah.com Blog: http://www.garyrobinson.net From robert.kern at gmail.com Thu Nov 8 13:14:03 2007 From: robert.kern at gmail.com (Robert Kern) Date: Thu, 08 Nov 2007 12:14:03 -0600 Subject: [SciPy-user] genetic algorithms In-Reply-To: <20071108114241470129.c7dd7bf8@mac.com> References: <20071108114241470129.c7dd7bf8@mac.com> Message-ID: <473351EB.3070609@gmail.com> Gary Robinson wrote: >> scipy/sandbox/ga/ >> >> Nothing in the sandbox is built by default. See scipy/sandbox/setup.py for >> instructions on enabling them. > > It's not in scipy/sandbox/setup.py. setup.py has the appropriate lines for 'montecarlo', 'spline', and others, but no 'ga'. I said to read the instructions in there, namely this: # All subpackages should be commented out in the version # committed to the repository. This prevents build problems # for people who are not actively working with these # potentially unstable packages. # You can put a list of modules you want to always enable in the # file 'enabled_packages.txt' in this directory (you'll have to create it). # Since this isn't under version control, it's less likely you'll # check it in and screw other people up :-) -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From robert.kern at gmail.com Thu Nov 8 13:28:06 2007 From: robert.kern at gmail.com (Robert Kern) Date: Thu, 08 Nov 2007 12:28:06 -0600 Subject: [SciPy-user] Undefined variable bug in scipy.stats.gaussian_kde.integrate_1d In-Reply-To: References: <4731FE15.7050503@gmail.com> Message-ID: <47335536.9050000@gmail.com> John Reid wrote: > > Robert Kern wrote: >> When reporting errors, please state the version which you are using. In this >> case, the errors have already been fixed in SVN. > > How has it been fixed in SVN? by removing the 'np.' ? Yes. You can find this information yourself, too: http://scipy.org/scipy/scipy/browser/trunk/scipy/stats/kde.py http://scipy.org/scipy/scipy/log/trunk/scipy/stats/kde.py?action=stop_on_copy&rev=3473&stop_rev=&mode=follow_copy -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From garyrob at mac.com Thu Nov 8 13:32:05 2007 From: garyrob at mac.com (Gary Robinson) Date: Thu, 8 Nov 2007 13:32:05 -0500 Subject: [SciPy-user] genetic algorithms In-Reply-To: <473351EB.3070609@gmail.com> References: <20071108114241470129.c7dd7bf8@mac.com> <473351EB.3070609@gmail.com> Message-ID: <20071108133205031383.c5f07871@mac.com> Ah. For no good reason I was assuming that if it wasn't one of the packages listed in the commented code in setup.py, it wasn't available. Thanks. -- Gary Robinson CTO Emergent Music, LLC garyrob at mac.com 207-942-3463 Company: http://www.goombah.com Blog: http://www.garyrobinson.net On Thu, 08 Nov 2007 12:14:03 -0600, Robert Kern wrote: > Gary Robinson wrote: >>> scipy/sandbox/ga/ >>> >>> Nothing in the sandbox is built by default. See scipy/sandbox/setup.py for >>> instructions on enabling them. >> >> It's not in scipy/sandbox/setup.py. setup.py has the appropriate >> lines for 'montecarlo', 'spline', and others, but no 'ga'. > > I said to read the instructions in there, namely this: > > # All subpackages should be commented out in the version > # committed to the repository. This prevents build problems > # for people who are not actively working with these > # potentially unstable packages. > > # You can put a list of modules you want to always enable in the > # file 'enabled_packages.txt' in this directory (you'll have to > create it). > # Since this isn't under version control, it's less likely you'll > # check it in and screw other people up :-) From david.huard at gmail.com Thu Nov 8 16:33:21 2007 From: david.huard at gmail.com (David Huard) Date: Thu, 8 Nov 2007 16:33:21 -0500 Subject: [SciPy-user] Argos data Message-ID: <91cf711d0711081333v8be1cd1y9677243eaf296f75@mail.gmail.com> Hi, Is there someone working with Argos satellite data ? I'd need some utility to process the data files and wanted to know what are the existing tools. Thanks, David -------------- next part -------------- An HTML attachment was scrubbed... URL: From faltet at carabos.com Fri Nov 9 04:37:17 2007 From: faltet at carabos.com (Francesc Altet) Date: Fri, 9 Nov 2007 10:37:17 +0100 Subject: [SciPy-user] Argos data In-Reply-To: <91cf711d0711081333v8be1cd1y9677243eaf296f75@mail.gmail.com> References: <91cf711d0711081333v8be1cd1y9677243eaf296f75@mail.gmail.com> Message-ID: <200711091037.19270.faltet@carabos.com> A Thursday 08 November 2007, David Huard escrigu?: > Hi, > > Is there someone working with Argos satellite data ? I'd need some > utility to process the data files and wanted to know what are the > existing tools. It would help if you know the kind of format. My guess is that it should be netCDF, but if you can tell us which extension do have these files it would be better. Also, if you are on a Unix system, trying the next command out: $ file your_data_file should tell you the format of the file (if it is known enough). In case it is a netCDF file (either v3 or v4), you can use the netCDF4 python module: http://netcdf4-python.googlecode.com/svn/trunk/docs/netCDF4-module.html HTH, -- >0,0< Francesc Altet ? ? http://www.carabos.com/ V V C?rabos Coop. V. ??Enjoy Data "-" From lorenzo.isella at gmail.com Fri Nov 9 05:08:35 2007 From: lorenzo.isella at gmail.com (Lorenzo Isella) Date: Fri, 9 Nov 2007 11:08:35 +0100 Subject: [SciPy-user] Introduction to Image Analysis Message-ID: Dear All, For my own personal interest, I would like to learn something about analyzing images with Python. The number of modules, scripts etc..."out there" can be a bit confusing, especially if you are starting from scratch. My purpose is to learn the ropes with some easy-going tutorial (if it relied on scipy it would be a huge plus for me). The interest comes from the fact that several job positions I would like to apply for imply dealing with satellite images (to detect common structures, take statistics etc...). Any suggestions? Kind Regards Lorenzo From schut at sarvision.nl Fri Nov 9 05:27:44 2007 From: schut at sarvision.nl (Vincent Schut) Date: Fri, 09 Nov 2007 11:27:44 +0100 Subject: [SciPy-user] Argos data In-Reply-To: <91cf711d0711081333v8be1cd1y9677243eaf296f75@mail.gmail.com> References: <91cf711d0711081333v8be1cd1y9677243eaf296f75@mail.gmail.com> Message-ID: David Huard wrote: > Hi, > > Is there someone working with Argos satellite data ? I'd need some > utility to process the data files and wanted to know what are the > existing tools. >From quick investigation it is netCDF? I'd recommend checking out gdal: an open source C library for reading/writing a multitude of geospatial raster formats, including stuff like projection info etc. It has a very active community with great support (have a look at the mailing list archives!), even for implementing new features/data formats if necessary. And best of all, it has pretty good python bindings which will allow you to very easy read (part of) your data into a numpy array. OK, enough propaganda. Have a look yourself: www.gdal.org. Cheers, Vincent. > > > Thanks, > > David > > > > ------------------------------------------------------------------------ > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user From chanley at stsci.edu Fri Nov 9 08:29:17 2007 From: chanley at stsci.edu (Christopher Hanley) Date: Fri, 09 Nov 2007 08:29:17 -0500 Subject: [SciPy-user] Introduction to Image Analysis In-Reply-To: References: Message-ID: <473460AD.5020804@stsci.edu> Lorenzo Isella wrote: > Dear All, > For my own personal interest, I would like to learn something about > analyzing images with Python. > The number of modules, scripts etc..."out there" can be a bit > confusing, especially if you are starting from scratch. > My purpose is to learn the ropes with some easy-going tutorial (if it > relied on scipy it would be a huge plus for me). > The interest comes from the fact that several job positions I would > like to apply for imply dealing with satellite images (to detect > common structures, take statistics etc...). > Any suggestions? > Kind Regards > > Lorenzo > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user Lorenzo, You could check out this tutorial: http://www.scipy.org/wikis/topical_software/Tutorial The examples use astronomical data since it was written by a couple of people here at Space Telescope. However, it is pretty generic. Chris -- Christopher Hanley Systems Software Engineer Space Telescope Science Institute 3700 San Martin Drive Baltimore MD, 21218 (410) 338-4338 From vel.accel at gmail.com Fri Nov 9 11:50:22 2007 From: vel.accel at gmail.com (D Hering) Date: Fri, 9 Nov 2007 11:50:22 -0500 Subject: [SciPy-user] sandbox.timeseries missing cseries Message-ID: <1e52e0880711090850q4439d944o26b7f3a8f210b214@mail.gmail.com> Hi. Just to let you know that sandbox.timeseries __init__.py is missing "import cseries" and in "__all__ = [...,'cseries']". dieter From pgmdevlist at gmail.com Fri Nov 9 12:12:43 2007 From: pgmdevlist at gmail.com (Pierre GM) Date: Fri, 9 Nov 2007 12:12:43 -0500 Subject: [SciPy-user] sandbox.timeseries missing cseries In-Reply-To: <1e52e0880711090850q4439d944o26b7f3a8f210b214@mail.gmail.com> References: <1e52e0880711090850q4439d944o26b7f3a8f210b214@mail.gmail.com> Message-ID: <200711091212.45103.pgmdevlist@gmail.com> On Friday 09 November 2007 11:50:22 D Hering wrote: > Hi. > > Just to let you know that sandbox.timeseries __init__.py is missing > "import cseries" and in "__all__ = [...,'cseries']". Not needed. The function is cseries are already available through dates.py, which is loaded in timeseries.__init__ From david.huard at gmail.com Fri Nov 9 12:49:08 2007 From: david.huard at gmail.com (David Huard) Date: Fri, 9 Nov 2007 12:49:08 -0500 Subject: [SciPy-user] Argos data In-Reply-To: References: <91cf711d0711081333v8be1cd1y9677243eaf296f75@mail.gmail.com> Message-ID: <91cf711d0711090949w1af50e2cv19e6e4ff07fe28e8@mail.gmail.com> Thanks guys, I'm pretty new at this, so I realize I should have mentioned it's data from an Ice Mass Balance buoy. The files I got are just text files with lines describing the satellite pass, then the buoy position and a hexadecimal number for each sensor (temperature profile in the ice, water and air, pressure). They are downloaded from argos-system using a telnet script in a format called PRV,DS (poetic, isn't ir ?) Writing a script to parse all this is not necessarily complicated, but I'm guessing it has already been done, though maybe not in python. Something for the dataset scikit, maybe ? Cheers, David 2007/11/9, Vincent Schut : David Huard wrote: > > Hi, > > > > Is there someone working with Argos satellite data ? I'd need some > > utility to process the data files and wanted to know what are the > > existing tools. > > >From quick investigation it is netCDF? > I'd recommend checking out gdal: an open source C library for > reading/writing a multitude of geospatial raster formats, including > stuff like projection info etc. It has a very active community with > great support (have a look at the mailing list archives!), even for > implementing new features/data formats if necessary. And best of all, it > has pretty good python bindings which will allow you to very easy read > (part of) your data into a numpy array. > > OK, enough propaganda. Have a look yourself: www.gdal.org. > > Cheers, > Vincent. > > > > > > Thanks, > > > > David > > > > > > > > ------------------------------------------------------------------------ > > > > _______________________________________________ > > SciPy-user mailing list > > SciPy-user at scipy.org > > http://projects.scipy.org/mailman/listinfo/scipy-user > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > -------------- next part -------------- An HTML attachment was scrubbed... URL: From vel.accel at gmail.com Fri Nov 9 12:54:00 2007 From: vel.accel at gmail.com (dieter h.) Date: Fri, 9 Nov 2007 12:54:00 -0500 Subject: [SciPy-user] sandbox.timeseries missing cseries In-Reply-To: <200711091212.45103.pgmdevlist@gmail.com> References: <1e52e0880711090850q4439d944o26b7f3a8f210b214@mail.gmail.com> <200711091212.45103.pgmdevlist@gmail.com> Message-ID: <1e52e0880711090954k10c694a6w5a9d08a2d6ffbc7c@mail.gmail.com> On Nov 9, 2007 12:12 PM, Pierre GM wrote: > > On Friday 09 November 2007 11:50:22 D Hering wrote: > > Hi. > > > > Just to let you know that sandbox.timeseries __init__.py is missing > > "import cseries" and in "__all__ = [...,'cseries']". > > Not needed. The function is cseries are already available through dates.py, > which is loaded in timeseries.__init__ > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > Hmm, Ok, but I got a traceback when trying to import timeseries into the interpreter. timeseries imports const.py and const.py wanted cseries. After including cseries (as mentioned above) in __init__.py for the package, I was able to import timeseries. -dieter From zpincus at stanford.edu Fri Nov 9 14:34:11 2007 From: zpincus at stanford.edu (Zachary Pincus) Date: Fri, 9 Nov 2007 14:34:11 -0500 Subject: [SciPy-user] KDE question Message-ID: <9F9D8106-7EEE-4623-98C9-B8B65C428419@stanford.edu> Hello all, I'm using scipy.stats.gaussian_kde to estimate the a probability density based on some scattered 2D samples. What I would like to do eventually is plot a line that surrounds (say) 90% of the probability density. Clearly this line corresponds to some contour of the density function (which I can extract out easily enough from KDE density estimates on a grid), but I'm not sure how to figure out what exact contour value it is that I want. Given the KDE density estimates on a grid, I could empirically figure out which value is closest, but I was wondering if there's a simpler closed-form solution... (And apologies if it's somehow obvious; I'm feeling a bit foggy-headed about this.) Best, Zach From robert.kern at gmail.com Fri Nov 9 15:13:02 2007 From: robert.kern at gmail.com (Robert Kern) Date: Fri, 09 Nov 2007 14:13:02 -0600 Subject: [SciPy-user] KDE question In-Reply-To: <9F9D8106-7EEE-4623-98C9-B8B65C428419@stanford.edu> References: <9F9D8106-7EEE-4623-98C9-B8B65C428419@stanford.edu> Message-ID: <4734BF4E.1090102@gmail.com> Zachary Pincus wrote: > Hello all, > > I'm using scipy.stats.gaussian_kde to estimate the a probability > density based on some scattered 2D samples. What I would like to do > eventually is plot a line that surrounds (say) 90% of the probability > density. > > Clearly this line corresponds to some contour of the density function > (which I can extract out easily enough from KDE density estimates on > a grid), but I'm not sure how to figure out what exact contour value > it is that I want. > > Given the KDE density estimates on a grid, I could empirically figure > out which value is closest, but I was wondering if there's a simpler > closed-form solution... (And apologies if it's somehow obvious; I'm > feeling a bit foggy-headed about this.) I don't think there is a closed-form solution for n>1-D. It shouldn't be too hard to evaluate the KDE on a grid, then search for the value between 0 and the peak where the integral is around 90%. from scipy import stats from scipy.optimize import brentq k = stats.gaussian_kde(data) xy = ... # the grid dxdy = ... # the area represented by each grid point z = k(xy) peak = z.max() # You can't quite do this since the function is discontinuous, and never # actually equals 0, but you can do a search along these lines. # Exercise left for the reader. def f(s): mask = z > s return 0.9 - (z[mask].sum() * dxdy) brentq(f, 0, peak) -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From rex at nosyntax.net Fri Nov 9 15:39:21 2007 From: rex at nosyntax.net (rex) Date: Fri, 9 Nov 2007 12:39:21 -0800 Subject: [SciPy-user] KDE question In-Reply-To: <4734BF4E.1090102@gmail.com> References: <9F9D8106-7EEE-4623-98C9-B8B65C428419@stanford.edu> <4734BF4E.1090102@gmail.com> Message-ID: <20071109203921.GB17405@nosyntax.net> Robert Kern [2007-11-09 12:11]: >from scipy import stats >from scipy.optimize import brentq $ python Python 2.5 (release25-maint, Dec 9 2006, 14:35:53) [GCC 4.1.2 20061115 (prerelease) (Debian 4.1.1-20)] on linux2 Type "help", "copyright", "credits" or "license" for more information. >>> import numpy >>> import scipy >>> from scipy import stats Traceback (most recent call last): File "", line 1, in File "/usr/lib/python2.5/site-packages/scipy/stats/__init__.py", line 7, in from stats import * File "/usr/lib/python2.5/site-packages/scipy/stats/stats.py", line 191, in import scipy.special as special File "/usr/lib/python2.5/site-packages/scipy/special/__init__.py", line 8, in from basic import * File "/usr/lib/python2.5/site-packages/scipy/special/basic.py", line 8, in from _cephes import * ImportError: /usr/lib/python2.5/site-packages/scipy/special/_cephes.so: undefined symbol: mtherr This is with numpy-1.04 & scipy 0.60 compiled with icc & ifort 10.1 & using MKL 10.0. Any hints appreciated (I took your suggestion to drop SUSE ;). Thanks, -rex From zpincus at stanford.edu Fri Nov 9 16:26:15 2007 From: zpincus at stanford.edu (Zachary Pincus) Date: Fri, 9 Nov 2007 16:26:15 -0500 Subject: [SciPy-user] KDE question In-Reply-To: <4734BF4E.1090102@gmail.com> References: <9F9D8106-7EEE-4623-98C9-B8B65C428419@stanford.edu> <4734BF4E.1090102@gmail.com> Message-ID: Thanks a lot, Robert -- I appreciate your time and suggestions! Zach On Nov 9, 2007, at 3:13 PM, Robert Kern wrote: > Zachary Pincus wrote: >> Hello all, >> >> I'm using scipy.stats.gaussian_kde to estimate the a probability >> density based on some scattered 2D samples. What I would like to do >> eventually is plot a line that surrounds (say) 90% of the probability >> density. >> >> Clearly this line corresponds to some contour of the density function >> (which I can extract out easily enough from KDE density estimates on >> a grid), but I'm not sure how to figure out what exact contour value >> it is that I want. >> >> Given the KDE density estimates on a grid, I could empirically figure >> out which value is closest, but I was wondering if there's a simpler >> closed-form solution... (And apologies if it's somehow obvious; I'm >> feeling a bit foggy-headed about this.) > > I don't think there is a closed-form solution for n>1-D. It > shouldn't be too > hard to evaluate the KDE on a grid, then search for the value > between 0 and the > peak where the integral is around 90%. > > > from scipy import stats > from scipy.optimize import brentq > > k = stats.gaussian_kde(data) > xy = ... # the grid > dxdy = ... # the area represented by each grid point > z = k(xy) > peak = z.max() > > # You can't quite do this since the function is discontinuous, and > never > # actually equals 0, but you can do a search along these lines. > # Exercise left for the reader. > def f(s): > mask = z > s > return 0.9 - (z[mask].sum() * dxdy) > > brentq(f, 0, peak) > > -- > Robert Kern > > "I have come to believe that the whole world is an enigma, a > harmless enigma > that is made terrible by our own mad attempt to interpret it as > though it had > an underlying truth." > -- Umberto Eco > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user From yarlett at psych.stanford.edu Fri Nov 9 16:33:23 2007 From: yarlett at psych.stanford.edu (Daniel Yarlett) Date: Fri, 9 Nov 2007 13:33:23 -0800 Subject: [SciPy-user] Problem installing scipy 0.6.0 on PPC OS X 10.5 Leopard. Message-ID: <16DF925E-0844-48C5-806D-E9F00488D862@psych.stanford.edu> hi, apologies if this is not the appropriate forum for posting this, but i am having some trouble building and installing scipy 0.6.0 on a PPC Mac that i have just upgraded to OS X 10.5 (this version of scipy previously worked fine on the same machine under OS X 10.4). numpy 1.0.4 seems to install fine with "python setup.py install". however, when i try to install scipy 0.6.0 i get the output (tail only) included at the end of this email. here are my compiler versions: gcc -- version gcc (GCC) 4.3.0 20071026 (experimental) gfortran -- version GNU Fortran (GCC) 4.3.0 20071026 (experimental) i remember that on previous versions of OS X you could use gcc_select to specify v3.4 of gcc to make the build go through. unfortunately gcc_select seems to have been deprecated in Leopard so i haven't been able to try that. don't know if that's a red herring or not. also, fwiw, i just upgraded an intel macbook prop from OS X 10.4 to 10.5 and had no problems at all with the pre-existing versions of numpy and scipy. anyway, any help would be greatly appreciated! best, dan. -- END OF OUTPUT FROM "python setup.py install" for scipy 0.6.0: Could not locate executable f95 customize AbsoftFCompiler Could not locate executable f90 Could not locate executable f77 customize IBMFCompiler Could not locate executable xlf90 Could not locate executable xlf customize IntelFCompiler Could not locate executable ifort Could not locate executable ifc customize GnuFCompiler Found executable /usr/local/bin/g77 gnu: no Fortran 90 compiler found gnu: no Fortran 90 compiler found customize GnuFCompiler gnu: no Fortran 90 compiler found gnu: no Fortran 90 compiler found customize GnuFCompiler using build_clib building 'superlu_src' library compiling C sources C compiler: gcc -arch ppc -arch i386 -isysroot /Developer/SDKs/ MacOSX10.4u.sdk -fno-strict-aliasing -Wno-long-double -no-cpp-precomp - mno-fused-madd -fno-common -dynamic -DNDEBUG -g -O3 compile options: '-DUSE_VENDOR_BLAS=1 -c' gcc: scipy/linsolve/SuperLU/SRC/colamd.c gcc: unrecognized option '-no-cpp-precomp' cc1: error: unrecognized command line option "-arch" cc1: error: unrecognized command line option "-arch" cc1: error: unrecognized command line option "-Wno-long-double" gcc: unrecognized option '-no-cpp-precomp' cc1: error: unrecognized command line option "-arch" cc1: error: unrecognized command line option "-arch" cc1: error: unrecognized command line option "-Wno-long-double" error: Command "gcc -arch ppc -arch i386 -isysroot /Developer/SDKs/ MacOSX10.4u.sdk -fno-strict-aliasing -Wno-long-double -no-cpp-precomp - mno-fused-madd -fno-common -dynamic -DNDEBUG -g -O3 - DUSE_VENDOR_BLAS=1 -c scipy/linsolve/SuperLU/SRC/colamd.c -o build/ temp.macosx-10.3-fat-2.5/scipy/linsolve/SuperLU/SRC/colamd.o" failed with exit status 1 From ellisonbg.net at gmail.com Fri Nov 9 16:47:29 2007 From: ellisonbg.net at gmail.com (Brian Granger) Date: Fri, 9 Nov 2007 14:47:29 -0700 Subject: [SciPy-user] Problem installing scipy 0.6.0 on PPC OS X 10.5 Leopard. In-Reply-To: <16DF925E-0844-48C5-806D-E9F00488D862@psych.stanford.edu> References: <16DF925E-0844-48C5-806D-E9F00488D862@psych.stanford.edu> Message-ID: <6ce0ac130711091347q4dbab44akce022d6b9bf2c72c@mail.gmail.com> Are you using the builtin python? If so, you will need to set your PYTHONPATH to point to the directory where numpy 1.0.4 is installed. The reason is that the builtin python comes with python 1.0.1, which won't build scipy with the gfortran you have. There is a somewhat odd issue on Leopard. If you look at sys.path: python -c "import sys; print sys.path" You will see that the directory where the builtin numpy is (I think it is something like */Extras/*) is before the site-packages directory where your numpy 1.0.4 would be installed (unless you used to --home or --prefix options when running python setup.py install). This means that unless you set PYTHONPATH you will always get Leopards older numpy, which will cause scipy to fail on building. Note, this is very different than on Tiger where the builtin python did not include many third party packages. Leopard includes quite a few (for better or worse). Brian On Nov 9, 2007 2:33 PM, Daniel Yarlett wrote: > hi, > > apologies if this is not the appropriate forum for posting this, but i > am having some trouble building and installing scipy 0.6.0 on a PPC > Mac that i have just upgraded to OS X 10.5 (this version of scipy > previously worked fine on the same machine under OS X 10.4). > > numpy 1.0.4 seems to install fine with "python setup.py install". > > however, when i try to install scipy 0.6.0 i get the output (tail > only) included at the end of this email. > > here are my compiler versions: > > gcc -- version > gcc (GCC) 4.3.0 20071026 (experimental) > > gfortran -- version > GNU Fortran (GCC) 4.3.0 20071026 (experimental) > > i remember that on previous versions of OS X you could use gcc_select > to specify v3.4 of gcc to make the build go through. unfortunately > gcc_select seems to have been deprecated in Leopard so i haven't been > able to try that. don't know if that's a red herring or not. > > also, fwiw, i just upgraded an intel macbook prop from OS X 10.4 to > 10.5 and had no problems at all with the pre-existing versions of > numpy and scipy. > > anyway, any help would be greatly appreciated! > > best, > > dan. > > -- > > END OF OUTPUT FROM "python setup.py install" for scipy 0.6.0: > > Could not locate executable f95 > customize AbsoftFCompiler > Could not locate executable f90 > Could not locate executable f77 > customize IBMFCompiler > Could not locate executable xlf90 > Could not locate executable xlf > customize IntelFCompiler > Could not locate executable ifort > Could not locate executable ifc > customize GnuFCompiler > Found executable /usr/local/bin/g77 > gnu: no Fortran 90 compiler found > gnu: no Fortran 90 compiler found > customize GnuFCompiler > gnu: no Fortran 90 compiler found > gnu: no Fortran 90 compiler found > customize GnuFCompiler using build_clib > building 'superlu_src' library > compiling C sources > C compiler: gcc -arch ppc -arch i386 -isysroot /Developer/SDKs/ > MacOSX10.4u.sdk -fno-strict-aliasing -Wno-long-double -no-cpp-precomp - > mno-fused-madd -fno-common -dynamic -DNDEBUG -g -O3 > > compile options: '-DUSE_VENDOR_BLAS=1 -c' > gcc: scipy/linsolve/SuperLU/SRC/colamd.c > gcc: unrecognized option '-no-cpp-precomp' > cc1: error: unrecognized command line option "-arch" > cc1: error: unrecognized command line option "-arch" > cc1: error: unrecognized command line option "-Wno-long-double" > gcc: unrecognized option '-no-cpp-precomp' > cc1: error: unrecognized command line option "-arch" > cc1: error: unrecognized command line option "-arch" > cc1: error: unrecognized command line option "-Wno-long-double" > error: Command "gcc -arch ppc -arch i386 -isysroot /Developer/SDKs/ > MacOSX10.4u.sdk -fno-strict-aliasing -Wno-long-double -no-cpp-precomp - > mno-fused-madd -fno-common -dynamic -DNDEBUG -g -O3 - > DUSE_VENDOR_BLAS=1 -c scipy/linsolve/SuperLU/SRC/colamd.c -o build/ > temp.macosx-10.3-fat-2.5/scipy/linsolve/SuperLU/SRC/colamd.o" failed > with exit status 1 > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > From yarlett at psych.stanford.edu Fri Nov 9 17:20:02 2007 From: yarlett at psych.stanford.edu (Daniel Yarlett) Date: Fri, 9 Nov 2007 14:20:02 -0800 Subject: [SciPy-user] Problem installing scipy 0.6.0 on PPC OS X 10.5 Leopard. Message-ID: <9EFF3A63-E69C-4C3F-B9AB-9DB36BBD4E21@psych.stanford.edu> here's what i get when i type "python -c "import sys; print sys.path": ['', '/Library/Frameworks/Python.framework/Versions/2.5/lib/ python25.zip', '/Library/Frameworks/Python.framework/Versions/2.5/lib/ python2.5', '/Library/Frameworks/Python.framework/Versions/2.5/lib/ python2.5/plat-darwin', '/Library/Frameworks/Python.framework/Versions/ 2.5/lib/python2.5/plat-mac', '/Library/Frameworks/Python.framework/ Versions/2.5/lib/python2.5/plat-mac/lib-scriptpackages', '/Library/ Frameworks/Python.framework/Versions/2.5/lib/python2.5/lib-tk', '/ Library/Frameworks/Python.framework/Versions/2.5/lib/python2.5/lib- dynload', '/Library/Frameworks/Python.framework/Versions/2.5/lib/ python2.5/site-packages'] i don't see anything of the form */Extras/* in there, although site- packages is the last in the path. but when i run my default version of python and import numpy i get version 1.0.4, which is the latest one i installed, and probably isn't the default version that comes with leopard: ramscar-lab5:~ yarlett$ python -c "import numpy; print numpy.__version__" 1.0.4 i can try setting $PYTHONPATH if you still think that might help. if so i'd appreciate it if you could let me know what i should set it to. thanks for the speedy response! dan. > Are you using the builtin python? If so, you will need to set your > PYTHONPATH to point to the directory where numpy 1.0.4 is installed. > The reason is that the builtin python comes with python 1.0.1, which > won't build scipy with the gfortran you have. > > There is a somewhat odd issue on Leopard. If you look at sys.path: > > python -c "import sys; print sys.path" > > You will see that the directory where the builtin numpy is (I think it > is something like */Extras/*) is before the site-packages directory > where your numpy 1.0.4 would be installed (unless you used to --home > or --prefix options when running python setup.py install). This means > that unless you set PYTHONPATH you will always get Leopards older > numpy, which will cause scipy to fail on building. > > Note, this is very different than on Tiger where the builtin python > did not include many third party packages. Leopard includes quite a > few (for better or worse). > > Brian > > On Nov 9, 2007 2:33 PM, Daniel Yarlett > psych.stanford.edu> wrote: > > hi, > > > > apologies if this is not the appropriate forum for posting this, > but i > > am having some trouble building and installing scipy 0.6.0 on a PPC > > Mac that i have just upgraded to OS X 10.5 (this version of scipy > > previously worked fine on the same machine under OS X 10.4). > > > > numpy 1.0.4 seems to install fine with "python setup.py install". > > > > however, when i try to install scipy 0.6.0 i get the output (tail > > only) included at the end of this email. > > > > here are my compiler versions: > > > > gcc -- version > > gcc (GCC) 4.3.0 20071026 (experimental) > > > > gfortran -- version > > GNU Fortran (GCC) 4.3.0 20071026 (experimental) > > > > i remember that on previous versions of OS X you could use > gcc_select > > to specify v3.4 of gcc to make the build go through. unfortunately > > gcc_select seems to have been deprecated in Leopard so i haven't > been > > able to try that. don't know if that's a red herring or not. > > > > also, fwiw, i just upgraded an intel macbook prop from OS X 10.4 to > > 10.5 and had no problems at all with the pre-existing versions of > > numpy and scipy. > > > > anyway, any help would be greatly appreciated! > > > > best, > > > > dan. > > > > -- > > > > END OF OUTPUT FROM "python setup.py install" for scipy 0.6.0: > > > > Could not locate executable f95 > > customize AbsoftFCompiler > > Could not locate executable f90 > > Could not locate executable f77 > > customize IBMFCompiler > > Could not locate executable xlf90 > > Could not locate executable xlf > > customize IntelFCompiler > > Could not locate executable ifort > > Could not locate executable ifc > > customize GnuFCompiler > > Found executable /usr/local/bin/g77 > > gnu: no Fortran 90 compiler found > > gnu: no Fortran 90 compiler found > > customize GnuFCompiler > > gnu: no Fortran 90 compiler found > > gnu: no Fortran 90 compiler found > > customize GnuFCompiler using build_clib > > building 'superlu_src' library > > compiling C sources > > C compiler: gcc -arch ppc -arch i386 -isysroot /Developer/SDKs/ > > MacOSX10.4u.sdk -fno-strict-aliasing -Wno-long-double -no-cpp- > precomp - > > mno-fused-madd -fno-common -dynamic -DNDEBUG -g -O3 > > > > compile options: '-DUSE_VENDOR_BLAS=1 -c' > > gcc: scipy/linsolve/SuperLU/SRC/colamd.c > > gcc: unrecognized option '-no-cpp-precomp' > > cc1: error: unrecognized command line option "-arch" > > cc1: error: unrecognized command line option "-arch" > > cc1: error: unrecognized command line option "-Wno-long-double" > > gcc: unrecognized option '-no-cpp-precomp' > > cc1: error: unrecognized command line option "-arch" > > cc1: error: unrecognized command line option "-arch" > > cc1: error: unrecognized command line option "-Wno-long-double" > > error: Command "gcc -arch ppc -arch i386 -isysroot /Developer/SDKs/ > > MacOSX10.4u.sdk -fno-strict-aliasing -Wno-long-double -no-cpp- > precomp - > > mno-fused-madd -fno-common -dynamic -DNDEBUG -g -O3 - > > DUSE_VENDOR_BLAS=1 -c scipy/linsolve/SuperLU/SRC/colamd.c -o build/ > > temp.macosx-10.3-fat-2.5/scipy/linsolve/SuperLU/SRC/colamd.o" failed > > with exit status 1 > > _______________________________________________ > > SciPy-user mailing list > > SciPy-user scipy.org > > http://projects.scipy.org/mailman/listinfo/scipy-user > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From cckidder at gmail.com Fri Nov 9 18:49:37 2007 From: cckidder at gmail.com (Chad Kidder) Date: Fri, 9 Nov 2007 17:49:37 -0600 Subject: [SciPy-user] Submitting new code Message-ID: I've got some file import classes that I am working on and would eventually like to contribute back to scipy, probably the io package. I've done a little bit of searching and didn't find anything really descriptive. Would anyone like to give me / point to a bit of information on how to go about this? --Chad Kidder -------------- next part -------------- An HTML attachment was scrubbed... URL: From oliphant at enthought.com Fri Nov 9 19:01:13 2007 From: oliphant at enthought.com (Travis E. Oliphant) Date: Fri, 09 Nov 2007 18:01:13 -0600 Subject: [SciPy-user] Submitting new code In-Reply-To: References: Message-ID: <4734F4C9.10507@enthought.com> Chad Kidder wrote: > I've got some file import classes that I am working on and would > eventually like to contribute back to scipy, probably the io package. > I've done a little bit of searching and didn't find anything really > descriptive. Would anyone like to give me / point to a bit of > information on how to go about this? > > Create a ticket at the Trac page for SciPy and upload a patch with your new code in it. This is the easiest way. Ask if you need more help. Good luck, -Travis O. > > --Chad Kidder > > > ------------------------------------------------------------------------ > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > From roger.herikstad at gmail.com Sat Nov 10 01:16:06 2007 From: roger.herikstad at gmail.com (Roger Herikstad) Date: Sat, 10 Nov 2007 14:16:06 +0800 Subject: [SciPy-user] Inverse of complex matrix Message-ID: Dear scipy users, I am trying to invert a 12 by 12 complex matrix R using scipy.linalg.inv(), but the round-off errors seem a bit too big. If I do scipy.dot(Ri,R), the first row of the resulting matrix looks like this: array([ 0.953125 +0.j , 0.06542969+0.01171875j, 0.08154297+0.15234375j, -0.078125 -0.09375j , 0.19140625-0.0859375j , 0.08203125+0.140625j , -0.08398438+0.0078125j , -0.02539062-0.03125j , -0.06738281+0.015625j , -0.04736328+0.15820312j, -0.09057617+0.02832031j, 0.109375 -0.0625j ]) Is this normal? Inverting a 2x2 matrix produces the expected result. I've attached a pickle of my matrix. Thanks! ~ Roger -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: matrix.pck Type: application/octet-stream Size: 7072 bytes Desc: not available URL: From nwagner at iam.uni-stuttgart.de Sat Nov 10 02:54:17 2007 From: nwagner at iam.uni-stuttgart.de (Nils Wagner) Date: Sat, 10 Nov 2007 08:54:17 +0100 Subject: [SciPy-user] Inverse of complex matrix In-Reply-To: References: Message-ID: On Sat, 10 Nov 2007 14:16:06 +0800 "Roger Herikstad" wrote: > Dear scipy users, I am trying to invert a 12 by 12 >complex matrix R using > scipy.linalg.inv(), but the round-off errors seem a bit >too big. If I do > scipy.dot(Ri,R), the first row of the resulting matrix >looks like this: > > array([ 0.953125 +0.j , 0.06542969+0.01171875j, > 0.08154297+0.15234375j, -0.078125 -0.09375j , > 0.19140625-0.0859375j , 0.08203125+0.140625j , > -0.08398438+0.0078125j , -0.02539062-0.03125j , > -0.06738281+0.015625j , -0.04736328+0.15820312j, > -0.09057617+0.02832031j, 0.109375 -0.0625j ]) > > > Is this normal? Inverting a 2x2 matrix produces the >expected result. I've > attached a pickle of my matrix. Thanks! > > ~ Roger It depends on the condition number of your matrix. Did you compute the singular values of the matrix ? Please can you send me your test matrix in matrixmarket format off-list. io.mmwrite Nils From mforbes at physics.ubc.ca Sat Nov 10 02:59:43 2007 From: mforbes at physics.ubc.ca (Michael McNeil Forbes) Date: Fri, 9 Nov 2007 23:59:43 -0800 Subject: [SciPy-user] Inverse of complex matrix In-Reply-To: References: Message-ID: <7141ED62-C305-41CB-AB7B-703C01C74F72@physics.ubc.ca> Hi Roger, Your matrix is extremely singular. The singular values are: >>> U,d,V = linalg.svd(R) >>> d array([1.41944451e+01, 1.23913727e+01, 9.46227279e+00, 2.81247753e-04, 1.11035378e-14, 1.02548448e-14, 8.22627497e-15, 6.74668324e-15, 5.51882714e-15, 4.78886237e-15, 2.82020737e-15, 7.14721524e-16]) Thus, the matrix is really il-conditioned (the ratio of the largest to smallest singular value is of the order of the machine precision. It is essentially singular, and only has 4 linearly independent columns). Thus, it is not surprising that the round-off errors are significant. You can't reasonably expect to invert a matrix when the ratio of the singular values is of order the machine precision. Michael. On 9 Nov 2007, at 10:16 PM, Roger Herikstad wrote: > Dear scipy users, > I am trying to invert a 12 by 12 complex matrix R using > scipy.linalg.inv(), but the round-off errors seem a bit too big. If > I do scipy.dot(Ri,R), the first row of the resulting matrix looks > like this: > > array([ 0.953125 +0.j , 0.06542969+0.01171875j, > 0.08154297+0.15234375j, -0.078125 -0.09375j , > 0.19140625-0.0859375j , 0.08203125+0.140625j , > -0.08398438+0.0078125j , -0.02539062-0.03125j , > -0.06738281+0.015625j , -0.04736328+0.15820312j, > -0.09057617+0.02832031j, 0.109375 -0.0625j ]) > > > Is this normal? Inverting a 2x2 matrix produces the expected > result. I've attached a pickle of my matrix. Thanks! > > ~ Roger > > From david at ar.media.kyoto-u.ac.jp Sat Nov 10 02:53:06 2007 From: david at ar.media.kyoto-u.ac.jp (David Cournapeau) Date: Sat, 10 Nov 2007 16:53:06 +0900 Subject: [SciPy-user] KDE question In-Reply-To: <4734BF4E.1090102@gmail.com> References: <9F9D8106-7EEE-4623-98C9-B8B65C428419@stanford.edu> <4734BF4E.1090102@gmail.com> Message-ID: <47356362.7010508@ar.media.kyoto-u.ac.jp> Robert Kern wrote: > Zachary Pincus wrote: >> Hello all, >> >> I'm using scipy.stats.gaussian_kde to estimate the a probability >> density based on some scattered 2D samples. What I would like to do >> eventually is plot a line that surrounds (say) 90% of the probability >> density. >> >> Clearly this line corresponds to some contour of the density function >> (which I can extract out easily enough from KDE density estimates on >> a grid), but I'm not sure how to figure out what exact contour value >> it is that I want. >> >> Given the KDE density estimates on a grid, I could empirically figure >> out which value is closest, but I was wondering if there's a simpler >> closed-form solution... (And apologies if it's somehow obvious; I'm >> feeling a bit foggy-headed about this.) > > I don't think there is a closed-form solution for n>1-D. It shouldn't be too > hard to evaluate the KDE on a grid, then search for the value between 0 and the > peak where the integral is around 90%. I am not sure I understand exactly the problem, but if the problem is to find a contour level of a Gaussian density, it has a closed form for any dimension. For the actual code, you can look at scikits.learn.machine.em, function gauss_ell, which is used by my Em toolbox for gaussian mixtures http://projects.scipy.org/scipy/scikits/browser/trunk/learn/scikits/learn/machine/em/densities.py The function can be used without any other code, I think (at least, I remember I wanted it that way:) ). This is based on the fact that if X ~ N(mu, sigma), then (X-mu)' * sigma^-1 * (X-mu) is a one dimensional random variable following a chi square. cheers, David From robert.kern at gmail.com Sat Nov 10 03:08:04 2007 From: robert.kern at gmail.com (Robert Kern) Date: Sat, 10 Nov 2007 02:08:04 -0600 Subject: [SciPy-user] KDE question In-Reply-To: <47356362.7010508@ar.media.kyoto-u.ac.jp> References: <9F9D8106-7EEE-4623-98C9-B8B65C428419@stanford.edu> <4734BF4E.1090102@gmail.com> <47356362.7010508@ar.media.kyoto-u.ac.jp> Message-ID: <473566E4.2060101@gmail.com> David Cournapeau wrote: > I am not sure I understand exactly the problem, but if the problem is to > find a contour level of a Gaussian density, it has a closed form for any > dimension. No, the problem is to find the appropriate contour level of a kernel density estimate (with Gaussian kernels in this case). Essentially, a mixture of many Gaussians, not a single one. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From david at ar.media.kyoto-u.ac.jp Sat Nov 10 03:11:05 2007 From: david at ar.media.kyoto-u.ac.jp (David Cournapeau) Date: Sat, 10 Nov 2007 17:11:05 +0900 Subject: [SciPy-user] Inverse of complex matrix In-Reply-To: References: Message-ID: <47356799.8000109@ar.media.kyoto-u.ac.jp> Roger Herikstad wrote: > Dear scipy users, > I am trying to invert a 12 by 12 complex matrix R using > scipy.linalg.inv(), but the round-off errors seem a bit too big. If I > do scipy.dot(Ri,R), the first row of the resulting matrix looks like > this: > > array([ 0.953125 +0.j , 0.06542969+0.01171875j, > 0.08154297+0.15234375j, -0.078125 -0.09375j , > 0.19140625-0.0859375j , 0.08203125+0.140625j , > -0.08398438+0.0078125j , -0.02539062-0.03125j , > -0.06738281+0.015625j , -0.04736328+0.15820312j, > -0.09057617+0.02832031j, 0.109375 -0.0625j ]) > > > Is this normal? Inverting a 2x2 matrix produces the expected result. > I've attached a pickle of my matrix. Thanks! As a rule, you should not invert a matrix unless you *really* need it. For example, if you want X in A X = B, with A a matrix, B and X vectors, inverting A should be avoided. Some other methods are better suited (numerically speaking), depending on A and what you are trying to do. The mathworks have good introduction on this topic, if you look for online help on inv, pinv and so on: http://www.mathworks.com/access/helpdesk/help/techdoc/ref/inv.html cheers, David From david at ar.media.kyoto-u.ac.jp Sat Nov 10 03:24:18 2007 From: david at ar.media.kyoto-u.ac.jp (David Cournapeau) Date: Sat, 10 Nov 2007 17:24:18 +0900 Subject: [SciPy-user] KDE question In-Reply-To: <473566E4.2060101@gmail.com> References: <9F9D8106-7EEE-4623-98C9-B8B65C428419@stanford.edu> <4734BF4E.1090102@gmail.com> <47356362.7010508@ar.media.kyoto-u.ac.jp> <473566E4.2060101@gmail.com> Message-ID: <47356AB2.2030009@ar.media.kyoto-u.ac.jp> Robert Kern wrote: > David Cournapeau wrote: > >> I am not sure I understand exactly the problem, but if the problem is to >> find a contour level of a Gaussian density, it has a closed form for any >> dimension. > > No, the problem is to find the appropriate contour level of a kernel density > estimate (with Gaussian kernels in this case). Essentially, a mixture of many > Gaussians, not a single one. > Ah ok, sorry. David From roger.herikstad at gmail.com Sat Nov 10 05:09:20 2007 From: roger.herikstad at gmail.com (Roger Herikstad) Date: Sat, 10 Nov 2007 18:09:20 +0800 Subject: [SciPy-user] Inverse of complex matrix In-Reply-To: <47356799.8000109@ar.media.kyoto-u.ac.jp> References: <47356799.8000109@ar.media.kyoto-u.ac.jp> Message-ID: Dear all, Thanks. I can get away with not inverting the matrix, actually, I just wanted to confirm a calculation via a different route, and I was momentarily surprised by my result. I realise my mistake though. Thanks for pointing it out for me! ~ Roger On Nov 10, 2007 4:11 PM, David Cournapeau wrote: > > Roger Herikstad wrote: > > Dear scipy users, > > I am trying to invert a 12 by 12 complex matrix R using > > scipy.linalg.inv(), but the round-off errors seem a bit too big. If I > > do scipy.dot(Ri,R), the first row of the resulting matrix looks like > > this: > > > > array([ 0.953125 +0.j , 0.06542969+0.01171875j, > > 0.08154297+0.15234375j, -0.078125 -0.09375j , > > 0.19140625-0.0859375j , 0.08203125+0.140625j , > > -0.08398438+0.0078125j , -0.02539062-0.03125j , > > -0.06738281+0.015625j , -0.04736328+0.15820312j, > > -0.09057617+0.02832031j, 0.109375 -0.0625j ]) > > > > > > Is this normal? Inverting a 2x2 matrix produces the expected result. > > I've attached a pickle of my matrix. Thanks! > As a rule, you should not invert a matrix unless you *really* need it. > For example, if you want X in > A X = B, with A a matrix, B and X vectors, inverting A should be > avoided. Some other methods are better suited (numerically speaking), > depending on A and what you are trying to do. > > The mathworks have good introduction on this topic, if you look for > online help on inv, pinv and so on: > > http://www.mathworks.com/access/helpdesk/help/techdoc/ref/inv.html > > cheers, > > David > > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > From gael.varoquaux at normalesup.org Sat Nov 10 05:24:45 2007 From: gael.varoquaux at normalesup.org (Gael Varoquaux) Date: Sat, 10 Nov 2007 11:24:45 +0100 Subject: [SciPy-user] KDE question In-Reply-To: <9F9D8106-7EEE-4623-98C9-B8B65C428419@stanford.edu> References: <9F9D8106-7EEE-4623-98C9-B8B65C428419@stanford.edu> Message-ID: <20071110102445.GB22735@clipper.ens.fr> On Fri, Nov 09, 2007 at 02:34:11PM -0500, Zachary Pincus wrote: > I'm using scipy.stats.gaussian_kde to estimate the a probability > density based on some scattered 2D samples. What I would like to do > eventually is plot a line that surrounds (say) 90% of the probability > density. If its only plotting you are interested in, matplotlib can do all the dirty work for you, using the "contour" command: http://matplotlib.sourceforge.net/screenshots/pcolor_demo.py which yields: http://matplotlib.sourceforge.net/screenshots/pcolor_demo_large.png HTH, Ga?l From david at ar.media.kyoto-u.ac.jp Sat Nov 10 06:13:31 2007 From: david at ar.media.kyoto-u.ac.jp (David Cournapeau) Date: Sat, 10 Nov 2007 20:13:31 +0900 Subject: [SciPy-user] KDE question In-Reply-To: <20071110102445.GB22735@clipper.ens.fr> References: <9F9D8106-7EEE-4623-98C9-B8B65C428419@stanford.edu> <20071110102445.GB22735@clipper.ens.fr> Message-ID: <4735925B.6080409@ar.media.kyoto-u.ac.jp> Gael Varoquaux wrote: > On Fri, Nov 09, 2007 at 02:34:11PM -0500, Zachary Pincus wrote: >> I'm using scipy.stats.gaussian_kde to estimate the a probability >> density based on some scattered 2D samples. What I would like to do >> eventually is plot a line that surrounds (say) 90% of the probability >> density. > > If its only plotting you are interested in, matplotlib can do all the > dirty work for you, using the "contour" command: > > http://matplotlib.sourceforge.net/screenshots/pcolor_demo.py > which yields: > http://matplotlib.sourceforge.net/screenshots/pcolor_demo_large.png I missed this was for plotting. Then the code from the EM toolbox may actually be useful, since I used the function mentioned before to plot density contours for mixture of Gaussians: http://www.ar.media.kyoto-u.ac.jp/members/david/softwares/em/pdfestimation.png Code there: http://projects.scipy.org/scipy/scikits/browser/trunk/learn/scikits/learn/machine/em/gauss_mix.py (function density_on_grid: this can be used from a Gaussian mixture class, that you may be able use easily from gauss_kde). This may not be adapted for your case, since I only coded it with a few Gaussians in mind (e.g. I have never tried to do something smart, since I don't need mixtures of say 500 Gaussian). cheers, David From zpincus at stanford.edu Sat Nov 10 07:14:39 2007 From: zpincus at stanford.edu (Zachary Pincus) Date: Sat, 10 Nov 2007 07:14:39 -0500 Subject: [SciPy-user] KDE question In-Reply-To: <20071110102445.GB22735@clipper.ens.fr> References: <9F9D8106-7EEE-4623-98C9-B8B65C428419@stanford.edu> <20071110102445.GB22735@clipper.ens.fr> Message-ID: <9F4EB7F4-D2E5-4363-B93F-105A76501D4F@stanford.edu> >> I'm using scipy.stats.gaussian_kde to estimate the a probability >> density based on some scattered 2D samples. What I would like to do >> eventually is plot a line that surrounds (say) 90% of the probability >> density. > > If its only plotting you are interested in, matplotlib can do all the > dirty work for you, using the "contour" command: > > http://matplotlib.sourceforge.net/screenshots/pcolor_demo.py > which yields: > http://matplotlib.sourceforge.net/screenshots/pcolor_demo_large.png In this case, the trick is to figure out *which* contour level to plot -- that is, at what density level and above is 90% (say) of the total density contained. Robert's example shows how to figure that out, and pretty cleverly. I wouldn't have thought of wrapping it all up with brentq. But gosh, do those matplotlib contours look nice. I've been using Gnuplot/Gnuplot.py for a long time, but it might be time to make a switch. Thanks, Zach From zpincus at stanford.edu Sat Nov 10 07:24:31 2007 From: zpincus at stanford.edu (Zachary Pincus) Date: Sat, 10 Nov 2007 07:24:31 -0500 Subject: [SciPy-user] Problem installing scipy 0.6.0 on PPC OS X 10.5 Leopard. In-Reply-To: <6ce0ac130711091347q4dbab44akce022d6b9bf2c72c@mail.gmail.com> References: <16DF925E-0844-48C5-806D-E9F00488D862@psych.stanford.edu> <6ce0ac130711091347q4dbab44akce022d6b9bf2c72c@mail.gmail.com> Message-ID: <87CE1249-D53F-4B49-8D8A-B6CA182B2D83@stanford.edu> Hello all, > Are you using the builtin python? If so, you will need to set your > PYTHONPATH to point to the directory where numpy 1.0.4 is installed. > The reason is that the builtin python comes with python 1.0.1, which > won't build scipy with the gfortran you have. I'm not using Leopard, but if I recall from the discussion on python- mac, there's the added difficulty that the environment isn't preserved through 'sudo' -- so just setting the PYTHONPATH environment variable won't work if you need to 'sudo python setup.py install' to install a given library to a system directory. I don't recall what the recommended solution, if any, was. I think it involved some .pth file trickery that looked relatively, um, ugly. Alternately, one could select a non-system directory as your local site-packages, via a .pth or PYTHONPATH, so sudo is not needed to install. This doesn't work for system admins supporting multiple users though. Finally, on the horizon were some options for setting up "virtual" python installs that would act like separate installs from the system one for the purpose of installing packages, but would use the system python and its libs under the covers. Is there yet any set of simple, newbie-friendly instructions that covers how best to install things like numpy/scipy on leopard? Zach On Nov 9, 2007, at 4:47 PM, Brian Granger wrote: > Are you using the builtin python? If so, you will need to set your > PYTHONPATH to point to the directory where numpy 1.0.4 is installed. > The reason is that the builtin python comes with python 1.0.1, which > won't build scipy with the gfortran you have. > > There is a somewhat odd issue on Leopard. If you look at sys.path: > > python -c "import sys; print sys.path" > > You will see that the directory where the builtin numpy is (I think it > is something like */Extras/*) is before the site-packages directory > where your numpy 1.0.4 would be installed (unless you used to --home > or --prefix options when running python setup.py install). This means > that unless you set PYTHONPATH you will always get Leopards older > numpy, which will cause scipy to fail on building. > > Note, this is very different than on Tiger where the builtin python > did not include many third party packages. Leopard includes quite a > few (for better or worse). > > Brian > > > > On Nov 9, 2007 2:33 PM, Daniel Yarlett > wrote: >> hi, >> >> apologies if this is not the appropriate forum for posting this, >> but i >> am having some trouble building and installing scipy 0.6.0 on a PPC >> Mac that i have just upgraded to OS X 10.5 (this version of scipy >> previously worked fine on the same machine under OS X 10.4). >> >> numpy 1.0.4 seems to install fine with "python setup.py install". >> >> however, when i try to install scipy 0.6.0 i get the output (tail >> only) included at the end of this email. >> >> here are my compiler versions: >> >> gcc -- version >> gcc (GCC) 4.3.0 20071026 (experimental) >> >> gfortran -- version >> GNU Fortran (GCC) 4.3.0 20071026 (experimental) >> >> i remember that on previous versions of OS X you could use gcc_select >> to specify v3.4 of gcc to make the build go through. unfortunately >> gcc_select seems to have been deprecated in Leopard so i haven't been >> able to try that. don't know if that's a red herring or not. >> >> also, fwiw, i just upgraded an intel macbook prop from OS X 10.4 to >> 10.5 and had no problems at all with the pre-existing versions of >> numpy and scipy. >> >> anyway, any help would be greatly appreciated! >> >> best, >> >> dan. >> >> -- >> >> END OF OUTPUT FROM "python setup.py install" for scipy 0.6.0: >> >> Could not locate executable f95 >> customize AbsoftFCompiler >> Could not locate executable f90 >> Could not locate executable f77 >> customize IBMFCompiler >> Could not locate executable xlf90 >> Could not locate executable xlf >> customize IntelFCompiler >> Could not locate executable ifort >> Could not locate executable ifc >> customize GnuFCompiler >> Found executable /usr/local/bin/g77 >> gnu: no Fortran 90 compiler found >> gnu: no Fortran 90 compiler found >> customize GnuFCompiler >> gnu: no Fortran 90 compiler found >> gnu: no Fortran 90 compiler found >> customize GnuFCompiler using build_clib >> building 'superlu_src' library >> compiling C sources >> C compiler: gcc -arch ppc -arch i386 -isysroot /Developer/SDKs/ >> MacOSX10.4u.sdk -fno-strict-aliasing -Wno-long-double -no-cpp- >> precomp - >> mno-fused-madd -fno-common -dynamic -DNDEBUG -g -O3 >> >> compile options: '-DUSE_VENDOR_BLAS=1 -c' >> gcc: scipy/linsolve/SuperLU/SRC/colamd.c >> gcc: unrecognized option '-no-cpp-precomp' >> cc1: error: unrecognized command line option "-arch" >> cc1: error: unrecognized command line option "-arch" >> cc1: error: unrecognized command line option "-Wno-long-double" >> gcc: unrecognized option '-no-cpp-precomp' >> cc1: error: unrecognized command line option "-arch" >> cc1: error: unrecognized command line option "-arch" >> cc1: error: unrecognized command line option "-Wno-long-double" >> error: Command "gcc -arch ppc -arch i386 -isysroot /Developer/SDKs/ >> MacOSX10.4u.sdk -fno-strict-aliasing -Wno-long-double -no-cpp- >> precomp - >> mno-fused-madd -fno-common -dynamic -DNDEBUG -g -O3 - >> DUSE_VENDOR_BLAS=1 -c scipy/linsolve/SuperLU/SRC/colamd.c -o build/ >> temp.macosx-10.3-fat-2.5/scipy/linsolve/SuperLU/SRC/colamd.o" failed >> with exit status 1 >> _______________________________________________ >> SciPy-user mailing list >> SciPy-user at scipy.org >> http://projects.scipy.org/mailman/listinfo/scipy-user >> > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user From j.reid at mail.cryst.bbk.ac.uk Sat Nov 10 12:26:39 2007 From: j.reid at mail.cryst.bbk.ac.uk (John Reid) Date: Sat, 10 Nov 2007 17:26:39 +0000 Subject: [SciPy-user] pdf of multivariate normal Message-ID: Am I right that this doesn't exist in scipy/numpy? Obviously it isn't too hard to code myself but I was surprised it wasn't there. If it is there, how could I have found it easily? John. From berthe.loic at gmail.com Sat Nov 10 12:52:36 2007 From: berthe.loic at gmail.com (LB) Date: Sat, 10 Nov 2007 09:52:36 -0800 Subject: [SciPy-user] pdf of multivariate normal In-Reply-To: References: Message-ID: <1194717156.583578.152820@o3g2000hsb.googlegroups.com> Have a look at the numpy.random module : >>> info(numpy.random.multivariate_normal) Return an array containing multivariate normally distributed random numbers with specified mean and covariance. multivariate_normal(mean, cov) -> random values multivariate_normal(mean, cov, [m, n, ...]) -> random values mean must be a 1 dimensional array. cov must be a square two dimensional array with the same number of rows and columns as mean has elements. The first form returns a single 1-D array containing a multivariate normal. The second form returns an array of shape (m, n, ..., cov.shape[0]). In this case, output[i,j,...,:] is a 1-D array containing a multivariate normal. -- LB From j.reid at mail.cryst.bbk.ac.uk Sat Nov 10 13:44:50 2007 From: j.reid at mail.cryst.bbk.ac.uk (John Reid) Date: Sat, 10 Nov 2007 18:44:50 +0000 Subject: [SciPy-user] pdf of multivariate normal In-Reply-To: <1194717156.583578.152820@o3g2000hsb.googlegroups.com> References: <1194717156.583578.152820@o3g2000hsb.googlegroups.com> Message-ID: LB wrote: >>>> info(numpy.random.multivariate_normal) > Return an array containing multivariate normally distributed random > numbers Thanks but I don't want random variates, I would like to calculate the probability density function. From j.reid at mail.cryst.bbk.ac.uk Sat Nov 10 13:55:47 2007 From: j.reid at mail.cryst.bbk.ac.uk (John Reid) Date: Sat, 10 Nov 2007 18:55:47 +0000 Subject: [SciPy-user] Bias in numpy.random.multivariate_normal? Message-ID: Could someone confirm I've got this right? In a multivariate normal (MVN) the covariance is E(X.X') - E(X).E(X'). I sample many times from a MVN with given mean and covariance. I then calculate the covariance of the sample. This covariance matrix should show no bias to be larger or smaller than the distribution's covariance but I see that some entries are always larger and others are smaller. See this code: from numpy.random import multivariate_normal from numpy.linalg import inv from numpy import outer mu = [10.,10.] precision = [[.6, .3], [.4, .3]] sigma = inv(precision) sample_size = 10000 num_tests = 100 for test in xrange(num_tests): samples = multivariate_normal(mu, sigma, [sample_size]) sample_mean = samples.sum(axis=0)/sample_size #print (sample_mean - mu) > 0. sample_covariance = sum(outer(sample-sample_mean, sample-sample_mean) for sample in samples) / sample_size print (sample_covariance - sigma) > 0 'True' is printed when the entry is larger and False otherwise Did I get this wrong or is this acceptable behaviour for a random variate generator or is it a bug? Thanks, John. From matthieu.brucher at gmail.com Sat Nov 10 14:00:54 2007 From: matthieu.brucher at gmail.com (Matthieu Brucher) Date: Sat, 10 Nov 2007 20:00:54 +0100 Subject: [SciPy-user] Bias in numpy.random.multivariate_normal? In-Reply-To: References: Message-ID: IIRC, you computed the biaised variance, so it is normal to see biais. Try dividing by (sample_size-1), it should be the unbiaised estimator. Matthieu 2007/11/10, John Reid : > > Could someone confirm I've got this right? > > In a multivariate normal (MVN) the covariance is E(X.X') - E(X).E(X'). > I sample many times from a MVN with given mean and covariance. I then > calculate the covariance of the sample. This covariance matrix should > show no bias to be larger or smaller than the distribution's covariance > but I see that some entries are always larger and others are smaller. > > See this code: > > from numpy.random import multivariate_normal > from numpy.linalg import inv > from numpy import outer > > mu = [10.,10.] > precision = [[.6, .3], [.4, .3]] > sigma = inv(precision) > sample_size = 10000 > num_tests = 100 > > for test in xrange(num_tests): > samples = multivariate_normal(mu, sigma, [sample_size]) > sample_mean = samples.sum(axis=0)/sample_size > #print (sample_mean - mu) > 0. > sample_covariance = sum(outer(sample-sample_mean, sample-sample_mean) > for sample in samples) / sample_size > print (sample_covariance - sigma) > 0 > > > 'True' is printed when the entry is larger and False otherwise > > Did I get this wrong or is this acceptable behaviour for a random > variate generator or is it a bug? > > Thanks, > John. > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > -- French PhD student Website : http://miles.developpez.com/ Blogs : http://matt.eifelle.com and http://blog.developpez.com/?blog=92 LinkedIn : http://www.linkedin.com/in/matthieubrucher -------------- next part -------------- An HTML attachment was scrubbed... URL: From j.reid at mail.cryst.bbk.ac.uk Sat Nov 10 19:18:45 2007 From: j.reid at mail.cryst.bbk.ac.uk (John Reid) Date: Sun, 11 Nov 2007 00:18:45 +0000 Subject: [SciPy-user] Bias in numpy.random.multivariate_normal? In-Reply-To: References: Message-ID: Matthieu Brucher wrote: > IIRC, you computed the biaised variance, so it is normal to see biais. > Try dividing by (sample_size-1), it should be the unbiaised estimator. > That wouldn't change the sign of the answer. In any case I get the same results when dividing by sample_size-1. From bryan.nollett at gmail.com Sat Nov 10 21:26:48 2007 From: bryan.nollett at gmail.com (Bryan Nollett) Date: Sat, 10 Nov 2007 20:26:48 -0600 Subject: [SciPy-user] Bias in numpy.random.multivariate_normal? In-Reply-To: References: Message-ID: <28fb541d0711101826y6b443a13o5c122d5263e9b643@mail.gmail.com> The covariance matrix provided in sigma is not symmetric. Forcing symmetry by changing the definition of sigma to sigma = inv(precision) sigma[1,0] = sigma[0,1] yields behavior that I suspect is in line with your expectations. Perhaps a check for symmetry in the cov argument would be desirable addition to multivariate_normal? Bryan On Nov 10, 2007 12:55 PM, John Reid wrote: > In a multivariate normal (MVN) the covariance is E(X.X') - E(X).E(X'). > I sample many times from a MVN with given mean and covariance. I then > calculate the covariance of the sample. This covariance matrix should > show no bias to be larger or smaller than the distribution's covariance > but I see that some entries are always larger and others are smaller. From j.reid at mail.cryst.bbk.ac.uk Sun Nov 11 05:32:44 2007 From: j.reid at mail.cryst.bbk.ac.uk (John Reid) Date: Sun, 11 Nov 2007 10:32:44 +0000 Subject: [SciPy-user] Bias in numpy.random.multivariate_normal? In-Reply-To: <28fb541d0711101826y6b443a13o5c122d5263e9b643@mail.gmail.com> References: <28fb541d0711101826y6b443a13o5c122d5263e9b643@mail.gmail.com> Message-ID: Bryan Nollett wrote: > The covariance matrix provided in sigma is not symmetric. Good point. I needed some test covariance matrices and I completely forgot the symmetric requirement. There is no way to draw from the Wishart or Inv-Wishart distributions is there? Their pdfs would be handy too. Thanks, John. From matthieu.brucher at gmail.com Sun Nov 11 06:21:11 2007 From: matthieu.brucher at gmail.com (Matthieu Brucher) Date: Sun, 11 Nov 2007 12:21:11 +0100 Subject: [SciPy-user] Bias in numpy.random.multivariate_normal? In-Reply-To: References: <28fb541d0711101826y6b443a13o5c122d5263e9b643@mail.gmail.com> Message-ID: Did you try the rvs method of those classes ? Matthieu 2007/11/11, John Reid : > > Bryan Nollett wrote: > > The covariance matrix provided in sigma is not symmetric. > > Good point. I needed some test covariance matrices and I completely > forgot the symmetric requirement. There is no way to draw from the > Wishart or Inv-Wishart distributions is there? Their pdfs would be handy > too. > > Thanks, > John. > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > -- French PhD student Website : http://miles.developpez.com/ Blogs : http://matt.eifelle.com and http://blog.developpez.com/?blog=92 LinkedIn : http://www.linkedin.com/in/matthieubrucher -------------- next part -------------- An HTML attachment was scrubbed... URL: From j.reid at mail.cryst.bbk.ac.uk Sun Nov 11 10:19:40 2007 From: j.reid at mail.cryst.bbk.ac.uk (John Reid) Date: Sun, 11 Nov 2007 15:19:40 +0000 Subject: [SciPy-user] Bias in numpy.random.multivariate_normal? In-Reply-To: References: <28fb541d0711101826y6b443a13o5c122d5263e9b643@mail.gmail.com> Message-ID: Matthieu Brucher wrote: > Did you try the rvs method of those classes ? Of which classes? I couldn't see anything in scipy.stats or scipy.stats.distributions to do with the wishart distribution. From vel.accel at gmail.com Sun Nov 11 17:10:48 2007 From: vel.accel at gmail.com (dieter h.) Date: Sun, 11 Nov 2007 17:10:48 -0500 Subject: [SciPy-user] Sandbox TimeSeries: request Message-ID: <1e52e0880711111410g150abf7dhe30b5cfc3adb0fbc@mail.gmail.com> Hi Pierre and Matt If I personally had the time to dig in to the module (and extension writing itself), I'd be incorporating this request myself. Functionality that would convert an array of epochs to datetime.datetime|mx.datetime|matplotlib.Dates objects would be a big plus. I suggest it in case you might also find it worthy. In the mean time I'll get by. :-) Thank you -Dieter From pgmdevlist at gmail.com Sun Nov 11 17:40:08 2007 From: pgmdevlist at gmail.com (Pierre GM) Date: Sun, 11 Nov 2007 17:40:08 -0500 Subject: [SciPy-user] Sandbox TimeSeries: request In-Reply-To: <1e52e0880711111410g150abf7dhe30b5cfc3adb0fbc@mail.gmail.com> References: <1e52e0880711111410g150abf7dhe30b5cfc3adb0fbc@mail.gmail.com> Message-ID: <200711111740.08260.pgmdevlist@gmail.com> > Functionality that would convert an array of epochs to > datetime.datetime|mx.datetime|matplotlib.Dates objects would be a big > plus. You can already transform a single Date object to datetime with datetime. It would be pretty straightforward to add a todatetime() method to the DateArray class, which would solve the first request. For matplotlib, I'd suggest you give the plotlib module a try: we have implemented a dynamic location/formatting of ticks that can be pretty useful... From rshepard at appl-ecosys.com Sun Nov 11 18:52:14 2007 From: rshepard at appl-ecosys.com (Rich Shepard) Date: Sun, 11 Nov 2007 15:52:14 -0800 (PST) Subject: [SciPy-user] Problem building SciPy-0.6.0 Message-ID: I upgraded my notebook from Slackware-11.0 to -12.0 so libc, gcc, etc. are different. I've built ATLAS (and copied the *.a and *.so libraries to /usr/local/lib), then sucessfully built and installed NumPy-1.0.4. But, when I try 'python setup.py install' from the scipy-0.6.0 directory the attempt immediately fails: Traceback (most recent call last): File "setup.py", line 53, in setup_package() File "setup.py", line 28, in setup_package from numpy.distutils.core import setup File "/usr/lib/python2.5/site-packages/numpy/__init__.py", line 43, in import linalg File "/usr/lib/python2.5/site-packages/numpy/linalg/__init__.py", line 4, in from linalg import * File "/usr/lib/python2.5/site-packages/numpy/linalg/linalg.py", line 25, in from numpy.linalg import lapack_lite ImportError: liblapack.so: cannot open shared object file: No such file or directory liblapack.so is in both /usr/local/lib and ~/python/ATLAS/ATLAS-linux/lib/. What am I missing here? Rich -- Richard B. Shepard, Ph.D. | Integrity Credibility Applied Ecosystem Services, Inc. | Innovation Voice: 503-667-4517 Fax: 503-667-8863 From rshepard at appl-ecosys.com Sun Nov 11 19:29:58 2007 From: rshepard at appl-ecosys.com (Rich Shepard) Date: Sun, 11 Nov 2007 16:29:58 -0800 (PST) Subject: [SciPy-user] Problem building SciPy-0.6.0 -- FIXED In-Reply-To: References: Message-ID: On Sun, 11 Nov 2007, Rich Shepard wrote: > What am I missing here? Found it: python was looking in /usr/lib, and not in /usr/local/lib. Softlinks fixed this problem. Rich -- Richard B. Shepard, Ph.D. | Integrity Credibility Applied Ecosystem Services, Inc. | Innovation Voice: 503-667-4517 Fax: 503-667-8863 From david at ar.media.kyoto-u.ac.jp Sun Nov 11 22:03:02 2007 From: david at ar.media.kyoto-u.ac.jp (David Cournapeau) Date: Mon, 12 Nov 2007 12:03:02 +0900 Subject: [SciPy-user] Bias in numpy.random.multivariate_normal? In-Reply-To: References: <28fb541d0711101826y6b443a13o5c122d5263e9b643@mail.gmail.com> Message-ID: <4737C266.1080905@ar.media.kyoto-u.ac.jp> John Reid wrote: > Matthieu Brucher wrote: > >> Did you try the rvs method of those classes ? >> > > Of which classes? I couldn't see anything in scipy.stats or > scipy.stats.distributions to do with the wishart distribution. > I don't think there is. This would be handy, indeed. So handy that I have just created a ticket for it :) http://scipy.org/scipy/scipy/ticket/536 cheers, David From c.j.lee at tnw.utwente.nl Mon Nov 12 06:14:50 2007 From: c.j.lee at tnw.utwente.nl (Chris Lee) Date: Mon, 12 Nov 2007 12:14:50 +0100 Subject: [SciPy-user] instrumentation control question Message-ID: <473835AA.20801@tnw.utwente.nl> Hi, I know this is a bit off-topic but I wondered if anyone here had any experience with the following problem. Part of our work involves controlling a bunch of lab instrumentation, which so far has worked beautifully using python + scipy/numpy + pyVISA. Then we got a stepper motor which does not comes with an activeX control and (as far as I can tell) nothing else. This means using the com extensions for win32 and trying to communicate with the activeX control. All is not progressing well (we can find the control but send no data to it, and it certainly won't talk back to us). Does anyone have any experience with this? If so, would you mind sending me some code snippets for gaining control of the activeX element? Thanks very much and my appologies for the not-so-on-topic post (I actually think instrumentation control would be a great addition to scipy and if I become competent may look at contributing code in that direction). Cheers Chris From sandricionut at yahoo.com Mon Nov 12 09:07:57 2007 From: sandricionut at yahoo.com (sandric ionut) Date: Mon, 12 Nov 2007 06:07:57 -0800 (PST) Subject: [SciPy-user] random gaussian array Message-ID: <834845.12895.qm@web51304.mail.re2.yahoo.com> Hi: I want to create an random Gaussian 2D array with an exact size. For example I want my array to look like: 2 3 4 0 1 1 2 3 3 4 1 1 3 2 5 6 7 8 all the values should be randomly generated from a Gaussian distribution with mean and standard deviation Is it possible to do it with scipy or numpy? Thank you Ionut __________________________________________________ Do You Yahoo!? Tired of spam? Yahoo! Mail has the best spam protection around http://mail.yahoo.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From matthieu.brucher at gmail.com Mon Nov 12 09:11:08 2007 From: matthieu.brucher at gmail.com (Matthieu Brucher) Date: Mon, 12 Nov 2007 15:11:08 +0100 Subject: [SciPy-user] random gaussian array In-Reply-To: <834845.12895.qm@web51304.mail.re2.yahoo.com> References: <834845.12895.qm@web51304.mail.re2.yahoo.com> Message-ID: Yes, look into numpy.random -> norm Matthieu 2007/11/12, sandric ionut : > > Hi: > > I want to create an random Gaussian 2D array with an exact size. For > example I want my array to look like: > > 2 3 4 0 1 1 > 2 3 3 4 1 1 > 3 2 5 6 7 8 > > all the values should be randomly generated from a Gaussian distribution > with mean and standard deviation > > Is it possible to do it with scipy or numpy? > > Thank you > > Ionut > > __________________________________________________ > Do You Yahoo!? > Tired of spam? Yahoo! Mail has the best spam protection around > http://mail.yahoo.com > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > > -- French PhD student Website : http://miles.developpez.com/ Blogs : http://matt.eifelle.com and http://blog.developpez.com/?blog=92 LinkedIn : http://www.linkedin.com/in/matthieubrucher -------------- next part -------------- An HTML attachment was scrubbed... URL: From berthe.loic at gmail.com Mon Nov 12 11:52:35 2007 From: berthe.loic at gmail.com (LB) Date: Mon, 12 Nov 2007 16:52:35 -0000 Subject: [SciPy-user] nullclines for nonlinear ODEs In-Reply-To: <1193769594.019129.155320@d55g2000hsg.googlegroups.com> References: <88e473830710300926l66c350c3k72b7fda9e6fd5c31@mail.gmail.com> <88e473830710300928r68f376b8k2a6a5c76b04c11c1@mail.gmail.com> <1193769594.019129.155320@d55g2000hsg.googlegroups.com> Message-ID: <1194886355.028155.281910@v3g2000hsg.googlegroups.com> Just for information, I've put a first version of this tutorial on the Wiki : http://www.scipy.org/LoktaVolterraTutorial. I don't talk about nullclines for the moment but feel to complete it - or to correct it if you find some mistakes. -- LB From jeremit0 at gmail.com Mon Nov 12 12:06:31 2007 From: jeremit0 at gmail.com (Jeremy Conlin) Date: Mon, 12 Nov 2007 12:06:31 -0500 Subject: [SciPy-user] fortran compiler not found Mac Leopard 10.5 Message-ID: <3db594f70711120906n11ddd08fkd1fca36b12baa611@mail.gmail.com> Hello all. I recently upgraded scipy from the svn distribution and tried installing it. I followed the instructions on http://scipy.org/Installing_SciPy/Mac_OS_X and downloaded the gfortran binaries from the att.com site. When I run python setup.py build it compiles for awhile but fails. The last few messages are copied at the end of this message. Is another compiler (i.e. f95) needed? Do I just need to rename the gfortran executable? I tried making a symlink g95->gfortran but it didn't work. Does anyone have any suggestions? Thanks, Jeremy $ gcc --version i686-apple-darwin9-gcc-4.0.1 (GCC) 4.0.1 (Apple Inc. build 5465) Copyright (C) 2005 Free Software Foundation, Inc. This is free software; see the source for copying conditions. There is NO warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. $ gfortran --version GNU Fortran (GCC) 4.2.1 Copyright (C) 2007 Free Software Foundation, Inc. GNU Fortran comes with NO WARRANTY, to the extent permitted by law. You may redistribute copies of GNU Fortran under the terms of the GNU General Public License. For more information about these matters, see the file named COPYING $ python setup.py build ? compiling Fortran sources Fortran f77 compiler: f95 -fixed -O4 -target=native Fortran f90 compiler: f95 -O4 -target=native Fortran fix compiler: f95 -fixed -O4 -target=native creating build/temp.macosx-10.5-i386-2.5 creating build/temp.macosx-10.5-i386-2.5/scipy creating build/temp.macosx-10.5-i386-2.5/scipy/fftpack creating build/temp.macosx-10.5-i386-2.5/scipy/fftpack/dfftpack compile options: '-c' f95:f77: scipy/fftpack/dfftpack/dcosqb.f sh: f95: command not found sh: f95: command not found error: Command "f95 -fixed -O4 -target=native -c -c scipy/fftpack/dfftpack/dcosqb.f -o build/temp.macosx-10.5-i386-2.5/scipy/fftpack/dfftpack/dcosqb.o" failed with exit status 127 From hasslerjc at comcast.net Mon Nov 12 13:00:32 2007 From: hasslerjc at comcast.net (John Hassler) Date: Mon, 12 Nov 2007 13:00:32 -0500 Subject: [SciPy-user] instrumentation control question In-Reply-To: <473835AA.20801@tnw.utwente.nl> References: <473835AA.20801@tnw.utwente.nl> Message-ID: <473894C0.7000005@comcast.net> I've used ctypes to access .dll libraries, but I've never used activex. Andrew Straw has written a very nice library for Measurement Computing hardware. Perhaps something there would be useful: http://code.astraw.com/ Do you have a hardware stepper driver, or are you trying to do it all in software? john Chris Lee wrote: > Hi, I know this is a bit off-topic but I wondered if anyone here had any > experience with the following problem. > > Part of our work involves controlling a bunch of lab instrumentation, > which so far has worked beautifully using python + scipy/numpy + pyVISA. > Then we got a stepper motor which does not comes with an activeX control > and (as far as I can tell) nothing else. > > This means using the com extensions for win32 and trying to communicate > with the activeX control. All is not progressing well (we can find the > control but send no data to it, and it certainly won't talk back to us). > Does anyone have any experience with this? If so, would you mind sending > me some code snippets for gaining control of the activeX element? > > Thanks very much and my appologies for the not-so-on-topic post (I > actually think instrumentation control would be a great addition to > scipy and if I become competent may look at contributing code in that > direction). > > Cheers > Chris > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > > > From dominique.orban at gmail.com Mon Nov 12 13:19:31 2007 From: dominique.orban at gmail.com (Dominique Orban) Date: Mon, 12 Nov 2007 13:19:31 -0500 Subject: [SciPy-user] fortran compiler not found Mac Leopard 10.5 In-Reply-To: <3db594f70711120906n11ddd08fkd1fca36b12baa611@mail.gmail.com> References: <3db594f70711120906n11ddd08fkd1fca36b12baa611@mail.gmail.com> Message-ID: <8793ae6e0711121019k35291310q664910303c6cdac7@mail.gmail.com> On 11/12/07, Jeremy Conlin wrote: > Hello all. I recently upgraded scipy from the svn distribution and > tried installing it. I followed the instructions on > http://scipy.org/Installing_SciPy/Mac_OS_X > > and downloaded the gfortran binaries from the att.com site. When I run > > python setup.py build > > it compiles for awhile but fails. The last few messages are copied at > the end of this message. Is another compiler (i.e. f95) needed? Do I > just need to rename the gfortran executable? I tried making a symlink > g95->gfortran but it didn't work. Does anyone have any suggestions? > > Thanks, > Jeremy > > $ gcc --version > i686-apple-darwin9-gcc-4.0.1 (GCC) 4.0.1 (Apple Inc. build 5465) > Copyright (C) 2005 Free Software Foundation, Inc. > This is free software; see the source for copying conditions. There is NO > warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. > > $ gfortran --version > GNU Fortran (GCC) 4.2.1 > Copyright (C) 2007 Free Software Foundation, Inc. > > GNU Fortran comes with NO WARRANTY, to the extent permitted by law. > You may redistribute copies of GNU Fortran > under the terms of the GNU General Public License. > For more information about these matters, see the file named COPYING > > $ python setup.py build > ? > compiling Fortran sources > Fortran f77 compiler: f95 -fixed -O4 -target=native > Fortran f90 compiler: f95 -O4 -target=native > Fortran fix compiler: f95 -fixed -O4 -target=native > creating build/temp.macosx-10.5-i386-2.5 > creating build/temp.macosx-10.5-i386-2.5/scipy > creating build/temp.macosx-10.5-i386-2.5/scipy/fftpack > creating build/temp.macosx-10.5-i386-2.5/scipy/fftpack/dfftpack > compile options: '-c' > f95:f77: scipy/fftpack/dfftpack/dcosqb.f > sh: f95: command not found > sh: f95: command not found > error: Command "f95 -fixed -O4 -target=native -c -c > scipy/fftpack/dfftpack/dcosqb.f -o > build/temp.macosx-10.5-i386-2.5/scipy/fftpack/dfftpack/dcosqb.o" > failed with exit status 127 > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > Hi Jeremy, did you perhaps forget to specify that you want to use gfortran to build SciPy? You can do it as so: python setup.py config_fc --fcompiler=gfortran build See INSTALL.txt for more information. Good luck, Dominique From C.J.Lee at tnw.utwente.nl Mon Nov 12 13:26:06 2007 From: C.J.Lee at tnw.utwente.nl (C.J.Lee at tnw.utwente.nl) Date: Mon, 12 Nov 2007 19:26:06 +0100 Subject: [SciPy-user] instrumentation control question References: <473835AA.20801@tnw.utwente.nl> <473894C0.7000005@comcast.net> Message-ID: <9E7F7554631C0047AB598527A4E9569B6F8121@tnx4.dynamic.tnw.utwente.nl> Well actually it is a DC motor that is controlled via a USB "block" that can be used to drive either stepper motors, DC servos, or piezos, or a combination (depending on the number of channels) I have discovered where the dlls are hidden but have not managed to drive it using ctypes (I have had success with this using a different motor control/motor combination) I will take a look at andrew's web site. Cheers Chris PS John, sorry for the double email but I didn't reply to the list on the first occasion -----Original Message----- From: John Hassler [mailto:hasslerjc at comcast.net] Sent: Mon 11/12/2007 7:00 PM To: Lee, Chris (TNW); SciPy Users List Subject: Re: [SciPy-user] instrumentation control question I've used ctypes to access .dll libraries, but I've never used activex. Andrew Straw has written a very nice library for Measurement Computing hardware. Perhaps something there would be useful: http://code.astraw.com/ Do you have a hardware stepper driver, or are you trying to do it all in software? john Chris Lee wrote: > Hi, I know this is a bit off-topic but I wondered if anyone here had any > experience with the following problem. > > Part of our work involves controlling a bunch of lab instrumentation, > which so far has worked beautifully using python + scipy/numpy + pyVISA. > Then we got a stepper motor which does not comes with an activeX control > and (as far as I can tell) nothing else. > > This means using the com extensions for win32 and trying to communicate > with the activeX control. All is not progressing well (we can find the > control but send no data to it, and it certainly won't talk back to us). > Does anyone have any experience with this? If so, would you mind sending > me some code snippets for gaining control of the activeX element? > > Thanks very much and my appologies for the not-so-on-topic post (I > actually think instrumentation control would be a great addition to > scipy and if I become competent may look at contributing code in that > direction). > > Cheers > Chris > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > > > From ellisonbg.net at gmail.com Mon Nov 12 13:39:04 2007 From: ellisonbg.net at gmail.com (Brian Granger) Date: Mon, 12 Nov 2007 11:39:04 -0700 Subject: [SciPy-user] fortran compiler not found Mac Leopard 10.5 In-Reply-To: <3db594f70711120906n11ddd08fkd1fca36b12baa611@mail.gmail.com> References: <3db594f70711120906n11ddd08fkd1fca36b12baa611@mail.gmail.com> Message-ID: <6ce0ac130711121039y47445aabu22acaa50ff402f83@mail.gmail.com> Are you using the builtin python? Are you using the numpy that comes with the builtin python? If so, you will need to upgrade numpy first. But, there are some pitfalls along this path as well. See recent threads on the numpy list and the pythonmac-sig list for details. Or, just use the python binary from python.org and install numpy/scipy from scratch. Brian On Nov 12, 2007 10:06 AM, Jeremy Conlin wrote: > Hello all. I recently upgraded scipy from the svn distribution and > tried installing it. I followed the instructions on > http://scipy.org/Installing_SciPy/Mac_OS_X > > and downloaded the gfortran binaries from the att.com site. When I run > > python setup.py build > > it compiles for awhile but fails. The last few messages are copied at > the end of this message. Is another compiler (i.e. f95) needed? Do I > just need to rename the gfortran executable? I tried making a symlink > g95->gfortran but it didn't work. Does anyone have any suggestions? > > Thanks, > Jeremy > > $ gcc --version > i686-apple-darwin9-gcc-4.0.1 (GCC) 4.0.1 (Apple Inc. build 5465) > Copyright (C) 2005 Free Software Foundation, Inc. > This is free software; see the source for copying conditions. There is NO > warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. > > $ gfortran --version > GNU Fortran (GCC) 4.2.1 > Copyright (C) 2007 Free Software Foundation, Inc. > > GNU Fortran comes with NO WARRANTY, to the extent permitted by law. > You may redistribute copies of GNU Fortran > under the terms of the GNU General Public License. > For more information about these matters, see the file named COPYING > > $ python setup.py build > ? > compiling Fortran sources > Fortran f77 compiler: f95 -fixed -O4 -target=native > Fortran f90 compiler: f95 -O4 -target=native > Fortran fix compiler: f95 -fixed -O4 -target=native > creating build/temp.macosx-10.5-i386-2.5 > creating build/temp.macosx-10.5-i386-2.5/scipy > creating build/temp.macosx-10.5-i386-2.5/scipy/fftpack > creating build/temp.macosx-10.5-i386-2.5/scipy/fftpack/dfftpack > compile options: '-c' > f95:f77: scipy/fftpack/dfftpack/dcosqb.f > sh: f95: command not found > sh: f95: command not found > error: Command "f95 -fixed -O4 -target=native -c -c > scipy/fftpack/dfftpack/dcosqb.f -o > build/temp.macosx-10.5-i386-2.5/scipy/fftpack/dfftpack/dcosqb.o" > failed with exit status 127 > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > From jeremit0 at gmail.com Mon Nov 12 14:23:03 2007 From: jeremit0 at gmail.com (Jeremy Conlin) Date: Mon, 12 Nov 2007 14:23:03 -0500 Subject: [SciPy-user] fortran compiler not found Mac Leopard 10.5 In-Reply-To: <8793ae6e0711121019k35291310q664910303c6cdac7@mail.gmail.com> References: <3db594f70711120906n11ddd08fkd1fca36b12baa611@mail.gmail.com> <8793ae6e0711121019k35291310q664910303c6cdac7@mail.gmail.com> Message-ID: <3db594f70711121123l1919a490q1ad159e0ea8a4f0c@mail.gmail.com> On Nov 12, 2007 1:19 PM, Dominique Orban wrote: > > On 11/12/07, Jeremy Conlin wrote: > > Hello all. I recently upgraded scipy from the svn distribution and > > tried installing it. I followed the instructions on > > http://scipy.org/Installing_SciPy/Mac_OS_X > > > > and downloaded the gfortran binaries from the att.com site. When I run > > > > python setup.py build > > > > it compiles for awhile but fails. The last few messages are copied at > > the end of this message. Is another compiler (i.e. f95) needed? Do I > > just need to rename the gfortran executable? I tried making a symlink > > g95->gfortran but it didn't work. Does anyone have any suggestions? > > > > Thanks, > > Jeremy > > > > $ gcc --version > > i686-apple-darwin9-gcc-4.0.1 (GCC) 4.0.1 (Apple Inc. build 5465) > > Copyright (C) 2005 Free Software Foundation, Inc. > > This is free software; see the source for copying conditions. There is NO > > warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. > > > > $ gfortran --version > > GNU Fortran (GCC) 4.2.1 > > Copyright (C) 2007 Free Software Foundation, Inc. > > > > GNU Fortran comes with NO WARRANTY, to the extent permitted by law. > > You may redistribute copies of GNU Fortran > > under the terms of the GNU General Public License. > > For more information about these matters, see the file named COPYING > > > > $ python setup.py build > > ? > > compiling Fortran sources > > Fortran f77 compiler: f95 -fixed -O4 -target=native > > Fortran f90 compiler: f95 -O4 -target=native > > Fortran fix compiler: f95 -fixed -O4 -target=native > > creating build/temp.macosx-10.5-i386-2.5 > > creating build/temp.macosx-10.5-i386-2.5/scipy > > creating build/temp.macosx-10.5-i386-2.5/scipy/fftpack > > creating build/temp.macosx-10.5-i386-2.5/scipy/fftpack/dfftpack > > compile options: '-c' > > f95:f77: scipy/fftpack/dfftpack/dcosqb.f > > sh: f95: command not found > > sh: f95: command not found > > error: Command "f95 -fixed -O4 -target=native -c -c > > scipy/fftpack/dfftpack/dcosqb.f -o > > build/temp.macosx-10.5-i386-2.5/scipy/fftpack/dfftpack/dcosqb.o" > > failed with exit status 127 > > _______________________________________________ > > SciPy-user mailing list > > SciPy-user at scipy.org > > http://projects.scipy.org/mailman/listinfo/scipy-user > > > > Hi Jeremy, > > did you perhaps forget to specify that you want to use gfortran to > build SciPy? You can do it as so: > > python setup.py config_fc --fcompiler=gfortran build > > See INSTALL.txt for more information. Duh! I did forget that important detail. I'll look more closely at the instructions on the wiki next time :P. I'll post again if there are troubles. Thanks, Jeremy From jeremit0 at gmail.com Mon Nov 12 14:25:24 2007 From: jeremit0 at gmail.com (Jeremy Conlin) Date: Mon, 12 Nov 2007 14:25:24 -0500 Subject: [SciPy-user] fortran compiler not found Mac Leopard 10.5 In-Reply-To: <6ce0ac130711121039y47445aabu22acaa50ff402f83@mail.gmail.com> References: <3db594f70711120906n11ddd08fkd1fca36b12baa611@mail.gmail.com> <6ce0ac130711121039y47445aabu22acaa50ff402f83@mail.gmail.com> Message-ID: <3db594f70711121125p1cea3dfaqd11133fd5f24e01a@mail.gmail.com> On Nov 12, 2007 1:39 PM, Brian Granger wrote: > Are you using the builtin python? > Are you using the numpy that comes with the builtin python? > > If so, you will need to upgrade numpy first. But, there are some > pitfalls along this path as well. See recent threads on the numpy > list and the pythonmac-sig list for details. Or, just use the python > binary from python.org and install numpy/scipy from scratch. > > Brian I am using the builtin python. I did build numpy myself from svn, but it looks like python is using builtin version. How can I make it use my compiled version instead? I will take a look at the pythonmac-sig email list and see what is known about scipy on Leopard. Thanks again, Jeremy From mhearne at usgs.gov Mon Nov 12 15:26:02 2007 From: mhearne at usgs.gov (Michael Hearne) Date: Mon, 12 Nov 2007 13:26:02 -0700 Subject: [SciPy-user] Bundling numpy/scipy with applications Message-ID: <0CDABAC6-8EA6-409E-B389-578864A6407E@usgs.gov> I'm creating a Python application that is internal to my organization, but will be installed on both Linux and Mac OS X machines. The application depends heavily on a number of "non-pure" modules (those that have C/C++/FORTRAN components), like numpy, scipy, gdal, etc. What is the most "pythonic" way to bundle these types of modules inside my application? I've investigated dist_utils and setuptools, and I don't see an easy way with those tools to include build instructions for packages built with autconf tools. Is my only recourse to write a custom install script that calls the "configure;make;make install" from a shell? --Mike ------------------------------------------------------ Michael Hearne mhearne at usgs.gov (303) 273-8620 USGS National Earthquake Information Center 1711 Illinois St. Golden CO 80401 Senior Software Engineer Synergetics, Inc. ------------------------------------------------------ -------------- next part -------------- An HTML attachment was scrubbed... URL: From lists.steve at arachnedesign.net Mon Nov 12 16:02:48 2007 From: lists.steve at arachnedesign.net (Steve Lianoglou) Date: Mon, 12 Nov 2007 16:02:48 -0500 Subject: [SciPy-user] fortran compiler not found Mac Leopard 10.5 In-Reply-To: <3db594f70711121125p1cea3dfaqd11133fd5f24e01a@mail.gmail.com> References: <3db594f70711120906n11ddd08fkd1fca36b12baa611@mail.gmail.com> <6ce0ac130711121039y47445aabu22acaa50ff402f83@mail.gmail.com> <3db594f70711121125p1cea3dfaqd11133fd5f24e01a@mail.gmail.com> Message-ID: <4D4B58BA-70E1-46E1-A955-7C516BB4BB79@arachnedesign.net> > I am using the builtin python. I did build numpy myself from svn, but > it looks like python is using builtin version. How can I make it use > my compiled version instead? > > I will take a look at the pythonmac-sig email list and see what is > known about scipy on Leopard. If I remember correctly, you have to setup your PYTHONPATH envi var to include the libraries in /Library/Python/2.5/site-packages before the ones that are installed into the system path (which I think are buried somewhere in /System/Library/Frameworks/Python.framework/...). (The normal `python setup.py install` command will drop then into / Library/Python/2.5/site-packages) Also, FWIW, I've had better luck using the gfortran binary from: http://r.research.att.com/tools Over the one that I think is suggested at the tutorial (from hpc.sf.net) even though they haven't been upgraded for Leopard yet (which I don't think they even need to be). HTH, -steve From aisaac at american.edu Mon Nov 12 16:46:42 2007 From: aisaac at american.edu (Alan G Isaac) Date: Mon, 12 Nov 2007 16:46:42 -0500 Subject: [SciPy-user] nullclines for nonlinear ODEs In-Reply-To: <1194886355.028155.281910@v3g2000hsg.googlegroups.com> References: <88e473830710300926l66c350c3k72b7fda9e6fd5c31@mail.gmail.com><88e473830710300928r68f376b8k2a6a5c76b04c11c1@mail.gmail.com><1193769594.019129.155320@d55g2000hsg.googlegroups.com><1194886355.028155.281910@v3g2000hsg.googlegroups.com> Message-ID: On Mon, 12 Nov 2007, LB apparently wrote: > http://www.scipy.org/LoktaVolterraTutorial After a quick look, I believe this will be very useful for my students. Thank you! Alan Isaac From jeremit0 at gmail.com Mon Nov 12 16:51:38 2007 From: jeremit0 at gmail.com (Jeremy Conlin) Date: Mon, 12 Nov 2007 16:51:38 -0500 Subject: [SciPy-user] fortran compiler not found Mac Leopard 10.5 In-Reply-To: <4D4B58BA-70E1-46E1-A955-7C516BB4BB79@arachnedesign.net> References: <3db594f70711120906n11ddd08fkd1fca36b12baa611@mail.gmail.com> <6ce0ac130711121039y47445aabu22acaa50ff402f83@mail.gmail.com> <3db594f70711121125p1cea3dfaqd11133fd5f24e01a@mail.gmail.com> <4D4B58BA-70E1-46E1-A955-7C516BB4BB79@arachnedesign.net> Message-ID: <3db594f70711121351w4a803df1u58fadcb2488efdbd@mail.gmail.com> On Nov 12, 2007 4:02 PM, Steve Lianoglou wrote: > > I am using the builtin python. I did build numpy myself from svn, but > > it looks like python is using builtin version. How can I make it use > > my compiled version instead? > > > > I will take a look at the pythonmac-sig email list and see what is > > known about scipy on Leopard. > > If I remember correctly, you have to setup your PYTHONPATH envi var to > include the libraries in /Library/Python/2.5/site-packages before the > ones that are installed into the system path (which I think are buried > somewhere in /System/Library/Frameworks/Python.framework/...). > > (The normal `python setup.py install` command will drop then into / > Library/Python/2.5/site-packages) I got numpy and scipy to install and they both passed all the tests. This is how I did it. I followed the instructions on scipy.org for installing on Mac OS X. I used the fortran compiler from att.com site. I set my PYTHONPATH as: export PYTHONPATH="/Library/Python/2.5/site-packages:$PYTHONPATH" Everything compiled without any problems. Thanks for everyone's suggestions. I hope my results will help others. Jeremy From cohen at slac.stanford.edu Mon Nov 12 17:02:24 2007 From: cohen at slac.stanford.edu (Johann Cohen-Tanugi) Date: Mon, 12 Nov 2007 14:02:24 -0800 Subject: [SciPy-user] problem building scipy : liblapack.so not found.... Message-ID: <4738CD70.7080401@slac.stanford.edu> I have the current SVN trunk, and I built lapack and ATLAS following the doc in the scipy web site. I also built numpy from SVN. Now when trying toinstall scipy I get : [cohen at localhost scipy-svn]$ su -c 'python setup.py install' Password: Traceback (most recent call last): File "setup.py", line 92, in setup_package() File "setup.py", line 63, in setup_package from numpy.distutils.core import setup File "/usr/lib/python2.5/site-packages/numpy/__init__.py", line 43, in import linalg File "/usr/lib/python2.5/site-packages/numpy/linalg/__init__.py", line 4, in from linalg import * File "/usr/lib/python2.5/site-packages/numpy/linalg/linalg.py", line 25, in from numpy.linalg import lapack_lite ImportError: liblapack.so: cannot open shared object file: No such file or directory but I do have this library: [cohen at localhost scipy-svn]$ ls -l /usr/local/lib/liblapack.so lrwxrwxrwx 1 root root 33 2007-11-12 09:29 /usr/local/lib/liblapack.so -> /usr/local/atlas/lib/liblapack.so and it should be in my paths. Moreover, I can issue the offending line without problem: [cohen at localhost scipy-svn]$ ipython Python 2.5 (r25:51908, Oct 19 2007, 09:47:40) Type "copyright", "credits" or "license" for more information. IPython 0.8.2.svn.r2848 -- An enhanced Interactive Python. ? -> Introduction and overview of IPython's features. %quickref -> Quick reference. help -> Python's own help system. object? -> Details about 'object'. ?object also works, ?? prints more. In [1]: from numpy.linalg import lapack_lite In [2]: dir(lapack_lite) Out[2]: ['LapackError', '__doc__', '__file__', '__name__', 'dgeev', 'dgelsd', 'dgeqrf', 'dgesdd', 'dgesv', 'dgetrf', 'dorgqr', 'dpotrf', 'dsyevd', 'zgeev', 'zgelsd', 'zgeqrf', 'zgesdd', 'zgesv', 'zgetrf', 'zheevd', 'zpotrf', 'zungqr'] \What is going on? thanks, Johann From david at ar.media.kyoto-u.ac.jp Mon Nov 12 23:48:18 2007 From: david at ar.media.kyoto-u.ac.jp (David Cournapeau) Date: Tue, 13 Nov 2007 13:48:18 +0900 Subject: [SciPy-user] Bundling numpy/scipy with applications In-Reply-To: <0CDABAC6-8EA6-409E-B389-578864A6407E@usgs.gov> References: <0CDABAC6-8EA6-409E-B389-578864A6407E@usgs.gov> Message-ID: <47392C92.8020208@ar.media.kyoto-u.ac.jp> Michael Hearne wrote: > I'm creating a Python application that is internal to my organization, > but will be installed on both Linux and Mac OS X machines. The > application depends heavily on a number of "non-pure" modules (those > that have C/C++/FORTRAN components), like numpy, scipy, gdal, etc. > > What is the most "pythonic" way to bundle these types of modules > inside my application? > > I've investigated dist_utils and setuptools, and I don't see an easy > way with those tools to include build instructions for packages built > with autconf tools. > > Is my only recourse to write a custom install script that calls the > "configure;make;make install" from a shell? > You have two problems: building and packaging. For packaging, autoconf will not help you much outside providing a coherence between packages, I think; distutils/setuptools have packaging tools, but I don't know how good they are. On Linux, the packaging system depends on the distribution: since it is only internal, the problem is severely simplified, since you are likely to target only one format (rpm, deb, etc...). On Mac OS X, you have also tools to package .pkg and .mpkg (which are a set of .pkg), available freely on apple dev website; I have not used them much, but they look simple and easy to use for simple command line packages. You could hack something in distutils to call configure and so on, but I don't think it will be pleasant. Building packages using distutils/setuptools is easy, and do not need much configuration. So I think it is more natural to use a top level tool calling distutils/setuptools than to use distutils as the top level tool. As a pythonic way, you may consider build tools, such as scons or waf, both written in python. But frankly, I would not bother too much about pythonic way: since some tools will be shell based anyway (autotools, packaging tools), and you don't have to care about windows, using make and shell may actually be easier. For the build part, you may take a look at my garnumpy package: it is essentially a set of rules for Gnu Makefiles, and it can build numpy + scipy, with a configurable set of dependencies (ATLAS, NETLIB BLAS/LAPACK, fftw, etc....). It can build both distutils and autotools-based packages. http://www.ar.media.kyoto-u.ac.jp/members/david/archives/garnumpy/garnumpy-0.4.tbz2 I used it sucessfully on linux and cygwin, so I would say it should work on Mac OS X without too much trouble. The only thing which will be likely to be a pain is fat binaries (Universal). I use it to build a totally self contained numpy/scipy installation, which is a first step toward packaging. If you think it can be useful to you, don't hesitate to ask questions; there is also a bzr archive if you want to have access to the dev history of the tool cheers, David From ellisonbg.net at gmail.com Mon Nov 12 23:59:08 2007 From: ellisonbg.net at gmail.com (Brian Granger) Date: Mon, 12 Nov 2007 21:59:08 -0700 Subject: [SciPy-user] fortran compiler not found Mac Leopard 10.5 In-Reply-To: <3db594f70711121351w4a803df1u58fadcb2488efdbd@mail.gmail.com> References: <3db594f70711120906n11ddd08fkd1fca36b12baa611@mail.gmail.com> <6ce0ac130711121039y47445aabu22acaa50ff402f83@mail.gmail.com> <3db594f70711121125p1cea3dfaqd11133fd5f24e01a@mail.gmail.com> <4D4B58BA-70E1-46E1-A955-7C516BB4BB79@arachnedesign.net> <3db594f70711121351w4a803df1u58fadcb2488efdbd@mail.gmail.com> Message-ID: <6ce0ac130711122059x610214bey5698dde00ee9cd8c@mail.gmail.com> Great, now you are ready for the other problem on Leopard: sudo does not preserve the PYTHONPATH environment variable. Thus anytime you do:: sudo python setup.py install to install something else, your version of numpy won't be used! This has been reported to apple and the problem can be fixed by adding PYTHONPATH to the appropriate section in /etc/sudoers. Another option (see the recent thread on pythonmax-sig) is to create a .pth file that dynamically reorders the sys.path. This approach is nice as it doesn't require setting PYTHONPATH. You will run into this when installing matplotlib and pytables...if you go that way. Brian On Nov 12, 2007 2:51 PM, Jeremy Conlin wrote: > On Nov 12, 2007 4:02 PM, Steve Lianoglou wrote: > > > I am using the builtin python. I did build numpy myself from svn, but > > > it looks like python is using builtin version. How can I make it use > > > my compiled version instead? > > > > > > I will take a look at the pythonmac-sig email list and see what is > > > known about scipy on Leopard. > > > > If I remember correctly, you have to setup your PYTHONPATH envi var to > > include the libraries in /Library/Python/2.5/site-packages before the > > ones that are installed into the system path (which I think are buried > > somewhere in /System/Library/Frameworks/Python.framework/...). > > > > (The normal `python setup.py install` command will drop then into / > > Library/Python/2.5/site-packages) > > I got numpy and scipy to install and they both passed all the tests. > This is how I did it. > > I followed the instructions on scipy.org for installing on Mac OS X. > I used the fortran compiler from att.com site. I set my PYTHONPATH > as: > > export PYTHONPATH="/Library/Python/2.5/site-packages:$PYTHONPATH" > > Everything compiled without any problems. Thanks for everyone's > suggestions. I hope my results will help others. > > Jeremy > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > From cohen at slac.stanford.edu Tue Nov 13 00:58:47 2007 From: cohen at slac.stanford.edu (Johann Cohen-Tanugi) Date: Mon, 12 Nov 2007 21:58:47 -0800 Subject: [SciPy-user] Bundling numpy/scipy with applications In-Reply-To: <47392C92.8020208@ar.media.kyoto-u.ac.jp> References: <0CDABAC6-8EA6-409E-B389-578864A6407E@usgs.gov> <47392C92.8020208@ar.media.kyoto-u.ac.jp> Message-ID: <47393D17.9090000@slac.stanford.edu> David Cournapeau wrote: > Michael Hearne wrote: > >> I'm creating a Python application that is internal to my organization, >> but will be installed on both Linux and Mac OS X machines. The >> application depends heavily on a number of "non-pure" modules (those >> that have C/C++/FORTRAN components), like numpy, scipy, gdal, etc. >> >> What is the most "pythonic" way to bundle these types of modules >> inside my application? >> >> I've investigated dist_utils and setuptools, and I don't see an easy >> way with those tools to include build instructions for packages built >> with autconf tools. >> >> Is my only recourse to write a custom install script that calls the >> "configure;make;make install" from a shell? >> >> > You have two problems: building and packaging. For packaging, autoconf > will not help you much outside providing a coherence between packages, I > think; distutils/setuptools have packaging tools, but I don't know how > good they are. > > On Linux, the packaging system depends on the distribution: since it is > only internal, the problem is severely simplified, since you are likely > to target only one format (rpm, deb, etc...). On Mac OS X, you have also > tools to package .pkg and .mpkg (which are a set of .pkg), available > freely on apple dev website; I have not used them much, but they look > simple and easy to use for simple command line packages. > > You could hack something in distutils to call configure and so on, but I > don't think it will be pleasant. Building packages using > distutils/setuptools is easy, and do not need much configuration. So I > think it is more natural to use a top level tool calling > distutils/setuptools than to use distutils as the top level tool. As a > pythonic way, you may consider build tools, such as scons or waf, both > written in python. > > But frankly, I would not bother too much about pythonic way: since some > tools will be shell based anyway (autotools, packaging tools), and you > don't have to care about windows, using make and shell may actually be > easier. > > For the build part, you may take a look at my garnumpy package: it is > essentially a set of rules for Gnu Makefiles, and it can build numpy + > scipy, with a configurable set of dependencies (ATLAS, NETLIB > BLAS/LAPACK, fftw, etc....). It can build both distutils and > autotools-based packages. > > http://www.ar.media.kyoto-u.ac.jp/members/david/archives/garnumpy/garnumpy-0.4.tbz2 > > I used it sucessfully on linux and cygwin, so I would say it should work > on Mac OS X without too much trouble. The only thing which will be > likely to be a pain is fat binaries (Universal). I use it to build a > totally self contained numpy/scipy installation, which is a first step > toward packaging. If you think it can be useful to you, don't hesitate > to ask questions; there is also a bzr archive if you want to have access > to the dev history of the tool > > cheers, > > David > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > Hi David, I am having a hell of a time with lapack/atlas. I already posted the issue I currently have with scipy after following the building doc in scipy web site (see http://projects.scipy.org/pipermail/scipy-user/2007-November/014506.html). Besides, octave build seems to also have serious issues with my lapack/atlas build. So I immediately downloaded youor package, but I seem to have a problem with the numpy download : make[3]: Leaving directory `/data1/sources/python/garnumpy-0.4/platform/numpy' # Change default path when looking for libs to fake dir, # so we can set everything by env variables cd work/main.d/numpy-1.0.3.1 && PYTHONPATH=/home/cohen/garnumpyinstall/lib/python2.5/site-packages:/home/cohen/garnumpyinstall/lib/python2.5/site-packages/gtk-2.0 /usr/bin/python \ setup.py config_fc --fcompiler=gnu config Traceback (most recent call last): File "setup.py", line 90, in setup_package() File "setup.py", line 60, in setup_package from numpy.distutils.core import setup File "/data1/sources/python/garnumpy-0.4/platform/numpy/work/main.d/numpy-1.0.3.1/numpy/__init__.py", line 39, in import core File "/data1/sources/python/garnumpy-0.4/platform/numpy/work/main.d/numpy-1.0.3.1/numpy/core/__init__.py", line 8, in import numerictypes as nt File "/data1/sources/python/garnumpy-0.4/platform/numpy/work/main.d/numpy-1.0.3.1/numpy/core/numerictypes.py", line 83, in from numpy.core.multiarray import typeinfo, ndarray, array, empty, dtype ImportError: No module named multiarray make[2]: *** [configure-custom] Error 1 make[2]: Leaving directory `/data1/sources/python/garnumpy-0.4/platform/numpy' make[1]: *** [../../platform/numpy/cookies/main.d/install] Error 2 make[1]: Leaving directory `/data1/sources/python/garnumpy-0.4/platform/scipy' make: *** [imgdep-main] Error 2 Any idea? Your package is anyway a great idea. I would actually love to see it work, of course, but also allow for the possibility to point to already exisiting source directories for numpy and scipy (for instance from a previous svn checkout). I did not read the doc of your package so it might actually be there.... anyway, thanks in advance for your help. Johann From david at ar.media.kyoto-u.ac.jp Tue Nov 13 01:10:32 2007 From: david at ar.media.kyoto-u.ac.jp (David Cournapeau) Date: Tue, 13 Nov 2007 15:10:32 +0900 Subject: [SciPy-user] Bundling numpy/scipy with applications In-Reply-To: <47393D17.9090000@slac.stanford.edu> References: <0CDABAC6-8EA6-409E-B389-578864A6407E@usgs.gov> <47392C92.8020208@ar.media.kyoto-u.ac.jp> <47393D17.9090000@slac.stanford.edu> Message-ID: <47393FD8.7060400@ar.media.kyoto-u.ac.jp> Johann Cohen-Tanugi wrote: > David Cournapeau wrote: >> Michael Hearne wrote: >> >>> I'm creating a Python application that is internal to my organization, >>> but will be installed on both Linux and Mac OS X machines. The >>> application depends heavily on a number of "non-pure" modules (those >>> that have C/C++/FORTRAN components), like numpy, scipy, gdal, etc. >>> >>> What is the most "pythonic" way to bundle these types of modules >>> inside my application? >>> >>> I've investigated dist_utils and setuptools, and I don't see an easy >>> way with those tools to include build instructions for packages built >>> with autconf tools. >>> >>> Is my only recourse to write a custom install script that calls the >>> "configure;make;make install" from a shell? >>> >>> >> You have two problems: building and packaging. For packaging, autoconf >> will not help you much outside providing a coherence between packages, I >> think; distutils/setuptools have packaging tools, but I don't know how >> good they are. >> >> On Linux, the packaging system depends on the distribution: since it is >> only internal, the problem is severely simplified, since you are likely >> to target only one format (rpm, deb, etc...). On Mac OS X, you have also >> tools to package .pkg and .mpkg (which are a set of .pkg), available >> freely on apple dev website; I have not used them much, but they look >> simple and easy to use for simple command line packages. >> >> You could hack something in distutils to call configure and so on, but I >> don't think it will be pleasant. Building packages using >> distutils/setuptools is easy, and do not need much configuration. So I >> think it is more natural to use a top level tool calling >> distutils/setuptools than to use distutils as the top level tool. As a >> pythonic way, you may consider build tools, such as scons or waf, both >> written in python. >> >> But frankly, I would not bother too much about pythonic way: since some >> tools will be shell based anyway (autotools, packaging tools), and you >> don't have to care about windows, using make and shell may actually be >> easier. >> >> For the build part, you may take a look at my garnumpy package: it is >> essentially a set of rules for Gnu Makefiles, and it can build numpy + >> scipy, with a configurable set of dependencies (ATLAS, NETLIB >> BLAS/LAPACK, fftw, etc....). It can build both distutils and >> autotools-based packages. >> >> http://www.ar.media.kyoto-u.ac.jp/members/david/archives/garnumpy/garnumpy-0.4.tbz2 >> >> I used it sucessfully on linux and cygwin, so I would say it should work >> on Mac OS X without too much trouble. The only thing which will be >> likely to be a pain is fat binaries (Universal). I use it to build a >> totally self contained numpy/scipy installation, which is a first step >> toward packaging. If you think it can be useful to you, don't hesitate >> to ask questions; there is also a bzr archive if you want to have access >> to the dev history of the tool >> >> cheers, >> >> David >> _______________________________________________ >> SciPy-user mailing list >> SciPy-user at scipy.org >> http://projects.scipy.org/mailman/listinfo/scipy-user >> > Hi David, > I am having a hell of a time with lapack/atlas. That's because they are far from trivial to build correctly. As always with build problems, it is never complicated, but problems are often difficult to track down. To make things worse, some linux distribution (including fedora and suse) did include bogus packages for blas and lapack. That's why I did this garnumpy thing in the first place: quickly build blas/lapack correctly, so that I can test things more easily. > I already posted the > issue I currently have with scipy after following the building doc in > scipy web site (see > http://projects.scipy.org/pipermail/scipy-user/2007-November/014506.html). > Besides, octave build seems to also have serious issues with my > lapack/atlas build. Which distribution are you using ? > > So I immediately downloaded youor package, but I seem to have a problem > with the numpy download : > make[3]: Leaving directory > `/data1/sources/python/garnumpy-0.4/platform/numpy' > # Change default path when looking for libs to fake dir, > # so we can set everything by env variables > cd work/main.d/numpy-1.0.3.1 && > PYTHONPATH=/home/cohen/garnumpyinstall/lib/python2.5/site-packages:/home/cohen/garnumpyinstall/lib/python2.5/site-packages/gtk-2.0 > /usr/bin/python \ > setup.py config_fc --fcompiler=gnu config > Traceback (most recent call last): > File "setup.py", line 90, in > setup_package() > File "setup.py", line 60, in setup_package > from numpy.distutils.core import setup > File > "/data1/sources/python/garnumpy-0.4/platform/numpy/work/main.d/numpy-1.0.3.1/numpy/__init__.py", > line 39, in > import core > File > "/data1/sources/python/garnumpy-0.4/platform/numpy/work/main.d/numpy-1.0.3.1/numpy/core/__init__.py", > line 8, in > import numerictypes as nt > File > "/data1/sources/python/garnumpy-0.4/platform/numpy/work/main.d/numpy-1.0.3.1/numpy/core/numerictypes.py", > line 83, in > from numpy.core.multiarray import typeinfo, ndarray, array, empty, dtype > ImportError: No module named multiarray > make[2]: *** [configure-custom] Error 1 > make[2]: Leaving directory > `/data1/sources/python/garnumpy-0.4/platform/numpy' > make[1]: *** [../../platform/numpy/cookies/main.d/install] Error 2 > make[1]: Leaving directory > `/data1/sources/python/garnumpy-0.4/platform/scipy' > make: *** [imgdep-main] Error 2 > This is really strange, I have never seen this error. This is what I would do: - remove garnumpyinstall directory - go into bootstrap/lapack, and do make install - go into platform/numpy, do make install At this point, you should have an installed numpy in garnumpyinstall: test it (import numpy; numpy.test(level = 9999, verbosity = 9999). If this works, then go into platform/scipy. The errors should be clearer. > Any idea? Your package is anyway a great idea. I would actually love to > see it work, of course, but also allow for the possibility to point to > already exisiting source directories for numpy and scipy (for instance > from a previous svn checkout). I did not read the doc of your package so > it might actually be there.... You cannot use svn, because this prevents patching from working, and also there is a checksum when downloading: this is a major limitation (but not so major if you realize that the only thing you really want to use from svn are numpy and scipy; you can reuse blas/lapack/fftw/atlas, and that's my main use when testing on different platforms). You can reuse downloaded archives, though: make garchive it will download all the tarballs (you cannot choose a subset, unfortunately) and put them into a directory which will be reused automatically afterwards. This means that even if you do make clean anywhere in the source tree, you won't need to download over and over the same things. cheers, David From cohen at slac.stanford.edu Tue Nov 13 02:12:04 2007 From: cohen at slac.stanford.edu (Johann Cohen-Tanugi) Date: Mon, 12 Nov 2007 23:12:04 -0800 Subject: [SciPy-user] Bundling numpy/scipy with applications In-Reply-To: <47393FD8.7060400@ar.media.kyoto-u.ac.jp> References: <0CDABAC6-8EA6-409E-B389-578864A6407E@usgs.gov> <47392C92.8020208@ar.media.kyoto-u.ac.jp> <47393D17.9090000@slac.stanford.edu> <47393FD8.7060400@ar.media.kyoto-u.ac.jp> Message-ID: <47394E44.6010608@slac.stanford.edu> David Cournapeau wrote: > Johann Cohen-Tanugi wrote: > >> David Cournapeau wrote: >> >>> Michael Hearne wrote: >>> >>> >>>> I'm creating a Python application that is internal to my organization, >>>> but will be installed on both Linux and Mac OS X machines. The >>>> application depends heavily on a number of "non-pure" modules (those >>>> that have C/C++/FORTRAN components), like numpy, scipy, gdal, etc. >>>> >>>> What is the most "pythonic" way to bundle these types of modules >>>> inside my application? >>>> >>>> I've investigated dist_utils and setuptools, and I don't see an easy >>>> way with those tools to include build instructions for packages built >>>> with autconf tools. >>>> >>>> Is my only recourse to write a custom install script that calls the >>>> "configure;make;make install" from a shell? >>>> >>>> >>>> >>> You have two problems: building and packaging. For packaging, autoconf >>> will not help you much outside providing a coherence between packages, I >>> think; distutils/setuptools have packaging tools, but I don't know how >>> good they are. >>> >>> On Linux, the packaging system depends on the distribution: since it is >>> only internal, the problem is severely simplified, since you are likely >>> to target only one format (rpm, deb, etc...). On Mac OS X, you have also >>> tools to package .pkg and .mpkg (which are a set of .pkg), available >>> freely on apple dev website; I have not used them much, but they look >>> simple and easy to use for simple command line packages. >>> >>> You could hack something in distutils to call configure and so on, but I >>> don't think it will be pleasant. Building packages using >>> distutils/setuptools is easy, and do not need much configuration. So I >>> think it is more natural to use a top level tool calling >>> distutils/setuptools than to use distutils as the top level tool. As a >>> pythonic way, you may consider build tools, such as scons or waf, both >>> written in python. >>> >>> But frankly, I would not bother too much about pythonic way: since some >>> tools will be shell based anyway (autotools, packaging tools), and you >>> don't have to care about windows, using make and shell may actually be >>> easier. >>> >>> For the build part, you may take a look at my garnumpy package: it is >>> essentially a set of rules for Gnu Makefiles, and it can build numpy + >>> scipy, with a configurable set of dependencies (ATLAS, NETLIB >>> BLAS/LAPACK, fftw, etc....). It can build both distutils and >>> autotools-based packages. >>> >>> http://www.ar.media.kyoto-u.ac.jp/members/david/archives/garnumpy/garnumpy-0.4.tbz2 >>> >>> I used it sucessfully on linux and cygwin, so I would say it should work >>> on Mac OS X without too much trouble. The only thing which will be >>> likely to be a pain is fat binaries (Universal). I use it to build a >>> totally self contained numpy/scipy installation, which is a first step >>> toward packaging. If you think it can be useful to you, don't hesitate >>> to ask questions; there is also a bzr archive if you want to have access >>> to the dev history of the tool >>> >>> cheers, >>> >>> David >>> _______________________________________________ >>> SciPy-user mailing list >>> SciPy-user at scipy.org >>> http://projects.scipy.org/mailman/listinfo/scipy-user >>> >>> >> Hi David, >> I am having a hell of a time with lapack/atlas. >> > That's because they are far from trivial to build correctly. As always > with build problems, it is never complicated, but problems are often > difficult to track down. To make things worse, some linux distribution > (including fedora and suse) did include bogus packages for blas and > lapack. That's why I did this garnumpy thing in the first place: quickly > build blas/lapack correctly, so that I can test things more easily. > >> I already posted the >> issue I currently have with scipy after following the building doc in >> scipy web site (see >> http://projects.scipy.org/pipermail/scipy-user/2007-November/014506.html). >> Besides, octave build seems to also have serious issues with my >> lapack/atlas build. >> > Which distribution are you using ? > >> So I immediately downloaded youor package, but I seem to have a problem >> with the numpy download : >> make[3]: Leaving directory >> `/data1/sources/python/garnumpy-0.4/platform/numpy' >> # Change default path when looking for libs to fake dir, >> # so we can set everything by env variables >> cd work/main.d/numpy-1.0.3.1 && >> PYTHONPATH=/home/cohen/garnumpyinstall/lib/python2.5/site-packages:/home/cohen/garnumpyinstall/lib/python2.5/site-packages/gtk-2.0 >> /usr/bin/python \ >> setup.py config_fc --fcompiler=gnu config >> Traceback (most recent call last): >> File "setup.py", line 90, in >> setup_package() >> File "setup.py", line 60, in setup_package >> from numpy.distutils.core import setup >> File >> "/data1/sources/python/garnumpy-0.4/platform/numpy/work/main.d/numpy-1.0.3.1/numpy/__init__.py", >> line 39, in >> import core >> File >> "/data1/sources/python/garnumpy-0.4/platform/numpy/work/main.d/numpy-1.0.3.1/numpy/core/__init__.py", >> line 8, in >> import numerictypes as nt >> File >> "/data1/sources/python/garnumpy-0.4/platform/numpy/work/main.d/numpy-1.0.3.1/numpy/core/numerictypes.py", >> line 83, in >> from numpy.core.multiarray import typeinfo, ndarray, array, empty, dtype >> ImportError: No module named multiarray >> make[2]: *** [configure-custom] Error 1 >> make[2]: Leaving directory >> `/data1/sources/python/garnumpy-0.4/platform/numpy' >> make[1]: *** [../../platform/numpy/cookies/main.d/install] Error 2 >> make[1]: Leaving directory >> `/data1/sources/python/garnumpy-0.4/platform/scipy' >> make: *** [imgdep-main] Error 2 >> >> > This is really strange, I have never seen this error. This is what I > would do: > - remove garnumpyinstall directory > - go into bootstrap/lapack, and do make install > - go into platform/numpy, do make install > At this point, you should have an installed numpy in garnumpyinstall: > test it (import numpy; numpy.test(level = 9999, verbosity = 9999). If > this works, then go into platform/scipy. The errors should be clearer. > do make install in platform/numpy fails exactly in the same way : [cohen at localhost numpy]$ make install [===== NOW BUILDING: numpy-1.0.3.1 =====] [fetch] complete for numpy. [checksum] complete for numpy. [extract] complete for numpy. [patch] complete for numpy. # Change default path when looking for libs to fake dir, # so we can set everything by env variables cd work/main.d/numpy-1.0.3.1 && PYTHONPATH=/home/cohen/garnumpyinstall/lib/python2.5/site-packages:/home/cohen/garnumpyinstall/lib/python2.5/site-packages/gtk-2.0 /usr/bin/python \ setup.py config_fc --fcompiler=gnu config Traceback (most recent call last): File "setup.py", line 90, in setup_package() File "setup.py", line 60, in setup_package from numpy.distutils.core import setup File "/data1/sources/python/garnumpy-0.4/platform/numpy/work/main.d/numpy-1.0.3.1/numpy/__init__.py", line 39, in import core File "/data1/sources/python/garnumpy-0.4/platform/numpy/work/main.d/numpy-1.0.3.1/numpy/core/__init__.py", line 8, in import numerictypes as nt File "/data1/sources/python/garnumpy-0.4/platform/numpy/work/main.d/numpy-1.0.3.1/numpy/core/numerictypes.py", line 83, in from numpy.core.multiarray import typeinfo, ndarray, array, empty, dtype ImportError: No module named multiarray >> Any idea? Your package is anyway a great idea. I would actually love to >> see it work, of course, but also allow for the possibility to point to >> already exisiting source directories for numpy and scipy (for instance >> from a previous svn checkout). I did not read the doc of your package so >> it might actually be there.... >> > You cannot use svn, because this prevents patching from working, and > also there is a checksum when downloading: this is a major limitation > (but not so major if you realize that the only thing you really want to > use from svn are numpy and scipy; you can reuse blas/lapack/fftw/atlas, > and that's my main use when testing on different platforms). > > You can reuse downloaded archives, though: > > make garchive > > it will download all the tarballs (you cannot choose a subset, > unfortunately) and put them into a directory which will be reused > automatically afterwards. This means that even if you do make clean > anywhere in the source tree, you won't need to download over and over > the same things. > > cheers, > > David > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > From david at ar.media.kyoto-u.ac.jp Tue Nov 13 02:35:40 2007 From: david at ar.media.kyoto-u.ac.jp (David Cournapeau) Date: Tue, 13 Nov 2007 16:35:40 +0900 Subject: [SciPy-user] Bundling numpy/scipy with applications In-Reply-To: <47394E44.6010608@slac.stanford.edu> References: <0CDABAC6-8EA6-409E-B389-578864A6407E@usgs.gov> <47392C92.8020208@ar.media.kyoto-u.ac.jp> <47393D17.9090000@slac.stanford.edu> <47393FD8.7060400@ar.media.kyoto-u.ac.jp> <47394E44.6010608@slac.stanford.edu> Message-ID: <473953CC.7010402@ar.media.kyoto-u.ac.jp> Johann Cohen-Tanugi wrote: > > do make install in platform/numpy fails exactly in the same way : I forgot to tell you to do make clean before, otherwise, it indeed has no way to fail in a different way :) > [cohen at localhost numpy]$ make install > [===== NOW BUILDING: numpy-1.0.3.1 =====] > [fetch] complete for numpy. > [checksum] complete for numpy. > [extract] complete for numpy. > [patch] complete for numpy. > # Change default path when looking for libs to fake dir, > # so we can set everything by env variables > cd work/main.d/numpy-1.0.3.1 && > PYTHONPATH=/home/cohen/garnumpyinstall/lib/python2.5/site-packages:/home/cohen/garnumpyinstall/lib/python2.5/site-packages/gtk-2.0 > /usr/bin/python \ > setup.py config_fc --fcompiler=gnu config > Traceback (most recent call last): > File "setup.py", line 90, in > setup_package() > File "setup.py", line 60, in setup_package > from numpy.distutils.core import setup > File > "/data1/sources/python/garnumpy-0.4/platform/numpy/work/main.d/numpy-1.0.3.1/numpy/__init__.py", > line 39, in > import core > File > "/data1/sources/python/garnumpy-0.4/platform/numpy/work/main.d/numpy-1.0.3.1/numpy/core/__init__.py", > line 8, in > import numerictypes as nt > File > "/data1/sources/python/garnumpy-0.4/platform/numpy/work/main.d/numpy-1.0.3.1/numpy/core/numerictypes.py", > line 83, in > from numpy.core.multiarray import typeinfo, ndarray, array, empty, dtype > ImportError: No module named multiarray I really don't understand this error: you are configuring numpy for compilation, so it should not try to import numpy.core (which has no way to succeed since it is not built yet). IOW, it is a bootstrap error (numpy build system tries to import numpy). If after make clean, it still does not work, you may try to do: make clean make patch And then go into work/ work/main.d/numpy-1.0.3.1/, and just execute python setup.py config there, and give me the errors (you can also give them to me in private). David From haase at msg.ucsf.edu Tue Nov 13 07:44:59 2007 From: haase at msg.ucsf.edu (Sebastian Haase) Date: Tue, 13 Nov 2007 13:44:59 +0100 Subject: [SciPy-user] Bundling numpy/scipy with applications In-Reply-To: <0CDABAC6-8EA6-409E-B389-578864A6407E@usgs.gov> References: <0CDABAC6-8EA6-409E-B389-578864A6407E@usgs.gov> Message-ID: Hi, This might be a heratic idea (in this context): But how "bad" is it really, to just build all components, put them in one place, and distribute them as a zip/tar-archive ? PYTHONPATH would need to be adjusted as needed - either in a place like .bashrc or in 3-line script that would then substitute the "python"-call. This is of course least feancy, but might be most effective - especially if it is only for a releatively small group of people. Cheers, Sebastian Haase On Nov 12, 2007 9:26 PM, Michael Hearne wrote: > I'm creating a Python application that is internal to my organization, but > will be installed on both Linux and Mac OS X machines. The application > depends heavily on a number of "non-pure" modules (those that have > C/C++/FORTRAN components), like numpy, scipy, gdal, etc. > > What is the most "pythonic" way to bundle these types of modules inside my > application? > > I've investigated dist_utils and setuptools, and I don't see an easy way > with those tools to include build instructions for packages built with > autconf tools. > > Is my only recourse to write a custom install script that calls the > "configure;make;make install" from a shell? > > --Mike > > > > ------------------------------------------------------ > Michael Hearne > mhearne at usgs.gov > (303) 273-8620 > USGS National Earthquake Information Center > 1711 Illinois St. Golden CO 80401 > Senior Software Engineer > Synergetics, Inc. > ------------------------------------------------------ > > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > > From sandricionut at yahoo.com Tue Nov 13 08:17:27 2007 From: sandricionut at yahoo.com (sandric ionut) Date: Tue, 13 Nov 2007 05:17:27 -0800 (PST) Subject: [SciPy-user] random gaussian array Message-ID: <201537.55063.qm@web51307.mail.re2.yahoo.com> Thank you Matthieu: I looked at numpy.random, but all I have is normalvariate. Can you give me an example? Ionut ----- Original Message ---- From: Matthieu Brucher To: SciPy Users List Sent: Monday, November 12, 2007 4:11:08 PM Subject: Re: [SciPy-user] random gaussian array Yes, look into numpy.random -> norm Matthieu 2007/11/12, sandric ionut : Hi: I want to create an random Gaussian 2D array with an exact size. For example I want my array to look like: 2 3 4 0 1 1 2 3 3 4 1 1 3 2 5 6 7 8 all the values should be randomly generated from a Gaussian distribution with mean and standard deviation Is it possible to do it with scipy or numpy? Thank you Ionut __________________________________________________ Do You Yahoo!? Tired of spam? Yahoo! Mail has the best spam protection around http://mail.yahoo.com _______________________________________________ SciPy-user mailing list SciPy-user at scipy.org http://projects.scipy.org/mailman/listinfo/scipy-user -- French PhD student Website : http://miles.developpez.com/ Blogs : http://matt.eifelle.com and http://blog.developpez.com/?blog=92 LinkedIn : http://www.linkedin.com/in/matthieubrucher ____________________________________________________________________________________ Get easy, one-click access to your favorites. Make Yahoo! your homepage. http://www.yahoo.com/r/hs -------------- next part -------------- An HTML attachment was scrubbed... URL: From sandricionut at yahoo.com Tue Nov 13 08:23:29 2007 From: sandricionut at yahoo.com (sandric ionut) Date: Tue, 13 Nov 2007 05:23:29 -0800 (PST) Subject: [SciPy-user] random gaussian array SOLVED Message-ID: <757507.90289.qm@web51301.mail.re2.yahoo.com> I have found it. randomArray = numpy.random.normal(mean, stdev, (shape)) It works OK Thank's again ----- Original Message ---- From: Matthieu Brucher To: SciPy Users List Sent: Monday, November 12, 2007 4:11:08 PM Subject: Re: [SciPy-user] random gaussian array Yes, look into numpy.random -> norm Matthieu 2007/11/12, sandric ionut : Hi: I want to create an random Gaussian 2D array with an exact size. For example I want my array to look like: 2 3 4 0 1 1 2 3 3 4 1 1 3 2 5 6 7 8 all the values should be randomly generated from a Gaussian distribution with mean and standard deviation Is it possible to do it with scipy or numpy? Thank you Ionut __________________________________________________ Do You Yahoo!? Tired of spam? Yahoo! Mail has the best spam protection around http://mail.yahoo.com _______________________________________________ SciPy-user mailing list SciPy-user at scipy.org http://projects.scipy.org/mailman/listinfo/scipy-user -- French PhD student Website : http://miles.developpez.com/ Blogs : http://matt.eifelle.com and http://blog.developpez.com/?blog=92 LinkedIn : http://www.linkedin.com/in/matthieubrucher ____________________________________________________________________________________ Be a better pen pal. Text or chat with friends inside Yahoo! Mail. See how. http://overview.mail.yahoo.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From david at ar.media.kyoto-u.ac.jp Tue Nov 13 08:13:24 2007 From: david at ar.media.kyoto-u.ac.jp (David Cournapeau) Date: Tue, 13 Nov 2007 22:13:24 +0900 Subject: [SciPy-user] random gaussian array In-Reply-To: <201537.55063.qm@web51307.mail.re2.yahoo.com> References: <201537.55063.qm@web51307.mail.re2.yahoo.com> Message-ID: <4739A2F4.4060709@ar.media.kyoto-u.ac.jp> sandric ionut wrote: > Thank you Matthieu: > > I looked at numpy.random, but all I have is normalvariate. Can you > give me an example? This is strange. Does: >> from numpy.random import normal fails ? If you use ipython, you can use: >> help numpy.random To get a list of all the function exporteed by numpy.random. Otherwise, with a standard python shell, you can use >> print dir(numpy.random) In both cases, you should see normal as a function. cheers, David From stefan at sun.ac.za Tue Nov 13 05:04:08 2007 From: stefan at sun.ac.za (Stefan van der Walt) Date: Tue, 13 Nov 2007 12:04:08 +0200 Subject: [SciPy-user] KDE question In-Reply-To: <473566E4.2060101@gmail.com> References: <9F9D8106-7EEE-4623-98C9-B8B65C428419@stanford.edu> <4734BF4E.1090102@gmail.com> <47356362.7010508@ar.media.kyoto-u.ac.jp> <473566E4.2060101@gmail.com> Message-ID: <20071113100408.GA6802@mentat.za.net> On Sat, Nov 10, 2007 at 02:08:04AM -0600, Robert Kern wrote: > David Cournapeau wrote: > > > I am not sure I understand exactly the problem, but if the problem is to > > find a contour level of a Gaussian density, it has a closed form for any > > dimension. > > No, the problem is to find the appropriate contour level of a kernel density > estimate (with Gaussian kernels in this case). Essentially, a mixture of many > Gaussians, not a single one. Sounds like the kind of problem that can be solved using marching squares: http://www.polytech.unice.fr/~lingrand/MarchingCubes/algo.html Regards St?fan From aisaac at american.edu Tue Nov 13 08:35:26 2007 From: aisaac at american.edu (Alan G Isaac) Date: Tue, 13 Nov 2007 08:35:26 -0500 Subject: [SciPy-user] random gaussian array In-Reply-To: <201537.55063.qm@web51307.mail.re2.yahoo.com> References: <201537.55063.qm@web51307.mail.re2.yahoo.com> Message-ID: On Tue, 13 Nov 2007, sandric ionut apparently wrote: > I looked at numpy.random, but all I have is normalvariate But that matches your question? If you use ``normal``, just set the size (i.e., shape). Cheers, Alan Isaac >>> help(N.random) Help on package numpy.random in numpy: ... multivariate_normal(...) Return an array containing multivariate normally distributed random numbers with specified mean and covariance. multivariate_normal(mean, cov) -> random values multivariate_normal(mean, cov, [m, n, ...]) -> random values mean must be a 1 dimensional array. cov must be a square two dimensional array with the same number of rows and columns as mean has elements. The first form returns a single 1-D array containing a multivariate normal. The second form returns an array of shape (m, n, ..., cov.shape[0]). In this case, output[i,j,...,:] is a 1-D array containing a multivariate normal. normal(...) Normal distribution (mean=loc, stdev=scale). normal(loc=0.0, scale=1.0, size=None) -> random values From jeremit0 at gmail.com Tue Nov 13 09:20:46 2007 From: jeremit0 at gmail.com (Jeremy Conlin) Date: Tue, 13 Nov 2007 09:20:46 -0500 Subject: [SciPy-user] fortran compiler not found Mac Leopard 10.5 In-Reply-To: <6ce0ac130711122059x610214bey5698dde00ee9cd8c@mail.gmail.com> References: <3db594f70711120906n11ddd08fkd1fca36b12baa611@mail.gmail.com> <6ce0ac130711121039y47445aabu22acaa50ff402f83@mail.gmail.com> <3db594f70711121125p1cea3dfaqd11133fd5f24e01a@mail.gmail.com> <4D4B58BA-70E1-46E1-A955-7C516BB4BB79@arachnedesign.net> <3db594f70711121351w4a803df1u58fadcb2488efdbd@mail.gmail.com> <6ce0ac130711122059x610214bey5698dde00ee9cd8c@mail.gmail.com> Message-ID: <3db594f70711130620h5c6914dcq4750ef9111d49a6b@mail.gmail.com> On Nov 12, 2007 11:59 PM, Brian Granger wrote: > Great, now you are ready for the other problem on Leopard: > > sudo does not preserve the PYTHONPATH environment variable. > > Thus anytime you do:: > > sudo python setup.py install > > to install something else, your version of numpy won't be used! This > has been reported to apple and the problem can be fixed by adding > PYTHONPATH to the appropriate section in /etc/sudoers. > > Another option (see the recent thread on pythonmax-sig) is to create a > .pth file that dynamically reorders the sys.path. This approach is > nice as it doesn't require setting PYTHONPATH. > > You will run into this when installing matplotlib and pytables...if > you go that way. > I am heading that way, at least I am installing matplotlib. Now I run into other issues not yet related to the PYTHONPATH problem. when issuing the command $ python setup.py build Everything seems to proceed normally until I get the following error: ? src/_image.cpp:5:17: src/_image.cpp:5:17: error: png.h: No such file or directory error: png.h: No such file or directory src/_image.cpp: In member function 'Py::Object Image::write_png(const Py::Tuple&)': src/_image.cpp:646: error: 'png_structp' was not declared in this scope src/_image.cpp:646: error: expected `;' before 'png_ptr' ? After that there are many compile errors. At the beginning of the building there was a message libpng: found, but unknown version (no pkg-config) * Could not find 'libpng' headers in any of * '/usr/local/include', '/usr/include', '.' So it seems like it doesn't know where to find the header files for libpng. How can I tell setup where to go looking for the header files? (I also got a similar message for freetype2.) Thanks, Jeremy From ellisonbg.net at gmail.com Tue Nov 13 09:56:12 2007 From: ellisonbg.net at gmail.com (Brian Granger) Date: Tue, 13 Nov 2007 07:56:12 -0700 Subject: [SciPy-user] fortran compiler not found Mac Leopard 10.5 In-Reply-To: <3db594f70711130620h5c6914dcq4750ef9111d49a6b@mail.gmail.com> References: <3db594f70711120906n11ddd08fkd1fca36b12baa611@mail.gmail.com> <6ce0ac130711121039y47445aabu22acaa50ff402f83@mail.gmail.com> <3db594f70711121125p1cea3dfaqd11133fd5f24e01a@mail.gmail.com> <4D4B58BA-70E1-46E1-A955-7C516BB4BB79@arachnedesign.net> <3db594f70711121351w4a803df1u58fadcb2488efdbd@mail.gmail.com> <6ce0ac130711122059x610214bey5698dde00ee9cd8c@mail.gmail.com> <3db594f70711130620h5c6914dcq4750ef9111d49a6b@mail.gmail.com> Message-ID: <6ce0ac130711130656u5ce11908mec651a17b9c53ed9@mail.gmail.com> How did you install libpng and freetype? What ---prefix did you use? Brian On Nov 13, 2007 7:20 AM, Jeremy Conlin wrote: > On Nov 12, 2007 11:59 PM, Brian Granger wrote: > > Great, now you are ready for the other problem on Leopard: > > > > sudo does not preserve the PYTHONPATH environment variable. > > > > Thus anytime you do:: > > > > sudo python setup.py install > > > > to install something else, your version of numpy won't be used! This > > has been reported to apple and the problem can be fixed by adding > > PYTHONPATH to the appropriate section in /etc/sudoers. > > > > Another option (see the recent thread on pythonmax-sig) is to create a > > .pth file that dynamically reorders the sys.path. This approach is > > nice as it doesn't require setting PYTHONPATH. > > > > You will run into this when installing matplotlib and pytables...if > > you go that way. > > > > I am heading that way, at least I am installing matplotlib. Now I run > into other issues not yet related to the PYTHONPATH problem. when > issuing the command > > > $ python setup.py build > > Everything seems to proceed normally until I get the following error: > ? > src/_image.cpp:5:17: src/_image.cpp:5:17: error: png.h: No such file > or directory > error: png.h: No such file or directory > src/_image.cpp: In member function 'Py::Object Image::write_png(const > Py::Tuple&)': > src/_image.cpp:646: error: 'png_structp' was not declared in this scope > src/_image.cpp:646: error: expected `;' before 'png_ptr' > ? > > After that there are many compile errors. At the beginning of the > building there was a message > > libpng: found, but unknown version (no pkg-config) > * Could not find 'libpng' headers in any of > * '/usr/local/include', '/usr/include', '.' > > So it seems like it doesn't know where to find the header files for > libpng. How can I tell setup where to go looking for the header > files? (I also got a similar message for freetype2.) > > Thanks, > > Jeremy > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > From jeremit0 at gmail.com Tue Nov 13 11:33:27 2007 From: jeremit0 at gmail.com (Jeremy Conlin) Date: Tue, 13 Nov 2007 11:33:27 -0500 Subject: [SciPy-user] fortran compiler not found Mac Leopard 10.5 In-Reply-To: <6ce0ac130711130656u5ce11908mec651a17b9c53ed9@mail.gmail.com> References: <3db594f70711120906n11ddd08fkd1fca36b12baa611@mail.gmail.com> <6ce0ac130711121039y47445aabu22acaa50ff402f83@mail.gmail.com> <3db594f70711121125p1cea3dfaqd11133fd5f24e01a@mail.gmail.com> <4D4B58BA-70E1-46E1-A955-7C516BB4BB79@arachnedesign.net> <3db594f70711121351w4a803df1u58fadcb2488efdbd@mail.gmail.com> <6ce0ac130711122059x610214bey5698dde00ee9cd8c@mail.gmail.com> <3db594f70711130620h5c6914dcq4750ef9111d49a6b@mail.gmail.com> <6ce0ac130711130656u5ce11908mec651a17b9c53ed9@mail.gmail.com> Message-ID: <3db594f70711130833j7e687137g35099eeec5b701b@mail.gmail.com> I didn't install them. I was guessing maybe they came with Leopard. Using Spotlight I have found header files for libpng and freetype here: /Developer/SDKs/MacOSX10.5.sdk/usr/X11/include Can I use these header files? Thanks, Jeremy On Nov 13, 2007 9:56 AM, Brian Granger wrote: > How did you install libpng and freetype? What ---prefix did you use? > > Brian > > > On Nov 13, 2007 7:20 AM, Jeremy Conlin wrote: > > On Nov 12, 2007 11:59 PM, Brian Granger wrote: > > > Great, now you are ready for the other problem on Leopard: > > > > > > sudo does not preserve the PYTHONPATH environment variable. > > > > > > Thus anytime you do:: > > > > > > sudo python setup.py install > > > > > > to install something else, your version of numpy won't be used! This > > > has been reported to apple and the problem can be fixed by adding > > > PYTHONPATH to the appropriate section in /etc/sudoers. > > > > > > Another option (see the recent thread on pythonmax-sig) is to create a > > > .pth file that dynamically reorders the sys.path. This approach is > > > nice as it doesn't require setting PYTHONPATH. > > > > > > You will run into this when installing matplotlib and pytables...if > > > you go that way. > > > > > > > I am heading that way, at least I am installing matplotlib. Now I run > > into other issues not yet related to the PYTHONPATH problem. when > > issuing the command > > > > > > $ python setup.py build > > > > Everything seems to proceed normally until I get the following error: > > ? > > src/_image.cpp:5:17: src/_image.cpp:5:17: error: png.h: No such file > > or directory > > error: png.h: No such file or directory > > src/_image.cpp: In member function 'Py::Object Image::write_png(const > > Py::Tuple&)': > > src/_image.cpp:646: error: 'png_structp' was not declared in this scope > > src/_image.cpp:646: error: expected `;' before 'png_ptr' > > ? > > > > After that there are many compile errors. At the beginning of the > > building there was a message > > > > libpng: found, but unknown version (no pkg-config) > > * Could not find 'libpng' headers in any of > > * '/usr/local/include', '/usr/include', '.' > > > > So it seems like it doesn't know where to find the header files for > > libpng. How can I tell setup where to go looking for the header > > files? (I also got a similar message for freetype2.) > > > > Thanks, > > > > Jeremy > > > _______________________________________________ > > SciPy-user mailing list > > SciPy-user at scipy.org > > http://projects.scipy.org/mailman/listinfo/scipy-user > > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > From ellisonbg.net at gmail.com Tue Nov 13 11:37:41 2007 From: ellisonbg.net at gmail.com (Brian Granger) Date: Tue, 13 Nov 2007 09:37:41 -0700 Subject: [SciPy-user] fortran compiler not found Mac Leopard 10.5 In-Reply-To: <3db594f70711130833j7e687137g35099eeec5b701b@mail.gmail.com> References: <3db594f70711120906n11ddd08fkd1fca36b12baa611@mail.gmail.com> <6ce0ac130711121039y47445aabu22acaa50ff402f83@mail.gmail.com> <3db594f70711121125p1cea3dfaqd11133fd5f24e01a@mail.gmail.com> <4D4B58BA-70E1-46E1-A955-7C516BB4BB79@arachnedesign.net> <3db594f70711121351w4a803df1u58fadcb2488efdbd@mail.gmail.com> <6ce0ac130711122059x610214bey5698dde00ee9cd8c@mail.gmail.com> <3db594f70711130620h5c6914dcq4750ef9111d49a6b@mail.gmail.com> <6ce0ac130711130656u5ce11908mec651a17b9c53ed9@mail.gmail.com> <3db594f70711130833j7e687137g35099eeec5b701b@mail.gmail.com> Message-ID: <6ce0ac130711130837p4fdca00cq6fce50c163289d9d@mail.gmail.com> I don't think so - at least, I have never had luck using these versions. I would ask on the matplotlib list about this issue. If they are really there, it would be really nice if the matplotlib build system could find and use the Apple versions. In the meantime, if you simply download libpng and freetype2 and then do: ./configure --prefix=/usr/local make sudo make install In the source code directories of both, they will be installed in a way that matplotlib can find. From there, the TKAgg and WxAgg matplotlib backends should run fine. Brian On Nov 13, 2007 9:33 AM, Jeremy Conlin wrote: > I didn't install them. I was guessing maybe they came with Leopard. > Using Spotlight I have found header files for libpng and freetype > here: > > /Developer/SDKs/MacOSX10.5.sdk/usr/X11/include > > Can I use these header files? > > Thanks, > Jeremy > > > On Nov 13, 2007 9:56 AM, Brian Granger wrote: > > How did you install libpng and freetype? What ---prefix did you use? > > > > Brian > > > > > > On Nov 13, 2007 7:20 AM, Jeremy Conlin wrote: > > > On Nov 12, 2007 11:59 PM, Brian Granger wrote: > > > > Great, now you are ready for the other problem on Leopard: > > > > > > > > sudo does not preserve the PYTHONPATH environment variable. > > > > > > > > Thus anytime you do:: > > > > > > > > sudo python setup.py install > > > > > > > > to install something else, your version of numpy won't be used! This > > > > has been reported to apple and the problem can be fixed by adding > > > > PYTHONPATH to the appropriate section in /etc/sudoers. > > > > > > > > Another option (see the recent thread on pythonmax-sig) is to create a > > > > .pth file that dynamically reorders the sys.path. This approach is > > > > nice as it doesn't require setting PYTHONPATH. > > > > > > > > You will run into this when installing matplotlib and pytables...if > > > > you go that way. > > > > > > > > > > I am heading that way, at least I am installing matplotlib. Now I run > > > into other issues not yet related to the PYTHONPATH problem. when > > > issuing the command > > > > > > > > > $ python setup.py build > > > > > > Everything seems to proceed normally until I get the following error: > > > ? > > > src/_image.cpp:5:17: src/_image.cpp:5:17: error: png.h: No such file > > > or directory > > > error: png.h: No such file or directory > > > src/_image.cpp: In member function 'Py::Object Image::write_png(const > > > Py::Tuple&)': > > > src/_image.cpp:646: error: 'png_structp' was not declared in this scope > > > src/_image.cpp:646: error: expected `;' before 'png_ptr' > > > ? > > > > > > After that there are many compile errors. At the beginning of the > > > building there was a message > > > > > > libpng: found, but unknown version (no pkg-config) > > > * Could not find 'libpng' headers in any of > > > * '/usr/local/include', '/usr/include', '.' > > > > > > So it seems like it doesn't know where to find the header files for > > > libpng. How can I tell setup where to go looking for the header > > > files? (I also got a similar message for freetype2.) > > > > > > Thanks, > > > > > > Jeremy > > > > > _______________________________________________ > > > SciPy-user mailing list > > > SciPy-user at scipy.org > > > http://projects.scipy.org/mailman/listinfo/scipy-user > > > > > _______________________________________________ > > SciPy-user mailing list > > SciPy-user at scipy.org > > http://projects.scipy.org/mailman/listinfo/scipy-user > > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > From robert.kern at gmail.com Tue Nov 13 12:57:20 2007 From: robert.kern at gmail.com (Robert Kern) Date: Tue, 13 Nov 2007 11:57:20 -0600 Subject: [SciPy-user] KDE question In-Reply-To: <20071113100408.GA6802@mentat.za.net> References: <9F9D8106-7EEE-4623-98C9-B8B65C428419@stanford.edu> <4734BF4E.1090102@gmail.com> <47356362.7010508@ar.media.kyoto-u.ac.jp> <473566E4.2060101@gmail.com> <20071113100408.GA6802@mentat.za.net> Message-ID: <4739E580.9050208@gmail.com> Stefan van der Walt wrote: > On Sat, Nov 10, 2007 at 02:08:04AM -0600, Robert Kern wrote: >> David Cournapeau wrote: >> >>> I am not sure I understand exactly the problem, but if the problem is to >>> find a contour level of a Gaussian density, it has a closed form for any >>> dimension. >> No, the problem is to find the appropriate contour level of a kernel density >> estimate (with Gaussian kernels in this case). Essentially, a mixture of many >> Gaussians, not a single one. > > Sounds like the kind of problem that can be solved using marching > squares: > > http://www.polytech.unice.fr/~lingrand/MarchingCubes/algo.html This solves the already-matplotlib-solved problem of drawing the contours given a level. That still leaves finding the correct level. Or am I underestimating the potential to reformulate marching squares to solve the integration problem, too? -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From gnchen at gmail.com Tue Nov 13 13:52:09 2007 From: gnchen at gmail.com (Gen-Nan Chen) Date: Tue, 13 Nov 2007 10:52:09 -0800 Subject: [SciPy-user] fortran compiler not found Mac Leopard 10.5 In-Reply-To: <6ce0ac130711130837p4fdca00cq6fce50c163289d9d@mail.gmail.com> References: <3db594f70711120906n11ddd08fkd1fca36b12baa611@mail.gmail.com> <6ce0ac130711121039y47445aabu22acaa50ff402f83@mail.gmail.com> <3db594f70711121125p1cea3dfaqd11133fd5f24e01a@mail.gmail.com> <4D4B58BA-70E1-46E1-A955-7C516BB4BB79@arachnedesign.net> <3db594f70711121351w4a803df1u58fadcb2488efdbd@mail.gmail.com> <6ce0ac130711122059x610214bey5698dde00ee9cd8c@mail.gmail.com> <3db594f70711130620h5c6914dcq4750ef9111d49a6b@mail.gmail.com> <6ce0ac130711130656u5ce11908mec651a17b9c53ed9@mail.gmail.com> <3db594f70711130833j7e687137g35099eeec5b701b@mail.gmail.com> <6ce0ac130711130837p4fdca00cq6fce50c163289d9d@mail.gmail.com> Message-ID: <588944e60711131052l68db1a40i90a0aceb39d3559b@mail.gmail.com> I just used i-installer to install everything in /usr/local/. Gen On Nov 13, 2007 8:37 AM, Brian Granger wrote: > I don't think so - at least, I have never had luck using these > versions. I would ask on the matplotlib list about this issue. If > they are really there, it would be really nice if the matplotlib build > system could find and use the Apple versions. > > In the meantime, if you simply download libpng and freetype2 and then do: > > ./configure --prefix=/usr/local > make > sudo make install > > In the source code directories of both, they will be installed in a > way that matplotlib can find. From there, the TKAgg and WxAgg > matplotlib backends should run fine. > > Brian > > On Nov 13, 2007 9:33 AM, Jeremy Conlin wrote: > > I didn't install them. I was guessing maybe they came with Leopard. > > Using Spotlight I have found header files for libpng and freetype > > here: > > > > /Developer/SDKs/MacOSX10.5.sdk/usr/X11/include > > > > Can I use these header files? > > > > Thanks, > > Jeremy > > > > > > On Nov 13, 2007 9:56 AM, Brian Granger wrote: > > > How did you install libpng and freetype? What ---prefix did you use? > > > > > > Brian > > > > > > > > > On Nov 13, 2007 7:20 AM, Jeremy Conlin wrote: > > > > On Nov 12, 2007 11:59 PM, Brian Granger > wrote: > > > > > Great, now you are ready for the other problem on Leopard: > > > > > > > > > > sudo does not preserve the PYTHONPATH environment variable. > > > > > > > > > > Thus anytime you do:: > > > > > > > > > > sudo python setup.py install > > > > > > > > > > to install something else, your version of numpy won't be used! > This > > > > > has been reported to apple and the problem can be fixed by adding > > > > > PYTHONPATH to the appropriate section in /etc/sudoers. > > > > > > > > > > Another option (see the recent thread on pythonmax-sig) is to > create a > > > > > .pth file that dynamically reorders the sys.path. This approach > is > > > > > nice as it doesn't require setting PYTHONPATH. > > > > > > > > > > You will run into this when installing matplotlib and > pytables...if > > > > > you go that way. > > > > > > > > > > > > > I am heading that way, at least I am installing matplotlib. Now I > run > > > > into other issues not yet related to the PYTHONPATH problem. when > > > > issuing the command > > > > > > > > > > > > $ python setup.py build > > > > > > > > Everything seems to proceed normally until I get the following > error: > > > > ? > > > > src/_image.cpp:5:17: src/_image.cpp:5:17: error: png.h: No such file > > > > or directory > > > > error: png.h: No such file or directory > > > > src/_image.cpp: In member function 'Py::Object > Image::write_png(const > > > > Py::Tuple&)': > > > > src/_image.cpp:646: error: 'png_structp' was not declared in this > scope > > > > src/_image.cpp:646: error: expected `;' before 'png_ptr' > > > > ? > > > > > > > > After that there are many compile errors. At the beginning of the > > > > building there was a message > > > > > > > > libpng: found, but unknown version (no pkg-config) > > > > * Could not find 'libpng' headers in any of > > > > * '/usr/local/include', '/usr/include', '.' > > > > > > > > So it seems like it doesn't know where to find the header files for > > > > libpng. How can I tell setup where to go looking for the header > > > > files? (I also got a similar message for freetype2.) > > > > > > > > Thanks, > > > > > > > > Jeremy > > > > > > > _______________________________________________ > > > > SciPy-user mailing list > > > > SciPy-user at scipy.org > > > > http://projects.scipy.org/mailman/listinfo/scipy-user > > > > > > > _______________________________________________ > > > SciPy-user mailing list > > > SciPy-user at scipy.org > > > http://projects.scipy.org/mailman/listinfo/scipy-user > > > > > _______________________________________________ > > SciPy-user mailing list > > SciPy-user at scipy.org > > http://projects.scipy.org/mailman/listinfo/scipy-user > > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > -- Gen-Nan Chen, Ph. D. email: gnchen at gmail.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From ivilata at carabos.com Wed Nov 14 12:08:00 2007 From: ivilata at carabos.com (Ivan Vilata i Balaguer) Date: Wed, 14 Nov 2007 18:08:00 +0100 Subject: [SciPy-user] [ANN] Release of the first PyTables video Message-ID: <20071114170800.GD30891@tardis.terramar.selidor.net> ===================================== Release of the first PyTables video ===================================== `Carabos `_ is very proud to announce the first of a series of videos dedicated to introducing the main features of PyTables to the public in a visual and easy to grasp manner. http://www.carabos.com/videos/pytables-1-intro `PyTables `_ is a Free/Open Source package designed to handle massive amounts of data in a simple, but highly efficient way, using the HDF5 file format and NumPy data containers. This first video is an introductory overview of PyTables, covering the following topics: * HDF5 file creation * the object tree * homogeneous array storage * natural naming * working with attributes With a running length of little more than 10 minutes, you may sit back and watch it during any short break. More videos about PyTables will be published in the near future. Stay tuned on www.pytables.org for the announcement of the new videos. We would like to hear your opinion on the video so we can do it better the next time. We are also open to suggestions for the topics of future videos. Best regards, :: Ivan Vilata i Balaguer >qo< http://www.carabos.com/ C?rabos Coop. V. V V Enjoy Data "" -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 307 bytes Desc: Digital signature URL: From hugues.salamin at gmail.com Wed Nov 14 12:13:32 2007 From: hugues.salamin at gmail.com (Hugues Salamin) Date: Wed, 14 Nov 2007 18:13:32 +0100 Subject: [SciPy-user] coercion error when using stats.distirbutions.entropy Message-ID: Hi, I come against a strange error when I compute the entropy of discrete random variable. Here is a small script showing this problem (python 2.4.4, scipy 0.5.2 and numpy 1.0.1): [code] #!/usr/bin/env python2.4 from scipy import stats val = ( (1,2,3),(0.5,0.25,0.25)) dist = stats.distributions.rv_discrete(name="temp", values = val) dist.pmf(2) print dist.entropy() [/code] I code the following exception: Traceback (most recent call last): File "./error.py", line 8, in ? print dist.entropy() File "/usr/lib/python2.4/site-packages/scipy/stats/distributions.py", line 3787, in entropy place(output,cond0,self.vecentropy(*goodargs)) File "/usr/lib/python2.4/site-packages/numpy/lib/function_base.py", line 856, in __call__ _res = array(self.ufunc(*args),copy=False).astype(self.otypes[0]) TypeError: function not supported for these types, and can't coerce safely to supported types If a remove the call to pmf, the error disappears. I tried to look under the hood what was happening but I'm totally at lost, Any help appreciated Hugues From cohen at slac.stanford.edu Wed Nov 14 15:00:35 2007 From: cohen at slac.stanford.edu (Johann Cohen-Tanugi) Date: Wed, 14 Nov 2007 12:00:35 -0800 Subject: [SciPy-user] coercion error when using stats.distirbutions.entropy In-Reply-To: References: Message-ID: <473B53E3.2020202@slac.stanford.edu> Hugues Salamin wrote: > Hi, > > I come against a strange error when I compute the entropy of discrete > random variable. Here is a small script > showing this problem (python 2.4.4, scipy 0.5.2 and numpy 1.0.1): > > [code] > #!/usr/bin/env python2.4 > > from scipy import stats > > val = ( (1,2,3),(0.5,0.25,0.25)) > dist = stats.distributions.rv_discrete(name="temp", values = val) > dist.pmf(2) > print dist.entropy() > [/code] > > I code the following exception: > > Traceback (most recent call last): > File "./error.py", line 8, in ? > print dist.entropy() > File "/usr/lib/python2.4/site-packages/scipy/stats/distributions.py", > line 3787, in entropy > place(output,cond0,self.vecentropy(*goodargs)) > File "/usr/lib/python2.4/site-packages/numpy/lib/function_base.py", > line 856, in __call__ > _res = array(self.ufunc(*args),copy=False).astype(self.otypes[0]) > TypeError: function not supported for these types, and can't coerce > safely to supported types > > > If a remove the call to pmf, the error disappears. I tried to look > under the hood what was happening but I'm totally at lost, > > Any help appreciated > > Hugues > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > Your code snippet works for me with python 2.5, and numpy/scipy 1.0.3.1/0.6.0 Johann From sebastien.maret at gmail.com Wed Nov 14 14:59:34 2007 From: sebastien.maret at gmail.com (Sebastien Maret) Date: Wed, 14 Nov 2007 14:59:34 -0500 Subject: [SciPy-user] Scipy-0.6.0 fails to build on MacOS 10.4.10 with gfortran (gcc4.2.2) Message-ID: Hello, The error log is below: /sw/bin/gfortran -Wall -L/sw/lib build/temp.macosx-10.4-i386-2.5/build/src.macosx-10.4-i386-2.5/scipy/fftpack/_fftpackmodule.o build/temp.macosx-10.4-i386-2.5/scipy/fftpack/src/zfft.o build/temp.macosx-10.4-i386-2.5/scipy/fftpack/src/drfft.o build/temp.macosx-10.4-i386-2.5/scipy/fftpack/src/zrfft.o build/temp.macosx-10.4-i386-2.5/scipy/fftpack/src/zfftnd.o build/temp.macosx-10.4-i386-2.5/build/src.macosx-10.4-i386-2.5/fortranobject.o /sw/lib/djbfft.a -L/sw/lib -L/sw/lib/gcc4.2/lib/gcc/i686-apple-darwin8/4.2.2 -Lbuild/temp.macosx-10.4-i386-2.5 -ldfftpack -lfftw3 -lgfortran -o build/lib.macosx-10.4-i386-2.5/scipy/fftpack/_fftpack.so /usr/bin/ld: Undefined symbols: _PyArg_ParseTupleAndKeywords _PyCObject_AsVoidPtr _PyCObject_Type _PyComplex_Type _PyDict_SetItemString _PyErr_Clear _PyErr_Format _PyErr_NewException _PyErr_Occurred _PyErr_Print _PyErr_SetString _PyExc_ImportError _PyExc_RuntimeError _PyImport_ImportModule _PyInt_Type _PyModule_GetDict _PyNumber_Int _PyObject_GetAttrString _PySequence_Check _PySequence_GetItem _PyString_FromString _PyString_Type _PyType_IsSubtype _PyType_Type _Py_BuildValue _Py_InitModule4 __Py_NoneStruct _PyCObject_FromVoidPtr _PyDict_DelItemString _PyDict_GetItemString _PyDict_New _PyExc_AttributeError _PyExc_TypeError _PyExc_ValueError _PyMem_Free _PyObject_Str _PyObject_Type _PyString_AsString _PyString_ConcatAndDel _Py_FindMethod __PyObject_New _MAIN__ collect2: ld returned 1 exit status /usr/bin/ld: Undefined symbols: _PyArg_ParseTupleAndKeywords _PyCObject_AsVoidPtr _PyCObject_Type _PyComplex_Type _PyDict_SetItemString _PyErr_Clear _PyErr_Format _PyErr_NewException _PyErr_Occurred _PyErr_Print _PyErr_SetString _PyExc_ImportError _PyExc_RuntimeError _PyImport_ImportModule _PyInt_Type _PyModule_GetDict _PyNumber_Int _PyObject_GetAttrString _PySequence_Check _PySequence_GetItem _PyString_FromString _PyString_Type _PyType_IsSubtype _PyType_Type _Py_BuildValue _Py_InitModule4 __Py_NoneStruct _PyCObject_FromVoidPtr _PyDict_DelItemString _PyDict_GetItemString _PyDict_New _PyExc_AttributeError _PyExc_TypeError _PyExc_ValueError _PyMem_Free _PyObject_Str _PyObject_Type _PyString_AsString _PyString_ConcatAndDel _Py_FindMethod __PyObject_New _MAIN__ collect2: ld returned 1 exit status error: Command "/sw/bin/gfortran -Wall -L/sw/lib build/temp.macosx-10.4-i386-2.5/build/src.macosx-10.4-i386-2.5/scipy/fftpack/_fftpackmodule.o build/temp.macosx-10.4-i386-2.5/scipy/fftpack/src/zfft.o build/temp.macosx-10.4-i386-2.5/scipy/fftpack/src/drfft.o build/temp.macosx-10.4-i386-2.5/scipy/fftpack/src/zrfft.o build/temp.macosx-10.4-i386-2.5/scipy/fftpack/src/zfftnd.o build/temp.macosx-10.4-i386-2.5/build/src.macosx-10.4-i386-2.5/fortranobject.o /sw/lib/djbfft.a -L/sw/lib -L/sw/lib/gcc4.2/lib/gcc/i686-apple-darwin8/4.2.2 -Lbuild/temp.macosx-10.4-i386-2.5 -ldfftpack -lfftw3 -lgfortran -o build/lib.macosx-10.4-i386-2.5/scipy/fftpack/_fftpack.so" failed with exit status 1 Thanks in advance for the help. S?bastien From robert.kern at gmail.com Wed Nov 14 16:29:21 2007 From: robert.kern at gmail.com (Robert Kern) Date: Wed, 14 Nov 2007 15:29:21 -0600 Subject: [SciPy-user] Scipy-0.6.0 fails to build on MacOS 10.4.10 with gfortran (gcc4.2.2) In-Reply-To: References: Message-ID: <473B68B1.4060007@gmail.com> Sebastien Maret wrote: > Hello, > > The error log is below: > > /sw/bin/gfortran -Wall -L/sw/lib build/temp.macosx-10.4-i386-2.5/build/src.macosx-10.4-i386-2.5/scipy/fftpack/_fftpackmodule.o build/temp.macosx-10.4-i386-2.5/scipy/fftpack/src/zfft.o build/temp.macosx-10.4-i386-2.5/scipy/fftpack/src/drfft.o build/temp.macosx-10.4-i386-2.5/scipy/fftpack/src/zrfft.o build/temp.macosx-10.4-i386-2.5/scipy/fftpack/src/zfftnd.o build/temp.macosx-10.4-i386-2.5/build/src.macosx-10.4-i386-2.5/fortranobject.o /sw/lib/djbfft.a -L/sw/lib -L/sw/lib/gcc4.2/lib/gcc/i686-apple-darwin8/4.2.2 -Lbuild/temp.macosx-10.4-i386-2.5 -ldfftpack -lfftw3 -lgfortran -o build/lib.macosx-10.4-i386-2.5/scipy/fftpack/_fftpack.so Don't use the LDFLAGS environment variable. This overwrites the link options, including the ones which provide the libpython2.5 library. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From barrywark at gmail.com Wed Nov 14 20:26:33 2007 From: barrywark at gmail.com (Barry Wark) Date: Wed, 14 Nov 2007 17:26:33 -0800 Subject: [SciPy-user] addition to scikits Message-ID: I'm sorry to spam everyone's inbox with such a simple question: how do I add a package to the scikits SVN? I have a possible addition to the scikits collection (a wrapper for the Approximate Nearest Neighbor library, a kdtree implementation that provides nearest neighbor and approximate nearest neighbor searching), which I've previously announced on the list. I've re-worked the installation and tests for the wrapper to conform to the scikits format. Hopefully it will be useful to others and the scikits collection seems an appropriate place to put it. barry From robert.kern at gmail.com Thu Nov 15 02:33:38 2007 From: robert.kern at gmail.com (Robert Kern) Date: Thu, 15 Nov 2007 01:33:38 -0600 Subject: [SciPy-user] addition to scikits In-Reply-To: References: Message-ID: <473BF652.3080103@gmail.com> Barry Wark wrote: > I'm sorry to spam everyone's inbox with such a simple question: how do > I add a package to the scikits SVN? I have asked our sysadmin to give you the appropriate privileges. He should email you about passwords and such shortly. > I have a possible addition to the scikits collection (a wrapper for > the Approximate Nearest Neighbor library, a kdtree implementation that > provides nearest neighbor and approximate nearest neighbor searching), > which I've previously announced on the list. I've re-worked the > installation and tests for the wrapper to conform to the scikits > format. Hopefully it will be useful to others and the scikits > collection seems an appropriate place to put it. Excellent. I look forward to it. Thanks. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From david at ar.media.kyoto-u.ac.jp Thu Nov 15 02:36:50 2007 From: david at ar.media.kyoto-u.ac.jp (David Cournapeau) Date: Thu, 15 Nov 2007 16:36:50 +0900 Subject: [SciPy-user] Scipy-0.6.0 fails to build on MacOS 10.4.10 with gfortran (gcc4.2.2) In-Reply-To: <473B68B1.4060007@gmail.com> References: <473B68B1.4060007@gmail.com> Message-ID: <473BF712.4070704@ar.media.kyoto-u.ac.jp> Robert Kern wrote: > Sebastien Maret wrote: >> Hello, >> >> The error log is below: >> >> /sw/bin/gfortran -Wall -L/sw/lib build/temp.macosx-10.4-i386-2.5/build/src.macosx-10.4-i386-2.5/scipy/fftpack/_fftpackmodule.o build/temp.macosx-10.4-i386-2.5/scipy/fftpack/src/zfft.o build/temp.macosx-10.4-i386-2.5/scipy/fftpack/src/drfft.o build/temp.macosx-10.4-i386-2.5/scipy/fftpack/src/zrfft.o build/temp.macosx-10.4-i386-2.5/scipy/fftpack/src/zfftnd.o build/temp.macosx-10.4-i386-2.5/build/src.macosx-10.4-i386-2.5/fortranobject.o /sw/lib/djbfft.a -L/sw/lib -L/sw/lib/gcc4.2/lib/gcc/i686-apple-darwin8/4.2.2 -Lbuild/temp.macosx-10.4-i386-2.5 -ldfftpack -lfftw3 -lgfortran -o build/lib.macosx-10.4-i386-2.5/scipy/fftpack/_fftpack.so > > Don't use the LDFLAGS environment variable. This overwrites the link options, > including the ones which provide the libpython2.5 library. I was wondering, since this is causing so much trouble, wouldn't it be easier to just emit a warning when LDFLAGS or CFLAGS is used, and disable it ? David From robert.kern at gmail.com Thu Nov 15 03:09:27 2007 From: robert.kern at gmail.com (Robert Kern) Date: Thu, 15 Nov 2007 02:09:27 -0600 Subject: [SciPy-user] Scipy-0.6.0 fails to build on MacOS 10.4.10 with gfortran (gcc4.2.2) In-Reply-To: <473BF712.4070704@ar.media.kyoto-u.ac.jp> References: <473B68B1.4060007@gmail.com> <473BF712.4070704@ar.media.kyoto-u.ac.jp> Message-ID: <473BFEB7.5070607@gmail.com> David Cournapeau wrote: > Robert Kern wrote: >> Sebastien Maret wrote: >>> Hello, >>> >>> The error log is below: >>> >>> /sw/bin/gfortran -Wall -L/sw/lib build/temp.macosx-10.4-i386-2.5/build/src.macosx-10.4-i386-2.5/scipy/fftpack/_fftpackmodule.o build/temp.macosx-10.4-i386-2.5/scipy/fftpack/src/zfft.o build/temp.macosx-10.4-i386-2.5/scipy/fftpack/src/drfft.o build/temp.macosx-10.4-i386-2.5/scipy/fftpack/src/zrfft.o build/temp.macosx-10.4-i386-2.5/scipy/fftpack/src/zfftnd.o build/temp.macosx-10.4-i386-2.5/build/src.macosx-10.4-i386-2.5/fortranobject.o /sw/lib/djbfft.a -L/sw/lib -L/sw/lib/gcc4.2/lib/gcc/i686-apple-darwin8/4.2.2 -Lbuild/temp.macosx-10.4-i386-2.5 -ldfftpack -lfftw3 -lgfortran -o build/lib.macosx-10.4-i386-2.5/scipy/fftpack/_fftpack.so >> Don't use the LDFLAGS environment variable. This overwrites the link options, >> including the ones which provide the libpython2.5 library. > I was wondering, since this is causing so much trouble, wouldn't it be > easier to just emit a warning when LDFLAGS or CFLAGS is used, and > disable it ? Not alone, no. It's the only way to override those settings. Particularly in the case of Fortran compilers, which will often be quite different from the C compiler used to build Python itself, this is an important feature. If you make another way to fully override those settings, then you may change the LDFLAGS behavior. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From Laurence.Viry at imag.fr Thu Nov 15 03:22:11 2007 From: Laurence.Viry at imag.fr (Laurence Viry) Date: Thu, 15 Nov 2007 09:22:11 +0100 Subject: [SciPy-user] Compiling scipy with Intel ifort and SCSL on an Atix350(IA64) with linux Message-ID: <473C01B3.8020608@imag.fr> Hi! I use Python2.4.4 and numpy-1.0.3.1. *I would like to use SCSL SGI Library (which is compiled with Intel Compiler) and intel compilers to install numpy and scipy* the install script didn't recognize the fc_config (--fcompiler=intele) options " warning: build_ext: f77_compiler=intele is not available. warning: build_ext: extension 'numpy.linalg.lapack_lite' has Fortran libraries but no Fortran linker found, using default linker ..." customize IntelItaniumFCompiler option --fcompiler not recognized " #Platform information: python -c 'import os,sys;print os.name,sys.platform' *posix linux2* uname -a *Linux zephir 2.6.5-7.282-sn2 #1 SMP Tue Aug 29 10:40:40 UTC 2006 ia64 ia64 ia64 GNU/Linux* # Information about C,C++,Fortran compilers/linkers ifort -V I*ntel(R) Fortran Itanium(R) Compiler for Itanium(R)-based applications Version 9.0 Build 20050430 Package ID: l_fc_p_9.0.021* Copyright (C) 1985-2005 Intel Corporation. All rights reserved. FOR NON-COMMERCIAL USE ONLY *Intel(R) C Itanium(R) Compiler for Itanium(R)-based applications Version 9.0 Build 20050722 Package ID: l_cc_c_9.0.024* Copyright (C) 1985-2005 Intel Corporation. All rights reserved. FOR NON-COMMERCIAL USE ONLY # Python version: python -c 'import sys;print sys.version' *2.4.4 (#1, Nov 13 2007, 12:19:56) [GCC 3.3.3 (SuSE Linux)]* # Python Numpy version: *1.0.3.1* # f2py version: *2_3979* # I try to use SCSL SGI Library which is optimized on our cluster, compiled with intel compilers Configuration file for numpy and scipy: site.cfg # [DEFAULT] library_dirs = /usr/lib:/usr/local/lib/ include_dirs = /usr/include:/usr/local/include [blas] library_dirs = /usr/lib blas_libs= scs [lapack] library_dirs = /usr/lib lapack_libs = scs [fftw] library_dirs=/usr/local/stow/fftw-3.1.2/lib include_dirs = /usr/local/stow/fftw-3.1.2/includepython setup.py config --compiler=intele --fcompiler=intele build_clib --compiler=intele --fcompiler=intele build_ext --compiler=intele --fcompiler=intele install libraries = fftw3 # python command to install *python setup.py config --compiler=intele --fcompiler=intele build_clib --compiler=intele --fcompiler=intele build_ext --compiler=intele --fcompiler=intele install* *The script didn't recognize the fc_config (--fcompiler=intele) options* " ... warning: build_ext: f77_compiler=intele is not available. warning: build_ext: extension 'numpy.linalg.lapack_lite' has Fortran libraries but no Fortran linker found, using default linker ..." customize IntelItaniumFCompiler option --fcompiler not recognized Couldn't match compiler version for 'Intel(R) Fortran Itanium(R) Compiler for Itanium(R)-based applications\nVersion 9.0 Build 20050430 Package ID: l_fc_p_9.0.021\nCopyright (C) 1985-2005 Intel Corporation. All rights reserved.\nFOR NON-COMMERCIAL USE ONLY\n\n Intel Fortran 9.0-5238' I changed numpy/distutils/ccompiler.py to : compiler_class['intele'] = ('intelccompiler','IntelItaniumCCompiler', "Intel(R) C Itanium(R) Compiler for Itanium(R)-based applications") and put version_match = intel_version_match('Itanium') in fcompiler/intel.py It found the C Intel Compiler but don't find the Fortran Intel compiler. It's Ok for numpy but when I installed scipy, I had the error with fortran " building 'scipy.interpolate.dfitpack' extension error: extension 'scipy.interpolate.dfitpack' has Fortran sources but no Fortran compiler found " Can you help me to understand thanks -- Laurence Viry CRIP UJF - Projet MIRAGE ( http://mirage.imag.fr ) Laboratoire de Mod?lisation et Calcul - IMAG tel: 04 76 63 56 34 fax: 04 76 63 12 63 e-mail: Laurence.Viry at imag.fr From darkwind.87 at gmail.com Thu Nov 15 03:56:06 2007 From: darkwind.87 at gmail.com (Dark Wind) Date: Thu, 15 Nov 2007 00:56:06 -0800 Subject: [SciPy-user] Problem with NLP in OpenOpt Message-ID: <8a6035a00711150056r62c30b6cyd5fb4052eaa3f8b8@mail.gmail.com> Hi, I am using NLP of OpenOpt to solve a NonLinear programming problem. I am using scipy_cobyla solver. The initial point that I have provided is feasible, but I am getting a message: ------------------------------------------message--------------------------------------------------------- starting solver scipy_cobyla (license: BSD ) with problem unnamed solver scipy_cobyla has finished solving the problem unnamed istop: -100 Solver: Time Elapsed = 0.75 CPU Time Elapsed = 0.750110139697 NO FEASIBLE SOLUTION is obtained (max residual = 1.29567664921e-005) ----------------------------------------------------------------------------------------------------------------- All other solvers cannot take h constraint and some others require extra additions. Following is my code: --------------------------------------------------code------------------------------------------------- from scikits.openopt import NLP f = lambda x: 1*x[0] + 7*x[5] + 3*x[10] + 8*x[15] x0 = [1,0,0,0,0,1,0,0,0,0,1,0,0,0,0,1] p=NLP(f,x0,maxIter=1e7,maxFunEvals=1e7) h1 = lambda x: x[0]**2 + x[1]**2 + x[2]**2 + x[3]**2 -1 h2 = lambda x: x[0]*x[4]+x[1]*x[5]+x[2]*x[6]+x[3]*x[7] h3 = lambda x: x[0]*x[8]+x[1]*x[9]+x[2]*x[10]+x[3]*x[11] h4 = lambda x: x[0]*x[12]+x[1]*x[13]+x[2]*x[14]+x[3]*x[15] h5 = lambda x: x[4]**2+x[5]**2+x[6]**2+x[7]**2-1 h6 = lambda x: x[4]*x[8]+x[5]*x[9]+x[6]*x[10]+x[7]*x[11] h7 = lambda x: x[4]*x[12]+x[5]*x[13]+x[6]*x[14]+x[7]*x[15] h8 = lambda x: x[8]**2+x[9]**2+x[10]**2+x[11]**2-1 h9 = lambda x: x[8]*x[12]+x[9]*x[13]+x[10]*x[14]+x[11]*x[15] h10 = lambda x: x[12]**2+x[13]**2+x[14]**2+x[15]**2-1 p.h=[h1,h2,h3,h4,h5,h6,h7,h8,h9,h10] r=p.solve('scipy_cobyla') ----------------------------------------------------------------------------------------------------------- The initial value x0 satisfies the constraints hi's but i am still getting that there is no feasible solution. Can anyone help me with this? Thank you From stefan at sun.ac.za Thu Nov 15 04:46:53 2007 From: stefan at sun.ac.za (Stefan van der Walt) Date: Thu, 15 Nov 2007 11:46:53 +0200 Subject: [SciPy-user] KDE question In-Reply-To: <4739E580.9050208@gmail.com> References: <9F9D8106-7EEE-4623-98C9-B8B65C428419@stanford.edu> <4734BF4E.1090102@gmail.com> <47356362.7010508@ar.media.kyoto-u.ac.jp> <473566E4.2060101@gmail.com> <20071113100408.GA6802@mentat.za.net> <4739E580.9050208@gmail.com> Message-ID: <20071115094653.GD9382@mentat.za.net> On Tue, Nov 13, 2007 at 11:57:20AM -0600, Robert Kern wrote: > Stefan van der Walt wrote: > > On Sat, Nov 10, 2007 at 02:08:04AM -0600, Robert Kern wrote: > >> David Cournapeau wrote: > >> > >>> I am not sure I understand exactly the problem, but if the problem is to > >>> find a contour level of a Gaussian density, it has a closed form for any > >>> dimension. > >> No, the problem is to find the appropriate contour level of a kernel density > >> estimate (with Gaussian kernels in this case). Essentially, a mixture of many > >> Gaussians, not a single one. > > > > Sounds like the kind of problem that can be solved using marching > > squares: > > > > http://www.polytech.unice.fr/~lingrand/MarchingCubes/algo.html > > This solves the already-matplotlib-solved problem of drawing the contours given > a level. That still leaves finding the correct level. Or am I underestimating > the potential to reformulate marching squares to solve the > integration problem, too? No, I don't think you are. As for the line-search, since the different components of the mixture are available, can't we evaluate the integral (over each component) directly, rather than working with a grid? I come from a GMM background, so a question about KDE: is there a component placed around each datapoint? It looks that way, since there are no means calculated anywhere, as with gaussian mixtures. Cheers St?fan From stefan at sun.ac.za Thu Nov 15 08:05:15 2007 From: stefan at sun.ac.za (Stefan van der Walt) Date: Thu, 15 Nov 2007 15:05:15 +0200 Subject: [SciPy-user] KDE question In-Reply-To: <20071115094653.GD9382@mentat.za.net> References: <9F9D8106-7EEE-4623-98C9-B8B65C428419@stanford.edu> <4734BF4E.1090102@gmail.com> <47356362.7010508@ar.media.kyoto-u.ac.jp> <473566E4.2060101@gmail.com> <20071113100408.GA6802@mentat.za.net> <4739E580.9050208@gmail.com> <20071115094653.GD9382@mentat.za.net> Message-ID: <20071115130515.GC26022@mentat.za.net> On Thu, Nov 15, 2007 at 11:46:53AM +0200, Stefan van der Walt wrote: > > > Sounds like the kind of problem that can be solved using marching > > > squares: > > > > > > http://www.polytech.unice.fr/~lingrand/MarchingCubes/algo.html > > > > This solves the already-matplotlib-solved problem of drawing the contours given > > a level. That still leaves finding the correct level. Or am I underestimating > > the potential to reformulate marching squares to solve the > > integration problem, too? > > No, I don't think you are. As for the line-search, since the > different components of the mixture are available, can't we evaluate > the integral (over each component) directly, rather than working with > a grid? No, since the relevant area depends on the *sum* of components, not the value of the component itself. Cheers St?fan From dmitrey.kroshko at scipy.org Thu Nov 15 09:49:19 2007 From: dmitrey.kroshko at scipy.org (dmitrey) Date: Thu, 15 Nov 2007 16:49:19 +0200 Subject: [SciPy-user] Problem with NLP in OpenOpt In-Reply-To: <8a6035a00711150056r62c30b6cyd5fb4052eaa3f8b8@mail.gmail.com> References: <8a6035a00711150056r62c30b6cyd5fb4052eaa3f8b8@mail.gmail.com> Message-ID: <473C5C6F.9080208@scipy.org> hi, scipy_cobyla yields solution r.ff = -19, r.xf= array([ -1.00000000e+00, -9.80461510e-08, 1.35489449e-05, 5.37394643e-06, 9.80515487e-08, -1.00000000e+00, 4.43639580e-07, -2.00656699e-07, -5.92177803e-07, -2.01266552e-07, -1.00000000e+00, 1.02019241e-07, -1.81972968e-07, 2.91935590e-07, -1.02019165e-07, -1.00000000e+00]) and max constraint = 1.29567664921e-05, that exceeds default p.contol=1e-6 ALGENCAN yields r.ff=-19.0000051782, r.xf = array([ -1.00000000e+00, 3.77969060e-08, 2.50917684e-08, 3.90370801e-08, -3.78516336e-08, -1.00000020e+00, -1.99934148e-08, 3.28135157e-09, -2.50878312e-08, 1.99952165e-08, -9.99999998e-01, 2.31707460e-08, -3.90372694e-08, -3.28049018e-09, -2.27805959e-08, -1.00000048e+00]) and max constraint 9.53914e-07 that is less than default 1e-6 and hence is feasible, no error message (btw time elapsed is 30% less than cobyla). You could just set p.contol=1e-4 and scipy_cobyla will solve your problem well. Regards, D. Dark Wind wrote: > Hi, > > I am using NLP of OpenOpt to solve a NonLinear programming problem. I > am using scipy_cobyla solver. > The initial point that I have provided is feasible, but I am getting a message: > ------------------------------------------message--------------------------------------------------------- > starting solver scipy_cobyla (license: BSD ) with problem unnamed > solver scipy_cobyla has finished solving the problem unnamed > istop: -100 > Solver: Time Elapsed = 0.75 CPU Time Elapsed = 0.750110139697 > NO FEASIBLE SOLUTION is obtained (max residual = 1.29567664921e-005) > ----------------------------------------------------------------------------------------------------------------- > > > All other solvers cannot take h constraint and some others require > extra additions. > > Following is my code: > --------------------------------------------------code------------------------------------------------- > from scikits.openopt import NLP > > f = lambda x: 1*x[0] + 7*x[5] + 3*x[10] + 8*x[15] > x0 = [1,0,0,0,0,1,0,0,0,0,1,0,0,0,0,1] > > p=NLP(f,x0,maxIter=1e7,maxFunEvals=1e7) > > h1 = lambda x: x[0]**2 + x[1]**2 + x[2]**2 + x[3]**2 -1 > h2 = lambda x: x[0]*x[4]+x[1]*x[5]+x[2]*x[6]+x[3]*x[7] > h3 = lambda x: x[0]*x[8]+x[1]*x[9]+x[2]*x[10]+x[3]*x[11] > h4 = lambda x: x[0]*x[12]+x[1]*x[13]+x[2]*x[14]+x[3]*x[15] > h5 = lambda x: x[4]**2+x[5]**2+x[6]**2+x[7]**2-1 > h6 = lambda x: x[4]*x[8]+x[5]*x[9]+x[6]*x[10]+x[7]*x[11] > h7 = lambda x: x[4]*x[12]+x[5]*x[13]+x[6]*x[14]+x[7]*x[15] > h8 = lambda x: x[8]**2+x[9]**2+x[10]**2+x[11]**2-1 > h9 = lambda x: x[8]*x[12]+x[9]*x[13]+x[10]*x[14]+x[11]*x[15] > h10 = lambda x: x[12]**2+x[13]**2+x[14]**2+x[15]**2-1 > > p.h=[h1,h2,h3,h4,h5,h6,h7,h8,h9,h10] > > r=p.solve('scipy_cobyla') > ----------------------------------------------------------------------------------------------------------- > > The initial value x0 satisfies the constraints hi's but i am still > getting that there is no feasible solution. > > Can anyone help me with this? > > Thank you > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > > > > From dominique.orban at gmail.com Thu Nov 15 11:54:44 2007 From: dominique.orban at gmail.com (Dominique Orban) Date: Thu, 15 Nov 2007 11:54:44 -0500 Subject: [SciPy-user] Problem with NLP in OpenOpt In-Reply-To: <8a6035a00711150056r62c30b6cyd5fb4052eaa3f8b8@mail.gmail.com> References: <8a6035a00711150056r62c30b6cyd5fb4052eaa3f8b8@mail.gmail.com> Message-ID: <8793ae6e0711150854g7b78db7ex4738dd0dd00cbe13@mail.gmail.com> On 11/15/07, Dark Wind wrote: > The initial value x0 satisfies the constraints hi's but i am still > getting that there is no feasible solution. > > Can anyone help me with this? Cobyla is an infeasible method, meaning that in order to identifiy a solution to your problem, it will venture outside of the feasible set. In your example, it wasn't able to reach a feasible solution within the given limits (max number of iterations, tolerances, ...) It is not because their is a feasible point that the code will be able to find a solution; in this case, it got stuck before reaching the feasible set again. You can either re-solve your problem with looser tolerances, which is not very satisfactory, you can re-solve from a different starting point (which doesn't have to be feasible), or you can try another solver (as Dmitrey showed in his response). Dominique From adam.ginsburg at colorado.edu Thu Nov 15 14:07:51 2007 From: adam.ginsburg at colorado.edu (Adam Ginsburg) Date: Thu, 15 Nov 2007 12:07:51 -0700 Subject: [SciPy-user] Pylab plotting: math range errors unrecoverable Message-ID: I've been encountering these errors quite often: : math range error > /usr/lib/python2.5/site-packages/matplotlib/ticker.py(791)scale_range() 790 else: --> 791 ex = divmod(math.log10(-meanv), 1)[0] 792 offset = -10**ex They happen when I try to plot an array containing "inf" values - no surprise there. Two questions, though: 1. is there any way to make pylab ignore inf values, i.e. plot zeroes or quit cleanly instead of giving a messy error? I've looked at masked arrays, but they don't seem to do it, and I'd like something that does not require modifying my data 2. when I get the above error, then try to plot something WITHOUT infinities, it fails with the exact same error, and when I ran the debugger it claimed that meanv was still inf. Is there any way to fix that short of quitting and reentering the python command line? Thanks, Adam From mforbes at physics.ubc.ca Thu Nov 15 14:19:30 2007 From: mforbes at physics.ubc.ca (Michael McNeil Forbes) Date: Thu, 15 Nov 2007 11:19:30 -0800 Subject: [SciPy-user] Pylab plotting: math range errors unrecoverable In-Reply-To: References: Message-ID: On 15 Nov 2007, at 11:07 AM, Adam Ginsburg wrote: > 2. when I get the above error, then try to plot something WITHOUT > infinities, it fails with the exact same error, and when I ran the > debugger it claimed that meanv was still inf. Is there any way to fix > that short of quitting and reentering the python command line? You can usually just clear the plot with clf(). I think the plot keeps the old data and then tries to redisplay it each time, giving errors over and over until you clear the figure. Michael. From david at ar.media.kyoto-u.ac.jp Thu Nov 15 22:44:17 2007 From: david at ar.media.kyoto-u.ac.jp (David Cournapeau) Date: Fri, 16 Nov 2007 12:44:17 +0900 Subject: [SciPy-user] KDE question In-Reply-To: <20071115130515.GC26022@mentat.za.net> References: <9F9D8106-7EEE-4623-98C9-B8B65C428419@stanford.edu> <4734BF4E.1090102@gmail.com> <47356362.7010508@ar.media.kyoto-u.ac.jp> <473566E4.2060101@gmail.com> <20071113100408.GA6802@mentat.za.net> <4739E580.9050208@gmail.com> <20071115094653.GD9382@mentat.za.net> <20071115130515.GC26022@mentat.za.net> Message-ID: <473D1211.70704@ar.media.kyoto-u.ac.jp> Stefan van der Walt wrote: > On Thu, Nov 15, 2007 at 11:46:53AM +0200, Stefan van der Walt wrote: >>>> Sounds like the kind of problem that can be solved using marching >>>> squares: >>>> >>>> http://www.polytech.unice.fr/~lingrand/MarchingCubes/algo.html >>> This solves the already-matplotlib-solved problem of drawing the contours given >>> a level. That still leaves finding the correct level. Or am I underestimating >>> the potential to reformulate marching squares to solve the >>> integration problem, too? >> No, I don't think you are. As for the line-search, since the >> different components of the mixture are available, can't we evaluate >> the integral (over each component) directly, rather than working with >> a grid? > > No, since the relevant area depends on the *sum* of components, not > the value of the component itself. Yes, that's what I would have thought, too. As Robert said, I don't think you can find an analytical version for the inverse cumulative "function" of a mixture of Gaussian on an area of interest, and that's what we need (matlab does not implement it, and I guess that if it was possible, they would have put it with the pdf and cdf abilities of their mixture object). For a component, the contour shape is easy to see (ellipsoids), for mixtures, not so easy. For kernel estimators, you assume each component is the same 'shape', (I mean same covariance matrix) right ? Maybe this make the computation feasible (find all the points such as sum_i{a_i f(x - \mu_i)} = cst) ? David From yennifersantiago at gmail.com Thu Nov 15 22:59:32 2007 From: yennifersantiago at gmail.com (Yennifer Santiago) Date: Thu, 15 Nov 2007 23:59:32 -0400 Subject: [SciPy-user] Scipy-AlgoritmosGeneticos Message-ID: <41bc705b0711151959u49c6a6d2r82a8e70414849e8@mail.gmail.com> Hola, Por favor necesito informaci?n sobre el uso de Scipy, especificamente para aplicaciones con Algoritmos Gen?ticos. Si tienen o conocen alguna direcci?n donde pueda encontrar algun tutorial sobre esto. Gracias Yennifer Santiago From robert.kern at gmail.com Thu Nov 15 23:03:59 2007 From: robert.kern at gmail.com (Robert Kern) Date: Thu, 15 Nov 2007 22:03:59 -0600 Subject: [SciPy-user] KDE question In-Reply-To: <473D1211.70704@ar.media.kyoto-u.ac.jp> References: <9F9D8106-7EEE-4623-98C9-B8B65C428419@stanford.edu> <4734BF4E.1090102@gmail.com> <47356362.7010508@ar.media.kyoto-u.ac.jp> <473566E4.2060101@gmail.com> <20071113100408.GA6802@mentat.za.net> <4739E580.9050208@gmail.com> <20071115094653.GD9382@mentat.za.net> <20071115130515.GC26022@mentat.za.net> <473D1211.70704@ar.media.kyoto-u.ac.jp> Message-ID: <473D16AF.9010108@gmail.com> David Cournapeau wrote: > For kernel estimators, you assume each component is the same 'shape', (I > mean same covariance matrix) right ? This is the only form that has been implemented, yes. Fancier implementations can adapt the covariance matrix (in KDE jargon, "bandwidth") of the kernels to take into account the properties of the local neighborhood. However, this is usually only done in 1-D because N-D sorting is hard. > Maybe this make the computation > feasible (find all the points such as sum_i{a_i f(x - \mu_i)} = cst) ? That boils down to evaluating on the grid again, I think. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From cohen at slac.stanford.edu Thu Nov 15 23:04:45 2007 From: cohen at slac.stanford.edu (Johann Cohen-Tanugi) Date: Thu, 15 Nov 2007 20:04:45 -0800 Subject: [SciPy-user] Scipy-AlgoritmosGeneticos In-Reply-To: <41bc705b0711151959u49c6a6d2r82a8e70414849e8@mail.gmail.com> References: <41bc705b0711151959u49c6a6d2r82a8e70414849e8@mail.gmail.com> Message-ID: <473D16DD.4050303@slac.stanford.edu> Yennifer Santiago wrote: > Hola, > > Por favor necesito informaci?n sobre el uso de Scipy, especificamente > para aplicaciones con Algoritmos Gen?ticos. Si tienen o conocen alguna > direcci?n donde pueda encontrar algun tutorial sobre esto. > > Gracias > > Yennifer Santiago > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > I think that in the sandbox area of scipy there is a genetic algorithm package in development. No idea about the stage of development..... Sorry to answer in English! Johann From david at ar.media.kyoto-u.ac.jp Thu Nov 15 23:13:58 2007 From: david at ar.media.kyoto-u.ac.jp (David Cournapeau) Date: Fri, 16 Nov 2007 13:13:58 +0900 Subject: [SciPy-user] KDE question In-Reply-To: <473D16AF.9010108@gmail.com> References: <9F9D8106-7EEE-4623-98C9-B8B65C428419@stanford.edu> <4734BF4E.1090102@gmail.com> <47356362.7010508@ar.media.kyoto-u.ac.jp> <473566E4.2060101@gmail.com> <20071113100408.GA6802@mentat.za.net> <4739E580.9050208@gmail.com> <20071115094653.GD9382@mentat.za.net> <20071115130515.GC26022@mentat.za.net> <473D1211.70704@ar.media.kyoto-u.ac.jp> <473D16AF.9010108@gmail.com> Message-ID: <473D1906.5090907@ar.media.kyoto-u.ac.jp> Robert Kern wrote: > David Cournapeau wrote: > > >> For kernel estimators, you assume each component is the same 'shape', (I >> mean same covariance matrix) right ? >> > > This is the only form that has been implemented, yes. Fancier implementations > can adapt the covariance matrix (in KDE jargon, "bandwidth") of the kernels to > take into account the properties of the local neighborhood. However, this is > usually only done in 1-D because N-D sorting is hard. > > >> Maybe this make the computation >> feasible (find all the points such as sum_i{a_i f(x - \mu_i)} = cst) ? >> > > That boils down to evaluating on the grid again, I think. > Yes, sorry, I was stupid. That's the different locations which matters, having the same shape for each Gaussian does not help that much. For one component case, the thing which makes it easy it that pdf = cst boils down to a "quadratic form involving the samples = constant". For multiple components, this does not seem feasible anymore. cheers, David From skraelings001 at gmail.com Thu Nov 15 23:46:08 2007 From: skraelings001 at gmail.com (Reynaldo Baquerizo) Date: Thu, 15 Nov 2007 23:46:08 -0500 Subject: [SciPy-user] Scipy-AlgoritmosGeneticos In-Reply-To: <473D16DD.4050303@slac.stanford.edu> References: <41bc705b0711151959u49c6a6d2r82a8e70414849e8@mail.gmail.com> <473D16DD.4050303@slac.stanford.edu> Message-ID: <473D2090.6090306@gmail.com> Johann Cohen-Tanugi escribi?: > Yennifer Santiago wrote: > >> Hola, >> >> Por favor necesito informaci?n sobre el uso de Scipy, especificamente >> para aplicaciones con Algoritmos Gen?ticos. Si tienen o conocen alguna >> direcci?n donde pueda encontrar algun tutorial sobre esto. >> >> Gracias >> >> Yennifer Santiago >> _______________________________________________ >> SciPy-user mailing list >> SciPy-user at scipy.org >> http://projects.scipy.org/mailman/listinfo/scipy-user >> >> > I think that in the sandbox area of scipy there is a genetic algorithm > maybe you could look at example.py in: http://svn.scipy.org/svn/scipy/trunk/scipy/sandbox/ga/ some resources: http://www.iitk.ac.in/kangal/links.shtml http://www.genetic-programming.org/ > package in development. No idea about the stage of development..... > Sorry to answer in English! > Johann > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > > From jelleferinga at gmail.com Fri Nov 16 04:22:32 2007 From: jelleferinga at gmail.com (jelle) Date: Fri, 16 Nov 2007 09:22:32 +0000 (UTC) Subject: [SciPy-user] Scipy-AlgoritmosGeneticos References: <41bc705b0711151959u49c6a6d2r82a8e70414849e8@mail.gmail.com> <473D16DD.4050303@slac.stanford.edu> <473D2090.6090306@gmail.com> Message-ID: if your interested in evolutionary computing, have a look at Evolving Objects. also, python wrappers exist. think of EO as a framework for evolutionary computing From grh at mur.at Fri Nov 16 05:44:43 2007 From: grh at mur.at (Georg Holzmann) Date: Fri, 16 Nov 2007 11:44:43 +0100 Subject: [SciPy-user] adaptive simulated annealing (ASA) Message-ID: <473D749B.6070308@mur.at> Hallo! Is there already a python implementation of the adaptive simulated annealing optimization algorithm (maybe a wrapper to the following C library: http://www.ingber.com/ASA-README.html) ? I did quite some google search and did not find something - so any hint is very much appreciated ! Thanks, LG Georg From dmitrey.kroshko at scipy.org Fri Nov 16 05:56:43 2007 From: dmitrey.kroshko at scipy.org (dmitrey) Date: Fri, 16 Nov 2007 12:56:43 +0200 Subject: [SciPy-user] adaptive simulated annealing (ASA) In-Reply-To: <473D749B.6070308@mur.at> References: <473D749B.6070308@mur.at> Message-ID: <473D776B.30009@scipy.org> Georg Holzmann wrote: > Hallo! > > Is there already a python implementation of the adaptive simulated > annealing optimization algorithm (maybe a wrapper to the following C > library: http://www.ingber.com/ASA-README.html) ? > > I did quite some google search and did not find something - so any hint > is very much appreciated ! > I had did the search as well (about searching global minimum software with python interface enable). All I had found were some obsolete packages, + one more with very complicated documentation (IIRC particle swarm simulation based). As for scipy anneal, I had informed it doesn't work properly but noone answered http://article.gmane.org/gmane.comp.python.scientific.user/13553/match=anneal and I have no time for now to search and fix the bug(s) by myself Regards, D From lbolla at gmail.com Fri Nov 16 09:53:00 2007 From: lbolla at gmail.com (lorenzo bolla) Date: Fri, 16 Nov 2007 15:53:00 +0100 Subject: [SciPy-user] ode on complex systems Message-ID: <80c99e790711160653v58f2b65o68a52c422e4623fc@mail.gmail.com> Hi all, can I use any of scipy.integrate.ode/odeint etc. for complex systems? How? Here is what I'm doing, but I only get r.y real, instead of complex (and the result is different from Matlab). ---------------------------------------- import numpy import scipy.integrate.ode def f(t, y, betaA, betaB, CAB): M = numpy.array([[1j * betaA, CAB], [-CAB, 1j * betaB]]) return numpy.dot(M, y) betaA = 1. betaB = 1. CAB = 1. t0 = 0. t1 = 1. dt = .01 y0 = numpy.array([1, 0]).astype(complex) r = scipy.integrate.ode(f).set_integrator('vode').set_initial_value(y0, t0).set_f_params(betaA, betaB, CAB) while r.successful() and r.t < t1: r.integrate(r.t + dt) print r.t, r.y ---------------------------------------- Thank you in advance, Lorenzo -------------- next part -------------- An HTML attachment was scrubbed... URL: From Fabrice.Silva at crans.org Fri Nov 16 10:19:08 2007 From: Fabrice.Silva at crans.org (Fabrice Silva) Date: Fri, 16 Nov 2007 16:19:08 +0100 Subject: [SciPy-user] ode on complex systems In-Reply-To: <80c99e790711160653v58f2b65o68a52c422e4623fc@mail.gmail.com> References: <80c99e790711160653v58f2b65o68a52c422e4623fc@mail.gmail.com> Message-ID: <1195226348.24691.2.camel@localhost> Le vendredi 16 novembre 2007 ? 15:53 +0100, lorenzo bolla a ?crit : > Hi all, > can I use any of scipy.integrate.ode/odeint etc. for complex systems? How? You may need to separate real and imaginary parts of your complex unknowns : you need to transform y(t)=[y0(t), y1(t)] with y0 and y1 complex signals into: y(t)=[real(y0(t)), imag(y0(t)), real (y1(t)), imag(y1(t))] and modify your matrix M as well. Can anyone confirm? -- Fabricio From nwagner at iam.uni-stuttgart.de Fri Nov 16 10:25:53 2007 From: nwagner at iam.uni-stuttgart.de (Nils Wagner) Date: Fri, 16 Nov 2007 16:25:53 +0100 Subject: [SciPy-user] ode on complex systems In-Reply-To: <1195226348.24691.2.camel@localhost> References: <80c99e790711160653v58f2b65o68a52c422e4623fc@mail.gmail.com> <1195226348.24691.2.camel@localhost> Message-ID: On Fri, 16 Nov 2007 16:19:08 +0100 Fabrice Silva wrote: > > Le vendredi 16 novembre 2007 ? 15:53 +0100, lorenzo >bolla a ?crit : >> Hi all, >> can I use any of scipy.integrate.ode/odeint etc. for >>complex systems? How? > > You may need to separate real and imaginary parts of >your complex > unknowns : > you need to transform > y(t)=[y0(t), y1(t)] with y0 and y1 complex signals > into: > y(t)=[real(y0(t)), imag(y0(t)), real (y1(t)), >imag(y1(t))] > > and modify your matrix M as well. > > Can anyone confirm? > -- >Fabricio > > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user AFAIK, you can also apply the patch which is available at http://projects.scipy.org/scipy/scipy/ticket/334 So you can integrate complex ODE's directly without doubling the size. Please let me know how it works for you. Nils From bryanv at enthought.com Fri Nov 16 11:05:20 2007 From: bryanv at enthought.com (Bryan Van de Ven) Date: Fri, 16 Nov 2007 10:05:20 -0600 Subject: [SciPy-user] adaptive simulated annealing (ASA) In-Reply-To: <473D749B.6070308@mur.at> References: <473D749B.6070308@mur.at> Message-ID: <473DBFC0.4010105@enthought.com> It's been a while, but I remember ASA having a fairly straightforward interface (set up one big config structure, then call one or two functions). The 'asamin' matlab wrapper isn't much code at all, and the guy that wrote that didn't have SWIG. :) I doubt you will find many simmulated annealing packages better or more featured than ASA, so I'd suggest taking a stab at wrapping it yourself. http://www.igi.tugraz.at/lehre/MLA/WS01/asamin.html (for reference) http://www.swig.org/ Georg Holzmann wrote: > Hallo! > > Is there already a python implementation of the adaptive simulated > annealing optimization algorithm (maybe a wrapper to the following C > library: http://www.ingber.com/ASA-README.html) ? > > I did quite some google search and did not find something - so any hint > is very much appreciated ! > > Thanks, > LG > Georg > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > From odonnems at yahoo.com Fri Nov 16 11:24:29 2007 From: odonnems at yahoo.com (Michael ODonnell) Date: Fri, 16 Nov 2007 08:24:29 -0800 (PST) Subject: [SciPy-user] inline.weave compile error Message-ID: <413269.69593.qm@web58011.mail.re3.yahoo.com> I am trying to compile some inline c++ code inside python using weave. I always get a similar problem where the compiled file cannot be found (See below output). I am not sure if the problem is with the compiler or something else. I am a new user of scipy and a novice with python so I would appreciate any direction someone can give me because I have not been able to figure out a work around. Thank you, Michael +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ I have the following related applications installed: +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ Python: 2.4.1 (I am using an older version because this is what ESRI GIS application supports) Scipy: scipy-0.6.0.win32-py2.4.exe MinGW: MinGW-5.1.3.exe OS: Windows XP SP2 +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ When I test the installation of Weave I get the following output: +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ >>> weave.test() Found 1 tests for scipy.weave.ast_tools Found 2 tests for scipy.weave.blitz_tools Found 9 tests for scipy.weave.build_tools Found 0 tests for scipy.weave.c_spec Found 26 tests for scipy.weave.catalog building extensions here: c:\docume~1\michael\locals~1\temp\Michael\python24_compiled\m3 Found 1 tests for scipy.weave.ext_tools Found 0 tests for scipy.weave.inline_tools Found 74 tests for scipy.weave.size_check Found 16 tests for scipy.weave.slice_handler Found 3 tests for scipy.weave.standard_array_spec Found 0 tests for __main__ ...warning: specified build_dir '_bad_path_' does not exist or is not writable. Trying default locations .....warning: specified build_dir '_bad_path_' does not exist or is not writable. Trying default locations ............................removing 'c:\docume~1\michael\locals~1\temp\tmpdqudhmcat_test' (and everything under it) error removing c:\docume~1\michael\locals~1\temp\tmpdqudhmcat_test: c:\docume~1\michael\locals~1\temp\tmpdqudhmcat_test\win3224compiled_catalog: Permission denied error removing c:\docume~1\michael\locals~1\temp\tmpdqudhmcat_test: c:\docume~1\michael\locals~1\temp\tmpdqudhmcat_test: Directory not empty .removing 'c:\docume~1\michael\locals~1\temp\tmpw144aycat_test' (and everything under it) ............................................................................................... ---------------------------------------------------------------------- Ran 132 tests in 2.625s OK +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ When I try to test the following script or any other script I get the following message: +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ def prod(m, v): #C++ version nrows, ncolumns = m.shape res = numpy.zeros((nrows, ncolumns), float) code = r""" for (int i=0; i running build_ext running build_src building extension "sc_045cfaf40ef1a0738b1066aaa886a55d7" sources customize Mingw32CCompiler customize Mingw32CCompiler using build_ext customize Mingw32CCompiler customize Mingw32CCompiler using build_ext building 'sc_045cfaf40ef1a0738b1066aaa886a55d7' extension compiling C++ sources C compiler: g++ -mno-cygwin -O2 -Wall compile options: '-IC:\Python24\lib\site-packages\scipy\weave -IC:\Python24\lib\site-packages\scipy\weave\scxx -IC:\Python24\lib\site-packages\numpy\core\include -IC:\Python24\include -IC:\Python24\PC -c' g++ -mno-cygwin -O2 -Wall -IC:\Python24\lib\site-packages\scipy\weave -IC:\Python24\lib\site-packages\scipy\weave\scxx -IC:\Python24\lib\site-packages\numpy\core\include -IC:\Python24\include -IC:\Python24\PC -c c:\docume~1\michael\locals~1\temp\Michael\python24_compiled\sc_045cfaf40ef1a0738b1066aaa886a55d7.cpp -o c:\docume~1\michael\locals~1\temp\Michael\python24_intermediate\compiler_c8350d870e6c54e8f29dd7094c2bfb45\Release\docume~1\michael\locals~1\temp\michael\python24_compiled\sc_045cfaf40ef1a0738b1066aaa886a55d7.o Traceback (most recent call last): File "C:\Python24\Lib\site-packages\pythonwin\pywin\framework\scriptutils.py", line 310, in RunScript exec codeObject in __main__.__dict__ File "C:\Documents and Settings\Michael\Application Data\ESRI\ArcToolbox\scripts\test_weave.py", line 179, in ? main() File "C:\Documents and Settings\Michael\Application Data\ESRI\ArcToolbox\scripts\test_weave.py", line 170, in main prod(m, v) File "C:\Documents and Settings\Michael\Application Data\ESRI\ArcToolbox\scripts\test_weave.py", line 44, in prod err = weave.inline(code,['nrows', 'ncolumns', 'res', 'm', 'v'], verbose=2) File "C:\Python24\Lib\site-packages\scipy\weave\inline_tools.py", line 338, in inline auto_downcast = auto_downcast, File "C:\Python24\Lib\site-packages\scipy\weave\inline_tools.py", line 447, in compile_function verbose=verbose, **kw) File "C:\Python24\Lib\site-packages\scipy\weave\ext_tools.py", line 365, in compile verbose = verbose, **kw) File "C:\Python24\Lib\site-packages\scipy\weave\build_tools.py", line 269, in build_extension setup(name = module_name, ext_modules = [ext],verbose=verb) File "C:\Python24\lib\site-packages\numpy\distutils\core.py", line 173, in setup return old_setup(**new_attr) File "C:\Python24\lib\distutils\core.py", line 159, in setup raise SystemExit, error CompileError: error: c:\docume~1\michael\locals~1\temp\tmp2dwbkp: No such file or directory ____________________________________________________________________________________ Be a better sports nut! Let your teams follow you with Yahoo Mobile. Try it now. http://mobile.yahoo.com/sports;_ylt=At9_qDKvtAbMuh1G1SQtBI7ntAcJ -------------- next part -------------- An HTML attachment was scrubbed... URL: From David.L.Goldsmith at noaa.gov Fri Nov 16 12:43:08 2007 From: David.L.Goldsmith at noaa.gov (David.Goldsmith) Date: Fri, 16 Nov 2007 09:43:08 -0800 Subject: [SciPy-user] ode on complex systems In-Reply-To: References: <80c99e790711160653v58f2b65o68a52c422e4623fc@mail.gmail.com> <1195226348.24691.2.camel@localhost> Message-ID: <473DD6AC.3070207@noaa.gov> Nils Wagner wrote: > AFAIK, you can also apply the patch which is available > at > > http://projects.scipy.org/scipy/scipy/ticket/334 > > > So you can integrate complex ODE's directly without > doubling the size. Please let me know how it works > for you. > Please let us all know. :-) DG > Nils > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > From lbolla at gmail.com Fri Nov 16 14:08:22 2007 From: lbolla at gmail.com (lorenzo bolla) Date: Fri, 16 Nov 2007 20:08:22 +0100 Subject: [SciPy-user] ode on complex systems In-Reply-To: <473DD6AC.3070207@noaa.gov> References: <80c99e790711160653v58f2b65o68a52c422e4623fc@mail.gmail.com> <1195226348.24691.2.camel@localhost> <473DD6AC.3070207@noaa.gov> Message-ID: <80c99e790711161108k6fbdd4b4sd1db90932d1771d6@mail.gmail.com> I quickly tried to compile scipy with the patch, but it failed (on cygwin). I had no time to investigate the problem further, but I'll let you know as soon as possible more details. L. PS: for now, I solved the problem by doubling the size of the system as Fabrice pointed out. On Nov 16, 2007 6:43 PM, David.Goldsmith wrote: > Nils Wagner wrote: > > AFAIK, you can also apply the patch which is available > > at > > > > http://projects.scipy.org/scipy/scipy/ticket/334 > > > > > > So you can integrate complex ODE's directly without > > doubling the size. Please let me know how it works > > for you. > > > Please let us all know. :-) > > DG > > > Nils > > > > _______________________________________________ > > SciPy-user mailing list > > SciPy-user at scipy.org > > http://projects.scipy.org/mailman/listinfo/scipy-user > > > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > -------------- next part -------------- An HTML attachment was scrubbed... URL: From pav at iki.fi Fri Nov 16 15:41:14 2007 From: pav at iki.fi (Pauli Virtanen) Date: Fri, 16 Nov 2007 20:41:14 +0000 (UTC) Subject: [SciPy-user] ode on complex systems References: <80c99e790711160653v58f2b65o68a52c422e4623fc@mail.gmail.com> <1195226348.24691.2.camel@localhost> <473DD6AC.3070207@noaa.gov> <80c99e790711161108k6fbdd4b4sd1db90932d1771d6@mail.gmail.com> Message-ID: Fri, 16 Nov 2007 20:08:22 +0100, lorenzo bolla wrote: > I quickly tried to compile scipy with the patch, but it failed (on > cygwin). I had no time to investigate the problem further, but I'll let > you know as soon as possible more details. > L. This may be a cygwin-related issue, I'd appreciate more information. The patches compile & test fine for me on Ubuntu atop Scipy r3546. (Except that the zvode-test.patch needs a rediff.) -- Pauli Virtanen From yennifersantiago at gmail.com Fri Nov 16 17:51:22 2007 From: yennifersantiago at gmail.com (Yennifer Santiago) Date: Fri, 16 Nov 2007 18:51:22 -0400 Subject: [SciPy-user] Biblioteca_algortimos _geneticos Message-ID: <41bc705b0711161451x15b5a94fvc3f8646bba72cff2@mail.gmail.com> Hola.. Quisiera saber como se llama la biblioteca de clases de Algoritmos Geneticos dentro del paquete scipy-0.6.0 que se descarga de internet. Necesito ver los metodos para operar con Algoritmos Geneticos con que cuenta esta biblioteca, alli veo varias carpetas pero ninguna referente a Algoritmos Geneticos. Gracias Yennifer Santiago From cohen at slac.stanford.edu Fri Nov 16 19:40:40 2007 From: cohen at slac.stanford.edu (Johann Cohen-Tanugi) Date: Fri, 16 Nov 2007 16:40:40 -0800 Subject: [SciPy-user] Biblioteca_algortimos _geneticos In-Reply-To: <41bc705b0711161451x15b5a94fvc3f8646bba72cff2@mail.gmail.com> References: <41bc705b0711161451x15b5a94fvc3f8646bba72cff2@mail.gmail.com> Message-ID: <473E3888.7010505@slac.stanford.edu> Yennifer Santiago wrote: > Hola.. > > Quisiera saber como se llama la biblioteca de clases de Algoritmos > Geneticos dentro del paquete scipy-0.6.0 que se descarga de internet. > Necesito ver los metodos para operar con Algoritmos Geneticos con que > cuenta esta biblioteca, alli veo varias carpetas pero ninguna > referente a Algoritmos Geneticos. > > Gracias > > Yennifer Santiago > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > se llama ga, dentro sandbox.... From skraelings001 at gmail.com Fri Nov 16 19:47:38 2007 From: skraelings001 at gmail.com (Reynaldo Baquerizo) Date: Fri, 16 Nov 2007 19:47:38 -0500 Subject: [SciPy-user] Biblioteca_algortimos _geneticos In-Reply-To: <41bc705b0711161451x15b5a94fvc3f8646bba72cff2@mail.gmail.com> References: <41bc705b0711161451x15b5a94fvc3f8646bba72cff2@mail.gmail.com> Message-ID: <473E3A2A.2020601@gmail.com> Yennifer Santiago escribi?: > Hola.. > > Quisiera saber como se llama la biblioteca de clases de Algoritmos > Geneticos dentro del paquete scipy-0.6.0 que se descarga de internet. > Necesito ver los metodos para operar con Algoritmos Geneticos con que > cuenta esta biblioteca, alli veo varias carpetas pero ninguna > referente a Algoritmos Geneticos. > > no tengo la ruta exacta, a?n uso scipy-0.5 pero deber?as ver un directorio 'sandbox', dentro de 'sandbox' hay un directorio 'ga' (genetic algorithm). Ah? est? todo lo relacionado a algoritmos gen?ticos, incluido un programa ejemplo example.py desde el interprete de python (el standard o ipython) puedes importar el m?dulo con .. 'from scipy import ga' > Gracias > > Yennifer Santiago > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > > From aisaac at american.edu Fri Nov 16 22:54:56 2007 From: aisaac at american.edu (Alan G Isaac) Date: Fri, 16 Nov 2007 22:54:56 -0500 Subject: [SciPy-user] adaptive simulated annealing (ASA) In-Reply-To: <473DBFC0.4010105@enthought.com> References: <473D749B.6070308@mur.at><473DBFC0.4010105@enthought.com> Message-ID: On Fri, 16 Nov 2007, Bryan Van de Ven apparently wrote: > I doubt you will find many simmulated annealing packages better or more > featured than ASA, so I'd suggest taking a stab at wrapping it yourself. > http://www.igi.tugraz.at/lehre/MLA/WS01/asamin.html (for reference) > http://www.swig.org/ Please keep the list apprised of your efforts. Thank you, Alan Isaac From dwarnold45 at suddenlink.net Sat Nov 17 02:38:49 2007 From: dwarnold45 at suddenlink.net (David Arnold) Date: Fri, 16 Nov 2007 23:38:49 -0800 Subject: [SciPy-user] Instruction in math and science Message-ID: All, We are considering using Python and some of its libraries to aid instruction in mathematics and science at our college. I've read much of the documentation and installation directions for python and some of its libraries, such as matplotlib, numpy, and scipy. I am aware that there are binary installations for the Mac at: http://www.pythonmac.org/packages/ But no binary for scipy at this site for the 2.5x version of python. I am also aware of the enthought distribution for windows and the installer at: http://code.enthought.com/downloads/enthon-python2.4-1.0.0.exe I am looking for advice from experienced educators who have advised their students on installation on Mac and Windows. These students at our school are unlikely to have the expertise to unzip, configure, compile, etc., and need a "one-click" solution to install the software needed. Can anyone share their experience with their students and/or give advice? Thanks. David Arnold College of the Redwoods http://online.redwoods.edu/instruct/darnold/ From wdj at usna.edu Sat Nov 17 07:22:41 2007 From: wdj at usna.edu (David Joyner) Date: Sat, 17 Nov 2007 07:22:41 -0500 (EST) Subject: [SciPy-user] Instruction in math and science Message-ID: <20071117072241.AJK54693@mp2.nettest.usna.edu> David: I'd like to suggest Sage, http://www.sagemath.org/ which comes with python 2.5 and scipy http://www.sagemath.org/packages/standard/ and is available in binary for macs and windows http://www.sagemath.org/download.html I've been using it for teaching DEs http://sage.math.washington.edu/home/wdj/teaching/index.html http://cadigweb.ew.usna.edu/%7Ewdj/teach/sm212/ - David Joyner ---- Original message ---- >Date: Fri, 16 Nov 2007 23:38:49 -0800 >From: David Arnold >Subject: [SciPy-user] Instruction in math and science >To: scipy-user at scipy.org > >All, > >We are considering using Python and some of its libraries to aid >instruction in mathematics and science at our college. I've read much >of the documentation and installation directions for python and some >of its libraries, such as matplotlib, numpy, and scipy. > >I am aware that there are binary installations for the Mac at: > >http://www.pythonmac.org/packages/ > >But no binary for scipy at this site for the 2.5x version of python. > >I am also aware of the enthought distribution for windows and the >installer at: > >http://code.enthought.com/downloads/enthon-python2.4-1.0.0.exe > >I am looking for advice from experienced educators who have advised >their students on installation on Mac and Windows. These students at >our school are unlikely to have the expertise to unzip, configure, >compile, etc., and need a "one-click" solution to install the >software needed. > >Can anyone share their experience with their students and/or give >advice? > >Thanks. > >David Arnold >College of the Redwoods >http://online.redwoods.edu/instruct/darnold/ >_______________________________________________ >SciPy-user mailing list >SciPy-user at scipy.org >http://projects.scipy.org/mailman/listinfo/scipy-user From boris at vonloesch.de Sat Nov 17 12:05:05 2007 From: boris at vonloesch.de (Boris von Loesch) Date: Sat, 17 Nov 2007 18:05:05 +0100 Subject: [SciPy-user] Problems compiling Scipy with Cygwin Message-ID: <41a2e2dc0711170905j635a61ofc6b8e6727e3299e@mail.gmail.com> Hello, I tried to compile scipy with cygwin, but the building ended in an error. I use the latest Atlas 3.8.0, and numpy 1.0.4 compiled without problems using python.exe setup.py config --compiler=mingw32 build --compiler=mingw32 bdist_wininst If I build scipy the following error occur: ... building 'scipy.integrate.vode' extension compiling C sources C compiler: gcc -mno-cygwin -O2 -Wall -Wstrict-prototypes compile options: '-DATLAS_INFO="\"?.?.?\"" -Ic:\cygwin\usr\include -Ibuild\src.w in32-2.5 -Ic:\programme\python2.5\lib\site-packages\numpy\core\include -Ic:\prog ramme\python2.5\include -Ic:\programme\python2.5\PC -c' C:\cygwin\bin\g77.exe -g -Wall -mno-cygwin -g -Wall -mno-cygwin -shared build\te mp.win32-2.5\Release\build\src.win32-2.5\scipy\integrate\vodemodule.o build\temp .win32-2.5\Release\build\src.win32-2.5\fortranobject.o -Lc:\cygwin\usr\lib -Lc:\ programme\python2.5\libs -Lc:\programme\python2.5\PCBuild -Lbuild\temp.win32-2.5 -lodepack -llinpack_lite -lmach -lf77blas -lcblas -latlas -lpython25 -lg2c -o b uild\lib.win32-2.5\scipy\integrate\vode.pyd build\temp.win32-2.5\Release\build\src.win32-2.5\scipy\integrate\vodemodule.o:vo demodule.c:(.text+0x367): undefined reference to `__impure_ptr' build\temp.win32-2.5\Release\build\src.win32-2.5\scipy\integrate\vodemodule.o:vo demodule.c:(.text+0x462): undefined reference to `__impure_ptr' build\temp.win32-2.5\Release\build\src.win32-2.5\scipy\integrate\vodemodule.o:vo demodule.c:(.text+0x585): undefined reference to `__impure_ptr' build\temp.win32-2.5\Release\build\src.win32-2.5\scipy\integrate\vodemodule.o:vo demodule.c:(.text+0x9b1): undefined reference to `__impure_ptr' build\temp.win32-2.5\Release\build\src.win32-2.5\scipy\integrate\vodemodule.o:vo demodule.c:(.text+0xb23): undefined reference to `__impure_ptr' build\temp.win32-2.5\Release\build\src.win32-2.5\scipy\integrate\vodemodule.o:vo demodule.c:(.text+0xb79): more undefined references to `__impure_ptr' follow build\temp.win32-2.5\Release\build\src.win32-2.5\scipy\integrate\vodemodule.o:vo demodule.c:(.text+0x18f4): undefined reference to `_setjmp' build\temp.win32-2.5\Release\build\src.win32-2.5\scipy\integrate\vodemodule.o:vo demodule.c:(.text+0x1ac5): undefined reference to `_setjmp' collect2: ld returned 1 exit status error: Command "C:\cygwin\bin\g77.exe -g -Wall -mno-cygwin -g -Wall -mno-cygwin -shared build\temp.win32-2.5\Release\build\src.win32-2.5\scipy\integrate\vodemod ule.o build\temp.win32-2.5\Release\build\src.win32-2.5\fortranobject.o -Lc:\cygw in\usr\lib -Lc:\programme\python2.5\libs -Lc:\programme\python2.5\PCBuild -Lbuil d\temp.win32-2.5 -lodepack -llinpack_lite -lmach -lf77blas -lcblas -latlas -lpyt hon25 -lg2c -o build\lib.win32-2.5\scipy\integrate\vode.pyd" failed with exit st atus 1 I also tried to build scipy with "-fno-second-underscore" enabled via F77FLAGS, but that doesn't change anything. Does anybody know how to fix this? Best Regards, Boris From alan at ajackson.org Sat Nov 17 16:30:10 2007 From: alan at ajackson.org (Alan Jackson) Date: Sat, 17 Nov 2007 15:30:10 -0600 Subject: [SciPy-user] 2D arrays in weave.inline Message-ID: <20071117153010.6019d1df@nova.oplnk.net> This is probably an astonishingly simple question, but I've been struggling for some time with it. I am trying to work with weave for the first time, and it is becoming clear that I don't understand how 2D arrays get passed. The example below illustrates the problem I'm having... the 2D indexing doesn't seem to be working right at all. BTW the link to the weave documentation is a dead link. Is there any documentation beyond a couple of short examples? Every promising google link turns up dead. #!/usr/bin/env python from numpy import array, zeros, abs, ones, arange, ravel, nan, empty, isnan, histogram, hstack, multiply, log, shape, triu import scipy.weave as weave from scipy.weave import converters def Matrix(width, minval, maxval, exponent, type="HorseSaddle"): halfSqrt2 = 0.707106781185 sin45 = halfSqrt2 cos45 = halfSqrt2 Matrix = zeros((width, width), dtype=float) for j in range(0,width): jScaled = j / (halfSqrt2 * (width-1)) jTranslated = jScaled - halfSqrt2 jsin45 = jTranslated * sin45 jcos45 = jsin45 for i in range(j,width): iScaled = i / (halfSqrt2 * (width-1)); iTranslated = iScaled - halfSqrt2; iRotated = iTranslated * cos45 + jsin45; jRotated = - iTranslated * sin45 + jcos45; iRotated = abs(iRotated); jRotated = abs(jRotated); Matrix[i, j] = round(maxval * iRotated**exponent + \ minval * jRotated**exponent, 2) Matrix[j,i] = Matrix[i,j] return Matrix ################################################################################ # algorithm - WEAVE version ################################################################################ def Fast(s1, s2, wt1, wt2, M): seq1 = s1[0] seq2 = s2[0] t1 = s1[1] t2 = s2[1] code = ''' for (unsigned int idx2 = 0; idx2 < 10; ++idx2) { printf("\\n"); for (unsigned int idx1 = 0; idx1 < 10; ++idx1) { printf("%6f ", M[idx1, idx2]); }// for whole s1 }// for whole s2 printf("\\n"); ''' weave.inline(code, ["seq1", "seq2", "t1", "t2", "M" ]) ################################################################################ # test section ################################################################################ if __name__ == '__main__' : attr1 = array([1,2,2,3,2,4,3,5,4,3,3, 3,4,6,1,0,2,4,2,5,2,3]) attr2 = array([1,2,2,3,2,5,3,5,4,3,2,6, 3,4,6,1,0,1,4,2,5,2,3]) t = arange(1, len(attr1)+1, dtype=float) wt1 = ones(len(attr1), dtype=float) t2 = arange(1, len(attr2)+1, dtype=float) wt2 = ones(len(attr2), dtype=float) seq1 = [attr1, t] seq2 = [attr2, t2] M = Matrix(7,-10,10,2.) print M s1s2, s2s1 = Fast(seq1, seq2, wt1, wt2, M ) -- ----------------------------------------------------------------------- | Alan K. Jackson | To see a World in a Grain of Sand | | alan at ajackson.org | And a Heaven in a Wild Flower, | | www.ajackson.org | Hold Infinity in the palm of your hand | | Houston, Texas | And Eternity in an hour. - Blake | ----------------------------------------------------------------------- From gael.varoquaux at normalesup.org Sat Nov 17 16:46:59 2007 From: gael.varoquaux at normalesup.org (Gael Varoquaux) Date: Sat, 17 Nov 2007 22:46:59 +0100 Subject: [SciPy-user] 2D arrays in weave.inline In-Reply-To: <20071117153010.6019d1df@nova.oplnk.net> References: <20071117153010.6019d1df@nova.oplnk.net> Message-ID: <20071117214659.GE12125@clipper.ens.fr> On Sat, Nov 17, 2007 at 03:30:10PM -0600, Alan Jackson wrote: > This is probably an astonishingly simple question, but I've been > struggling for some time with it. > I am trying to work with weave for the first time, and it is becoming > clear that I don't understand how 2D arrays get passed. Me neither. This is why I use the blitz converter. I modified your example to use the vlitz converters, it is just so much nicer: ++++++++++++++++++++++++++++++++++++++++++++ def Fast(s1, s2, wt1, wt2, M): seq1 = s1[0] seq2 = s2[0] t1 = s1[1] t2 = s2[1] code = ''' for (int idx2 = 0; idx2 < 10; ++idx2) { printf("\\n"); for (int idx1 = 0; idx1 < 10; ++idx1) { printf("%6f ", M(idx1, idx2)); }// for whole s1 }// for whole s2 printf("\\n"); ''' weave.inline(code, ["seq1", "seq2", "t1", "t2", "M" ], type_converters=converters.blitz, ) ++++++++++++++++++++++++++++++++++++++++++++ > BTW the link to the weave documentation is a dead link. Is there any > documentation beyond a couple of short examples? Every promising google > link turns up dead. There have been a few threads on this mailing list. I also think http://scipy.org/PerformancePython#head-a3f4dd816378d3ba4cbdd3d23dc98529e8ad7087 can be useful. Ga?l From william.ratcliff at gmail.com Sat Nov 17 17:16:55 2007 From: william.ratcliff at gmail.com (william ratcliff) Date: Sat, 17 Nov 2007 17:16:55 -0500 Subject: [SciPy-user] 2D arrays in weave.inline In-Reply-To: <20071117214659.GE12125@clipper.ens.fr> References: <20071117153010.6019d1df@nova.oplnk.net> <20071117214659.GE12125@clipper.ens.fr> Message-ID: <827183970711171416v35880e81h1d377b155fca1bde@mail.gmail.com> Gael, have you tried weave with openmp? Cheers, William On Nov 17, 2007 4:46 PM, Gael Varoquaux wrote: > On Sat, Nov 17, 2007 at 03:30:10PM -0600, Alan Jackson wrote: > > This is probably an astonishingly simple question, but I've been > > struggling for some time with it. > > > I am trying to work with weave for the first time, and it is becoming > > clear that I don't understand how 2D arrays get passed. > > Me neither. This is why I use the blitz converter. I modified your > example to use the vlitz converters, it is just so much nicer: > > ++++++++++++++++++++++++++++++++++++++++++++ > def Fast(s1, s2, wt1, wt2, M): > > seq1 = s1[0] > seq2 = s2[0] > t1 = s1[1] > t2 = s2[1] > > code = ''' > > > for (int idx2 = 0; idx2 < 10; ++idx2) { > printf("\\n"); > for (int idx1 = 0; idx1 < 10; ++idx1) { > printf("%6f ", M(idx1, idx2)); > > }// for whole s1 > }// for whole s2 > printf("\\n"); > ''' > > weave.inline(code, ["seq1", "seq2", "t1", "t2", "M" ], > type_converters=converters.blitz, > ) > ++++++++++++++++++++++++++++++++++++++++++++ > > > > BTW the link to the weave documentation is a dead link. Is there any > > documentation beyond a couple of short examples? Every promising google > > link turns up dead. > > There have been a few threads on this mailing list. I also think > http://scipy.org/PerformancePython#head-a3f4dd816378d3ba4cbdd3d23dc98529e8ad7087 > can be useful. > > Ga?l > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > From gael.varoquaux at normalesup.org Sat Nov 17 17:20:26 2007 From: gael.varoquaux at normalesup.org (Gael Varoquaux) Date: Sat, 17 Nov 2007 23:20:26 +0100 Subject: [SciPy-user] 2D arrays in weave.inline In-Reply-To: <827183970711171416v35880e81h1d377b155fca1bde@mail.gmail.com> References: <20071117153010.6019d1df@nova.oplnk.net> <20071117214659.GE12125@clipper.ens.fr> <827183970711171416v35880e81h1d377b155fca1bde@mail.gmail.com> Message-ID: <20071117222026.GG12125@clipper.ens.fr> On Sat, Nov 17, 2007 at 05:16:55PM -0500, william ratcliff wrote: > Gael, have you tried weave with openmp? No, and I would be interested to find out if it works or not. But if it doesn't, I don't see why ctypes would not (maybe I am missing some thing, though, my knowledge of dynamical loading is very thin). Ga?l From alan at ajackson.org Sat Nov 17 17:44:52 2007 From: alan at ajackson.org (Alan Jackson) Date: Sat, 17 Nov 2007 16:44:52 -0600 Subject: [SciPy-user] 2D arrays in weave.inline In-Reply-To: <20071117214659.GE12125@clipper.ens.fr> References: <20071117153010.6019d1df@nova.oplnk.net> <20071117214659.GE12125@clipper.ens.fr> Message-ID: <20071117164452.4d0ca4c7@nova.oplnk.net> On Sat, 17 Nov 2007 22:46:59 +0100 Gael Varoquaux wrote: > On Sat, Nov 17, 2007 at 03:30:10PM -0600, Alan Jackson wrote: > > This is probably an astonishingly simple question, but I've been > > struggling for some time with it. > > > I am trying to work with weave for the first time, and it is becoming > > clear that I don't understand how 2D arrays get passed. > > Me neither. This is why I use the blitz converter. I modified your > example to use the vlitz converters, it is just so much nicer: > Well, with my *real* code, blitz gave me several pages of errors - I tried treating the 2D arrays as a 1D array, doing all the index arithmetic myself, and that seems to work. Which strikes me as odd... -- ----------------------------------------------------------------------- | Alan K. Jackson | To see a World in a Grain of Sand | | alan at ajackson.org | And a Heaven in a Wild Flower, | | www.ajackson.org | Hold Infinity in the palm of your hand | | Houston, Texas | And Eternity in an hour. - Blake | ----------------------------------------------------------------------- From stefan at sun.ac.za Sat Nov 17 18:13:44 2007 From: stefan at sun.ac.za (Stefan van der Walt) Date: Sun, 18 Nov 2007 01:13:44 +0200 Subject: [SciPy-user] 2D arrays in weave.inline In-Reply-To: <20071117164452.4d0ca4c7@nova.oplnk.net> References: <20071117153010.6019d1df@nova.oplnk.net> <20071117214659.GE12125@clipper.ens.fr> <20071117164452.4d0ca4c7@nova.oplnk.net> Message-ID: <20071117231344.GC26080@mentat.za.net> On Sat, Nov 17, 2007 at 04:44:52PM -0600, Alan Jackson wrote: > On Sat, 17 Nov 2007 22:46:59 +0100 > Well, with my *real* code, blitz gave me several pages of errors - I tried > treating the 2D arrays as a 1D array, doing all the index arithmetic myself, > and that seems to work. Which strikes me as odd... With Blitz, you are allowed M(i,j) indexing. Otherwise, only the memory address of the data is passed and, as you noticed, you must calculate the offset yourself, i.e. M[i*nr_cols + j]. I modified your code to work with ctypes, which you may find interesting. Use python setup.py build_ext -i to build the C file. Then run alan.py. Cheers St?fan -------------- next part -------------- A non-text attachment was scrubbed... Name: setup.py Type: text/x-python Size: 461 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: alan.c Type: text/x-csrc Size: 243 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: alan.py Type: text/x-python Size: 1973 bytes Desc: not available URL: From william.ratcliff at gmail.com Sat Nov 17 18:16:26 2007 From: william.ratcliff at gmail.com (william ratcliff) Date: Sat, 17 Nov 2007 18:16:26 -0500 Subject: [SciPy-user] 2D arrays in weave.inline In-Reply-To: <20071117164452.4d0ca4c7@nova.oplnk.net> References: <20071117153010.6019d1df@nova.oplnk.net> <20071117214659.GE12125@clipper.ens.fr> <20071117164452.4d0ca4c7@nova.oplnk.net> Message-ID: <827183970711171516n5d439908w55bb68c70abcb4f7@mail.gmail.com> I had good luck with it using blitz type converters (though if there's a bug in the C code, it's painful to find ;>). Did you look at the array3d example that came with the code? So far, I've been able to pass 1,2, and 3d arrays successfully. But, I've been cheating and creating the arrays that I want returned in numpy before calling the routine in weave and passing them in as arguments. I modify them within weave, and then check the results when weave terminates to make sure they have the values I think they should have in numpy. Then I didn't have to work on figuring out return types, allocating memory,etc. I also use blitz type converters. What kind of errors are you getting? When I was writing my first program using weave, I got tons of errors--even if it was something as minor as forgetting a ";" in one of the lines of code--one helpful thing to do is to find the temporary directory where weave stores the c++ versions of your code and deleting it--there should be two directories. However, if it's not a simple question of stale code lying around, then wade through the pages of output and try to find out whether there is a simple bug in your C code. The next possible source of error could be a compiler issue--for example, I normally use an older version of gcc (I forget, something like 3.4) and upgraded to version 4.x (one of the most recent) and found that it no longer worked. Is the real code involved something simple enough that you can post it? Cheers, William On Nov 17, 2007 5:44 PM, Alan Jackson wrote: > On Sat, 17 Nov 2007 22:46:59 +0100 > Gael Varoquaux wrote: > > > On Sat, Nov 17, 2007 at 03:30:10PM -0600, Alan Jackson wrote: > > > This is probably an astonishingly simple question, but I've been > > > struggling for some time with it. > > > > > I am trying to work with weave for the first time, and it is becoming > > > clear that I don't understand how 2D arrays get passed. > > > > Me neither. This is why I use the blitz converter. I modified your > > example to use the vlitz converters, it is just so much nicer: > > > > Well, with my *real* code, blitz gave me several pages of errors - I tried > treating the 2D arrays as a 1D array, doing all the index arithmetic myself, > and that seems to work. Which strikes me as odd... > > -- > ----------------------------------------------------------------------- > | Alan K. Jackson | To see a World in a Grain of Sand | > | alan at ajackson.org | And a Heaven in a Wild Flower, | > | www.ajackson.org | Hold Infinity in the palm of your hand | > | Houston, Texas | And Eternity in an hour. - Blake | > ----------------------------------------------------------------------- > _______________________________________________ > > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > From alan at ajackson.org Sat Nov 17 18:30:18 2007 From: alan at ajackson.org (Alan Jackson) Date: Sat, 17 Nov 2007 17:30:18 -0600 Subject: [SciPy-user] 2D arrays in weave.inline In-Reply-To: <827183970711171516n5d439908w55bb68c70abcb4f7@mail.gmail.com> References: <20071117153010.6019d1df@nova.oplnk.net> <20071117214659.GE12125@clipper.ens.fr> <20071117164452.4d0ca4c7@nova.oplnk.net> <827183970711171516n5d439908w55bb68c70abcb4f7@mail.gmail.com> Message-ID: <20071117173018.582e94a7@nova.oplnk.net> Thanks all for your help. The path I chose was to redo my indexing - that was the path of least resistance. I'm not sure where the blitz errors came from - the same code compiled clean with inline, so that is a mystery (I did modify the arrays for blitz before compiling). I also created all the arrays I wanted returned in numpy first - which makes for a really long function call, but it works and it is not too awkward. I got a speed up of x37 on my real test. On Sat, 17 Nov 2007 18:16:26 -0500 "william ratcliff" wrote: > I had good luck with it using blitz type converters (though if there's > a bug in the C code, it's painful to find ;>). Did you look at the > array3d example that came with the code? So far, I've been able to > pass 1,2, and 3d arrays successfully. But, I've been cheating and > creating the arrays that I want returned in numpy before calling the > routine in weave and passing them in as arguments. I modify them > within weave, and then check the results when weave terminates to make > sure they have the values I think they should have in numpy. Then I > didn't have to work on figuring out return types, allocating > memory,etc. I also use blitz type converters. > > What kind of errors are you getting? When I was writing my first > program using weave, I got tons of errors--even if it was something as > minor as forgetting a ";" in one of the lines of code--one helpful > thing to do is to find the temporary directory where weave stores the > c++ versions of your code and deleting it--there should be two > directories. However, if it's not a simple question of stale code > lying around, then wade through the pages of output and try to find > out whether there is a simple bug in your C code. The next possible > source of error could be a compiler issue--for example, I normally use > an older version of gcc (I forget, something like 3.4) and upgraded to > version 4.x (one of the most recent) and found that it no longer > worked. Is the real code involved something simple enough that you > can post it? > > Cheers, > William > > On Nov 17, 2007 5:44 PM, Alan Jackson wrote: > > On Sat, 17 Nov 2007 22:46:59 +0100 > > Gael Varoquaux wrote: > > > > > On Sat, Nov 17, 2007 at 03:30:10PM -0600, Alan Jackson wrote: > > > > This is probably an astonishingly simple question, but I've been > > > > struggling for some time with it. > > > > > > > I am trying to work with weave for the first time, and it is becoming > > > > clear that I don't understand how 2D arrays get passed. > > > > > > Me neither. This is why I use the blitz converter. I modified your > > > example to use the vlitz converters, it is just so much nicer: > > > > > > > Well, with my *real* code, blitz gave me several pages of errors - I tried > > treating the 2D arrays as a 1D array, doing all the index arithmetic myself, > > and that seems to work. Which strikes me as odd... > > > > -- > > ----------------------------------------------------------------------- > > | Alan K. Jackson | To see a World in a Grain of Sand | > > | alan at ajackson.org | And a Heaven in a Wild Flower, | > > | www.ajackson.org | Hold Infinity in the palm of your hand | > > | Houston, Texas | And Eternity in an hour. - Blake | > > ----------------------------------------------------------------------- > > _______________________________________________ > > > > SciPy-user mailing list > > SciPy-user at scipy.org > > http://projects.scipy.org/mailman/listinfo/scipy-user > > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user -- ----------------------------------------------------------------------- | Alan K. Jackson | To see a World in a Grain of Sand | | alan at ajackson.org | And a Heaven in a Wild Flower, | | www.ajackson.org | Hold Infinity in the palm of your hand | | Houston, Texas | And Eternity in an hour. - Blake | ----------------------------------------------------------------------- From gael.varoquaux at normalesup.org Sat Nov 17 18:33:06 2007 From: gael.varoquaux at normalesup.org (Gael Varoquaux) Date: Sun, 18 Nov 2007 00:33:06 +0100 Subject: [SciPy-user] 2D arrays in weave.inline In-Reply-To: <20071117164452.4d0ca4c7@nova.oplnk.net> References: <20071117153010.6019d1df@nova.oplnk.net> <20071117214659.GE12125@clipper.ens.fr> <20071117164452.4d0ca4c7@nova.oplnk.net> Message-ID: <20071117233306.GH12125@clipper.ens.fr> On Sat, Nov 17, 2007 at 04:44:52PM -0600, Alan Jackson wrote: > Well, with my *real* code, blitz gave me several pages of errors If you are doing complicated things, you should really use another technique to call C from Python. Ctypes is pretty easy to use (http://www.scipy.org/Cookbook/Ctypes). Debug is also much easier, because it is easier to understand the compiler's error. Ga?l From gael.varoquaux at normalesup.org Sat Nov 17 18:36:06 2007 From: gael.varoquaux at normalesup.org (Gael Varoquaux) Date: Sun, 18 Nov 2007 00:36:06 +0100 Subject: [SciPy-user] 2D arrays in weave.inline In-Reply-To: <20071117173018.582e94a7@nova.oplnk.net> References: <20071117153010.6019d1df@nova.oplnk.net> <20071117214659.GE12125@clipper.ens.fr> <20071117164452.4d0ca4c7@nova.oplnk.net> <827183970711171516n5d439908w55bb68c70abcb4f7@mail.gmail.com> <20071117173018.582e94a7@nova.oplnk.net> Message-ID: <20071117233606.GI12125@clipper.ens.fr> On Sat, Nov 17, 2007 at 05:30:18PM -0600, Alan Jackson wrote: > I'm not sure where the blitz errors came from - the same code compiled > clean with inline, so that is a mystery (I did modify the arrays for > blitz before compiling). Just checking: you did replace the "M[x, y]" by "M(x, y)" in all your code? > I also created all the arrays I wanted returned in numpy first - which > makes for a really long function call, but it works and it is not too > awkward. I always do that. It removes all memory management from the C level. Tht's one less hurdle. I usual wrap my function in a another one, to hide this ugly function call, though. > I got a speed up of x37 on my real test. Nice ! Ga?l From prabhu at aero.iitb.ac.in Sat Nov 17 20:13:43 2007 From: prabhu at aero.iitb.ac.in (Prabhu Ramachandran) Date: Sun, 18 Nov 2007 06:43:43 +0530 Subject: [SciPy-user] 2D arrays in weave.inline In-Reply-To: <20071117231344.GC26080@mentat.za.net> References: <20071117153010.6019d1df@nova.oplnk.net> <20071117214659.GE12125@clipper.ens.fr> <20071117164452.4d0ca4c7@nova.oplnk.net> <20071117231344.GC26080@mentat.za.net> Message-ID: <473F91C7.3030202@aero.iitb.ac.in> Stefan van der Walt wrote: > On Sat, Nov 17, 2007 at 04:44:52PM -0600, Alan Jackson wrote: >> On Sat, 17 Nov 2007 22:46:59 +0100 >> Well, with my *real* code, blitz gave me several pages of errors - I tried >> treating the 2D arrays as a 1D array, doing all the index arithmetic myself, >> and that seems to work. Which strikes me as odd... > > With Blitz, you are allowed M(i,j) indexing. Otherwise, only the > memory address of the data is passed and, as you noticed, you must > calculate the offset yourself, i.e. M[i*nr_cols + j]. Travis added a few macros to automate this calculation (for non-blitz cases). If you passed in 'a' then A1(i), A2(i, j) A3(i, j, k) and A4(i, j, k, l) are all defined. Here are the macros: #define A1(i) (*((long*)(a_array->data + (i)*Sa[0]))) #define A2(i,j) (*((long*)(a_array->data + (i)*Sa[0] + (j)*Sa[1]))) #define A3(i,j,k) (*((long*)(a_array->data + (i)*Sa[0] + (j)*Sa[1] + (k)*Sa[2]))) #define A4(i,j,k,l) (*((long*)(a_array->data + (i)*Sa[0] + (j)*Sa[1] + (k)*Sa[2] + (l)*Sa[3]))) cheers, prabhu From alan at ajackson.org Sat Nov 17 22:16:34 2007 From: alan at ajackson.org (Alan Jackson) Date: Sat, 17 Nov 2007 21:16:34 -0600 Subject: [SciPy-user] 2D arrays in weave.inline In-Reply-To: <20071117233606.GI12125@clipper.ens.fr> References: <20071117153010.6019d1df@nova.oplnk.net> <20071117214659.GE12125@clipper.ens.fr> <20071117164452.4d0ca4c7@nova.oplnk.net> <827183970711171516n5d439908w55bb68c70abcb4f7@mail.gmail.com> <20071117173018.582e94a7@nova.oplnk.net> <20071117233606.GI12125@clipper.ens.fr> Message-ID: <20071117211634.7cd27b9a@nova.oplnk.net> On Sun, 18 Nov 2007 00:36:06 +0100 Gael Varoquaux wrote: > On Sat, Nov 17, 2007 at 05:30:18PM -0600, Alan Jackson wrote: > > I'm not sure where the blitz errors came from - the same code compiled > > clean with inline, so that is a mystery (I did modify the arrays for > > blitz before compiling). > > Just checking: you did replace the "M[x, y]" by "M(x, y)" in all your > code? Yep... > > > I also created all the arrays I wanted returned in numpy first - which > > makes for a really long function call, but it works and it is not too > > awkward. > > I always do that. It removes all memory management from the C level. > Tht's one less hurdle. I usual wrap my function in a another one, to hide > this ugly function call, though. > > > I got a speed up of x37 on my real test. > > Nice ! > > Ga?l > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user -- ----------------------------------------------------------------------- | Alan K. Jackson | To see a World in a Grain of Sand | | alan at ajackson.org | And a Heaven in a Wild Flower, | | www.ajackson.org | Hold Infinity in the palm of your hand | | Houston, Texas | And Eternity in an hour. - Blake | ----------------------------------------------------------------------- From alan at ajackson.org Sat Nov 17 22:17:39 2007 From: alan at ajackson.org (Alan Jackson) Date: Sat, 17 Nov 2007 21:17:39 -0600 Subject: [SciPy-user] 2D arrays in weave.inline In-Reply-To: <20071117233306.GH12125@clipper.ens.fr> References: <20071117153010.6019d1df@nova.oplnk.net> <20071117214659.GE12125@clipper.ens.fr> <20071117164452.4d0ca4c7@nova.oplnk.net> <20071117233306.GH12125@clipper.ens.fr> Message-ID: <20071117211739.00075255@nova.oplnk.net> I'll have to look into it... On Sun, 18 Nov 2007 00:33:06 +0100 Gael Varoquaux wrote: > On Sat, Nov 17, 2007 at 04:44:52PM -0600, Alan Jackson wrote: > > Well, with my *real* code, blitz gave me several pages of errors > > If you are doing complicated things, you should really use another > technique to call C from Python. Ctypes is pretty easy to use > (http://www.scipy.org/Cookbook/Ctypes). Debug is also much easier, > because it is easier to understand the compiler's error. > > Ga?l > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user -- ----------------------------------------------------------------------- | Alan K. Jackson | To see a World in a Grain of Sand | | alan at ajackson.org | And a Heaven in a Wild Flower, | | www.ajackson.org | Hold Infinity in the palm of your hand | | Houston, Texas | And Eternity in an hour. - Blake | ----------------------------------------------------------------------- From dwarnold45 at suddenlink.net Sun Nov 18 00:06:10 2007 From: dwarnold45 at suddenlink.net (David Arnold) Date: Sat, 17 Nov 2007 21:06:10 -0800 Subject: [SciPy-user] Installing numpy Message-ID: All, Mac OS X 10.4.11 PPC. Installed python from this: http://www.python.org/ftp/python/2.5.1/python-2.5.1-macosx.dmg Installation finishes with "There were errors installing the software. Please try installing again." Installing again makes no difference, but Python appears to be running: darnold $ python Python 2.5.1 (r251:54869, Apr 18 2007, 22:08:04) [GCC 4.0.1 (Apple Computer, Inc. build 5367)] on darwin Type "help", "copyright", "credits" or "license" for more information. >>> Downloaded numpy from: http://downloads.sourceforge.net/numpy/ numpy-1.0.4.tar.gz?use_mirror=internap After unpacking and changing directories, simply did this: numpy-1.0.4 $ python setup.py install Seem to finish OK. But testing according to README.txt resulted in: numpy-1.0.4 $ python -c 'import numpy; numpy.test()' Running from numpy source directory. Traceback (most recent call last): File "", line 1, in AttributeError: 'module' object has no attribute 'test' Any ideas or advice? Thanks David. From dwarnold45 at suddenlink.net Sun Nov 18 00:27:14 2007 From: dwarnold45 at suddenlink.net (David Arnold) Date: Sat, 17 Nov 2007 21:27:14 -0800 Subject: [SciPy-user] Installing Scipy Message-ID: <4F98C7F5-CA90-4568-839D-0CD8D9295A4A@suddenlink.net> All, Mac OS X 10.4.11 PPC. Installed python from this: http://www.python.org/ftp/python/2.5.1/python-2.5.1-macosx.dmg Installation finishes with "There were errors installing the software. Please try installing again." Installing again makes no difference, but Python appears to be running: scipy-0.6.0 $ python Python 2.5.1 (r251:54869, Apr 18 2007, 22:08:04) [GCC 4.0.1 (Apple Computer, Inc. build 5367)] on darwin Type "help", "copyright", "credits" or "license" for more information. >>> Downloaded scipy from: http://prdownloads.sourceforge.net/scipy/ scipy-0.6.0.tar.gz?download After unpacking and changing directories, simply did this: scipy-0.6.0 $ sudo python setup.py install Seem to finish OK. Lots of "Warnings.' But testing according to INSTALL.txt resulted in: scipy-0.6.0 $ python Python 2.5.1 (r251:54869, Apr 18 2007, 22:08:04) [GCC 4.0.1 (Apple Computer, Inc. build 5367)] on darwin Type "help", "copyright", "credits" or "license" for more information. >>> import scipy Traceback (most recent call last): File "", line 1, in File "scipy/__init__.py", line 54, in from __config__ import show as show_config ImportError: No module named __config__ Any ideas or advice? Thanks David. From robert.kern at gmail.com Sun Nov 18 00:41:59 2007 From: robert.kern at gmail.com (Robert Kern) Date: Sat, 17 Nov 2007 23:41:59 -0600 Subject: [SciPy-user] Installing numpy In-Reply-To: References: Message-ID: <473FD0A7.3030404@gmail.com> David Arnold wrote: > All, > > Mac OS X 10.4.11 PPC. > > Installed python from this: > > http://www.python.org/ftp/python/2.5.1/python-2.5.1-macosx.dmg > > Installation finishes with "There were errors installing the > software. Please try installing again." Installing again makes no > difference, but Python appears to be running: > > darnold $ python > Python 2.5.1 (r251:54869, Apr 18 2007, 22:08:04) > [GCC 4.0.1 (Apple Computer, Inc. build 5367)] on darwin > Type "help", "copyright", "credits" or "license" for more information. > >>> > > Downloaded numpy from: http://downloads.sourceforge.net/numpy/ > numpy-1.0.4.tar.gz?use_mirror=internap > > After unpacking and changing directories, simply did this: > > numpy-1.0.4 $ python setup.py install > > Seem to finish OK. But testing according to README.txt resulted in: > > numpy-1.0.4 $ python -c 'import numpy; numpy.test()' > Running from numpy source directory. > Traceback (most recent call last): > File "", line 1, in > AttributeError: 'module' object has no attribute 'test' > > Any ideas or advice? Change to another directory. The numpy/ directory in the source tree is getting picked up instead of the installed one. Same thing with scipy. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From dwarnold45 at suddenlink.net Sun Nov 18 01:02:39 2007 From: dwarnold45 at suddenlink.net (David Arnold) Date: Sat, 17 Nov 2007 22:02:39 -0800 Subject: [SciPy-user] Installing numpy In-Reply-To: <473FD0A7.3030404@gmail.com> References: <473FD0A7.3030404@gmail.com> Message-ID: <7F5B3638-3AF9-4181-9A52-7908DAB7A425@suddenlink.net> Wow. Good catch. Switching directories: Ran 692 tests in 1.211s >>>import numpy >>>numpy.test() ... OK >>> Not so lucky with scipy: darnold $ python Python 2.5.1 (r251:54869, Apr 18 2007, 22:08:04) [GCC 4.0.1 (Apple Computer, Inc. build 5367)] on darwin Type "help", "copyright", "credits" or "license" for more information. >>> import scipy s>>> scipy.test(level=1) .... Ran 1716 tests in 6.949s FAILED (failures=2) >>> Along the way, I got: **************************************************************** WARNING: clapack module is empty ----------- See scipy/INSTALL.txt for troubleshooting. Notes: * If atlas library is not found by numpy/distutils/system_info.py, then scipy uses flapack instead of clapack. **************************************************************** And: Warning: FAILURE importing tests for /Library/Frameworks/Python.framework/Versions/2.5/lib/python2.5/site- packages/scipy/linsolve/umfpack/tests/test_umfpack.py:17: AttributeError: 'module' object has no attribute 'umfpack' (in ) Warning: FAILURE importing tests for And: **************************************************************** WARNING: cblas module is empty ----------- See scipy/INSTALL.txt for troubleshooting. Notes: * If atlas library is not found by numpy/distutils/system_info.py, then scipy uses fblas instead of cblas. **************************************************************** And failures: ====================================================================== FAIL: check loadmat case sparse ---------------------------------------------------------------------- Traceback (most recent call last): File "/Library/Frameworks/Python.framework/Versions/2.5/lib/ python2.5/site-packages/scipy/io/tests/test_mio.py", line 85, in cc self._check_case(name, files, expected) File "/Library/Frameworks/Python.framework/Versions/2.5/lib/ python2.5/site-packages/scipy/io/tests/test_mio.py", line 80, in _check_case self._check_level(k_label, expected, matdict[k]) File "/Library/Frameworks/Python.framework/Versions/2.5/lib/ python2.5/site-packages/scipy/io/tests/test_mio.py", line 63, in _check_level decimal = 5) File "/Library/Frameworks/Python.framework/Versions/2.5/lib/ python2.5/site-packages/numpy/testing/utils.py", line 232, in assert_array_almost_equal header='Arrays are not almost equal') File "/Library/Frameworks/Python.framework/Versions/2.5/lib/ python2.5/site-packages/numpy/testing/utils.py", line 217, in assert_array_compare assert cond, msg AssertionError: Arrays are not almost equal test sparse; file /Library/Frameworks/Python.framework/Versions/2.5/ lib/python2.5/site-packages/scipy/io/tests/./data/ testsparse_6.5.1_GLNX86.mat, variable testsparse (mismatch 46.6666666667%) x: array([[ 3.03865194e-319, 3.16202013e-322, 1.04346664e-320, 2.05531309e-320, 2.56123631e-320], [ 3.16202013e-322, 0.00000000e+000, 0.00000000e+000,... y: array([[ 1., 2., 3., 4., 5.], [ 2., 0., 0., 0., 0.], [ 3., 0., 0., 0., 0.]]) ====================================================================== FAIL: check_dot (scipy.lib.tests.test_blas.test_fblas1_simple) ---------------------------------------------------------------------- Traceback (most recent call last): File "/Library/Frameworks/Python.framework/Versions/2.5/lib/ python2.5/site-packages/scipy/lib/blas/tests/test_blas.py", line 76, in check_dot assert_almost_equal(f([3j,-4,3-4j],[2,3,1]),-9+2j) File "/Library/Frameworks/Python.framework/Versions/2.5/lib/ python2.5/site-packages/numpy/testing/utils.py", line 158, in assert_almost_equal assert round(abs(desired - actual),decimal) == 0, msg AssertionError: Items are not equal: ACTUAL: (-1.9988369941711426+4.5782649454569436e-37j) DESIRED: (-9+2j) I am too much of a neophyte to know what to do at this point. Any help will be appreciated. D. On Nov 17, 2007, at 9:41 PM, Robert Kern wrote: > David Arnold wrote: >> All, >> >> Mac OS X 10.4.11 PPC. >> >> Installed python from this: >> >> http://www.python.org/ftp/python/2.5.1/python-2.5.1-macosx.dmg >> >> Installation finishes with "There were errors installing the >> software. Please try installing again." Installing again makes no >> difference, but Python appears to be running: >> >> darnold $ python >> Python 2.5.1 (r251:54869, Apr 18 2007, 22:08:04) >> [GCC 4.0.1 (Apple Computer, Inc. build 5367)] on darwin >> Type "help", "copyright", "credits" or "license" for more >> information. >>>>> >> >> Downloaded numpy from: http://downloads.sourceforge.net/numpy/ >> numpy-1.0.4.tar.gz?use_mirror=internap >> >> After unpacking and changing directories, simply did this: >> >> numpy-1.0.4 $ python setup.py install >> >> Seem to finish OK. But testing according to README.txt resulted in: >> >> numpy-1.0.4 $ python -c 'import numpy; numpy.test()' >> Running from numpy source directory. >> Traceback (most recent call last): >> File "", line 1, in >> AttributeError: 'module' object has no attribute 'test' >> >> Any ideas or advice? > > Change to another directory. The numpy/ directory in the source > tree is getting > picked up instead of the installed one. Same thing with scipy. > > -- > Robert Kern > > "I have come to believe that the whole world is an enigma, a > harmless enigma > that is made terrible by our own mad attempt to interpret it as > though it had > an underlying truth." > -- Umberto Eco > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user From youngsu999 at gmail.com Sun Nov 18 05:50:21 2007 From: youngsu999 at gmail.com (youngsu park) Date: Sun, 18 Nov 2007 19:50:21 +0900 Subject: [SciPy-user] scipy 0.6.0 ImportError: cannot import name cscmux Message-ID: <338357900711180250u1fd08b86ide9a504d664e0cc1@mail.gmail.com> Hi. Im Youngsu-park ,in POSTECH in south korea. I had same problem on scipy importing like: from scipy import * . So I looked at the scipy/sparse folder, "C:\Python25\Lib\site-packages\scipy\sparse". To find the reason of error, I open the "C:\Python25\Lib\site-packages\scipy\sparse\sparse.py", file. >From line 21 to 26, there was import statement for "sparsetools" from scipy.sparse.sparsetools import cscmux, csrmux, \ cootocsr, csrtocoo, cootocsc, csctocoo, csctocsr, csrtocsc, \ densetocsr, csrtodense, \ csrmucsr, cscmucsc, \ csr_plus_csr, csc_plus_csc, csr_minus_csr, csc_minus_csc, \ csr_elmul_csr, csc_elmul_csc, csr_eldiv_csr, csc_eldiv_csc I change the working directory to "C:\Python25\Lib\site-packages\scipy\sparse", import sparestools and use dir function to see the members of sparsetools In [16]: import sparsetoo In [17]: dir(sparsetools) Out[17]: ['__doc__', '__file__', '__name__', '__version__', 'ccootocsc', 'ccscadd', 'ccscextract', 'ccscgetel', 'ccscmucsc', 'ccscmucsr', 'ccscmul', 'ccscmux', 'ccscsetel', 'ccsctocoo', 'ccsctofull', 'ccsrmucsc', 'ccsrmux', 'cdiatocsc', 'cfulltocsc', 'ctransp', 'dcootocsc', 'dcscadd', 'dcscextract', 'dcscgetel', 'dcscmucsc', 'dcscmucsr', 'dcscmul', 'dcscmux', 'dcscsetel', 'dcsctocoo', 'dcsctofull', 'dcsrmucsc', 'dcsrmux', 'ddiatocsc', 'dfulltocsc', 'dtransp', 'scootocsc', 'scscadd', 'scscextract', 'scscgetel', 'scscmucsc', 'scscmucsr', 'scscmul', 'scscmux', 'scscsetel', 'scsctocoo', 'scsctofull', 'scsrmucsc', 'scsrmux', 'sdiatocsc', 'sfulltocsc', 'stransp', 'zcootocsc', 'zcscadd', 'zcscextract', 'zcscgetel', 'zcscmucsc', 'zcscmucsr', 'zcscmul', 'zcscmux', 'zcscsetel', 'zcsctocoo', 'zcsctofull', 'zcsrmucsc', 'zcsrmux', 'zdiatocsc', 'zfulltocsc', 'ztransp'] But it is not the members of sparsetools.py I think it is the members of sparsetools.pyd. Move sparsetools.pyd to some backup directory. Then it will works well. In [21]: from scipy import * In [22]: -------------- next part -------------- An HTML attachment was scrubbed... URL: From youngsu999 at gmail.com Sun Nov 18 06:06:35 2007 From: youngsu999 at gmail.com (youngsu park) Date: Sun, 18 Nov 2007 20:06:35 +0900 Subject: [SciPy-user] scipy 0.6.0 ImportError: cannot import name cscmux In-Reply-To: <338357900711180250u1fd08b86ide9a504d664e0cc1@mail.gmail.com> References: <338357900711180250u1fd08b86ide9a504d664e0cc1@mail.gmail.com> Message-ID: <338357900711180306i5eb9e54eif0a34032b93c53ef@mail.gmail.com> Hi. Im Youngsu-park ,in POSTECH in south korea. I had same problem on scipy importing like: from scipy import * . So I looked at the scipy/sparse folder, "C:\Python25\Lib\site-packages\scipy\sparse ". To find the reason of error, I open the "C:\Python25\Lib\site-packages\scipy\sparse\sparse.py", file. >From line 21 to 26, there was import statement for "sparsetools" from scipy.sparse.sparsetools import cscmux, csrmux, \ cootocsr, csrtocoo, cootocsc, csctocoo, csctocsr, csrtocsc, \ densetocsr, csrtodense, \ csrmucsr, cscmucsc, \ csr_plus_csr, csc_plus_csc, csr_minus_csr, csc_minus_csc, \ csr_elmul_csr, csc_elmul_csc, csr_eldiv_csr, csc_eldiv_csc I change the working directory to "C:\Python25\Lib\site-packages\scipy\sparse ", import sparestools and use dir function to see the members of sparsetools In [16]: import sparsetoo In [17]: dir(sparsetools) Out[17]: ['__doc__', '__file__', '__name__', '__version__', 'ccootocsc', 'ccscadd', 'ccscextract', 'ccscgetel', 'ccscmucsc', 'ccscmucsr', 'ccscmul', 'ccscmux', 'ccscsetel', 'ccsctocoo', 'ccsctofull', 'ccsrmucsc', 'ccsrmux', 'cdiatocsc', 'cfulltocsc', 'ctransp', 'dcootocsc', 'dcscadd', 'dcscextract', 'dcscgetel', 'dcscmucsc', 'dcscmucsr', 'dcscmul', 'dcscmux', 'dcscsetel', 'dcsctocoo', 'dcsctofull', 'dcsrmucsc', 'dcsrmux', 'ddiatocsc', 'dfulltocsc', 'dtransp', 'scootocsc', 'scscadd', 'scscextract', 'scscgetel', 'scscmucsc', 'scscmucsr', 'scscmul', 'scscmux', 'scscsetel', 'scsctocoo', 'scsctofull', 'scsrmucsc', 'scsrmux', 'sdiatocsc', 'sfulltocsc', 'stransp', 'zcootocsc', 'zcscadd', 'zcscextract', 'zcscgetel', 'zcscmucsc', 'zcscmucsr', 'zcscmul', 'zcscmux', 'zcscsetel', 'zcsctocoo', 'zcsctofull', 'zcsrmucsc', 'zcsrmux', 'zdiatocsc', 'zfulltocsc', 'ztransp'] But it is not the members of sparsetools.py I think it is the members of sparsetools.pyd. Move sparsetools.pyd to some backup directory. Then it will works well. In [21]: from scipy import * In [22]: -------------- next part -------------- An HTML attachment was scrubbed... URL: From ramercer at gmail.com Sun Nov 18 12:08:06 2007 From: ramercer at gmail.com (Adam Mercer) Date: Sun, 18 Nov 2007 12:08:06 -0500 Subject: [SciPy-user] Specifying fortran compiler In-Reply-To: <54D6FEEC-043D-4B1B-A0E4-B7AB7DC98CA4@physics.mcmaster.ca> References: <799406d60710281113m7865640ds11d9e66665f09c2b@mail.gmail.com> <799406d60710281918j616481a1te3fdcf88c45481a@mail.gmail.com> <47254429.4030508@gmail.com> <799406d60710282030t77121657s894b6859113700a@mail.gmail.com> <54D6FEEC-043D-4B1B-A0E4-B7AB7DC98CA4@physics.mcmaster.ca> Message-ID: <799406d60711180908j4fbeb53ay4cc5723082b70e9@mail.gmail.com> On 31/10/2007, David M. Cooke wrote: > If it's fixed (as I think it is), best to wait for 1.0.4. I suppose > you could make a symlink to gfortran-mp-42 in the work/ directory, and > add it to the PATH that'd be used in the Portfile. This problem goes away with numpy-1.0.4 Cheers Adam From berthe.loic at gmail.com Sun Nov 18 14:08:52 2007 From: berthe.loic at gmail.com (LB) Date: Sun, 18 Nov 2007 11:08:52 -0800 (PST) Subject: [SciPy-user] Pb with scipy.stats Message-ID: Hi, I've got some problem with some of the continuous distributions defined in scipy.stats : >>> from scipy import stats, __version__ >>> __version__ '0.7.0.dev3440' >>> beta = stats.betaprime(5., 5.) >>> beta.pdf( 0.5 ) array(0.68282274043590918) >>> beta.cdf( 0.5>>> from scipy import stats, __version__ >>> __version__ '0.7.0.dev3440' >>> beta = stats.betaprime(5., 5.) >>> beta.pdf( 0.5 ) array(0.68282274043590918) >>> beta.cdf( 0.5 ) array(0.14484580602550423) So, PDF and CDF methods work, but not PPF : >>> beta.ppf( 0.5 ) Traceback (most recent call last): File "", line 1, in File "/home/loic/tmp/pycvs/lib/python2.6/site-packages/scipy/stats/ distributions.py", line 112, in ppf return self.dist.ppf(q,*self.args,**self.kwds) File "/home/loic/tmp/pycvs/lib/python2.6/site-packages/scipy/stats/ distributions.py", line 563, in ppf place(output,cond,self._ppf(*goodargs)*scale + loc) File "/home/loic/tmp/pycvs/lib/python2.6/site-packages/scipy/stats/ distributions.py", line 382, in _ppf return self.vecfunc(q,*args) File "/home/loic/tmp/pycvs/lib/python2.6/site-packages/numpy/lib/ function_base.py", line 949, in __call__ raise ValueError, "mismatch between python function inputs"\ ValueError: mismatch between python function inputs and received arguments It seems the same problem appears with at least foldcauchy, foldnorm and genexpon Is there any way to avoid this ? -- LB From lev at vpac.org Sun Nov 18 17:36:25 2007 From: lev at vpac.org (Lev Lafayette) Date: Mon, 19 Nov 2007 09:36:25 +1100 Subject: [SciPy-user] Installing numpy In-Reply-To: <7F5B3638-3AF9-4181-9A52-7908DAB7A425@suddenlink.net> References: <473FD0A7.3030404@gmail.com> <7F5B3638-3AF9-4181-9A52-7908DAB7A425@suddenlink.net> Message-ID: <1195425385.26337.2.camel@sys09.in.vpac.org> On Sat, 2007-11-17 at 22:02 -0800, David Arnold wrote: > Wow. Good catch. Switching directories: > > Ran 692 tests in 1.211s > > >>>import numpy > >>>numpy.test() > > ... > > OK > > >>> > > > Not so lucky with scipy: > > darnold $ python > Python 2.5.1 (r251:54869, Apr 18 2007, 22:08:04) > [GCC 4.0.1 (Apple Computer, Inc. build 5367)] on darwin > Type "help", "copyright", "credits" or "license" for more information. > >>> import scipy > s>>> scipy.test(level=1) > Is this a 64 bit machine? If so you'll have the same problem with SciPy as I did... until a very nice member of this list pointed me to the magic solution. http://scipy.org/scipy/scipy/changeset/3498 HTH HAND, -- Lev Lafayette From david at ar.media.kyoto-u.ac.jp Sun Nov 18 22:12:05 2007 From: david at ar.media.kyoto-u.ac.jp (David Cournapeau) Date: Mon, 19 Nov 2007 12:12:05 +0900 Subject: [SciPy-user] Adding pysamplerate to scikits ? Message-ID: <4740FF05.4070302@ar.media.kyoto-u.ac.jp> Hi, Just wanted to know if it would be ok to add pysamplerate to scikits ? pysamplerate is a (really rough for now) wrapper around SRC for high quality audio resampling (using sinc interpolation). It would be LGPL, as SRC itself: http://www.ar.media.kyoto-u.ac.jp/members/david/softwares/pysamplerate/index.html cheers, David From millman at berkeley.edu Sun Nov 18 23:04:33 2007 From: millman at berkeley.edu (Jarrod Millman) Date: Sun, 18 Nov 2007 20:04:33 -0800 Subject: [SciPy-user] Adding pysamplerate to scikits ? In-Reply-To: <4740FF05.4070302@ar.media.kyoto-u.ac.jp> References: <4740FF05.4070302@ar.media.kyoto-u.ac.jp> Message-ID: On Nov 18, 2007 7:12 PM, David Cournapeau wrote: > Just wanted to know if it would be ok to add pysamplerate to scikits > ? pysamplerate is a (really rough for now) wrapper around SRC for high > quality audio resampling (using sinc interpolation). It would be LGPL, > as SRC itself: +1 I would be happy to see you adding a audio resampling scikit. As you I am sure you will guess, my only concern is that you rename the package to at least remove the 'py'. I would also prefer to see it named resample over samplerate. Thanks, -- Jarrod Millman Computational Infrastructure for Research Labs 10 Giannini Hall, UC Berkeley phone: 510.643.4014 http://cirl.berkeley.edu/ From millman at berkeley.edu Sun Nov 18 23:04:33 2007 From: millman at berkeley.edu (Jarrod Millman) Date: Sun, 18 Nov 2007 20:04:33 -0800 Subject: [SciPy-user] Adding pysamplerate to scikits ? In-Reply-To: <4740FF05.4070302@ar.media.kyoto-u.ac.jp> References: <4740FF05.4070302@ar.media.kyoto-u.ac.jp> Message-ID: On Nov 18, 2007 7:12 PM, David Cournapeau wrote: > Just wanted to know if it would be ok to add pysamplerate to scikits > ? pysamplerate is a (really rough for now) wrapper around SRC for high > quality audio resampling (using sinc interpolation). It would be LGPL, > as SRC itself: +1 I would be happy to see you adding a audio resampling scikit. As you I am sure you will guess, my only concern is that you rename the package to at least remove the 'py'. I would also prefer to see it named resample over samplerate. Thanks, -- Jarrod Millman Computational Infrastructure for Research Labs 10 Giannini Hall, UC Berkeley phone: 510.643.4014 http://cirl.berkeley.edu/ From david at ar.media.kyoto-u.ac.jp Sun Nov 18 23:06:03 2007 From: david at ar.media.kyoto-u.ac.jp (David Cournapeau) Date: Mon, 19 Nov 2007 13:06:03 +0900 Subject: [SciPy-user] Adding pysamplerate to scikits ? In-Reply-To: References: <4740FF05.4070302@ar.media.kyoto-u.ac.jp> Message-ID: <47410BAB.9010902@ar.media.kyoto-u.ac.jp> Jarrod Millman wrote: > On Nov 18, 2007 7:12 PM, David Cournapeau wrote: > >> Just wanted to know if it would be ok to add pysamplerate to scikits >> ? pysamplerate is a (really rough for now) wrapper around SRC for high >> quality audio resampling (using sinc interpolation). It would be LGPL, >> as SRC itself: >> > > +1 > > I would be happy to see you adding a audio resampling scikit. As you > I am sure you will guess, my only concern is that you rename the > package to at least remove the 'py'. I would also prefer to see it > named resample over samplerate. > I indeed thought it would be a 'complain'. No problem, I don't care about the name (and it was a bad name anyway). cheers, David From youngsu999 at gmail.com Mon Nov 19 00:10:34 2007 From: youngsu999 at gmail.com (Youngsu Park) Date: Mon, 19 Nov 2007 14:10:34 +0900 Subject: [SciPy-user] Pb with scipy.stats Message-ID: <338357900711182110i6a06756fx1822e0c7dcb606d6@mail.gmail.com> Hi, In the scipy/stats directory, there is distribution.py file. Several distribution is defined in it. The betaprime is a instance of betaprime_gen that inherit rv_continuous. The ppf method is not defined In betaprime_gen class. ## Beta Prime class betaprime_gen(rv_continuous): def _rvs(self, a, b): u1 = gamma.rvs(a,size=self._size) u2 = gamma.rvs(b,size=self._size) return (u1 / u2) def _pdf(self, x, a, b): return 1.0/special.beta(a,b)*x**(a-1.0)/(1+x)**(a+b) def _cdf(self, x, a, b): x = where(x==1.0, 1.0-1e-6,x) return pow(x,a)*special.hyp2f1(a+b,a,1+a,-x)/a/special.beta(a,b) def _munp(self, n, a, b): if (n == 1.0): return where(b > 1, a/(b-1.0), inf) elif (n == 2.0): return where(b > 2, a*(a+1.0)/((b-2.0)*(b-1.0)), inf) elif (n == 3.0): return where(b > 3, a*(a+1.0)*(a+2.0)/((b-3.0)*(b-2.0)*(b-1.0)), inf) elif (n == 4.0): return where(b > 4, a*(a+1.0)*(a+2.0)*(a+3.0)/((b-4.0)*(b-3.0) \ *(b-2.0)*(b-1.0)), inf) else: raise NotImplementedError betaprime = betaprime_gen(a=0.0, b=500.0, name='betaprime', shapes='a,b', extradoc="""~~~""") so it uses the rv_continuous class's method. def ppf(self,q,*args,**kwds): """Percent point function (inverse of cdf) at q of the given RV. *args ===== The shape parameter(s) for the distribution (see docstring of the instance object for more information) **kwds ====== loc - location parameter (default=0) scale - scale parameter (default=1) """ loc,scale=map(kwds.get,['loc','scale']) args, loc, scale = self.__fix_loc_scale(args, loc, scale) q,loc,scale = map(arr,(q,loc,scale)) args = tuple(map(arr,args)) cond0 = self._argcheck(*args) & (scale > 0) & (loc==loc) cond1 = (q > 0) & (q < 1) cond2 = (q==1) & cond0 cond = cond0 & cond1 output = valarray(shape(cond),value=self.a*scale + loc) place(output,(1-cond0)+(1-cond1)*(q!=0.0), self.badvalue) place(output,cond2,self.b*scale + loc) goodargs = argsreduce(cond, *((q,)+args+(scale,loc))) scale, loc, goodargs = goodargs[-2], goodargs[-1], goodargs[:-2] place(output,cond,self._ppf(*goodargs)*scale + loc) ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ return output .... sgf = vectorize self.vecfunc = sgf(self._ppf_single_call,otypes='d') ... def _ppf(self, q, *args): return self.vecfunc(q,*args) ~~~~~~~~~~~~~~~~~~~~ ... def _ppf_single_call(self, q, *args): return scipy.optimize.brentq(self._ppf_to_solve, self.xa, self.xb, args=(q,)+args, xtol=self.xtol) I'm not sure but I think that the problem comes from the vectorization of defalut _ppf_single_call funstion. But I don'k know how to fix it. Is there someone to fix it? From pav at iki.fi Mon Nov 19 03:14:02 2007 From: pav at iki.fi (Pauli Virtanen) Date: Mon, 19 Nov 2007 08:14:02 +0000 (UTC) Subject: [SciPy-user] ode on complex systems References: <80c99e790711160653v58f2b65o68a52c422e4623fc@mail.gmail.com> <1195226348.24691.2.camel@localhost> <473DD6AC.3070207@noaa.gov> <80c99e790711161108k6fbdd4b4sd1db90932d1771d6@mail.gmail.com> Message-ID: Fri, 16 Nov 2007 20:41:14 +0000, Pauli Virtanen wrote: > Fri, 16 Nov 2007 20:08:22 +0100, lorenzo bolla wrote: >> I quickly tried to compile scipy with the patch, but it failed (on >> cygwin). I had no time to investigate the problem further, but I'll let >> you know as soon as possible more details. L. > > This may be a cygwin-related issue, Ok, it seems that someone else is also having problems on Cygwin (see the post titled "Problems compiling Scipy with Cygwin"): the VODE module even without patches fails to compile. Did you check whether you are compile an unpatched Scipy? -- Pauli Virtanen From lev at columbia.edu Mon Nov 19 09:20:43 2007 From: lev at columbia.edu (Lev Givon) Date: Mon, 19 Nov 2007 09:20:43 -0500 Subject: [SciPy-user] Adding pysamplerate to scikits ? In-Reply-To: References: <4740FF05.4070302@ar.media.kyoto-u.ac.jp> Message-ID: <20071119142042.GA21903@localhost.cc.columbia.edu> Received from Jarrod Millman on Sun, Nov 18, 2007 at 11:04:33PM EST: > On Nov 18, 2007 7:12 PM, David Cournapeau wrote: > > Just wanted to know if it would be ok to add pysamplerate to scikits > > ? pysamplerate is a (really rough for now) wrapper around SRC for high > > quality audio resampling (using sinc interpolation). It would be LGPL, > > as SRC itself: SRC is licensed under the GPL, not the LGPL: http://www.mega-nerd.com/SRC/license.html Offhand, I am not sure how this impacts the distribution of David's wrapper under the BSD license. > > +1 > > I would be happy to see you adding a audio resampling scikit. As you Me too; I was actually looking for something like this recently. > I am sure you will guess, my only concern is that you rename the > package to at least remove the 'py'. I would also prefer to see it > named resample over samplerate. > > Thanks, > > -- > Jarrod Millman Although I agree that there are better names than "samplerate", it might be preferable to retain the name of the underlying library (or perhaps choose something else entirely) because there is another free resampling library called (lib)resample: http://www-ccrma.stanford.edu/~jos/resample/ Given my previous observation regarding SRC's license, it is worth noting that this library is licensed under the LGPL (supposedly because the author found SRC's license problematic for his purposes). L.G. From lbolla at gmail.com Mon Nov 19 10:35:36 2007 From: lbolla at gmail.com (lorenzo bolla) Date: Mon, 19 Nov 2007 16:35:36 +0100 Subject: [SciPy-user] ode on complex systems In-Reply-To: References: <80c99e790711160653v58f2b65o68a52c422e4623fc@mail.gmail.com> <1195226348.24691.2.camel@localhost> <473DD6AC.3070207@noaa.gov> <80c99e790711161108k6fbdd4b4sd1db90932d1771d6@mail.gmail.com> Message-ID: <80c99e790711190735w1a9f2719o2b1843506f05729b@mail.gmail.com> I managed to compile scipy with the zvode.patch. Just one observation: doing $> patch < zvode.patch in the scipy/scipy/integrate directory is not enough: you also need to move zvode.f in odepack/ directory and the others *.f in linpack_lite/ directory. That is what I missed last time. Thank you again, Lorenzo. On 11/19/07, Pauli Virtanen wrote: > > Fri, 16 Nov 2007 20:41:14 +0000, Pauli Virtanen wrote: > > > Fri, 16 Nov 2007 20:08:22 +0100, lorenzo bolla wrote: > >> I quickly tried to compile scipy with the patch, but it failed (on > >> cygwin). I had no time to investigate the problem further, but I'll let > >> you know as soon as possible more details. L. > > > > This may be a cygwin-related issue, > > Ok, it seems that someone else is also having problems on Cygwin (see the > post titled "Problems compiling Scipy with Cygwin"): the VODE module even > without patches fails to compile. Did you check whether you are compile > an unpatched Scipy? > > -- > Pauli Virtanen > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > -------------- next part -------------- An HTML attachment was scrubbed... URL: From odonnems at yahoo.com Mon Nov 19 11:06:03 2007 From: odonnems at yahoo.com (Michael ODonnell) Date: Mon, 19 Nov 2007 08:06:03 -0800 (PST) Subject: [SciPy-user] weave and compiler install help Message-ID: <271728.69580.qm@web58012.mail.re3.yahoo.com> I am trying to compile some inline c++ code inside python using weave. I always get a similar problem where the compiled file cannot be found (see below). I am not sure if the problem is with the compiler or something else. I am a new user of scipy and a novice with python so I would appreciate any direction someone can give me because I have not been able to figure out a work around. Thank you, Michael +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ I have the following related applications installed: +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ Python: 2.4.1 (I am using an older version because this is what ESRI GIS application supports) Scipy: scipy-0.6.0.win32-py2.4.exe MinGW: MinGW-5.1.3.exe gcc-4.1.2-mingw-setup.exe cygwin: 3.2.25 OS: Windows XP SP2 +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ When I test the installation of Weave I get the following output: +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ >>> weave.test() Found 1 tests for scipy.weave.ast_tools Found 2 tests for scipy.weave.blitz_tools Found 9 tests for scipy.weave.build_tools Found 0 tests for scipy.weave.c_spec Found 26 tests for scipy.weave.catalog building extensions here: c:\docume~1\michael\locals~1\temp\Michael\python24_compiled\m3 Found 1 tests for scipy.weave.ext_tools Found 0 tests for scipy.weave.inline_tools Found 74 tests for scipy.weave.size_check Found 16 tests for scipy.weave.slice_handler Found 3 tests for scipy.weave.standard_array_spec Found 0 tests for __main__ ...warning: specified build_dir '_bad_path_' does not exist or is not writable. Trying default locations .....warning: specified build_dir '_bad_path_' does not exist or is not writable. Trying default locations ............................removing 'c:\docume~1\michael\locals~1\temp\tmpdqudhmcat_test' (and everything under it) error removing c:\docume~1\michael\locals~1\temp\tmpdqudhmcat_test: c:\docume~1\michael\locals~1\temp\tmpdqudhmcat_test\win3224compiled_catalog: Permission denied error removing c:\docume~1\michael\locals~1\temp\tmpdqudhmcat_test: c:\docume~1\michael\locals~1\temp\tmpdqudhmcat_test: Directory not empty .removing 'c:\docume~1\michael\locals~1\temp\tmpw144aycat_test' (and everything under it) ............................................................................................... ---------------------------------------------------------------------- Ran 132 tests in 2.625s OK +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ When I try to test the following script or any other script I get the following message: +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ def prod(m, v): #C++ version nrows, ncolumns = m.shape res = numpy.zeros((nrows, ncolumns), float) code = r""" for (int i=0; i running build_ext running build_src building extension "sc_045cfaf40ef1a0738b1066aaa886a55d7" sources customize Mingw32CCompiler customize Mingw32CCompiler using build_ext customize Mingw32CCompiler customize Mingw32CCompiler using build_ext building 'sc_045cfaf40ef1a0738b1066aaa886a55d7' extension compiling C++ sources C compiler: g++ -mno-cygwin -O2 -Wall compile options: '-IC:\Python24\lib\site-packages\scipy\weave -IC:\Python24\lib\site-packages\scipy\weave\scxx -IC:\Python24\lib\site-packages\numpy\core\include -IC:\Python24\include -IC:\Python24\PC -c' g++ -mno-cygwin -O2 -Wall -IC:\Python24\lib\site-packages\scipy\weave -IC:\Python24\lib\site-packages\scipy\weave\scxx -IC:\Python24\lib\site-packages\numpy\core\include -IC:\Python24\include -IC:\Python24\PC -c c:\docume~1\michael\locals~1\temp\Michael\python24_compiled\sc_045cfaf40ef1a0738b1066aaa886a55d7.cpp -o c:\docume~1\michael\locals~1\temp\Michael\python24_intermediate\compiler_c8350d870e6c54e8f29dd7094c2bfb45\Release\docume~1\michael\locals~1\temp\michael\python24_compiled\sc_045cfaf40ef1a0738b1066aaa886a55d7.o Traceback (most recent call last): File "C:\Python24\Lib\site-packages\pythonwin\pywin\framework\scriptutils.py", line 310, in RunScript exec codeObject in __main__.__dict__ File "C:\Documents and Settings\Michael\Application Data\ESRI\ArcToolbox\scripts\test_weave.py", line 179, in ? main() File "C:\Documents and Settings\Michael\Application Data\ESRI\ArcToolbox\scripts\test_weave.py", line 170, in main prod(m, v) File "C:\Documents and Settings\Michael\Application Data\ESRI\ArcToolbox\scripts\test_weave.py", line 44, in prod err = weave.inline(code,['nrows', 'ncolumns', 'res', 'm', 'v'], verbose=2) File "C:\Python24\Lib\site-packages\scipy\weave\inline_tools.py", line 338, in inline auto_downcast = auto_downcast, File "C:\Python24\Lib\site-packages\scipy\weave\inline_tools.py", line 447, in compile_function verbose=verbose, **kw) File "C:\Python24\Lib\site-packages\scipy\weave\ext_tools.py", line 365, in compile verbose = verbose, **kw) File "C:\Python24\Lib\site-packages\scipy\weave\build_tools.py", line 269, in build_extension setup(name = module_name, ext_modules = [ext],verbose=verb) File "C:\Python24\lib\site-packages\numpy\distutils\core.py", line 173, in setup return old_setup(**new_attr) File "C:\Python24\lib\distutils\core.py", line 159, in setup raise SystemExit, error CompileError: error: c:\docume~1\michael\locals~1\temp\tmp2dwbkp: No such file or directory This file is created and automatically opens in python: import os import sys sys.path.insert(0,'C:\\Python24\\lib\\site-packages\\numpy\\distutils') from exec_command import exec_command del sys.path[0] cmd = ['g++', '-mno-cygwin', '-O2', '-Wall', '-IC:\\Python24\\lib\\site-packages\\scipy\\weave', '-IC:\\Python24\\lib\\site-packages\\scipy\\weave\\scxx', '-IC:\\Python24\\lib\\site-packages\\scipy\\weave\\blitz', '-IC:\\Python24\\lib\\site-packages\\numpy\\core\\include', '-IC:\\Python24\\include', '-IC:\\Python24\\PC', '-c', 'c:\\docume~1\\michael\\locals~1\\temp\\Michael\\python24_compiled\\sc_e6928d48a5ee12fb55c3b29ed6bf49ef4.cpp', '-o', 'c:\\docume~1\\michael\\locals~1\\temp\\Michael\\python24_intermediate\\compiler_921933e1c4e3c013306c4ed6f4c15144\\Release\\docume~1\\michael\\locals~1\\temp\\michael\\python24_compiled\\sc_e6928d48a5ee12fb55c3b29ed6bf49ef4.o'] os.environ = {'TMP': 'C:\\WINDOWS\\TEMP', 'COMPUTERNAME': 'ALDER', 'USERDOMAIN': 'ALDER', 'ARCHOME': 'C:\\arcgis\\arcexe9x', 'COMMONPROGRAMFILES': 'C:\\Program Files\\Common Files', 'PROCESSOR_IDENTIFIER': 'x86 Family 6 Model 13 Stepping 8, GenuineIntel', 'PROGRAMFILES': 'C:\\Program Files', 'PROCESSOR_REVISION': '0d08', 'PATH': 'c:\\mingw\\bin;C:\\Program Files\\Common Files\\Roxio Shared\\DLLShared;C:\\arcgis\\arcexe9x\\bin;C:\\Program Files\\FWTools1.2.2\\pymod;C:\\Program Files\\R\\R-2.5.1\\bin;C:\\Documents and Settings\\Michael\\My Documents\\mod_web\\ems_web\\web-content\\mapserver-4.8.1bin;C:\\cygwin\\bin;C:\\program files\\imagemagick-6.3.6-q16;C:\\Python24;C:\\Program Files\\gs\\gs8.51\\lib;C:\\Program Files\\gnuplot\\bin;C:\\ModisTools\\MRT\\bin;C:\\ModisTools\\LDOPE\\LDOPE_Win_bin\\ANCILLARY;C:\\Documents and Settings\\Michael\\My Documents\\my_home\\py_scripting\\src\\tools;C:\\Program Files\\HEG\\HEG_Win\\bin;C:\\Perl\\site\\bin;C:\\Perl\\bin;C:\\WINDOWS\\system32;C:\\WINDOWS;C:\\WINDOWS\\System32\\Wbem;C:\\Program Files\\Microsoft SQL Server\\80\\Tools\\Binn\\;C:\\Program Files\\MySQL\\MySQL Server 5.0\\bin;C:\\Program Files\\QuickTime\\QTSystem;C:\\WINDOWS\\system32\\config\\systemprofile\\Local Settings\\Temp;C:\\ModisTools\\MRT\\bin; C:\\ModisTools\\LDOPE\\LDOPE_Win_bin\\ANCILLARY;C:\\Program Files\\R\\R-2.5.1\\bin;;c:\\Program Files\\HEG\\HEG_Win\\bin;c:\\Program Files\\HEG\\HEG_Win\\bin;c:\\Program Files\\HEG\\HEG_Win\\bin', 'SYSTEMROOT': 'C:\\WINDOWS', 'ATHOME': 'C:\\arcgis\\arcexe9x\\arctools', 'SCRIPTING': 'C:\\Documents and Settings\\Michael\\My Documents\\my_home\\py_scripting', 'ARCHOME_USER': 'C:\\arcgis\\arcexe9x', 'TEMP': 'C:\\DOCUME~1\\Michael\\LOCALS~1\\Temp', 'PROCESSOR_ARCHITECTURE': 'x86', 'ANCPATH': 'C:\\ModisTools\\LDOPE\\LDOPE_Win_bin\\ANCILLARY', 'ALLUSERSPROFILE': 'C:\\Documents and Settings\\All Users', 'SESSIONNAME': 'Console', 'ARCINFOFONTSIZE': '8', 'HOMEPATH': '\\Documents and Settings\\Michael', 'USERNAME': 'Michael', 'LOGONSERVER': '\\\\ALDER', 'COMSPEC': 'C:\\WINDOWS\\system32\\cmd.exe', 'QTJAVA': 'C:\\Program Files\\Java\\jre1.6.0_01\\lib\\ext\\QTJava.zip', 'PYTHONPATH': 'C:\\Program Files\\ArcGIS\\bin;C:\\Python24\\Lib\\site-packages\\gdal;C:\\Program Files\\R\\R-2.5.1\\bin;C:\\Documents and Settings\\Michael\\My Documents\\my_home\\py_scripting\\src\\tools;C:\\Documents and Settings\\Michael\\My Documents\\my_home\\py_scripting\\lib;C:\\WINDOWS\\system32\\config\\systemprofile\\Local Settings\\Temp;C:\\cygwin\\bin;C:\\mingw\\bin;c:\\mingw\\bin;C:\\Program Files\\Common Files\\Roxio Shared\\DLLShared;C:\\arcgis\\arcexe9x\\bin;C:\\Program Files\\FWTools1.2.2\\pymod;C:\\Program Files\\R\\R-2.5.1\\bin;C:\\Documents and Settings\\Michael\\My Documents\\mod_web\\ems_web\\web-content\\mapserver-4.8.1bin;C:\\cygwin\\bin;C:\\program files\\imagemagick-6.3.6-q16;C:\\Python24;C:\\Program Files\\gs\\gs8.51\\lib;C:\\Program Files\\gnuplot\\bin;C:\\ModisTools\\MRT\\bin;C:\\ModisTools\\LDOPE\\LDOPE_Win_bin\\ANCILLARY;C:\\Documents and Settings\\Michael\\My Documents\\my_home\\py_scripting\\src\\tools;C:\\Program Files\\HEG\\HEG_Win\\bin;C:\\Perl\\site\\bin;C:\\Perl\\bin;C:\\WINDOWS\\system32;C:\\WINDOWS;C:\\WINDOWS\\System32\\Wbem;C:\\Program Files\\Microsoft SQL Server\\80\\Tools\\Binn\\;C:\\Program Files\\MySQL\\MySQL Server 5.0\\bin;C:\\Program Files\\QuickTime\\QTSystem;%USERPROFILE%\\Local Settings\\Temp;', 'CLASSPATH': '.;C:\\Program Files\\Java\\jre1.6.0_01\\lib\\ext\\QTJava.zip', 'PLAT': 'win32', 'PYTHONSRC': 'C:\\Python24', 'PGSHOME': 'c:\\Program Files\\HEG\\HEG_Win\\TOOLKIT_MTD', 'PATHEXT': '.COM;.EXE;.BAT;.CMD;.VBS;.VBE;.JS;.JSE;.WSF;.WSH;.py;.pdf', 'CLIENTNAME': 'Console', 'FP_NO_HOST_CHECK': 'NO', 'WINDIR': 'C:\\WINDOWS', 'APPDATA': 'C:\\Documents and Settings\\Michael\\Application Data', 'HOMEDRIVE': 'C:', 'ARCINFOFONTNAME': 'Courier New', 'SYSTEMDRIVE': 'C:', 'NUMBER_OF_PROCESSORS': '1', 'ARCGISHOME': 'C:\\Program Files\\ArcGIS\\', 'MRTBINDIR': 'c:\\Program Files\\HEG\\HEG_Win\\bin', 'PROCESSOR_LEVEL': '6', 'MRTDATADIR': 'C:\\ModisTools\\MRT\\data', 'OS': 'Windows_NT', '__COMPAT_LAYER': 'EnableNXShowUI ', 'USERPROFILE': 'C:\\Documents and Settings\\Michael'} s,o = exec_command(cmd, _with_python=0, **{}) f=open('c:\\docume~1\\michael\\locals~1\\temp\\tmp0gk0g_',"w") f.write(str(s)) f.close() f=open('c:\\docume~1\\michael\\locals~1\\temp\\tmpadrrdg',"w") f.write(o) f.close() ____________________________________________________________________________________ Get easy, one-click access to your favorites. Make Yahoo! your homepage. http://www.yahoo.com/r/hs -------------- next part -------------- An HTML attachment was scrubbed... URL: From keflavich at gmail.com Mon Nov 19 14:05:53 2007 From: keflavich at gmail.com (Keflavich) Date: Mon, 19 Nov 2007 11:05:53 -0800 (PST) Subject: [SciPy-user] Pylab plotting: math range errors unrecoverable In-Reply-To: References: Message-ID: <25d9ffcb-c0d3-4268-bd47-612b5273a095@s6g2000prc.googlegroups.com> Thanks Michael. Any thoughts on automatically masking bad values? Adam On Nov 15, 12:19 pm, Michael McNeil Forbes wrote: > On 15 Nov 2007, at 11:07 AM, Adam Ginsburg wrote:> 2. when I get the above error, then try to plot something WITHOUT > > infinities, it fails with the exact same error, and when I ran the > > debugger it claimed that meanv was still inf. Is there any way to fix > > that short of quitting and reentering the python command line? > > You can usually just clear the plot with clf(). I think the plot > keeps the old data and then tries to redisplay it each time, giving > errors over and over until you clear the figure. > > Michael. > > _______________________________________________ > SciPy-user mailing list > SciPy-u... at scipy.orghttp://projects.scipy.org/mailman/listinfo/scipy-user From darkwind.87 at gmail.com Mon Nov 19 19:31:01 2007 From: darkwind.87 at gmail.com (Dark Wind) Date: Mon, 19 Nov 2007 16:31:01 -0800 Subject: [SciPy-user] Problem with NLP in OpenOpt In-Reply-To: <8793ae6e0711150854g7b78db7ex4738dd0dd00cbe13@mail.gmail.com> References: <8a6035a00711150056r62c30b6cyd5fb4052eaa3f8b8@mail.gmail.com> <8793ae6e0711150854g7b78db7ex4738dd0dd00cbe13@mail.gmail.com> Message-ID: <8a6035a00711191631n7994c989g435b96d3e8aba6ab@mail.gmail.com> Thank a lot. Scipy_Cobyla worked, but I am not getting how exactly to install ALGENCAN on windows? Do I need cygwin to install it on windows? thanks On Nov 15, 2007 8:54 AM, Dominique Orban wrote: > On 11/15/07, Dark Wind wrote: > > The initial value x0 satisfies the constraints hi's but i am still > > getting that there is no feasible solution. > > > > Can anyone help me with this? > > Cobyla is an infeasible method, meaning that in order to identifiy a > solution to your problem, it will venture outside of the feasible set. > In your example, it wasn't able to reach a feasible solution within > the given limits (max number of iterations, tolerances, ...) It is not > because their is a feasible point that the code will be able to find a > solution; in this case, it got stuck before reaching the feasible set > again. > > You can either re-solve your problem with looser tolerances, which is > not very satisfactory, you can re-solve from a different starting > point (which doesn't have to be feasible), or you can try another > solver (as Dmitrey showed in his response). > > Dominique > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > From david at ar.media.kyoto-u.ac.jp Mon Nov 19 22:28:42 2007 From: david at ar.media.kyoto-u.ac.jp (David Cournapeau) Date: Tue, 20 Nov 2007 12:28:42 +0900 Subject: [SciPy-user] Adding pysamplerate to scikits ? In-Reply-To: <20071119142042.GA21903@localhost.cc.columbia.edu> References: <4740FF05.4070302@ar.media.kyoto-u.ac.jp> <20071119142042.GA21903@localhost.cc.columbia.edu> Message-ID: <4742546A.7090200@ar.media.kyoto-u.ac.jp> Lev Givon wrote: > Received from Jarrod Millman on Sun, Nov 18, 2007 at 11:04:33PM EST: >> On Nov 18, 2007 7:12 PM, David Cournapeau wrote: >>> Just wanted to know if it would be ok to add pysamplerate to scikits >>> ? pysamplerate is a (really rough for now) wrapper around SRC for high >>> quality audio resampling (using sinc interpolation). It would be LGPL, >>> as SRC itself: > > SRC is licensed under the GPL, not the LGPL: > > http://www.mega-nerd.com/SRC/license.html > > Offhand, I am not sure how this impacts the distribution of David's > wrapper under the BSD license. Indeed, I was mixing SRC and sndfile licenses (both are by Erik Castro de Lopo). The plan is to do as Robert Kern suggested: release the wrapper with the same license as the wrapped library (GPL here). > >> +1 >> >> I would be happy to see you adding a audio resampling scikit. As you > > Me too; I was actually looking for something like this recently. If I had more time, I would implement one in pure python, using filterbank (which is how matlab does it, AFAIK, and is worthwile as a 'pedagogical' tool). But well, I don't have this time :) cheers, David From xavier.barthelemy at cmla.ens-cachan.fr Tue Nov 20 06:21:29 2007 From: xavier.barthelemy at cmla.ens-cachan.fr (Xavier Barthelemy) Date: Tue, 20 Nov 2007 12:21:29 +0100 Subject: [SciPy-user] organizing data In-Reply-To: <4742546A.7090200@ar.media.kyoto-u.ac.jp> References: <4740FF05.4070302@ar.media.kyoto-u.ac.jp> <20071119142042.GA21903@localhost.cc.columbia.edu> <4742546A.7090200@ar.media.kyoto-u.ac.jp> Message-ID: <4742C339.8010308@cmla.ens-cachan.fr> Hi everyone! I'm quite new to python, and I am breaking my mind on this: I would like to organize data so I can plot them. My data are the variation of 11 parameters, with 18 values for each. so you can see them as an hypercube of 11 dimension. My goal is the possibility to plot with gnuplot in 2D (or 3D). So I have to class my data by slices of 9 (or 8) parameters constant. That's my problem: it is not really an "hyper"matrix of data because i did not discretized the space parameters equally for each parameters. So i think i have to class and then recursively cut into the mass so I can have small arrays I can plot with the gnuplot.py As I said, i am quite new in pythonizing, and I sure can use a little help. I have spent much time on this, and I can't figure it out how. so: from numpy import * from scipy import * from operator import itemgetter import Gnuplot, Gnuplot.PlotItems, Gnuplot.funcutils #I read the data def LireFich(Data): donnees= io.read_array(Data,columns=(0,-1),lines=(0,-1),atype='d') return donnees def Couper(liste,col): #Then I class the whole array (sorted can class by each and every columns just add a coma and a new col) longueur = len(liste) listeSansDoublon=[] liste2=array(sorted(liste,key=itemgetter(col))) #Then i create a list of each values of the parameters so i can cut by each constant "pieces" for i in liste2[:,col] : if listeSansDoublon.count(i) == 0: listeSansDoublon.append(i) longueur2=len(listeSansDoublon) and then... I don't know I have guessed the array_split function OuCouper=[] k=0 for i in listeSansDoublon: j = array(where(i == liste2[:,col])) jj= where(i == liste2[:,col]) A=min(j) B=max(j) OuCouper[k,1]=A OuCouper[k,2]=B # essai=array_split(liste2,min(j)--max(j)) but actually it doesn't work, as i have read the F*M*. i have also think that I can only manipulate masks ans apply the final recursively calculated mask on the data and then reduce to have the correct form for gnuplot. Thanks for your help if you can help me. I really passed much time on this, and i still can't figure it out how. Xavier From ramercer at gmail.com Tue Nov 20 09:03:35 2007 From: ramercer at gmail.com (Adam Mercer) Date: Tue, 20 Nov 2007 09:03:35 -0500 Subject: [SciPy-user] Building with a non-standard gfortran Message-ID: <799406d60711200603t8e451cah2e13add2bd794a11@mail.gmail.com> Hi Using MacPorts, the gfortran compiler, from gcc-4.2.x, is installed as gfortran-mp-4.2. I have been trying to build scipy with this compiler and found that that numpy I need to build and install with $ python setup.py config_fc --fcompiler gnu95 \ --f77exec /opt/local/bin/gfortran-mp-4.2 \ --f90exec /opt/local/bin/gfortran-mp-4.2 build $ python setup.py install --prefix=${NUMPY_LOCATION} however using the above for scipy fails on the install phase with it being unable to locate a fortran compiler. I therefore need to install scipy with $ python setup.py config_fc --fcompiler gnu95 \ --f77exec /opt/local/bin/gfortran-mp-4.2 \ --f90exec /opt/local/bin/gfortran-mp-4.2 install --prefix=${SCIPY_LOCATION} Is this expected, as I would have expected install to inherit the options that where used during the build phase? Cheers Adam PS: I'm using numpy-1.0.4 and scipy-0.6.0 From aisaac at american.edu Tue Nov 20 09:51:59 2007 From: aisaac at american.edu (Alan G Isaac) Date: Tue, 20 Nov 2007 09:51:59 -0500 Subject: [SciPy-user] Adding pysamplerate to scikits ? In-Reply-To: <4742546A.7090200@ar.media.kyoto-u.ac.jp> References: <4740FF05.4070302@ar.media.kyoto-u.ac.jp> <20071119142042.GA21903@localhost.cc.columbia.edu><4742546A.7090200@ar.media.kyoto-u.ac.jp> Message-ID: On Tue, 20 Nov 2007, David Cournapeau apparently wrote: > I was mixing SRC and sndfile licenses (both are by Erik > Castro de Lopo). The plan is to do as Robert Kern > suggested: release the wrapper with the same license as > the wrapped library (GPL here). If you anticipate a reduction in usefulness from this, why not explain that and ask Erik for a release under the LGPL? Licenses are not a "given", especially when the number of developers is small enough to feasibly agree on an additional license. Cheers, Alan Isaac From timmichelsen at gmx-topmail.de Tue Nov 20 11:47:21 2007 From: timmichelsen at gmx-topmail.de (Tim Michelsen) Date: Tue, 20 Nov 2007 16:47:21 +0000 (UTC) Subject: [SciPy-user] help on array manipulation Message-ID: Hello, I have the following array: In [240]: z Out[240]: array([[ 15., 3., 7., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.], [ 16., 3., 7., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.], [ 17., 3., 7., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.]]) How do I get it be become like: 15 3 15 7 15 0 ... 16 3 16 7 ... 17 3 Is there a special function I can use for this or do I need to build various for loops to achive this? I mean this example has few data. But what if the amount increases? I am stuck here. I'd be very glad if anyone could point be to a good tutorial for array manipulation. Scipy Cookbook is not enough. Well, I also found : http://numpy.scipy.org/numpydoc and http://www.penzilla.net/tutorials/python/numeric Thanks in advance, Timmie ### # Code for the array: ### z = numpy.zeros((3,25)) z = numpy.zeros((3,25)) for i in xrange(z.shape[1]): z[:,0] = 1 z[:,1] = 3 z[:,2] = 7 for i in xrange(z.shape[0]): z[0,0] = 15 z[1,0] = 16 z[2,0] = 17 From timmichelsen at gmx-topmail.de Tue Nov 20 17:58:07 2007 From: timmichelsen at gmx-topmail.de (Tim Michelsen) Date: Tue, 20 Nov 2007 22:58:07 +0000 (UTC) Subject: [SciPy-user] reading data with gaps and converting Message-ID: Hello, I encountered another difficulty converting my files from one format to the other. Below you may find a simplified example. When I try to load for instance row 1 on the input file the pylab.load or scipy.io.read_array complains that the index is out of range or that a sequence cannot be converted into a array. I assume that the problem are the cells without any data. The file I use was exported from Excel. There may be some of you who have had to deal with similar format conversion issues. Therefore I would be very happy someone could indicate me a method how to read such structured data efficiently into a array. Thanks in advance. Kind regards, Tim reduced example data: input file structure parameter1 parameter1 parameter2 parameter2 year month day hour1 hour2 hour1 hour2 2007 11 20 8 2 2007 11 21 9 3 4 4 2007 11 22 2 8 4 8 2007 11 23 3 3 6 51 2007 11 24 1 72 34 2007 11 25 7 9 34 24 2007 11 26 6 65 12 1 2007 11 27 7 0 19 2 2007 11 28 44 21 2 output file structure year month day hour parameter1 parameter2 2007 11 20 1 nodata 2 2007 11 20 2 8 nodata 2007 11 21 1 9 4 2007 11 21 2 3 4 2007 11 22 1 2 4 2007 11 22 2 8 8 From stuwilkins at mac.com Tue Nov 20 18:09:37 2007 From: stuwilkins at mac.com (Stuart Wilkins) Date: Tue, 20 Nov 2007 15:09:37 -0800 Subject: [SciPy-user] Questions on leastsq Message-ID: <7D7AE45E-0116-1000-8386-17482B0ECB9E-Webmail-10020@mac.com> Hi, I am having problems with the least squares fitting. This may or may not be a bug. I am trying to get an idea of the errors from fitting data. I am usualy fitting to Gaussian Lorentzian, Lor**2 etc. When I try to compute the errors from sigma = sqrt(diag(cov_x)) where cov_x is the covariance matrix I get results which are wildly different than anything before. Including some code I had a long time ago in Matlab. Does anyone have any idea if the form of cov_x is correct or any success in using this? Thanks, Stuart From anand.prabhakar.patil at gmail.com Tue Nov 20 18:30:01 2007 From: anand.prabhakar.patil at gmail.com (Anand Patil) Date: Tue, 20 Nov 2007 15:30:01 -0800 Subject: [SciPy-user] Sparse matrix advice Message-ID: <2bc7a5a50711201530l25b3b813j673be7620c1e8154@mail.gmail.com> Hi all, I'm looking for a 'row-ish' sparse matrix format that I can operate on (including making new nonzero elements) quickly in C++ or Pyrex. Could anyone recommend one? The LIL format looks nice because it can be expanded without any copying, but I'm worried that having to call PyList_GetItem (or whatever it is), inc/decreffing and converting from Python to C integers will take up a lot of time. PyTrilinos' Epetra looks pretty cool but I'm not sure how much distributed memory stuff I'll be doing and it would take me a long time to understand. I'm thinking that a good solution might be Pyrex wrappers for matrices represented as C++ vectors of rows, which are themselves vectors; would this actually be a good solution, and is anything like this out there already? I'm a total sparse linear algebra neophyte, so if I'm missing some boat entirely please help me get a clue. Many thanks in advance, Anand Patil From david at ar.media.kyoto-u.ac.jp Tue Nov 20 21:26:38 2007 From: david at ar.media.kyoto-u.ac.jp (David Cournapeau) Date: Wed, 21 Nov 2007 11:26:38 +0900 Subject: [SciPy-user] Adding pysamplerate to scikits ? In-Reply-To: References: <4740FF05.4070302@ar.media.kyoto-u.ac.jp> <20071119142042.GA21903@localhost.cc.columbia.edu><4742546A.7090200@ar.media.kyoto-u.ac.jp> Message-ID: <4743975E.10705@ar.media.kyoto-u.ac.jp> Alan G Isaac wrote: > On Tue, 20 Nov 2007, David Cournapeau apparently wrote: >> I was mixing SRC and sndfile licenses (both are by Erik >> Castro de Lopo). The plan is to do as Robert Kern >> suggested: release the wrapper with the same license as >> the wrapped library (GPL here). > > If you anticipate a reduction in usefulness from this, > why not explain that and ask Erik for a release under the LGPL? > Licenses are not a "given", especially when the number of > developers is small enough to feasibly agree on an > additional license. I don't anticipate any reduction of usefulness. I was just wrong in my understanding that SRC was under the LGPL: Robert suggested to license a wrapper with the same license than the wrapped library, whatever the license is, and I myseful do not care about the licenses of my scikits. Erik is a well known hacker in the audio linux programming scene, and he certainly is aware of the differences between GPL and LGPL. So although I agree asking for a relicensing for some part of the codes may be sometimes useful (and I did just that for my other audio related scikits, audiolab), I think asking to go from GPL to LGPL is quite rude (because it is GPL, Erik can dual license it to users who do not wish to open source their product, like Trolltech does with QT; with LGPL, not really) cheers, David > > Cheers, > Alan Isaac > > > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > > From aisaac at american.edu Tue Nov 20 23:18:30 2007 From: aisaac at american.edu (Alan G Isaac) Date: Tue, 20 Nov 2007 23:18:30 -0500 Subject: [SciPy-user] Adding pysamplerate to scikits ? In-Reply-To: <4743975E.10705@ar.media.kyoto-u.ac.jp> References: <4740FF05.4070302@ar.media.kyoto-u.ac.jp> <20071119142042.GA21903@localhost.cc.columbia.edu><4742546A.7090200@ar.media.kyoto-u.ac.jp> <4743975E.10705@ar.media.kyoto-u.ac.jp> Message-ID: On Wed, 21 Nov 2007, David Cournapeau apparently wrote: > I think asking to go from GPL to LGPL is quite rude > (because it is GPL, Erik can dual license it to users who > do not wish to open source their product, like Trolltech > does with QT; with LGPL, not really) Yes: if the GPL was chosen for that reason, I agree. However many projects have chosen GPL by default, without much though about licensing. (This seems to be getting rarer.) Cheers, Alan Isaac From gael.varoquaux at normalesup.org Wed Nov 21 02:45:22 2007 From: gael.varoquaux at normalesup.org (Gael Varoquaux) Date: Wed, 21 Nov 2007 08:45:22 +0100 Subject: [SciPy-user] organizing data In-Reply-To: <4742C339.8010308@cmla.ens-cachan.fr> References: <4740FF05.4070302@ar.media.kyoto-u.ac.jp> <20071119142042.GA21903@localhost.cc.columbia.edu> <4742546A.7090200@ar.media.kyoto-u.ac.jp> <4742C339.8010308@cmla.ens-cachan.fr> Message-ID: <20071121074522.GA14595@clipper.ens.fr> > My data are the variation of 11 parameters, with 18 values for each. so > you can see them as an hypercube of 11 dimension. My goal is the > possibility to plot with gnuplot in 2D (or 3D). So I have to class my > data by slices of 9 (or 8) parameters constant. > That's my problem: it is not really an "hyper"matrix of data because i > did not discretized the space parameters equally for each parameters. > So i think i have to class and then recursively cut into the mass so I > can have small arrays I can plot with the gnuplot.py Hi Xavier, Can you provide an example of your data? If I understand properly, you have non-regular gridded data that you want to interpolate on a regular grid (though only through a cut) to plot as images. If I get it right, that does not sound an incredibly easy problem. There are algorithmes to do this, but I don't think there are wrapped in scipy. VTK but do this, but AFAIK, only in 3D. But hopefully your problem is simpler, and if you show us the data we'll understand it better. Cheers, Ga?l From dmitrey.kroshko at scipy.org Wed Nov 21 03:41:12 2007 From: dmitrey.kroshko at scipy.org (dmitrey) Date: Wed, 21 Nov 2007 10:41:12 +0200 Subject: [SciPy-user] a small OpenOpt example howto benchmark solvers Message-ID: <4743EF28.4060707@scipy.org> Hi all, I have got a letter from Nils Wagner about interest of NLP solvers comparison, and maybe some of scipy-user mailist subscribers could be interested as well. So if anyone is interested - here's an example of ALGENCAN, cobyla and lincher comparison: http://openopt.blogspot.com/2007/11/example-of-oo-nlp-solvers-benchmark.html Please let me note that usually ALGENCAN is much more better, and choosing correct stop criteria for your problem (gradtol, ftol, xtol etc) is very essential. Regards, D From boris at vonloesch.de Wed Nov 21 03:54:38 2007 From: boris at vonloesch.de (Boris von Loesch) Date: Wed, 21 Nov 2007 09:54:38 +0100 Subject: [SciPy-user] Sparse matrix advice In-Reply-To: <2bc7a5a50711201530l25b3b813j673be7620c1e8154@mail.gmail.com> References: <2bc7a5a50711201530l25b3b813j673be7620c1e8154@mail.gmail.com> Message-ID: <41a2e2dc0711210054l476115d5l91054cfadb548f85@mail.gmail.com> Hi Anand, I use the COO-format for the creation of the matrix, because it could be assembled quite fast. A second advantage is, that you need to create the sparsity only the first time and then just create a vector with the non-zero entries. For calculations I convert it to CSR-Format, which is also quite fast. With this method I can create a 255^2*255^2 sparse matrix with roughly 8 non-zero entries per row in about 0.25 sec. Creating the matrix in LIL-format or DIC-format is a lot slower. Best regards, Boris On Nov 21, 2007 12:30 AM, Anand Patil wrote: > Hi all, > > I'm looking for a 'row-ish' sparse matrix format that I can operate on > (including making new nonzero elements) quickly in C++ or Pyrex. Could > anyone recommend one? > > The LIL format looks nice because it can be expanded without any > copying, but I'm worried that having to call PyList_GetItem (or > whatever it is), inc/decreffing and converting from Python to C > integers will take up a lot of time. PyTrilinos' Epetra looks pretty > cool but I'm not sure how much distributed memory stuff I'll be doing > and it would take me a long time to understand. > > I'm thinking that a good solution might be Pyrex wrappers for matrices > represented as C++ vectors of rows, which are themselves vectors; > would this actually be a good solution, and is anything like this out > there already? > > I'm a total sparse linear algebra neophyte, so if I'm missing some > boat entirely please help me get a clue. > > Many thanks in advance, > Anand Patil > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > From gael.varoquaux at normalesup.org Wed Nov 21 05:16:50 2007 From: gael.varoquaux at normalesup.org (Gael Varoquaux) Date: Wed, 21 Nov 2007 11:16:50 +0100 Subject: [SciPy-user] organizing data In-Reply-To: <20071121074522.GA14595@clipper.ens.fr> References: <4740FF05.4070302@ar.media.kyoto-u.ac.jp> <20071119142042.GA21903@localhost.cc.columbia.edu> <4742546A.7090200@ar.media.kyoto-u.ac.jp> <4742C339.8010308@cmla.ens-cachan.fr> <20071121074522.GA14595@clipper.ens.fr> Message-ID: <20071121101650.GE14595@clipper.ens.fr> On Wed, Nov 21, 2007 at 08:45:22AM +0100, Gael Varoquaux wrote: > If I understand properly, you have non-regular gridded data that you want > to interpolate on a regular grid (though only through a cut) to plot as > images. If I get it right, that does not sound an incredibly easy > problem. There are algorithmes to do this, but I don't think there are > wrapped in scipy. VTK but do this, but AFAIK, only in 3D. I am stupid. Of course there are things in scipy ! First you can look at a Delaunay natural neighbor interpolation (http://www.scipy.org/Cookbook/Matplotlib/Gridding_irregularly_spaced_data). Unfortunately I think the algorithm in Scipy can only do this for 2D data. Robert Kern will chime in, when he wakes up. Second, you can use radial basis function for interpolation, see http://www.scipy.org/Cookbook/RadialBasisFunctions Unfortunately, you will have to rebuild scipy for this. To everybody, is it possible to change the sandbox so that when scipy is built, all the packages in the sandbox are built, but the build doesn't complain if it fails for these (or you need a special switch to have it complain). There are some really nice packages in the sandbox, but most people cannot build them because they don't know how to build. IMHO it is better to have the binary packages of scipy shipping with the fraction of the sandbox that builds, and might not even pass tests, than to have these packages simply not available for users who cannot build. Cheers, Ga?l From j.reid at mail.cryst.bbk.ac.uk Wed Nov 21 06:11:17 2007 From: j.reid at mail.cryst.bbk.ac.uk (John Reid) Date: Wed, 21 Nov 2007 11:11:17 +0000 Subject: [SciPy-user] Place data on grid to minimise cost function Message-ID: I have a problem for which I do not know any algorithm to solve it. I'm wondering if there is something in scipy that will help me. Apologies if there isn't and this is off-topic. I have a set of N data points, which I want to place on a k x m grid. I have a cost function D(x1,x2) that measures how different 2 data points are. I want to minimise the sum of D over all neighbouring pairs of points on the grid (i.e. I want similar points to be neighbours on the grid). A) Is there some established technique to do this? B) Is it in scipy? BTW, N = k.m Thanks, John. From aisaac at american.edu Wed Nov 21 09:47:35 2007 From: aisaac at american.edu (Alan G Isaac) Date: Wed, 21 Nov 2007 09:47:35 -0500 Subject: [SciPy-user] a small OpenOpt example howto benchmark solvers In-Reply-To: <4743EF28.4060707@scipy.org> References: <4743EF28.4060707@scipy.org> Message-ID: On Wed, 21 Nov 2007, dmitrey apparently wrote: > http://openopt.blogspot.com/2007/11/example-of-oo-nlp-solvers-benchmark.html What problems did you run into with matplotlib? Did you bring them to the matplotlib list? (The developers are extremely responsive.) Cheers, Alan Isaac From yfan at emory.edu Wed Nov 21 09:50:29 2007 From: yfan at emory.edu (Ying Wai (Daniel) Fan) Date: Wed, 21 Nov 2007 09:50:29 -0500 Subject: [SciPy-user] Any image deblurring and wavelets packages? Message-ID: <474445B5.1070201@emory.edu> Hi All, I am a graduate student working on image deblurring. Is there any Python package doing that? I am also interested in wavelets. The PyWavelets package hasn't been updated for a year. Is it still being developed? Are there other Python wavelets packages? I want to persuade my adviser to switch from Matlab to Scipy. Without an image deblurring package/toolbox on Scipy, it is hard for him to switch. I am planning to rewrite some of his old codes in Scipy over the break and release them to the public. I just want to know if there is already some image deblurring Python codes out there? Regards, Daniel From lev at columbia.edu Wed Nov 21 10:04:48 2007 From: lev at columbia.edu (Lev Givon) Date: Wed, 21 Nov 2007 10:04:48 -0500 Subject: [SciPy-user] Any image deblurring and wavelets packages? In-Reply-To: <474445B5.1070201@emory.edu> References: <474445B5.1070201@emory.edu> Message-ID: <20071121150448.GA6392@localhost.cc.columbia.edu> Received from Ying Wai (Daniel) Fan on Wed, Nov 21, 2007 at 09:50:29AM EST: > Hi All, > > I am a graduate student working on image deblurring. Is there any Python > package doing that? > I am also interested in wavelets. The PyWavelets package hasn't been > updated for a year. Is it still being developed? Are there other Python > wavelets packages? > The pywavelets code is still being actively developed: http://wavelets.scipy.org L.G. From xavier.barthelemy at cmla.ens-cachan.fr Wed Nov 21 10:50:21 2007 From: xavier.barthelemy at cmla.ens-cachan.fr (Xavier Barthelemy) Date: Wed, 21 Nov 2007 16:50:21 +0100 Subject: [SciPy-user] Place data on grid to minimise cost function In-Reply-To: References: Message-ID: <474453BD.20706@cmla.ens-cachan.fr> John Reid a ?crit : > I have a problem for which I do not know any algorithm to solve it. I'm > wondering if there is something in scipy that will help me. Apologies if > there isn't and this is off-topic. > > I have a set of N data points, which I want to place on a k x m grid. I > have a cost function D(x1,x2) that measures how different 2 data points > are. I want to minimise the sum of D over all neighbouring pairs of > points on the grid (i.e. I want similar points to be neighbours on the > grid). > > A) Is there some established technique to do this? yes. It is called (un)constraint optimization, or optimal control, or in physics, inverse problems. You can see Lions and al (he received a fields medal for that) for some biblio. To do an optimal choice you will have to minimize or maximize the cost function with(out) constraint. you'll have to zeros the cost-function derivative respect to each parameters. basically, you have 2 choices: the sensibility method, where you manually (good!) or numerically (bouuu very bad!) calculate the frechet-derivative of the cost function, ie the derivative for each parameters respect to every else parameters for every single evaluation of your cost function. it means for k*m grid, you will need a (k*n)**2 matrix which represent the derivative of the cost function evaluated in one point. the adjoint methods (very good, very modern, less calculus, stable numerically), where you evaluate the cost function derivative with the adjoint operator (mathematically calculated (better) or numerically calculated (not so acurate)) solved. The best method and the most stable, because it reimplace an ill posed problem by a suite of well posed problem, in the Hadamard sense. > B) Is it in scipy? this one, I don't know :) > > BTW, N = k.m > > Thanks, > John. > cheers Xavier From gael.varoquaux at normalesup.org Wed Nov 21 10:54:57 2007 From: gael.varoquaux at normalesup.org (Gael Varoquaux) Date: Wed, 21 Nov 2007 16:54:57 +0100 Subject: [SciPy-user] organizing data In-Reply-To: <474447DB.6030204@cmla.ens-cachan.fr> References: <4740FF05.4070302@ar.media.kyoto-u.ac.jp> <20071119142042.GA21903@localhost.cc.columbia.edu> <4742546A.7090200@ar.media.kyoto-u.ac.jp> <4742C339.8010308@cmla.ens-cachan.fr> <20071121074522.GA14595@clipper.ens.fr> <20071121101650.GE14595@clipper.ens.fr> <474447DB.6030204@cmla.ens-cachan.fr> Message-ID: <20071121155457.GA2447@clipper.ens.fr> Let's keep this on the mailing list, or else, we can switch to French (not that I mind English). On Wed, Nov 21, 2007 at 03:59:39PM +0100, Xavier Barthelemy wrote: > So I think you did not understand my problem. It's a problem of > organizing and cutting by slice of rows in my data. OK, you're telling me you already have interpolated your data, and you have all the values you need (ie regular grids in the different directions you are interested in), but you just need to sort them out. > My data comports 18 values witch depends of 12 parameters. > so I have my data like that: > by rows: > first the 12 parameters, then the 18 values. > each rows represent a numeric experiment. I am doing parameters > exploration, so each varies independently. But the discretization of the > space parameters is not "regular": i have refined some for some values > of the others parameters. If you want to do image plot, it will be more easier to have regular data, that's where interpolation comes in. It looks to me like you want to have an interpolation function f({P}) -> {V} where {P} is your set of parameters and {V} your set of values. When you want to plot the cut along a hyperplane HP, you simply choose a regular grid of this hyper plane, and apply your interpolation function f on it. That's how I would do it, if I understand your problem correctly. > so my problem is really what I have (badly i guess) explained: I would > like to plot 2D (and 3D) graphs. > Let's suppose that I have X ,Y and Z datas corresponding. knowing that, > how do you plot Y(X) with Z constant when you have a bunch of three > columns data? In 1D that's really easy: if D is your [X, Y, Z] array, I would do X = D[:, 0] Y = D[:, 1] Z = D[:, 2] # Select all the data for Z = z0 x = X[Z==z0] y = Y[Z==z0] # Sort, just to make things prettier y = y[argsort(x)] x.sort() plot(x, y) > you 'll have to first sort by Z, and then by X. so when you'll plot > that, for each "X", sequentially sorted, you'll have the different "Y" > for each "Z" values. consequently you will have the number of different > Z values plots. OK, you want to do this for all values of Z. > And my problem now is the generalization with 12 parameters, let's name > them from A to J. How I'll do if I want to plot F(H)? the same, I will > sort by each of the 10 parameters and finally by H and I will have a > family of plots. Yes, you can do this in a nice way using an array with fields and the "order" argument to sort. > but now I want to cut (slice) them to have each plot independently, and > i can plot them by the interfaced gnuplot. You can always generalize the cutting method used in my example. If U, V, W are parameters (similar to Z in the example above), you can define a mask array: mask = (U == u0) & (V == v0) & (W == w0) # You dont really need x and y arrays, you could directly go to xy x = X[mask] y = Y[mask] xy = empty(x.shape, dtype=dtype([('x','float'), ('y','float')])) xy['x'] = x xy['y'] = y xy.sort(order=('x', 'y')) then you have in xy['x'] and xy['y'] what you are interested in, if I understand things right. But I still think you need a regular grid, so either your data already has that structure, and I don't understand why it has been "shuffled", or it hasn't, and you'll need interpolating, so why bother sorting? (selecting might be useful to reduce the amount of points). HTH, Ga?l From josegomez at gmx.net Wed Nov 21 10:59:57 2007 From: josegomez at gmx.net (Jose Luis Gomez Dans) Date: Wed, 21 Nov 2007 16:59:57 +0100 Subject: [SciPy-user] Is scipy.io.numpyio.fread working? Message-ID: <20071121155957.309730@gmx.net> Hi, I am trying to read binary data from a binary file. The file has a header, and several chunks of data in it. I want to seek some position, and read from there X data into an array. So, I have done the following: fp = open( file ) fp.seek (header_position ) #My header position a = scipy.io.numpyio.fread(fp, 2400*2400,'f',byteswap=1) This bombs with a : fread() takes no keyword arguments I don't think this has to do with the fact that I seeked the file. I am running scipy.__version__="0.6.0" from Andrew Straw's Ubuntu repository on Ubuntu i386. Is there something wrong here? Cheers! Jose -- GMX FreeMail: 1 GB Postfach, 5 E-Mail-Adressen, 10 Free SMS. Alle Infos und kostenlose Anmeldung: http://www.gmx.net/de/go/freemail From xavier.barthelemy at cmla.ens-cachan.fr Wed Nov 21 11:24:10 2007 From: xavier.barthelemy at cmla.ens-cachan.fr (Xavier Barthelemy) Date: Wed, 21 Nov 2007 17:24:10 +0100 Subject: [SciPy-user] organizing data In-Reply-To: <20071121155457.GA2447@clipper.ens.fr> References: <4740FF05.4070302@ar.media.kyoto-u.ac.jp> <20071119142042.GA21903@localhost.cc.columbia.edu> <4742546A.7090200@ar.media.kyoto-u.ac.jp> <4742C339.8010308@cmla.ens-cachan.fr> <20071121074522.GA14595@clipper.ens.fr> <20071121101650.GE14595@clipper.ens.fr> <474447DB.6030204@cmla.ens-cachan.fr> <20071121155457.GA2447@clipper.ens.fr> Message-ID: <47445BAA.7070406@cmla.ens-cachan.fr> Gael Varoquaux a ?crit : > Let's keep this on the mailing list, or else, we can switch to French > (not that I mind English). Sorry I made reply without looking at the email > > On Wed, Nov 21, 2007 at 03:59:39PM +0100, Xavier Barthelemy wrote: >> So I think you did not understand my problem. It's a problem of >> organizing and cutting by slice of rows in my data. > > > OK, you're telling me you already have interpolated your data, and you > have all the values you need (ie regular grids in the different > directions you are interested in), but you just need to sort them out. > no, I did not interpolate my data, because you cannot. I am looking for sub-critical instabilities in binary instationary miscible compressible viscous fluids. so each numerical experiment is the result of (for some) days or hours of calculus in a high performance parallel computer. >> My data comports 18 values witch depends of 12 parameters. > >> so I have my data like that: >> by rows: >> first the 12 parameters, then the 18 values. > >> each rows represent a numeric experiment. I am doing parameters >> exploration, so each varies independently. But the discretization of the >> space parameters is not "regular": i have refined some for some values >> of the others parameters. > > If you want to do image plot, it will be more easier to have regular > data, that's where interpolation comes in. It looks to me like you want > to have an interpolation function f({P}) -> {V} where {P} is your set of > parameters and {V} your set of values. When you want to plot the cut > along a hyperplane HP, you simply choose a regular grid of this hyper > plane, and apply your interpolation function f on it. That's how I would > do it, if I understand your problem correctly. Yes, you're right, i would do it like that, but I can't. Too much calculus. I am refining some zones in the space parameter independently. > >> so my problem is really what I have (badly i guess) explained: I would >> like to plot 2D (and 3D) graphs. >> Let's suppose that I have X ,Y and Z datas corresponding. knowing that, >> how do you plot Y(X) with Z constant when you have a bunch of three >> columns data? > > In 1D that's really easy: if D is your [X, Y, Z] array, I would do > > X = D[:, 0] > Y = D[:, 1] > Z = D[:, 2] > > # Select all the data for Z = z0 > x = X[Z==z0] > y = Y[Z==z0] > > # Sort, just to make things prettier > y = y[argsort(x)] > x.sort() > > plot(x, y) i will try it. may be i have thought it too complicated. > >> you 'll have to first sort by Z, and then by X. so when you'll plot >> that, for each "X", sequentially sorted, you'll have the different "Y" >> for each "Z" values. consequently you will have the number of different >> Z values plots. > > OK, you want to do this for all values of Z. > >> And my problem now is the generalization with 12 parameters, let's name >> them from A to J. How I'll do if I want to plot F(H)? the same, I will >> sort by each of the 10 parameters and finally by H and I will have a >> family of plots. > > Yes, you can do this in a nice way using an array with fields and the > "order" argument to sort. > >> but now I want to cut (slice) them to have each plot independently, and >> i can plot them by the interfaced gnuplot. > > You can always generalize the cutting method used in my example. If U, V, > W are parameters (similar to Z in the example above), you can define a > mask array: > > mask = (U == u0) & (V == v0) & (W == w0) > > # You dont really need x and y arrays, you could directly go to xy > x = X[mask] > y = Y[mask] > xy = empty(x.shape, dtype=dtype([('x','float'), ('y','float')])) > xy['x'] = x > xy['y'] = y > > xy.sort(order=('x', 'y')) > > then you have in xy['x'] and xy['y'] what you are interested in, if I > understand things right. sounds good > > But I still think you need a regular grid, so either your data already > has that structure, and I don't understand why it has been "shuffled", or > it hasn't, and you'll need interpolating, so why bother sorting? > (selecting might be useful to reduce the amount of points). they are "shuffled" because I did not saved a data-base with the logical links. so if I want the 7th column in function of the 5th, they are "shuffled" thanks for your help, I'll try just now cheers Xavier From josegomez at gmx.net Wed Nov 21 11:25:43 2007 From: josegomez at gmx.net (Jose Luis Gomez Dans) Date: Wed, 21 Nov 2007 17:25:43 +0100 Subject: [SciPy-user] Is scipy.io.numpyio.fread working? In-Reply-To: <20071121155957.309730@gmx.net> References: <20071121155957.309730@gmx.net> Message-ID: <20071121162543.8250@gmx.net> Hi, -------- Original-Nachricht -------- > a = scipy.io.numpyio.fread(fp, 2400*2400,'f',byteswap=1) > > This bombs with a > : fread() takes no keyword arguments The problem might be in the docstring. If I just do a = scipy.io.numpyio.fread(fp, 2400*2400,'f',byteswap=1).byteswap() it works perfectly. However, the docstring appears to be from an older version of scipy where byteswapping was indeed part of fread. Is it a bug? Thanks, Jose -- Ist Ihr Browser Vista-kompatibel? Jetzt die neuesten Browser-Versionen downloaden: http://www.gmx.net/de/go/browser From josegomez at gmx.net Wed Nov 21 11:25:43 2007 From: josegomez at gmx.net (Jose Luis Gomez Dans) Date: Wed, 21 Nov 2007 17:25:43 +0100 Subject: [SciPy-user] Is scipy.io.numpyio.fread working? In-Reply-To: <20071121155957.309730@gmx.net> References: <20071121155957.309730@gmx.net> Message-ID: <20071121162543.8250@gmx.net> Hi, -------- Original-Nachricht -------- > a = scipy.io.numpyio.fread(fp, 2400*2400,'f',byteswap=1) > > This bombs with a > : fread() takes no keyword arguments The problem might be in the docstring. If I just do a = scipy.io.numpyio.fread(fp, 2400*2400,'f',byteswap=1).byteswap() it works perfectly. However, the docstring appears to be from an older version of scipy where byteswapping was indeed part of fread. Is it a bug? Thanks, Jose -- Ist Ihr Browser Vista-kompatibel? Jetzt die neuesten Browser-Versionen downloaden: http://www.gmx.net/de/go/browser From gael.varoquaux at normalesup.org Wed Nov 21 11:47:27 2007 From: gael.varoquaux at normalesup.org (Gael Varoquaux) Date: Wed, 21 Nov 2007 17:47:27 +0100 Subject: [SciPy-user] organizing data In-Reply-To: <47445BAA.7070406@cmla.ens-cachan.fr> References: <4740FF05.4070302@ar.media.kyoto-u.ac.jp> <20071119142042.GA21903@localhost.cc.columbia.edu> <4742546A.7090200@ar.media.kyoto-u.ac.jp> <4742C339.8010308@cmla.ens-cachan.fr> <20071121074522.GA14595@clipper.ens.fr> <20071121101650.GE14595@clipper.ens.fr> <474447DB.6030204@cmla.ens-cachan.fr> <20071121155457.GA2447@clipper.ens.fr> <47445BAA.7070406@cmla.ens-cachan.fr> Message-ID: <20071121164727.GE2447@clipper.ens.fr> On Wed, Nov 21, 2007 at 05:24:10PM +0100, Xavier Barthelemy wrote: > no, I did not interpolate my data, because you cannot. I am looking for > sub-critical instabilities in binary instationary miscible > compressible viscous fluids. so each numerical experiment is the result > of (for some) days or hours of calculus in a high performance parallel > computer. I am talking of interpolating using the results of these experiments, and not by doing the calculations. Just think of you results as data points and you need to do interpolation on these data points. If you don't have regular gridded data, your plotting software will do interpolation (if it knows how) to plot it with an image. HTH, Ga?l From robert.kern at gmail.com Wed Nov 21 13:37:01 2007 From: robert.kern at gmail.com (Robert Kern) Date: Wed, 21 Nov 2007 12:37:01 -0600 Subject: [SciPy-user] organizing data In-Reply-To: <20071121101650.GE14595@clipper.ens.fr> References: <4740FF05.4070302@ar.media.kyoto-u.ac.jp> <20071119142042.GA21903@localhost.cc.columbia.edu> <4742546A.7090200@ar.media.kyoto-u.ac.jp> <4742C339.8010308@cmla.ens-cachan.fr> <20071121074522.GA14595@clipper.ens.fr> <20071121101650.GE14595@clipper.ens.fr> Message-ID: <47447ACD.7000608@gmail.com> Gael Varoquaux wrote: > On Wed, Nov 21, 2007 at 08:45:22AM +0100, Gael Varoquaux wrote: >> If I understand properly, you have non-regular gridded data that you want >> to interpolate on a regular grid (though only through a cut) to plot as >> images. If I get it right, that does not sound an incredibly easy >> problem. There are algorithmes to do this, but I don't think there are >> wrapped in scipy. VTK but do this, but AFAIK, only in 3D. > > I am stupid. Of course there are things in scipy ! > > First you can look at a Delaunay natural neighbor interpolation > (http://www.scipy.org/Cookbook/Matplotlib/Gridding_irregularly_spaced_data). > Unfortunately I think the algorithm in Scipy can only do this for 2D > data. Robert Kern will chime in, when he wakes up. You are correct. > To everybody, is it possible to change the sandbox so that when scipy is > built, all the packages in the sandbox are built, but the build doesn't > complain if it fails for these (or you need a special switch to have it > complain). There are some really nice packages in the sandbox, but most > people cannot build them because they don't know how to build. IMHO it is > better to have the binary packages of scipy shipping with the fraction of > the sandbox that builds, and might not even pass tests, than to have > these packages simply not available for users who cannot build. Jarrod and I came to the conclusion that the sandbox packages will be moved out of the sandbox. Packages that just need a home could go to scikits instead. Packages intended for scipy itself can live under branches/ temporarily as a standalone package until they are approved for inclusion. Maintainers who want to build binaries of them can do so. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From nwagner at iam.uni-stuttgart.de Wed Nov 21 14:04:07 2007 From: nwagner at iam.uni-stuttgart.de (Nils Wagner) Date: Wed, 21 Nov 2007 20:04:07 +0100 Subject: [SciPy-user] a small OpenOpt example howto benchmark solvers In-Reply-To: <4743EF28.4060707@scipy.org> References: <4743EF28.4060707@scipy.org> Message-ID: On Wed, 21 Nov 2007 10:41:12 +0200 dmitrey wrote: > Hi all, > I have got a letter from Nils Wagner about interest of >NLP solvers > comparison, and maybe some of scipy-user mailist >subscribers could be > interested as well. > > So if anyone is interested - here's an example of >ALGENCAN, cobyla and > lincher comparison: > http://openopt.blogspot.com/2007/11/example-of-oo-nlp-solvers-benchmark.html > > Please let me note that usually ALGENCAN is much more >better, and > choosing correct stop criteria for your problem >(gradtol, ftol, xtol > etc) is very essential. > > Regards, D > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user Hi Dmitrey, Do you have a reference for your benchmark example ? I'm curious about it. Nils From dmitrey.kroshko at scipy.org Wed Nov 21 14:11:32 2007 From: dmitrey.kroshko at scipy.org (dmitrey) Date: Wed, 21 Nov 2007 21:11:32 +0200 Subject: [SciPy-user] a small OpenOpt example howto benchmark solvers In-Reply-To: References: <4743EF28.4060707@scipy.org> Message-ID: <474482E4.8060105@scipy.org> Hi Nils, I suppose I have provided all required URLs. Let me mention the ones once again: blog report about results http://openopt.blogspot.com/2007/11/example-of-oo-nlp-solvers-benchmark.html and this one contains URLs to code (7th line from top, "the example"): http://projects.scipy.org/scipy/scikits/browser/trunk/openopt/scikits/openopt/examples/nlp_bench_1.py Alan G Isaac wrote: > What problems did you run into with matplotlib? > Did you bring them to the matplotlib list? > (The developers are extremely responsive.) Thank you, I'll take it into account. Regards, D. Nils Wagner wrote: > On Wed, 21 Nov 2007 10:41:12 +0200 > dmitrey wrote: > >> Hi all, >> I have got a letter from Nils Wagner about interest of >> NLP solvers >> comparison, and maybe some of scipy-user mailist >> subscribers could be >> interested as well. >> >> So if anyone is interested - here's an example of >> ALGENCAN, cobyla and >> lincher comparison: >> http://openopt.blogspot.com/2007/11/example-of-oo-nlp-solvers-benchmark.html >> >> Please let me note that usually ALGENCAN is much more >> better, and >> choosing correct stop criteria for your problem >> (gradtol, ftol, xtol >> etc) is very essential. >> >> Regards, D >> _______________________________________________ >> SciPy-user mailing list >> SciPy-user at scipy.org >> http://projects.scipy.org/mailman/listinfo/scipy-user >> > > > Hi Dmitrey, > > Do you have a reference for your benchmark example ? > I'm curious about it. > > Nils > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > > > > From nwagner at iam.uni-stuttgart.de Wed Nov 21 14:28:08 2007 From: nwagner at iam.uni-stuttgart.de (Nils Wagner) Date: Wed, 21 Nov 2007 20:28:08 +0100 Subject: [SciPy-user] a small OpenOpt example howto benchmark solvers In-Reply-To: <474482E4.8060105@scipy.org> References: <4743EF28.4060707@scipy.org> <474482E4.8060105@scipy.org> Message-ID: On Wed, 21 Nov 2007 21:11:32 +0200 dmitrey wrote: > Hi Nils, > I suppose I have provided all required URLs. > Let me mention the ones once again: > > blog report about results > http://openopt.blogspot.com/2007/11/example-of-oo-nlp-solvers-benchmark.html > and this one contains URLs to code (7th line from top, >"the example"): > http://projects.scipy.org/scipy/scikits/browser/trunk/openopt/scikits/openopt/examples/nlp_bench_1.py > Sorry for the confusion. I was talking about a reference from the literature. Nils From dmitrey.kroshko at scipy.org Wed Nov 21 14:57:04 2007 From: dmitrey.kroshko at scipy.org (dmitrey) Date: Wed, 21 Nov 2007 21:57:04 +0200 Subject: [SciPy-user] a small OpenOpt example howto benchmark solvers In-Reply-To: References: <4743EF28.4060707@scipy.org> <474482E4.8060105@scipy.org> Message-ID: <47448D90.9090008@scipy.org> Nils Wagner wrote: > On Wed, 21 Nov 2007 21:11:32 +0200 > dmitrey wrote: > >> Hi Nils, >> I suppose I have provided all required URLs. >> Let me mention the ones once again: >> >> blog report about results >> http://openopt.blogspot.com/2007/11/example-of-oo-nlp-solvers-benchmark.html >> and this one contains URLs to code (7th line from top, >> "the example"): >> http://projects.scipy.org/scipy/scikits/browser/trunk/openopt/scikits/openopt/examples/nlp_bench_1.py >> >> > > Sorry for the confusion. I was talking about a reference > from the literature. > > Nils > That example is not from literature, it's just almost 1st that I decided to code and bench (looks very similar to nlp_2.py from /openopt/examples/, but objfunc is (x-M)**1.5, not **2, to prevent 100% pure quadratic func). Almost all examples from literature available for me are too small to be taken into openopt time and cputime comparison, because (as I have mentioned you in a letter) openopt requires some time for preparation prob struct before and after calculations; also, stop criteria mean too much for so small problems. As you see from the example, ALGENCAN takes much more f,c,h evals (for *this* example) time is almost same. It's because the cause I have mentioned (it's espesially valid for small problems), + ALGENCAN is fortran-written (as well as cobyla), while lincher is 100% Python + cvxopt QP solver, that is also mostly Python-written. Regards, D. From robert.kern at gmail.com Wed Nov 21 15:17:26 2007 From: robert.kern at gmail.com (Robert Kern) Date: Wed, 21 Nov 2007 14:17:26 -0600 Subject: [SciPy-user] help on array manipulation In-Reply-To: References: Message-ID: <47449256.4070407@gmail.com> Tim Michelsen wrote: > Hello, > > I have the following array: > > In [240]: z > Out[240]: > array([[ 15., 3., 7., 0., 0., 0., 0., 0., 0., 0., 0., > 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., > 0., 0., 0.], > [ 16., 3., 7., 0., 0., 0., 0., 0., 0., 0., 0., > 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., > 0., 0., 0.], > [ 17., 3., 7., 0., 0., 0., 0., 0., 0., 0., 0., > 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., > 0., 0., 0.]]) > > How do I get it be become like: > > 15 3 > 15 7 > 15 0 > ... > 16 3 > 16 7 > ... > 17 3 > > > Is there a special function I can use for this or do I need to build various for > loops to achive this? import numpy keys = z[:,0] values = z[:,1:] n = values.shape[-1] z2 = numpy.column_stack([numpy.repeat(keys, n), values.flat]) > I mean this example has few data. But what if the amount increases? > > I am stuck here. > > I'd be very glad if anyone could point be to a good tutorial for array > manipulation. Scipy Cookbook is not enough. > Well, I also found : > http://numpy.scipy.org/numpydoc > and > http://www.penzilla.net/tutorials/python/numeric This helps: http://www.scipy.org/Numpy_Example_List_With_Doc but honestly, it mostly comes with practice. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From anand.prabhakar.patil at gmail.com Wed Nov 21 21:58:38 2007 From: anand.prabhakar.patil at gmail.com (Anand Patil) Date: Wed, 21 Nov 2007 18:58:38 -0800 Subject: [SciPy-user] SciPy-user Digest, Vol 51, Issue 45 In-Reply-To: References: Message-ID: <2bc7a5a50711211858y76f766d1i4816cfe3b45f2e22@mail.gmail.com> >From: "Boris von Loesch" > Subject: Re: [SciPy-user] Sparse matrix advice > > Hi Anand, > > I use the COO-format for the creation of the matrix, because it could > be assembled quite fast. A second advantage is, that you need to > create the sparsity only the first time and then just create a vector > with the non-zero entries. > For calculations I convert it to CSR-Format, which is also quite fast. > With this method I can create a 255^2*255^2 sparse matrix with roughly > 8 non-zero entries per row in about 0.25 sec. Creating the matrix in > LIL-format or DIC-format is a lot slower. > > Best regards, > Boris Hi Boris, Thanks for the advice, I think I can get where I'm trying to go now. A couple of additional questions: Is there a triangular version (DTRMM) of csrmucsr out there somewhere? How about a CSR-CSR triangular solve? Thanks, Anand From nwagner at iam.uni-stuttgart.de Thu Nov 22 07:36:57 2007 From: nwagner at iam.uni-stuttgart.de (Nils Wagner) Date: Thu, 22 Nov 2007 13:36:57 +0100 Subject: [SciPy-user] Read an array from a file using io.read_array Message-ID: Hi all, Is it possible to use io.read_array for the following task. I have an ascii file realeig.pmat. The essential information in this file is a table with 4 columns and several lines corresponding to the number of computed eigenvalues. The task is to read that information from a file. Empty lines and lines beginning with "!", "$" should be skipped. How can I handle the delimiter "&" which is used to continue lines ? Any pointer would be appreciated. Nils -------------- next part -------------- A non-text attachment was scrubbed... Name: realeig.pmat.gz Type: application/x-gzip Size: 1057 bytes Desc: not available URL: From ptenconi at eco.uninsubria.it Thu Nov 22 09:24:42 2007 From: ptenconi at eco.uninsubria.it (Paolo Tenconi) Date: Thu, 22 Nov 2007 15:24:42 +0100 Subject: [SciPy-user] Numpy matrix and blitz Matrix Message-ID: Hi Everyone, *) numpy arrays are automatically converted to blitz arrays. That's fine. *) I need to work with blitz Matrix objects, but I noticed that numpy matrix objects are not converted to them and I get a compilation error too 3) Does someone on the list succeeded doing that? Or is there a workaround to create quickly a blitz matrix object from a numpy 2D array or numpy-matrix without wasting time in casting operations? It would be nice having an automatic translation of numpy.matrix to blitz Matrix as matrices are used frequently in scientific computing. Any suggestion or help would be very glad. Paolo #------------------ EXAMPLE CODE ----------------------------------- import numpy import scipy #Dot product with arrays (it works) x=numpy.array([[1,2,],[3,4]]) y=numpy.zeros((2,2)) scipy.weave.inline("""y=x*x;""",['x','y'],type_converters=scipy.weave.converters.blitz,compiler='gcc',force=1) #Matrix multiplication with matrix (gives compilation error) X=numpy.matlib.mat('1 2; 3 4') Y=numpy.matlib.zeros((2,2)) scipy.weave.inline("""Y=X*X;""",['X','Y'],type_converters=scipy.weave.converters.blitz,compiler='gcc') #-------------------------------------------------------------------- From aisaac at american.edu Thu Nov 22 11:04:04 2007 From: aisaac at american.edu (Alan G Isaac) Date: Thu, 22 Nov 2007 11:04:04 -0500 Subject: [SciPy-user] Read an array from a file using io.read_array In-Reply-To: References: Message-ID: On Thu, 22 Nov 2007, Nils Wagner apparently wrote: > Is it possible to use io.read_array for the following > task. I think you want numpy.loadtxt Cheers, Alan Isaac From nwagner at iam.uni-stuttgart.de Thu Nov 22 12:00:42 2007 From: nwagner at iam.uni-stuttgart.de (Nils Wagner) Date: Thu, 22 Nov 2007 18:00:42 +0100 Subject: [SciPy-user] Read an array from a file using io.read_array In-Reply-To: References: Message-ID: On Thu, 22 Nov 2007 11:04:04 -0500 Alan G Isaac wrote: > On Thu, 22 Nov 2007, Nils Wagner apparently wrote: >> Is it possible to use io.read_array for the following >> task. > > I think you want numpy.loadtxt > > Cheers, > Alan Isaac > > > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user loadtxt(fname, dtype=, comments='#', delimiter=None, converters=None, skiprows=0, usecols=None, unpack=False) Can I use more than one character to filter out comment lines starting with "!" or "$" ? e.g. comments='!$' Nils From rshepard at appl-ecosys.com Thu Nov 22 13:04:43 2007 From: rshepard at appl-ecosys.com (Rich Shepard) Date: Thu, 22 Nov 2007 10:04:43 -0800 (PST) Subject: [SciPy-user] Web Site's Down Message-ID: Did the server for the scipy.org web site take off for the Thanksgiving holiday? Firefox keeps reporting that the server cannot be found. Rich -- Richard B. Shepard, Ph.D. | Integrity Credibility Applied Ecosystem Services, Inc. | Innovation Voice: 503-667-4517 Fax: 503-667-8863 From gael.varoquaux at normalesup.org Thu Nov 22 13:07:10 2007 From: gael.varoquaux at normalesup.org (Gael Varoquaux) Date: Thu, 22 Nov 2007 19:07:10 +0100 Subject: [SciPy-user] Web Site's Down In-Reply-To: References: Message-ID: <20071122180710.GI4775@clipper.ens.fr> On Thu, Nov 22, 2007 at 10:04:43AM -0800, Rich Shepard wrote: > Did the server for the scipy.org web site take off for the Thanksgiving > holiday? Firefox keeps reporting that the server cannot be found. Its fine here ! No thanksgivinf in France. Cheers, Ga?l From rshepard at appl-ecosys.com Thu Nov 22 13:10:58 2007 From: rshepard at appl-ecosys.com (Rich Shepard) Date: Thu, 22 Nov 2007 10:10:58 -0800 (PST) Subject: [SciPy-user] Web Site's Down In-Reply-To: <20071122180710.GI4775@clipper.ens.fr> References: <20071122180710.GI4775@clipper.ens.fr> Message-ID: On Thu, 22 Nov 2007, Gael Varoquaux wrote: > Its fine here ! No thanksgivinf in France. Ga?l, It came back up here just a few minutes ago. And you might have a thanksgiving if the rail and energy workers stop striking and go back to work. :-) Rich -- Richard B. Shepard, Ph.D. | Integrity Credibility Applied Ecosystem Services, Inc. | Innovation Voice: 503-667-4517 Fax: 503-667-8863 From vagabondaero at gmail.com Thu Nov 22 13:23:36 2007 From: vagabondaero at gmail.com (Vagabond_Aero) Date: Thu, 22 Nov 2007 10:23:36 -0800 Subject: [SciPy-user] Web Site's Down In-Reply-To: References: <20071122180710.GI4775@clipper.ens.fr> Message-ID: <8b5ec91a0711221023hb18d50ck5a48bb4d62cd2347@mail.gmail.com> It popped right up for me On Nov 22, 2007 10:10 AM, Rich Shepard wrote: > On Thu, 22 Nov 2007, Gael Varoquaux wrote: > > > Its fine here ! No thanksgivinf in France. > > Ga?l, > > It came back up here just a few minutes ago. > > And you might have a thanksgiving if the rail and energy workers stop > striking and go back to work. :-) > > > Rich > > -- > Richard B. Shepard, Ph.D. | Integrity Credibility > Applied Ecosystem Services, Inc. | Innovation > Voice: 503-667-4517 Fax: 503-667-8863 > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > > From fperez.net at gmail.com Thu Nov 22 16:18:47 2007 From: fperez.net at gmail.com (Fernando Perez) Date: Thu, 22 Nov 2007 14:18:47 -0700 Subject: [SciPy-user] New example In-Reply-To: <20071029210726.GE30790@mentat.za.net> References: <20071025211521.GA15301@mentat.za.net> <20071026064019.GD9025@clipper.ens.fr> <20071029210726.GE30790@mentat.za.net> Message-ID: On Oct 29, 2007 2:07 PM, Stefan van der Walt wrote: > Hi all, > > Here is the code for doing an elementary finite difference heat-flow > simulation. This is really just a repeated convolution, with certain > values resetting after each time-step (sources or sinks). The code > was written for Numeric, but it seems to run under NumPy without > problems. > > Regards > St?fan > > > On Fri, Oct 26, 2007 at 08:40:19AM +0200, Gael Varoquaux wrote: > > Anne and Stefan, I think these examples are very useful, and definitely > > should be on the scipy.org website. Would you all mind/want to have them contributed to the workbook of the 'python for science' workshop that John Hunter and I have taught (and will teach again soon)? http://matplotlib.svn.sourceforge.net/viewvc/matplotlib/trunk/py4science/workbook/ Obviously contributor credit would be given for all material, and this is available already to anyone wanting to use this for teaching (I need to update the credits page to list Stefan already for the FFT examples, let me know of anyone else I've missed). For a long time we've talked about sharing open teaching materials for scientific computing, and this is already a little start. The current workbook with solutions clocks in at ~57 pages, and over the next few days we'll try to add more. Cheers, f From peridot.faceted at gmail.com Thu Nov 22 17:16:15 2007 From: peridot.faceted at gmail.com (Anne Archibald) Date: Thu, 22 Nov 2007 17:16:15 -0500 Subject: [SciPy-user] New example In-Reply-To: References: <20071025211521.GA15301@mentat.za.net> <20071026064019.GD9025@clipper.ens.fr> <20071029210726.GE30790@mentat.za.net> Message-ID: On 22/11/2007, Fernando Perez wrote: > Would you all mind/want to have them contributed to the workbook of > the 'python for science' workshop that John Hunter and I have taught > (and will teach again soon)? That would be great! Anne From aisaac at american.edu Thu Nov 22 23:14:08 2007 From: aisaac at american.edu (Alan G Isaac) Date: Thu, 22 Nov 2007 23:14:08 -0500 Subject: [SciPy-user] Read an array from a file using io.read_array In-Reply-To: References: Message-ID: > On Thu, 22 Nov 2007 11:04:04 -0500 Alan wrote: >> I think you want numpy.loadtxt On Thu, 22 Nov 2007, Nils Wagner apparently wrote: > Can I use more than one character to filter out comment > lines starting with "!" or "$" ? > e.g. comments='!$' As long as all comments start in the same way and these two characters always start comments sequentially, no problem. But I think you are saying there are two **different** comment characters? Then you will have to filter comments yourself. Cheers, Alan Isaac From ramercer at gmail.com Thu Nov 22 23:54:26 2007 From: ramercer at gmail.com (Adam Mercer) Date: Thu, 22 Nov 2007 23:54:26 -0500 Subject: [SciPy-user] Building with a non-standard gfortran In-Reply-To: <799406d60711200603t8e451cah2e13add2bd794a11@mail.gmail.com> References: <799406d60711200603t8e451cah2e13add2bd794a11@mail.gmail.com> Message-ID: <799406d60711222054x696c6908tfcc560d498e043b8@mail.gmail.com> On 20/11/2007, Adam Mercer wrote: > Hi > > Using MacPorts, the gfortran compiler, from gcc-4.2.x, is installed as > gfortran-mp-4.2. I have been trying to build scipy with this compiler > and found that that numpy I need to build and install with > > $ python setup.py config_fc --fcompiler gnu95 \ > --f77exec /opt/local/bin/gfortran-mp-4.2 \ > --f90exec /opt/local/bin/gfortran-mp-4.2 build > $ python setup.py install --prefix=${NUMPY_LOCATION} > > however using the above for scipy fails on the install phase with it > being unable to locate a fortran compiler. I therefore need to > install scipy with > > $ python setup.py config_fc --fcompiler gnu95 \ > --f77exec /opt/local/bin/gfortran-mp-4.2 \ > --f90exec /opt/local/bin/gfortran-mp-4.2 install --prefix=${SCIPY_LOCATION} > > Is this expected, as I would have expected install to inherit the > options that where used during the build phase? > > Cheers > > Adam > > PS: I'm using numpy-1.0.4 and scipy-0.6.0 Anyone? From cookedm at physics.mcmaster.ca Fri Nov 23 02:31:10 2007 From: cookedm at physics.mcmaster.ca (David M. Cooke) Date: Fri, 23 Nov 2007 02:31:10 -0500 Subject: [SciPy-user] Building with a non-standard gfortran In-Reply-To: <799406d60711200603t8e451cah2e13add2bd794a11@mail.gmail.com> References: <799406d60711200603t8e451cah2e13add2bd794a11@mail.gmail.com> Message-ID: <4BD1DAD9-E9B8-4AE2-93E5-FE46B146830D@physics.mcmaster.ca> On Nov 20, 2007, at 09:03 , Adam Mercer wrote: > Hi > > Using MacPorts, the gfortran compiler, from gcc-4.2.x, is installed as > gfortran-mp-4.2. I have been trying to build scipy with this compiler > and found that that numpy I need to build and install with > > $ python setup.py config_fc --fcompiler gnu95 \ > --f77exec /opt/local/bin/gfortran-mp-4.2 \ > --f90exec /opt/local/bin/gfortran-mp-4.2 build > $ python setup.py install --prefix=${NUMPY_LOCATION} > > however using the above for scipy fails on the install phase with it > being unable to locate a fortran compiler. I therefore need to > install scipy with > > $ python setup.py config_fc --fcompiler gnu95 \ > --f77exec /opt/local/bin/gfortran-mp-4.2 \ > --f90exec /opt/local/bin/gfortran-mp-4.2 install --prefix=$ > {SCIPY_LOCATION} > > Is this expected, as I would have expected install to inherit the > options that where used during the build phase? The options used in the build phase aren't stored, so, yes, they need to be respecified. For numpy it doesn't matter, as the Fortran compiler isn't used. Since we use distutils, you can set the config_fc options in a file (either globally for the user in a ~/.pydistutils.cfg, or per-package in a setup.cfg next to the setup.py you're running) by adding the following section to the appropiate file [config_fc] fcompiler=gfortran f77exec=gfortran-mp-4.2 f90exec=gfortran-mp-4.2 -- |>|\/|< /------------------------------------------------------------------\ |David M. Cooke http://arbutus.physics.mcmaster.ca/dmc/ |cookedm at physics.mcmaster.ca From dbm.sun at gmail.com Fri Nov 23 12:18:03 2007 From: dbm.sun at gmail.com (Yutec Sun) Date: Fri, 23 Nov 2007 12:18:03 -0500 Subject: [SciPy-user] scipy 0.6 installation problem Message-ID: <11001d3e0711230918q1ab5e650s53c98fe8613420dd@mail.gmail.com> Hello, I'd like to ask your favor for helping me install scipy 0.6. I have a problem importing scipy. In fact, importing doesn't show an error message, but scipy.test produces failures. My OS is Ubuntu 7.10 (Gutsy Gibbon) amd64. Following the installation instruction, I attach the messages for further information. Please understand that I'm new to linux system and scipy, and let me know how to successfully install scipy step by step. I thank you for your help. Best regards, Yutec -------------------------------------------------------------------------------------------------------------- 1. Error message -------------------------------------------------------------------------------------------------------------- yutec at yutec-laptop:/usr/lib/python2.5/site-packages/numpy/distutils$ python Python 2.5.1 (r251:54863, Oct 5 2007, 13:50:07) [GCC 4.1.3 20070929 (prerelease) (Ubuntu 4.1.2-16ubuntu2)] on linux2 Type "help", "copyright", "credits" or "license" for more information. >>> import scipy >>> scipy.test(level=1) Found 9/9 tests for scipy.cluster.vq Found 18/18 tests for scipy.fftpack.basic Found 4/4 tests for scipy.fftpack.helper Found 20/20 tests for scipy.fftpack.pseudo_diffs Found 1/1 tests for scipy.integrate Found 10/10 tests for scipy.integrate.quadpack Found 3/3 tests for scipy.integrate.quadrature Found 6/6 tests for scipy.interpolate Found 6/6 tests for scipy.interpolate.fitpack Found 4/4 tests for scipy.io.array_import Found 28/28 tests for scipy.io.mio Found 13/13 tests for scipy.io.mmio Found 5/5 tests for scipy.io.npfile Found 4/4 tests for scipy.io.recaster Found 16/16 tests for scipy.lib.blas Found 128/128 tests for scipy.lib.blas.fblas Found 42/42 tests for scipy.lib.lapack Found 41/41 tests for scipy.linalg.basic Found 16/16 tests for scipy.linalg.blas Found 72/72 tests for scipy.linalg.decomp Found 128/128 tests for scipy.linalg.fblas Found 6/6 tests for scipy.linalg.iterative Found 4/4 tests for scipy.linalg.lapack Found 7/7 tests for scipy.linalg.matfuncs Found 9/9 tests for scipy.linsolve.umfpack Found 2/2 tests for scipy.maxentropy Found 3/3 tests for scipy.misc.pilutil Found 399/399 tests for scipy.ndimage Found 5/5 tests for scipy.odr Found 8/8 tests for scipy.optimize Found 1/1 tests for scipy.optimize.cobyla Found 10/10 tests for scipy.optimize.nonlin Found 4/4 tests for scipy.optimize.zeros Found 5/5 tests for scipy.signal.signaltools Found 4/4 tests for scipy.signal.wavelets Found 152/152 tests for scipy.sparse Found 342/342 tests for scipy.special.basic Found 3/3 tests for scipy.special.spfun_stats Found 107/107 tests for scipy.stats Found 73/73 tests for scipy.stats.distributions Found 10/10 tests for scipy.stats.morestats Found 0/0 tests for __main__ /usr/lib/python2.5/site-packages/scipy/cluster/vq.py:477: UserWarning: One of the clusters is empty. Re-run kmean with a different initialization. warnings.warn("One of the clusters is empty. " ...exception raised as expected: One of the clusters is empty. Re-run kmean with a different initialization. ................................................Residual: 1.05006987327e-07 ..................../usr/lib/python2.5/site-packages/scipy/interpolate/fitpack2.py:458: UserWarning: The coefficients of the spline returned have been computed as the minimal norm least-squares solution of a (numerically) rank deficient system (deficiency=7). If deficiency is large, the results may be inaccurate. Deficiency may strongly depend on the value of eps. warnings.warn(message) ...... Don't worry about a warning regarding the number of bytes read. Warning: 1000000 bytes requested, 20 bytes read. .........................................................................caxpy:n=4 ..caxpy:n=3 ....ccopy:n=4 ..ccopy:n=3 .............cscal:n=4 ....cswap:n=4 ..cswap:n=3 .....daxpy:n=4 ..daxpy:n=3 ....dcopy:n=4 ..dcopy:n=3 .............dscal:n=4 ....dswap:n=4 ..dswap:n=3 .....saxpy:n=4 ..saxpy:n=3 ....scopy:n=4 ..scopy:n=3 .............sscal:n=4 ....sswap:n=4 ..sswap:n=3 .....zaxpy:n=4 ..zaxpy:n=3 ....zcopy:n=4 ..zcopy:n=3 .............zscal:n=4 ....zswap:n=4 ..zswap:n=3 ................................................................................................................................................................................caxpy:n=4 ..caxpy:n=3 ....ccopy:n=4 ..ccopy:n=3 .............cscal:n=4 ....cswap:n=4 ..cswap:n=3 .....daxpy:n=4 ..daxpy:n=3 ....dcopy:n=4 ..dcopy:n=3 .............dscal:n=4 ....dswap:n=4 ..dswap:n=3 .....saxpy:n=4 ..saxpy:n=3 ....scopy:n=4 ..scopy:n=3 .............sscal:n=4 ....sswap:n=4 ..sswap:n=3 .....zaxpy:n=4 ..zaxpy:n=3 ....zcopy:n=4 ..zcopy:n=3 .............zscal:n=4 ....zswap:n=4 ..zswap:n=3 .............Result may be inaccurate, approximate err = 1.56163357201e-08 ...Result may be inaccurate, approximate err = 1.25221600096e-10 ......Use minimum degree ordering on A'+A. ..Use minimum degree ordering on A'+A. ...Use minimum degree ordering on A'+A. ............................................................................................................./usr/lib/python2.5/site-packages/scipy/ndimage/interpolation.py:41: UserWarning: Mode "reflect" may yield incorrect results on boundaries. Please use "mirror" instead. warnings.warn('Mode "reflect" may yield incorrect results on ' ........................................................................................................................................................................................................................................................................................................F..F.........................................................Use minimum degree ordering on A'+A. .....................................Use minimum degree ordering on A'+A. .....................................Use minimum degree ordering on A'+A. ................................Use minimum degree ordering on A'+A. ....................................................................................................................................................................................................................................................................................................................................................0.2 0.2 0.2 ......0.2 ..0.2 0.2 0.2 0.2 0.2 .........................................................................................................................................................................................................Ties preclude use of exact statistic. ..Ties preclude use of exact statistic. ...... ====================================================================== FAIL: test_explicit (scipy.tests.test_odr.test_odr) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/lib/python2.5/site-packages/scipy/odr/tests/test_odr.py", line 50, in test_explicit -8.7849712165253724e-02]), File "/usr/lib/python2.5/site-packages/numpy/testing/utils.py", line 232, in assert_array_almost_equal header='Arrays are not almost equal') File "/usr/lib/python2.5/site-packages/numpy/testing/utils.py", line 217, in assert_array_compare assert cond, msg AssertionError: Arrays are not almost equal (mismatch 100.0%) x: array([ 1.26462971e+03, -5.42545890e+01, -8.64250389e-02]) y: array([ 1.26465481e+03, -5.40184100e+01, -8.78497122e-02]) ====================================================================== FAIL: test_multi (scipy.tests.test_odr.test_odr) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/lib/python2.5/site-packages/scipy/odr/tests/test_odr.py", line 191, in test_multi 0.5101147161764654, 0.5173902330489161]), File "/usr/lib/python2.5/site-packages/numpy/testing/utils.py", line 232, in assert_array_almost_equal header='Arrays are not almost equal') File "/usr/lib/python2.5/site-packages/numpy/testing/utils.py", line 217, in assert_array_compare assert cond, msg AssertionError: Arrays are not almost equal (mismatch 100.0%) x: array([ 4.31272063, 2.44289312, 7.76215871, 0.55995622, 0.46423343]) y: array([ 4.37998803, 2.43330576, 8.00288459, 0.51011472, 0.51739023]) ---------------------------------------------------------------------- Ran 1728 tests in 3.660s FAILED (failures=2) -------------------------------------------------------------------------------------------------------------- 2. System check -------------------------------------------------------------------------------------------------------------- yutec at yutec-laptop:~$ python -c 'import os,sys;print os.name,sys.platform' posix linux2 yutec at yutec-laptop:~$ uname -a Linux yutec-laptop 2.6.22-14-generic #1 SMP Sun Oct 14 21:45:15 GMT 2007 x86_64 GNU/Linux yutec at yutec-laptop:~$ gcc -v Using built-in specs. Target: x86_64-linux-gnu Configured with: ../src/configure -v --enable-languages=c,c++,fortran,objc,obj-c++,treelang --prefix=/usr --enable-shared --with-system-zlib --libexecdir=/usr/lib --without-included-gettext --enable-threads=posix --enable-nls --with-gxx-include-dir=/usr/include/c++/4.1.3 --program-suffix=-4.1 --enable-__cxa_atexit --enable-clocale=gnu --enable-libstdcxx-debug --enable-mpfr --enable-checking=release x86_64-linux-gnu Thread model: posix gcc version 4.1.3 20070929 (prerelease) (Ubuntu 4.1.2-16ubuntu2) yutec at yutec-laptop:~$ g77 --version GNU Fortran (GCC) 3.4.6 (Ubuntu 3.4.6-6ubuntu2) Copyright (C) 2006 Free Software Foundation, Inc. GNU Fortran comes with NO WARRANTY, to the extent permitted by law. You may redistribute copies of GNU Fortran under the terms of the GNU General Public License. For more information about these matters, see the file named COPYING or type the command `info -f g77 Copying'. yutec at yutec-laptop:~$ python -c 'import sys;print sys.version' 2.5.1 (r251:54863, Oct 5 2007, 13:50:07) [GCC 4.1.3 20070929 (prerelease) (Ubuntu 4.1.2-16ubuntu2)] yutec at yutec-laptop:~$ python -c 'import numpy;print numpy.__version__' 1.0.4 yutec at yutec-laptop:~$ cd /usr/lib/python2.5/site-packages/scipy/ yutec at yutec-laptop:/usr/lib/python2.5/site-packages/scipy$ cd linalg yutec at yutec-laptop:/usr/lib/python2.5/site-packages/scipy/linalg$ yutec at yutec-laptop:/usr/lib/python2.5/site-packages/scipy/linalg$ python setup_atlas_version.py build_ext --inplace --force Traceback (most recent call last): File "setup_atlas_version.py", line 7, in from numpy.distutils.misc_util import get_path, default_config_dict ImportError: cannot import name get_path yutec at yutec-laptop:/usr/lib/python2.5/site-packages/scipy/linalg$ python -c 'import atlas_version' ATLAS version 3.6.0 built by root on Thu Jun 3 21:13:07 UTC 2004: UNAME : Linux ravel 2.6.5 #3 SMP Tue Apr 13 13:41:54 UTC 2004 x86_64 GNU/Linux INSTFLG : MMDEF : /home/camm/atlas3-3.6.0/CONFIG/ARCHS/HAMMER64SSE2/gcc/gemm ARCHDEF : /home/camm/atlas3-3.6.0/CONFIG/ARCHS/HAMMER64SSE2/gcc/misc F2CDEFS : -DAdd__ -DStringSunStyle CACHEEDGE: 720896 F77 : /home/camm/usr/bin/g77, version GNU Fortran (GCC) 3.3.3 (Debian 20040422) F77FLAGS : -fomit-frame-pointer -O -m64 CC : /usr/bin/gcc, version gcc (GCC) 3.3.3 (Debian 20040422) CC FLAGS : -fomit-frame-pointer -O -mfpmath=387 -m64 MCC : /usr/bin/gcc, version gcc (GCC) 3.3.3 (Debian 20040422) MCCFLAGS : -fomit-frame-pointer -O -mfpmath=387 -m64 yutec at yutec-laptop:/usr/lib/python2.5/site-packages/scipy/linalg$ cd .. yutec at yutec-laptop:/usr/lib/python2.5/site-packages/scipy$ cd .. yutec at yutec-laptop:/usr/lib/python2.5/site-packages$ cd numpy yutec at yutec-laptop:/usr/lib/python2.5/site-packages/numpy$ cd distutils/ yutec at yutec-laptop:/usr/lib/python2.5/site-packages/numpy/distutils$ python system_info.py lapack_info: libraries lapack not found in /usr/local/lib FOUND: libraries = ['lapack'] library_dirs = ['/usr/lib'] language = f77 lapack_opt_info: lapack_mkl_info: mkl_info: libraries mkl,vml,guide not found in /usr/local/lib libraries mkl,vml,guide not found in /usr/lib NOT AVAILABLE NOT AVAILABLE atlas_threads_info: Setting PTATLAS=ATLAS libraries ptf77blas,ptcblas,atlas not found in /usr/local/lib libraries lapack_atlas not found in /usr/local/lib libraries ptf77blas,ptcblas,atlas not found in /usr/lib/atlas libraries lapack_atlas not found in /usr/lib/atlas libraries ptf77blas,ptcblas,atlas not found in /usr/lib __main__.atlas_threads_info NOT AVAILABLE atlas_info: libraries f77blas,cblas,atlas not found in /usr/local/lib libraries lapack_atlas not found in /usr/local/lib libraries f77blas,cblas,atlas not found in /usr/lib/atlas libraries lapack_atlas not found in /usr/lib/atlas __main__.atlas_info FOUND: libraries = ['lapack', 'f77blas', 'cblas', 'atlas'] library_dirs = ['/usr/lib'] language = f77 include_dirs = ['/usr/include'] customize GnuFCompiler Found executable /usr/bin/g77 gnu: no Fortran 90 compiler found gnu: no Fortran 90 compiler found customize GnuFCompiler gnu: no Fortran 90 compiler found gnu: no Fortran 90 compiler found customize GnuFCompiler using config FOUND: libraries = ['lapack', 'f77blas', 'cblas', 'atlas'] library_dirs = ['/usr/lib'] language = f77 define_macros = [('ATLAS_INFO', '"\\"?.?.?\\""')] include_dirs = ['/usr/include'] lapack_atlas_info: libraries lapack_atlas,f77blas,cblas,atlas not found in /usr/local/lib libraries lapack_atlas not found in /usr/local/lib libraries lapack_atlas,f77blas,cblas,atlas not found in /usr/lib/atlas libraries lapack_atlas not found in /usr/lib/atlas __main__.lapack_atlas_info FOUND: libraries = ['lapack', 'lapack_atlas', 'f77blas', 'cblas', 'atlas'] library_dirs = ['/usr/lib'] language = f77 include_dirs = ['/usr/include'] umfpack_info: libraries umfpack not found in /usr/local/lib libraries umfpack not found in /usr/lib NOT AVAILABLE _pkg_config_info: Found executable /usr/bin/pkg-config NOT AVAILABLE lapack_atlas_threads_info: Setting PTATLAS=ATLAS libraries lapack_atlas,ptf77blas,ptcblas,atlas not found in /usr/local/lib libraries lapack_atlas not found in /usr/local/lib libraries lapack_atlas,ptf77blas,ptcblas,atlas not found in /usr/lib/atlas libraries lapack_atlas not found in /usr/lib/atlas libraries lapack_atlas,ptf77blas,ptcblas,atlas not found in /usr/lib __main__.lapack_atlas_threads_info NOT AVAILABLE x11_info: libraries X11 not found in /usr/lib64 NOT AVAILABLE blas_info: libraries blas not found in /usr/local/lib FOUND: libraries = ['blas'] library_dirs = ['/usr/lib'] language = f77 fftw_info: libraries fftw3 not found in /usr/local/lib FOUND: libraries = ['fftw3'] library_dirs = ['/usr/lib'] define_macros = [('SCIPY_FFTW3_H', None)] include_dirs = ['/usr/include'] f2py_info: FOUND: sources = ['/usr/lib/python2.5/site-packages/numpy/f2py/src/fortranobject.c'] include_dirs = ['/usr/lib/python2.5/site-packages/numpy/f2py/src'] gdk_pixbuf_xlib_2_info: NOT AVAILABLE dfftw_threads_info: libraries drfftw_threads,dfftw_threads not found in /usr/local/lib libraries drfftw_threads,dfftw_threads not found in /usr/lib dfftw threads not found NOT AVAILABLE atlas_blas_info: libraries f77blas,cblas,atlas not found in /usr/local/lib libraries f77blas,cblas,atlas not found in /usr/lib/atlas FOUND: libraries = ['f77blas', 'cblas', 'atlas'] library_dirs = ['/usr/lib'] language = c include_dirs = ['/usr/include'] fftw3_info: libraries fftw3 not found in /usr/local/lib FOUND: libraries = ['fftw3'] library_dirs = ['/usr/lib'] define_macros = [('SCIPY_FFTW3_H', None)] include_dirs = ['/usr/include'] blas_opt_info: blas_mkl_info: libraries mkl,vml,guide not found in /usr/local/lib libraries mkl,vml,guide not found in /usr/lib NOT AVAILABLE atlas_blas_threads_info: Setting PTATLAS=ATLAS libraries ptf77blas,ptcblas,atlas not found in /usr/local/lib libraries ptf77blas,ptcblas,atlas not found in /usr/lib/atlas libraries ptf77blas,ptcblas,atlas not found in /usr/lib NOT AVAILABLE customize GnuFCompiler gnu: no Fortran 90 compiler found gnu: no Fortran 90 compiler found customize GnuFCompiler gnu: no Fortran 90 compiler found gnu: no Fortran 90 compiler found customize GnuFCompiler using config FOUND: libraries = ['f77blas', 'cblas', 'atlas'] library_dirs = ['/usr/lib'] language = c define_macros = [('ATLAS_INFO', '"\\"?.?.?\\""')] include_dirs = ['/usr/include'] sfftw_info: libraries srfftw,sfftw not found in /usr/local/lib libraries srfftw,sfftw not found in /usr/lib sfftw not found NOT AVAILABLE xft_info: NOT AVAILABLE fft_opt_info: djbfft_info: NOT AVAILABLE FOUND: libraries = ['fftw3'] library_dirs = ['/usr/lib'] define_macros = [('SCIPY_FFTW3_H', None)] include_dirs = ['/usr/include'] gdk_x11_2_info: NOT AVAILABLE agg2_info: NOT AVAILABLE numarray_info: NOT AVAILABLE blas_src_info: NOT AVAILABLE fftw2_info: libraries rfftw,fftw not found in /usr/local/lib FOUND: libraries = ['rfftw', 'fftw'] library_dirs = ['/usr/lib'] define_macros = [('SCIPY_FFTW_H', None)] include_dirs = ['/usr/include'] fftw_threads_info: libraries rfftw_threads,fftw_threads not found in /usr/local/lib FOUND: libraries = ['rfftw_threads', 'fftw_threads'] library_dirs = ['/usr/lib'] define_macros = [('SCIPY_FFTW_THREADS_H', None)] include_dirs = ['/usr/include'] _numpy_info: FOUND: define_macros = [('NUMERIC_VERSION', '"\\"24.2\\""'), ('NUMERIC', None)] include_dirs = ['/usr/include/python2.5'] wx_info: Could not locate executable wx-config File not found: None. Cannot determine wx info. NOT AVAILABLE gdk_info: NOT AVAILABLE gtkp_x11_2_info: NOT AVAILABLE sfftw_threads_info: libraries srfftw_threads,sfftw_threads not found in /usr/local/lib libraries srfftw_threads,sfftw_threads not found in /usr/lib sfftw threads not found NOT AVAILABLE boost_python_info: NOT AVAILABLE freetype2_info: NOT AVAILABLE gdk_2_info: NOT AVAILABLE dfftw_info: libraries drfftw,dfftw not found in /usr/local/lib libraries drfftw,dfftw not found in /usr/lib dfftw not found NOT AVAILABLE lapack_src_info: NOT AVAILABLE gtkp_2_info: NOT AVAILABLE gdk_pixbuf_2_info: NOT AVAILABLE amd_info: libraries amd not found in /usr/local/lib libraries amd not found in /usr/lib NOT AVAILABLE Numeric_info: FOUND: define_macros = [('NUMERIC_VERSION', '"\\"24.2\\""'), ('NUMERIC', None)] include_dirs = ['/usr/include/python2.5'] numerix_info: numpy_info: FOUND: define_macros = [('NUMPY_VERSION', '"\\"1.0.4\\""'), ('NUMPY', None)] FOUND: define_macros = [('NUMPY_VERSION', '"\\"1.0.4\\""'), ('NUMPY', None)] From Laurence.Viry at imag.fr Fri Nov 23 13:16:50 2007 From: Laurence.Viry at imag.fr (Laurence Viry) Date: Fri, 23 Nov 2007 19:16:50 +0100 Subject: [SciPy-user] install numpy/scipy with Intel Compilers Message-ID: <47471912.3050300@imag.fr> Hi! I try to install numpy 1.0.4 and scipy 0.6.0 with INTEL C/C++ and Fortran compilers and SCSL scientific library with python2.4.4 on Itanium with linux OS. (Altix SGI with SUSE linux) I didn't success to configure the construct of build_ext for scipy with the fortran INTEL compiler (--fcompiler=intele) * First, I set LD_LIBRARY_PATH and LD_RUN_PATH with directories which contains c/c++, fortran and SCSL libraries export LD_LIBRARY_PATH=/opt/intel/intel_cc/9.0.024/lib:/opt/intel/intel_fc/9.0.021/lib:/usr/lib Export LD_RUN_PATH=/opt/intel/intel_cc/9.0.024/lib:/opt/intel/intel_fc/9.0.021/lib:/usr/lib * I put in the file "site.cfg" : [DEFAULT] library_dirs = /usr/lib:/usr/local/lib/:/usr/local/stow/fftw-3.1.2/lib include_dirs = /usr/include:/usr/local/include:/usr/local/stow/fftw-3.1.2/include [blas] library_dirs = /usr/lib include_dirs=/usr/include blas_libs= scs [lapack] library_dirs = /usr/lib include_dirs=/usr/include lapack_libs = scs [fftw] libraries = fftw3 * Compile and install numpy with: python setup.py config --compiler=intel build_clib --compiler=intel build_ext --compiler=intel install It's Ok for numpy, it doesn'tt need fortran compiler but for scipy python setup.py -v config_cc --compiler=intele config_fc --fcompiler=intele build_clib --compiler=intele --fcompiler=intele build_ext --compiler=intele --fcompiler=intele install It's Ok for build_clib, the python script find c and fortran compiler it doesn't find fortran compiler for build_ext, first I had a warning warning: build_ext: f77_compiler=intele is not available, it was available for the construct of build_clib? ???? then warning: build_ext: extension 'scipy.fftpack._fftpack' has Fortran libraries but no Fortran linker found, using default linker for the construct of build_ext, the python script use the default fortran compiler g77 and the construct stoped with the message error: "extension 'scipy.interpolate.dfitpack' has Fortran sources but no Fortran compiler found" why --fcompiler=intele option of build_ext doesn't operate????????? I show this option of build_ext command with the python command python setup.py build_ext --help-fcompiler I set the variables F77, F90 , F77FLAGS with the intel compiler, I had the same error How can I configure the construct of build_ext for scipy with the Intel fortran compiler The solutions I found on the scipy archives didn't work Thank for your help -- Laurence Viry CRIP UJF - Projet MIRAGE ( http://mirage.imag.fr ) Laboratoire de Mod?lisation et Calcul - IMAG tel: 04 76 63 56 34 fax: 04 76 63 12 63 e-mail: Laurence.Viry at imag.fr From cohen at slac.stanford.edu Fri Nov 23 14:30:41 2007 From: cohen at slac.stanford.edu (Johann Cohen-Tanugi) Date: Fri, 23 Nov 2007 11:30:41 -0800 Subject: [SciPy-user] scipy 0.6 installation problem In-Reply-To: <11001d3e0711230918q1ab5e650s53c98fe8613420dd@mail.gmail.com> References: <11001d3e0711230918q1ab5e650s53c98fe8613420dd@mail.gmail.com> Message-ID: <47472A61.6070607@slac.stanford.edu> well, I am certainly no expert on scipy, but the results of the test don't look that bad! When something really bad has occurred at build time, you usually don't pass almost all the tests! Ran 1728 tests in 3.660s FAILED (failures=2) As for the one that you don't pass, it seems that you end up with odr evaluations different than the ones the test expects... I think I have the same issue, but it does not point, IMHO, to a failure to build correctly. Experts will tell you about it better than I can, but from your post it seems that you are mostly ok to start enjoying scipy best, Johann Yutec Sun wrote: > Hello, > I'd like to ask your favor for helping me install scipy 0.6. > I have a problem importing scipy. In fact, importing doesn't show an > error message, but scipy.test produces failures. > My OS is Ubuntu 7.10 (Gutsy Gibbon) amd64. Following the installation > instruction, I attach the messages for further information. > Please understand that I'm new to linux system and scipy, and let me > know how to successfully install scipy step by step. > I thank you for your help. > > Best regards, > > Yutec > > > -------------------------------------------------------------------------------------------------------------- > 1. Error message > -------------------------------------------------------------------------------------------------------------- > yutec at yutec-laptop:/usr/lib/python2.5/site-packages/numpy/distutils$ python > Python 2.5.1 (r251:54863, Oct 5 2007, 13:50:07) > [GCC 4.1.3 20070929 (prerelease) (Ubuntu 4.1.2-16ubuntu2)] on linux2 > Type "help", "copyright", "credits" or "license" for more information. > >>>> import scipy >>>> scipy.test(level=1) >>>> > Found 9/9 tests for scipy.cluster.vq > Found 18/18 tests for scipy.fftpack.basic > Found 4/4 tests for scipy.fftpack.helper > Found 20/20 tests for scipy.fftpack.pseudo_diffs > Found 1/1 tests for scipy.integrate > Found 10/10 tests for scipy.integrate.quadpack > Found 3/3 tests for scipy.integrate.quadrature > Found 6/6 tests for scipy.interpolate > Found 6/6 tests for scipy.interpolate.fitpack > Found 4/4 tests for scipy.io.array_import > Found 28/28 tests for scipy.io.mio > Found 13/13 tests for scipy.io.mmio > Found 5/5 tests for scipy.io.npfile > Found 4/4 tests for scipy.io.recaster > Found 16/16 tests for scipy.lib.blas > Found 128/128 tests for scipy.lib.blas.fblas > Found 42/42 tests for scipy.lib.lapack > Found 41/41 tests for scipy.linalg.basic > '/usr/lib/python2.5/site-packages/scipy/linalg/fblas.so'> > Found 16/16 tests for scipy.linalg.blas > Found 72/72 tests for scipy.linalg.decomp > Found 128/128 tests for scipy.linalg.fblas > Found 6/6 tests for scipy.linalg.iterative > Found 4/4 tests for scipy.linalg.lapack > Found 7/7 tests for scipy.linalg.matfuncs > Found 9/9 tests for scipy.linsolve.umfpack > Found 2/2 tests for scipy.maxentropy > Found 3/3 tests for scipy.misc.pilutil > Found 399/399 tests for scipy.ndimage > Found 5/5 tests for scipy.odr > Found 8/8 tests for scipy.optimize > Found 1/1 tests for scipy.optimize.cobyla > Found 10/10 tests for scipy.optimize.nonlin > Found 4/4 tests for scipy.optimize.zeros > Found 5/5 tests for scipy.signal.signaltools > Found 4/4 tests for scipy.signal.wavelets > Found 152/152 tests for scipy.sparse > Found 342/342 tests for scipy.special.basic > Found 3/3 tests for scipy.special.spfun_stats > Found 107/107 tests for scipy.stats > Found 73/73 tests for scipy.stats.distributions > Found 10/10 tests for scipy.stats.morestats > Found 0/0 tests for __main__ > /usr/lib/python2.5/site-packages/scipy/cluster/vq.py:477: UserWarning: > One of the clusters is empty. Re-run kmean with a different > initialization. > warnings.warn("One of the clusters is empty. " > ...exception raised as expected: One of the clusters is empty. Re-run > kmean with a different initialization. > ................................................Residual: 1.05006987327e-07 > ..................../usr/lib/python2.5/site-packages/scipy/interpolate/fitpack2.py:458: > UserWarning: > The coefficients of the spline returned have been computed as the > minimal norm least-squares solution of a (numerically) rank deficient > system (deficiency=7). If deficiency is large, the results may be > inaccurate. Deficiency may strongly depend on the value of eps. > warnings.warn(message) > ...... > Don't worry about a warning regarding the number of bytes read. > Warning: 1000000 bytes requested, 20 bytes read. > .........................................................................caxpy:n=4 > ..caxpy:n=3 > ....ccopy:n=4 > ..ccopy:n=3 > .............cscal:n=4 > ....cswap:n=4 > ..cswap:n=3 > .....daxpy:n=4 > ..daxpy:n=3 > ....dcopy:n=4 > ..dcopy:n=3 > .............dscal:n=4 > ....dswap:n=4 > ..dswap:n=3 > .....saxpy:n=4 > ..saxpy:n=3 > ....scopy:n=4 > ..scopy:n=3 > .............sscal:n=4 > ....sswap:n=4 > ..sswap:n=3 > .....zaxpy:n=4 > ..zaxpy:n=3 > ....zcopy:n=4 > ..zcopy:n=3 > .............zscal:n=4 > ....zswap:n=4 > ..zswap:n=3 > ................................................................................................................................................................................caxpy:n=4 > ..caxpy:n=3 > ....ccopy:n=4 > ..ccopy:n=3 > .............cscal:n=4 > ....cswap:n=4 > ..cswap:n=3 > .....daxpy:n=4 > ..daxpy:n=3 > ....dcopy:n=4 > ..dcopy:n=3 > .............dscal:n=4 > ....dswap:n=4 > ..dswap:n=3 > .....saxpy:n=4 > ..saxpy:n=3 > ....scopy:n=4 > ..scopy:n=3 > .............sscal:n=4 > ....sswap:n=4 > ..sswap:n=3 > .....zaxpy:n=4 > ..zaxpy:n=3 > ....zcopy:n=4 > ..zcopy:n=3 > .............zscal:n=4 > ....zswap:n=4 > ..zswap:n=3 > .............Result may be inaccurate, approximate err = 1.56163357201e-08 > ...Result may be inaccurate, approximate err = 1.25221600096e-10 > ......Use minimum degree ordering on A'+A. > ..Use minimum degree ordering on A'+A. > ...Use minimum degree ordering on A'+A. > ............................................................................................................./usr/lib/python2.5/site-packages/scipy/ndimage/interpolation.py:41: > UserWarning: Mode "reflect" may yield incorrect results on boundaries. > Please use "mirror" instead. > warnings.warn('Mode "reflect" may yield incorrect results on ' > ........................................................................................................................................................................................................................................................................................................F..F.........................................................Use > minimum degree ordering on A'+A. > .....................................Use minimum degree ordering on A'+A. > .....................................Use minimum degree ordering on A'+A. > ................................Use minimum degree ordering on A'+A. > ....................................................................................................................................................................................................................................................................................................................................................0.2 > 0.2 > 0.2 > ......0.2 > ..0.2 > 0.2 > 0.2 > 0.2 > 0.2 > .........................................................................................................................................................................................................Ties > preclude use of exact statistic. > ..Ties preclude use of exact statistic. > ...... > ====================================================================== > FAIL: test_explicit (scipy.tests.test_odr.test_odr) > ---------------------------------------------------------------------- > Traceback (most recent call last): > File "/usr/lib/python2.5/site-packages/scipy/odr/tests/test_odr.py", > line 50, in test_explicit > -8.7849712165253724e-02]), > File "/usr/lib/python2.5/site-packages/numpy/testing/utils.py", line > 232, in assert_array_almost_equal > header='Arrays are not almost equal') > File "/usr/lib/python2.5/site-packages/numpy/testing/utils.py", line > 217, in assert_array_compare > assert cond, msg > AssertionError: > Arrays are not almost equal > > (mismatch 100.0%) > x: array([ 1.26462971e+03, -5.42545890e+01, -8.64250389e-02]) > y: array([ 1.26465481e+03, -5.40184100e+01, -8.78497122e-02]) > > ====================================================================== > FAIL: test_multi (scipy.tests.test_odr.test_odr) > ---------------------------------------------------------------------- > Traceback (most recent call last): > File "/usr/lib/python2.5/site-packages/scipy/odr/tests/test_odr.py", > line 191, in test_multi > 0.5101147161764654, 0.5173902330489161]), > File "/usr/lib/python2.5/site-packages/numpy/testing/utils.py", line > 232, in assert_array_almost_equal > header='Arrays are not almost equal') > File "/usr/lib/python2.5/site-packages/numpy/testing/utils.py", line > 217, in assert_array_compare > assert cond, msg > AssertionError: > Arrays are not almost equal > > (mismatch 100.0%) > x: array([ 4.31272063, 2.44289312, 7.76215871, 0.55995622, 0.46423343]) > y: array([ 4.37998803, 2.43330576, 8.00288459, 0.51011472, 0.51739023]) > > ---------------------------------------------------------------------- > Ran 1728 tests in 3.660s > > FAILED (failures=2) > > > > > -------------------------------------------------------------------------------------------------------------- > 2. System check > -------------------------------------------------------------------------------------------------------------- > yutec at yutec-laptop:~$ python -c 'import os,sys;print os.name,sys.platform' > posix linux2 > > yutec at yutec-laptop:~$ uname -a > Linux yutec-laptop 2.6.22-14-generic #1 SMP Sun Oct 14 21:45:15 GMT > 2007 x86_64 GNU/Linux > > yutec at yutec-laptop:~$ gcc -v > Using built-in specs. > Target: x86_64-linux-gnu > Configured with: ../src/configure -v > --enable-languages=c,c++,fortran,objc,obj-c++,treelang --prefix=/usr > --enable-shared --with-system-zlib --libexecdir=/usr/lib > --without-included-gettext --enable-threads=posix --enable-nls > --with-gxx-include-dir=/usr/include/c++/4.1.3 --program-suffix=-4.1 > --enable-__cxa_atexit --enable-clocale=gnu --enable-libstdcxx-debug > --enable-mpfr --enable-checking=release x86_64-linux-gnu > Thread model: posix > gcc version 4.1.3 20070929 (prerelease) (Ubuntu 4.1.2-16ubuntu2) > > yutec at yutec-laptop:~$ g77 --version > GNU Fortran (GCC) 3.4.6 (Ubuntu 3.4.6-6ubuntu2) > Copyright (C) 2006 Free Software Foundation, Inc. > > GNU Fortran comes with NO WARRANTY, to the extent permitted by law. > You may redistribute copies of GNU Fortran > under the terms of the GNU General Public License. > For more information about these matters, see the file named COPYING > or type the command `info -f g77 Copying'. > > yutec at yutec-laptop:~$ python -c 'import sys;print sys.version' > 2.5.1 (r251:54863, Oct 5 2007, 13:50:07) > [GCC 4.1.3 20070929 (prerelease) (Ubuntu 4.1.2-16ubuntu2)] > > yutec at yutec-laptop:~$ python -c 'import numpy;print numpy.__version__' > 1.0.4 > > yutec at yutec-laptop:~$ cd /usr/lib/python2.5/site-packages/scipy/ > > yutec at yutec-laptop:/usr/lib/python2.5/site-packages/scipy$ cd linalg > yutec at yutec-laptop:/usr/lib/python2.5/site-packages/scipy/linalg$ > yutec at yutec-laptop:/usr/lib/python2.5/site-packages/scipy/linalg$ > python setup_atlas_version.py build_ext --inplace --force > Traceback (most recent call last): > File "setup_atlas_version.py", line 7, in > from numpy.distutils.misc_util import get_path, default_config_dict > ImportError: cannot import name get_path > > yutec at yutec-laptop:/usr/lib/python2.5/site-packages/scipy/linalg$ > python -c 'import atlas_version' > ATLAS version 3.6.0 built by root on Thu Jun 3 21:13:07 UTC 2004: > UNAME : Linux ravel 2.6.5 #3 SMP Tue Apr 13 13:41:54 UTC 2004 > x86_64 GNU/Linux > INSTFLG : > MMDEF : /home/camm/atlas3-3.6.0/CONFIG/ARCHS/HAMMER64SSE2/gcc/gemm > ARCHDEF : /home/camm/atlas3-3.6.0/CONFIG/ARCHS/HAMMER64SSE2/gcc/misc > F2CDEFS : -DAdd__ -DStringSunStyle > CACHEEDGE: 720896 > F77 : /home/camm/usr/bin/g77, version GNU Fortran (GCC) 3.3.3 > (Debian 20040422) > F77FLAGS : -fomit-frame-pointer -O -m64 > CC : /usr/bin/gcc, version gcc (GCC) 3.3.3 (Debian 20040422) > CC FLAGS : -fomit-frame-pointer -O -mfpmath=387 -m64 > MCC : /usr/bin/gcc, version gcc (GCC) 3.3.3 (Debian 20040422) > MCCFLAGS : -fomit-frame-pointer -O -mfpmath=387 -m64 > > yutec at yutec-laptop:/usr/lib/python2.5/site-packages/scipy/linalg$ cd .. > yutec at yutec-laptop:/usr/lib/python2.5/site-packages/scipy$ cd .. > yutec at yutec-laptop:/usr/lib/python2.5/site-packages$ cd numpy > yutec at yutec-laptop:/usr/lib/python2.5/site-packages/numpy$ cd distutils/ > > yutec at yutec-laptop:/usr/lib/python2.5/site-packages/numpy/distutils$ > python system_info.py > lapack_info: > libraries lapack not found in /usr/local/lib > FOUND: > libraries = ['lapack'] > library_dirs = ['/usr/lib'] > language = f77 > > lapack_opt_info: > lapack_mkl_info: > mkl_info: > libraries mkl,vml,guide not found in /usr/local/lib > libraries mkl,vml,guide not found in /usr/lib > NOT AVAILABLE > > NOT AVAILABLE > > atlas_threads_info: > Setting PTATLAS=ATLAS > libraries ptf77blas,ptcblas,atlas not found in /usr/local/lib > libraries lapack_atlas not found in /usr/local/lib > libraries ptf77blas,ptcblas,atlas not found in /usr/lib/atlas > libraries lapack_atlas not found in /usr/lib/atlas > libraries ptf77blas,ptcblas,atlas not found in /usr/lib > __main__.atlas_threads_info > NOT AVAILABLE > > atlas_info: > libraries f77blas,cblas,atlas not found in /usr/local/lib > libraries lapack_atlas not found in /usr/local/lib > libraries f77blas,cblas,atlas not found in /usr/lib/atlas > libraries lapack_atlas not found in /usr/lib/atlas > __main__.atlas_info > FOUND: > libraries = ['lapack', 'f77blas', 'cblas', 'atlas'] > library_dirs = ['/usr/lib'] > language = f77 > include_dirs = ['/usr/include'] > > customize GnuFCompiler > Found executable /usr/bin/g77 > gnu: no Fortran 90 compiler found > gnu: no Fortran 90 compiler found > customize GnuFCompiler > gnu: no Fortran 90 compiler found > gnu: no Fortran 90 compiler found > customize GnuFCompiler using config > FOUND: > libraries = ['lapack', 'f77blas', 'cblas', 'atlas'] > library_dirs = ['/usr/lib'] > language = f77 > define_macros = [('ATLAS_INFO', '"\\"?.?.?\\""')] > include_dirs = ['/usr/include'] > > lapack_atlas_info: > libraries lapack_atlas,f77blas,cblas,atlas not found in /usr/local/lib > libraries lapack_atlas not found in /usr/local/lib > libraries lapack_atlas,f77blas,cblas,atlas not found in /usr/lib/atlas > libraries lapack_atlas not found in /usr/lib/atlas > __main__.lapack_atlas_info > FOUND: > libraries = ['lapack', 'lapack_atlas', 'f77blas', 'cblas', 'atlas'] > library_dirs = ['/usr/lib'] > language = f77 > include_dirs = ['/usr/include'] > > umfpack_info: > libraries umfpack not found in /usr/local/lib > libraries umfpack not found in /usr/lib > NOT AVAILABLE > > _pkg_config_info: > Found executable /usr/bin/pkg-config > NOT AVAILABLE > > lapack_atlas_threads_info: > Setting PTATLAS=ATLAS > libraries lapack_atlas,ptf77blas,ptcblas,atlas not found in /usr/local/lib > libraries lapack_atlas not found in /usr/local/lib > libraries lapack_atlas,ptf77blas,ptcblas,atlas not found in /usr/lib/atlas > libraries lapack_atlas not found in /usr/lib/atlas > libraries lapack_atlas,ptf77blas,ptcblas,atlas not found in /usr/lib > __main__.lapack_atlas_threads_info > NOT AVAILABLE > > x11_info: > libraries X11 not found in /usr/lib64 > NOT AVAILABLE > > blas_info: > libraries blas not found in /usr/local/lib > FOUND: > libraries = ['blas'] > library_dirs = ['/usr/lib'] > language = f77 > > fftw_info: > libraries fftw3 not found in /usr/local/lib > FOUND: > libraries = ['fftw3'] > library_dirs = ['/usr/lib'] > define_macros = [('SCIPY_FFTW3_H', None)] > include_dirs = ['/usr/include'] > > f2py_info: > FOUND: > sources = ['/usr/lib/python2.5/site-packages/numpy/f2py/src/fortranobject.c'] > include_dirs = ['/usr/lib/python2.5/site-packages/numpy/f2py/src'] > > gdk_pixbuf_xlib_2_info: > NOT AVAILABLE > > dfftw_threads_info: > libraries drfftw_threads,dfftw_threads not found in /usr/local/lib > libraries drfftw_threads,dfftw_threads not found in /usr/lib > dfftw threads not found > NOT AVAILABLE > > atlas_blas_info: > libraries f77blas,cblas,atlas not found in /usr/local/lib > libraries f77blas,cblas,atlas not found in /usr/lib/atlas > FOUND: > libraries = ['f77blas', 'cblas', 'atlas'] > library_dirs = ['/usr/lib'] > language = c > include_dirs = ['/usr/include'] > > fftw3_info: > libraries fftw3 not found in /usr/local/lib > FOUND: > libraries = ['fftw3'] > library_dirs = ['/usr/lib'] > define_macros = [('SCIPY_FFTW3_H', None)] > include_dirs = ['/usr/include'] > > blas_opt_info: > blas_mkl_info: > libraries mkl,vml,guide not found in /usr/local/lib > libraries mkl,vml,guide not found in /usr/lib > NOT AVAILABLE > > atlas_blas_threads_info: > Setting PTATLAS=ATLAS > libraries ptf77blas,ptcblas,atlas not found in /usr/local/lib > libraries ptf77blas,ptcblas,atlas not found in /usr/lib/atlas > libraries ptf77blas,ptcblas,atlas not found in /usr/lib > NOT AVAILABLE > > customize GnuFCompiler > gnu: no Fortran 90 compiler found > gnu: no Fortran 90 compiler found > customize GnuFCompiler > gnu: no Fortran 90 compiler found > gnu: no Fortran 90 compiler found > customize GnuFCompiler using config > FOUND: > libraries = ['f77blas', 'cblas', 'atlas'] > library_dirs = ['/usr/lib'] > language = c > define_macros = [('ATLAS_INFO', '"\\"?.?.?\\""')] > include_dirs = ['/usr/include'] > > sfftw_info: > libraries srfftw,sfftw not found in /usr/local/lib > libraries srfftw,sfftw not found in /usr/lib > sfftw not found > NOT AVAILABLE > > xft_info: > NOT AVAILABLE > > fft_opt_info: > djbfft_info: > NOT AVAILABLE > > FOUND: > libraries = ['fftw3'] > library_dirs = ['/usr/lib'] > define_macros = [('SCIPY_FFTW3_H', None)] > include_dirs = ['/usr/include'] > > gdk_x11_2_info: > NOT AVAILABLE > > agg2_info: > NOT AVAILABLE > > numarray_info: > NOT AVAILABLE > > blas_src_info: > NOT AVAILABLE > > fftw2_info: > libraries rfftw,fftw not found in /usr/local/lib > FOUND: > libraries = ['rfftw', 'fftw'] > library_dirs = ['/usr/lib'] > define_macros = [('SCIPY_FFTW_H', None)] > include_dirs = ['/usr/include'] > > fftw_threads_info: > libraries rfftw_threads,fftw_threads not found in /usr/local/lib > FOUND: > libraries = ['rfftw_threads', 'fftw_threads'] > library_dirs = ['/usr/lib'] > define_macros = [('SCIPY_FFTW_THREADS_H', None)] > include_dirs = ['/usr/include'] > > _numpy_info: > FOUND: > define_macros = [('NUMERIC_VERSION', '"\\"24.2\\""'), ('NUMERIC', None)] > include_dirs = ['/usr/include/python2.5'] > > wx_info: > Could not locate executable wx-config > File not found: None. Cannot determine wx info. > NOT AVAILABLE > > gdk_info: > NOT AVAILABLE > > gtkp_x11_2_info: > NOT AVAILABLE > > sfftw_threads_info: > libraries srfftw_threads,sfftw_threads not found in /usr/local/lib > libraries srfftw_threads,sfftw_threads not found in /usr/lib > sfftw threads not found > NOT AVAILABLE > > boost_python_info: > NOT AVAILABLE > > freetype2_info: > NOT AVAILABLE > > gdk_2_info: > NOT AVAILABLE > > dfftw_info: > libraries drfftw,dfftw not found in /usr/local/lib > libraries drfftw,dfftw not found in /usr/lib > dfftw not found > NOT AVAILABLE > > lapack_src_info: > NOT AVAILABLE > > gtkp_2_info: > NOT AVAILABLE > > gdk_pixbuf_2_info: > NOT AVAILABLE > > amd_info: > libraries amd not found in /usr/local/lib > libraries amd not found in /usr/lib > NOT AVAILABLE > > Numeric_info: > FOUND: > define_macros = [('NUMERIC_VERSION', '"\\"24.2\\""'), ('NUMERIC', None)] > include_dirs = ['/usr/include/python2.5'] > > numerix_info: > numpy_info: > FOUND: > define_macros = [('NUMPY_VERSION', '"\\"1.0.4\\""'), ('NUMPY', None)] > > FOUND: > define_macros = [('NUMPY_VERSION', '"\\"1.0.4\\""'), ('NUMPY', None)] > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > From s.mientki at ru.nl Fri Nov 23 15:16:48 2007 From: s.mientki at ru.nl (Stef Mientki) Date: Fri, 23 Nov 2007 21:16:48 +0100 Subject: [SciPy-user] this must sound stupid ... (how to delete and destroy an AUI-pane ?) Message-ID: <47473530.7060200@ru.nl> hello, I want to dynamically create, modify and delete AUI-panes in a AUI_DockingWindowMgr, Now I can create panes: self._mgr.AddPane( self.CreateHTMLCtrl(), wx.aui.AuiPaneInfo(). Name("test3").Caption("Client Size Reporter"). Left(). CloseButton ( False ). MaximizeButton ( False ) ) I can find the pane again pane = self._mgr.GetPane("test2") but how do I delete and destroy a pane ? The next statements all returns errors :-( pane.Close() self._mgr.DetachPane(pane) thanks, Stef From cohen at slac.stanford.edu Fri Nov 23 15:11:38 2007 From: cohen at slac.stanford.edu (Johann Cohen-Tanugi) Date: Fri, 23 Nov 2007 12:11:38 -0800 Subject: [SciPy-user] Building with a non-standard gfortran In-Reply-To: <4BD1DAD9-E9B8-4AE2-93E5-FE46B146830D@physics.mcmaster.ca> References: <799406d60711200603t8e451cah2e13add2bd794a11@mail.gmail.com> <4BD1DAD9-E9B8-4AE2-93E5-FE46B146830D@physics.mcmaster.ca> Message-ID: <474733FA.1000108@slac.stanford.edu> David M. Cooke wrote: > On Nov 20, 2007, at 09:03 , Adam Mercer wrote: > > >> Hi >> >> Using MacPorts, the gfortran compiler, from gcc-4.2.x, is installed as >> gfortran-mp-4.2. I have been trying to build scipy with this compiler >> and found that that numpy I need to build and install with >> >> $ python setup.py config_fc --fcompiler gnu95 \ >> --f77exec /opt/local/bin/gfortran-mp-4.2 \ >> --f90exec /opt/local/bin/gfortran-mp-4.2 build >> $ python setup.py install --prefix=${NUMPY_LOCATION} >> >> however using the above for scipy fails on the install phase with it >> being unable to locate a fortran compiler. I therefore need to >> install scipy with >> >> $ python setup.py config_fc --fcompiler gnu95 \ >> --f77exec /opt/local/bin/gfortran-mp-4.2 \ >> --f90exec /opt/local/bin/gfortran-mp-4.2 install --prefix=$ >> {SCIPY_LOCATION} >> >> Is this expected, as I would have expected install to inherit the >> options that where used during the build phase? >> > > The options used in the build phase aren't stored, so, yes, they need > to be respecified. For numpy it doesn't matter, as the Fortran > compiler isn't used. Since we use distutils, you can set the config_fc > options in a file (either globally for the user in a > ~/.pydistutils.cfg, or per-package in a setup.cfg next to the setup.py > you're running) by adding the following section to the appropiate file > > [config_fc] > fcompiler=gfortran > f77exec=gfortran-mp-4.2 > f90exec=gfortran-mp-4.2 > > Thanks for the tip, David. I tried it : [cohen at localhost scipy-svn]$ which gfortran /usr/bin/gfortran [cohen at localhost scipy-svn]$ more ~/.pydistutils.cfg [config_fc] fcompiler=gfortran f77exec=gfortran f90exec=gfortran and 'python setup.py build' does correctly use gfortran, but both numpy and scipy keep putting out customize GnuFCompiler gnu: no Fortran 90 compiler found gnu: no Fortran 90 compiler found customize GnuFCompiler gnu: no Fortran 90 compiler found gnu: no Fortran 90 compiler found customize GnuFCompiler using build_ext at install time.... Also, now that I am at it, there seems to be a doubling of the mach library, which should be solved for cleanliness : customize UnixCCompiler using build_ext library 'mach' defined more than once, overwriting build_info {'sources': ['scipy/integrate/mach/i1mach.f', 'scipy/integrate/mach/xerror.f', 'scipy/integrate/mach/r1mach.f', 'scipy/integrate/mach/d1mach.f'], 'config_fc': {'noopt': ('scipy/integrate/setup.pyc', 1)}, 'source_languages': ['f77']}... with {'sources': ['scipy/special/mach/i1mach.f', 'scipy/special/mach/xerror.f', 'scipy/special/mach/r1mach.f', 'scipy/special/mach/d1mach.f'], 'config_fc': {'noopt': ('scipy/special/setup.pyc', 1)}, 'source_languages': ['f77']}... best, Johann From s.mientki at ru.nl Fri Nov 23 15:20:09 2007 From: s.mientki at ru.nl (Stef Mientki) Date: Fri, 23 Nov 2007 21:20:09 +0100 Subject: [SciPy-user] sorry wrong list, was: this must sound stupid ... (how to delete and destroy an AUI-pane ?) In-Reply-To: <47473530.7060200@ru.nl> References: <47473530.7060200@ru.nl> Message-ID: <474735F9.2050203@ru.nl> sorry wrong list ;-) Stef Mientki wrote: > hello, > > I want to dynamically create, modify and delete AUI-panes in a > AUI_DockingWindowMgr, > > Now I can create panes: > self._mgr.AddPane( self.CreateHTMLCtrl(), wx.aui.AuiPaneInfo(). > Name("test3").Caption("Client Size Reporter"). > Left(). > CloseButton ( False ). > MaximizeButton ( False ) ) > > I can find the pane again > pane = self._mgr.GetPane("test2") > > but how do I delete and destroy a pane ? > The next statements all returns errors :-( > pane.Close() > self._mgr.DetachPane(pane) > > thanks, > Stef > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > > > From gnurser at googlemail.com Fri Nov 23 18:40:34 2007 From: gnurser at googlemail.com (George Nurser) Date: Fri, 23 Nov 2007 23:40:34 +0000 Subject: [SciPy-user] Building with a non-standard gfortran In-Reply-To: <4BD1DAD9-E9B8-4AE2-93E5-FE46B146830D@physics.mcmaster.ca> References: <799406d60711200603t8e451cah2e13add2bd794a11@mail.gmail.com> <4BD1DAD9-E9B8-4AE2-93E5-FE46B146830D@physics.mcmaster.ca> Message-ID: <1d1e6ea70711231540k219f0b17h864ca472a2e46805@mail.gmail.com> I've got an extended version of the same query. I have installed gcc43 into /opt/local/bin from macports, and would like to use this gcc (and associated c++ and gfortran) to compile numpy/scipy/pytables/matplotlib. /opt/local/bin is at the start of my $PATH I understand I create ~/.pydistutils.cfg something like [config_fc] fcompiler=gfortran f77exec=gfortran-mp-4.3 f90exec=gfortran-mp-4.3 opt='-mtune=core2 -march=core2' [config] compiler=gcc cc=gcc-mp-4.3 But ... 1. how do i specify the gcc4.3 C++ compiler for matplotlib etc..? 2. how do i specify the C & C++ compiler options (like -mtune=core2, -march=core2) 2. do I need to add /opt/local/lib to my $LD_LIBRARY_PATH? Or is there a less drastic method? 3. Is there anything else I need? Apologies if I've missed the documentation. Regards, George Nurser. From markbak at gmail.com Sat Nov 24 08:05:07 2007 From: markbak at gmail.com (Mark Bakker) Date: Sat, 24 Nov 2007 14:05:07 +0100 Subject: [SciPy-user] Installation of Python/mpl/numpy/scipy/Ipython for students Message-ID: <6946b9500711240505n711f66c6m5e2b63db294a1957@mail.gmail.com> A question was posted a week or so ago on the scipy list about installing Python/mpl/numpy/scipy/Ipython by David Arnold (both Windows and Mac). He was interested in a simple install for students (and others that are less computer-savy). We had a little off-list correspondence, but thought to bring it back to the list, also posting on mpl (cause I read that list and it seems pretty responsive) and the numpy Google discussion group (they respond VERY quickly). On the scipy list it was suggested to use Sage reply, but that sounds like a little overkill to me. I have used the enthought package on windows successfully in class in the past, but I understand that they have now switched to eggs, which is not convenient (yet). I also don't like all the other things I have to set myself. A switch on the IDLE GUI to get mpl to work interactively (which is now seems impossible to do), and setting my PYTHONPATH with environment variables. It is all easy to do, but not for students, especially students without much knowledge of how computers work let alone access to set these variables and options. At one point I was reading about making a superpack (or a name like that) installation for Python with our favorite packages as mentioned above. I am not sure where that is going. Are others dreaming about such a superpack? What needs to be done to get it going? Mark -------------- next part -------------- An HTML attachment was scrubbed... URL: From anand.prabhakar.patil at gmail.com Sat Nov 24 11:39:49 2007 From: anand.prabhakar.patil at gmail.com (Anand Patil) Date: Sat, 24 Nov 2007 08:39:49 -0800 Subject: [SciPy-user] sparse BLAS in scipy? Message-ID: <2bc7a5a50711240839m19024b4aw9f73f4217f9a27ec@mail.gmail.com> Hi all, Is there a set of python-callable sparse BLAS out there yet? I haven't found one, and not for lack of Googling. I'm willing to work on swigging a library if the need exists, but would like some guidance from someone with more experience with these things. First, it looks like there are three implementations that would be good for this purpose: - NIST sparse BLAS - SparseLib++ (superseded by above?) - Boost's ublas with sparse template parameters. Questions: - Do I need to do this, or are wrappers already available? - If so, which library would be best? I lean toward Boost, just because it's so broadly templatized that scripting a wrapper for all all the sparse-sparse and sparse-dense versions should be relatively easy. - What should the calling conventions from Python be like? - Any other pointers? (things I should know about numpy.i, for example). Anand From lev at vpac.org Sun Nov 25 17:11:57 2007 From: lev at vpac.org (Lev Lafayette) Date: Mon, 26 Nov 2007 09:11:57 +1100 Subject: [SciPy-user] scipy 0.6 installation problem In-Reply-To: <11001d3e0711230918q1ab5e650s53c98fe8613420dd@mail.gmail.com> References: <11001d3e0711230918q1ab5e650s53c98fe8613420dd@mail.gmail.com> Message-ID: <1196028717.4934.9.camel@sys09.in.vpac.org> On Fri, 2007-11-23 at 12:18 -0500, Yutec Sun wrote: > Hello, > I'd like to ask your favor for helping me install scipy 0.6. > I have a problem importing scipy. In fact, importing doesn't show an > error message, but scipy.test produces failures. > My OS is Ubuntu 7.10 (Gutsy Gibbon) amd64. Following the installation > instruction, I attach the messages for further information. > Please understand that I'm new to linux system and scipy, and let me > know how to successfully install scipy step by step. > I thank you for your help. > 64bit, eh? Have you done these modifications? http://scipy.org/scipy/scipy/changeset/3498 All the best, Lev From strawman at astraw.com Mon Nov 26 00:35:55 2007 From: strawman at astraw.com (Andrew Straw) Date: Sun, 25 Nov 2007 21:35:55 -0800 Subject: [SciPy-user] scipy 0.6 installation problem In-Reply-To: <11001d3e0711230918q1ab5e650s53c98fe8613420dd@mail.gmail.com> References: <11001d3e0711230918q1ab5e650s53c98fe8613420dd@mail.gmail.com> Message-ID: <474A5B3B.40408@astraw.com> Hi Yutec, You can install my version: http://debs.astraw.com/gutsy/python-scipy_0.6.0-1ubuntu0~0ads2_amd64.deb -Andrew Yutec Sun wrote: > Hello, > I'd like to ask your favor for helping me install scipy 0.6. > I have a problem importing scipy. In fact, importing doesn't show an > error message, but scipy.test produces failures. > My OS is Ubuntu 7.10 (Gutsy Gibbon) amd64. Following the installation > instruction, I attach the messages for further information. > Please understand that I'm new to linux system and scipy, and let me > know how to successfully install scipy step by step. > I thank you for your help. > > Best regards, > > Yutec > > > -------------------------------------------------------------------------------------------------------------- > 1. Error message > -------------------------------------------------------------------------------------------------------------- > yutec at yutec-laptop:/usr/lib/python2.5/site-packages/numpy/distutils$ python > Python 2.5.1 (r251:54863, Oct 5 2007, 13:50:07) > [GCC 4.1.3 20070929 (prerelease) (Ubuntu 4.1.2-16ubuntu2)] on linux2 > Type "help", "copyright", "credits" or "license" for more information. >>>> import scipy >>>> scipy.test(level=1) > Found 9/9 tests for scipy.cluster.vq > Found 18/18 tests for scipy.fftpack.basic > Found 4/4 tests for scipy.fftpack.helper > Found 20/20 tests for scipy.fftpack.pseudo_diffs > Found 1/1 tests for scipy.integrate > Found 10/10 tests for scipy.integrate.quadpack > Found 3/3 tests for scipy.integrate.quadrature > Found 6/6 tests for scipy.interpolate > Found 6/6 tests for scipy.interpolate.fitpack > Found 4/4 tests for scipy.io.array_import > Found 28/28 tests for scipy.io.mio > Found 13/13 tests for scipy.io.mmio > Found 5/5 tests for scipy.io.npfile > Found 4/4 tests for scipy.io.recaster > Found 16/16 tests for scipy.lib.blas > Found 128/128 tests for scipy.lib.blas.fblas > Found 42/42 tests for scipy.lib.lapack > Found 41/41 tests for scipy.linalg.basic > '/usr/lib/python2.5/site-packages/scipy/linalg/fblas.so'> > Found 16/16 tests for scipy.linalg.blas > Found 72/72 tests for scipy.linalg.decomp > Found 128/128 tests for scipy.linalg.fblas > Found 6/6 tests for scipy.linalg.iterative > Found 4/4 tests for scipy.linalg.lapack > Found 7/7 tests for scipy.linalg.matfuncs > Found 9/9 tests for scipy.linsolve.umfpack > Found 2/2 tests for scipy.maxentropy > Found 3/3 tests for scipy.misc.pilutil > Found 399/399 tests for scipy.ndimage > Found 5/5 tests for scipy.odr > Found 8/8 tests for scipy.optimize > Found 1/1 tests for scipy.optimize.cobyla > Found 10/10 tests for scipy.optimize.nonlin > Found 4/4 tests for scipy.optimize.zeros > Found 5/5 tests for scipy.signal.signaltools > Found 4/4 tests for scipy.signal.wavelets > Found 152/152 tests for scipy.sparse > Found 342/342 tests for scipy.special.basic > Found 3/3 tests for scipy.special.spfun_stats > Found 107/107 tests for scipy.stats > Found 73/73 tests for scipy.stats.distributions > Found 10/10 tests for scipy.stats.morestats > Found 0/0 tests for __main__ > /usr/lib/python2.5/site-packages/scipy/cluster/vq.py:477: UserWarning: > One of the clusters is empty. Re-run kmean with a different > initialization. > warnings.warn("One of the clusters is empty. " > ...exception raised as expected: One of the clusters is empty. Re-run > kmean with a different initialization. > ................................................Residual: 1.05006987327e-07 > ..................../usr/lib/python2.5/site-packages/scipy/interpolate/fitpack2.py:458: > UserWarning: > The coefficients of the spline returned have been computed as the > minimal norm least-squares solution of a (numerically) rank deficient > system (deficiency=7). If deficiency is large, the results may be > inaccurate. Deficiency may strongly depend on the value of eps. > warnings.warn(message) > ...... > Don't worry about a warning regarding the number of bytes read. > Warning: 1000000 bytes requested, 20 bytes read. > .........................................................................caxpy:n=4 > ..caxpy:n=3 > ....ccopy:n=4 > ..ccopy:n=3 > .............cscal:n=4 > ....cswap:n=4 > ..cswap:n=3 > .....daxpy:n=4 > ..daxpy:n=3 > ....dcopy:n=4 > ..dcopy:n=3 > .............dscal:n=4 > ....dswap:n=4 > ..dswap:n=3 > .....saxpy:n=4 > ..saxpy:n=3 > ....scopy:n=4 > ..scopy:n=3 > .............sscal:n=4 > ....sswap:n=4 > ..sswap:n=3 > .....zaxpy:n=4 > ..zaxpy:n=3 > ....zcopy:n=4 > ..zcopy:n=3 > .............zscal:n=4 > ....zswap:n=4 > ..zswap:n=3 > ................................................................................................................................................................................caxpy:n=4 > ..caxpy:n=3 > ....ccopy:n=4 > ..ccopy:n=3 > .............cscal:n=4 > ....cswap:n=4 > ..cswap:n=3 > .....daxpy:n=4 > ..daxpy:n=3 > ....dcopy:n=4 > ..dcopy:n=3 > .............dscal:n=4 > ....dswap:n=4 > ..dswap:n=3 > .....saxpy:n=4 > ..saxpy:n=3 > ....scopy:n=4 > ..scopy:n=3 > .............sscal:n=4 > ....sswap:n=4 > ..sswap:n=3 > .....zaxpy:n=4 > ..zaxpy:n=3 > ....zcopy:n=4 > ..zcopy:n=3 > .............zscal:n=4 > ....zswap:n=4 > ..zswap:n=3 > .............Result may be inaccurate, approximate err = 1.56163357201e-08 > ...Result may be inaccurate, approximate err = 1.25221600096e-10 > ......Use minimum degree ordering on A'+A. > ..Use minimum degree ordering on A'+A. > ...Use minimum degree ordering on A'+A. > ............................................................................................................./usr/lib/python2.5/site-packages/scipy/ndimage/interpolation.py:41: > UserWarning: Mode "reflect" may yield incorrect results on boundaries. > Please use "mirror" instead. > warnings.warn('Mode "reflect" may yield incorrect results on ' > ........................................................................................................................................................................................................................................................................................................F..F.........................................................Use > minimum degree ordering on A'+A. > .....................................Use minimum degree ordering on A'+A. > .....................................Use minimum degree ordering on A'+A. > ................................Use minimum degree ordering on A'+A. > ....................................................................................................................................................................................................................................................................................................................................................0.2 > 0.2 > 0.2 > ......0.2 > ..0.2 > 0.2 > 0.2 > 0.2 > 0.2 > .........................................................................................................................................................................................................Ties > preclude use of exact statistic. > ..Ties preclude use of exact statistic. > ...... > ====================================================================== > FAIL: test_explicit (scipy.tests.test_odr.test_odr) > ---------------------------------------------------------------------- > Traceback (most recent call last): > File "/usr/lib/python2.5/site-packages/scipy/odr/tests/test_odr.py", > line 50, in test_explicit > -8.7849712165253724e-02]), > File "/usr/lib/python2.5/site-packages/numpy/testing/utils.py", line > 232, in assert_array_almost_equal > header='Arrays are not almost equal') > File "/usr/lib/python2.5/site-packages/numpy/testing/utils.py", line > 217, in assert_array_compare > assert cond, msg > AssertionError: > Arrays are not almost equal > > (mismatch 100.0%) > x: array([ 1.26462971e+03, -5.42545890e+01, -8.64250389e-02]) > y: array([ 1.26465481e+03, -5.40184100e+01, -8.78497122e-02]) > > ====================================================================== > FAIL: test_multi (scipy.tests.test_odr.test_odr) > ---------------------------------------------------------------------- > Traceback (most recent call last): > File "/usr/lib/python2.5/site-packages/scipy/odr/tests/test_odr.py", > line 191, in test_multi > 0.5101147161764654, 0.5173902330489161]), > File "/usr/lib/python2.5/site-packages/numpy/testing/utils.py", line > 232, in assert_array_almost_equal > header='Arrays are not almost equal') > File "/usr/lib/python2.5/site-packages/numpy/testing/utils.py", line > 217, in assert_array_compare > assert cond, msg > AssertionError: > Arrays are not almost equal > > (mismatch 100.0%) > x: array([ 4.31272063, 2.44289312, 7.76215871, 0.55995622, 0.46423343]) > y: array([ 4.37998803, 2.43330576, 8.00288459, 0.51011472, 0.51739023]) > > ---------------------------------------------------------------------- > Ran 1728 tests in 3.660s > > FAILED (failures=2) > > > > > -------------------------------------------------------------------------------------------------------------- > 2. System check > -------------------------------------------------------------------------------------------------------------- > yutec at yutec-laptop:~$ python -c 'import os,sys;print os.name,sys.platform' > posix linux2 > > yutec at yutec-laptop:~$ uname -a > Linux yutec-laptop 2.6.22-14-generic #1 SMP Sun Oct 14 21:45:15 GMT > 2007 x86_64 GNU/Linux > > yutec at yutec-laptop:~$ gcc -v > Using built-in specs. > Target: x86_64-linux-gnu > Configured with: ../src/configure -v > --enable-languages=c,c++,fortran,objc,obj-c++,treelang --prefix=/usr > --enable-shared --with-system-zlib --libexecdir=/usr/lib > --without-included-gettext --enable-threads=posix --enable-nls > --with-gxx-include-dir=/usr/include/c++/4.1.3 --program-suffix=-4.1 > --enable-__cxa_atexit --enable-clocale=gnu --enable-libstdcxx-debug > --enable-mpfr --enable-checking=release x86_64-linux-gnu > Thread model: posix > gcc version 4.1.3 20070929 (prerelease) (Ubuntu 4.1.2-16ubuntu2) > > yutec at yutec-laptop:~$ g77 --version > GNU Fortran (GCC) 3.4.6 (Ubuntu 3.4.6-6ubuntu2) > Copyright (C) 2006 Free Software Foundation, Inc. > > GNU Fortran comes with NO WARRANTY, to the extent permitted by law. > You may redistribute copies of GNU Fortran > under the terms of the GNU General Public License. > For more information about these matters, see the file named COPYING > or type the command `info -f g77 Copying'. > > yutec at yutec-laptop:~$ python -c 'import sys;print sys.version' > 2.5.1 (r251:54863, Oct 5 2007, 13:50:07) > [GCC 4.1.3 20070929 (prerelease) (Ubuntu 4.1.2-16ubuntu2)] > > yutec at yutec-laptop:~$ python -c 'import numpy;print numpy.__version__' > 1.0.4 > > yutec at yutec-laptop:~$ cd /usr/lib/python2.5/site-packages/scipy/ > > yutec at yutec-laptop:/usr/lib/python2.5/site-packages/scipy$ cd linalg > yutec at yutec-laptop:/usr/lib/python2.5/site-packages/scipy/linalg$ > yutec at yutec-laptop:/usr/lib/python2.5/site-packages/scipy/linalg$ > python setup_atlas_version.py build_ext --inplace --force > Traceback (most recent call last): > File "setup_atlas_version.py", line 7, in > from numpy.distutils.misc_util import get_path, default_config_dict > ImportError: cannot import name get_path > > yutec at yutec-laptop:/usr/lib/python2.5/site-packages/scipy/linalg$ > python -c 'import atlas_version' > ATLAS version 3.6.0 built by root on Thu Jun 3 21:13:07 UTC 2004: > UNAME : Linux ravel 2.6.5 #3 SMP Tue Apr 13 13:41:54 UTC 2004 > x86_64 GNU/Linux > INSTFLG : > MMDEF : /home/camm/atlas3-3.6.0/CONFIG/ARCHS/HAMMER64SSE2/gcc/gemm > ARCHDEF : /home/camm/atlas3-3.6.0/CONFIG/ARCHS/HAMMER64SSE2/gcc/misc > F2CDEFS : -DAdd__ -DStringSunStyle > CACHEEDGE: 720896 > F77 : /home/camm/usr/bin/g77, version GNU Fortran (GCC) 3.3.3 > (Debian 20040422) > F77FLAGS : -fomit-frame-pointer -O -m64 > CC : /usr/bin/gcc, version gcc (GCC) 3.3.3 (Debian 20040422) > CC FLAGS : -fomit-frame-pointer -O -mfpmath=387 -m64 > MCC : /usr/bin/gcc, version gcc (GCC) 3.3.3 (Debian 20040422) > MCCFLAGS : -fomit-frame-pointer -O -mfpmath=387 -m64 > > yutec at yutec-laptop:/usr/lib/python2.5/site-packages/scipy/linalg$ cd .. > yutec at yutec-laptop:/usr/lib/python2.5/site-packages/scipy$ cd .. > yutec at yutec-laptop:/usr/lib/python2.5/site-packages$ cd numpy > yutec at yutec-laptop:/usr/lib/python2.5/site-packages/numpy$ cd distutils/ > > yutec at yutec-laptop:/usr/lib/python2.5/site-packages/numpy/distutils$ > python system_info.py > lapack_info: > libraries lapack not found in /usr/local/lib > FOUND: > libraries = ['lapack'] > library_dirs = ['/usr/lib'] > language = f77 > > lapack_opt_info: > lapack_mkl_info: > mkl_info: > libraries mkl,vml,guide not found in /usr/local/lib > libraries mkl,vml,guide not found in /usr/lib > NOT AVAILABLE > > NOT AVAILABLE > > atlas_threads_info: > Setting PTATLAS=ATLAS > libraries ptf77blas,ptcblas,atlas not found in /usr/local/lib > libraries lapack_atlas not found in /usr/local/lib > libraries ptf77blas,ptcblas,atlas not found in /usr/lib/atlas > libraries lapack_atlas not found in /usr/lib/atlas > libraries ptf77blas,ptcblas,atlas not found in /usr/lib > __main__.atlas_threads_info > NOT AVAILABLE > > atlas_info: > libraries f77blas,cblas,atlas not found in /usr/local/lib > libraries lapack_atlas not found in /usr/local/lib > libraries f77blas,cblas,atlas not found in /usr/lib/atlas > libraries lapack_atlas not found in /usr/lib/atlas > __main__.atlas_info > FOUND: > libraries = ['lapack', 'f77blas', 'cblas', 'atlas'] > library_dirs = ['/usr/lib'] > language = f77 > include_dirs = ['/usr/include'] > > customize GnuFCompiler > Found executable /usr/bin/g77 > gnu: no Fortran 90 compiler found > gnu: no Fortran 90 compiler found > customize GnuFCompiler > gnu: no Fortran 90 compiler found > gnu: no Fortran 90 compiler found > customize GnuFCompiler using config > FOUND: > libraries = ['lapack', 'f77blas', 'cblas', 'atlas'] > library_dirs = ['/usr/lib'] > language = f77 > define_macros = [('ATLAS_INFO', '"\\"?.?.?\\""')] > include_dirs = ['/usr/include'] > > lapack_atlas_info: > libraries lapack_atlas,f77blas,cblas,atlas not found in /usr/local/lib > libraries lapack_atlas not found in /usr/local/lib > libraries lapack_atlas,f77blas,cblas,atlas not found in /usr/lib/atlas > libraries lapack_atlas not found in /usr/lib/atlas > __main__.lapack_atlas_info > FOUND: > libraries = ['lapack', 'lapack_atlas', 'f77blas', 'cblas', 'atlas'] > library_dirs = ['/usr/lib'] > language = f77 > include_dirs = ['/usr/include'] > > umfpack_info: > libraries umfpack not found in /usr/local/lib > libraries umfpack not found in /usr/lib > NOT AVAILABLE > > _pkg_config_info: > Found executable /usr/bin/pkg-config > NOT AVAILABLE > > lapack_atlas_threads_info: > Setting PTATLAS=ATLAS > libraries lapack_atlas,ptf77blas,ptcblas,atlas not found in /usr/local/lib > libraries lapack_atlas not found in /usr/local/lib > libraries lapack_atlas,ptf77blas,ptcblas,atlas not found in /usr/lib/atlas > libraries lapack_atlas not found in /usr/lib/atlas > libraries lapack_atlas,ptf77blas,ptcblas,atlas not found in /usr/lib > __main__.lapack_atlas_threads_info > NOT AVAILABLE > > x11_info: > libraries X11 not found in /usr/lib64 > NOT AVAILABLE > > blas_info: > libraries blas not found in /usr/local/lib > FOUND: > libraries = ['blas'] > library_dirs = ['/usr/lib'] > language = f77 > > fftw_info: > libraries fftw3 not found in /usr/local/lib > FOUND: > libraries = ['fftw3'] > library_dirs = ['/usr/lib'] > define_macros = [('SCIPY_FFTW3_H', None)] > include_dirs = ['/usr/include'] > > f2py_info: > FOUND: > sources = ['/usr/lib/python2.5/site-packages/numpy/f2py/src/fortranobject.c'] > include_dirs = ['/usr/lib/python2.5/site-packages/numpy/f2py/src'] > > gdk_pixbuf_xlib_2_info: > NOT AVAILABLE > > dfftw_threads_info: > libraries drfftw_threads,dfftw_threads not found in /usr/local/lib > libraries drfftw_threads,dfftw_threads not found in /usr/lib > dfftw threads not found > NOT AVAILABLE > > atlas_blas_info: > libraries f77blas,cblas,atlas not found in /usr/local/lib > libraries f77blas,cblas,atlas not found in /usr/lib/atlas > FOUND: > libraries = ['f77blas', 'cblas', 'atlas'] > library_dirs = ['/usr/lib'] > language = c > include_dirs = ['/usr/include'] > > fftw3_info: > libraries fftw3 not found in /usr/local/lib > FOUND: > libraries = ['fftw3'] > library_dirs = ['/usr/lib'] > define_macros = [('SCIPY_FFTW3_H', None)] > include_dirs = ['/usr/include'] > > blas_opt_info: > blas_mkl_info: > libraries mkl,vml,guide not found in /usr/local/lib > libraries mkl,vml,guide not found in /usr/lib > NOT AVAILABLE > > atlas_blas_threads_info: > Setting PTATLAS=ATLAS > libraries ptf77blas,ptcblas,atlas not found in /usr/local/lib > libraries ptf77blas,ptcblas,atlas not found in /usr/lib/atlas > libraries ptf77blas,ptcblas,atlas not found in /usr/lib > NOT AVAILABLE > > customize GnuFCompiler > gnu: no Fortran 90 compiler found > gnu: no Fortran 90 compiler found > customize GnuFCompiler > gnu: no Fortran 90 compiler found > gnu: no Fortran 90 compiler found > customize GnuFCompiler using config > FOUND: > libraries = ['f77blas', 'cblas', 'atlas'] > library_dirs = ['/usr/lib'] > language = c > define_macros = [('ATLAS_INFO', '"\\"?.?.?\\""')] > include_dirs = ['/usr/include'] > > sfftw_info: > libraries srfftw,sfftw not found in /usr/local/lib > libraries srfftw,sfftw not found in /usr/lib > sfftw not found > NOT AVAILABLE > > xft_info: > NOT AVAILABLE > > fft_opt_info: > djbfft_info: > NOT AVAILABLE > > FOUND: > libraries = ['fftw3'] > library_dirs = ['/usr/lib'] > define_macros = [('SCIPY_FFTW3_H', None)] > include_dirs = ['/usr/include'] > > gdk_x11_2_info: > NOT AVAILABLE > > agg2_info: > NOT AVAILABLE > > numarray_info: > NOT AVAILABLE > > blas_src_info: > NOT AVAILABLE > > fftw2_info: > libraries rfftw,fftw not found in /usr/local/lib > FOUND: > libraries = ['rfftw', 'fftw'] > library_dirs = ['/usr/lib'] > define_macros = [('SCIPY_FFTW_H', None)] > include_dirs = ['/usr/include'] > > fftw_threads_info: > libraries rfftw_threads,fftw_threads not found in /usr/local/lib > FOUND: > libraries = ['rfftw_threads', 'fftw_threads'] > library_dirs = ['/usr/lib'] > define_macros = [('SCIPY_FFTW_THREADS_H', None)] > include_dirs = ['/usr/include'] > > _numpy_info: > FOUND: > define_macros = [('NUMERIC_VERSION', '"\\"24.2\\""'), ('NUMERIC', None)] > include_dirs = ['/usr/include/python2.5'] > > wx_info: > Could not locate executable wx-config > File not found: None. Cannot determine wx info. > NOT AVAILABLE > > gdk_info: > NOT AVAILABLE > > gtkp_x11_2_info: > NOT AVAILABLE > > sfftw_threads_info: > libraries srfftw_threads,sfftw_threads not found in /usr/local/lib > libraries srfftw_threads,sfftw_threads not found in /usr/lib > sfftw threads not found > NOT AVAILABLE > > boost_python_info: > NOT AVAILABLE > > freetype2_info: > NOT AVAILABLE > > gdk_2_info: > NOT AVAILABLE > > dfftw_info: > libraries drfftw,dfftw not found in /usr/local/lib > libraries drfftw,dfftw not found in /usr/lib > dfftw not found > NOT AVAILABLE > > lapack_src_info: > NOT AVAILABLE > > gtkp_2_info: > NOT AVAILABLE > > gdk_pixbuf_2_info: > NOT AVAILABLE > > amd_info: > libraries amd not found in /usr/local/lib > libraries amd not found in /usr/lib > NOT AVAILABLE > > Numeric_info: > FOUND: > define_macros = [('NUMERIC_VERSION', '"\\"24.2\\""'), ('NUMERIC', None)] > include_dirs = ['/usr/include/python2.5'] > > numerix_info: > numpy_info: > FOUND: > define_macros = [('NUMPY_VERSION', '"\\"1.0.4\\""'), ('NUMPY', None)] > > FOUND: > define_macros = [('NUMPY_VERSION', '"\\"1.0.4\\""'), ('NUMPY', None)] > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user From timmichelsen at gmx-topmail.de Mon Nov 26 09:29:45 2007 From: timmichelsen at gmx-topmail.de (Tim Michelsen) Date: Mon, 26 Nov 2007 14:29:45 +0000 (UTC) Subject: [SciPy-user] Installation of time series failed on windows Message-ID: Hello, I tried to install the time-series package from the sandbox on my windows. It failed while maskarray was installed nicely. May someone please help me here? Here's a log from IPython when running the setup.py install Ignoring attempt to set 'name' (from 'timeseries' to 'timeseries.lib') Warning: Assuming default configuration (.\tests/{setup_tests,setup}.py was not found) Appending timeseries.tests configuration to timeseries Ignoring attempt to set 'name' (from 'timeseries' to 'timeseries.tests') running install running build running config_cc unifing config_cc, config, build_clib, build_ext, build commands --compiler options running config_fc unifing config_fc, config, build_clib, build_ext, build commands --fcompiler options running build_src building extension "timeseries.cseries" sources running build_py copying .\ipython_log.py -> build\lib.win32-2.5\timeseries running build_ext No module named msvccompiler in numpy.distutils, trying from distutils.. : error: Python was built with Visual Studio 2003; extensions must be built with a compiler than can generate compatible binaries. Visual Studio 2003 was not found on this system. If you have Cygwin installed, you can try compiling with MingW32, by passing "-c mingw32" to setup.py. WARNING: Failure executing file: Thanks. Tim From Alexander.Dietz at astro.cf.ac.uk Mon Nov 26 12:05:42 2007 From: Alexander.Dietz at astro.cf.ac.uk (Alexander Dietz) Date: Mon, 26 Nov 2007 17:05:42 +0000 Subject: [SciPy-user] Problems installing scipy 0.6 on FC5 Message-ID: <9cf809a00711260905o3c0d14behed14925157c0be9b@mail.gmail.com> Hi, I am running FC5 (I think, how to figure out) and I followed the installation instructions for FC5. The last step, however, fails with the following error messages: > python setup.py install ... ... g77:f77: scipy/fftpack/dfftpack/zfftb.f g77:f77: scipy/fftpack/dfftpack/dsinti.f g77:f77: scipy/fftpack/dfftpack/dfftb.f g77:f77: scipy/fftpack/dfftpack/zfftb1.f /tmp/ccuB5p4W.s: Assembler messages: /tmp/ccuB5p4W.s:589: Error: suffix or operands invalid for `movd' /tmp/ccuB5p4W.s:2948: Error: suffix or operands invalid for `movd' /tmp/ccuB5p4W.s: Assembler messages: /tmp/ccuB5p4W.s:589: Error: suffix or operands invalid for `movd' /tmp/ccuB5p4W.s:2948: Error: suffix or operands invalid for `movd' error: Command "/usr/bin/g77 -g -Wall -fno-second-underscore -fPIC -O2 -funroll-loops -march=i686 -mmmx -msse2 -msse -fomit-frame-pointer -malign-double -c -c scipy/fftpack/dfftpack/zfftb1.f -o build/temp.linux- i686-2.4/scipy/fftpack/dfftpack/zfftb1.o" failed with exit status 1 Any ideas how to solve this issue? Cheers Alex -------------- next part -------------- An HTML attachment was scrubbed... URL: From mattknox_ca at hotmail.com Mon Nov 26 13:07:12 2007 From: mattknox_ca at hotmail.com (Matt Knox) Date: Mon, 26 Nov 2007 18:07:12 +0000 (UTC) Subject: [SciPy-user] Installation of time series failed on windows References: Message-ID: Hi Tim, I just added a section to the wiki about compiling extension modules on Windows. Please give the following a try: http://www.scipy.org/Cookbook/CompilingExtensionsOnWindowsWithMinGW If you are still having trouble after that, let me know. - Matt From david at ar.media.kyoto-u.ac.jp Mon Nov 26 22:19:35 2007 From: david at ar.media.kyoto-u.ac.jp (David Cournapeau) Date: Tue, 27 Nov 2007 12:19:35 +0900 Subject: [SciPy-user] Problems installing scipy 0.6 on FC5 In-Reply-To: <9cf809a00711260905o3c0d14behed14925157c0be9b@mail.gmail.com> References: <9cf809a00711260905o3c0d14behed14925157c0be9b@mail.gmail.com> Message-ID: <474B8CC7.7010108@ar.media.kyoto-u.ac.jp> Alexander Dietz wrote: > Hi, > > I am running FC5 (I think, how to figure out) and I followed the > installation instructions for FC5. The last step, however, fails with > the following error messages: > > > python setup.py install > ... > ... > g77:f77: scipy/fftpack/dfftpack/zfftb.f > g77:f77: scipy/fftpack/dfftpack/dsinti.f > g77:f77: scipy/fftpack/dfftpack/dfftb.f > g77:f77: scipy/fftpack/dfftpack/zfftb1.f > /tmp/ccuB5p4W.s: Assembler messages: > /tmp/ccuB5p4W.s:589: Error: suffix or operands invalid for `movd' > /tmp/ccuB5p4W.s:2948: Error: suffix or operands invalid for `movd' > /tmp/ccuB5p4W.s: Assembler messages: > /tmp/ccuB5p4W.s:589: Error: suffix or operands invalid for `movd' > /tmp/ccuB5p4W.s:2948: Error: suffix or operands invalid for `movd' > error: Command "/usr/bin/g77 -g -Wall -fno-second-underscore -fPIC -O2 > -funroll-loops -march=i686 -mmmx -msse2 -msse -fomit-frame-pointer > -malign-double -c -c scipy/fftpack/dfftpack/zfftb1.f -o > build/temp.linux- i686-2.4/scipy/fftpack/dfftpack/zfftb1.o" failed > with exit status 1 > > Any ideas how to solve this issue? > > Cheers > Alex > > What's your machine ? More exactly, could you give us the result of the following command (to get the exact characteristics of your CPU): cat /proc/cpuinfo ? cheers, David From strawman at astraw.com Mon Nov 26 22:34:16 2007 From: strawman at astraw.com (Andrew Straw) Date: Mon, 26 Nov 2007 19:34:16 -0800 Subject: [SciPy-user] pdf of multivariate normal In-Reply-To: References: <1194717156.583578.152820@o3g2000hsb.googlegroups.com> Message-ID: <474B9038.9030708@astraw.com> John Reid wrote: > LB wrote: >>>>> info(numpy.random.multivariate_normal) >> Return an array containing multivariate normally distributed random >> numbers > > Thanks but I don't want random variates, I would like to calculate the > probability density function. I have the same question myself. I found one implementation: scikits.learn.machine.em.densities.gauss_den() Is there a version in scipy (or numpy)? After searching, I think not, but it does seems this belongs in there -- perhaps in scipy.stats. Any reason beyond the usual (lack of time) why it's not there already? -Andrew From Alexander.Dietz at astro.cf.ac.uk Tue Nov 27 03:59:41 2007 From: Alexander.Dietz at astro.cf.ac.uk (Alexander Dietz) Date: Tue, 27 Nov 2007 08:59:41 +0000 Subject: [SciPy-user] Problems installing scipy 0.6 on FC5 In-Reply-To: <474B8CC7.7010108@ar.media.kyoto-u.ac.jp> References: <9cf809a00711260905o3c0d14behed14925157c0be9b@mail.gmail.com> <474B8CC7.7010108@ar.media.kyoto-u.ac.jp> Message-ID: <9cf809a00711270059w117804faj7a03710d8e5fc208@mail.gmail.com> Hi, see the content of /proc/cpuinfo below. Cheers Alex processor : 0 vendor_id : GenuineIntel cpu family : 6 model : 14 model name : Genuine Intel(R) CPU T2300 @ 1.66GHz stepping : 8 cpu MHz : 1000.000 cache size : 2048 KB physical id : 0 siblings : 2 core id : 0 cpu cores : 2 fdiv_bug : no hlt_bug : no f00f_bug : no coma_bug : no fpu : yes fpu_exception : yes cpuid level : 10 wp : yes flags : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat clflush dts acpi mmx fxsr sse sse2 ss ht tm pbe nx constant_tsc pni monitor est tm2 xtpr bogomips : 3335.11 processor : 1 vendor_id : GenuineIntel cpu family : 6 model : 14 model name : Genuine Intel(R) CPU T2300 @ 1.66GHz stepping : 8 cpu MHz : 1000.000 cache size : 2048 KB physical id : 0 siblings : 2 core id : 1 cpu cores : 2 fdiv_bug : no hlt_bug : no f00f_bug : no coma_bug : no fpu : yes fpu_exception : yes cpuid level : 10 wp : yes flags : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat clflush dts acpi mmx fxsr sse sse2 ss ht tm pbe nx constant_tsc pni monitor est tm2 xtpr bogomips : 3329.30 On Nov 27, 2007 3:19 AM, David Cournapeau wrote: > Alexander Dietz wrote: > > Hi, > > > > I am running FC5 (I think, how to figure out) and I followed the > > installation instructions for FC5. The last step, however, fails with > > the following error messages: > > > > > python setup.py install > > ... > > ... > > g77:f77: scipy/fftpack/dfftpack/zfftb.f > > g77:f77: scipy/fftpack/dfftpack/dsinti.f > > g77:f77: scipy/fftpack/dfftpack/dfftb.f > > g77:f77: scipy/fftpack/dfftpack/zfftb1.f > > /tmp/ccuB5p4W.s: Assembler messages: > > /tmp/ccuB5p4W.s:589: Error: suffix or operands invalid for `movd' > > /tmp/ccuB5p4W.s:2948: Error: suffix or operands invalid for `movd' > > /tmp/ccuB5p4W.s: Assembler messages: > > /tmp/ccuB5p4W.s:589: Error: suffix or operands invalid for `movd' > > /tmp/ccuB5p4W.s:2948: Error: suffix or operands invalid for `movd' > > error: Command "/usr/bin/g77 -g -Wall -fno-second-underscore -fPIC -O2 > > -funroll-loops -march=i686 -mmmx -msse2 -msse -fomit-frame-pointer > > -malign-double -c -c scipy/fftpack/dfftpack/zfftb1.f -o > > build/temp.linux- i686-2.4/scipy/fftpack/dfftpack/zfftb1.o" failed > > with exit status 1 > > > > Any ideas how to solve this issue? > > > > Cheers > > Alex > > > > > What's your machine ? More exactly, could you give us the result of the > following command (to get the exact characteristics of your CPU): cat > /proc/cpuinfo ? > > cheers, > > David > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > -------------- next part -------------- An HTML attachment was scrubbed... URL: From stefan at sun.ac.za Tue Nov 27 04:11:41 2007 From: stefan at sun.ac.za (Stefan van der Walt) Date: Tue, 27 Nov 2007 11:11:41 +0200 Subject: [SciPy-user] pdf of multivariate normal In-Reply-To: <474B9038.9030708@astraw.com> References: <1194717156.583578.152820@o3g2000hsb.googlegroups.com> <474B9038.9030708@astraw.com> Message-ID: <20071127091141.GD29033@mentat.za.net> On Mon, Nov 26, 2007 at 07:34:16PM -0800, Andrew Straw wrote: > I have the same question myself. I found one implementation: > scikits.learn.machine.em.densities.gauss_den() > > Is there a version in scipy (or numpy)? After searching, I think not, > but it does seems this belongs in there -- perhaps in scipy.stats. Any > reason beyond the usual (lack of time) why it's not there already? Isn't learn.machine.em.densities.gauss_den used to estimate parameters for Gaussian mixture models? For a multivariate Gaussian, the mean and covariance can be calculated directly. Regards St?fan From david at ar.media.kyoto-u.ac.jp Tue Nov 27 04:06:50 2007 From: david at ar.media.kyoto-u.ac.jp (David Cournapeau) Date: Tue, 27 Nov 2007 18:06:50 +0900 Subject: [SciPy-user] Problems installing scipy 0.6 on FC5 In-Reply-To: <9cf809a00711270059w117804faj7a03710d8e5fc208@mail.gmail.com> References: <9cf809a00711260905o3c0d14behed14925157c0be9b@mail.gmail.com> <474B8CC7.7010108@ar.media.kyoto-u.ac.jp> <9cf809a00711270059w117804faj7a03710d8e5fc208@mail.gmail.com> Message-ID: <474BDE2A.5000404@ar.media.kyoto-u.ac.jp> Alexander Dietz wrote: > Hi, > > see the content of /proc/cpuinfo below. > Ok, your CPU supports sse2 instruction set, so this is not the problem. Since you use an old FC version, it may be an old g77 bug. Could you give use the version of g77 (g77 -v). Also, you could try to see which flag make it fails, for example, does the following: /usr/bin/g77 -g -Wall -fno-second-underscore -fPIC -O2 -funroll-loops -march=i686 -mmmx -msse -fomit-frame-pointer -malign-double -c scipy/fftpack/dfftpack/zfftb1.f (I suspect the -msse2 to be a problem with old g77; several problems related to -msse2 and movd were fixed in gcc 3.3). cheers, David From david at ar.media.kyoto-u.ac.jp Tue Nov 27 04:18:00 2007 From: david at ar.media.kyoto-u.ac.jp (David Cournapeau) Date: Tue, 27 Nov 2007 18:18:00 +0900 Subject: [SciPy-user] pdf of multivariate normal In-Reply-To: <20071127091141.GD29033@mentat.za.net> References: <1194717156.583578.152820@o3g2000hsb.googlegroups.com> <474B9038.9030708@astraw.com> <20071127091141.GD29033@mentat.za.net> Message-ID: <474BE0C8.3080206@ar.media.kyoto-u.ac.jp> Stefan van der Walt wrote: > On Mon, Nov 26, 2007 at 07:34:16PM -0800, Andrew Straw wrote: >> I have the same question myself. I found one implementation: >> scikits.learn.machine.em.densities.gauss_den() >> >> Is there a version in scipy (or numpy)? After searching, I think not, >> but it does seems this belongs in there -- perhaps in scipy.stats. Any >> reason beyond the usual (lack of time) why it's not there already? > > Isn't learn.machine.em.densities.gauss_den used to estimate parameters > for Gaussian mixture models? For a multivariate Gaussian, the mean > and covariance can be calculated directly. No, it is used to compute the pdf of a multivariate Gaussian, which is then used to compute the pdf of mixtures. AFAIK, there is no multivariate facilities in scipy.stats, or did I missed them ? Providing the same facilities as in scipy.stats.distribution for multi-variate would be quite a lot of coding, though I guess. For example, for cdf alone, it is not so easy to compute them generally (but the code is already there for kde if I am right, in the mvn extension). David From Alexander.Dietz at astro.cf.ac.uk Tue Nov 27 04:39:01 2007 From: Alexander.Dietz at astro.cf.ac.uk (Alexander Dietz) Date: Tue, 27 Nov 2007 09:39:01 +0000 Subject: [SciPy-user] Problems installing scipy 0.6 on FC5 In-Reply-To: <474BDE2A.5000404@ar.media.kyoto-u.ac.jp> References: <9cf809a00711260905o3c0d14behed14925157c0be9b@mail.gmail.com> <474B8CC7.7010108@ar.media.kyoto-u.ac.jp> <9cf809a00711270059w117804faj7a03710d8e5fc208@mail.gmail.com> <474BDE2A.5000404@ar.media.kyoto-u.ac.jp> Message-ID: <9cf809a00711270139h3e5ee58bk62f8af9f6322f08c@mail.gmail.com> Hi, On Nov 27, 2007 9:06 AM, David Cournapeau wrote: > Alexander Dietz wrote: > > Hi, > > > > see the content of /proc/cpuinfo below. > > > Ok, your CPU supports sse2 instruction set, so this is not the problem. > Since you use an old FC version, it may be an old g77 bug. Could you > give use the version of g77 (g77 -v). Reading specs from /usr/lib/gcc-lib/i386-redhat-linux/3.2.3/specs Configured with: ../configure --prefix=/usr --mandir=/usr/share/man --infodir=/usr/share/info --enable-shared --enable-threads=posix --disable-checking --with-system-zlib --enable-__cxa_atexit --enable-languages=c,c++,f77 --disable-libgcj --host=i386-redhat-linux Thread model: posix gcc version 3.2.3 20030502 (Red Hat Linux 3.2.3-55.fc5) > Also, you could try to see which > flag make it fails, for example, does the following: > > /usr/bin/g77 -g -Wall -fno-second-underscore -fPIC -O2 -funroll-loops > -march=i686 -mmmx -msse -fomit-frame-pointer -malign-double -c > scipy/fftpack/dfftpack/zfftb1.f > That worked actually... Cheers Alex > (I suspect the -msse2 to be a problem with old g77; several problems > related to -msse2 and movd were fixed in gcc 3.3). > > cheers, > > David > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-user > -------------- next part -------------- An HTML attachment was scrubbed... URL: From david at ar.media.kyoto-u.ac.jp Tue Nov 27 04:43:15 2007 From: david at ar.media.kyoto-u.ac.jp (David Cournapeau) Date: Tue, 27 Nov 2007 18:43:15 +0900 Subject: [SciPy-user] Problems installing scipy 0.6 on FC5 In-Reply-To: <9cf809a00711270139h3e5ee58bk62f8af9f6322f08c@mail.gmail.com> References: <9cf809a00711260905o3c0d14behed14925157c0be9b@mail.gmail.com> <474B8CC7.7010108@ar.media.kyoto-u.ac.jp> <9cf809a00711270059w117804faj7a03710d8e5fc208@mail.gmail.com> <474BDE2A.5000404@ar.media.kyoto-u.ac.jp> <9cf809a00711270139h3e5ee58bk62f8af9f6322f08c@mail.gmail.com> Message-ID: <474BE6B3.9030709@ar.media.kyoto-u.ac.jp> Alexander Dietz wrote: > > > Also, you could try to see which > flag make it fails, for example, does the following: > > /usr/bin/g77 -g -Wall -fno-second-underscore -fPIC -O2 -funroll-loops > -march=i686 -mmmx -msse -fomit-frame-pointer -malign-double -c > scipy/fftpack/dfftpack/zfftb1.f > > > That worked actually... Yes, so the problem is your gcc version. Maybe we could disable adding -msse2 altogether for gcc < 3.3, instead of enabling it for gcc > 3.2.2 (look at numpy/distutils/fcompiler/gnu.py, and look for -msse2; if Pearu or David are reading this, maybe they know a better solution ?). Note that FC5 is old. You should consider upgrading (FC 5 is not maintained anymore) if you can, cheers, David From david at ar.media.kyoto-u.ac.jp Tue Nov 27 04:46:43 2007 From: david at ar.media.kyoto-u.ac.jp (David Cournapeau) Date: Tue, 27 Nov 2007 18:46:43 +0900 Subject: [SciPy-user] Problems installing scipy 0.6 on FC5 In-Reply-To: <474BE6B3.9030709@ar.media.kyoto-u.ac.jp> References: <9cf809a00711260905o3c0d14behed14925157c0be9b@mail.gmail.com> <474B8CC7.7010108@ar.media.kyoto-u.ac.jp> <9cf809a00711270059w117804faj7a03710d8e5fc208@mail.gmail.com> <474BDE2A.5000404@ar.media.kyoto-u.ac.jp> <9cf809a00711270139h3e5ee58bk62f8af9f6322f08c@mail.gmail.com> <474BE6B3.9030709@ar.media.kyoto-u.ac.jp> Message-ID: <474BE783.2020707@ar.media.kyoto-u.ac.jp> David Cournapeau wrote: > Alexander Dietz wrote: > >> >> >> Also, you could try to see which >> flag make it fails, for example, does the following: >> >> /usr/bin/g77 -g -Wall -fno-second-underscore -fPIC -O2 -funroll-loops >> -march=i686 -mmmx -msse -fomit-frame-pointer -malign-double -c >> scipy/fftpack/dfftpack/zfftb1.f >> >> >> That worked actually... >> > Yes, so the problem is your gcc version. Maybe we could disable adding > -msse2 altogether for gcc < 3.3, instead of enabling it for gcc > 3.2.2 > (look at numpy/distutils/fcompiler/gnu.py, and look for -msse2; if Pearu > or David are reading this, maybe they know a better solution ?). > > Note that FC5 is old. You should consider upgrading (FC 5 is not > maintained anymore) if you can, I forgot: another possibility is to use gfortran instead of g77, but you will have to recompile everything (you cannot really mix g77 and gfortran code together), including BLAS/LAPACK. For FC6, I remember that gfortran is the default ABI, so this is anyway the best choice, but I am not sure for FC5. David From cookedm at physics.mcmaster.ca Tue Nov 27 08:57:37 2007 From: cookedm at physics.mcmaster.ca (David M. Cooke) Date: Tue, 27 Nov 2007 08:57:37 -0500 Subject: [SciPy-user] install numpy/scipy with Intel Compilers In-Reply-To: <47471912.3050300@imag.fr> References: <47471912.3050300@imag.fr> Message-ID: On Nov 23, 2007, at 13:16 , Laurence Viry wrote: > Hi! > > I try to install numpy 1.0.4 and scipy 0.6.0 with INTEL C/C++ and > Fortran compilers and SCSL scientific library with python2.4.4 on > Itanium with linux OS. (Altix SGI with SUSE linux) > > I didn't success to configure the construct of build_ext for scipy > with > the fortran INTEL compiler (--fcompiler=intele) > Thank for your help Hi Laurence, I'm not familiar with Intel's compilers (even more, running them on Itaniums). I'm guessing that numpy.distutils isn't able to tell whether your compiler is intele or not. From numpy/distutils/fcompiler/ intel.py, the command run to find the version is $ ifort -FI -V -c dummy.f -o dummy.o where dummy.f is just a file with a dummy subroutine in it. Could you do this and give us the output? -- |>|\/|< /------------------------------------------------------------------\ |David M. Cooke http://arbutus.physics.mcmaster.ca/dmc/ |cookedm at physics.mcmaster.ca From rex at nosyntax.net Tue Nov 27 10:40:00 2007 From: rex at nosyntax.net (rex) Date: Tue, 27 Nov 2007 07:40:00 -0800 Subject: [SciPy-user] install numpy/scipy with Intel Compilers In-Reply-To: References: <47471912.3050300@imag.fr> Message-ID: <20071127154000.GE5720@nosyntax.net> David M. Cooke [2007-11-27 05:54]: >I'm guessing that numpy.distutils isn't able to tell >whether your compiler is intele or not. From numpy/distutils/fcompiler/ >intel.py, the command run to find the version is > >$ ifort -FI -V -c dummy.f -o dummy.o > >where dummy.f is just a file with a dummy subroutine in it. Could you >do this and give us the output? Hello David, No Itanium here, but this is the output of the 32-bit version. ============================================================================== $ ifort -FI -V -c foo.f -o foo.o Intel(R) Fortran Compiler for applications running on IA-32, Version 10.1 Build 20070913 Package ID: l_fc_p_10.1.008 Copyright (C) 1985-2007 Intel Corporation. All rights reserved. FOR NON-COMMERCIAL USE ONLY Intel Fortran 10.0-2024 ================================================================================ There's a distutils problem using Intel's MKL in NumPy-1.04 and NumPy SVN 4510 (and below). In MKL 9.1 & 10.0, mkl_lapack32 & mkl_lapack64 no longer exist. They were replaced by /32/mkl_lapack and /64/mkl_lapack. As a result, numpy/distutils/system_info.py needs to be changed: class lapack_mkl_info(mkl_info): def calc_info(self): mkl = get_info('mkl') if not mkl: return if sys.platform == 'win32': lapack_libs = self.get_libs('lapack_libs',['mkl_lapack']) else: #lapack_libs = self.get_libs('lapack_libs',['mkl_lapack32','mkl_lapack64']) lapack_libs = self.get_libs('lapack_libs',['mkl_lapack']) If this change is not made NumPy builds, but the MKL is not used. I reported this problem earlier on the NumPy list. -rex From Laurence.Viry at imag.fr Tue Nov 27 11:32:25 2007 From: Laurence.Viry at imag.fr (Laurence Viry) Date: Tue, 27 Nov 2007 17:32:25 +0100 Subject: [SciPy-user] install numpy/scipy with Intel Compilers In-Reply-To: References: <47471912.3050300@imag.fr> Message-ID: <474C4699.7020905@imag.fr> D > > Hi Laurence, > > I'm not familiar with Intel's compilers (even more, running them on > Itaniums). I'm guessing that numpy.distutils isn't able to tell > whether your compiler is intele or not. From numpy/distutils/fcompiler/ > intel.py, the command run to find the version is > > $ ifort -FI -V -c dummy.f -o dummy.o > > where dummy.f is just a file with a dummy subroutine in it. Could you > do this and give us the output? > > * the ouput is the following, --------------------------------- Intel(R) Fortran Itanium(R) Compiler for Itanium(R)-based applications Version 9.0 Build 20050430 Package ID: l_fc_p_9.0.021 Copyright (C) 1985-2005 Intel Corporation. All rights reserved. FOR NON-COMMERCIAL USE ONLY ---------------------------------- I observed before, it was not the same in intel.py description = 'Intel Fortran Compiler for Itanium apps' I tried to replace by : description = 'Intel(R) Fortran Itanium(R) Compiler for Itanium(R) apps' it's ok for numpy but I had stil the same error for scipy * I don't use MKL bul SCSL library, it is the optimized scientific library on our computrer, I give you the ouput of the config command blas_info: FOUND: libraries = ['scs'] library_dirs = ['/usr/lib'] language = f77 FOUND: libraries = ['scs'] library_dirs = ['/usr/lib'] define_macros = [('NO_ATLAS_INFO', 1)] language = f77 lapack_info: FOUND: libraries = ['scs'] library_dirs = ['/usr/lib'] language = f77 FOUND: libraries = ['scs', 'scs'] library_dirs = ['/usr/lib'] define_macros = [('NO_ATLAS_INFO', 1)] language = f77 Thank for your help -- Laurence Viry CRIP UJF - Projet MIRAGE ( http://mirage.imag.fr ) Laboratoire de Mod?lisation et Calcul - IMAG tel: 04 76 63 56 34 fax: 04 76 63 12 63 e-mail: Laurence.Viry at imag.fr From odonnems at yahoo.com Wed Nov 28 14:47:00 2007 From: odonnems at yahoo.com (Michael ODonnell) Date: Wed, 28 Nov 2007 11:47:00 -0800 (PST) Subject: [SciPy-user] weave inline compile error Message-ID: <943333.38163.qm@web58012.mail.re3.yahoo.com> I have been trying to use scipy.weave.inline on Windows XP via cygwin. Ultimately, I need to use weave.inline for compiling c++ code nested in python program. However, I have been extremely unsuccessful at getting this to work. I recently posted some errors on this site, which are subject under "weave and compiler install help". Basically, I have not been able to find out anything else, but let me summarize: First, I am using these versions, but note I have tried to use the most recent stable release of python enthought: I created a distutils.cfg file specifying [build_ext] compiler=mingw32 python-2.4.1.msi scipy-0.6.0.win32-py2.4.exe Numeric-24.2.win32-py2.4.exe numpy-1.04.win32-py2.4.exe wxPython2.8-win32-unicode-2.8.6.1-py24.exe and wxPython2.8-win32-docs-demos-2.8.6.1.exe As well as a couple other packages cygwin: setup.exe (current) mingw-install-20060210.tar (MSYS) and MinGW-5.1.3.exe (I read online that these do not work with 2.4.1 so I am focusing with cygwin) An example of weave that I am trying to get to work is: a = 1 weave.inline('printf("%d\\n",a);',['a'], verbose=2, type_converters=converters.blitz, compiler='gcc') #I have tried numerous versions of this The error seems to be in the code written in the numpy\distutils\exec_command.py: s,o = exec_command(cmd) The best I can tell, is that the code is trying to execute the above command wich is failing to write output to s,o. These variables are then written to files, which cannot be read. The program jumps around between exec_command and _exec_command. I don't know python well enough to know what is causing the error. It appears to me that the code is being compiled, but this one line is not working. I also tried running the script written by exec_command.py directly (commented out the os.remove() in exec_command.py, but this did not work. It did help me identify that the problem is with s and o variables. Does anyone have any thoughts on what I should try to do. I appreciate any suggestions and ideas one can provide. Thanks! Michael Here is an example of my errors: running build_ext running build_src building extension "sc_552cccf5dbf4f6eadd273cdcbd5860525" sources customize Mingw32CCompiler customize Mingw32CCompiler using build_ext customize Mingw32CCompiler customize Mingw32CCompiler using build_ext building 'sc_552cccf5dbf4f6eadd273cdcbd5860525' extension compiling C++ sources C compiler: g++ -mno-cygwin -O2 -Wall compile options: '-IC:\Python24\lib\site-packages\scipy\weave -IC:\Python24\lib\site-packages\scipy\weave\scxx -IC:\Python24\lib\site-packages\numpy\core\include -IC:\Python24\include -IC:\Python24\PC -c' g++ -mno-cygwin -O2 -Wall -IC:\Python24\lib\site-packages\scipy\weave -IC:\Python24\lib\site-packages\scipy\weave\scxx -IC:\Python24\lib\site-packages\numpy\core\include -IC:\Python24\include -IC:\Python24\PC -c c:\temp\Michael\python24_compiled\sc_552cccf5dbf4f6eadd273cdcbd5860525.cpp -o c:\temp\Michael\python24_intermediate\compiler_c8350d870e6c54e8f29dd7094c2bfb45\Release\temp\michael\python24_compiled\sc_552cccf5dbf4f6eadd273cdcbd5860525.o Traceback (most recent call last): File "C:\Python24\Lib\site-packages\pythonwin\pywin\framework\scriptutils.py", line 310, in RunScript exec codeObject in __main__.__dict__ File "C:\Documents and Settings\Michael\Application Data\ESRI\ArcToolbox\scripts\test_weave.py", line 251, in ? main() File "C:\Documents and Settings\Michael\Application Data\ESRI\ArcToolbox\scripts\test_weave.py", line 205, in main weave.inline('printf("%d\\n",a);',['a'], verbose=2, type_converters=converters.blitz) #, compiler = 'gcc', verbpse=2, type_converters=converters.blitz, auto_downcast=0) #'msvc' or 'gcc' or 'mingw32' File "C:\Python24\Lib\site-packages\scipy\weave\inline_tools.py", line 338, in inline auto_downcast = auto_downcast, File "C:\Python24\Lib\site-packages\scipy\weave\inline_tools.py", line 447, in compile_function verbose=verbose, **kw) File "C:\Python24\Lib\site-packages\scipy\weave\ext_tools.py", line 365, in compile verbose = verbose, **kw) File "C:\Python24\Lib\site-packages\scipy\weave\build_tools.py", line 269, in build_extension setup(name = module_name, ext_modules = [ext],verbose=verb) File "C:\Python24\Lib\site-packages\numpy\distutils\core.py", line 176, in setup return old_setup(**new_attr) File "C:\Python24\lib\distutils\core.py", line 159, in setup raise SystemExit, error CompileError: error: c:\temp\tmpgnzrbr\buunmb.txt: No such file or directory ____________________________________________________________________________________ Be a better pen pal. Text or chat with friends inside Yahoo! Mail. See how. http://overview.mail.yahoo.com/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From williamhpurcell at gmail.com Wed Nov 28 19:36:23 2007 From: williamhpurcell at gmail.com (William Purcell) Date: Wed, 28 Nov 2007 18:36:23 -0600 Subject: [SciPy-user] Scipy in XP on VMWare Workstation Message-ID: <474E0987.7050000@gmail.com> I am trying to run scipy in XP on VMWare Workstation. The host OS is Ubuntu. When I attempt to start scipy I get the error: ____________________________________________________________________________ C:\Documents and Settings\Administrator>C:\Python25\python.exe C:\Python25\scrip ts\ipython -pylab -p scipy Traceback (most recent call last): File "C:\Python25\scripts\ipython", line 27, in IPython.Shell.start().mainloop() File "C:\Python25\Lib\site-packages\IPython\Shell.py", line 1111, in start return shell(user_ns = user_ns) File "C:\Python25\Lib\site-packages\IPython\Shell.py", line 1008, in __init__ shell_class=MatplotlibShell) File "C:\Python25\Lib\site-packages\IPython\Shell.py", line 74, in __init__ debug=debug,shell_class=shell_class) File "C:\Python25\Lib\site-packages\IPython\ipmaker.py", line 95, in make_IPyt hon embedded=embedded,**kw) File "C:\Python25\Lib\site-packages\IPython\Shell.py", line 562, in __init__ user_ns,b2 = self._matplotlib_config(name,user_ns) File "C:\Python25\Lib\site-packages\IPython\Shell.py", line 463, in _matplotli b_config from matplotlib import backends File "C:\Python25\Lib\site-packages\matplotlib\backends\__init__.py", line 55, in new_figure_manager, draw_if_interactive, show = pylab_setup() File "C:\Python25\Lib\site-packages\matplotlib\backends\__init__.py", line 24, in pylab_setup globals(),locals(),[backend_name]) File "C:\Python25\Lib\site-packages\matplotlib\backends\backend_tkagg.py", lin e 8, in import tkagg # Paint image to Tk photo blitter extension File "C:\Python25\Lib\site-packages\matplotlib\backends\tkagg.py", line 1, in import _tkagg ImportError: DLL load failed: The specified module could not be found. _____________________________________________________________________________________ Has anyone been successful in doing this or have any suggested fixes. Thanks, Bill From hbabcock at mac.com Wed Nov 28 21:45:26 2007 From: hbabcock at mac.com (Hazen Babcock) Date: Wed, 28 Nov 2007 21:45:26 -0500 Subject: [SciPy-user] spe file reader (Roper Scientific / Winview) Message-ID: <3E7F13DD-576C-4390-B130-90AB4310DC37@mac.com> Hello, Does anyone know of a module for reading (and possibly writing) Roper Scientific / Winview .spe formatted files? thanks, -Hazen From vincefn at users.sourceforge.net Thu Nov 29 03:37:51 2007 From: vincefn at users.sourceforge.net (Favre-Nicolin Vincent) Date: Thu, 29 Nov 2007 09:37:51 +0100 Subject: [SciPy-user] spe file reader (Roper Scientific / Winview) In-Reply-To: <3E7F13DD-576C-4390-B130-90AB4310DC37@mac.com> References: <3E7F13DD-576C-4390-B130-90AB4310DC37@mac.com> Message-ID: <200711290937.51949.vincefn@users.sourceforge.net> Hi, > Does anyone know of a module for reading (and possibly writing) Roper > Scientific / Winview .spe formatted files? The following file should provide reading - I'm using it to read ccd images from princeton camers. No writing though. I'm not sure how many variants of the format this can read so be careful ! And it only reads the image data, not the associated info (date etc...). Vincent -- Vincent Favre-Nicolin Universit? Joseph Fourier http://v.favrenicolin.free.fr ObjCryst & Fox : http://objcryst.sourceforge.net -------------- next part -------------- A non-text attachment was scrubbed... Name: winview.py Type: application/x-python Size: 4822 bytes Desc: not available URL: From discerptor at gmail.com Thu Nov 29 13:17:31 2007 From: discerptor at gmail.com (Joshua Lippai) Date: Thu, 29 Nov 2007 10:17:31 -0800 Subject: [SciPy-user] Error in scipy.test(10) even after compiling successfully- Mac OS X 10.4.11 Apple GCC 4.0.1 Message-ID: <9911419a0711291017r22d48580q2c1652855d253c86@mail.gmail.com> Hello all, After building scipy, I ran the scipy tests and got this error in check_integer: ====================================================================== ERROR: check_integer (scipy.io.tests.test_array_import.TestReadArray) ---------------------------------------------------------------------- Traceback (most recent call last): File "/Library/Frameworks/Python.framework/Versions/2.5/lib/python2.5/site-packages/scipy/io/tests/test_array_import.py", line 55, in check_integer from scipy import stats File "/Library/Frameworks/Python.framework/Versions/2.5/lib/python2.5/site-packages/scipy/stats/__init__.py", line 7, in from stats import * File "/Library/Frameworks/Python.framework/Versions/2.5/lib/python2.5/site-packages/scipy/stats/stats.py", line 191, in import scipy.special as special File "/Library/Frameworks/Python.framework/Versions/2.5/lib/python2.5/site-packages/scipy/special/__init__.py", line 8, in from basic import * File "/Library/Frameworks/Python.framework/Versions/2.5/lib/python2.5/site-packages/scipy/special/basic.py", line 8, in from _cephes import * ImportError: dlopen(/Library/Frameworks/Python.framework/Versions/2.5/lib/python2.5/site-packages/scipy/special/_cephes.so, 2): Symbol not found: __gfortran_pow_r8_i4 Referenced from: /Library/Frameworks/Python.framework/Versions/2.5/lib/python2.5/site-packages/scipy/special/_cephes.so Expected in: dynamic lookup It would appear it's not finding a symbol for gfrotran in one of the files, and this in turn makes it impossible to import _cephes. How would I go about remedying this? From david at ar.media.kyoto-u.ac.jp Thu Nov 29 20:58:19 2007 From: david at ar.media.kyoto-u.ac.jp (David Cournapeau) Date: Fri, 30 Nov 2007 10:58:19 +0900 Subject: [SciPy-user] Error in scipy.test(10) even after compiling successfully- Mac OS X 10.4.11 Apple GCC 4.0.1 In-Reply-To: <9911419a0711291017r22d48580q2c1652855d253c86@mail.gmail.com> References: <9911419a0711291017r22d48580q2c1652855d253c86@mail.gmail.com> Message-ID: <474F6E3B.1020209@ar.media.kyoto-u.ac.jp> Joshua Lippai wrote: > Hello all, > > After building scipy, I ran the scipy tests and got this error in check_integer: > > ====================================================================== > ERROR: check_integer (scipy.io.tests.test_array_import.TestReadArray) > ---------------------------------------------------------------------- > Traceback (most recent call last): > File "/Library/Frameworks/Python.framework/Versions/2.5/lib/python2.5/site-packages/scipy/io/tests/test_array_import.py", > line 55, in check_integer > from scipy import stats > File "/Library/Frameworks/Python.framework/Versions/2.5/lib/python2.5/site-packages/scipy/stats/__init__.py", > line 7, in > from stats import * > File "/Library/Frameworks/Python.framework/Versions/2.5/lib/python2.5/site-packages/scipy/stats/stats.py", > line 191, in > import scipy.special as special > File "/Library/Frameworks/Python.framework/Versions/2.5/lib/python2.5/site-packages/scipy/special/__init__.py", > line 8, in > from basic import * > File "/Library/Frameworks/Python.framework/Versions/2.5/lib/python2.5/site-packages/scipy/special/basic.py", > line 8, in > from _cephes import * > ImportError: dlopen(/Library/Frameworks/Python.framework/Versions/2.5/lib/python2.5/site-packages/scipy/special/_cephes.so, > 2): Symbol not found: __gfortran_pow_r8_i4 > Referenced from: > /Library/Frameworks/Python.framework/Versions/2.5/lib/python2.5/site-packages/scipy/special/_cephes.so > Expected in: dynamic lookup > > It would appear it's not finding a symbol for gfrotran in one of the > files, and this in turn makes it impossible to import _cephes. How > would I go about remedying this? > This symbol is in the gfortran runtime. We need the complete build log to be sure about the exact problem, but I expect some mismatch between fortran compilers (do you have several fortran compilers on your machine ?). cheers, David From lbolla at gmail.com Fri Nov 30 05:12:39 2007 From: lbolla at gmail.com (lorenzo bolla) Date: Fri, 30 Nov 2007 11:12:39 +0100 Subject: [SciPy-user] problems installing scipy on windows with cygwin Message-ID: <80c99e790711300212q4974fb80j5aa5d6401dcb2430@mail.gmail.com> I'm having problems installing scipy on my windows machine (Quad 2.3GHzIntel Pentium processors) using cygwin. Here is the error message: ------------------------- /cygdrive/c/Documents and Settings/bollalo001/My Documents/archive/scipy- 0.6.0/build/src.cygwin-1.5.24-i686-2.5/build/src.cygwin-1.5.24-i686-2.5/scipy/lib/blas/fblasmodule.c:474: undefined reference to `_csrot_' /cygdrive/c/Documents and Settings/bollalo001/My Documents/archive/scipy- 0.6.0/build/src.cygwin-1.5.24-i686-2.5/build/src.cygwin-1.5.24-i686-2.5/scipy/lib/blas/fblasmodule.c:487: undefined reference to `_zdrot_' collect2: ld returned 1 exit status error: Command "/usr/bin/g77 -g -Wall -g -Wall -shared build/temp.cygwin- 1.5.24-i686-2.5/build/src.cygwin-1.5.24-i686-2.5/build/src.cygwin- 1.5.24-i686-2.5/scipy/lib/blas/fblasmodule.o build/temp.cygwin- 1.5.24-i686-2.5/build/src.cygwin-1.5.24-i686-2.5/fortranobject.o build/temp.cygwin-1.5.24-i686-2.5/build/src.cygwin-1.5.24-i686-2.5/scipy/lib/blas/fblaswrap.o build/temp.cygwin-1.5.24-i686-2.5/build/src.cygwin-1.5.24-i686-2.5 /build/src.cygwin-1.5.24-i686-2.5/scipy/lib/blas/fblas-f2pywrappers.o-L/usr/lib -L/usr/lib/gcc/i686-pc-cygwin/3.4.4 -L/usr/lib/python2.5/config -Lbuild/temp.cygwin-1.5.24-i686-2.5 -lblas -lpython2.5 -lg2c -o build/lib.cygwin-1.5.24-i686-2.5/scipy/lib/blas/fblas.dll" failed with exit status 1 ------------------------- It seems like a LAPACK problem. I tried with both the cygwin provided blas/lapack and with the complete BLAS/LAPACK from netlib. what do you suggest? thank you! L. -------------- next part -------------- An HTML attachment was scrubbed... URL: From lbolla at gmail.com Fri Nov 30 08:33:59 2007 From: lbolla at gmail.com (lorenzo bolla) Date: Fri, 30 Nov 2007 14:33:59 +0100 Subject: [SciPy-user] enthough python: installing sandbox packages Message-ID: <80c99e790711300533s36c07903n20fc0d3e4c67bc98@mail.gmail.com> Hi all, is there a simple way to install sandbox packages (e.g. arpack) in a Enthough python using tools like easy_install.exe or enstaller.exe? Thanks in advance, L. -------------- next part -------------- An HTML attachment was scrubbed... URL: From Giovanni.Samaey at cs.kuleuven.be Fri Nov 30 12:26:50 2007 From: Giovanni.Samaey at cs.kuleuven.be (Giovanni Samaey) Date: Fri, 30 Nov 2007 18:26:50 +0100 Subject: [SciPy-user] lil_matrix Message-ID: Hello all, I have a basic question concerning lil_matrix slice assignments. The following code does not behave as intended : import scipy.sparse as S A = S.lil_matrix((5,5)) A.setdiag(ones(5)) B = S.lil_matrix((6,6)) B[:5,:5]=A B.getnnz() This returns 0 instead of the expected 5. Am I using the code incorrectly ? Giovanni From jh at physics.ucf.edu Fri Nov 30 15:00:29 2007 From: jh at physics.ucf.edu (Joe Harrington) Date: Fri, 30 Nov 2007 15:00:29 -0500 Subject: [SciPy-user] surface plotting Message-ID: <1196452829.6047.387.camel@glup.physics.ucf.edu> Well, I was hoping for a lighter solution, but I gave it a one-hour try. It didn't help that when I did the link to the user docs was broken (it's fixed now). I got so far as installing mayavi2 and trying to figure it out without the doc, but ran out of time and wasn't able to figure out surface plotting my existing numpy array. This was for use in a class and I decided that if I couldn't figure out how to do it in an hour I probably couldn't expect the students to do it at all. It would be good to have a cookbook recipe about how to do surface plotting of a 2D numpy array in mayavi2. When will mayavi2 be available as a deb file? This looks like software that everyone should be able to get in the easiest fashion possible. Thanks, --jh-- From jdh2358 at gmail.com Fri Nov 30 15:07:54 2007 From: jdh2358 at gmail.com (John Hunter) Date: Fri, 30 Nov 2007 14:07:54 -0600 Subject: [SciPy-user] surface plotting In-Reply-To: <1196452829.6047.387.camel@glup.physics.ucf.edu> References: <1196452829.6047.387.camel@glup.physics.ucf.edu> Message-ID: <88e473830711301207h62f83491x78c5a217179bce41@mail.gmail.com> On Nov 30, 2007 2:00 PM, Joe Harrington wrote: > It would be good to have a cookbook recipe about how to do surface > plotting of a 2D numpy array in mayavi2. For the purposes of your class, I find that a simple imshow or pcolor of 2D data with a color map is usually good enough to visualize a 2D array. Surf doesn't really add much unless you are interactively panning and zooming, when the movement generates extra 3D visual cues. JDH From fredmfp at gmail.com Fri Nov 30 15:12:10 2007 From: fredmfp at gmail.com (fred) Date: Fri, 30 Nov 2007 21:12:10 +0100 Subject: [SciPy-user] surface plotting In-Reply-To: <1196452829.6047.387.camel@glup.physics.ucf.edu> References: <1196452829.6047.387.camel@glup.physics.ucf.edu> Message-ID: <47506E9A.4090705@gmail.com> Joe Harrington a ?crit : > It would be good to have a cookbook recipe about how to do surface > plotting of a 2D numpy array in mayavi2. > http://www.scipy.org/Cookbook/MayaVi/Surf http://www.scipy.org/Cookbook/MayaVi/mlab > When will mayavi2 be available as a deb file? It's coming... :-) HTH. Cheers, -- http://scipy.org/FredericPetit From robert.kern at gmail.com Fri Nov 30 15:19:58 2007 From: robert.kern at gmail.com (Robert Kern) Date: Fri, 30 Nov 2007 14:19:58 -0600 Subject: [SciPy-user] enthough python: installing sandbox packages In-Reply-To: <80c99e790711300533s36c07903n20fc0d3e4c67bc98@mail.gmail.com> References: <80c99e790711300533s36c07903n20fc0d3e4c67bc98@mail.gmail.com> Message-ID: <4750706E.70308@gmail.com> lorenzo bolla wrote: > Hi all, > is there a simple way to install sandbox packages (e.g. arpack) in a > Enthough python using tools like easy_install.exe or enstaller.exe? Unfortunately, there isn't. The way the sandbox is set up, the sandbox packages are built as part of the monolithic scipy package. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From jh at physics.ucf.edu Fri Nov 30 15:28:25 2007 From: jh at physics.ucf.edu (Joe Harrington) Date: Fri, 30 Nov 2007 15:28:25 -0500 Subject: [SciPy-user] surface plotting Message-ID: <1196454505.6047.418.camel@glup.physics.ucf.edu> > For the purposes of your class, I find that a simple imshow or pcolor > of 2D data with a color map is usually good enough to visualize a 2D > array. Surf doesn't really add much unless you are interactively > panning and zooming, when the movement generates extra 3D visual cues. Mostly we use a combination of imshow and ds9 in astronomy. There is a *lot* of zooming and panning, and real-time, interactive colormap and scaling manipulations, and zooming the cursor around and watching what values go under the cursor. If there's a way to do any of those in matplotlib it would be good to know how, but I didn't see it in the manual. The numdisplay interface to ds9 is pretty good, but not perfect, as it interpolates the image before sending it to ds9, which is a throwback to an interface defined in the days of yore. But, the particular application for which surface really stands out is looking at the relative brightnesses of stars. Yeah, you can get an idea of it with a colormap, but green is not three times red, and even shades of gray are hard to divide by eye. Differing heights are easy to do by eye. Within that category, the one thing I simply can't do well with an image display is look at bad pixels. To find these, and find if you've killed them all, you really need to do an edge-on look at a surface plot. With images that are several k by several k, a single pixel just gets swallowed up in an image display. Its spike does not get swallowed up. Also, a good 2D display will examine the histogram and scale to, say 98% of the histogram so that spurious spikes in the data with values of 10**12 don't make it impossible to see the stars. For this particular application, surface really is the only good way I've found. As I said in an earlier post, the matplotlib surface looks fine, except that it is too slow. I'm not sure if that's because the other methods I've used are in compiled code or if there is some better algorithm. But, I've been using surface plots for ~20 years, and the computers back then were slower than current ones by more than the difference between compiled and python code, I think. The IRAF package is open-source and has such an algorithm, though it's in a bowdlerized version of FORTRAN. --jh-- From rmay at ou.edu Fri Nov 30 16:56:26 2007 From: rmay at ou.edu (Ryan May) Date: Fri, 30 Nov 2007 15:56:26 -0600 Subject: [SciPy-user] 1D Correlation on 3D array Message-ID: <4750870A.5080101@ou.edu> Hi, I was curious if anyone has a fast non-looping way to calculate 1D correlations on a N-dim array (my case is specifically 3). What i have right now is rather slow: import numpy as N from scipy.signal import correlate ts_data = N.ones((360,1201,17), dtype=N.complex64) for index in N.ndindex(ts_data.shape[:-1]): x = ts_data[index] R = correlate(x, x.conj()) I've looked at just passing in the whole array, but that seems to want to do full N-dim correlation. Thanks, Ryan -- Ryan May Graduate Research Assistant School of Meteorology University of Oklahoma