From nwagner at mecha.uni-stuttgart.de Wed Oct 1 04:15:23 2003 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Wed, 01 Oct 2003 10:15:23 +0200 Subject: [SciPy-user] Missing matrix function sqrtm Message-ID: <3F7A8D1B.3FEC9214@mecha.uni-stuttgart.de> Dear experts, I would appreciate it, if the square root of a matrix will be provided as a separate function in scipy. Any suggestion ? Nils From nwagner at mecha.uni-stuttgart.de Wed Oct 1 04:27:44 2003 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Wed, 01 Oct 2003 10:27:44 +0200 Subject: [SciPy-user] config.h:1: error: parse error before '<<' token error Message-ID: <3F7A9000.3650CF5F@mecha.uni-stuttgart.de> Hi all, python setup.py build from latest cvs failed In file included from /export/home/wagner/cvs/scipy/Lib/xplt/src/play/unix/dir.c:16: /export/home/wagner/cvs/scipy/Lib/xplt/src/play/unix/config.h:1: error: parse error before '<<' token error: command 'gcc' failed with exit status 1 Nils From pearu at scipy.org Wed Oct 1 03:39:01 2003 From: pearu at scipy.org (Pearu Peterson) Date: Wed, 1 Oct 2003 02:39:01 -0500 (CDT) Subject: [SciPy-user] config.h:1: error: parse error before '<<' token error In-Reply-To: <3F7A9000.3650CF5F@mecha.uni-stuttgart.de> Message-ID: On Wed, 1 Oct 2003, Nils Wagner wrote: > python setup.py build from latest cvs failed > > In file included from > /export/home/wagner/cvs/scipy/Lib/xplt/src/play/unix/dir.c:16: > /export/home/wagner/cvs/scipy/Lib/xplt/src/play/unix/config.h:1: error: > parse error before '<<' token > error: command 'gcc' failed with exit status 1 This probably means that there were conflicts when mergeing the changes in CVS and your local config.h. This is is a known issue that we should evenutally fix (generate files only under build/ tree and never commit generated files to CVS). As a workaround, remove scipy/Lib/xplt/src/play/unix/config.h and do `cvs update` again. Pearu From nwagner at mecha.uni-stuttgart.de Wed Oct 1 05:06:03 2003 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Wed, 01 Oct 2003 11:06:03 +0200 Subject: [SciPy-user] config.h:1: error: parse error before '<<' tokenerror References: Message-ID: <3F7A98FA.F3EE21BF@mecha.uni-stuttgart.de> Pearu Peterson schrieb: > > On Wed, 1 Oct 2003, Nils Wagner wrote: > > > python setup.py build from latest cvs failed > > > > In file included from > > /export/home/wagner/cvs/scipy/Lib/xplt/src/play/unix/dir.c:16: > > /export/home/wagner/cvs/scipy/Lib/xplt/src/play/unix/config.h:1: error: > > parse error before '<<' token > > error: command 'gcc' failed with exit status 1 > > This probably means that there were conflicts when mergeing the changes in > CVS and your local config.h. This is is a known issue that we should > evenutally fix (generate files only under build/ tree and never commit > generated files to CVS). > > As a workaround, remove scipy/Lib/xplt/src/play/unix/config.h and > do `cvs update` again. > > Pearu > Just another question. What is the meaning of ? scipy/_ft.cpp ? scipy/_kiva.cpp ? scipy/Lib/linalg/build ? scipy/Lib/linalg/cblas.pyf ? scipy/Lib/linalg/clapack.pyf ? scipy/Lib/linalg/fblas.pyf ? scipy/Lib/linalg/flapack.pyf ? scipy/Lib/xplt/pygist/Make.cfg ? scipy/scipy_core/scipy_base/mconf_lite.h ? scipy/scipy_core/scipy_base/mconf_lite.h Is it necessary to remove these files ? Nils > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user From pearu at scipy.org Wed Oct 1 04:16:12 2003 From: pearu at scipy.org (Pearu Peterson) Date: Wed, 1 Oct 2003 03:16:12 -0500 (CDT) Subject: [SciPy-user] config.h:1: error: parse error before '<<' tokenerror In-Reply-To: <3F7A98FA.F3EE21BF@mecha.uni-stuttgart.de> Message-ID: On Wed, 1 Oct 2003, Nils Wagner wrote: > Just another question. What is the meaning of > > ? scipy/_ft.cpp > ? scipy/_kiva.cpp > ? scipy/Lib/linalg/build > ? scipy/Lib/linalg/cblas.pyf > ? scipy/Lib/linalg/clapack.pyf > ? scipy/Lib/linalg/fblas.pyf > ? scipy/Lib/linalg/flapack.pyf > ? scipy/Lib/xplt/pygist/Make.cfg > ? scipy/scipy_core/scipy_base/mconf_lite.h > ? scipy/scipy_core/scipy_base/mconf_lite.h > > Is it necessary to remove these files ? No. But they all illustrate a bad habit to generate files under CVS tree. While *.pyf,*.cpp files are platform/compiler independent then other files are not. These other files must be removed whenever you switch a compiler or a platform but using the same CVS tree. Actually, also *.pyf files depend on the ATLAS version and may cause problems when switching ATLAS libraries. `?` indicates that those files are unknown to the CVS repository and this is a good thing. Pearu From nwagner at mecha.uni-stuttgart.de Wed Oct 1 05:40:43 2003 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Wed, 01 Oct 2003 11:40:43 +0200 Subject: [SciPy-user] Different failures in scipy.test() Message-ID: <3F7AA11B.89E6EDAF@mecha.uni-stuttgart.de> Dear experts, I have observed different failures in scipy.test() on different installations (python2.2, python2.3) with different versions of ATLAS. What is the reason for that ? Which versions of ATLAS and python are recommended ? Nils From pearu at scipy.org Wed Oct 1 04:34:42 2003 From: pearu at scipy.org (Pearu Peterson) Date: Wed, 1 Oct 2003 03:34:42 -0500 (CDT) Subject: [SciPy-user] Different failures in scipy.test() In-Reply-To: <3F7AA11B.89E6EDAF@mecha.uni-stuttgart.de> Message-ID: On Wed, 1 Oct 2003, Nils Wagner wrote: > Dear experts, > > I have observed different failures in scipy.test() on different > installations (python2.2, python2.3) with different versions of ATLAS. > What is the reason for that ? Could you include also the nature of these failures? It is rather difficult to comment with the given (lack of) information. > Which versions of ATLAS and python are recommended ? See INSTALL.txt. In practice, less trouble are with versions that scipy developers use;-) I usually test scipy with the boundary versions of software as given in INSTALL.txt. Pearu From nwagner at mecha.uni-stuttgart.de Wed Oct 1 06:43:18 2003 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Wed, 01 Oct 2003 12:43:18 +0200 Subject: [SciPy-user] Improvements for the iterative computation of signm Message-ID: <3F7AAFC6.EF6B6179@mecha.uni-stuttgart.de> Dear experts, The current iterative algorithm for signm can be improved. The theory is given in a recent paper by Bai et. al. The spectral decomposition of nonsymmetric matrices on distributed memory parallel computers, SIAM J. Sci. Comput. Vol. 18 (5) pp. 1446-1461 Any cooments ? Nils From nwagner at mecha.uni-stuttgart.de Wed Oct 1 11:47:35 2003 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Wed, 01 Oct 2003 17:47:35 +0200 Subject: [SciPy-user] Different failures in scipy.test() References: Message-ID: <3F7AF717.CFA9C672@mecha.uni-stuttgart.de> Pearu Peterson schrieb: > > On Wed, 1 Oct 2003, Nils Wagner wrote: > > > Dear experts, > > > > I have observed different failures in scipy.test() on different > > installations (python2.2, python2.3) with different versions of ATLAS. > > What is the reason for that ? > > Could you include also the nature of these failures? Yes. >> scipy.__version__ '0.2.0__220.4275' >>> Numeric.__version__ '23.0' Python 2.2.2 (#1, Mar 17 2003, 15:17:58) [GCC 3.3 20030226 (prerelease) (SuSE Linux)] on linux2 Type "help", "copyright", "credits" or "license" for more information. BTW, Is it possible to dtermine the version of ATLAS in a similar way ? ====================================================================== FAIL: check_pro_ang1_cv (test_basic.test_cephes) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/lib/python2.2/site-packages/scipy/special/tests/test_basic.py", line 357, in check_pro_ang1_cv assert_equal(cephes.pro_ang1_cv(1,1,1,1,0),(1.0,0.0)) File "/usr/lib/python2.2/site-packages/scipy_test/testing.py", line 332, in assert_equal assert desired == actual, msg AssertionError: Items are not equal: DESIRED: (1.0, 0.0) ACTUAL: (0.99999999999999989, 0.0) ---------------------------------------------------------------------- Ran 967 tests in 3.857s FAILED (failures=1) ################################################################################################################# The same cvs version of scipy but >>> Numeric.__version__ '23.1' ====================================================================== FAIL: check_pro_ang1_cv (test_basic.test_cephes) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/lib/python2.2/site-packages/scipy/special/tests/test_basic.py", line 357, in check_pro_ang1_cv assert_equal(cephes.pro_ang1_cv(1,1,1,1,0),(1.0,0.0)) File "/usr/lib/python2.2/site-packages/scipy_test/testing.py", line 332, in assert_equal assert desired == actual, msg AssertionError: Items are not equal: DESIRED: (1.0, 0.0) ACTUAL: (0.99999999999999989, 0.0) ====================================================================== FAIL: check_sh_chebyt (test_basic.test_sh_chebyt) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/lib/python2.2/site-packages/scipy/special/tests/test_basic.py", line 1818, in check_sh_chebyt assert_array_almost_equal(Ts5.c,tse5.c,12) File "/usr/lib/python2.2/site-packages/scipy_test/testing.py", line 405, in assert_array_almost_equal assert alltrue(ravel(reduced)),\ AssertionError: Arrays are not almost equal: ====================================================================== FAIL: check_random_complex (test_decomp.test_cholesky) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/lib/python2.2/site-packages/scipy/linalg/tests/test_decomp.py", line 280, in check_random_complex assert_array_almost_equal(cholesky(a),c) File "/usr/lib/python2.2/site-packages/scipy_test/testing.py", line 405, in assert_array_almost_equal assert alltrue(ravel(reduced)),\ AssertionError: Arrays are not almost equal: ====================================================================== FAIL: check_simple_complex (test_decomp.test_cholesky) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/lib/python2.2/site-packages/scipy/linalg/tests/test_decomp.py", line 254, in check_simple_complex assert_array_almost_equal(cholesky(a),c) File "/usr/lib/python2.2/site-packages/scipy_test/testing.py", line 405, in assert_array_almost_equal assert alltrue(ravel(reduced)),\ AssertionError: Arrays are not almost equal: ---------------------------------------------------------------------- Ran 967 tests in 3.177s FAILED (failures=4) Nils > It is rather difficult to comment with the given (lack of) information. > > > Which versions of ATLAS and python are recommended ? > > See INSTALL.txt. > > In practice, less trouble are with versions that scipy developers > use;-) > > I usually test scipy with the boundary versions of software as given in > INSTALL.txt. > > Pearu > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user From pearu at scipy.org Wed Oct 1 15:30:41 2003 From: pearu at scipy.org (Pearu Peterson) Date: Wed, 1 Oct 2003 14:30:41 -0500 (CDT) Subject: [SciPy-user] Different failures in scipy.test() In-Reply-To: <3F7AF717.CFA9C672@mecha.uni-stuttgart.de> Message-ID: On Wed, 1 Oct 2003, Nils Wagner wrote: > >> scipy.__version__ > '0.2.0__220.4275' > >>> Numeric.__version__ > '23.0' > Python 2.2.2 (#1, Mar 17 2003, 15:17:58) > [GCC 3.3 20030226 (prerelease) (SuSE Linux)] on linux2 > Type "help", "copyright", "credits" or "license" for more information. > > > BTW, Is it possible to dtermine the version of ATLAS in a similar way ? Yes, see INSTALL.txt: section TROUBLESHOOTING, item (6) > ====================================================================== > FAIL: check_pro_ang1_cv (test_basic.test_cephes) > ---------------------------------------------------------------------- > Items are not equal: > DESIRED: (1.0, 0.0) > ACTUAL: (0.99999999999999989, 0.0) This could be related to Python 2.2.2, not sure. On Python 2.3.1 this test succeeds. > ################################################################################################################# > > The same cvs version of scipy > but > >>> Numeric.__version__ > '23.1' > ====================================================================== > FAIL: check_sh_chebyt (test_basic.test_sh_chebyt) > ---------------------------------------------------------------------- > ====================================================================== > FAIL: check_random_complex (test_decomp.test_cholesky) > ---------------------------------------------------------------------- > ====================================================================== > FAIL: check_simple_complex (test_decomp.test_cholesky) > ---------------------------------------------------------------------- I am not sure why these tests fail when switching Numeric 23.0->23.1. I am using Python 2.3.1, Numeric 23.1, Atlas 3.2.1 and Scipy 0.2.0__220.4275 and all tests pass OK. I'll later try earlier versions of Python... Anyway, I don't see how the above failures could be related to using different ATLAS versions as you originally suggested. Pearu From oliphant at ee.byu.edu Wed Oct 1 15:39:12 2003 From: oliphant at ee.byu.edu (Travis E. Oliphant) Date: Wed, 01 Oct 2003 13:39:12 -0600 Subject: [SciPy-user] Different failures in scipy.test() In-Reply-To: References: Message-ID: <3F7B2D60.7080809@ee.byu.edu> Pearu Peterson wrote: > > >>====================================================================== >>FAIL: check_sh_chebyt (test_basic.test_sh_chebyt) >>---------------------------------------------------------------------- >>====================================================================== >>FAIL: check_random_complex (test_decomp.test_cholesky) >>---------------------------------------------------------------------- >>====================================================================== >>FAIL: check_simple_complex (test_decomp.test_cholesky) >>---------------------------------------------------------------------- > > > I am not sure why these tests fail when switching Numeric 23.0->23.1. > I am using Python 2.3.1, Numeric 23.1, Atlas 3.2.1 and Scipy > 0.2.0__220.4275 and all tests pass OK. I'll later try earlier versions of > Python... > > Anyway, I don't see how the above failures could be related to > using different ATLAS versions as you originally suggested. > I could see the sh_chebyt failure as being due to ATLAS because that routine uses eigenvalue decomposition and the failure is clearly a "significant digits" failure --- it can be corrected by changing the required accuracy of the test. -Travis From nwagner at mecha.uni-stuttgart.de Thu Oct 2 04:09:26 2003 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Thu, 02 Oct 2003 10:09:26 +0200 Subject: [SciPy-user] ATLAS version Message-ID: <3F7BDD36.324CD2D0@mecha.uni-stuttgart.de> Hi all, just a short note concerning the ATLAS version. The hint in INSTALL.txt is going out of date. It should be cd scipy/Lib/linalg instead of cd scipy/linalg Nils From nwagner at mecha.uni-stuttgart.de Thu Oct 2 04:54:34 2003 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Thu, 02 Oct 2003 10:54:34 +0200 Subject: [SciPy-user] failure during build from latest cvs Message-ID: <3F7BE7CA.3BF17070@mecha.uni-stuttgart.de> ### Little Endian detected #### building agg swig: swig -c++ -python -shadow -I/export/home/wagner/cvs/scipy/Lib_chaco/agg2/include agg.i sh: line 1: swig: command not found Traceback (most recent call last): File "setup.py", line 111, in ? setup_package() File "setup.py", line 90, in setup_package config_list += map(get_separate_package_config,separate_packages) File "setup.py", line 74, in get_separate_package_config return get_package_config(name,'') File "setup.py", line 68, in get_package_config config = mod.configuration(parent) File "Lib_chaco/kiva/setup_kiva.py", line 60, in configuration config_list.append(get_package_config('agg','kiva')) File "Lib_chaco/kiva/setup_kiva.py", line 37, in get_package_config config = mod.configuration(parent) File "/export/home/wagner/cvs/scipy/Lib_chaco/kiva/agg/setup_agg.py", line 11, in configuration return setup_swig_agg.configuration(parent_package) File "/export/home/wagner/cvs/scipy/Lib_chaco/kiva/agg/swig_src/setup_swig_agg.py", line 38, in configuration assert (not res), "agg swig step failed" AssertionError: agg swig step failed From fperez at colorado.edu Thu Oct 2 12:32:13 2003 From: fperez at colorado.edu (Fernando Perez) Date: Thu, 02 Oct 2003 10:32:13 -0600 Subject: [SciPy-user] Another Python graphics package which may be of interest Message-ID: <3F7C530D.50108@colorado.edu> Hi all, today I stumbled across PyX: http://pyx.sourceforge.net I figured I'd mention it here, since I'd never seen it mentioned by anyone in the science/python discussions and it seems nicely done. With its tight latex integration (see the examples page), it may fit the bill for some. Disclaimer: I have NOT used it myself, I just looked at the examples and syntax. Regards, F. From wagner.nils at vdi.de Mon Oct 6 15:00:15 2003 From: wagner.nils at vdi.de (Nils Wagner) Date: Mon, 06 Oct 2003 21:00:15 +0200 Subject: [SciPy-user] Origin of limits Message-ID: <20031006190141.45FC23EB1C@www.scipy.com> Dear experts, I am interested in the origin of >>> limits.double_epsilon 2.22044604925e-16 Please can someone explain this value. Thanks in advance. Nils From bob at redivi.com Mon Oct 6 15:05:58 2003 From: bob at redivi.com (Bob Ippolito) Date: Mon, 6 Oct 2003 15:05:58 -0400 Subject: [SciPy-user] Origin of limits In-Reply-To: <20031006190141.45FC23EB1C@www.scipy.com> Message-ID: <1E5C361C-F830-11D7-AAA7-000A95686CD8@redivi.com> On Monday, Oct 6, 2003, at 15:00 America/New_York, Nils Wagner wrote: > I am interested in the origin of >>>> limits.double_epsilon > 2.22044604925e-16 > > Please can someone explain this value. IIRC, double_epsilon is the smallest representable number by a double (64bit IEEE 754 floating point). Python floats are of this type. -bob From wagner.nils at vdi.de Mon Oct 6 15:18:28 2003 From: wagner.nils at vdi.de (Nils Wagner) Date: Mon, 06 Oct 2003 21:18:28 +0200 Subject: [SciPy-user] Origin of limits In-Reply-To: <1E5C361C-F830-11D7-AAA7-000A95686CD8@redivi.com> Message-ID: <20031006191951.DC18D3EA06@www.scipy.com> ------------------- On Monday, Oct 6, 2003, at 15:00 America/New_York, Nils Wagner wrote: > I am interested in the origin of >>>> limits.double_epsilon > 2.22044604925e-16 > > Please can someone explain this value. IIRC, double_epsilon is the smallest representable number by a double (64bit IEEE 754 floating point). Python floats are of this type. -bob Are you aware of any reference (Journal, book, whatever) ? I need some more background information. Nils _______________________________________________ SciPy-user mailing list SciPy-user at scipy.net http://www.scipy.net/mailman/listinfo/scipy-user From fperez at colorado.edu Mon Oct 6 15:22:40 2003 From: fperez at colorado.edu (Fernando Perez) Date: Mon, 06 Oct 2003 13:22:40 -0600 Subject: [SciPy-user] Origin of limits In-Reply-To: <20031006191951.DC18D3EA06@www.scipy.com> References: <20031006191951.DC18D3EA06@www.scipy.com> Message-ID: <3F81C100.6000606@colorado.edu> Nils Wagner wrote: > IIRC, double_epsilon is the smallest representable number by a double > (64bit IEEE 754 floating point). Python floats are of this type. > > -bob > > Are you aware of any reference (Journal, book, whatever) ? > I need some more background information. google('IEEE 754 floating point')["I'm feeling lucky"] Tons of references to books, articles, and the original IEEE standard itself. Please, let's keep the questions on this list to a level of things which are at least above google's first hit. Cheers, f From Norman.Shelley at motorola.com Mon Oct 6 15:15:46 2003 From: Norman.Shelley at motorola.com (Norman Shelley) Date: Mon, 06 Oct 2003 12:15:46 -0700 Subject: [SciPy-user] ?Loading double and cdouble in an PyArrayObject? Message-ID: <3F81BF62.517BB41@motorola.com> How do I take data from a double array or a complex array as defined below and initialize a Numeric array and load it appropriately? struct complex { double real; double imag; } complex; complex cdata[100]; double *data[100]; PyArrayObject *array; /* code where cdata and data are loaded up is not included *. I think this is how I do it for doubles (correct me if I'm wrong). array = PyArray_FromDims(1, numData, PyArray_DOUBLE) for (i=0; i< 100; i++) { (double *)(array->data)[i] = data[i]; } How do I load the data in cdata into an PyArrayObject? Thanks, Norman Shelley From wagner.nils at vdi.de Mon Oct 6 15:26:31 2003 From: wagner.nils at vdi.de (Nils Wagner) Date: Mon, 06 Oct 2003 21:26:31 +0200 Subject: [SciPy-user] suppress_small=0 In-Reply-To: <1E5C361C-F830-11D7-AAA7-000A95686CD8@redivi.com> Message-ID: <20031006192753.0D08E3EB37@www.scipy.com> Dear experts, The option suppress_small={0,1} has been removed from io.write_array. For what reason ? Nils From ransom at physics.mcgill.ca Mon Oct 6 15:28:25 2003 From: ransom at physics.mcgill.ca (Scott Ransom) Date: Mon, 6 Oct 2003 15:28:25 -0400 Subject: [SciPy-user] Origin of limits In-Reply-To: <20031006191951.DC18D3EA06@www.scipy.com> References: <20031006191951.DC18D3EA06@www.scipy.com> Message-ID: <200310061528.25762.ransom@physics.mcgill.ca> > On Monday, Oct 6, 2003, at 15:00 America/New_York, Nils Wagner wrote: > > I am interested in the origin of > > > >>>> limits.double_epsilon > > > > 2.22044604925e-16 > > > > Please can someone explain this value. > > IIRC, double_epsilon is the smallest representable number by a double > (64bit IEEE 754 floating point). Python floats are of this type. Actually, epsilon is not the smallest representable number, it is the smallest (fractional) difference you can have between 2 consecutive double precision numbers and still have them be distinct from one another. It is effectively the precision of the double precision floating point system used on the hardware in question. As Bob pointed out, the above value is typical for hardware that follows 64bit IEEE 754 standards. Scott -- Scott M. Ransom Address: McGill Univ. Physics Dept. Phone: (514) 398-6492 3600 University St., Rm 338 email: ransom at physics.mcgill.ca Montreal, QC Canada H3A 2T8 GPG Fingerprint: 06A9 9553 78BE 16DB 407B FFCA 9BFA B6FF FFD3 2989 From wagner.nils at vdi.de Mon Oct 6 17:14:29 2003 From: wagner.nils at vdi.de (Nils Wagner) Date: Mon, 06 Oct 2003 23:14:29 +0200 Subject: [SciPy-user] io.write_arrayfor complex arrays Message-ID: <20031006211551.AF7C73EB1C@www.scipy.com> Hi all, The output of io.write_array is somehow strange especially when the imaginary part is negative. -3.840747834305893e-01+j3.251446106553843e+00 -3.840747834305893e-01-j-3.251446106553843e+00 -1.023995414172234e+00+j4.358924376676102e+00 -1.023995414172234e+00-j-4.358924376676102e+00 Any suggestion ? Nils From nwagner at mecha.uni-stuttgart.de Tue Oct 7 03:18:38 2003 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Tue, 07 Oct 2003 09:18:38 +0200 Subject: [SciPy-user] Latex output with io.write_array Message-ID: <3F8268CE.98AE2B7@mecha.uni-stuttgart.de> Dear experts, Is it somehow possible to extent the output options of io.write_array It would be nice to have an \LaTex output, e.g. from scipy import * from RandomArray import * file=open("latex.out",'w') A = rand(3,3) io.write_array(file,A,separator='&', linesep='\n', precision=5, keep_open=0) 9.70506e-01&7.53710e-01&3.78068e-01 1.16224e-01&4.41116e-01&9.08877e-01 5.99735e-01&4.20824e-01&8.19178e-01 I would like to have something like that \left[\begin{array}{ccc} 9.70506e-01&7.53710e-01&3.78068e-01 \\ 1.16224e-01&4.41116e-01&9.08877e-01 \\ 5.99735e-01&4.20824e-01&8.19178e-01 \end{array}\right] Any suggestion ? Nils From pearu at scipy.org Tue Oct 7 02:27:29 2003 From: pearu at scipy.org (Pearu Peterson) Date: Tue, 7 Oct 2003 01:27:29 -0500 (CDT) Subject: [SciPy-user] Latex output with io.write_array In-Reply-To: <3F8268CE.98AE2B7@mecha.uni-stuttgart.de> Message-ID: On Tue, 7 Oct 2003, Nils Wagner wrote: > Dear experts, > > Is it somehow possible to extent the output options of io.write_array > It would be nice to have an \LaTex output, e.g. > > from scipy import * > from RandomArray import * > file=open("latex.out",'w') > A = rand(3,3) > io.write_array(file,A,separator='&', linesep='\n', precision=5, > keep_open=0) > > > 9.70506e-01&7.53710e-01&3.78068e-01 > 1.16224e-01&4.41116e-01&9.08877e-01 > 5.99735e-01&4.20824e-01&8.19178e-01 > > I would like to have something like that > > \left[\begin{array}{ccc} > 9.70506e-01&7.53710e-01&3.78068e-01 \\ > 1.16224e-01&4.41116e-01&9.08877e-01 \\ > 5.99735e-01&4.20824e-01&8.19178e-01 > \end{array}\right] > > > Any suggestion ? That's basic Python question: file=open("latex.out",'w') file.write('\left[\begin{array}{ccc}') io.write_array(file,A,separator='&', linesep=' \\\\\n', precision=5,keep_open=1) file.write('\end{array}\right]') Pearu From nwagner at mecha.uni-stuttgart.de Tue Oct 7 05:02:54 2003 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Tue, 07 Oct 2003 11:02:54 +0200 Subject: [SciPy-user] Latex output Message-ID: <3F82813E.3F7F0715@mecha.uni-stuttgart.de> Pearu, Actually, I was thinking of an automized Latex output option for general matrices . It might be useful to have this option for illustrating results. The number of c's and \ are given by m,n=shape(A) How can I use this information with respect to your suggestion file.write('\left[\begin{array}{c...}') linesep='\\\\....' Nils From stephen.walton at csun.edu Tue Oct 7 14:07:25 2003 From: stephen.walton at csun.edu (Stephen Walton) Date: Tue, 07 Oct 2003 11:07:25 -0700 Subject: [SciPy-user] Origin of limits In-Reply-To: <200310061528.25762.ransom@physics.mcgill.ca> References: <20031006191951.DC18D3EA06@www.scipy.com> <200310061528.25762.ransom@physics.mcgill.ca> Message-ID: <1065550044.1878.37.camel@hector.sfo.csun.edu> On Mon, 2003-10-06 at 12:28, Scott Ransom wrote: > Actually, epsilon is not the smallest representable number, it is the > smallest (fractional) difference you can have between 2 consecutive > double precision numbers and still have them be distinct from one > another. To be even more precise, machine epsilon is defined as the smallest value which one can add to 1 in floating point and get an answer which is larger than 1. Kahaner, Moler and Nash, "Numerical Methods and Software", Prentice Hall Press. Probably also Numerical Recipes and many other places. -- Stephen Walton Dept. of Physics & Astronomy, Cal State Northridge From pearu at scipy.org Thu Oct 9 05:10:31 2003 From: pearu at scipy.org (Pearu Peterson) Date: Thu, 9 Oct 2003 04:10:31 -0500 (CDT) Subject: [SciPy-user] Re: [SciPy-dev] gui_thread and chaco under Linux In-Reply-To: <16260.19985.719444.297190@monster.linux.in> Message-ID: Hi, ------------------------------------------------------------------ Those who have not followed this thread or are not interested in details, can skip until the end of this message where instructions for how to use chaco.wxplt from Python prompt under Linux (also Win32 should work but I haven't tested) are given. ------------------------------------------------------------------ On Wed, 8 Oct 2003, Prabhu Ramachandran wrote: > >>>>> "PP" == Pearu Peterson writes: > PP> I took this challenge and here are intermediate results: > > >>>> from parallel_exec import ParallelExec pexec=ParallelExec() # > [snip] > >>>> app.shutdown() > > PP> that is, I can already use chaco from my Linux prompt! And I > PP> have made no changes to chaco for that. The next step is to > PP> make > > Nice. This looks to be similar to the code written for Gtk by > someone. IIRC it was available on ASPN. You probably mean http://aspn.activestate.com/ASPN/Cookbook/Python/Recipe/65109 that indeed has very similar basic idea. The main difference with ParallelExec is that ParallelExec implementation is more generic and can be used also in non-GUI applications. > >>>> from chaco.wxplt import * plot([1,2]) > > PP> safe... > > I'd think you'd need to modify or wrap around wxPython itself to get > this to work that transparently or you'd need to do something similar > to gui_thread. I have tried both ways, well, more or less. And they have different pros and cons, but the main difficulty in both cases is that both are too complex to be absolutely robust. IMHO, if the gui_thread module is not robust then we'll struggle with Xlib: unexpected async reply type of crashes forever... > Actually it looks like this could be easy to hack into wxPython. Most > of wxPython's work is done inside of a call to apply. So if we > quietly replace apply inside wxPython's namespace with a massaged > apply that calls the real apply in the right thread, I think we should > be all set. This is a hack but if its this easy I guess Robin could > be convinced to add cleaner support for it. > > orig_apply = apply > def apply(o, *args, **kwargs): > pexec('orig_apply(o, *args, **kwargs)') > > > I don't know if this will work with pexec the way its implemented > currently but I think you get what I'm trying to say here. Yes, I think it is a very good idea to try out. If there are not calls to wxPython functions during wxPython import then this should work just fine. And using ParallelExec here should give no problems. I'll check it.. In the mean time, scipy Linux users can already use chaco.wxplt functions from Python prompt (just update scipy from CVS) as follows: >>> import gui_thread >>> execfile(gui_thread.__path__[0]+'/chaco_wxplt.py') >>> f1=figure() >>> p1=plot([1,2]) >>> f2=figure() >>> p2=plot([3,4]) Pearu From nwagner at mecha.uni-stuttgart.de Thu Oct 9 06:39:59 2003 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Thu, 09 Oct 2003 12:39:59 +0200 Subject: [SciPy-user] Current status of SuperLU, UMFPACK, sparsekit in scipy ?? Message-ID: <3F853AFF.ED64E84@mecha.uni-stuttgart.de> Dear experts, Is it already possible to use the sparse solvers in scipy ? Afaik, it's no standard package In setup.py standard_packages = ['io','linalg', 'special','signal','stats', 'interpolate','integrate','optimize', 'cluster','cow','ga','fftpack' # ,'sparse' ] How do I use these packages ? A small example of the usage would be appreciated. Thanks in advance. Nils From hetal75 at hotmail.com Thu Oct 9 08:55:15 2003 From: hetal75 at hotmail.com (Hetal Patel) Date: Thu, 09 Oct 2003 13:55:15 +0100 Subject: [SciPy-user] How do I install SciPy with windows Python 2.3 Message-ID: Is there any way to install SciPy with Python v2.3 on windows? _________________________________________________________________ Get Hotmail on your mobile phone http://www.msn.co.uk/msnmobile From travis at enthought.com Thu Oct 9 09:25:17 2003 From: travis at enthought.com (Travis N. Vaught) Date: Thu, 9 Oct 2003 08:25:17 -0500 Subject: [SciPy-user] How do I install SciPy with windows Python 2.3 In-Reply-To: Message-ID: <0d5601c38e68$cb2ddf70$ac01a8c0@tvlaptop> Hetal, If you haven't customized your python installation and you'd like the 'kitchen sink included' approach, I'd recommend getting Python 2.3 (Enthought Edition) which comes with scipy and a ton of other stuff (including a nice integration of help from the included packages in one large .chm file). This is the easiest approach, as it comes with a nice click-install exe: http://www.enthought.com/python/ If this doesn't suit your needs, there may be a 0.2 binary for windows available in the medium term (which means "when we can get to it"). Otherwise, you can get the cvs version and do the usual: python setup.py install (after installing any dependencies, of course--read INSTALL.txt, it's the most authoritative document on building from source) Regards, Travis > -----Original Message----- > From: scipy-user-bounces at scipy.net [mailto:scipy-user-bounces at scipy.net] > On Behalf Of Hetal Patel > Sent: Thursday, October 09, 2003 6:55 AM > To: scipy-user at scipy.net > Subject: [SciPy-user] How do I install SciPy with windows Python 2.3 > > Is there any way to install SciPy with Python v2.3 on windows? > > _________________________________________________________________ > Get Hotmail on your mobile phone http://www.msn.co.uk/msnmobile > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user From pearu at scipy.org Thu Oct 9 21:39:05 2003 From: pearu at scipy.org (Pearu Peterson) Date: Thu, 9 Oct 2003 20:39:05 -0500 (CDT) Subject: [SciPy-user] New implementation for gui_thread In-Reply-To: <16261.27275.924686.578196@monster.linux.in> Message-ID: Hi, I have succesfully finished implementing new hooks for importing wxPython to its own thread. The working idea is that all wxPython extension modules (that contain only low-level functions) are transparently replaced with wrappers that direct wx execution to a different thread so that Python prompt will be available for interactive work even when a wxPython application is running. As an example, here follows how to use chaco from Python prompt: # update scipy from CVS >>> from gui_thread import wxPython_thread; wxPython_thread() >>> from chaco.wxplt import * >>> f=figure() >>> p=plot([1,2]) >>> close() etc. Also scipy.plt works: >>> from gui_thread import wxPython_thread; wxPython_thread() >>> from scipy.plt import * >>> f=figure() >>> p=plot([1,2]) etc. If there will be no serious problems then I'll replace wxPython_thread function with start so that the above examples become: >>> import gui_thread; gui_thread.start() >>> ... It would be great if someone could test the above also on Windows. Enjoy, Pearu From prabhu at aero.iitm.ernet.in Fri Oct 10 02:58:06 2003 From: prabhu at aero.iitm.ernet.in (Prabhu Ramachandran) Date: Fri, 10 Oct 2003 12:28:06 +0530 Subject: [SciPy-user] Re: [SciPy-dev] New implementation for gui_thread In-Reply-To: References: <16261.27275.924686.578196@monster.linux.in> Message-ID: <16262.22654.373840.444208@monster.linux.in> Hi, >>>>> "PP" == Pearu Peterson writes: PP> I have succesfully finished implementing new hooks for PP> importing wxPython to its own thread. The working idea is that PP> all wxPython extension modules (that contain only low-level PP> functions) are transparently replaced with wrappers that PP> direct wx execution to a different thread so that Python PP> prompt will be available for interactive work even when a PP> wxPython application is running. Nice! I briefly looked at the code and from what I can see you transparently wrap the builtin functions used inside all the wxPython functions. This means apply is wrapped and therefore everything works correctly? I've managed to work around the kiva_agg build problem by building the offending code with gcc-3.0. How does one specify to python setup.py to choose gcc-3.0 instead of vanilla gcc? Setting the environment variable CC does not appear to help. Anyway, the new gui_thread works really nicely. I like the fact that you don't have to wait before you run the next statement since you wait while importing wxPython. However, the new gui_thread appears to have trouble when quitting the interpreter: In [1]: from gui_thread import wxPython_thread; wxPython_thread() In [2]: from chaco.wxplt import * In [3]: p=plot([1,2]) In [4]: close() In [5]: Do you really want to exit ([y]/n)? Exception in thread Thread-221: Traceback (most recent call last): File "/usr/lib/python2.2/threading.py", line 408, in __bootstrap self.run() File "/usr/local/stow/scipy/lib/python2.2/site-packages/gui_thread/wxBackgroundApp.py", line 65, in run exec self.cmd in self.globs,self.locs File "", line 1, in ? AttributeError: wxBackgroundApp instance has no attribute 'event_catcher' cheers, prabhu From nwagner at mecha.uni-stuttgart.de Fri Oct 10 09:15:12 2003 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Fri, 10 Oct 2003 15:15:12 +0200 Subject: [SciPy-user] rand, random and seed Message-ID: <3F86B0E0.B3E83C7C@mecha.uni-stuttgart.de> Dear experts, What is going on here from scipy import * from RandomArray import * seed() # Set seeds based on current time a = get_seed() # What seeds were used print a print random((2,2)) print rand(2,2) seed(a[0],a[1]) # resetting the same seeds print random((2,2)) # should yield the same results for both, rand and random print rand(2,2) # Am I missing something Any suggestion ? Nils From jmiller at stsci.edu Fri Oct 10 10:17:54 2003 From: jmiller at stsci.edu (Todd Miller) Date: 10 Oct 2003 10:17:54 -0400 Subject: [SciPy-user] doctest fortran I/O synchronization Message-ID: <1065795474.24316.105.camel@halloween.stsci.edu> I'm trying to make a doctest to verify that the different flow patterns of f2py interfaces work with different varieties of numarrays (normal, byte-swapped, misaligned, dis-contiguous, type-converted). I'm trying to test this out under Linux with g77, and (it seems like) I'm having trouble synchronizing the Fortran I/O with Python's C I/O. Given foo.f: subroutine in_c(a,m,n) real*8 a(n,m) Cf2py intent(in,c) a Cf2py depend(a) :: n=shape(a,0), m=shape(a,1) do j=1,m do i=1,n write (6,1) a(i,j) 1 format( $, 1F3.0, ', ') enddo print *,'' enddo end And given f2py_tests.py: """ >>> foo.in_f(a) 0., 5., 10., 1., 6., 11., 2., 7., 12., 3., 8., 13., 4., 9., 14., """ import foo, numarray def test(): import doctest global a t = doctest.Tester(globs=globals()) a = numarray.arange(15., shape=(3,5)) t.runstring(__doc__, "c_array") return t.summarize() I get this: [jmiller at halloween ~/f2py_tests]$ python f2py_tests.py 0., 5., 10., 1., 6., 11., 2., 7., 12., 3., 8., 13., 4., 9., 14., ***************************************************************** Failure in example: foo.in_f(a) from line #1 of c_array Expected: 0., 5., 10., 1., 6., 11., 2., 7., 12., 3., 8., 13., 4., 9., 14., Got: ***************************************************************** 1 items had failures: 1 of 1 in c_array ***Test Failed*** 1 failures. Where it appears that the output from the first example somehow escapes the C I/O system I presume doctest is using. The actual test I'm writing has multiple examples, and the fortran I/O *does* make it into the doctest after the first example but remains out of sync. Does anyone have an explanation and/or fix for this problem? -- Todd Miller jmiller at stsci.edu STSCI / ESS / SSB From pearu at scipy.org Fri Oct 10 10:54:15 2003 From: pearu at scipy.org (Pearu Peterson) Date: Fri, 10 Oct 2003 09:54:15 -0500 (CDT) Subject: [SciPy-user] doctest fortran I/O synchronization In-Reply-To: <1065795474.24316.105.camel@halloween.stsci.edu> Message-ID: Hi Todd, On 10 Oct 2003, Todd Miller wrote: > I'm trying to make a doctest to verify that the different flow patterns > of f2py interfaces work with different varieties of numarrays (normal, > byte-swapped, misaligned, dis-contiguous, type-converted). I'm trying > to test this out under Linux with g77, and (it seems like) I'm having > trouble synchronizing the Fortran I/O with Python's C I/O. > > Given foo.f: > > subroutine in_c(a,m,n) > real*8 a(n,m) > Cf2py intent(in,c) a > Cf2py depend(a) :: n=shape(a,0), m=shape(a,1) > do j=1,m > do i=1,n > write (6,1) a(i,j) > 1 format( $, 1F3.0, ', ') > enddo > print *,'' > enddo > end > > And given f2py_tests.py: > """ > >>> foo.in_f(a) I could not run given tests as they were not complete and had typos. For instance, foo.f defines in_c but in test string you are using in_f. Also AFAIK, Python I/O will not catch I/O from Fortran. Any way, when I modify in_c to in_f then the following code a = numarray.arange(15., shape=(3,5)) print a foo.in_f(a) outputs: [[ 0. 1. 2. 3. 4.] [ 5. 6. 7. 8. 9.] [ 10. 11. 12. 13. 14.]] 0., 1., 2., 3., 4., 5., 6., 7., 8., 9., 10., 11., 12., 13., 14., You probaly would not expect this. This is related to different storage order in C and Fortran, of cource, and you have disabled f2py ability to take into account this by using intent(c). So, when you would not use intent(c), that is, in foo.f is a line Cf2py intent(in) a then the following output occurs: [[ 0. 1. 2. 3. 4.] [ 5. 6. 7. 8. 9.] [ 10. 11. 12. 13. 14.]] 0., 5., 10., 1., 6., 11., 2., 7., 12., 3., 8., 13., 4., 9., 14., The arrays look transposed because in Fortran your row index varies faster, that is, a transposed array is displayed. To sum up, don't use intent(c) and change the order of loops in in_f function to get matching results. HTH, Pearu From jmiller at stsci.edu Fri Oct 10 11:09:09 2003 From: jmiller at stsci.edu (Todd Miller) Date: 10 Oct 2003 11:09:09 -0400 Subject: [SciPy-user] doctest fortran I/O synchronization In-Reply-To: References: Message-ID: <1065798549.24315.164.camel@halloween.stsci.edu> Hi Pearu, Sorry about the example mismatch. Your explanation of what to do with the array loops will eventually be a big help. Thanks! Todd On Fri, 2003-10-10 at 10:54, Pearu Peterson wrote: > > Hi Todd, > > On 10 Oct 2003, Todd Miller wrote: > > > I'm trying to make a doctest to verify that the different flow patterns > > of f2py interfaces work with different varieties of numarrays (normal, > > byte-swapped, misaligned, dis-contiguous, type-converted). I'm trying > > to test this out under Linux with g77, and (it seems like) I'm having > > trouble synchronizing the Fortran I/O with Python's C I/O. > > > > Given foo.f: > > > > subroutine in_c(a,m,n) > > real*8 a(n,m) > > Cf2py intent(in,c) a > > Cf2py depend(a) :: n=shape(a,0), m=shape(a,1) > > do j=1,m > > do i=1,n > > write (6,1) a(i,j) > > 1 format( $, 1F3.0, ', ') > > enddo > > print *,'' > > enddo > > end > > > > And given f2py_tests.py: > > > """ > > >>> foo.in_f(a) > > > I could not run given tests as they were not complete and had typos. > For instance, foo.f defines in_c but in test string you are using in_f. > Also AFAIK, Python I/O will not catch I/O from Fortran. This is good to know. FWIW, it seems Python I/O eventually catches some of Fortran I/O, just not all of it. > > Any way, when I modify in_c to in_f then the following code > > a = numarray.arange(15., shape=(3,5)) > print a > foo.in_f(a) > > outputs: > > [[ 0. 1. 2. 3. 4.] > [ 5. 6. 7. 8. 9.] > [ 10. 11. 12. 13. 14.]] > 0., 1., 2., > 3., 4., 5., > 6., 7., 8., > 9., 10., 11., > 12., 13., 14., > > You probaly would not expect this. This is related to different storage > order in C and Fortran, of cource, and you have disabled f2py ability to > take into account this by using intent(c). > > So, when you would not use intent(c), that is, in foo.f is a line > > Cf2py intent(in) a > > then the following output occurs: > > [[ 0. 1. 2. 3. 4.] > [ 5. 6. 7. 8. 9.] > [ 10. 11. 12. 13. 14.]] > 0., 5., 10., > 1., 6., 11., > 2., 7., 12., > 3., 8., 13., > 4., 9., 14., > > The arrays look transposed because in Fortran your row index varies > faster, that is, a transposed array is displayed. > > To sum up, don't use intent(c) and change the order of loops in in_f > function to get matching results. > > HTH, > Pearu > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user -- Todd Miller jmiller at stsci.edu STSCI / ESS / SSB From anewgene at hotpop.com Fri Oct 10 17:09:30 2003 From: anewgene at hotpop.com (CL WU) Date: Fri, 10 Oct 2003 16:09:30 -0500 Subject: [SciPy-user] How do I install SciPy with windows Python 2.3 In-Reply-To: <0d5601c38e68$cb2ddf70$ac01a8c0@tvlaptop> References: <0d5601c38e68$cb2ddf70$ac01a8c0@tvlaptop> Message-ID: <3F87200A.8050105@hotpop.com> Hi, group, I am new to scipy. I installed python 2.3 enthought distribution on my win2k, but I have some problems to make it work: 1. scipy.test() failed on my machine with 2 errors. The error msg (truncated) is attached as scipy_testerr.txt. 2. I install scipy for trying its weave, but weave.test() failed with even more errors (29). They are all the same exception for different function. I attached a typical err msg in weave_testerr.txt. Some notes about my system: a. I installed enthought distribution over my original python 2.3 from python.org. b. I have visual studio .net v7.0 on my computer and I already put "C:\PROGRA~1\MICROS~1.NET\Vc7\bin" (where "vcvars32.bat" locates) into my "PATH". 3. scipy.plt doesn't work and it always crashes python or hangs the whole system up with an "memory could not be read" error , although I "import gui_thread" before "from scipy.plt import *" already. Would you point me out how to solve these problems? I hope those problems are obvious to you. Thanks. Chunlei I -------------- next part -------------- An embedded and charset-unspecified text was scrubbed... Name: scipy_testerr.txt URL: -------------- next part -------------- An embedded and charset-unspecified text was scrubbed... Name: weave_testerr.txt URL: From pajer at iname.com Fri Oct 10 18:30:25 2003 From: pajer at iname.com (Gary Pajer) Date: Fri, 10 Oct 2003 18:30:25 -0400 Subject: [SciPy-user] How do I install SciPy with windows Python 2.3 References: <0d5601c38e68$cb2ddf70$ac01a8c0@tvlaptop> <3F87200A.8050105@hotpop.com> Message-ID: <003901c38f7e$19d7f320$01fd5644@playroom> > 3. scipy.plt doesn't work and it always crashes python or hangs the > whole system up with an "memory could not be read" error , although I > "import gui_thread" before "from scipy.plt import *" already. I *think* this is a well known problem that is being addressed. After import gui_thread try gui_thread.start() and see if that helps. There have been recent messages posted that seem to say that this will change shortly. You may find that scipy.plt gives you a bunch of "float deprecation warnings". I think that they are being addressed I'm really not the person to comment, but: I *think* a bunch of improvements are coming soon. (days, not months). From anewgene at hotpop.com Fri Oct 10 18:58:41 2003 From: anewgene at hotpop.com (CL WU) Date: Fri, 10 Oct 2003 17:58:41 -0500 Subject: [SciPy-user] How do I install SciPy with windows Python 2.3 In-Reply-To: <003901c38f7e$19d7f320$01fd5644@playroom> References: <0d5601c38e68$cb2ddf70$ac01a8c0@tvlaptop> <3F87200A.8050105@hotpop.com> <003901c38f7e$19d7f320$01fd5644@playroom> Message-ID: <3F8739A1.9080609@hotpop.com> Yes, this problem was solved as you said. Thanks. Chunlei Gary Pajer wrote: >>3. scipy.plt doesn't work and it always crashes python or hangs the >>whole system up with an "memory could not be read" error , although I >>"import gui_thread" before "from scipy.plt import *" already. >> >> > >I *think* this is a well known problem that is being addressed. >After >import gui_thread >try >gui_thread.start() >and see if that helps. There have been recent messages posted that seem to >say that this will change shortly. > >You may find that scipy.plt gives you a bunch of "float deprecation >warnings". I think that they are being addressed > >I'm really not the person to comment, but: I *think* a bunch of >improvements are coming soon. (days, not months). > >_______________________________________________ >SciPy-user mailing list >SciPy-user at scipy.net >http://www.scipy.net/mailman/listinfo/scipy-user > > > From pajer at iname.com Fri Oct 10 22:31:58 2003 From: pajer at iname.com (Gary Pajer) Date: Fri, 10 Oct 2003 22:31:58 -0400 Subject: [SciPy-user] Chaco / traits line color problem in current cvs ? Message-ID: <200310102231.58448.pajer@iname.com> The improvments in threading wxplt seem to work well... thanks. Now, if I create a simple plot >>> from gui_thread import wxPython_thread ; wxPython_thread() >>> from chaco import wxplt >>> wxplt.plot((1,2,3)) and try to change the line color by right-click / Edit / line, and choosing a color from the pallette, it doesn't. >>> Traceback (most recent call last): File "/usr/local/lib/python2.3/site-packages/traits/wxtrait_sheet.py", line 2186, in on_click self.from_wx_color( control.GetBackgroundColour(), parent ), TypeError: from_wx_color() takes exactly 2 arguments (3 given) Also, on my system the color icons on the color pallettes are all white. regards, gary From stephen.walton at csun.edu Fri Oct 10 23:39:28 2003 From: stephen.walton at csun.edu (Stephen Walton) Date: Fri, 10 Oct 2003 20:39:28 -0700 Subject: [SciPy-user] New implementation for gui_thread In-Reply-To: References: Message-ID: <1065843568.2048.2.camel@dhcppc1> On Thu, 2003-10-09 at 18:39, Pearu Peterson wrote: > # update scipy from CVS Well, I tried, and now scipy won't build on RedHat 8. The specific error is from SWIG, where I have version 1.1 installed from the RedHat provided RPM. The error message log is attached. (I may not have noticed before, but has SWIG always been required for a build from source?) -- Stephen Walton Dept. of Physics & Astronomy, Cal State Northridge -------------- next part -------------- Script started on Fri Oct 10 20:38:26 2003 sunspot:~/src/scipy> python setup.py build ### Little Endian detected #### building agg swig: swig -c++ -python -shadow -I/home/swalton/src/scipy/Lib_chaco/agg2/include agg.i Generating wrappers for Python /home/swalton/src/scipy/Lib_chaco/agg2/include/agg_basics.h : Line 27. Syntax error in input. /home/swalton/src/scipy/Lib_chaco/agg2/include/agg_basics.h : Line 29. Syntax error in input. /home/swalton/src/scipy/Lib_chaco/agg2/include/agg_basics.h : Line 139. Syntax error in input. constants.i : Line 35. Syntax error in input. constants.i : Line 36. Syntax error in input. constants.i : Line 37. Syntax error in input. constants.i : Line 38. Syntax error in input. constants.i : Line 39. Syntax error in input. constants.i : Line 41. Syntax error in input. constants.i : Line 51. Syntax error in input. constants.i : Line 71. Syntax error in input. constants.i : Line 84. Syntax error in input. constants.i : Line 94. Syntax error in input. constants.i : Line 103. Syntax error in input. constants.i : Line 113. Syntax error in input. constants.i : Line 133. Syntax error in input. constants.i : Line 145. Syntax error in input. /home/swalton/src/scipy/Lib_chaco/agg2/include/agg_basics.h : Line 62. Constructor new_rect multiply defined (2nd definition ignored). /home/swalton/src/scipy/Lib_chaco/agg2/include/agg_basics.h : EOF. Missing #endif detected. Traceback (most recent call last): File "setup.py", line 111, in ? setup_package() File "setup.py", line 90, in setup_package config_list += map(get_separate_package_config,separate_packages) File "setup.py", line 74, in get_separate_package_config return get_package_config(name,'') File "setup.py", line 68, in get_package_config config = mod.configuration(parent) File "Lib_chaco/kiva/setup_kiva.py", line 60, in configuration config_list.append(get_package_config('agg','kiva')) File "Lib_chaco/kiva/setup_kiva.py", line 37, in get_package_config config = mod.configuration(parent) File "/home/swalton/src/scipy/Lib_chaco/kiva/agg/setup_agg.py", line 11, in configuration return setup_swig_agg.configuration(parent_package) File "/home/swalton/src/scipy/Lib_chaco/kiva/agg/swig_src/setup_swig_agg.py", line 38, in configuration assert (not res), "agg swig step failed" AssertionError: agg swig step failed sunspot:~/src/scipy> exit Script done on Fri Oct 10 20:38:32 2003 From pearu at scipy.org Sat Oct 11 01:40:44 2003 From: pearu at scipy.org (Pearu Peterson) Date: Sat, 11 Oct 2003 00:40:44 -0500 (CDT) Subject: [SciPy-user] New implementation for gui_thread In-Reply-To: <1065843568.2048.2.camel@dhcppc1> Message-ID: On Fri, 10 Oct 2003, Stephen Walton wrote: > On Thu, 2003-10-09 at 18:39, Pearu Peterson wrote: > > > # update scipy from CVS > > Well, I tried, and now scipy won't build on RedHat 8. The specific > error is from SWIG, where I have version 1.1 installed from the RedHat > provided RPM. The error message log is attached. > > (I may not have noticed before, but has SWIG always been required for a > build from source?) Upgrading to swig1.3 fixes the errors. Swig is only required when building chaco packages. It would be great if some Swig expert could check if it is possible to backport .i files in chaco to Swig version 1.1. Btw, people who will make a fresh install from the latest scipy may notice that importing chaco.wxplt will fail. This is related to http://www.scipy.net/roundup/scipy/issue100 I have looked into this problem and discovered that Linux port of chaco is broken more seriously than I would expect. Though the missing files can be restored from Attic directories, it seems that chaco development under Linux is so much behind what has been done under Windows. And now I am much less enthusiastic in the possibility starting to use chaco under linux in short term than before, after the gui_thread issue was resolved :-( Regards, Pearu PS: I'll be mostly off-line for the coming week. From pajer at iname.com Sat Oct 11 11:07:04 2003 From: pajer at iname.com (Gary Pajer) Date: Sat, 11 Oct 2003 11:07:04 -0400 Subject: [SciPy-user] New implementation for gui_thread References: Message-ID: <001d01c39009$5551de80$01fd5644@playroom> ----- Original Message ----- From: "Pearu Peterson" [...] > I have looked into this problem and discovered that Linux port of > chaco is broken more seriously than I would expect. Though the missing > files can be restored from Attic directories, it seems that chaco > development under Linux is so much behind what has been done under > Windows. Generally, is scipy / chaco development happening in Windows, then ported to linux, or vice versa? The instructions on the web site for building in Windows sound very scary. Is building in Windows reasonable using the mingc compiler or cygwin? The instructions on the web site are definitely broken ( a broken link) and possibly old (?). I've avoided even trying to build under windows, but I'd love to, actually. I'd ask if it were possible to post instructions, but I don't know what I'm asking ... it may be an unreasonable request. -Gary From stephen.walton at csun.edu Sat Oct 11 13:23:12 2003 From: stephen.walton at csun.edu (Stephen Walton) Date: Sat, 11 Oct 2003 10:23:12 -0700 Subject: [SciPy-user] New implementation for gui_thread In-Reply-To: References: Message-ID: <1065892991.4055.18.camel@dhcppc1> On Fri, 2003-10-10 at 22:40, Pearu Peterson wrote: > Upgrading to swig1.3 fixes the errors [with Linux builds]. > Swig is only required when building > chaco packages.... Yeah, I did upgrade to swig1.3. I'm not a SWIG expert but I've played with it a bit, so maybe I'll see if config.i can be made compatible with 1.1, which is still considered the current stable release according to the SWIG web site. I have hit another problem: the config.h file in Lib/xplt/src/play/unix/ of the current CVS is badly malformatted. I deleted everything but the three #define lines. The CVS doesn't seem contain the config.sh script that generates this file. The comments seem to indicate that it is Travis's problem :-) > I have looked into this problem and discovered that Linux port of > chaco is broken more seriously than I would expect. Indeed, and this is very important to me. If I want my largely Linux-using folks to switch to SciPy from IRAF/MATLAB, I need a good graphics package. FWIW, here's what happens when I try Pearu's example: Python 2.2.1 (#1, Aug 30 2002, 12:15:30) [GCC 3.2 20020822 (Red Hat Linux Rawhide 3.2-4)] on linux2 Type "help", "copyright", "credits" or "license" for more information. >>> from gui_thread import wxPython_thread >>> wxPython_thread() >>> from chaco.wxplt import * Traceback (most recent call last): File "", line 1, in ? File "/usr/lib/python2.2/site-packages/chaco/wxplt.py", line 19, in ? from wxplot import * File "/usr/lib/python2.2/site-packages/chaco/wxplot.py", line 17, in ? from wxplot_window import PlotWindow, PlotTraitSheet File "/usr/lib/python2.2/site-packages/chaco/wxplot_window.py", line 7, in ? from wxplot_window_native import * File "/usr/lib/python2.2/site-packages/chaco/wxplot_window_native.py", line 27, in ? from kiva.wxcore2d import GraphicsContext, CAP_BUTT File "/usr/lib/python2.2/site-packages/kiva/wxcore2d.py", line 7, in ? from wxcore2d_native import * File "/usr/lib/python2.2/site-packages/kiva/wxcore2d_native.py", line 17, in ? from image_transform import transform_image,bin_mask ImportError: No module named image_transform >>> ^D Exception exceptions.AttributeError: "'NoneType' object has no attribute 'get_ident'" in ignored [above repeated about 30 times] 10:20:29 AM: Debug: C++ assertion "wxThread::IsMain()" failed in /projects/wx/wxPython/_build_rpm/rpmtop/BUILD/wxPythonSrc-2.4.0.1/src/unix/threadpsx.cpp(1618): only main thread can be here Exception wxPython.wxc.wxPyAssertionError: 'C++ assertion "wxThread::IsMain()" failed in /projects/wx/wxPython/_build_rpm/rpmtop/BUILD/wxPythonSrc-2.4.0.1/src/unix/threadpsx.cpp(1618): only main thread can be here' in > ignored -- Stephen Walton Dept. of Physics & Astronomy, Cal State Northridge From prabhu at aero.iitm.ernet.in Sun Oct 12 13:39:51 2003 From: prabhu at aero.iitm.ernet.in (Prabhu Ramachandran) Date: Sun, 12 Oct 2003 23:09:51 +0530 Subject: [SciPy-user] New implementation for gui_thread In-Reply-To: <1065892991.4055.18.camel@dhcppc1> References: <1065892991.4055.18.camel@dhcppc1> Message-ID: <16265.37351.798495.737961@monster.linux.in> >>>>> "SW" == Stephen Walton writes: SW> On Fri, 2003-10-10 at 22:40, Pearu Peterson wrote: >> Upgrading to swig1.3 fixes the errors [with Linux builds]. >> Swig is only required when building chaco packages.... SW> Yeah, I did upgrade to swig1.3. I'm not a SWIG expert but SW> I've played with it a bit, so maybe I'll see if config.i can SW> be made compatible with 1.1, which is still considered the SW> current stable release according to the SWIG web site. Swig 1.3 is vastly superior to 1.1 as regards c++ support. I'm using it off CVS and it seems quite stable for me. Having said that I am not in a position to help with the actual problem. I guess if the swig'ed sources are included in CVS others will be able to build scipy even without swig 1.3 installed. [snip] >> I have looked into this problem and discovered that Linux port >> of chaco is broken more seriously than I would expect. SW> Indeed, and this is very important to me. If I want my SW> largely Linux-using folks to switch to SciPy from IRAF/MATLAB, SW> I need a good graphics package. Well, if its Linux only I recommend giving gracePlot.py a try. It is useable enough from the Python command line and has a very powerful GUI and pretty decent docs. It supports greek symbols, produces very nice EPS output (apart from PDF and SVG). Check out the features here: http://plasma-gate.weizmann.ac.il/Grace/ I use SciPy to process most data and then simply plot the results with Grace. Of course, Debian stable ships with grace. :) I personally have been using Nathan Gray's gracePlot.py and am quite happy with it. http://www.idyll.org/~n8gray/code/ You might also like this: http://graceplot.sourceforge.net/ but I haven't used it. cheers, prabhu From pearu at scipy.org Mon Oct 13 15:30:57 2003 From: pearu at scipy.org (Pearu Peterson) Date: Mon, 13 Oct 2003 14:30:57 -0500 (CDT) Subject: [SciPy-user] New implementation for gui_thread In-Reply-To: <001d01c39009$5551de80$01fd5644@playroom> Message-ID: On Sat, 11 Oct 2003, Gary Pajer wrote: > Generally, is scipy / chaco development happening in Windows, then ported to > linux, or vice versa? I think it is true that Scipy is developed mainly under Linux and Chaco mainly (read: currently only) under Windows. > The instructions on the web site for building in Windows sound very scary. > Is building in Windows reasonable using the mingc compiler or cygwin? The > instructions on the web site are definitely broken ( a broken link) and > possibly old (?). I've avoided even trying to build under windows, but I'd > love to, actually. I'd ask if it were possible to post instructions, but I > don't know what I'm asking ... it may be an unreasonable request. I think what you are asking is a very reasonable request but unfortunately I cannot help much what comes to building scipy under Windows. I can only suggest to try it out and report any occuring problems to scipy-dev list. Regards, Pearu From oliphant at ee.byu.edu Mon Oct 13 19:31:18 2003 From: oliphant at ee.byu.edu (Travis E. Oliphant) Date: Mon, 13 Oct 2003 17:31:18 -0600 Subject: [SciPy-user] suppress_small=0 In-Reply-To: <20031006192753.0D08E3EB37@www.scipy.com> References: <20031006192753.0D08E3EB37@www.scipy.com> Message-ID: <3F8B35C6.7040600@ee.byu.edu> Nils Wagner wrote: > Dear experts, > > The option suppress_small={0,1} has been removed > from io.write_array. For what reason ? > Because I had to re-write the write_array function instead of borrowing Numeric's print statement so that io.read_array() would always be able to read the output of write_array. I did not invest the time to add the option. It could be added if there is a great need. -Travis From oliphant at ee.byu.edu Mon Oct 13 19:57:25 2003 From: oliphant at ee.byu.edu (Travis E. Oliphant) Date: Mon, 13 Oct 2003 17:57:25 -0600 Subject: [SciPy-user] Current status of SuperLU, UMFPACK, sparsekit in scipy ?? In-Reply-To: <3F853AFF.ED64E84@mecha.uni-stuttgart.de> References: <3F853AFF.ED64E84@mecha.uni-stuttgart.de> Message-ID: <3F8B3BE5.6020601@ee.byu.edu> Nils Wagner wrote: > Dear experts, > > Is it already possible to use the sparse solvers in scipy ? > Afaik, it's no standard package > > In setup.py > > standard_packages = ['io','linalg', > 'special','signal','stats', > 'interpolate','integrate','optimize', > 'cluster','cow','ga','fftpack' > # ,'sparse' > ] > > > How do I use these packages ? > > A small example of the usage would be appreciated. Look at the sparse subdirectory and the source code for the Sparse package (Sparse.py I think). There are tests at the bottom of the file that demonstrate use. -Travis From oliphant at ee.byu.edu Mon Oct 13 20:10:22 2003 From: oliphant at ee.byu.edu (Travis E. Oliphant) Date: Mon, 13 Oct 2003 18:10:22 -0600 Subject: [SciPy-user] New implementation for gui_thread In-Reply-To: <1065843568.2048.2.camel@dhcppc1> References: <1065843568.2048.2.camel@dhcppc1> Message-ID: <3F8B3EEE.4090402@ee.byu.edu> Stephen Walton wrote: > On Thu, 2003-10-09 at 18:39, Pearu Peterson wrote: > > >># update scipy from CVS > > > Well, I tried, and now scipy won't build on RedHat 8. The specific > error is from SWIG, where I have version 1.1 installed from the RedHat > provided RPM. The error message log is attached. > You need a newer version of swig. I've had similar problems which disappeared when I obtained the newest release of SWIG 1.3.19 (I think) The version that comes with RedHat 8 is outdated, I suppose. -Travis > (I may not have noticed before, but has SWIG always been required for a > build from source?) > It's been needed off and on. I think it's always been suggested. From nwagner at mecha.uni-stuttgart.de Tue Oct 14 04:17:03 2003 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Tue, 14 Oct 2003 10:17:03 +0200 Subject: [SciPy-user] Current status of SuperLU, UMFPACK, sparsekit inscipy ?? References: <3F853AFF.ED64E84@mecha.uni-stuttgart.de> <3F8B3BE5.6020601@ee.byu.edu> Message-ID: <3F8BB0FF.488ACCF9@mecha.uni-stuttgart.de> "Travis E. Oliphant" schrieb: > > Nils Wagner wrote: > > Dear experts, > > > > Is it already possible to use the sparse solvers in scipy ? > > Afaik, it's no standard package > > > > In setup.py > > > > standard_packages = ['io','linalg', > > 'special','signal','stats', > > 'interpolate','integrate','optimize', > > 'cluster','cow','ga','fftpack' > > # ,'sparse' > > ] > > > > > > How do I use these packages ? > > > > A small example of the usage would be appreciated. > > Look at the sparse subdirectory and the source code for the Sparse > package (Sparse.py I think). There are tests at the bottom of the file > that demonstrate use. > > -Travis > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user Travis, I added sparse to standard_packages in setup.py. The examples work fine for me. Is it possible to convert a sparse matrix S to full storage organization in scipy ? Which format is used by Matlab for sparse matrices ? Is it compatible with respect to scipy ? Can I import sparse matrices which are generated by Matlab ? Nils From nwagner at mecha.uni-stuttgart.de Tue Oct 14 04:31:34 2003 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Tue, 14 Oct 2003 10:31:34 +0200 Subject: [SciPy-user] Documentation of sparse etc Message-ID: <3F8BB466.C90A4523@mecha.uni-stuttgart.de> Dear experts, So far sparse is not documented in the tutorial. Is it possible to redirect the output of help(sparse) to a file ? Actually, I would like to have a complete overview with respect to the functionality of sparse etc. Any suggestion ? Nils From nwagner at mecha.uni-stuttgart.de Tue Oct 14 05:28:28 2003 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Tue, 14 Oct 2003 11:28:28 +0200 Subject: [SciPy-user] io.loadmat Message-ID: <3F8BC1BC.85CED77F@mecha.uni-stuttgart.de> Dear experts, I need some advice concerning loadmat loadmat -- read a MATLAB (version <= 4) style mat file loadmat(name, dict=None, appendmat=1) Load the MATLAB mat file saved in level 1.0 format. If name is a full path name load it in. Otherwise search for the file on the sys.path list and load the first one found (the current directory is searched first). Only Level 1.0 MAT files are supported so far. Inputs: name -- name of the mat file (don't need .mat extension) dict -- the dictionary to insert into. If none the variables will be returned in a dictionary. appendmat -- non-zero to append the .mat extension to the end of the given filename. Outputs: If dict is None, then a dictionary of names and objects representing the stored arrays is returned. What is meaning of level 1.0 format ? Afaik, the current version of Matlab is 6.5. However loadmat seems to be restricted to version <=4. Is there any progress in upgrading this feature ? from scipy import * file=open("matrizen_red.mat",'r') io.loadmat(file,appendmat=0) raceback (most recent call last): File "matlab.py", line 3, in ? io.loadmat(file,appendmat=0) File "/usr/local/lib/python2.1/site-packages/scipy/io/mio.py", line 348, in loadmat if os.sep in name: TypeError: 'in' or 'not in' needs sequence right argument Any suggestion ? Nils From ransom at physics.mcgill.ca Tue Oct 14 09:36:23 2003 From: ransom at physics.mcgill.ca (Scott Ransom) Date: Tue, 14 Oct 2003 09:36:23 -0400 Subject: [SciPy-user] Parameter errors using leastsq? Message-ID: <20031014133623.GB19765@spock.physics.mcgill.ca> Hi All, Does anyone have an example of how to extract the statistical errors on the fitted parameters from the full_output of scipy.optimize.leastsq? Or for that matter, simply the covariance matrix? My linear algebra is beyond rusty, and I'm not sure which of the output arrays are needed for this. If it is easy, it might make a nice addition to the tutorial page showing the fitting of a sinusoid to noisy data. Thanks in advance, Scott -- Scott M. Ransom Address: McGill Univ. Physics Dept. Phone: (514) 398-6492 3600 University St., Rm 338 email: ransom at physics.mcgill.ca Montreal, QC Canada H3A 2T8 GPG Fingerprint: 06A9 9553 78BE 16DB 407B FFCA 9BFA B6FF FFD3 2989 From kern at ugcs.caltech.edu Tue Oct 14 09:03:07 2003 From: kern at ugcs.caltech.edu (Robert Kern) Date: Tue, 14 Oct 2003 06:03:07 -0700 (PDT) Subject: [SciPy-user] Re: io.loadmat In-Reply-To: <3F8BC1BC.85CED77F__30309.4456474811$1066119230@mecha.uni-stuttgart.de> Message-ID: <20031014130307.94F2B9C18F@barf.ugcs.caltech.edu> In article <3F8BC1BC.85CED77F__30309.4456474811$1066119230 at mecha.uni-stuttgart.de> you wrote: [snip] > from scipy import * > file=open("matrizen_red.mat",'r') > io.loadmat(file,appendmat=0) > > raceback (most recent call last): > File "matlab.py", line 3, in ? > io.loadmat(file,appendmat=0) > File "/usr/local/lib/python2.1/site-packages/scipy/io/mio.py", line > 348, in loadmat > if os.sep in name: > TypeError: 'in' or 'not in' needs sequence right argument Use the filename as the first argument, not an open file object. -- Robert Kern kern at ugcs.caltech.edu "In the fields of Hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From wagner.nils at vdi.de Tue Oct 14 14:11:45 2003 From: wagner.nils at vdi.de (Nils Wagner) Date: Tue, 14 Oct 2003 20:11:45 +0200 Subject: [SciPy-user] Re: io.loadmat In-Reply-To: <20031014130307.94F2B9C18F@barf.ugcs.caltech.edu> Message-ID: <20031014181346.0F49B3EB27@www.scipy.com> ------------------- In article <3F8BC1BC.85CED77F__30309.4456474811$1066119230 at mecha.uni-stuttgart.de> you wrote: [snip] > from scipy import * > file=open("matrizen_red.mat",'r') > io.loadmat(file,appendmat=0) > > raceback (most recent call last): > File "matlab.py", line 3, in ? > io.loadmat(file,appendmat=0) > File "/usr/local/lib/python2.1/site-packages/scipy/io/mio.py", line > 348, in loadmat > if os.sep in name: > TypeError: 'in' or 'not in' needs sequence right argument Use the filename as the first argument, not an open file object. -- Robert Kern kern at ugcs.caltech.edu The next problem is the version file format of my Matlab file. Traceback (most recent call last): File "iotest.py", line 2, in ? io.loadmat("matrizen_red.mat",appendmat=0) File "/usr/lib/python2.2/site-packages/scipy/io/mio.py", line 372, in loadmat raise ValueError, "Version 5.0 file format not supported." ValueError: Version 5.0 file format not supported. Any idea to circumvent this problem ? Nils "In the fields of Hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter _______________________________________________ SciPy-user mailing list SciPy-user at scipy.net http://www.scipy.net/mailman/listinfo/scipy-user From kern at ugcs.caltech.edu Tue Oct 14 14:39:25 2003 From: kern at ugcs.caltech.edu (Robert Kern) Date: Tue, 14 Oct 2003 11:39:25 -0700 Subject: [SciPy-user] Re: io.loadmat In-Reply-To: <20031014181346.0F49B3EB27@www.scipy.com> References: <20031014130307.94F2B9C18F@barf.ugcs.caltech.edu> <20031014181346.0F49B3EB27@www.scipy.com> Message-ID: <20031014183924.GA9432@schilling.ugcs.caltech.edu> On Tue, Oct 14, 2003 at 08:11:45PM +0200, Nils Wagner wrote: [snip] > The next problem is the version file format of my Matlab file. > > Traceback (most recent call last): > File "iotest.py", line 2, in ? > io.loadmat("matrizen_red.mat",appendmat=0) > File "/usr/lib/python2.2/site-packages/scipy/io/mio.py", line 372, in loadmat > raise ValueError, "Version 5.0 file format not supported." > ValueError: Version 5.0 file format not supported. > > Any idea to circumvent this problem ? Grab http://www.mathworks.com/access/helpdesk/help/pdf_doc/matlab/matfile_format.pdf and add support for Version 5.0 files. > Nils -- Robert Kern kern at ugcs.caltech.edu "In the fields of Hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From wagner.nils at vdi.de Tue Oct 14 14:50:05 2003 From: wagner.nils at vdi.de (Nils Wagner) Date: Tue, 14 Oct 2003 20:50:05 +0200 Subject: [SciPy-user] Re: io.loadmat In-Reply-To: <20031014183924.GA9432@schilling.ugcs.caltech.edu> Message-ID: <20031014185205.796AE3EB27@www.scipy.com> ------------------- On Tue, Oct 14, 2003 at 08:11:45PM +0200, Nils Wagner wrote: [snip] > The next problem is the version file format of my Matlab file. > > Traceback (most recent call last): > File "iotest.py", line 2, in ? > io.loadmat("matrizen_red.mat",appendmat=0) > File "/usr/lib/python2.2/site-packages/scipy/io/mio.py", line 372, in loadmat > raise ValueError, "Version 5.0 file format not supported." > ValueError: Version 5.0 file format not supported. > > Any idea to circumvent this problem ? Grab http://www.mathworks.com/access/helpdesk/help/pdf_doc/matlab/matfile_format.pdf and add support for Version 5.0 files. > Nils Robert, Thank you for your note. However, I am not the right person for doing that. It would be great if someone else can do that. Thanks in advance. Nils -- Robert Kern kern at ugcs.caltech.edu "In the fields of Hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter _______________________________________________ SciPy-user mailing list SciPy-user at scipy.net http://www.scipy.net/mailman/listinfo/scipy-user From jdhunter at ace.bsd.uchicago.edu Tue Oct 14 16:31:18 2003 From: jdhunter at ace.bsd.uchicago.edu (John Hunter) Date: Tue, 14 Oct 2003 15:31:18 -0500 Subject: [SciPy-user] Re: io.loadmat In-Reply-To: <20031014185205.796AE3EB27@www.scipy.com> (Nils Wagner's message of "Tue, 14 Oct 2003 20:50:05 +0200") References: <20031014185205.796AE3EB27@www.scipy.com> Message-ID: >>>>> "Nils" == Nils Wagner writes: Nils> Thank you for your note. However, I am not the right person Nils> for doing that. It would be great if someone else can do Nils> that. Thanks in advance. Have you seen matfile? It will load matlab version 5 into Numeric arrays See http://groups.google.com/groups?hl=en&lr=&ie=UTF-8&oe=UTF-8&safe=off&threadm=m3g09ck23n.fsf%40nmw-office.ion.le.ac.uk&rnum=2&prev=/groups%3Fnum%3D20%26hl%3Den%26lr%3D%26ie%3DUTF-8%26oe%3DUTF-8%26safe%3Doff%26q%3Dmatfile%2Bpython%26sa%3DN%26tab%3Dwg and the src at ftp://ion.le.ac.uk/matfile/matfile.tar.gz Hope this helps, John Hunter From oliphant at ee.byu.edu Tue Oct 14 17:24:36 2003 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Tue, 14 Oct 2003 15:24:36 -0600 Subject: [SciPy-user] io.loadmat In-Reply-To: <3F8BC1BC.85CED77F@mecha.uni-stuttgart.de> References: <3F8BC1BC.85CED77F@mecha.uni-stuttgart.de> Message-ID: <3F8C6994.9050700@ee.byu.edu> Loadmat will only read mat files saved using the MATLAB version 4 file format. You can save mat files in MATLAB in this format using the -v4 switch when you enter the save command (do a help save in MATLAB for information). There is no reason we couldn't read the MATLAB version 5 format --- it's documented on the web. But, it is significantly more complicated and nobody has spent the time to write it. Perhaps you could write it. -Travis From ferrell at diablotech.com Tue Oct 14 19:06:44 2003 From: ferrell at diablotech.com (Robert Ferrell) Date: Tue, 14 Oct 2003 23:06:44 -0000 (GMT) Subject: [SciPy-user] compute wave speed Message-ID: <1345.128.165.144.120.1066172804.squirrel@webmail.pair.com> Can anyone suggest an algorithm, or SciPy code, that can be used to calculate wave speed? I have a scalar field that is a function of space and time. In practice, the data is a list of Numeric arrays, one array for each time. The data may be 1, 2 or 3 dimensional. There is a perturbance moving through the scalar field. I want to compute how fast the perturbance is moving. I have this notion that this must be something that signal processing folks know how to do, although I have no justification for that notion. In any case, does anyone have a suggestion for how to compute the speed (and direction?) of the wave? Thanks, -robert From astraw at insightscientific.com Tue Oct 14 20:11:22 2003 From: astraw at insightscientific.com (Andrew Straw) Date: Wed, 15 Oct 2003 09:41:22 +0930 Subject: [SciPy-user] Re: io.loadmat In-Reply-To: References: <20031014185205.796AE3EB27@www.scipy.com> Message-ID: <3F8C90AA.8070609@insightscientific.com> John Hunter wrote: >>>>>>"Nils" == Nils Wagner writes: > > > Nils> Thank you for your note. However, I am not the right person > Nils> for doing that. It would be great if someone else can do > Nils> that. Thanks in advance. > > > Have you seen matfile? It will load matlab version 5 into Numeric > arrays > > See > http://groups.google.com/groups?hl=en&lr=&ie=UTF-8&oe=UTF-8&safe=off&threadm=m3g09ck23n.fsf%40nmw-office.ion.le.ac.uk&rnum=2&prev=/groups%3Fnum%3D20%26hl%3Den%26lr%3D%26ie%3DUTF-8%26oe%3DUTF-8%26safe%3Doff%26q%3Dmatfile%2Bpython%26sa%3DN%26tab%3Dwg > > and the src at ftp://ion.le.ac.uk/matfile/matfile.tar.gz Cool! I hadn't seen that before, and it seems to work on the simple test case I tried. I made a setup script rather than fiddling with the Makefile. This seems like a useful enough utility that it should get wider exposure. Too bad it can't be incorporated into scipy as is (GPL vs. BSD license issues...). #!/usr/bin/env python """Setup script for matfile """ from distutils.core import setup, Extension import distutils.sysconfig import os numpy_inc_dir = os.path.join(distutils.sysconfig.get_python_inc(),'Numeric') matfile_dir = 'matfile' # off current directory matfile_srcs = ['mat_open.c', 'mat_read_array.c', 'mat_copy_data.c', 'mat_free_array.c', 'mat_read_array4.c', 'mat_write_array4.c', 'mat_read_array5.c', 'mat_subarray5.c', 'mat_utils5.c'] matfile_srcs = [ os.path.join(matfile_dir,src) for src in matfile_srcs ] setup(name="matfile", description = "MATLAB .mat file reader for Python", author = "Nigel Wade", author_email = "nmw at ion.le.ac.uk", license = "GNU GPL", ext_modules = [Extension(name='matfile', sources=['matfilemodule.c']+matfile_srcs, include_dirs=[matfile_dir, numpy_inc_dir])]) -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 155 bytes Desc: not available URL: From fperez at colorado.edu Tue Oct 14 20:16:45 2003 From: fperez at colorado.edu (Fernando Perez) Date: Tue, 14 Oct 2003 18:16:45 -0600 Subject: [SciPy-user] Re: io.loadmat In-Reply-To: <3F8C90AA.8070609@insightscientific.com> References: <20031014185205.796AE3EB27@www.scipy.com> <3F8C90AA.8070609@insightscientific.com> Message-ID: <3F8C91ED.5040301@colorado.edu> Andrew Straw wrote: > Cool! I hadn't seen that before, and it seems to work on the simple > test case I tried. I made a setup script rather than fiddling with the > Makefile. This seems like a useful enough utility that it should get > wider exposure. Too bad it can't be incorporated into scipy as is (GPL > vs. BSD license issues...). Mmh, not necessarily. Perhaps the author would be amenable to a license change for inclusion in scipy? It wouldn't hurt to ask nicely, the worst that happens is that he says no. Since there seem to be a lot of matlab users out there (I'm not one of them), this may be worth pursuing by some. Cheers, f. From nwagner at mecha.uni-stuttgart.de Wed Oct 15 04:46:28 2003 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Wed, 15 Oct 2003 10:46:28 +0200 Subject: [SciPy-user] Re: io.loadmat References: <20031014185205.796AE3EB27@www.scipy.com> <3F8C90AA.8070609@insightscientific.com> Message-ID: <3F8D0964.D14EBF8E@mecha.uni-stuttgart.de> Andrew Straw schrieb: > > John Hunter wrote: > >>>>>>"Nils" == Nils Wagner writes: > > > > > > Nils> Thank you for your note. However, I am not the right person > > Nils> for doing that. It would be great if someone else can do > > Nils> that. Thanks in advance. > > > > > > Have you seen matfile? It will load matlab version 5 into Numeric > > arrays > > > > See > > http://groups.google.com/groups?hl=en&lr=&ie=UTF-8&oe=UTF-8&safe=off&threadm=m3g09ck23n.fsf%40nmw-office.ion.le.ac.uk&rnum=2&prev=/groups%3Fnum%3D20%26hl%3Den%26lr%3D%26ie%3DUTF-8%26oe%3DUTF-8%26safe%3Doff%26q%3Dmatfile%2Bpython%26sa%3DN%26tab%3Dwg > > > > and the src at ftp://ion.le.ac.uk/matfile/matfile.tar.gz > > Cool! I hadn't seen that before, and it seems to work on the simple > test case I tried. I made a setup script rather than fiddling with the > Makefile. This seems like a useful enough utility that it should get > wider exposure. Too bad it can't be incorporated into scipy as is (GPL > vs. BSD license issues...). > > #!/usr/bin/env python > """Setup script for matfile > """ > > from distutils.core import setup, Extension > import distutils.sysconfig > import os > > numpy_inc_dir = os.path.join(distutils.sysconfig.get_python_inc(),'Numeric') > matfile_dir = 'matfile' # off current directory > > matfile_srcs = ['mat_open.c', 'mat_read_array.c', 'mat_copy_data.c', > 'mat_free_array.c', 'mat_read_array4.c', 'mat_write_array4.c', > 'mat_read_array5.c', 'mat_subarray5.c', 'mat_utils5.c'] > matfile_srcs = [ os.path.join(matfile_dir,src) for src in matfile_srcs ] > > setup(name="matfile", > description = "MATLAB .mat file reader for Python", > author = "Nigel Wade", > author_email = "nmw at ion.le.ac.uk", > license = "GNU GPL", > ext_modules = [Extension(name='matfile', > sources=['matfilemodule.c']+matfile_srcs, > include_dirs=[matfile_dir, > numpy_inc_dir])]) > I have modified the Makefiles for SuSE8.2 (see the attachment). Finally, I was able to build the module. How about the location of the modules ? Shall I copy this stuff to /usr/local/lib/python2.1/site-packages/matfile ? How do I use the module in scipy ? Sorry, I am a newbie to these things. In ~/mysoftware/matfile I have -rw-r--r-- 1 wagner users 1358 2003-10-15 10:20 Makefile -rw-r--r-- 1 wagner users 1358 2003-10-15 10:18 Makefile.ori lrwxrwxrwx 1 wagner users 15 2003-10-15 10:34 libmatfile.so -> libmatfile.so.0 -rwxr-xr-x 1 wagner users 87247 2003-10-15 10:34 libmatfile.so.0 drwxr-xr-x 2 wagner users 806 2003-10-15 10:34 matfile -rw-r--r-- 1 wagner users 16376 2001-09-24 17:34 matfilemodule.c -rw-r--r-- 1 wagner users 8692 2003-10-15 10:34 matfilemodule.o lrwxrwxrwx 1 wagner users 18 2003-10-15 10:34 matfilemodule.so -> matfilemodule.so.0 -rwxr-xr-x 1 wagner users 1620346 2003-10-15 10:34 matfilemodule.so.0 Any suggestion ? Nils > ------------------------------------------------------------------------ > Part 1.1.2Type: application/pgp-signature > > Part 1.2 Type: Plain Text (text/plain) > Encoding: 7bit -------------- next part -------------- TOP=. PYTHON_INCLUDE = /usr/local/include/python2.1 PYTHON_LIBDIR = /usr/local/lib/python2.1/config PYTHON_LIBS = -lpython2.1 -ldb -lutil -lpthread -lnsl -ldl -lm VERSION=0 SHARED_DSO=-shared DSO_NAME=-Wl,-h,$(@F) DSO_SYMBOLIC= LINK_TYPE=dynamic SRCS = matfilemodule.c OBJS = $(SRCS:%.c=%.o) CFLAGS += -I$(PYTHON_INCLUDE) -I$(PYTHON_INCLUDE)/Numeric -Imatfile LIBS += -L$(PYTHON_LIBDIR) $(PYTHON_LIBS) -L. -lmatfile LIBNAME=matfilemodule DIR = . LIB = $(DIR)/$(LIBNAME) LIB.so = $(LIB).so lib$(LIBNAME) : matfile $(LINK_TYPE) dynamic : $(LIB.so) matfile : FORCE (cd matfile; $(MAKE)) $(LIB.so) : $(LIB.so).$(VERSION) @(if [ -h $@ -o -f $@ ]; then /bin/rm $@; fi; \ if [ -f $? ]; then ln -s $(?F) $@; fi; \ exit 0) $(LIB.so).$(VERSION) : $(OBJS) $(LINK.c) $(SHARED_DSO) $(DSO_SYMBOLIC) $(DSO_NAME) -o $@ $(OBJS) $(LIBS) depend : FORCE @sed -n -e '/^# DO NOT DELETE THIS LINE -- make depend depends on it./q;p' Makefile > Makefile.new @echo "# DO NOT DELETE THIS LINE -- make depend depends on it." >> Makefile.new @$(COMPILE.c) $(DEPEND) $(OPTIONS) $(INCLUDE_FLAGS) $(SRCS) >> Makefile.new @cp Makefile Makefile.bak @cp Makefile.new Makefile clean : FORCE /bin/rm -f $(OBJS) veryclean veryveryclean: clean /bin/rm -f $(LIB.so) $(LIB.so).$(VERSION) $(LIB.a) FORCE: # DO NOT DELETE THIS LINE -- make depend depends on it. -------------- next part -------------- TOP=.. VERSION=0 SHARED_DSO=-shared DSO_NAME=-Wl,-h,$(@F) DSO_SYMBOLIC= LINK_TYPE=dynamic MAKEDEPEND=makedepend -- $(CFLAGS) -- $(INCLUDE_FLAGS) $(OPTIONS) LIBNAME=matfile DIR = .. LIB = $(DIR)/lib$(LIBNAME) LIB.so = $(LIB).so LIB.a = $(LIB).a SRCS = mat_open.c mat_read_array.c mat_copy_data.c mat_free_array.c \ mat_read_array4.c mat_write_array4.c \ mat_read_array5.c mat_subarray5.c mat_utils5.c OBJS = $(SRCS:%.c=%.o) lib$(LIBNAME) : $(LINK_TYPE) dynamic : $(LIB.so) static : $(LIB.a) $(LIB.a) : $(LIB.a)($(OBJS)) $(LIB.so) : $(LIB.so).$(VERSION) @(if [ -h $@ -o -f $@ ]; then /bin/rm $@; fi; \ if [ -f $? ]; then ln -s $(?F) $@; fi; \ exit 0) $(LIB.so).$(VERSION) : $(OBJS) $(LINK.c) $(SHARED_DSO) $(DSO_SYMBOLIC) $(DSO_NAME) -o $@ $(OBJS) $(RUNPATH) $(LIB.a)(%.o) : %.o ar -rs $@ $< @$(RANLIB) $@ %.o : %.c $(COMPILE.c) $< install : install_$(LINK_TYPE) install_static : FORCE @$cp $(LIB.a) $(LIBDIR) install_dynamic : FORCE @cp $(LIB.so).$(VERSION) $(LIBDIR) @(cd $(LIBDIR); \ if [ -h lib$(LIBNAME).so -o -f lib$(LIBNAME).so ]; then /bin/rm lib$(LIBNAME).so; fi; \ if [ -f lib$(LIBNAME).so.$(VERSION) ]; then ln -s lib$(LIBNAME).so.$(VERSION) lib$(LIBNAME).so; fi) depend : FORCE $(MAKEDEPEND) $(SRCS) clean : FORCE /bin/rm -f $(OBJS) veryclean veryveryclean: clean /bin/rm -f $(LIB.so) $(LIB.so).$(VERSION) $(LIB.a) FORCE: # DO NOT DELETE THIS LINE -- make depend depends on it. mat_open.o: /usr/include/stdlib.h /usr/include/features.h mat_open.o: /usr/include/sys/cdefs.h /usr/include/gnu/stubs.h #mat_open.o: /usr/lib/gcc-lib/i386-redhat-linux/egcs-2.91.66/include/stddef.h mat_open.o: /usr/lib/gcc-lib/i486-suse-linux/3.3/include/stddef.h mat_open.o: /usr/include/sys/types.h /usr/include/bits/types.h mat_open.o: /usr/include/time.h /usr/include/endian.h mat_open.o: /usr/include/bits/endian.h /usr/include/sys/select.h mat_open.o: /usr/include/bits/select.h /usr/include/bits/sigset.h mat_open.o: /usr/include/sys/sysmacros.h /usr/include/alloca.h mat_open.o: /usr/include/stdio.h #mat_open.o: /usr/lib/gcc-lib/i386-redhat-linux/egcs-2.91.66/include/stdarg.h mat_open.o: /usr/lib/gcc-lib/i486-suse-linux/3.3/include/stdarg.h mat_open.o: /usr/include/libio.h /usr/include/_G_config.h mat_open.o: /usr/include/bits/stdio_lim.h /usr/include/limits.h mat_open.o: /usr/include/bits/posix1_lim.h /usr/include/bits/local_lim.h mat_open.o: /usr/include/linux/limits.h /usr/include/bits/posix2_lim.h mat_open.o: /usr/include/ctype.h matfile.h mat_read_array.o: /usr/include/stdlib.h /usr/include/features.h mat_read_array.o: /usr/include/sys/cdefs.h /usr/include/gnu/stubs.h #mat_read_array.o: /usr/lib/gcc-lib/i386-redhat-linux/egcs-2.91.66/include/stddef.h mat_read_array.o: /usr/lib/gcc-lib/i486-suse-linux/3.3/include/stddef.h mat_read_array.o: /usr/include/sys/types.h /usr/include/bits/types.h mat_read_array.o: /usr/include/time.h /usr/include/endian.h mat_read_array.o: /usr/include/bits/endian.h /usr/include/sys/select.h mat_read_array.o: /usr/include/bits/select.h /usr/include/bits/sigset.h mat_read_array.o: /usr/include/sys/sysmacros.h /usr/include/alloca.h mat_read_array.o: /usr/include/stdio.h #mat_read_array.o: /usr/lib/gcc-lib/i386-redhat-linux/egcs-2.91.66/include/stdarg.h mat_read_array.o: /usr/lib/gcc-lib/i486-suse-linux/3.3/include/stdarg.h mat_read_array.o: /usr/include/libio.h /usr/include/_G_config.h mat_read_array.o: /usr/include/bits/stdio_lim.h /usr/include/assert.h mat_read_array.o: /usr/include/limits.h /usr/include/bits/posix1_lim.h mat_read_array.o: /usr/include/bits/local_lim.h /usr/include/linux/limits.h mat_read_array.o: /usr/include/bits/posix2_lim.h matfile.h mat_copy_data.o: /usr/include/stdlib.h /usr/include/features.h mat_copy_data.o: /usr/include/sys/cdefs.h /usr/include/gnu/stubs.h #mat_copy_data.o: /usr/lib/gcc-lib/i386-redhat-linux/egcs-2.91.66/include/stddef.h mat_copy_data.o: /usr/lib/gcc-lib/i486-suse-linux/3.3/include/stddef.h mat_copy_data.o: /usr/include/sys/types.h /usr/include/bits/types.h mat_copy_data.o: /usr/include/time.h /usr/include/endian.h mat_copy_data.o: /usr/include/bits/endian.h /usr/include/sys/select.h mat_copy_data.o: /usr/include/bits/select.h /usr/include/bits/sigset.h mat_copy_data.o: /usr/include/sys/sysmacros.h /usr/include/alloca.h matfile.h mat_copy_data.o: /usr/include/stdio.h #mat_copy_data.o: /usr/lib/gcc-lib/i386-redhat-linux/egcs-2.91.66/include/stdarg.h mat_copy_data.o: /usr/lib/gcc-lib/i486-suse-linux/3.3/include/stdarg.h mat_copy_data.o: /usr/include/libio.h /usr/include/_G_config.h mat_copy_data.o: /usr/include/bits/stdio_lim.h mat_free_array.o: matfile.h /usr/include/stdio.h /usr/include/features.h mat_free_array.o: /usr/include/sys/cdefs.h /usr/include/gnu/stubs.h #mat_free_array.o: /usr/lib/gcc-lib/i386-redhat-linux/egcs-2.91.66/include/stddef.h mat_free_array.o: /usr/lib/gcc-lib/i486-suse-linux/3.3/include/stddef.h #mat_free_array.o: /usr/lib/gcc-lib/i386-redhat-linux/egcs-2.91.66/include/stdarg.h mat_free_array.o: /usr/lib/gcc-lib/i486-suse-linux/3.3/include/stdarg.h mat_free_array.o: /usr/include/bits/types.h /usr/include/libio.h mat_free_array.o: /usr/include/_G_config.h /usr/include/bits/stdio_lim.h mat_read_array4.o: /usr/include/stdlib.h /usr/include/features.h mat_read_array4.o: /usr/include/sys/cdefs.h /usr/include/gnu/stubs.h #mat_read_array4.o: /usr/lib/gcc-lib/i386-redhat-linux/egcs-2.91.66/include/stddef.h mat_read_array4.o: /usr/lib/gcc-lib/i486-suse-linux/3.3/include/stddef.h mat_read_array4.o: /usr/include/sys/types.h /usr/include/bits/types.h mat_read_array4.o: /usr/include/time.h /usr/include/endian.h mat_read_array4.o: /usr/include/bits/endian.h /usr/include/sys/select.h mat_read_array4.o: /usr/include/bits/select.h /usr/include/bits/sigset.h mat_read_array4.o: /usr/include/sys/sysmacros.h /usr/include/alloca.h mat_read_array4.o: /usr/include/stdio.h #mat_read_array4.o: /usr/lib/gcc-lib/i386-redhat-linux/egcs-2.91.66/include/stdarg.h mat_read_array4.o: /usr/lib/gcc-lib/i486-suse-linux/3.3/include/stdarg.h mat_read_array4.o: /usr/include/libio.h /usr/include/_G_config.h mat_read_array4.o: /usr/include/bits/stdio_lim.h /usr/include/assert.h mat_read_array4.o: /usr/include/limits.h /usr/include/bits/posix1_lim.h mat_read_array4.o: /usr/include/bits/local_lim.h /usr/include/linux/limits.h mat_read_array4.o: /usr/include/bits/posix2_lim.h matfile.h mat_write_array4.o: /usr/include/stdio.h /usr/include/features.h mat_write_array4.o: /usr/include/sys/cdefs.h /usr/include/gnu/stubs.h #mat_write_array4.o: /usr/lib/gcc-lib/i386-redhat-linux/egcs-2.91.66/include/stddef.h mat_write_array4.o: /usr/lib/gcc-lib/i486-suse-linux/3.3/include/stddef.h #mat_write_array4.o: /usr/lib/gcc-lib/i386-redhat-linux/egcs-2.91.66/include/stdarg.h mat_write_array4.o: /usr/lib/gcc-lib/i486-suse-linux/3.3/include/stdarg.h mat_write_array4.o: /usr/include/bits/types.h /usr/include/libio.h mat_write_array4.o: /usr/include/_G_config.h /usr/include/bits/stdio_lim.h mat_write_array4.o: /usr/include/stdlib.h /usr/include/sys/types.h mat_write_array4.o: /usr/include/time.h /usr/include/endian.h mat_write_array4.o: /usr/include/bits/endian.h /usr/include/sys/select.h mat_write_array4.o: /usr/include/bits/select.h /usr/include/bits/sigset.h mat_write_array4.o: /usr/include/sys/sysmacros.h /usr/include/alloca.h mat_write_array4.o: matfile.h mat_read_array5.o: /usr/include/stdlib.h /usr/include/features.h mat_read_array5.o: /usr/include/sys/cdefs.h /usr/include/gnu/stubs.h #mat_read_array5.o: /usr/lib/gcc-lib/i386-redhat-linux/egcs-2.91.66/include/stddef.h mat_read_array5.o: /usr/lib/gcc-lib/i486-suse-linux/3.3/include/stddef.h mat_read_array5.o: /usr/include/sys/types.h /usr/include/bits/types.h mat_read_array5.o: /usr/include/time.h /usr/include/endian.h mat_read_array5.o: /usr/include/bits/endian.h /usr/include/sys/select.h mat_read_array5.o: /usr/include/bits/select.h /usr/include/bits/sigset.h mat_read_array5.o: /usr/include/sys/sysmacros.h /usr/include/alloca.h mat_read_array5.o: /usr/include/stdio.h #mat_read_array5.o: /usr/lib/gcc-lib/i386-redhat-linux/egcs-2.91.66/include/stdarg.h mat_read_array5.o: /usr/lib/gcc-lib/i486-suse-linux/3.3/include/stdarg.h mat_read_array5.o: /usr/include/libio.h /usr/include/_G_config.h mat_read_array5.o: /usr/include/bits/stdio_lim.h /usr/include/assert.h mat_read_array5.o: /usr/include/limits.h /usr/include/bits/posix1_lim.h mat_read_array5.o: /usr/include/bits/local_lim.h /usr/include/linux/limits.h mat_read_array5.o: /usr/include/bits/posix2_lim.h matfile.h mat_subarray5.o: /usr/include/stdlib.h /usr/include/features.h mat_subarray5.o: /usr/include/sys/cdefs.h /usr/include/gnu/stubs.h #mat_subarray5.o: /usr/lib/gcc-lib/i386-redhat-linux/egcs-2.91.66/include/stddef.h mat_subarray5.o: /usr/lib/gcc-lib/i486-suse-linux/3.3/include/stddef.h mat_subarray5.o: /usr/include/sys/types.h /usr/include/bits/types.h mat_subarray5.o: /usr/include/time.h /usr/include/endian.h mat_subarray5.o: /usr/include/bits/endian.h /usr/include/sys/select.h mat_subarray5.o: /usr/include/bits/select.h /usr/include/bits/sigset.h mat_subarray5.o: /usr/include/sys/sysmacros.h /usr/include/alloca.h mat_subarray5.o: /usr/include/stdio.h #mat_subarray5.o: /usr/lib/gcc-lib/i386-redhat-linux/egcs-2.91.66/include/stdarg.h mat_subarray5.o: /usr/lib/gcc-lib/i486-suse-linux/3.3/include/stdarg.h mat_subarray5.o: /usr/include/libio.h /usr/include/_G_config.h mat_subarray5.o: /usr/include/bits/stdio_lim.h /usr/include/assert.h mat_subarray5.o: /usr/include/limits.h /usr/include/bits/posix1_lim.h mat_subarray5.o: /usr/include/bits/local_lim.h /usr/include/linux/limits.h mat_subarray5.o: /usr/include/bits/posix2_lim.h matfile.h mat_utils5.o: /usr/include/limits.h /usr/include/features.h mat_utils5.o: /usr/include/sys/cdefs.h /usr/include/gnu/stubs.h mat_utils5.o: /usr/include/bits/posix1_lim.h /usr/include/bits/local_lim.h mat_utils5.o: /usr/include/linux/limits.h /usr/include/bits/posix2_lim.h mat_utils5.o: /usr/include/stdlib.h #mat_utils5.o: /usr/lib/gcc-lib/i386-redhat-linux/egcs-2.91.66/include/stddef.h mat_utils5.o: /usr/lib/gcc-lib/i486-suse-linux/3.3/include/stddef.h mat_utils5.o: /usr/include/sys/types.h /usr/include/bits/types.h mat_utils5.o: /usr/include/time.h /usr/include/endian.h mat_utils5.o: /usr/include/bits/endian.h /usr/include/sys/select.h mat_utils5.o: /usr/include/bits/select.h /usr/include/bits/sigset.h mat_utils5.o: /usr/include/sys/sysmacros.h /usr/include/alloca.h mat_utils5.o: /usr/include/string.h matfile.h /usr/include/stdio.h #mat_utils5.o: /usr/lib/gcc-lib/i386-redhat-linux/egcs-2.91.66/include/stdarg.h mat_utils5.o: /usr/lib/gcc-lib/i486-suse-linux/3.3/include/stdarg.h mat_utils5.o: /usr/include/libio.h /usr/include/_G_config.h mat_utils5.o: /usr/include/bits/stdio_lim.h From nwagner at mecha.uni-stuttgart.de Wed Oct 15 04:55:08 2003 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Wed, 15 Oct 2003 10:55:08 +0200 Subject: [SciPy-user] Re: io.loadmat References: <20031014185205.796AE3EB27@www.scipy.com> <3F8C90AA.8070609@insightscientific.com> <3F8C91ED.5040301@colorado.edu> Message-ID: <3F8D0B6C.F662B336@mecha.uni-stuttgart.de> Fernando Perez schrieb: > > Andrew Straw wrote: > > > Cool! I hadn't seen that before, and it seems to work on the simple > > test case I tried. I made a setup script rather than fiddling with the > > Makefile. This seems like a useful enough utility that it should get > > wider exposure. Too bad it can't be incorporated into scipy as is (GPL > > vs. BSD license issues...). > > Mmh, not necessarily. Perhaps the author would be amenable to a license > change for inclusion in scipy? It wouldn't hurt to ask nicely, the worst that > happens is that he says no. > > Since there seem to be a lot of matlab users out there (I'm not one of them), > this may be worth pursuing by some. > > Cheers, > > f. > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user Fernando, Did you ask the author --> http://ion.le.ac.uk/admin/nigel.html already ? It would be great to have this feature in scipy. I am also interested in an interface to NASTRAN files - there is some progress in this direction by Jose Fonseca http://jrfonseca.dyndns.org/work/phd/ Nils From nwagner at mecha.uni-stuttgart.de Wed Oct 15 05:20:03 2003 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Wed, 15 Oct 2003 11:20:03 +0200 Subject: [SciPy-user] Re: io.loadmat References: <20031014185205.796AE3EB27@www.scipy.com> <3F8C90AA.8070609@insightscientific.com> Message-ID: <3F8D1143.6825A2CC@mecha.uni-stuttgart.de> Andrew Straw schrieb: > > John Hunter wrote: > >>>>>>"Nils" == Nils Wagner writes: > > > > > > Nils> Thank you for your note. However, I am not the right person > > Nils> for doing that. It would be great if someone else can do > > Nils> that. Thanks in advance. > > > > > > Have you seen matfile? It will load matlab version 5 into Numeric > > arrays > > > > See > > http://groups.google.com/groups?hl=en&lr=&ie=UTF-8&oe=UTF-8&safe=off&threadm=m3g09ck23n.fsf%40nmw-office.ion.le.ac.uk&rnum=2&prev=/groups%3Fnum%3D20%26hl%3Den%26lr%3D%26ie%3DUTF-8%26oe%3DUTF-8%26safe%3Doff%26q%3Dmatfile%2Bpython%26sa%3DN%26tab%3Dwg > > > > and the src at ftp://ion.le.ac.uk/matfile/matfile.tar.gz > > Cool! I hadn't seen that before, and it seems to work on the simple > test case I tried. I made a setup script rather than fiddling with the > Makefile. This seems like a useful enough utility that it should get > wider exposure. Too bad it can't be incorporated into scipy as is (GPL > vs. BSD license issues...). > > #!/usr/bin/env python > """Setup script for matfile > """ > > from distutils.core import setup, Extension > import distutils.sysconfig > import os > > numpy_inc_dir = os.path.join(distutils.sysconfig.get_python_inc(),'Numeric') > matfile_dir = 'matfile' # off current directory > > matfile_srcs = ['mat_open.c', 'mat_read_array.c', 'mat_copy_data.c', > 'mat_free_array.c', 'mat_read_array4.c', 'mat_write_array4.c', > 'mat_read_array5.c', 'mat_subarray5.c', 'mat_utils5.c'] > matfile_srcs = [ os.path.join(matfile_dir,src) for src in matfile_srcs ] > > setup(name="matfile", > description = "MATLAB .mat file reader for Python", > author = "Nigel Wade", > author_email = "nmw at ion.le.ac.uk", > license = "GNU GPL", > ext_modules = [Extension(name='matfile', > sources=['matfilemodule.c']+matfile_srcs, > include_dirs=[matfile_dir, > numpy_inc_dir])]) > Andrew, I have used your setup script. However, I get into trouble from scipy import * import matfile load("matrizen_red.mat") #io.loadmat("matrizen_red.mat",appendmat=0) Traceback (most recent call last): File "matlab.py", line 3, in ? load("matrizen_red.mat") File "/usr/local/lib/python2.1/site-packages/Numeric/Numeric.py", line 506, in load return Unpickler(file).load() File "/usr/local/lib/python2.1/pickle.py", line 554, in __init__ self.readline = file.readline AttributeError: readline Any suggestion ? Nils > ------------------------------------------------------------------------ > Part 1.1.2Type: application/pgp-signature > > Part 1.2 Type: Plain Text (text/plain) > Encoding: 7bit From prabhu at aero.iitm.ernet.in Wed Oct 15 04:40:53 2003 From: prabhu at aero.iitm.ernet.in (Prabhu Ramachandran) Date: Wed, 15 Oct 2003 14:10:53 +0530 Subject: [SciPy-user] Re: io.loadmat In-Reply-To: <3F8D1143.6825A2CC@mecha.uni-stuttgart.de> References: <20031014185205.796AE3EB27@www.scipy.com> <3F8C90AA.8070609@insightscientific.com> <3F8D1143.6825A2CC@mecha.uni-stuttgart.de> Message-ID: <16269.2069.137680.912676@monster.linux.in> >>>>> "NW" == Nils Wagner writes: NW> I have used your setup script. However, I get into trouble NW> from scipy import * NW> import matfile NM> load("matrizen_red.mat") NW> #io.loadmat("matrizen_red.mat",appendmat=0) NW> Traceback (most recent call last): NW> File "matlab.py", line 3, in ? NW> load("matrizen_red.mat") NW> File NW> "/usr/local/lib/python2.1/site-packages/Numeric/Numeric.py", NW> line NW> 506, in load NW> return Unpickler(file).load() NW> File "/usr/local/lib/python2.1/pickle.py", line 554, in NW> __init__ NW> self.readline = file.readline NW> AttributeError: readline NW> Any suggestion ? Well, sorry to sound rude but I think it helps if you learn to read exceptions and interpret them if you are serious about using Python. The above exception says that file.readline does not exist since you called: load("matrizen_red.mat") If you look at the relevant lines quoted in the traceback, you will immediately see that you invoked Numeric's load function (and not matfile.load!). IIRC you ran into this problem a few days back and Robert replied with the answer. Now the exception goes on to say that Numeric's load is calling pickle.py's __init__ member function and that function is trying to find an attribute called readline in the file object you allegedly passed. Evidently, the object you passed has no readline method. A file object in Python is supposed to have a readline method. Now you did not pass a file object but a string which is not a file. So the thing to do is to pass load() a file object. load(open("matrizen_red.mat")) will do the job. My guess is that this is not what you want. Perhaps you wanted to do matfile.load(...)? In any case you might seriously want to consider reading the Python tutorial if you are totally new to Python. The basics should take less than an afternoon to read and you should be more comfortable with all of the above. cheers, prabhu From nwagner at mecha.uni-stuttgart.de Wed Oct 15 08:33:37 2003 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Wed, 15 Oct 2003 14:33:37 +0200 Subject: [SciPy-user] io.loadmat References: <3F8BC1BC.85CED77F@mecha.uni-stuttgart.de> <3F8C6994.9050700@ee.byu.edu> Message-ID: <3F8D3EA1.F96BB79@mecha.uni-stuttgart.de> Travis Oliphant schrieb: > > Loadmat will only read mat files saved using the MATLAB version 4 file > format. You can save mat files in MATLAB in this format using the -v4 > switch when you enter the save command (do a help save in MATLAB for > information). There is no reason we couldn't read the MATLAB version 5 > format --- it's documented on the web. But, it is significantly more > complicated and nobody has spent the time to write it. Perhaps you > could write it. > > -Travis > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user Travis, Thank you for your workaround. However, the current version of loadmat cannot handle sparse matrices ? Is there any progress in importing sparse matrices ? Traceback (most recent call last): File "matlab.py", line 4, in ? io.loadmat("k00.mat",appendmat=0) File "/usr/local/lib/python2.1/site-packages/scipy/io/mio.py", line 423, in loadmat raise ValueError, "Cannot handle sparse matrices, yet." ValueError: Cannot handle sparse matrices, yet. So I guess, I have to use k00=full(k) in Matlab before I use save -mat -v4 k00 Nils From bryan.cole at teraview.com Wed Oct 15 07:35:19 2003 From: bryan.cole at teraview.com (bryan cole) Date: Wed, 15 Oct 2003 12:35:19 +0100 Subject: [SciPy-user] vector-product In-Reply-To: <3F8D1143.6825A2CC@mecha.uni-stuttgart.de> References: <20031014185205.796AE3EB27@www.scipy.com> <3F8C90AA.8070609@insightscientific.com> <3F8D1143.6825A2CC@mecha.uni-stuttgart.de> Message-ID: <1066217719.32344.1.camel@bryan.teraviewhq.local> Does SciPy have a function for calculating the cross-product of two vectors? (a 'convenience' function) cheers, Bryan -- Bryan Cole Teraview Ltd., 302-304 Cambridge Science Park, Milton Road, Cambridge CB4 0WG, United Kingdom. tel: +44 (1223) 435380 / 435386 (direct-dial) fax: +44 (1223) 435382 From nwagner at mecha.uni-stuttgart.de Wed Oct 15 09:16:36 2003 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Wed, 15 Oct 2003 15:16:36 +0200 Subject: [SciPy-user] Cannot handle sparse matrices, yet. Message-ID: <3F8D48B4.16CBE6C5@mecha.uni-stuttgart.de> Dear experts, At the beginning I have used k0=full(k00) to convert the sparse matrix k00 to a full matrix k0. Then I have used the -v4 option in Matlab to save the full matrix save -mat -v4 k0 However, I cannot import the f u l l matrix k0 in scioy Traceback (most recent call last): File "matlab.py", line 4, in ? io.loadmat("k0",appendmat=0) File "/usr/local/lib/python2.1/site-packages/scipy/io/mio.py", line 423, in loadmat raise ValueError, "Cannot handle sparse matrices, yet." ValueError: Cannot handle sparse matrices, yet. For what reason ? Nils From arnd.baecker at web.de Wed Oct 15 08:12:06 2003 From: arnd.baecker at web.de (Arnd Baecker) Date: Wed, 15 Oct 2003 14:12:06 +0200 (CEST) Subject: [SciPy-user] Cannot handle sparse matrices, yet. In-Reply-To: <3F8D48B4.16CBE6C5@mecha.uni-stuttgart.de> References: <3F8D48B4.16CBE6C5@mecha.uni-stuttgart.de> Message-ID: Hi Nils, though not feeling as an expert on this, may I still ask if you are 100% sure that k0 is really stored as a full matrix? Somehow it seems that the stored k0 still "knows" that it came from a sparse matrix. Does save -mat -v4 k0 lead to a resonably larger file than save -mat -v4 k00 ? Is the shape of k0 the right one? What happens if you assign a new matrix k1=IdentityMatrix * k0 and store that ? Presumably you this kind of tests already ... Arnd On Wed, 15 Oct 2003, Nils Wagner wrote: > Dear experts, > > At the beginning I have used > > k0=full(k00) > > to convert the sparse matrix k00 to a full matrix k0. > Then I have used the -v4 option in Matlab to save the full matrix > > save -mat -v4 k0 > > However, I cannot import the f u l l matrix k0 in scioy > > Traceback (most recent call last): > File "matlab.py", line 4, in ? > io.loadmat("k0",appendmat=0) > File "/usr/local/lib/python2.1/site-packages/scipy/io/mio.py", line > 423, in loadmat > raise ValueError, "Cannot handle sparse matrices, yet." > ValueError: Cannot handle sparse matrices, yet. > > For what reason ? > > Nils > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user > From jdhunter at ace.bsd.uchicago.edu Wed Oct 15 08:27:35 2003 From: jdhunter at ace.bsd.uchicago.edu (John Hunter) Date: Wed, 15 Oct 2003 07:27:35 -0500 Subject: [SciPy-user] Re: io.loadmat In-Reply-To: <3F8D0B6C.F662B336@mecha.uni-stuttgart.de> (Nils Wagner's message of "Wed, 15 Oct 2003 10:55:08 +0200") References: <20031014185205.796AE3EB27@www.scipy.com> <3F8C90AA.8070609@insightscientific.com> <3F8C91ED.5040301@colorado.edu> <3F8D0B6C.F662B336@mecha.uni-stuttgart.de> Message-ID: >>>>> "Nils" == Nils Wagner writes: Nils> Did you ask the author --> Nils> http://ion.le.ac.uk/admin/nigel.html already ? I emailed him this morning. I hope he didn't get too many requests this AM :-). I'll post back if he responds. JDH From lcordier at dsp.sun.ac.za Wed Oct 15 08:33:23 2003 From: lcordier at dsp.sun.ac.za (Louis Cordier) Date: Wed, 15 Oct 2003 14:33:23 +0200 (SAST) Subject: [SciPy-user] Matlab, NASTRAN, CDF :) In-Reply-To: <20031015112109.8F2E53EB35@www.scipy.com> Message-ID: > Fernando Perez schrieb: > > > > Andrew Straw wrote: > > > > > Cool! I hadn't seen that before, and it seems to work on the simple > > > test case I tried. I made a setup script rather than fiddling with the > > > Makefile. This seems like a useful enough utility that it should get > > > wider exposure. Too bad it can't be incorporated into scipy as is (GPL > > > vs. BSD license issues...). > > > > Mmh, not necessarily. Perhaps the author would be amenable to a license > > change for inclusion in scipy? It wouldn't hurt to ask nicely, the worst that > > happens is that he says no. > > > > Since there seem to be a lot of matlab users out there (I'm not one of them), > > this may be worth pursuing by some. > > > > Cheers, > > > > f. > > > > Fernando, > > Did you ask the author --> http://ion.le.ac.uk/admin/nigel.html already > ? > > It would be great to have this feature in scipy. > > I am also interested in an interface to NASTRAN files - there is some > progress > in this direction by Jose Fonseca > > http://jrfonseca.dyndns.org/work/phd/ > > Nils Why don't the powers that be just implement "Common Data Format" in SciPy ? http://nssdc.gsfc.nasa.gov/cdf/cdf_home.html http://www.faqs.org/faqs/sci-data-formats/ >From the FAQ http://nssdc.gsfc.nasa.gov/cdf/html/FAQ.html It seems Matlab supports CDF ---8<----------------------------------- 26. Is there any commercial or public domain data analysis/visualization software that accepts/supports CDF files? Commercial software: Interactive Data Language (IDL) MathWorks MATLAB Language (MATLAB) Application Visualization System (AVS) IBM Visualization Data Explorer (DX) Weisang GmbH & Co. KG Data Analysis and Presentation (FlexPro) ... 27. Are there any conversion programs available that will convert non-CDF files into CDF files or vice versa? ... ---8<----------------------------------- Then one simply have to pull an effort on CDF2* convertion programs. Instead of SciPy supporting a lot of different file formats that is constantly evolving and growing... A possible starting point would be: http://www.mrao.cam.ac.uk/~jsy1001/mfit/ Regars, Louis. -------------------------------------------------------------- Louis Cordier, c:+27721472305, h:+27218865145, w:+27218084315 lcordier at dsp.sun.ac.za, http://www.dsp.sun.ac.za/~lcordier/ University of Stellenbosch - Electric & Electronic Engineering -------------------------------------------------------------- From jdhunter at ace.bsd.uchicago.edu Wed Oct 15 08:38:14 2003 From: jdhunter at ace.bsd.uchicago.edu (John Hunter) Date: Wed, 15 Oct 2003 07:38:14 -0500 Subject: [SciPy-user] io.loadmat In-Reply-To: <3F8D3EA1.F96BB79@mecha.uni-stuttgart.de> (Nils Wagner's message of "Wed, 15 Oct 2003 14:33:37 +0200") References: <3F8BC1BC.85CED77F@mecha.uni-stuttgart.de> <3F8C6994.9050700@ee.byu.edu> <3F8D3EA1.F96BB79@mecha.uni-stuttgart.de> Message-ID: >>>>> "Nils" == Nils Wagner writes: Nils> Thank you for your workaround. However, the current version Nils> of loadmat cannot handle sparse matrices ? Is there any Nils> progress in importing sparse matrices ? In my tests of the version 5 matfile, sparse matrices also failed to load, but they failed silently. JDH From nwagner at mecha.uni-stuttgart.de Wed Oct 15 09:59:21 2003 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Wed, 15 Oct 2003 15:59:21 +0200 Subject: [SciPy-user] Cannot handle sparse matrices, yet. References: <3F8D48B4.16CBE6C5@mecha.uni-stuttgart.de> Message-ID: <3F8D52B9.CC0A514C@mecha.uni-stuttgart.de> Arnd Baecker schrieb: > > Hi Nils, > > though not feeling as an expert on this, may I still > ask if you are 100% sure that k0 is really stored as a full matrix? > Somehow it seems that the stored k0 still "knows" that it > came from a sparse matrix. > Does save -mat -v4 k0 lead to a resonably larger > file than save -mat -v4 k00 ? Yes. > Is the shape of k0 the right one? Yes > What happens if you assign a new matrix k1=IdentityMatrix * k0 > and store that ? Nothing changes. > > Presumably you this kind of tests already ... > > Arnd The matrix is complex. Is this a problem for loadmat ? Nils > > On Wed, 15 Oct 2003, Nils Wagner wrote: > > > Dear experts, > > > > At the beginning I have used > > > > k0=full(k00) > > > > to convert the sparse matrix k00 to a full matrix k0. > > Then I have used the -v4 option in Matlab to save the full matrix > > > > save -mat -v4 k0 > > > > However, I cannot import the f u l l matrix k0 in scioy > > > > Traceback (most recent call last): > > File "matlab.py", line 4, in ? > > io.loadmat("k0",appendmat=0) > > File "/usr/local/lib/python2.1/site-packages/scipy/io/mio.py", line > > 423, in loadmat > > raise ValueError, "Cannot handle sparse matrices, yet." > > ValueError: Cannot handle sparse matrices, yet. > > > > For what reason ? > > > > Nils > > _______________________________________________ > > SciPy-user mailing list > > SciPy-user at scipy.net > > http://www.scipy.net/mailman/listinfo/scipy-user > > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user From arnd.baecker at web.de Wed Oct 15 08:55:48 2003 From: arnd.baecker at web.de (Arnd Baecker) Date: Wed, 15 Oct 2003 14:55:48 +0200 (CEST) Subject: [SciPy-user] Cannot handle sparse matrices, yet. In-Reply-To: <3F8D52B9.CC0A514C@mecha.uni-stuttgart.de> References: <3F8D48B4.16CBE6C5@mecha.uni-stuttgart.de> <3F8D52B9.CC0A514C@mecha.uni-stuttgart.de> Message-ID: Hi Nils, > The matrix is complex. Is this a problem for loadmat ? Looking at the source of scipy.io.loadmat I'd _guess_ that it does not support complex matrices. Could you re-try with just the real part of the matrix? If that works you could (as work-around) store real and imaginary part separately and load them separately and combine them afterwards. Good luck, Arnd From nwagner at mecha.uni-stuttgart.de Wed Oct 15 10:20:23 2003 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Wed, 15 Oct 2003 16:20:23 +0200 Subject: [SciPy-user] Cannot handle sparse matrices, yet. References: <3F8D48B4.16CBE6C5@mecha.uni-stuttgart.de> Message-ID: <3F8D57A7.FE37098E@mecha.uni-stuttgart.de> Arnd Baecker schrieb: > > Hi Nils, > > > The matrix is complex. Is this a problem for loadmat ? > > Looking at the source of scipy.io.loadmat I'd _guess_ > that it does not support complex matrices. Even, if I split the matrix into its real and imaginary part by k0r = real(full(k00)) k0i = imag(full(k00)) and save them as two real matrices save -v4 -mat k0r save -v4 -mat k0i I obtain the same message as before Traceback (most recent call last): File "matlab.py", line 4, in ? io.loadmat("k0r",appendmat=0) File "/usr/local/lib/python2.1/site-packages/scipy/io/mio.py", line 423, in loadmat raise ValueError, "Cannot handle sparse matrices, yet." ValueError: Cannot handle sparse matrices, yet. What is going on there ? Nils > Could you re-try with just the real part of the matrix? > If that works you could (as work-around) store real and imaginary part > separately and load them separately and combine them afterwards. > > Good luck, > > Arnd > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user From arnd.baecker at web.de Wed Oct 15 09:18:34 2003 From: arnd.baecker at web.de (Arnd Baecker) Date: Wed, 15 Oct 2003 15:18:34 +0200 (CEST) Subject: [SciPy-user] Cannot handle sparse matrices, yet. In-Reply-To: <3F8D57A7.FE37098E@mecha.uni-stuttgart.de> References: <3F8D48B4.16CBE6C5@mecha.uni-stuttgart.de> <3F8D57A7.FE37098E@mecha.uni-stuttgart.de> Message-ID: On Wed, 15 Oct 2003, Nils Wagner wrote: [...] > What is going on there ? Hmm, no idea ;-). Did writing and loading ever work for you ? Eg. what happens if you create a "normal" real diagonal matrix, store that and try to reload ? Maybe the -v4 output is not compatible with V4 ? Sorry, I think I am not of any further help here - it's time for the experts I'd guess. Arnd From jdhunter at ace.bsd.uchicago.edu Wed Oct 15 09:16:57 2003 From: jdhunter at ace.bsd.uchicago.edu (John Hunter) Date: Wed, 15 Oct 2003 08:16:57 -0500 Subject: [SciPy-user] Cannot handle sparse matrices, yet. In-Reply-To: <3F8D57A7.FE37098E@mecha.uni-stuttgart.de> (Nils Wagner's message of "Wed, 15 Oct 2003 16:20:23 +0200") References: <3F8D48B4.16CBE6C5@mecha.uni-stuttgart.de> <3F8D57A7.FE37098E@mecha.uni-stuttgart.de> Message-ID: >>>>> "Nils" == Nils Wagner writes: Nils> k0r = real(full(k00)) k0i = imag(full(k00)) Nils> and save them as two real matrices Nils> save -v4 -mat k0r save -v4 -mat k0i Nils> I obtain the same message as before matfile handles the sparse->full case and complex matrices. >> S = sparse(zeros(100,100)); >> M = full(S); >> M(1,1) = 3+j; >> save full.mat M Then >>> import Numeric >>> import matfile >>> d = matfile.load('full.mat') >>> M = d['M'] >>> print M.shape, M[0,0] (100, 100) (3+1j) JDH From jdhunter at ace.bsd.uchicago.edu Wed Oct 15 09:25:36 2003 From: jdhunter at ace.bsd.uchicago.edu (John Hunter) Date: Wed, 15 Oct 2003 08:25:36 -0500 Subject: [SciPy-user] Re: io.loadmat In-Reply-To: (John Hunter's message of "Wed, 15 Oct 2003 07:27:35 -0500") References: <20031014185205.796AE3EB27@www.scipy.com> <3F8C90AA.8070609@insightscientific.com> <3F8C91ED.5040301@colorado.edu> <3F8D0B6C.F662B336@mecha.uni-stuttgart.de> Message-ID: >>>>> "Nils" == Nils Wagner writes: Nils> Did you ask the author --> Nils> http://ion.le.ac.uk/admin/nigel.html already ? John> I emailed him this morning. I hope he didn't get too many John> requests this AM :-). I'll post back if he responds. Nigel got back to me. He said he had no objection to releasing it under another license. He released it originally under GPL for convenience and asked for pointers to appropriate licenses for scipy. I pointed him to PSF, BSD and MIT, and suggested he post here when he did a release. If I don't hear from him in a week or so, I'll send another gentle request. JDH From nwagner at mecha.uni-stuttgart.de Wed Oct 15 11:11:19 2003 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Wed, 15 Oct 2003 17:11:19 +0200 Subject: [SciPy-user] loadmat etc Message-ID: <3F8D6397.A1E24DA2@mecha.uni-stuttgart.de> Hi all, I have made a mistake when I saved the full matrices in Matlab. Sorry for that. io.loadmat works fine. But, how can I take the array ouf of b ? >>> b {'k0i': array([[ ...... ]])} >>> Nils From jdhunter at ace.bsd.uchicago.edu Wed Oct 15 10:02:23 2003 From: jdhunter at ace.bsd.uchicago.edu (John Hunter) Date: Wed, 15 Oct 2003 09:02:23 -0500 Subject: [SciPy-user] loadmat etc In-Reply-To: <3F8D6397.A1E24DA2@mecha.uni-stuttgart.de> (Nils Wagner's message of "Wed, 15 Oct 2003 17:11:19 +0200") References: <3F8D6397.A1E24DA2@mecha.uni-stuttgart.de> Message-ID: >>>>> "Nils" == Nils Wagner writes: Nils> But, how can I take the array ouf of b ? >>>> b Nils> {'k0i': array([[ ...... ]])} Following Prabu's suggestion to your post earlier, you really need to read the python tutorial. You've gotten a lot of help here, but you should take some time to help yourself. The answer to this is elementary python, and you'll not make good progress until you learn this and other essential python features by reading the tutorial or other introductory books. Hint: read about dicts. John Hunter From fperez at colorado.edu Wed Oct 15 18:31:00 2003 From: fperez at colorado.edu (Fernando Perez) Date: Wed, 15 Oct 2003 16:31:00 -0600 Subject: [SciPy-user] Re: [Scipy-chaco] TeX/LaTeX and Chaco In-Reply-To: <3F8E6CE5.A83F845B@noaa.gov> References: <3F58DFF5.8030203@uibk.ac.at> <002001c37b11$4bfedc00$01fd5644@playroom> <3F8E6CE5.A83F845B@noaa.gov> Message-ID: <3F8DCAA4.3070206@colorado.edu> Chris Barker wrote: > Anyway, PyX is a pretty neat project, that seems to have a lot in common > with Chaco/Kiva. I wanted to make sure you all knew about it. Just a few days ago I mentioned it on the scipy-* (not chaco) lists, after finding out about it myself. Something's in the air ;-) Anyway, even more coincidence, I spent a good chunk of time today getting it to work, which unfortunately isn't totally trivial on a RedHat box, mainly because of some path issues. I was this very minute about to send my notes to the developers so they perhaps can include the fixes in the future. But since you brought it up here, I figured I'd put my notes on these lists. Once you have the instructions, the fixes are trivial to implement. But knowing what to do may save people quite a bit of time. Hopefully these instructions will be unnecessary in the near future. In the meantime, I hope some find them useful. Best, f. -------------- next part -------------- An embedded and charset-unspecified text was scrubbed... Name: pyx URL: From gpajer at rider.edu Wed Oct 15 19:27:56 2003 From: gpajer at rider.edu (Gary Pajer) Date: Wed, 15 Oct 2003 19:27:56 -0400 Subject: [SciPy-user] Re: [Scipy-chaco] TeX/LaTeX and Chaco In-Reply-To: <3F8DCAA4.3070206@colorado.edu> Message-ID: <352C7FE4-FF67-11D7-BC42-000393A41AAE@rider.edu> On Wednesday, October 15, 2003, at 06:31 PM, Fernando Perez wrote: > Chris Barker wrote: > >> Anyway, PyX is a pretty neat project, that seems to have a lot in >> common >> with Chaco/Kiva. I wanted to make sure you all knew about it. > > Just a few days ago I mentioned it on the scipy-* (not chaco) lists, > after finding out about it myself. Something's in the air ;-) > > Anyway, even more coincidence, I spent a good chunk of time today > getting it to work, which unfortunately isn't totally trivial on a > RedHat box, Very nontrivial on Windows, too. If I ever get it compiled and running (I have doubts) I'll post instructions. Certainly the installation is not ready for primetime. (I'd love it if someone makes a fool out of me and tells me how easy it is ... ) How is the performance relative to the promises on the web site? From fperez at colorado.edu Wed Oct 15 19:32:39 2003 From: fperez at colorado.edu (Fernando Perez) Date: Wed, 15 Oct 2003 17:32:39 -0600 Subject: [SciPy-user] Re: [Scipy-chaco] TeX/LaTeX and Chaco In-Reply-To: <352C7FE4-FF67-11D7-BC42-000393A41AAE@rider.edu> References: <352C7FE4-FF67-11D7-BC42-000393A41AAE@rider.edu> Message-ID: <3F8DD917.9010604@colorado.edu> Gary Pajer wrote: > Very nontrivial on Windows, too. If I ever get it compiled and running > (I have doubts) I'll post instructions. Certainly the installation is > not ready for primetime. (I'd love it if someone makes a fool out of > me and tells me how easy it is ... ) > > How is the performance relative to the promises on the web site? Actually, quite good as far as I've seen. I ran all the examples, and only one (the PyX logo with a pattern of little copies of itself) didn't give me identical-looking eps. That's listed on the report I sent to the developers. What I find _extremely_ interesting about it is the ability to load pre-made EPS files to add latex symbols and arbitrary PostScript elements to them. Gnuplot is my main workhorse, but making symbols, arrows, etc in Gnuplot is a pain in the neck. A combination of Gnuplot for the first pass over data and pyx for 'post-processing' may be a killer. Regards, Fernando. PS. I'd love to see chaco incorporate these capabilities, but until that happens on Linux, we need viable alternatives. However, for the long run I truly hope Chaco becomes very successful. As Chris suggested, the pyx approach may even provide Chaco with Latex capabilities. That would indeed be fantastic. From ariciputi at pito.com Thu Oct 16 04:54:41 2003 From: ariciputi at pito.com (Andrea Riciputi) Date: Thu, 16 Oct 2003 10:54:41 +0200 Subject: [SciPy-user] Re: [Scipy-chaco] TeX/LaTeX and Chaco In-Reply-To: <3F8DCAA4.3070206@colorado.edu> Message-ID: <61792A41-FFB6-11D7-B3AE-000393933E4E@pito.com> Hi, I tried PyX too after your post last week. It looks very promising and I'm doing some experiments with it. Under MacOS X it compiles without problems and all the examples worked (except minimal.py and bar.py because corresponding .dat files are missing, but the author told me he had already fixed this in CVS), 'import pyx' gives the same problem here. I've looked at the output of pattern.py example (both mine and yours) and as far as I can see it looks like the one on the website. Are you sure your output is different from the one on the site? For further discussion about PyX I've subscribed PyX-Users mailing list. Hope to see your posts there soon. ;-) For those interested in PyX and running MacOS X I've prepared and submitted a Fink's package. I hope it will be available to all the Fink's users very soon. Cheers, Andrea. On Thursday, Oct 16, 2003, at 00:31 Europe/Rome, Fernando Perez wrote: > Chris Barker wrote: > >> Anyway, PyX is a pretty neat project, that seems to have a lot in >> common >> with Chaco/Kiva. I wanted to make sure you all knew about it. > > Just a few days ago I mentioned it on the scipy-* (not chaco) lists, > after finding out about it myself. Something's in the air ;-) > > Anyway, even more coincidence, I spent a good chunk of time today > getting it to work, which unfortunately isn't totally trivial on a > RedHat box, mainly because of some path issues. I was this very > minute about to send my notes to the developers so they perhaps can > include the fixes in the future. But since you brought it up here, I > figured I'd put my notes on these lists. Once you have the > instructions, the fixes are trivial to implement. But knowing what to > do may save people quite a bit of time. > > Hopefully these instructions will be unnecessary in the near future. > In the meantime, I hope some find them useful. > > Best, > > f. > PyX NOTES > ========= > > PyX http://pyx.sourceforge.net is an extremely interesting tool for > manipulating PostScript files in python, with full access to LaTeX > commands. > In particular, it is capable of reading in an existing eps file, and > 'overlay' > additional elements (graphical or text, including arbitrary mathematics > defined in LaTeX). Since it is a python library, all changes are done > programatically and can be automated. > > In conjunction with (I)Python's support for Gnuplot, it becomes very > easy to > generate the basic part of a plot using Gnuplot, and then use PyX to > add any > elements which may be difficult or impossible to control with Gnuplot. > > However, the current (0.4.1) version of PyX as distributed presents a > few > hurdles, at least on a RedHat system. > > Below is a list of the problems I encountered, along with their > solution. > They can all be fixed within a few minutes, and hopefully these fixes > will be > implemented in the mainline distribution for future versions. > > > PROBLEM > ------- > > 'import pyx' doesn't provide access to pyx's modules. The examples > provided > all use 'from pyx import *', which is bad form for larger scripts, > where this > kind of blanket import is bound to cause nasty name collisions. > > FIX: trivial. In __init__.py, add after __all__ is defined: > > # Load __all__ in pyx namespace so that a simple 'import pyx' gives > # access to them via pyx. > for name in __all__: > __import__(name,globals(),locals(),[]) > > > - PROBLEM: setup.py doesn't put things in the proper place if --home > is given > different from the default. I used --home=~/usr/local, and while pyx > itself > went into ~/usr/local/lib/python/pyx, its .lfs files went to > ~/usr/local/share/pyx. But the pyx program itself expects to find them > somewhere else, because trying to run hello.py produces: > > planck[examples]> python hello.py > Traceback (most recent call last): > File "hello.py", line 4, in ? > c.text(0, 0, "Hello, world!") > File "/home/fperez/usr/local/lib/python/pyx/canvas.py", line 767, in > text > return self.insert(self.texrunner.text(x, y, atext, *args)) > File "/home/fperez/usr/local/lib/python/pyx/text.py", line 2283, in > text > return self._text(unit.topt(x), unit.topt(y), expr, *args) > File "/home/fperez/usr/local/lib/python/pyx/text.py", line 2271, in > _text > self.execute(expr, *helper.getattrs(args, texmessage, > default=self.texmessagedefaultrun)) > File "/home/fperez/usr/local/lib/python/pyx/text.py", line 1994, in > execute > raise IOError("file '%s' not found. Available latex font sizes: > %s" % (lfsname, lfsnames)) > IOError: file '10pt.lfs' not found. Available latex font sizes: [] > > This may be a bug in setup.py, or in pyx itself (pyx may need to > adjust its > search paths according to where it got installed). I don't know for > sure. > > Putting some print statements in text.py gave the following: > > *** file 10pt.lfs not found! > *** file /usr/share/pyx/10pt.lfs not found! > *** file /home/fperez/usr/local/lib/python/pyx/lfs/10pt.lfs not found! > > So the problem is in how pyx builds its search path (lines 1980-93 in > text.py). It needs to pick up the correct installation path, or > setup.py > needs to make sure that the lfs files end up underneath the pyx > directory > (instead of above it, in ../../share). > > FIX: For now I can fix it manually, by moving the .lfs files to a > directory > (which must be created first) called > /home/fperez/usr/local/lib/python/pyx/lfs. But this needs to be done > permanently in the code. I'm not exactly sure what the correct fix is. > > > PROBLEM > ------- > > After fixing the previous one manually, I still can't run hello.py. > Note that I've also seen this error on a machine where pyx was > 'setup.py > install'ed to a default directory. > > Here's the traceback: > > planck[examples]> python hello.py > Traceback (most recent call last): > File "hello.py", line 6, in ? > c.writetofile("hello") > > ... [snip] > > RuntimeError: no information for font 'cmr10' found in font mapping > file, aborting > > This problem is discussed in: > > http://sourceforge.net/mailarchive/ > forum.php?thread_id=3225190&forum_id=23700 > > FIX: change around line 504 of text.py, from: > > fontmap = readfontmap(["psfonts.map"]) > > to: > > fontmap = readfontmap(["psfonts.map","psfonts.cmz","psfonts.amz"]) > > It might be worth doing this in the mainline sources. Even if it is > not the > ideal default for some latex-related reasons I don't understand, the > truth is > that distributions as popular as RedHat choke on the current > configuration. > If this simple change makes pyx work out of the box for many users, it > is that > much more likely to become popular. > > > PROBLEM > ------- > > Trying to run the latex.py example: > > planck[examples]> python latex.py > Traceback (most recent call last): > File "latex.py", line 9, in ? > c.text(0, 0, r"This is \LaTeX.") > > .... [snip] > > pyx.text.TexResultError: TeX didn't respond as expected within the > timeout > period (5 seconds). > The expression passed to TeX was: > \documentclass{article}% > \PyXInput{3}% > After parsing the return message from TeX, the following was left: > * > > > The problem is that on a system where ~ is NFS mounted, a 5 second > timeout may > be just too short. We serve /home from an old Solaris box, and > latex'ing a > file typically is a bit of a slow process with lots of network > read/write > activity. I wouldn't be surprised if this was a rather common > situation in > typical unix shops. > > FIX: I'd advocate for a much longer timeout, perhaps 30 seconds at > least. I > changed it to 30 (line 1848 of text.py), and it works fine even on our > sluggish network. While this timeout is a bit long, it makes pyx a > much more > robust system in the face of 'real world' network environments. > > > PROBLEM > ------- > > After fixing the above, I can run most of the included examples. The > only > remaining issues are: > > - The pattern.eps file looks very different from the one on the > website. The > global figure is indeed the PyX logo, but instead of being made out of > little > copies of PyX, it's made out of text blocks. This may or may not be > an issue > with my system, I don't know. > > I've put up what I get (temporarily) in > http://windom.colorado.edu/~fperez/tmp/pattern.eps > in case the developers find it useful. > > - bar.py and minimal.py don't run because the corresponding .dat files > are not > included in the distribution. > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user > --- Andrea Riciputi "Science is like sex: sometimes something useful comes out, but that is not the reason we are doing it" -- (Richard Feynman) From fperez at colorado.edu Thu Oct 16 13:28:52 2003 From: fperez at colorado.edu (Fernando Perez) Date: Thu, 16 Oct 2003 11:28:52 -0600 Subject: [SciPy-user] Re: [Scipy-chaco] TeX/LaTeX and Chaco In-Reply-To: <61792A41-FFB6-11D7-B3AE-000393933E4E@pito.com> References: <61792A41-FFB6-11D7-B3AE-000393933E4E@pito.com> Message-ID: <3F8ED554.8030601@colorado.edu> Andrea Riciputi wrote: > I've looked at the output of pattern.py example (both mine and yours) > and as far as I can see it looks like the one on the website. Are you > sure your output is different from the one on the site? Issue solved. It was an artifact of using antialiasing with gv. If I turn AA off, it looks fine. Which means it's a ghostview bug, not a pyx one. > > For further discussion about PyX I've subscribed PyX-Users mailing > list. Hope to see your posts there soon. ;-) I just did too. Long reply coming now, addressing the various issues raised here. Best, f ps. I won't post on this further here, to keep the scipy lists focused again on scipy. However, I fully agree with Chris Barker that PyX may be exactly the way to integrate LaTeX into output, so I hope the scipy core team considers it as a possibility. From anewgene at hotpop.com Fri Oct 17 16:44:14 2003 From: anewgene at hotpop.com (CL WU) Date: Fri, 17 Oct 2003 15:44:14 -0500 Subject: [SciPy-user] changes of weave in current cvs version? In-Reply-To: <3F8ED554.8030601@colorado.edu> References: <61792A41-FFB6-11D7-B3AE-000393933E4E@pito.com> <3F8ED554.8030601@colorado.edu> Message-ID: <3F90549E.6030903@hotpop.com> Hi, Group, I installed cvs version of scipy, but I can not make dictionary sort code in weave document work. def c_sort(adict): |/================================================================== assert(type(adict) == type({})) code = """ #line 21 "dict_sort.py" Py::List keys = adict.keys(); Py::List items(keys.length()); keys.sort(); PyObject* item = NULL; for(int i = 0; i < keys.length();i++) { item = PyList_GET_ITEM(keys.ptr(),i); item = PyDict_GetItem(adict.ptr(),item); Py_XINCREF(item); PyList_SetItem(items.ptr(),i,item); } return_val = Py::new_reference_to(items); """ return inline_tools.inline(code,['adict'],verbose=1) ================================================================== I guess it is because weave is updated while document not. I did some modification of the code, such as, "Py" --> "py"; list reference using "[]". It solved some errors, but still I cannot figure out how to access items from a dictionary. Help needed. Thanks in advance. Chunlei /| From mbakker at engr.uga.edu Sun Oct 19 12:48:52 2003 From: mbakker at engr.uga.edu (Mark Bakker) Date: Sun, 19 Oct 2003 12:48:52 -0400 Subject: [SciPy-user] Windows binary install for 2.3 Message-ID: <3F92C074.6050506@engr.uga.edu> Sorry to bother you guys, but does anybody have time to make a scipy binary windows install for Python 2.3.X? I would really really appreciate it (I am still in 2.2 land and very eager to move forward). Thanks, Mark From turner at lanl.gov Sun Oct 19 21:43:56 2003 From: turner at lanl.gov (John A. Turner) Date: Sun, 19 Oct 2003 19:43:56 -0600 Subject: [SciPy-user] Windows binary install for 2.3 In-Reply-To: <3F92C074.6050506@engr.uga.edu> References: <3F92C074.6050506@engr.uga.edu> Message-ID: <3F933DDC.7080807@lanl.gov> Mark Bakker wrote: > Sorry to bother you guys, but does anybody have time to make a scipy > binary windows install for Python 2.3.X? > I would really really appreciate it (I am still in 2.2 land and very > eager to move forward). you mean something besides the mega-install from Enthought? http://www.enthought.com/python/ personally I've been loving that... -John Turner From imr at oersted.dtu.dk Mon Oct 20 04:06:34 2003 From: imr at oersted.dtu.dk (Ivan Martinez) Date: Mon, 20 Oct 2003 10:06:34 +0200 Subject: [SciPy-user] Any paper about control applications? Message-ID: <200310201006.34557.imr@oersted.dtu.dk> Hello all, I'm writing a paper where I list open-source languages for scientific computing. For every language, I include a reference to a example of the language being used in a control application. Could anybody give me references to papers/proceedings about SciPi/Python being used to control some process/experiment/robot/etc.?. Thank you. Ivan Martinez From astraw at insightscientific.com Mon Oct 20 05:39:44 2003 From: astraw at insightscientific.com (Andrew Straw) Date: Mon, 20 Oct 2003 19:09:44 +0930 Subject: [SciPy-user] Any paper about control applications? In-Reply-To: <200310201006.34557.imr@oersted.dtu.dk> References: <200310201006.34557.imr@oersted.dtu.dk> Message-ID: <3F93AD60.7000506@insightscientific.com> Ivan Martinez wrote: > Hello all, > I'm writing a paper where I list open-source languages for scientific > computing. For every language, I include a reference to a example of the > language being used in a control application. Could anybody give me > references to papers/proceedings about SciPi/Python being used to control > some process/experiment/robot/etc.?. Hi, I and others use the Vision Egg http://www.visionegg.org/ for performing visual neuroscience experiments. It's written in 99% Python, but itself doesn't use scipy. (Although I use scipy extensively for data analysis). Do you perhaps mean control systems applications? In which case, have you seen "the other pyro" -- http://emergent.brynmawr.edu/~dblank/pyro/ ? Cheers! Andrew -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: application/pgp-signature Size: 155 bytes Desc: not available URL: From wagner.nils at vdi.de Tue Oct 21 15:07:16 2003 From: wagner.nils at vdi.de (Nils Wagner) Date: Tue, 21 Oct 2003 21:07:16 +0200 Subject: [SciPy-user] Sparse eigenvalue problems Message-ID: <20031021190956.886753EB2E@www.scipy.com> Dear experts, Is it possible to solve generalized eigenvalue problems A x = \lambda B x with scipy, where A and B are large and s p a r s e matrices ? Any suggestion ? Nils http://www.caam.rice.edu/software/ARPACK/ From jdhunter at ace.bsd.uchicago.edu Tue Oct 21 15:42:57 2003 From: jdhunter at ace.bsd.uchicago.edu (John Hunter) Date: Tue, 21 Oct 2003 14:42:57 -0500 Subject: [SciPy-user] ANN matplotlib-0.30 - matlab style python plotting Message-ID: matplotlib is a 2D plotting package for python with a matlab compatible syntax and output tested under linux and windows platforms. matplotlib-0.30 is available for download at http://matplotlib.sourceforge.net, and has many new features since the last major release. Multiple outputs matplotlib now supports postscript and GD output, as well as the traditional GTK backend. The postscript backend goes a long way towards the goal of acheiving publication quality output. The GD backend allows you to use matplotlib even in environments with no X server, such as for a web application server serving dynamic charts. Log scaling With the help of Andrew Straw, matplotlib now has log axis capabilities, with new commands semilogx, semilogy and loglog See http://matplotlib.sourceforge.net/screenshots.html#log_shot Legends With the help of Charles R. Twardy, matplotlib now has a matlab compatible legend command. See http://matplotlib.sourceforge.net/screenshots.html#legend_demo Numerous bug fixes and minor additions DPI parameter allows multiple output resolutions with correct scaling Several bug fixes in GTK interactive mode using examples/interactive2.py Multiple ways to specify colors, including matlab compatible format strings, RGB tuples, and html-style hex color strings Rewrite of line class for much greater compatibility with matlab handle graphics commands and flexibility in choosing line styles and markers See http://matplotlib.sourceforge.net/screenshots.html for screenshots and http://matplotlib.sourceforge.net/whats_new.html for more detailed information on what's new. John Hunter From ggerber at sun.ac.za Tue Oct 21 15:48:22 2003 From: ggerber at sun.ac.za (Gerber G ) Date: Tue, 21 Oct 2003 21:48:22 +0200 Subject: [SciPy-user] fsolve: Max iteration cut out Message-ID: <5CD3C38E1B0D774F9254A78617B148F0073B11@STBEVS02.stb.sun.ac.za> Hello, This might be a trivial question.. I am currently programming finite elements. I am using optimize.fsolve to solve my non-linear set of equations. Two approaches were taken: 1) Use fsolve, supplied only with a function that computes the residual. 2) Use fsolve, supplied with a function for the residual and a function for the Jacobian. Approach (1) gives satisfactory results after 25 function evaluations for each time-step. (This is a hyperbolic partial differential equation) Approach (2) succeeds with the first time-step by using 30 function calls and 3 Jacobian calls. However, the time-steps that follow, terminates at 11 function calls with the error message: "The iteration is not making good progress, as measured by the improvement from the last ten iterations." The default setting in fsolve uses 100(N+1) as the maximum number of iterations. Why does it stop at 11 function calls? Attached is the runs for the two approaches. Thanks in advance, George Gerber -------------- next part -------------- An embedded and charset-unspecified text was scrubbed... Name: Residual.txt URL: -------------- next part -------------- An embedded and charset-unspecified text was scrubbed... Name: residualAndJacobian.txt URL: From serquiciamailing at fastimap.com Tue Oct 21 18:36:03 2003 From: serquiciamailing at fastimap.com (Santiago Erquicia) Date: Tue, 21 Oct 2003 17:36:03 -0500 Subject: [SciPy-user] Newbie question Message-ID: <3F95B4D3.70807@fastimap.com> Sorry to do this, but I don't find any documentation about how to do this. I want to use the pdf and cdf method of the triangular distribution. I was looking at the constructor of the distribution and its methods and I really have no clue what every parameter means. In my case, I have triangular distributions with min_value = 3, mode = 5 and max_value = 7. How do I need to construct the triangular distribution? triang_dist = scipy.stats.distribution.triang_gen(3,5,7)? The other simple problem (not for me) is that I want to get the cdf of the distribution at point 6. I supposed that with triang_dist.pdf(6) would do it, but it asks me for two parameters. How about the pdf? Does anyone has an example of how to use it? Thanks and sorry for the basic question. From kern at ugcs.caltech.edu Tue Oct 21 19:16:53 2003 From: kern at ugcs.caltech.edu (Robert Kern) Date: Tue, 21 Oct 2003 16:16:53 -0700 Subject: [SciPy-user] Newbie question In-Reply-To: <3F95B4D3.70807@fastimap.com> References: <3F95B4D3.70807@fastimap.com> Message-ID: <20031021231653.GA24092@lira.ugcs.caltech.edu> On Tue, Oct 21, 2003 at 05:36:03PM -0500, Santiago Erquicia wrote: > Sorry to do this, but I don't find any documentation about how to do this. > > I want to use the pdf and cdf method of the triangular distribution. I > was looking at the constructor of the distribution and its methods and I > really have no clue what every parameter means. Yeah, it's confusing. > In my case, I have triangular distributions with min_value = 3, mode = 5 > and max_value = 7. How do I need to construct the triangular > distribution? triang_dist = scipy.stats.distribution.triang_gen(3,5,7)? Currently, you can't "freeze" a distribution like that. Use triang, not triang_gen. Each time you call one of the methods, you must specify all three parameters in the method call. Also, check the docstring for the meaning of these parameters. loc=min_value, scale=max_value-min_value, c=(mode-min_value)/(max_value-min_value) in this case. The c parameter comes first, then loc, then scale. So, to get the pdf evaluated at some point x (possibly an array), >>> triang.pdf(x, 0.5, 3, 4) >>> triang.cdf(x, 0.5, 3, 4) etc. For Travis O., et al.: This is a confusing interface and cumbersome since it requires that we pass the same parameters to every method call. Would it be possible to have an API (either replacing the current one or auxiliary to it) such that one can provide the parameters in a constructor, then call the methods with only necessary arguments. E.g. my_triang = frozen_triang(0.5, 3, 4) my_triang.pdf(x) my_triang.rvs() or perhaps my_triang = freeze_dist(triang, (0.5, 3, 4)) -- Robert Kern kern at ugcs.caltech.edu "In the fields of Hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From nwagner at mecha.uni-stuttgart.de Wed Oct 22 06:04:12 2003 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Wed, 22 Oct 2003 12:04:12 +0200 Subject: [SciPy-user] Finite element method using scipy Message-ID: <3F96561C.76445790@mecha.uni-stuttgart.de> Hi all, I wonder, if there is any project using scipy with regard to finite element analysis ? I found some projects related to python via google - however, the activity seems to be very low. Any reference would be appreciated. Nils From bryan.cole at teraview.com Wed Oct 22 06:24:44 2003 From: bryan.cole at teraview.com (bryan cole) Date: Wed, 22 Oct 2003 11:24:44 +0100 Subject: [SciPy-user] Finite element method using scipy In-Reply-To: <3F96561C.76445790@mecha.uni-stuttgart.de> References: <3F96561C.76445790@mecha.uni-stuttgart.de> Message-ID: <1066818284.1059.8.camel@bryan.teraviewhq.local> I recently came across the 'PySparse' and 'PyFemax' modules at http://people.web.psi.ch/geus/pyfemax/index.html I found PySparse easier to use than the sparsekit in SciPy, largely due to better (i.e. some) documentation. 'PyFemax' is a finite element maxwels-equation solver suite. I'm not sure about more general FEM-modules however. Bryan On Wed, 2003-10-22 at 11:04, Nils Wagner wrote: > Hi all, > > I wonder, if there is any project using scipy with regard to finite > element analysis ? > I found some projects related to python via google - however, the > activity seems to be very low. > Any reference would be appreciated. > > Nils > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user -- Bryan Cole Teraview Ltd., 302-304 Cambridge Science Park, Milton Road, Cambridge CB4 0WG, United Kingdom. tel: +44 (1223) 435380 / 435386 (direct-dial) fax: +44 (1223) 435382 From oliphant at ee.byu.edu Wed Oct 22 12:12:07 2003 From: oliphant at ee.byu.edu (Travis E. Oliphant) Date: Wed, 22 Oct 2003 10:12:07 -0600 Subject: [SciPy-user] Newbie question In-Reply-To: <3F95B4D3.70807@fastimap.com> References: <3F95B4D3.70807@fastimap.com> Message-ID: <3F96AC57.8060701@ee.byu.edu> Santiago Erquicia wrote: > Sorry to do this, but I don't find any documentation about how to do this. > > I want to use the pdf and cdf method of the triangular distribution. I > was looking at the constructor of the distribution and its methods and I > really have no clue what every parameter means. > > In my case, I have triangular distributions with min_value = 3, mode = 5 > and max_value = 7. How do I need to construct the triangular > distribution? triang_dist = scipy.stats.distribution.triang_gen(3,5,7)? > Which version of SciPy are you using? For the current version > 0.2 the following help is provided. There is a general pattern for all the distribution, but you have to read the documentation (continuous.pdf) in order to understand it at all. I am in the process of adding docstrings for all the stats functions (does anybody know of someway to alter the docstring dynamically?) Here is the basic idea: All distributions are defined from a simple standard random variable with any appropriate shape parameters. Then, the variance can be changed by adding the appropriate scale parameter (scale=) and the mean can be changed by adding a location parameter (loc=) If the following variables are defined: u --- mean of standard random variable o --- standard deviation of standard random variable S --- scale parameter (user supplied) L --- location parameter (user supplied) then the new mean and standard deviation are u1 = S u + L o1 = S o For the triangle distribution, the simple standard form is a triangle pdf between 0 and 1. The user supplies an additional shape parameter (c) that gives the location of the peak of the triangle (0 The other simple problem (not for me) is that I want to get the cdf of > the distribution at point 6. I supposed that with triang_dist.pdf(6) > would do it, but it asks me for two parameters. How about the pdf? stats.triang.cdf(6,0.5,loc=3,scale=4) stats.triang.pdf(6,0.5,loc=3,scale=4) I will be adding simpler versions of some common distributions, but this general approach makes remember how to use all the distributions easier once you get used to it. The way the distributions are implemented made it difficult to add docstrings in the simple way (without redefining pdf and cdf for all distributions). I was looking for a way to add docstrings dynamically, but I never found one. It would appear that the __doc__ attribute is read-only At some point we may add some simpler interfaces to common distributions --- the triangle distribution is one that has already been asked for. Thanks for your patience. -Travis Oliphant From serquiciamailing at fastimap.com Wed Oct 22 12:48:41 2003 From: serquiciamailing at fastimap.com (Santiago Erquicia) Date: Wed, 22 Oct 2003 11:48:41 -0500 Subject: [SciPy-user] Newbie question In-Reply-To: <3F96AC57.8060701@ee.byu.edu> References: <3F95B4D3.70807@fastimap.com> <3F96AC57.8060701@ee.byu.edu> Message-ID: <3F96B4E9.6040108@fastimap.com> Travis E. Oliphant wrote: > Santiago Erquicia wrote: > > Which version of SciPy are you using? > SciPy-0.2.0_alpha_210.4077.win32-py2.2 > For the current version > 0.2 the following help is provided. > > There is a general pattern for all the distribution, but you have to > read the documentation (continuous.pdf) in order to understand it at all. > Where can I find that document? It appears that it is not installed with the package I have. Thanks for your answer. That really helped. From ehaux at ucmerced.edu Wed Oct 22 15:21:30 2003 From: ehaux at ucmerced.edu (ehaux at ucmerced.edu) Date: Wed, 22 Oct 2003 12:21:30 -0700 Subject: [SciPy-user] can't get past special.j0 Message-ID: <939b7bac.7bac939b@ucmerced.edu> I'm working through the tutorial(s) (http://www.scipy.org/site_content/tutorials/plot_tutorial) but don't know what to do about the following error: >>> z = special.j0(r) Traceback (most recent call last): File "", line 1, in ? NameError: name 'special' is not defined >>> thanks in advance. Eric From travis at enthought.com Wed Oct 22 15:34:23 2003 From: travis at enthought.com (Travis N. Vaught) Date: Wed, 22 Oct 2003 14:34:23 -0500 Subject: [SciPy-user] can't get past special.j0 In-Reply-To: <939b7bac.7bac939b@ucmerced.edu> Message-ID: <047c01c398d3$833f48f0$0200a8c0@tvlaptop> Eric, My gut reaction is to tell you to make sure you've done a: from scipy import * to get scipy.special in the root of the namespace (as special). Travis > -----Original Message----- > From: scipy-user-bounces at scipy.net [mailto:scipy-user-bounces at scipy.net] > On Behalf Of ehaux at ucmerced.edu > Sent: Wednesday, October 22, 2003 1:22 PM > To: scipy-user at scipy.net > Subject: [SciPy-user] can't get past special.j0 > > I'm working through the tutorial(s) > (http://www.scipy.org/site_content/tutorials/plot_tutorial) > but don't know what to do about the following error: > > >>> z = special.j0(r) > Traceback (most recent call last): > File "", line 1, in ? > NameError: name 'special' is not defined > >>> > > thanks in advance. > > Eric > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user From serquiciamailing at fastimap.com Wed Oct 22 17:39:06 2003 From: serquiciamailing at fastimap.com (Santiago Erquicia) Date: Wed, 22 Oct 2003 16:39:06 -0500 Subject: [SciPy-user] Another problem with triangular pdf Message-ID: <3F96F8FA.6070007@fastimap.com> I'm working with the triangular pdf and I find that this is not normalized. Given this values of x: [ 1. 1.2 1.4 1.6 1.8 2. 2.2 2.4 2.6 2.8 3. ] and getting the values of the triangular pdf as follows: scipy.stats.triang.pdf(x, distr_shape, loc = distr_location, scale = distr_scale) where distr_shape = 2.0 distr_location = 1.0 distr_scale = 0.5 that represents a triangular distribution with min = 1, mode = 2, and max = 3 the results are: [ 0. 0.2 0.4 0.6 0.8 1. 0.8 0.6 0.4 0.2 0. ] Shouldn't the results be always less than 1 and the integral of the pdf from min to x equal to the cdf? Is there any other method that normalize the pdf? Thanks, Santiago From ehaux at ucmerced.edu Wed Oct 22 18:06:37 2003 From: ehaux at ucmerced.edu (ehaux at ucmerced.edu) Date: Wed, 22 Oct 2003 15:06:37 -0700 Subject: [SciPy-user] can't get past special.j0 Message-ID: <667f989d.989d667f@ucmerced.edu> Thanks Travis, that worked. now i just have to figure out the following: >>> z = special.j0(r) >>> gplt.surf(z) Traceback (most recent call last): File "", line 1, in ? File "/usr/lib/python2.3/site-packages/scipy/gplt/ interface.py", line 164, in surf apply(_active.surf,data) File "/usr/lib/python2.3/site-packages/scipy/gplt/ pyPlot.py", line 437, in surf self._init_plot() File "/usr/lib/python2.3/site-packages/scipy/gplt/ pyPlot.py", line 702, in _init_plot self._send('reset') File "/usr/lib/python2.3/site-packages/scipy/gplt/ pyPlot.py", line 820, in _send self.g.flush() IOError: [Errno 32] Broken pipe Eric From kern at ugcs.caltech.edu Wed Oct 22 18:20:15 2003 From: kern at ugcs.caltech.edu (Robert Kern) Date: Wed, 22 Oct 2003 15:20:15 -0700 Subject: [SciPy-user] Another problem with triangular pdf In-Reply-To: <3F96F8FA.6070007@fastimap.com> References: <3F96F8FA.6070007@fastimap.com> Message-ID: <20031022222015.GA27783@riyal.ugcs.caltech.edu> On Wed, Oct 22, 2003 at 04:39:06PM -0500, Santiago Erquicia wrote: > I'm working with the triangular pdf and I find that this is not normalized. > > Given this values of x: > [ 1. 1.2 1.4 1.6 1.8 2. 2.2 2.4 2.6 2.8 3. ] > > and getting the values of the triangular pdf as follows: > scipy.stats.triang.pdf(x, distr_shape, loc = distr_location, scale = > distr_scale) > > where > distr_shape = 2.0 > distr_location = 1.0 > distr_scale = 0.5 > > that represents a triangular distribution with min = 1, mode = 2, and > max = 3 Not quite. distr_shape = 0.5; distr_scale = 2.0 is I think what you meant. > the results are: > [ 0. 0.2 0.4 0.6 0.8 1. 0.8 0.6 0.4 0.2 0. ] > > Shouldn't the results be always less than 1 No. All that matters is that the integral between the limits ([loc, loc+scale] in this case) be 1 and the pdf is always non-negative. In general, the value of the pdf at any point x has units like 1/x (If x is a random variable in length, for example, the pdf(x) has units 1/length). Think about if you shrink the scale factor. To keep the integral equal to 1, the peak must increase. > and the integral of the pdf > from min to x equal to the cdf? Yes. The area under the triangle should equal 1. The area under a triangle is width*height/2 = 2*1/2 = 1 in this case. The pdf is correct. -- Robert Kern kern at ugcs.caltech.edu "In the fields of Hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From covino at mi.astro.it Thu Oct 23 04:06:40 2003 From: covino at mi.astro.it (Stefano Covino) Date: Thu, 23 Oct 2003 10:06:40 +0200 Subject: [SciPy-user] table matching Message-ID: <200310231006.40020.covino@mi.astro.it> Hi! Let's assume to have two sets of points on a plane identified by their coordinates x,y. If you like you can think to positions of objects as seen from an airplanes in two different pictures. The pictures have the same spatial orientation and scale, but can be shifted both in X and Y. My goal is to match tho two sets of points finding the common objects within a given tolerance and the relative shift DX, DY of the best match. Of course the problem is well known and there are a lot of algorithms to face with it. However, I wonder if in the present SciPy release there is a python tool already developed to apply in this case. Thank you in advance, Stefano From josegomez at gmx.net Thu Oct 23 05:58:55 2003 From: josegomez at gmx.net (=?iso-8859-1?q?Jos=E9=20Luis=20G=F3mez=20Dans?=) Date: Thu, 23 Oct 2003 10:58:55 +0100 Subject: [SciPy-user] table matching In-Reply-To: <200310231006.40020.covino@mi.astro.it> References: <200310231006.40020.covino@mi.astro.it> Message-ID: <200310231058.55217.josegomez@gmx.net> On Thursday 23 Oct 2003 9:06 am, Stefano Covino wrote: > However, I wonder if in the present SciPy release there is a python > tool already developed to apply in this case. I'm no scipy expert, but can't you just find out what the dx,dy displacement is by means of a 2D correlation (use FFTs to calculate that in frequency domain if you want), and then re-sample one image with respect to the other (using one of the interpolation routines in scipy)? I know I am not answering your question directly, and I am not really poiting anything novel with regards to scipy :-) Jos? -- Jos? L G?mez Dans PhD student Tel: +44 114 222 5582 Radar & Communications Group FAX; +44 870 132 2990 Department of Electronic Engineering University of Sheffield UK From covino at mi.astro.it Thu Oct 23 06:22:17 2003 From: covino at mi.astro.it (Stefano Covino) Date: Thu, 23 Oct 2003 12:22:17 +0200 Subject: [SciPy-user] table matching In-Reply-To: <200310231058.55217.josegomez@gmx.net> References: <200310231006.40020.covino@mi.astro.it> <200310231058.55217.josegomez@gmx.net> Message-ID: <200310231222.17681.covino@mi.astro.it> > > I'm no scipy expert, but can't you just find out what the dx,dy > displacement is by means of a 2D correlation (use FFTs to calculate > that in frequency domain if you want), and then re-sample one image > with respect to the other (using one of the interpolation routines in > scipy)? > Well, this is one of the possible algorithms to solve the problem. Actually I admit I am wondering if someone has already done the job for me! However the problem can in principle be a bit more complicated. It is possible that the two sets show holes, i.e. some object can be included in a list and not in the other due to some bias. In this case it would be interesting to evaluate the performances of FFTs. In any case thank you. It is anyway a possibility. Stefano From bldrake at adaptcs.com Thu Oct 23 10:21:23 2003 From: bldrake at adaptcs.com (Barry Drake) Date: Thu, 23 Oct 2003 07:21:23 -0700 (PDT) Subject: [SciPy-user] weave and compiler Message-ID: <20031023142123.13293.qmail@web104.mail.yahoo.com> This is my first time using weave. OS: Win XP pro Compilers: MS VS.Net, Enthought MingW32 distro with gcc 3.2 I haven't found a way to force weave to use mingw32 rather that msvc. Is there a configuration setting or command line arg I can use? I've checked: platform_info.py ext_tools.py build_tools.py It appears that on Windows if MSVS is installed weave and distutils always default to the MS compiler. In the fibonacci.py example code, I change the line: mod = ext_tools.ext_module('fibonacci_ext') to mod = ext_tools.ext_module('fibonacci_ext', 'mingw32') without success. Here is the output: Traceback (most recent call last): File "C:\Python23\Lib\site-packages\weave\examples\fibonacci1.py", line 71, in ? build_fibonacci() File "C:\Python23\Lib\site-packages\weave\examples\fibonacci1.py", line 66, in build_fibonacci mod.compile() File "..\ext_tools.py", line 340, in compile verbose = verbose, **kw) File "..\build_tools.py", line 240, in build_extension compiler_dir = platform_info.get_compiler_dir(compiler_name) File "..\platform_info.py", line 110, in get_compiler_dir compiler_obj = create_compiler_instance(dist) File "..\platform_info.py", line 50, in create_compiler_instance compiler = new_compiler(compiler=compiler_name) File ".\distutils\ccompiler.py", line 1173, in new_compiler File ".\distutils\msvccompiler.py", line 212, in __init__ distutils.errors.DistutilsPlatformError: Python was built with version 6 of Visual Studio, and extensions need to be built with the same version of the compiler, but it isn't installed. How can I force the use of a different compiler? Thanks. Barry From jdhunter at ace.bsd.uchicago.edu Thu Oct 23 12:20:40 2003 From: jdhunter at ace.bsd.uchicago.edu (John Hunter) Date: Thu, 23 Oct 2003 11:20:40 -0500 Subject: [SciPy-user] matfile now under BSD In-Reply-To: <3F97FF09.2050809@ion.le.ac.uk> (Nigel Wade's message of "Thu, 23 Oct 2003 17:17:13 +0100") References: <3F8D47C1.4070700@ion.le.ac.uk> <3F97FF09.2050809@ion.le.ac.uk> Message-ID: >>>>> "Nigel" == Nigel Wade writes: John Hunter wrote: >> If you decide to release it under one the above licenses or a >> variant, that would be excellent. You can either email me or >> better yet, the scipy list directly. Nigel> Sorry about the delay, but I got sidetracked onto other Nigel> things... Nigel> I've changed the license in matfile so it's now the BSD Nigel> license. The tar file on Nigel> ftp://ion.le.ac.uk/matfile/matfile.tar.gz now contains this Nigel> new license. Excellent. Thanks a lot. Nigel> I'd notify the scipy mailing list but I don't know it's Nigel> location. CC'd on this email Nigel> I have no real idea of the portability of the C code, Nigel> either onto other OSs, or with differing versions of Nigel> Numeric. With that in mind, I originally set out to write a Nigel> verions entirely in python/Numeric. This has prompted me to Nigel> go back to that and I'm currently in the process of doing Nigel> it. What it's speed will be like is anyone's guess, but I Nigel> guess that's the price you pay for portability - better to Nigel> run slow than not at all. Sounds great. Let me know when you have a beta version and I'll help test it with some of my matfiles. Some of the scipy gurus here might be able to give you some pointers on how to make it fit naturally into scipy. Thanks again. John Hunter From lanceboyle at myrealbox.com Fri Oct 24 02:19:33 2003 From: lanceboyle at myrealbox.com (Lance Boyle) Date: Thu, 23 Oct 2003 23:19:33 -0700 Subject: [SciPy-user] Newbie Q on array indexing Message-ID: <08F4CE09-05EA-11D8-8453-003065F93FF0@myrealbox.com> I find the Python restriction that array indexing must start with 0 to be not only unnatural and awkward when such indexing doesn't match the problem at hand, but also an excellent way to generate errors in my code. With all due apologies, this seems like a bad (and unnecessary) language design, since I have a Pascal and Ada background. My question is--does SciPy or any other Python package provide a way around this limitation? Is there a work-around that doesn't cause a major slowdown in execution speed? Does using a dictionary make sense? Jerry From pearu at scipy.org Fri Oct 24 03:13:29 2003 From: pearu at scipy.org (Pearu Peterson) Date: Fri, 24 Oct 2003 02:13:29 -0500 (CDT) Subject: [SciPy-user] Newbie Q on array indexing In-Reply-To: <08F4CE09-05EA-11D8-8453-003065F93FF0@myrealbox.com> Message-ID: On Thu, 23 Oct 2003, Lance Boyle wrote: > I find the Python restriction that array indexing must start with 0 to > be not only unnatural and awkward when such indexing doesn't match the > problem at hand, but also an excellent way to generate errors in my > code. With all due apologies, this seems like a bad (and unnecessary) > language design, since I have a Pascal and Ada background. Since my mother tongue is Estonian and my English is not so good, then it does not mean that English is "bad and unnecessary" (and all English speaking persons should learn Estonian, which is not so bad idea from my point of view;-). > My question is--does SciPy or any other Python package provide a way > around this limitation? Is there a work-around that doesn't cause a > major slowdown in execution speed? Does using a dictionary make sense? Please, google comp.lang.python for "zero indexing". It gives plenty of reasons why indexing from 0 is good or bad; and also gives hints for workarounds on the subject. I don't think that it is reasonable for SciPy to start supporting indexing from any number than zero. It's waste of time and resources, IMHO. Regards, Pearu From wagner.nils at vdi.de Fri Oct 24 09:01:14 2003 From: wagner.nils at vdi.de (Nils Wagner) Date: Fri, 24 Oct 2003 15:01:14 +0200 Subject: [SciPy-user] Matlab to scipy Message-ID: <20031024130403.891363EB35@www.scipy.com> Dear experts, I tried to convert the following Matlab code to scipy. (LU-factorization with complete pivoting) http://www1.mate.polimi.it/~calnum/programs.html However it failed. Any help would be appreciated Nils function [L,U,P,Q] = LUpivtot(A,n) P=eye(n); Q=P; Minv=P; for k=1:n-1 [Pk,Qk]=pivot(A,k,n); A=Pk*A*Qk; [Mk,Mkinv]=MGauss(A,k,n); A=Mk*A; P=Pk*P; Q=Q*Qk; Minv=Minv*Pk*Mkinv; end U=triu(A); L=P*Minv; function [Mk,Mkinv]=MGauss(A,k,n) Mk=eye(n); for i=k+1:n Mk(i,k)=-A(i,k)/A(k,k); end Mkinv=2*eye(n)-Mk; return function [Pk,Qk]=pivot(A,k,n) [y,i]=max(abs(A(k:n,k:n))); [piv,jpiv]=max(y); ipiv=i(jpiv); jpiv=jpiv+k-1; ipiv=ipiv+k-1; Pk=eye(n); Pk(ipiv,ipiv)=0; Pk(k,k)=0; Pk(k,ipiv)=1; Pk(ipiv,k)=1; Qk=eye(n); Qk(jpiv,jpiv)=0; Qk(k,k)=0; Qk(k,jpiv)=1; Qk(jpiv,k)=1; return ------------------------------------------------------------- from scipy import * import MLab def LUpivot(A,n): P = MLab.eye(n) Q = P Minv = P for k in arange(0,n): Pk,Qk = pivot(A,k,n) A = dot(Pk,dot(A,Qk)) Mk,Mkinv = MGauss(A,k,n) A = dot(Mk,A) P = dot(Pk,P) Q = dot(Q,Qk) Minv = dot(Minv,dot(Pk,Mkinv)) U = MLab.triu(A) L = dot(P,Minv) return L,U,P,Q def MGauss(A,k,n): Mk = MLab.eye(n) for i in arange(k+1,n): Mk[i,k]=-A[i,k]/A[k,k] Mkinv = 2.0*MLab.eye(n)-Mk return Mk, Mkinv def pivot(A,k,n): y,i = MLab.max(abs(A[k:n,k:n])) piv,jpiv = MLab.max(y) ipiv=i[jpiv] jpiv=jpiv+k-1 ipiv=ipiv+k-1 Pk = MLab.eye(n) Pk[ipiv,ipiv]=0 Pk[k,k] =0 Pk[k,ipiv] =1 Pk[ipiv,k] =1 Qk=MLab.eye(n) Qk[jpiv,jpiv]=0 Qk[k,k] =0 Qk[k,jpiv] =1 Qk[jpiv,k] =1 # # Example # A = array(( [-4000.0, 2000.0, 2000.0], [ 2000.0, 0.78125, 0.0], [ 2000.0, 0.0, 0.0])) n,m=shape(A) L,U,P,Q = LUpivot(A,n) From pearu at scipy.org Fri Oct 24 09:16:52 2003 From: pearu at scipy.org (Pearu Peterson) Date: Fri, 24 Oct 2003 08:16:52 -0500 (CDT) Subject: [SciPy-user] Matlab to scipy In-Reply-To: <20031024130403.891363EB35@www.scipy.com> Message-ID: On Fri, 24 Oct 2003, Nils Wagner wrote: > Dear experts, > > I tried to convert the following Matlab code to scipy. > (LU-factorization with complete pivoting) > http://www1.mate.polimi.it/~calnum/programs.html > > However it failed. Any help would be appreciated Notice that lines > [y,i]=max(abs(A(k:n,k:n))); in Matlab and > y,i = MLab.max(abs(A[k:n,k:n])) in Python are not equivalent. The latter just fails with exception ValueError: too many values to unpack Hopefully this hint is enough for you to fix your code, Pearu From wagner.nils at vdi.de Fri Oct 24 09:16:27 2003 From: wagner.nils at vdi.de (Nils Wagner) Date: Fri, 24 Oct 2003 15:16:27 +0200 Subject: [SciPy-user] LU factorization with c o m p l e t e pivoting Message-ID: <20031024131915.0E2323EB37@www.scipy.com> Dear all, I am wondering, if someone has written a function for LU factorization with c o m p l e t e pivoting in scipy. Nils Another Matlab implementation is given in the Matrix computation toolbox by Higham (gep.m). http://www.maths.man.ac.uk/~higham/mctoolbox/ However the conversion from Matlab to Scipy is not straightforward (at least in my judgement). From wagner.nils at vdi.de Fri Oct 24 09:39:13 2003 From: wagner.nils at vdi.de (Nils Wagner) Date: Fri, 24 Oct 2003 15:39:13 +0200 Subject: [SciPy-user] Matlab to scipy In-Reply-To: Message-ID: <20031024134201.BFDD03EB3B@www.scipy.com> ------------------- On Fri, 24 Oct 2003, Nils Wagner wrote: > Dear experts, > > I tried to convert the following Matlab code to scipy. > (LU-factorization with complete pivoting) > http://www1.mate.polimi.it/~calnum/programs.html > > However it failed. Any help would be appreciated Notice that lines > [y,i]=max(abs(A(k:n,k:n))); in Matlab and > y,i = MLab.max(abs(A[k:n,k:n])) in Python are not equivalent. The latter just fails with exception ValueError: too many values to unpack Hopefully this hint is enough for you to fix your code, Pearu Pearu, [y,i] = max(X) stores the indices of the maximum values in vector i. If there are several identical maximum values, the index of the first one found is returned. So y = MLab.max(abs(A[k:n,k:n])) But, how can I evaluate i in scipy ? Nils _______________________________________________ SciPy-user mailing list SciPy-user at scipy.net http://www.scipy.net/mailman/listinfo/scipy-user From pearu at scipy.org Fri Oct 24 10:35:38 2003 From: pearu at scipy.org (Pearu Peterson) Date: Fri, 24 Oct 2003 09:35:38 -0500 (CDT) Subject: [SciPy-user] Matlab to scipy In-Reply-To: <20031024134201.BFDD03EB3B@www.scipy.com> Message-ID: On Fri, 24 Oct 2003, Nils Wagner wrote: > [y,i] = max(X) stores the indices of the maximum values in > vector i. If there are several identical maximum values, the > index of the first one found is returned. > So > y = MLab.max(abs(A[k:n,k:n])) > But, how can I evaluate i in scipy ? Try i?amax functions from linalg.fblas, For example, >>> A array([[ 3., 4., 1.], [ 1., 2., 3.], [-2., -1., 0.]]) >>> i=map(linalg.fblas.idamax,transpose(A)) >>> y=map(lambda a,j:a[j],transpose(A),i) >>> y,i ([3.0, 4.0, 3.0], [0, 0, 1]) while in Matlab: >> [y,i]=max(A) y = 3 4 3 i = 1 1 2 HTH, Pearu From chris at fonnesbeck.org Fri Oct 24 10:49:17 2003 From: chris at fonnesbeck.org (Christopher Fonnesbeck) Date: Fri, 24 Oct 2003 10:49:17 -0400 Subject: [SciPy-user] CVS build on OSX segfaults Message-ID: <3E25D148-0631-11D8-88DF-000A956FDAC0@fonnesbeck.org> I am trying to build the latest CVS of scipy on OSX, but I run into a segmentation fault when generating the flapack interface: Running generate build/temp.darwin-6.8-Power_Macintosh-2.3/Lib/linalg/flapack.pyf generating flapack interface Segmentation fault this has not occurred with past CVS builds, so I am wondering if anyone else has had a similar result, or ideas as to why this may be happening. Thanks, -- Christopher J. Fonnesbeck (c h r i s @ f o n n e s b e c k . o r g) GA Coop. Fish & Wildlife Research Unit, University of Georgia From pearu at scipy.org Fri Oct 24 10:59:52 2003 From: pearu at scipy.org (Pearu Peterson) Date: Fri, 24 Oct 2003 09:59:52 -0500 (CDT) Subject: [SciPy-user] CVS build on OSX segfaults In-Reply-To: <3E25D148-0631-11D8-88DF-000A956FDAC0@fonnesbeck.org> Message-ID: On Fri, 24 Oct 2003, Christopher Fonnesbeck wrote: > I am trying to build the latest CVS of scipy on OSX, but I run into a > segmentation fault when generating the flapack interface: > > Running generate > build/temp.darwin-6.8-Power_Macintosh-2.3/Lib/linalg/flapack.pyf > generating flapack interface > Segmentation fault > > this has not occurred with past CVS builds, so I am wondering if anyone > else has had a similar result, or ideas as to why this may be happening. Could determine exactly what part of the building process is causing this segfault? What lapack/atlas libraries are you using? Pearu From wagner.nils at vdi.de Fri Oct 24 11:13:53 2003 From: wagner.nils at vdi.de (Nils Wagner) Date: Fri, 24 Oct 2003 17:13:53 +0200 Subject: [SciPy-user] Matlab to scipy In-Reply-To: Message-ID: <20031024151641.C15543EB36@www.scipy.com> ------------------- On Fri, 24 Oct 2003, Nils Wagner wrote: > [y,i] = max(X) stores the indices of the maximum values in > vector i. If there are several identical maximum values, the > index of the first one found is returned. > So > y = MLab.max(abs(A[k:n,k:n])) > But, how can I evaluate i in scipy ? Try i?amax functions from linalg.fblas, For example, >>> A array([[ 3., 4., 1.], [ 1., 2., 3.], [-2., -1., 0.]]) >>> i=map(linalg.fblas.idamax,transpose(A)) >>> y=map(lambda a,j:a[j],transpose(A),i) >>> y,i ([3.0, 4.0, 3.0], [0, 0, 1]) while in Matlab: >> [y,i]=max(A) y = 3 4 3 i = 1 1 2 HTH, Pearu Pearu, I have still some problems with the conversion. Maybe this is due to different indexing. I have enclosed the program. Hopefully someone will find the bug. Afaik, only partial pivoting is available in the current version of scipy. Might it be an idea to add complete pivoting ? Thank you again. Nils _______________________________________________ SciPy-user mailing list SciPy-user at scipy.net http://www.scipy.net/mailman/listinfo/scipy-user -------------- next part -------------- A non-text attachment was scrubbed... Name: complete.py Type: application/octet-stream Size: 1311 bytes Desc: not available URL: From chris at fonnesbeck.org Fri Oct 24 15:41:32 2003 From: chris at fonnesbeck.org (Christopher Fonnesbeck) Date: Fri, 24 Oct 2003 15:41:32 -0400 Subject: [SciPy-user] CVS build on OSX seg In-Reply-To: Message-ID: <1231785F-065A-11D8-BEB7-000A956FDAC0@fonnesbeck.org> It appears that it is macpython 2.3 that is the problem; building on python 2.2 on the same machine works fine. It is interesting that the same 2.3 installation successfully built scipy only a couple months ago. Since all my current development is based on 2.3, I guess I will have to wait until the problem is resolved. Thanks, Chris On Friday, October 24, 2003, at 02:34 PM, Pearu Peterson wrote: > > > On 24 Oct 2003, Chris Fonnesbeck wrote: > >> -> Ok, I got the linalg build to run, but it also gives a >> segmentation fault at the same place. > > This means that your Python is crashing and probably when executing > functions from interface_gen.py. Try: > > cd Lib/linalg > python interface_gen.py > > Note that when this gives segfault then there must be something wrong > with > Python, recent changes in scipy did not involve this file. > > Pearu > > > -- Christopher J. Fonnesbeck (c h r i s @ f o n n e s b e c k . o r g) GA Coop. Fish & Wildlife Research Unit, University of Georgia From zunbeltz at lcdx00.wm.lc.ehu.es Wed Oct 22 05:04:38 2003 From: zunbeltz at lcdx00.wm.lc.ehu.es (Zunbeltz Izaola) Date: Wed, 22 Oct 2003 11:04:38 +0200 (MET DST) Subject: [SciPy-user] error testing in linux Message-ID: Hi I've build scipy for linux and i got the folowing error when testing scipy.test(level=10) otuput : unction f6 cc.bisect : 4.560 cc.ridder : 5.460 cc.brenth : 5.240 cc.brentq : 4.990 ..F segment failure Information obtained from python -c 'from f2py2e.diagnose import run;run()' is: ----- os.name='posix' ------ sys.platform='linux2' ------ sys.version: 2.2.3 (#1, Oct 21 2003, 08:32:13) [GCC 2.95.3 20010315 (SuSE)] ------ sys.prefix: /usr/local ------ sys.path=':/usr/local/dislin/python:/usr/lib/python2.2/site-packages:/usr/local/lib/python2.2:/usr/local/lib/python2.2/plat-linux2:/usr/local/lib/python2.2/lib-tk:/usr/local/lib/python2.2/lib-dynload:/usr/local/lib/python2.2/site-packages:/usr/local/lib/python2.2/site-packages/Numeric' ------ Found Numeric version '22.0' in /usr/local/lib/python2.2/site-packages/Numeric/Numeric.pyc ------ Found f2py2e version '2.37.233-1545' in /usr/lib/python2.2/site-packages/f2py2e/f2py2e.pyc ------ Found scipy_distutils version '0.2.0_alpha_3.288' in '/usr/local/lib/python2.2/site-packages/scipy_distutils/__init__.pyc' Importing scipy_distutils.command.build_flib ... ok ------ Checking availability of supported Fortran compilers: detecting Absoft Fortran compiler... f77 -V -c /tmp/@1801.0__dummy.f -o /tmp/@1801.0__dummy.o 256: f77: invalid version number format detecting SGI Fortran compiler... f90 -version 32512: sh: f90: command not found detecting Forte Fortran compiler... f90 -V 32512: sh: f90: command not found detecting Sun Fortran compiler... f90 -V 32512: sh: f90: command not found detecting Intel Fortran compiler... ifc -FI -V -c /tmp/@1801.1__dummy.f -o /tmp/@1801.1__dummy.o 32512: sh: ifc: command not found detecting Itanium Fortran compiler... efc -FI -V -c /tmp/@1801.2__dummy.f -o /tmp/@1801.2__dummy.o 32512: sh: efc: command not found detecting NAG Fortran compiler... f95 -V 32512: sh: f95: command not found detecting Compaq Fortran compiler... fort -version 32512: sh: fort: command not found detecting VAST Fortran compiler... vf90 -v 32512: sh: vf90: command not found f90 +version 32512: sh: f90: command not found detecting F Fortran compiler... F -V 32512: sh: F: command not found detecting Lahey Fortran compiler... lf95 --version 32512: sh: lf95: command not found running gnu_fortran_compiler.find_lib_directories g77 -v detecting Gnu Fortran compiler... g77 --version found GNU Fortran 0.5.25 20010315 (release) ------ Importing scipy_distutils.command.cpuinfo ... ok CPU information: getNCPUs has_mmx is_Intel is_Pentium is_PentiumII is_i686 is_singleCPU ------ Can someone help me? Thanks in advance Zunbeltz Izaola From bldrake at adaptcs.com Fri Oct 24 21:27:53 2003 From: bldrake at adaptcs.com (Barry Drake) Date: Fri, 24 Oct 2003 18:27:53 -0700 (PDT) Subject: [SciPy-user] Weave problem Message-ID: <20031025012753.78262.qmail@web105.mail.yahoo.com> When I try to run the fibonacci.py that came installed with Enthought Python 2.3 dist., I get the following: Traceback (most recent call last): File "C:\Python23\Lib\site-packages\weave\examples\fibonacci1.py", line 71, in ? build_fibonacci() File "C:\Python23\Lib\site-packages\weave\examples\fibonacci1.py", line 66, in build_fibonacci mod.compile() File "..\ext_tools.py", line 340, in compile verbose = verbose, **kw) File "..\build_tools.py", line 240, in build_extension compiler_dir = platform_info.get_compiler_dir(compiler_name) File "..\platform_info.py", line 110, in get_compiler_dir compiler_obj = create_compiler_instance(dist) File "..\platform_info.py", line 50, in create_compiler_instance compiler = new_compiler(compiler=compiler_name) File ".\distutils\ccompiler.py", line 1173, in new_compiler File ".\distutils\msvccompiler.py", line 212, in __init__ distutils.errors.DistutilsPlatformError: Python was built with version 6 of Visual Studio, and extensions need to be built with the same version of the compiler, but it isn't installed. Does this mean I have to go out and buy MCVC 6.0 and uninstall MSVC 7.1 or rebuild Python 2.3 from the ground up using my MSVC 7.1? Is there some reason that Python 2.3 was not built with mingw32, which is supplied with the Enthought Python 2.3 dist.? It seems to me that the latter would make the use of weave and other tools seamless regardless of which MS*^(*&(* is installed. Does anyone know of a less painless way to get around the above problem? Please tell me I'm missing something real simple; so far all of my searching and digging has yieled nothing. Thanks. Barry From travis at enthought.com Fri Oct 24 22:17:39 2003 From: travis at enthought.com (Travis N. Vaught) Date: Fri, 24 Oct 2003 21:17:39 -0500 Subject: [SciPy-user] Weave problem In-Reply-To: <20031025012753.78262.qmail@web105.mail.yahoo.com> Message-ID: <0b4e01c39a9e$2d20a410$0200a8c0@tvlaptop> > -----Original Message----- > From: scipy-user-bounces at scipy.net [mailto:scipy-user-bounces at scipy.net] > On Behalf Of Barry Drake > Sent: Friday, October 24, 2003 7:28 PM > To: SciPy Users > Subject: [SciPy-user] Weave problem > > When I try to run the fibonacci.py that came installed > compiler = new_compiler(compiler=compiler_name) > File ".\distutils\ccompiler.py", line 1173, in > new_compiler > File ".\distutils\msvccompiler.py", line 212, in > __init__ > distutils.errors.DistutilsPlatformError: Python was > built with version 6 of Visual Studio, and extensions > need to be built with the same version of the > compiler, but it isn't installed. > > Does this mean I have to go out and buy MCVC 6.0 and > uninstall MSVC 7.1 or rebuild Python 2.3 from the > ground up using my MSVC 7.1? Is there some reason > that Python 2.3 was not built with mingw32, which is > supplied with the Enthought Python 2.3 dist.? [TNV] You can change line 65 to: mod.compile(compiler='mingw32') and it works for me (I don't have the msvc compiler either) There is also --compiler=mingw32 setting that works on a distutils setup.py with the build_ext switch if you happen to be building an extension module with the ext_tools. > It > seems to me that the latter would make the use of > weave and other tools seamless regardless of which > MS*^(*&(* is installed. Does anyone know of a less > painless way to get around the above problem? Please > tell me I'm missing something real simple; so far all > of my searching and digging has yieled nothing. > Apologies for the 'broken' example (broken=)...the docs and examples are lagging a bit. Regards, Travis From bldrake at adaptcs.com Fri Oct 24 23:57:11 2003 From: bldrake at adaptcs.com (Barry Drake) Date: Fri, 24 Oct 2003 20:57:11 -0700 (PDT) Subject: [SciPy-user] Weave problem In-Reply-To: <0b4e01c39a9e$2d20a410$0200a8c0@tvlaptop> Message-ID: <20031025035711.44488.qmail@web103.bizmail.yahoo.com> It works now! Many, many thanks for your prompt response and easy solution (the error message looked much more ominous). Barry --- "Travis N. Vaught" wrote: > > -----Original Message----- > > From: scipy-user-bounces at scipy.net > [mailto:scipy-user-bounces at scipy.net] > > On Behalf Of Barry Drake > > Sent: Friday, October 24, 2003 7:28 PM > > To: SciPy Users > > Subject: [SciPy-user] Weave problem > > > > When I try to run the fibonacci.py that came > installed > > > > > compiler = > new_compiler(compiler=compiler_name) > > File ".\distutils\ccompiler.py", line 1173, in > > new_compiler > > File ".\distutils\msvccompiler.py", line 212, in > > __init__ > > distutils.errors.DistutilsPlatformError: Python > was > > built with version 6 of Visual Studio, and > extensions > > need to be built with the same version of the > > compiler, but it isn't installed. > > > > Does this mean I have to go out and buy MCVC 6.0 > and > > uninstall MSVC 7.1 or rebuild Python 2.3 from the > > ground up using my MSVC 7.1? Is there some reason > > that Python 2.3 was not built with mingw32, which > is > > supplied with the Enthought Python 2.3 dist.? > > [TNV] > > You can change line 65 to: > > mod.compile(compiler='mingw32') > > and it works for me (I don't have the msvc compiler > either) > > There is also --compiler=mingw32 setting that works > on a distutils > setup.py with the build_ext switch if you happen to > be building an > extension module with the ext_tools. > > > > It > > seems to me that the latter would make the use of > > weave and other tools seamless regardless of which > > MS*^(*&(* is installed. Does anyone know of a > less > > painless way to get around the above problem? > Please > > tell me I'm missing something real simple; so far > all > > of my searching and digging has yieled nothing. > > > > Apologies for the 'broken' example (broken= ms product>)...the > docs and examples are lagging a bit. > > > Regards, > > Travis > > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user From bldrake at adaptcs.com Sat Oct 25 00:11:17 2003 From: bldrake at adaptcs.com (Barry Drake) Date: Fri, 24 Oct 2003 21:11:17 -0700 (PDT) Subject: [SciPy-user] Weave problem In-Reply-To: <0b4e01c39a9e$2d20a410$0200a8c0@tvlaptop> Message-ID: <20031025041117.82947.qmail@web104.mail.yahoo.com> > There is also --compiler=mingw32 setting that works > on a distutils > setup.py with the build_ext switch if you happen to > be building an > extension module with the ext_tools. Travis, Since weave is providing the front end for distutils, do you know where in the weave code I could insert this option? Or is it always the case that in using weave the comple() method is always called at some point? In the latter case, you've already provided the solution. Barry --- "Travis N. Vaught" wrote: > > -----Original Message----- > > From: scipy-user-bounces at scipy.net > [mailto:scipy-user-bounces at scipy.net] > > On Behalf Of Barry Drake > > Sent: Friday, October 24, 2003 7:28 PM > > To: SciPy Users > > Subject: [SciPy-user] Weave problem > > > > When I try to run the fibonacci.py that came > installed > > > > > compiler = > new_compiler(compiler=compiler_name) > > File ".\distutils\ccompiler.py", line 1173, in > > new_compiler > > File ".\distutils\msvccompiler.py", line 212, in > > __init__ > > distutils.errors.DistutilsPlatformError: Python > was > > built with version 6 of Visual Studio, and > extensions > > need to be built with the same version of the > > compiler, but it isn't installed. > > > > Does this mean I have to go out and buy MCVC 6.0 > and > > uninstall MSVC 7.1 or rebuild Python 2.3 from the > > ground up using my MSVC 7.1? Is there some reason > > that Python 2.3 was not built with mingw32, which > is > > supplied with the Enthought Python 2.3 dist.? > > [TNV] > > You can change line 65 to: > > mod.compile(compiler='mingw32') > > and it works for me (I don't have the msvc compiler > either) > > There is also --compiler=mingw32 setting that works > on a distutils > setup.py with the build_ext switch if you happen to > be building an > extension module with the ext_tools. > > > > It > > seems to me that the latter would make the use of > > weave and other tools seamless regardless of which > > MS*^(*&(* is installed. Does anyone know of a > less > > painless way to get around the above problem? > Please > > tell me I'm missing something real simple; so far > all > > of my searching and digging has yieled nothing. > > > > Apologies for the 'broken' example (broken= ms product>)...the > docs and examples are lagging a bit. > > > Regards, > > Travis > > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user From travis at enthought.com Sat Oct 25 01:43:34 2003 From: travis at enthought.com (Travis N. Vaught) Date: Sat, 25 Oct 2003 00:43:34 -0500 Subject: [SciPy-user] Weave problem In-Reply-To: <20031025041117.82947.qmail@web104.mail.yahoo.com> Message-ID: <0b8801c39aba$f182edb0$0200a8c0@tvlaptop> > -----Original Message----- > From: scipy-user-bounces at scipy.net [mailto:scipy-user-bounces at scipy.net] > On Behalf Of Barry Drake > Sent: Friday, October 24, 2003 10:11 PM > To: SciPy Users List > Subject: RE: [SciPy-user] Weave problem > > > There is also --compiler=mingw32 setting that works > > on a distutils > > setup.py with the build_ext switch if you happen to > > be building an > > extension module with the ext_tools. > > Travis, > Since weave is providing the front end for distutils, > do you know where in the weave code I could insert > this option? Or is it always the case that in using > weave the comple() method is always called at some > point? In the latter case, you've already provided > the solution. > > Barry I think both are the case. You could have run fibonacci.py with: C:\Python23\Lib\site-packages\weave\examples>c:\python23\python fibonacci.py build_ext --compiler=mingw32 Without having the compiler specified in the module (I think). The --compiler switch is only available through with build_ext. Regards Travis > > --- "Travis N. Vaught" wrote: > > > -----Original Message----- > > > From: scipy-user-bounces at scipy.net > > [mailto:scipy-user-bounces at scipy.net] > > > On Behalf Of Barry Drake > > > Sent: Friday, October 24, 2003 7:28 PM > > > To: SciPy Users > > > Subject: [SciPy-user] Weave problem > > > > > > When I try to run the fibonacci.py that came > > installed > > > > > > > > > compiler = > > new_compiler(compiler=compiler_name) > > > File ".\distutils\ccompiler.py", line 1173, in > > > new_compiler > > > File ".\distutils\msvccompiler.py", line 212, in > > > __init__ > > > distutils.errors.DistutilsPlatformError: Python > > was > > > built with version 6 of Visual Studio, and > > extensions > > > need to be built with the same version of the > > > compiler, but it isn't installed. > > > > > > Does this mean I have to go out and buy MCVC 6.0 > > and > > > uninstall MSVC 7.1 or rebuild Python 2.3 from the > > > ground up using my MSVC 7.1? Is there some reason > > > that Python 2.3 was not built with mingw32, which > > is > > > supplied with the Enthought Python 2.3 dist.? > > > > [TNV] > > > > You can change line 65 to: > > > > mod.compile(compiler='mingw32') > > > > and it works for me (I don't have the msvc compiler > > either) > > > > There is also --compiler=mingw32 setting that works > > on a distutils > > setup.py with the build_ext switch if you happen to > > be building an > > extension module with the ext_tools. > > > > > > > It > > > seems to me that the latter would make the use of > > > weave and other tools seamless regardless of which > > > MS*^(*&(* is installed. Does anyone know of a > > less > > > painless way to get around the above problem? > > Please > > > tell me I'm missing something real simple; so far > > all > > > of my searching and digging has yieled nothing. > > > > > > > Apologies for the 'broken' example (broken= > ms product>)...the > > docs and examples are lagging a bit. > > > > > > Regards, > > > > Travis > > > > > > _______________________________________________ > > SciPy-user mailing list > > SciPy-user at scipy.net > > http://www.scipy.net/mailman/listinfo/scipy-user > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user From pearu at scipy.org Sat Oct 25 04:58:14 2003 From: pearu at scipy.org (Pearu Peterson) Date: Sat, 25 Oct 2003 03:58:14 -0500 (CDT) Subject: [SciPy-user] error testing in linux In-Reply-To: Message-ID: On Wed, 22 Oct 2003, Zunbeltz Izaola wrote: > Hi > I've build scipy for linux and i got the folowing error when testing > scipy.test(level=10) > > otuput : > > unction f6 > > cc.bisect : 4.560 > cc.ridder : 5.460 > cc.brenth : 5.240 > cc.brentq : 4.990 > > > > ..F > > segment failure > Can someone help me? First check that scipy were build against correct Numeric header files (compare the header files in Numeric source distribution and the installed header files). If that is ok then try using scipy from its recent CVS (and don't forget `rm -rf build` before building scipy). Pearu From bldrake at adaptcs.com Sat Oct 25 13:14:43 2003 From: bldrake at adaptcs.com (Barry Drake) Date: Sat, 25 Oct 2003 10:14:43 -0700 (PDT) Subject: [SciPy-user] Weave problem In-Reply-To: <0b8801c39aba$f182edb0$0200a8c0@tvlaptop> Message-ID: <20031025171443.8225.qmail@web105.mail.yahoo.com> > fibonacci.py build_ext --compiler=mingw32 This worked well and is what I will use for my own code. I just ran all of the test code in the /weave/tests subdirectory by forcing the compiler usage in calls to compile() and inline(). Most tests passed. The scxx and wx tests didn't pass since I couldn't find a way to nonintrusively force the compiler to mingw32. Thank you for your help. Barry --- "Travis N. Vaught" wrote: > > > > -----Original Message----- > > From: scipy-user-bounces at scipy.net > [mailto:scipy-user-bounces at scipy.net] > > On Behalf Of Barry Drake > > Sent: Friday, October 24, 2003 10:11 PM > > To: SciPy Users List > > Subject: RE: [SciPy-user] Weave problem > > > > > There is also --compiler=mingw32 setting that > works > > > on a distutils > > > setup.py with the build_ext switch if you happen > to > > > be building an > > > extension module with the ext_tools. > > > > Travis, > > Since weave is providing the front end for > distutils, > > do you know where in the weave code I could insert > > this option? Or is it always the case that in > using > > weave the comple() method is always called at some > > point? In the latter case, you've already > provided > > the solution. > > > > Barry > > I think both are the case. You could have run > fibonacci.py with: > > C:\Python23\Lib\site-packages\weave\examples>c:\python23\python > fibonacci.py build_ext --compiler=mingw32 > > Without having the compiler specified in the module > (I think). > > The --compiler switch is only available through with > build_ext. > > Regards > > Travis > > > > > --- "Travis N. Vaught" > wrote: > > > > -----Original Message----- > > > > From: scipy-user-bounces at scipy.net > > > [mailto:scipy-user-bounces at scipy.net] > > > > On Behalf Of Barry Drake > > > > Sent: Friday, October 24, 2003 7:28 PM > > > > To: SciPy Users > > > > Subject: [SciPy-user] Weave problem > > > > > > > > When I try to run the fibonacci.py that came > > > installed > > > > > > > > > > > > > compiler = > > > new_compiler(compiler=compiler_name) > > > > File ".\distutils\ccompiler.py", line 1173, > in > > > > new_compiler > > > > File ".\distutils\msvccompiler.py", line > 212, in > > > > __init__ > > > > distutils.errors.DistutilsPlatformError: > Python > > > was > > > > built with version 6 of Visual Studio, and > > > extensions > > > > need to be built with the same version of the > > > > compiler, but it isn't installed. > > > > > > > > Does this mean I have to go out and buy MCVC > 6.0 > > > and > > > > uninstall MSVC 7.1 or rebuild Python 2.3 from > the > > > > ground up using my MSVC 7.1? Is there some > reason > > > > that Python 2.3 was not built with mingw32, > which > > > is > > > > supplied with the Enthought Python 2.3 dist.? > > > > > > [TNV] > > > > > > You can change line 65 to: > > > > > > mod.compile(compiler='mingw32') > > > > > > and it works for me (I don't have the msvc > compiler > > > either) > > > > > > There is also --compiler=mingw32 setting that > works > > > on a distutils > > > setup.py with the build_ext switch if you happen > to > > > be building an > > > extension module with the ext_tools. > > > > > > > > > > It > > > > seems to me that the latter would make the use > of > > > > weave and other tools seamless regardless of > which > > > > MS*^(*&(* is installed. Does anyone know of a > > > less > > > > painless way to get around the above problem? > > > Please > > > > tell me I'm missing something real simple; so > far > > > all > > > > of my searching and digging has yieled > nothing. > > > > > > > > > > Apologies for the 'broken' example > (broken= > > ms product>)...the > > > docs and examples are lagging a bit. > > > > > > > > > Regards, > > > > > > Travis > > > > > > > > > _______________________________________________ > > > SciPy-user mailing list > > > SciPy-user at scipy.net > > > http://www.scipy.net/mailman/listinfo/scipy-user > > > > _______________________________________________ > > SciPy-user mailing list > > SciPy-user at scipy.net > > http://www.scipy.net/mailman/listinfo/scipy-user > > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user From bldrake at adaptcs.com Sun Oct 26 17:23:24 2003 From: bldrake at adaptcs.com (Barry Drake) Date: Sun, 26 Oct 2003 14:23:24 -0800 (PST) Subject: [SciPy-user] cluster module Message-ID: <20031026222324.40700.qmail@web102.bizmail.yahoo.com> The vq.py example tries to import the scipy.cluster.vq module, which raises an import exception. Does anyone know where this module can be downloaded? It wasn't installed with the Enthought distro of Python 2.3 and I couldn't find it in a search of the SciPy site or mail list. Thanks. Barry Drake From bldrake at adaptcs.com Sun Oct 26 17:57:10 2003 From: bldrake at adaptcs.com (Barry Drake) Date: Sun, 26 Oct 2003 14:57:10 -0800 (PST) Subject: [SciPy-user] cluster module In-Reply-To: <20031026222324.40700.qmail@web102.bizmail.yahoo.com> Message-ID: <20031026225710.18239.qmail@web104.mail.yahoo.com> Found vq.py and related files in the SciPy CVS repository in the cluster subdirectory. bld --- Barry Drake wrote: > The vq.py example tries to import the > scipy.cluster.vq > module, which raises an import exception. Does > anyone > know where this module can be downloaded? It wasn't > installed with the Enthought distro of Python 2.3 > and > I couldn't find it in a search of the SciPy site or > mail list. > > Thanks. > Barry Drake > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user From wagner.nils at vdi.de Mon Oct 27 07:27:30 2003 From: wagner.nils at vdi.de (Nils Wagner) Date: Mon, 27 Oct 2003 13:27:30 +0100 Subject: [SciPy-user] output format Message-ID: <20031027123033.DAB773EB1B@www.scipy.com> Dear experts, Is it possible to remove e+xx in the output generated by io.write_array(file,B4,...) Actually, the entries of B4 are integers. >>> B4 array([[ 4.25424070e+10, 3.81028483e+10, 2.12712035e+10], [ 3.36632897e+10, 3.01503208e+10, 1.68316448e+10], [ 8.87911734e+09, 7.95252750e+09, 4.43955867e+09]]) Nils From ransom at physics.mcgill.ca Mon Oct 27 10:02:59 2003 From: ransom at physics.mcgill.ca (Scott Ransom) Date: Mon, 27 Oct 2003 10:02:59 -0500 Subject: [SciPy-user] output format In-Reply-To: <20031027123033.DAB773EB1B@www.scipy.com> References: <20031027123033.DAB773EB1B@www.scipy.com> Message-ID: <20031027150259.GB7008@spock.physics.mcgill.ca> Hi Nils, On Mon, Oct 27, 2003 at 01:27:30PM +0100, Nils Wagner wrote: > Dear experts, > > Is it possible to remove e+xx in the output > generated by io.write_array(file,B4,...) > Actually, the entries of B4 are integers. > > >>> B4 > array([[ 4.25424070e+10, 3.81028483e+10, 2.12712035e+10], > [ 3.36632897e+10, 3.01503208e+10, 1.68316448e+10], > [ 8.87911734e+09, 7.95252750e+09, 4.43955867e+09]]) Actually, they are not integers. They are doubles. That is why they have the scientific notation. They may have started out as integers, but they were cast to doubles sometime along the way. You can prevent casts using the savespace() method of Numeric arrays. But note that Numeric uses 32 bit integers and so many of your values would cause a 32 bit int to overflow... Scott -- Scott M. Ransom Address: McGill Univ. Physics Dept. Phone: (514) 398-6492 3600 University St., Rm 338 email: ransom at physics.mcgill.ca Montreal, QC Canada H3A 2T8 GPG Fingerprint: 06A9 9553 78BE 16DB 407B FFCA 9BFA B6FF FFD3 2989 From nwagner at mecha.uni-stuttgart.de Mon Oct 27 11:25:55 2003 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Mon, 27 Oct 2003 17:25:55 +0100 Subject: [SciPy-user] output format 64 bit integers References: <20031027123033.DAB773EB1B@www.scipy.com> <20031027150259.GB7008@spock.physics.mcgill.ca> Message-ID: <3F9D4713.A35BACB6@mecha.uni-stuttgart.de> Scott Ransom schrieb: > > Hi Nils, > > On Mon, Oct 27, 2003 at 01:27:30PM +0100, Nils Wagner wrote: > > Dear experts, > > > > Is it possible to remove e+xx in the output > > generated by io.write_array(file,B4,...) > > Actually, the entries of B4 are integers. > > > > >>> B4 > > array([[ 4.25424070e+10, 3.81028483e+10, 2.12712035e+10], > > [ 3.36632897e+10, 3.01503208e+10, 1.68316448e+10], > > [ 8.87911734e+09, 7.95252750e+09, 4.43955867e+09]]) > > Actually, they are not integers. They are doubles. That is > why they have the scientific notation. They may have started > out as integers, but they were cast to doubles sometime along > the way. You can prevent casts using the savespace() method of > Numeric arrays. But note that Numeric uses 32 bit integers and > so many of your values would cause a 32 bit int to overflow... > > Scott > Will numarray or a later version of numeric support 64 bit integers ? Nils > -- > Scott M. Ransom Address: McGill Univ. Physics Dept. > Phone: (514) 398-6492 3600 University St., Rm 338 > email: ransom at physics.mcgill.ca Montreal, QC Canada H3A 2T8 > GPG Fingerprint: 06A9 9553 78BE 16DB 407B FFCA 9BFA B6FF FFD3 2989 > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user From wagner.nils at vdi.de Mon Oct 27 15:13:20 2003 From: wagner.nils at vdi.de (Nils Wagner) Date: Mon, 27 Oct 2003 21:13:20 +0100 Subject: [SciPy-user] More matrix functions Message-ID: <20031027201624.EFC963EB33@www.scipy.com> Dear experts, Please find enclosed two algorithms for the computation of the matrix square root. It would be great to have this function in scipy. Any comments ? Nils Reference Nick Higham How and how not to compute the matrix square root Numerical Algorithms 15(2) 227--242 (1997) -------------- next part -------------- A non-text attachment was scrubbed... Name: db.py Type: application/octet-stream Size: 1223 bytes Desc: not available URL: From wagner.nils at vdi.de Mon Oct 27 15:40:34 2003 From: wagner.nils at vdi.de (Nils Wagner) Date: Mon, 27 Oct 2003 21:40:34 +0100 Subject: [SciPy-user] sqrtm Message-ID: <20031027204338.CB65C3EB3E@www.scipy.com> Hi all, Just another sqrtm implementation based on Pade iteration... def sqrtm2(A,p): # # Pade Iteration # m,n = shape(A) i=arange(1,p+1) xi = 0.5*(1.+cos(0.5*(2*i-1)*pi/p)) ai2 = 1.0/xi-1. Y0 = A Z0 = identity(n) maxiter = 5 for k in arange(0,maxiter): sum1= zeros((n,n),Complex) sum2= zeros((n,n),Complex) for j in arange(0,p): sum1 = sum1 + linalg.inv(dot(Z0,Y0)+ai2[j]*identity(n))/xi[j] sum2 = sum2 + linalg.inv(dot(Y0,Z0)+ai2[j]*identity(n))/xi[j] Y1 = dot(Y0,sum1)/p Z1 = dot(Z0,sum1)/p Y0 = Y1 Z0 = Z1 return Y1,Z1 # # test matrix # A=rand(5,5) # # Modify the spectrum # A = dot(A,A)/25.0+identity(5) # Y2,Z2 = sqrtm2(A,4) print linalg.norm(dot(Y2,Y2)-A) print linalg.norm(dot(Z2,Z2)-linalg.inv(A)) Any suggestions ? Nils From chris at fonnesbeck.org Mon Oct 27 21:05:58 2003 From: chris at fonnesbeck.org (Christopher Fonnesbeck) Date: Mon, 27 Oct 2003 21:05:58 -0500 Subject: [SciPy-user] weave build errors Message-ID: <4583627F-08EB-11D8-8C47-000A956FDAC0@fonnesbeck.org> I'm trying to build and install weave on OSX 10.3 using python 2.3, but get the following build error that I cannot interpret: Goldeneye:~/Development/python/weave chris$ python setup_weave.py build Traceback (most recent call last): File "setup_weave.py", line 52, in ? setup(**configuration()) File "setup_weave.py", line 9, in configuration local_path = get_path(__name__,parent_path2) TypeError: get_path() takes exactly 1 argument (2 given) I'm not sure why this is happening, so advice is welcome. Note that calling setup.py instead of setup_weave.py yields a similar result. Chris Fonnesbeck From pearu at scipy.org Tue Oct 28 00:37:35 2003 From: pearu at scipy.org (Pearu Peterson) Date: Mon, 27 Oct 2003 23:37:35 -0600 (CST) Subject: [SciPy-user] weave build errors In-Reply-To: <4583627F-08EB-11D8-8C47-000A956FDAC0@fonnesbeck.org> Message-ID: On Mon, 27 Oct 2003, Christopher Fonnesbeck wrote: > I'm trying to build and install weave on OSX 10.3 using python 2.3, but > get the following build error that I cannot interpret: > > Goldeneye:~/Development/python/weave chris$ python setup_weave.py build > Traceback (most recent call last): > File "setup_weave.py", line 52, in ? > setup(**configuration()) > File "setup_weave.py", line 9, in configuration > local_path = get_path(__name__,parent_path2) > TypeError: get_path() takes exactly 1 argument (2 given) > > I'm not sure why this is happening, so advice is welcome. Note that > calling setup.py instead of setup_weave.py yields a similar result. You have to install recent scipy_core packages first and then proceed with weave install. Pearu From chris at fonnesbeck.org Tue Oct 28 12:44:15 2003 From: chris at fonnesbeck.org (Christopher Fonnesbeck) Date: Tue, 28 Oct 2003 12:44:15 -0500 Subject: [SciPy-user] weave build errors Message-ID: <591937AC-096E-11D8-8154-000A956FDAC0@fonnesbeck.org> OK, done that. I do get the following error, however that prevents weave from running its tests: package init file '/Users/chris/Development/python/scipy_core/scipy_base/tests/ __init__.py' not found (or not a regular file) Thanks again, cjf -- Christopher J. Fonnesbeck ( c h r i s @ f o n n e s b e c k . o r g ) Georgia Cooperative Fish & Wildlife Research Unit, University of Georgia From chris at fonnesbeck.org Tue Oct 28 12:59:28 2003 From: chris at fonnesbeck.org (Christopher Fonnesbeck) Date: Tue, 28 Oct 2003 12:59:28 -0500 Subject: [SciPy-user] SWIG required for scipy setup? Message-ID: <79680814-0970-11D8-8154-000A956FDAC0@fonnesbeck.org> OSX nightmares continue: It now appears that swig is required to build scipy on OSX 10.3 -- building scipy from cvs gives the following error: building agg swig: swig -c++ -python -shadow -I/Users/chris/Development/python/scipy/Lib_chaco/agg2/include agg.i sh: line 1: swig: command not found I thought scipy was independent of swig. Should I be avoiding the scipy cvs for this platform? Sorry for the hassles, but I either need to get scipy running to continue development of my own code, or move on without it. Thanks, Chris -- Christopher J. Fonnesbeck ( c h r i s @ f o n n e s b e c k . o r g ) Georgia Cooperative Fish & Wildlife Research Unit, University of Georgia From pearu at scipy.org Tue Oct 28 16:36:11 2003 From: pearu at scipy.org (Pearu Peterson) Date: Tue, 28 Oct 2003 15:36:11 -0600 (CST) Subject: [SciPy-user] SWIG required for scipy setup? In-Reply-To: <79680814-0970-11D8-8154-000A956FDAC0@fonnesbeck.org> Message-ID: On Tue, 28 Oct 2003, Christopher Fonnesbeck wrote: > OSX nightmares continue: It now appears that swig is required to build > scipy on OSX 10.3 -- building scipy from cvs gives the following error: > > building agg swig: > swig -c++ -python -shadow > -I/Users/chris/Development/python/scipy/Lib_chaco/agg2/include agg.i > sh: line 1: swig: command not found > > I thought scipy was independent of swig. Should I be avoiding the scipy > cvs for this platform? If you disable chaco packages in setup.py then no swig is required. Pearu From pearu at scipy.org Tue Oct 28 16:38:37 2003 From: pearu at scipy.org (Pearu Peterson) Date: Tue, 28 Oct 2003 15:38:37 -0600 (CST) Subject: [SciPy-user] weave build errors In-Reply-To: <591937AC-096E-11D8-8154-000A956FDAC0@fonnesbeck.org> Message-ID: On Tue, 28 Oct 2003, Christopher Fonnesbeck wrote: > OK, done that. I do get the following error, however that prevents > weave from running its tests: > > package init file > '/Users/chris/Development/python/scipy_core/scipy_base/tests/ > __init__.py' not found (or not a regular file) This is not an error. What exactly happens when you run weave tests? In Python you should do that as follows: >>> import weave >>> weave.test() Pearu From chris at fonnesbeck.org Tue Oct 28 17:49:33 2003 From: chris at fonnesbeck.org (Christopher Fonnesbeck) Date: Tue, 28 Oct 2003 17:49:33 -0500 Subject: [SciPy-user] weave build errors Message-ID: Running weave.test() results in the output below, where it seems to have trouble finding most of the tests. This is why I thought the warning about not finding __init__.py in tests was important. I have installed scipy_distutils, scipy_test and scipy_base; scipy_core and the full scipy cause problems, seemingly related to gcc3.4, as unrecognized symbols pop up. >>> weave.test() Found 2 tests for weave.blitz_tools !! No test file 'test_vtk_spec.py' found for Found 0 tests for weave.c_spec !! No test file 'test_base_spec.py' found for Found 16 tests for weave.slice_handler Found 3 tests for weave.standard_array_spec Found 12 tests for weave.build_tools !! No test file 'test_common_info.py' found for !! No test file 'test_converters.py' found for !! No test file 'test_base_info.py' found for !! FAILURE importing tests for /System/Library/Frameworks/Python.framework/Versions/2.3/lib/python2.3/ site-packages/weave/tests/test_wx_spec.py:17: ImportError: No module named wxPython (in ?) !! No test file 'test_swigptr.py' found for building extensions here: /Users/chris/.python23_compiled/m3 Found 1 tests for weave.ext_tools Found 26 tests for weave.catalog !! No test file 'test_info_weave.py' found for !! No test file 'test_blitz_spec.py' found for Found 85 tests for weave.size_check !! No test file 'test_platform_info.py' found for Found 1 tests for weave.ast_tools !! No test file 'test_accelerate_tools.py' found for !! No test file 'test___cvs_version__.py' found for !! No test file 'test_weave_version.py' found for !! No test file 'test_dumbdbm_patched.py' found for Found 0 tests for weave.inline_tools !! No test file 'test_dumb_shelve.py' found for !! No test file 'test___init__.py' found for !! No test file 'test_bytecodecompiler.py' found for .....................warning: specified build_dir '_bad_path_' does not exist or is not writable. Trying default locations .....warning: specified build_dir '_bad_path_' does not exist or is not writable. Trying default locations ........................................................................ ................................................ ---------------------------------------------------------------------- Ran 146 tests in 2.674s OK -- Christopher J. Fonnesbeck ( c h r i s @ f o n n e s b e c k . o r g ) Georgia Cooperative Fish & Wildlife Research Unit, University of Georgia From pearu at scipy.org Wed Oct 29 03:11:33 2003 From: pearu at scipy.org (Pearu Peterson) Date: Wed, 29 Oct 2003 02:11:33 -0600 (CST) Subject: [SciPy-user] weave build errors In-Reply-To: Message-ID: On Tue, 28 Oct 2003, Christopher Fonnesbeck wrote: > Running weave.test() results in the output below, where it seems to > have trouble finding most of the tests. This is because they haven't been implemented yet and the messages are meant as a reminder to developers. > This is why I thought the > warning about not finding __init__.py in tests was important. I have > installed scipy_distutils, scipy_test and scipy_base; scipy_core and > the full scipy cause problems, seemingly related to gcc3.4, as > unrecognized symbols pop up. No, relation to gcc3.4 is irrelevant here. Additional comments follow below. > >>> weave.test() Note that weave.test has also level and verbosity arguments. You might try also weave.test(level=10,verbosity=2) that includes all available weave tests to be run. The same applies to scipy and scipy_base tests. (Btw, weave test test_scxx_object.test_object_print.check_failure causes a segfault, just you know that we know it ;) > Found 2 tests for weave.blitz_tools weave.test found 2 level=1 tests for weave.blitz_tools. > !! No test file 'test_vtk_spec.py' found for 'weave.vtk_spec' from '...te-packages/weave/vtk_spec.pyc'> The test file tests/test_vtk_spec.py is not implemented. > Found 0 tests for weave.c_spec No level=1 tests were found for weave.c_spec. However, there are higher level tests for this module. > !! FAILURE importing tests for '...ite-packages/weave/wx_spec.pyc'> > > /System/Library/Frameworks/Python.framework/Versions/2.3/lib/python2.3/ > site-packages/weave/tests/test_wx_spec.py:17: ImportError: No module > named wxPython (in ?) Module weave.wx_spec tests assume that wxPython is installed which is obviously not the case in your system. You can ignore this failure. Regards, Pearu From ariciputi at pito.com Wed Oct 29 11:34:59 2003 From: ariciputi at pito.com (Andrea Riciputi) Date: Wed, 29 Oct 2003 17:34:59 +0100 Subject: [SciPy-user] SciPy'03 proceeding. Message-ID: Hi there, any news about slides presented at SciPy'03? Are they available on-line at last? If not, when will they be put on-line? Thanks in advance, Andrea. --- Andrea Riciputi "Science is like sex: sometimes something useful comes out, but that is not the reason we are doing it" -- (Richard Feynman)