From baghdadi at sickkids.ca Tue Mar 1 17:30:34 2005 From: baghdadi at sickkids.ca (Leila baghdadi) Date: 01 Mar 2005 17:30:34 -0500 Subject: [SciPy-user] help building scipy on Opteron 64 bit Message-ID: <1109716233.1004.44.camel@mouse18.phenogenomics.ca> Hello everyone, I am new to this so bear with me! I have been trying to build scipy on our Opteron machine (linux enterprise 3, gcc 3.2.3) and had no success, I searched the mailing list and found the link about optimization and gcc version http://www.scipy.net/pipermail/scipy-user/2004-November/003815.html and decided to only build blas and lapack and see if I can get the correct answer for the example that is given in the above link I tried with gcc 3.2.3 and no optimization and had no luck I then build a local copy of gcc 3.4.3 and tried with/without optimization and still have no luck my answers are wr=9.32183,-9.59852e-16,-0.321825,wi=0,0,0 whereas the above link gives wr=9.32183,-6.20979e-16,-0.321825,wi=0,0,0 at this point I am unable to proceed simply because I have no idea what to do, according to the above link gcc 3.4.3 should solve the problem but I am not sure why it does not do it in my case. any help/hints is greatly appreciated. Leila From andrew.swan at swiftdsl.com.au Wed Mar 2 05:20:57 2005 From: andrew.swan at swiftdsl.com.au (Andrew Swan) Date: Wed, 02 Mar 2005 18:20:57 +0800 Subject: [SciPy-user] sparse matrix multiplication Message-ID: <20050302102058.117023EB1A@www.scipy.com> Hi - I am wondering if there is a problem with the following code: from scipy_base import * from scipy.sparse import csc_matrix, dok_matrix x = dok_matrix() x[0,0] = 2 x[1,1] = 1 x[2,1] = 1 y = csc_matrix(x) yt = y.transp() yy = yt * y print yy.todense() The print statement gives: [[ 4. 0.] [ 2. 0.]] When the answer I was expecting was: array([[4, 0], [0, 2]]) Thanks in advance... ---- Msg sent via - People Telecom Webmail From duncan at enthought.com Wed Mar 2 18:18:27 2005 From: duncan at enthought.com (Duncan Child) Date: Wed, 02 Mar 2005 17:18:27 -0600 Subject: [SciPy-user] Performance question. Message-ID: <422649C3.3040600@enthought.com> I am working on developing algorithms that are usually called with parameters that are Numeric arrays. We have the usual challenge though of trying to craft code that will gracefully accept both floats or arrays. Because of the often discussed problem with the handling of zero length arrays we expend some effort to ensure that we don't make calls on floats that only work on arrays and have ended up with a bunch of code like safe_len() that can be called with either. Today we have a related problem but now it is with performance (see code below). Numeric is faster than NumArray operating on smaller arrays but it is still relatively slow handling regular floats. We could add to the safe_ suite of functions the fast_ series but this still entails a significant performance hit and is not exactly elegant. The problem is larger than just handling sqrt so I would appreciate any feedback or suggestions on how best to proceed. Thanks, Duncan ================================================= def safe_len(a): # Return the length of the input array or 1 if it is a scalar try: safelen = len(a) except: safelen = 1 return safelen ================================================= from scipy import arange, sqrt from math import sqrt as csqrt import time # this is slower ... start_time = time.clock() for i in range(1000): a = sqrt(i) t1 = time.clock() - start_time # this is faster ... start_time = time.clock() for i in range(1000): a = csqrt(i) t2 = time.clock() - start_time print t1, t2, t1 / t2 C:\sqrt.py 0.0537227007132 0.00181048033684 68.6731754663 ================================================= from scipy import sqrt from math import sqrt as csqrt import types def fast_sqrt(arg): if type(arg) == types.FloatType: return csqrt(arg) else: return sqrt(arg) From rkern at ucsd.edu Wed Mar 2 19:33:57 2005 From: rkern at ucsd.edu (Robert Kern) Date: Wed, 02 Mar 2005 16:33:57 -0800 Subject: [SciPy-user] Performance question. In-Reply-To: <422649C3.3040600@enthought.com> References: <422649C3.3040600@enthought.com> Message-ID: <42265B75.6080805@ucsd.edu> Duncan Child wrote: > I am working on developing algorithms that are usually called with > parameters that are Numeric arrays. We have the usual challenge though > of trying to craft code that will gracefully accept both floats or > arrays. Because of the often discussed problem with the handling of zero > length arrays we expend some effort to ensure that we don't make calls > on floats that only work on arrays and have ended up with a bunch of > code like safe_len() that can be called with either. > > Today we have a related problem but now it is with performance (see code > below). Numeric is faster than NumArray operating on smaller arrays but > it is still relatively slow handling regular floats. We could add to the > safe_ suite of functions the fast_ series but this still entails a > significant performance hit and is not exactly elegant. A potential solution to the performance problem, though not the elegance problem, might be Pyrex. An untested sketch: cdef extern from "Numeric/arrayobject.h": bool PyArray_Check(object x) cdef extern from "math.h": double sqrt(double x) import Numeric def fast_sqrt(object x): if PyArray_Check(x): return Numeric.sqrt(x) else: return sqrt(x) -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From pearu at scipy.org Thu Mar 3 03:48:46 2005 From: pearu at scipy.org (Pearu Peterson) Date: Thu, 3 Mar 2005 02:48:46 -0600 (CST) Subject: [SciPy-user] help building scipy on Opteron 64 bit In-Reply-To: <1109716233.1004.44.camel@mouse18.phenogenomics.ca> References: <1109716233.1004.44.camel@mouse18.phenogenomics.ca> Message-ID: On Tue, 1 Mar 2005, Leila baghdadi wrote: > Hello everyone, > > I am new to this so bear with me! > > I have been trying to build scipy on our Opteron machine (linux > enterprise 3, gcc 3.2.3) and had no success, > > I searched the mailing list and found the link about optimization and > gcc version > http://www.scipy.net/pipermail/scipy-user/2004-November/003815.html > > and decided to only build blas and lapack and see if I can get the > correct answer for the example that is given in the above link > > I tried with gcc 3.2.3 and no optimization and had no luck > > I then build a local copy of gcc 3.4.3 and tried with/without > optimization and still have no luck > > my answers are > > wr=9.32183,-9.59852e-16,-0.321825,wi=0,0,0 > > whereas the above link gives > wr=9.32183,-6.20979e-16,-0.321825,wi=0,0,0 I don't see what is the problem here, the results are equal up-to numerical noice. It is normal to get slightly different results when using Fortran compiled blas/lapack or ATLAS blas/lapack. Pearu From stephen.walton at csun.edu Thu Mar 3 19:58:20 2005 From: stephen.walton at csun.edu (Stephen Walton) Date: Thu, 03 Mar 2005 16:58:20 -0800 Subject: [SciPy-user] bdist-rpm problem Message-ID: <4227B2AC.1080200@csun.edu> Hi, All, A week or so ago, I posted to matplotlib-users about a problem with bdist_rpm. I'd asked about python 2.3 on Fedora Core 1. It turns out there are two problems. One is that even if one has python2.3 and python2.2 installed, bdist_rpm always calls the interpreter named 'python', which is 2.2 on FC1. The other problem is that in bdist_rpm.py there is a set of lines near line 307 which tests if the number of generated RPM files is 1. This fails because all of matplotlib, numeric, numarray and scipy generate a debuginfo RPM when one does 'python setup.py bdist_rpm'. (Why the RPM count doesn't fail with Python 2.3 on FC3 is beyond me, but nevermind.) The patch is at http://opensvn.csie.org/pyvault/rpms/trunk/python23/python-2.3.4-distutils-bdist-rpm.patch and I have verified that after applying this patch to /usr/lib/python2.2/distutils/command/bdist_rpm.py on FC1 that 'python setup.py bdist_rpm' works for numarray 1.2.2, scipy current CVS, and matplotlib 0.72 (after changing setup.py for python2.2 as documented in the latter). It still fails with Numeric 23.6 however for reasons I'm still checking into; the failed "setup.py bdist_rpm" claims that arraytypes.c doesn't exist. Steve Walton From stephen.walton at csun.edu Thu Mar 3 20:01:41 2005 From: stephen.walton at csun.edu (Stephen Walton) Date: Thu, 03 Mar 2005 17:01:41 -0800 Subject: [SciPy-user] bdist-rpm problem In-Reply-To: <4227B2AC.1080200@csun.edu> References: <4227B2AC.1080200@csun.edu> Message-ID: <4227B375.8000200@csun.edu> Stephen Walton wrote: > [bdist_rpm] still fails with Numeric 23.6 however for reasons I'm > still checking into; Posted too soon; this problem is fixed at Numeric 23.7. From pearu at scipy.org Thu Mar 3 20:04:06 2005 From: pearu at scipy.org (Pearu Peterson) Date: Thu, 3 Mar 2005 19:04:06 -0600 (CST) Subject: [SciPy-user] bdist-rpm problem In-Reply-To: <4227B2AC.1080200@csun.edu> References: <4227B2AC.1080200@csun.edu> Message-ID: On Thu, 3 Mar 2005, Stephen Walton wrote: > Hi, All, > > A week or so ago, I posted to matplotlib-users about a problem with > bdist_rpm. I'd asked about python 2.3 on Fedora Core 1. > > It turns out there are two problems. One is that even if one has python2.3 > and python2.2 installed, bdist_rpm always calls the interpreter named > 'python', which is 2.2 on FC1. Using `bdist_rpm --fix-python` should take care of this issue. Pearu From Fernando.Perez at colorado.edu Thu Mar 3 20:22:54 2005 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Thu, 03 Mar 2005 18:22:54 -0700 Subject: [SciPy-user] bdist-rpm problem In-Reply-To: <4227B2AC.1080200@csun.edu> References: <4227B2AC.1080200@csun.edu> Message-ID: <4227B86E.6070108@colorado.edu> Stephen Walton wrote: > Hi, All, > > A week or so ago, I posted to matplotlib-users about a problem with > bdist_rpm. I'd asked about python 2.3 on Fedora Core 1. > > It turns out there are two problems. One is that even if one has > python2.3 and python2.2 installed, bdist_rpm always calls the > interpreter named 'python', which is 2.2 on FC1. The other problem is You need to 'fix' the python version to be called inside the actual rpm build. From the ipython release script: # A 2.4-specific RPM, where we must use the --fix-python option to ensure that # the resulting RPM is really built with 2.4 (so things go to # lib/python2.4/...) python2.4 ./setup.py bdist_rpm --release=py24 --fix-python > that in bdist_rpm.py there is a set of lines near line 307 which tests > if the number of generated RPM files is 1. This fails because all of > matplotlib, numeric, numarray and scipy generate a debuginfo RPM when > one does 'python setup.py bdist_rpm'. (Why the RPM count doesn't fail > with Python 2.3 on FC3 is beyond me, but nevermind.) The patch is at This problem has been fixed in recent 2.3 and 2.4. 2.2 still has it. Best, f From novak at ucolick.org Fri Mar 4 01:15:10 2005 From: novak at ucolick.org (Greg Novak) Date: Thu, 3 Mar 2005 22:15:10 -0800 Subject: [SciPy-user] Re: Numeric Documentation Message-ID: <20050304061510.GB2021@dionysus.ucolick.org> On a whim I searched http://www.archive.org for the PDF documentation of Numpy, and lo! they had it. I didn't think they archived binary files, but I was mistaken... I've posted it at http://www.ucolick.org/~novak/numpy.pdf. It'd be great if someone could snatch it from there and post it to the scipy web site so that it doesn't perish from the world... Thanks! Greg From Fernando.Perez at colorado.edu Fri Mar 4 01:45:36 2005 From: Fernando.Perez at colorado.edu (Fernando.Perez at colorado.edu) Date: Thu, 3 Mar 2005 23:45:36 -0700 Subject: [SciPy-user] Re: Numeric Documentation In-Reply-To: <20050304061510.GB2021@dionysus.ucolick.org> References: <20050304061510.GB2021@dionysus.ucolick.org> Message-ID: <1109918736.42280410c6ec1@webmail.colorado.edu> Quoting Greg Novak : > On a whim I searched http://www.archive.org for the PDF documentation > of Numpy, and lo! they had it. I didn't think they archived binary > files, but I was mistaken... > > I've posted it at http://www.ucolick.org/~novak/numpy.pdf. It'd be > great if someone could snatch it from there and post it to the scipy > web site so that it doesn't perish from the world... Mmh, you might want to note that: http://numeric.scipy.org/ already has the manual. I just updated the html version there to my local one, which has all the broken html problems manually corrected. Cheers, f From nwagner at mecha.uni-stuttgart.de Fri Mar 4 03:38:36 2005 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Fri, 04 Mar 2005 09:38:36 +0100 Subject: [SciPy-user] PyTrilinos - Python interface to Trilinos libraries Message-ID: <42281E8C.60800@mecha.uni-stuttgart.de> Hi all, A new release of Trilinos is available. It includes a Python interface to Trilinos libraries. http://software.sandia.gov/trilinos/release_5.0_notes.html Regards, Nils From jeremydmorris at gmail.com Fri Mar 4 16:53:10 2005 From: jeremydmorris at gmail.com (Jeremy Morris) Date: Fri, 4 Mar 2005 16:53:10 -0500 Subject: [SciPy-user] Installation Issues Message-ID: <57483bc70503041353370268f9@mail.gmail.com> I have been trying to install Scipy for the last few days. I run Mandrake 10, and when I run the command: python setup.py build I get the following error: In file included from /usr/include/python2.3/Python.h:8, from Lib/cluster/src/vq_wrap.cpp:176: /usr/include/python2.3/pyconfig.h:847:1: warning: "_POSIX_C_SOURCE" redefined In file included from /usr/include/string.h:26, from Lib/cluster/src/vq_wrap.cpp:27: /usr/include/features.h:131:1: warning: this is the location of the previous definition In file included from Lib/cluster/src/vq_wrap.cpp:499: Lib/cluster/src/vq.h:57:7: warning: no newline at end of file Lib/cluster/src/vq_wrap.cpp:563:33: Numeric/arrayobject.h: No such file or directory Lib/cluster/src/vq_wrap.cpp:686: error: syntax error before `*' token Lib/cluster/src/vq_wrap.cpp:690: error: `basetype_string' was not declared in this scope This is just the first few lines. I think the line that is the most important is this one: Lib/cluster/src/vq_wrap.cpp:563:33: Numeric/arrayobject.h: No such file or directory Like I said, I have been trying to build/install for a few days now. I knew when I was altering the source that maybe something was wrong. Can anyone help? I need to get this running for an assignment, and still have to learn Python. The website said that I should include the following info: Platform : posix linux2 uname -a : Linux 2.6.3-7mdk #1 Wed Mar 17 15:56:42 CET 2004 i686 unknown unknown GNU/Linux gcc -v : Reading specs from /usr/lib/gcc-lib/i586-mandrake-linux-gnu/3.3.2/specs Configured with: ../configure --prefix=/usr --libdir=/usr/lib --with-slibdir=/lib --mandir=/usr/share/man --infodir=/usr/share/info --enable-shared --enable-threads=posix --disable-checking --enable-long-long --enable-__cxa_atexit --enable-clocale=gnu --enable-languages=c,c++,ada,f77,objc,java,pascal --host=i586-mandrake-linux-gnu --with-system-zlib Thread model: posix gcc version 3.3.2 (Mandrake Linux 10.0 3.3.2-6mdk) g77 --version : GNU Fortran (GCC) 3.3.2 (Mandrake Linux 10.0 3.3.2-6mdk) Copyright (C) 2002 Free Software Foundation, Inc. GNU Fortran comes with NO WARRANTY, to the extent permitted by law. You may redistribute copies of GNU Fortran under the terms of the GNU General Public License. For more information about these matters, see the file named COPYING or type the command `info -f g77 Copying'. Python Version : 2.3.3 (#2, Feb 9 2005, 13:57:58) [GCC 3.3.2 (Mandrakelinux 10.0 3.3.2-9mdk)] Numeric Version : 23.1 f2py Version : 2.45.241_1926 ATLAS Info : Traceback (most recent call last): File "setup_atlas_version.py", line 5, in ? from scipy_distutils.misc_util import get_path, default_config_dict ImportError: No module named scipy_distutils.misc_util And : Traceback (most recent call last): File "", line 1, in ? ImportError: No module named atlas_version Other output : _pkg_config_info: NOT AVAILABLE agg2_info: ( src_dirs = .:/usr/local/src ) NOT AVAILABLE atlas_blas_info: ( library_dirs = /root/Linux_ATHLONSSE1/lib:/usr/local/lib:/usr/lib ) ( paths: /root/Linux_ATHLONSSE1/lib/libf77blas.a ) ( paths: /root/Linux_ATHLONSSE1/lib/libcblas.a ) ( paths: /root/Linux_ATHLONSSE1/lib/libatlas.a ) system_info.atlas_blas_info ( include_dirs = /root/Linux_ATHLONSSE1/lib:/usr/local/include:/usr/include ) FOUND: libraries = ['f77blas', 'cblas', 'atlas'] library_dirs = ['/root/Linux_ATHLONSSE1/lib'] language = c atlas_blas_threads_info: ( library_dirs = /root/Linux_ATHLONSSE1/lib:/usr/local/lib:/usr/lib ) ( paths: /root/Linux_ATHLONSSE1/lib/libatlas.a ) system_info.atlas_blas_threads_info NOT AVAILABLE atlas_info: ( library_dirs = /root/Linux_ATHLONSSE1/lib:/usr/local/lib:/usr/lib ) ( paths: /root/Linux_ATHLONSSE1/lib/libf77blas.a ) ( paths: /root/Linux_ATHLONSSE1/lib/libcblas.a ) ( paths: /root/Linux_ATHLONSSE1/lib/libatlas.a ) ( paths: /root/Linux_ATHLONSSE1/lib/liblapack.a ) system_info.atlas_info ( include_dirs = /root/Linux_ATHLONSSE1/lib:/usr/local/include:/usr/include ) FOUND: libraries = ['lapack', 'f77blas', 'cblas', 'atlas'] library_dirs = ['/root/Linux_ATHLONSSE1/lib'] language = f77 atlas_threads_info: ( library_dirs = /root/Linux_ATHLONSSE1/lib:/usr/local/lib:/usr/lib ) ( paths: /root/Linux_ATHLONSSE1/lib/libatlas.a ) system_info.atlas_threads_info NOT AVAILABLE blas_info: ( library_dirs = /usr/local/lib:/usr/lib ) NOT AVAILABLE blas_opt_info: Traceback (most recent call last): File "scipy_core/scipy_distutils/system_info.py", line 1430, in ? show_all() File "scipy_core/scipy_distutils/system_info.py", line 1427, in show_all r = c.get_info() File "scipy_core/scipy_distutils/system_info.py", line 314, in get_info self.calc_info() File "scipy_core/scipy_distutils/system_info.py", line 972, in calc_info atlas_version = get_atlas_version(**version_info) File "scipy_core/scipy_distutils/system_info.py", line 809, in get_atlas_version from core import Extension, setup File "scipy_core/scipy_distutils/core.py", line 5, in ? from scipy_distutils.dist import Distribution ImportError: No module named scipy_distutils.dist And : python: can't open file 'scipy_core/scipy_distutils/command/build_flib.py' Thanks again. Jeremy From stephen.walton at csun.edu Fri Mar 4 18:03:32 2005 From: stephen.walton at csun.edu (Stephen Walton) Date: Fri, 04 Mar 2005 15:03:32 -0800 Subject: [SciPy-user] leastsq and Jacobian Message-ID: <4228E944.8040506@csun.edu> I find that the following script consistently core dumps (scipy latest CVS, Numeric 23.7, ATLAS 3.6.0) on both FC1 and FC3 when I say "python script.py" at the command line. I was trying to test problems others have reported using the Jacobian argument to leastsq, including a message way back in April 2003 from John Hunter. I'm pretty familiar with MINPACK, having used it to fit some functions a while back. Where might I start looking for the problem? from Numeric import * import scipy as S def ros(x): return array([10*(x[0] - x[1]**2), 1-x[1]]) def dros(x): return array([[-20*x[0], 10],[-1,0]]) def dros_trans(x): return array([[-20*x[0], -1],[10,0]]) x0=[-1.2,1.0] print "No Jacobian:" print S.optimize.leastsq(ros,x0) print "With Jacobian, try 1:" print S.optimize.leastsq(ros,x0,Dfun=dros) print "With Jacobian, try 2:" print S.optimize.leastsq(ros,x0,Dfun=dros_trans) print "With Jacobian, try 3:" print S.optimize.leastsq(ros,x0,Dfun=dros,col_deriv=1) print "With Jacobian, try 4:" print S.optimize.leastsq(ros,x0,Dfun=dros_trans,col_deriv=1) From oliphant at ee.byu.edu Fri Mar 4 18:16:30 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Fri, 04 Mar 2005 16:16:30 -0700 Subject: [SciPy-user] Installation Issues In-Reply-To: <57483bc70503041353370268f9@mail.gmail.com> References: <57483bc70503041353370268f9@mail.gmail.com> Message-ID: <4228EC4E.9040105@ee.byu.edu> Jeremy Morris wrote: >I have been trying to install Scipy for the last few days. I run >Mandrake 10, and when I run the command: > > It looks like you are using the python-numeric rpm. But, you need to install the development RPM as well (it has the Numeric headers that you are missing: Numeric/arrayobject.h) The best thing for you to do is to download the latest Numeric from the sourceforge download site: see http://numeric.scipy.org uncomment the relevant parts of the setup.py file to enable the use of ATLAS with Numeric. python setup.py install Numeric and then try to install scipy again. I use Mandrake all the time, and in general my installation difficulties for any software package is almost always missing development rpms causing missing header files. Good luck, -Travis O. From baghdadi at sickkids.ca Sun Mar 6 12:36:12 2005 From: baghdadi at sickkids.ca (Leila baghdadi) Date: 06 Mar 2005 12:36:12 -0500 Subject: [SciPy-user] scipy test file error Message-ID: <1110130572.1004.112.camel@mouse18.phenogenomics.ca> Hi everyone, I have finally managed to build scipy on Opteron (red hat linux enterprise 3), so I can import it in python with no problems but when I go to scipy/tests and try to run any of the tests python test_common.py the following error appears, can anyone give me a hint, File "/usr/lib64/python2.2/unittest.py", line 412, in loadTestsFromTestCase self.getTestCaseNames(testCaseClass))) File "test_common.py", line 20, in __init__ self.z = rand(100,10,20) File "/projects/mice/share/arch/linux64/lib64/python2.2/site-packages/scipy_base/ppimport.py", line 115, in __call__ return self._ppimport_attr(*args,**kwds) File "/projects/mice/share/arch/linux64/lib64/python2.2/site-packages/scipy/stats/distributions.py", line 82, in rand return random(args) File "/projects/mice/share/arch/linux64/lib64/python2.2/site-packages/scipy/stats/distributions.py", line 75, in random return _build_random_array(randfile.sample, (), size) File "/projects/mice/share/arch/linux64/lib64/python2.2/site-packages/scipy/stats/distributions.py", line 66, in _build_random_array s.shape = size ValueError: total size of new array must be unchanged Thanks very much Leila From pearu at scipy.org Sun Mar 6 16:24:23 2005 From: pearu at scipy.org (Pearu Peterson) Date: Sun, 6 Mar 2005 15:24:23 -0600 (CST) Subject: [SciPy-user] scipy test file error In-Reply-To: <1110130572.1004.112.camel@mouse18.phenogenomics.ca> References: <1110130572.1004.112.camel@mouse18.phenogenomics.ca> Message-ID: On Sun, 6 Mar 2005, Leila baghdadi wrote: > Hi everyone, > > I have finally managed to build scipy on Opteron (red hat linux > enterprise 3), > > so I can import it in python with no problems > > but when I go to scipy/tests and > > try to run any of the tests > > python test_common.py A proper way to run scipy tests is the following: >>> import scipy >>> scipy.test() > the following error appears, > > can anyone give me a hint, > > File "/usr/lib64/python2.2/unittest.py", line 412, in > loadTestsFromTestCase > self.getTestCaseNames(testCaseClass))) > File "test_common.py", line 20, in __init__ > self.z = rand(100,10,20) > File > "/projects/mice/share/arch/linux64/lib64/python2.2/site-packages/scipy_base/ppimport.py", line 115, in __call__ > return self._ppimport_attr(*args,**kwds) > File > "/projects/mice/share/arch/linux64/lib64/python2.2/site-packages/scipy/stats/distributions.py", line 82, in rand > return random(args) > File > "/projects/mice/share/arch/linux64/lib64/python2.2/site-packages/scipy/stats/distributions.py", line 75, in random > return _build_random_array(randfile.sample, (), size) > File > "/projects/mice/share/arch/linux64/lib64/python2.2/site-packages/scipy/stats/distributions.py", line 66, in _build_random_array > s.shape = size > ValueError: total size of new array must be unchanged What Numeric version are you using? Try upgrading Numeric to the latest. Also, using Python 2.3 and up is recommended. Pearu From baghdadi at sickkids.ca Sun Mar 6 16:36:28 2005 From: baghdadi at sickkids.ca (Leila baghdadi) Date: 06 Mar 2005 16:36:28 -0500 Subject: [SciPy-user] scipy test file error (solved!!) In-Reply-To: References: <1110130572.1004.112.camel@mouse18.phenogenomics.ca> Message-ID: <1110144988.1004.142.camel@mouse18.phenogenomics.ca> Hello Pearu, Thanks very much for your consistent help! so I think what happened is I got Numeric 23.7 which was basically creating the problem (not sure why), I was not sure what to do so I just got a previous version of Numeric 23.5 and installed that and BAM the problem is solved, so I am sure not sure what the difference between 23.5 and 23.7 but the reshape and resize functions were giving me problems on Opteron. I think at this point I have everything, a quick question is do you need wxPython for running scipy.test() because it is complaining /projects/mice/share/arch/linux64/lib/python2.2/site-packages/scipy/plt/interface.pyc No module named wxPython Thanks very much Leila On Sun, 2005-03-06 at 16:24, Pearu Peterson wrote: > On Sun, 6 Mar 2005, Leila baghdadi wrote: > > > Hi everyone, > > > > I have finally managed to build scipy on Opteron (red hat linux > > enterprise 3), > > > > so I can import it in python with no problems > > > > but when I go to scipy/tests and > > > > try to run any of the tests > > > > python test_common.py > > A proper way to run scipy tests is the following: > > >>> import scipy > >>> scipy.test() > > > the following error appears, > > > > can anyone give me a hint, > > > > File "/usr/lib64/python2.2/unittest.py", line 412, in > > loadTestsFromTestCase > > self.getTestCaseNames(testCaseClass))) > > File "test_common.py", line 20, in __init__ > > self.z = rand(100,10,20) > > File > > "/projects/mice/share/arch/linux64/lib64/python2.2/site-packages/scipy_base/ppimport.py", line 115, in __call__ > > return self._ppimport_attr(*args,**kwds) > > File > > "/projects/mice/share/arch/linux64/lib64/python2.2/site-packages/scipy/stats/distributions.py", line 82, in rand > > return random(args) > > File > > "/projects/mice/share/arch/linux64/lib64/python2.2/site-packages/scipy/stats/distributions.py", line 75, in random > > return _build_random_array(randfile.sample, (), size) > > File > > "/projects/mice/share/arch/linux64/lib64/python2.2/site-packages/scipy/stats/distributions.py", line 66, in _build_random_array > > s.shape = size > > ValueError: total size of new array must be unchanged > > What Numeric version are you using? Try upgrading Numeric to the latest. > Also, using Python 2.3 and up is recommended. > > Pearu > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user From pearu at scipy.org Sun Mar 6 16:43:16 2005 From: pearu at scipy.org (Pearu Peterson) Date: Sun, 6 Mar 2005 15:43:16 -0600 (CST) Subject: [SciPy-user] scipy test file error (solved!!) In-Reply-To: <1110144988.1004.142.camel@mouse18.phenogenomics.ca> References: <1110130572.1004.112.camel@mouse18.phenogenomics.ca> <1110144988.1004.142.camel@mouse18.phenogenomics.ca> Message-ID: On Sun, 6 Mar 2005, Leila baghdadi wrote: > so I think what happened is I got Numeric 23.7 which was basically > creating the problem (not sure why), > > I was not sure what to do so I just got a previous version of Numeric > > 23.5 and installed that and BAM the problem is solved, so I am sure not > sure what the difference between 23.5 and 23.7 but the reshape and > resize functions were giving me problems on Opteron. In my Opteron box I have rather old Numeric, 23.1, and all tests succeed. When I get time to upgrade to 23.7, I'll look into this problem. > a quick question is do you need wxPython for running scipy.test() > because it is complaining > > /projects/mice/share/arch/linux64/lib/python2.2/site-packages/scipy/plt/interface.pyc No module named wxPython No, just without wxPython plt will not be functional. Pearu From oliphant at ee.byu.edu Sun Mar 6 16:49:12 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Sun, 06 Mar 2005 14:49:12 -0700 Subject: [SciPy-user] scipy test file error (solved!!) In-Reply-To: <1110144988.1004.142.camel@mouse18.phenogenomics.ca> References: <1110130572.1004.112.camel@mouse18.phenogenomics.ca> <1110144988.1004.142.camel@mouse18.phenogenomics.ca> Message-ID: <422B7AD8.3090600@ee.byu.edu> Leila baghdadi wrote: >Hello Pearu, > >Thanks very much for your consistent help! > >so I think what happened is I got Numeric 23.7 which was basically >creating the problem (not sure why), > >I was not sure what to do so I just got a previous version of Numeric > > Some fixes are in CVS of Numeric for 64-bit systems. I need to make a new release... -Travis From elcorto at gmx.net Mon Mar 7 09:34:26 2005 From: elcorto at gmx.net (Steve Schmerler) Date: Mon, 07 Mar 2005 15:34:26 +0100 Subject: [SciPy-user] robustfit In-Reply-To: <4228E944.8040506@csun.edu> References: <4228E944.8040506@csun.edu> Message-ID: <422C6672.6000501@gmx.net> Hi developers I wonder if there are plans to implement a routine in scipy.stats similar to MATLAB's robustfit() function which handels outliers in a data set by iteratively adjusting a residual weight array. Cheers, steve -- There are three types of people in this world: those who make things happen, those who watch things happen and those who wonder what happened. - Mary Kay Ash From duncan at enthought.com Mon Mar 7 12:18:41 2005 From: duncan at enthought.com (Duncan Child) Date: Mon, 07 Mar 2005 11:18:41 -0600 Subject: [SciPy-user] Performance question. In-Reply-To: <42265B75.6080805@ucsd.edu> References: <422649C3.3040600@enthought.com> <42265B75.6080805@ucsd.edu> Message-ID: <422C8CF1.6070604@enthought.com> Thanks for the suggestion Robert. If we ignore the elegance requirement then handling scalars as a special case in pure Python is an adequate approach. However, I am hoping that Numeric3 will allow us to write more elegant functions that perform well on both scalars and arrays. Regards, Duncan Robert Kern wrote: > Duncan Child wrote: > >> I am working on developing algorithms that are usually called with >> parameters that are Numeric arrays. We have the usual challenge >> though of trying to craft code that will gracefully accept both >> floats or arrays. Because of the often discussed problem with the >> handling of zero length arrays we expend some effort to ensure that >> we don't make calls on floats that only work on arrays and have ended >> up with a bunch of code like safe_len() that can be called with either. >> >> Today we have a related problem but now it is with performance (see >> code below). Numeric is faster than NumArray operating on smaller >> arrays but it is still relatively slow handling regular floats. We >> could add to the safe_ suite of functions the fast_ series but this >> still entails a significant performance hit and is not exactly elegant. > > > A potential solution to the performance problem, though not the > elegance problem, might be Pyrex. > > An untested sketch: > > cdef extern from "Numeric/arrayobject.h": > bool PyArray_Check(object x) > > cdef extern from "math.h": > double sqrt(double x) > > import Numeric > > def fast_sqrt(object x): > if PyArray_Check(x): > return Numeric.sqrt(x) > else: > return sqrt(x) > From baghdadi at sickkids.ca Mon Mar 7 14:09:31 2005 From: baghdadi at sickkids.ca (Leila baghdadi) Date: 07 Mar 2005 14:09:31 -0500 Subject: [SciPy-user] scipy test file error In-Reply-To: References: <1110130572.1004.112.camel@mouse18.phenogenomics.ca> <1110144988.1004.142.camel@mouse18.phenogenomics.ca> Message-ID: <1110222571.10566.112.camel@mouse18.phenogenomics.ca> Hi Pearu, I just thought I share this with the list in case anyone is also using Opteron, 1) when I installed Numeric 23.5 and tried to run the test as you told me import scipy scipy.test() it complained about wxpython and got a segfault (I saw it after I e-mailed you yesterday) 2) after you told me that you have Numeric 23.1 on your opteron I decided to get it just for a testing so now I can run scipy.test() and there are no segmentation faults but there is a few errors which I am hoping to fix, a) it complains about a few test no being there from plt, I guess that is because I have no wxpython b)few tests fail with this message Attribute Error : 'module' object has no attribute 'eig' I have also attached a text file of the error message, can you please give me a hint as in what to do and how to fix this Thanks very much Leila On Sun, 2005-03-06 at 16:43, Pearu Peterson wrote: > On Sun, 6 Mar 2005, Leila baghdadi wrote: > > > so I think what happened is I got Numeric 23.7 which was basically > > creating the problem (not sure why), > > > > I was not sure what to do so I just got a previous version of Numeric > > > > 23.5 and installed that and BAM the problem is solved, so I am sure not > > sure what the difference between 23.5 and 23.7 but the reshape and > > resize functions were giving me problems on Opteron. > > In my Opteron box I have rather old Numeric, 23.1, and all tests succeed. > When I get time to upgrade to 23.7, I'll look into this problem. > > > a quick question is do you need wxPython for running scipy.test() > > because it is complaining > > > > /projects/mice/share/arch/linux64/lib/python2.2/site-packages/scipy/plt/interface.pyc No module named wxPython > > No, just without wxPython plt will not be functional. > > Pearu > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user -------------- next part -------------- A non-text attachment was scrubbed... Name: my_stuff Type: application/octet-stream Size: 53698 bytes Desc: not available URL: From stephen.walton at csun.edu Mon Mar 7 15:04:58 2005 From: stephen.walton at csun.edu (Stephen Walton) Date: Mon, 07 Mar 2005 12:04:58 -0800 Subject: [SciPy-user] lognormal distribution In-Reply-To: <420D31A6.2040106@ucsd.edu> References: <420AB03E.5090605@csun.edu> <420D1220.9040102@iname.com> <420D31A6.2040106@ucsd.edu> Message-ID: <422CB3EA.2020601@csun.edu> Robert Kern wrote: > If it's univariate, and you can write out the pdf or cdf as a > function, then I believe you can subclass scipy.stats.rv_continuous, > and it's rvs() method will numerically invert the cdf to generate it's > random numbers. This is so cool! I had a desire to generate values on (0,1) where values near 0.5 were less probable than values at the endpoints. Here's the implementation: #---------------begin------------------------------ import Numeric as Num from scipy.stats.distributions import rv_continuous # # CDF for the sunspot generation function. If we make it a subclass # of rv_continuous we get sunspot.rvs for free :-) # class sunspot_gen(rv_continuous): pmin=0.1 def _pdf(self,x): pmax=(Num.pi-2*self.pmin)/(Num.pi-2) return((pmax-self.pmin)*(1-Num.sin(Num.pi*x))+self.pmin) def _cdf(self,x): pmax=(Num.pi-2*self.pmin)/(Num.pi-2) return -(cos(pi*x)*pmin-cos(pi*x)*pmax-pmax*x*pi-pmin+pmax)/pi sunspot = sunspot_gen(a=0.,b=1.,name='sunspot', longname='A sunspot subdivision',extradoc=""" Sunspot distribution This distribution represents the probability of a subdivision of a sunspot into two spots of size x and (1-x), where x is a value from sunspot.rvs(0). The PDF has a high probability of x near 0 or 1 and a low probability of x near 0.5. """) #-----------------end-------------------------------- My only question: what should I replace the 'import Numeric as Num' with if I want to be able to work within the framework of using either Numeric or numarray? 'import numerix as Num' doesn't seem to work. From jmiller at stsci.edu Mon Mar 7 15:23:36 2005 From: jmiller at stsci.edu (Todd Miller) Date: Mon, 07 Mar 2005 15:23:36 -0500 Subject: [SciPy-user] lognormal distribution In-Reply-To: <422CB3EA.2020601@csun.edu> References: <420AB03E.5090605@csun.edu> <420D1220.9040102@iname.com> <420D31A6.2040106@ucsd.edu> <422CB3EA.2020601@csun.edu> Message-ID: <1110227016.5366.17.camel@jaytmiller.comcast.net> On Mon, 2005-03-07 at 12:04 -0800, Stephen Walton wrote: > Robert Kern wrote: > > > If it's univariate, and you can write out the pdf or cdf as a > > function, then I believe you can subclass scipy.stats.rv_continuous, > > and it's rvs() method will numerically invert the cdf to generate it's > > random numbers. > > This is so cool! I had a desire to generate values on (0,1) where > values near 0.5 were less probable than values at the endpoints. Here's > the implementation: > > #---------------begin------------------------------ > import Numeric as Num > from scipy.stats.distributions import rv_continuous > > # > # CDF for the sunspot generation function. If we make it a subclass > # of rv_continuous we get sunspot.rvs for free :-) > # > > class sunspot_gen(rv_continuous): > pmin=0.1 > def _pdf(self,x): > pmax=(Num.pi-2*self.pmin)/(Num.pi-2) > return((pmax-self.pmin)*(1-Num.sin(Num.pi*x))+self.pmin) > def _cdf(self,x): > pmax=(Num.pi-2*self.pmin)/(Num.pi-2) > return -(cos(pi*x)*pmin-cos(pi*x)*pmax-pmax*x*pi-pmin+pmax)/pi > sunspot = sunspot_gen(a=0.,b=1.,name='sunspot', longname='A sunspot > subdivision',extradoc=""" > > Sunspot distribution > > This distribution represents the probability of a subdivision of a sunspot > into two spots of size x and (1-x), where x is a value from sunspot.rvs(0). > The PDF has a high probability of x near 0 or 1 and a low probability of x > near 0.5. > > """) > #-----------------end-------------------------------- > > My only question: what should I replace the 'import Numeric as Num' > with if I want to be able to work within the framework of using either > Numeric or numarray? 'import numerix as Num' doesn't seem to work. Try "import scipy_base.numerix as Num" but understand that it probably won't work with numarray until scipy as a whole is ported (right now all we've got is scipy_base). It should work with Numeric however. Regards, Todd From rkern at ucsd.edu Tue Mar 8 02:16:29 2005 From: rkern at ucsd.edu (Robert Kern) Date: Mon, 07 Mar 2005 23:16:29 -0800 Subject: [SciPy-user] robustfit In-Reply-To: <422C6672.6000501@gmx.net> References: <4228E944.8040506@csun.edu> <422C6672.6000501@gmx.net> Message-ID: <422D514D.1090603@ucsd.edu> Steve Schmerler wrote: > Hi developers > > I wonder if there are plans to implement a routine in scipy.stats > similar to MATLAB's robustfit() function which handels outliers in a > data set by iteratively adjusting a residual weight array. I don't think anyone has any concrete plans to do so. Can you provide any documentation about the specific algorithm you'd like to see? I've encountered any number of such beasts (and don't trust any of them!). -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From elcorto at gmx.net Tue Mar 8 09:45:20 2005 From: elcorto at gmx.net (Steve Schmerler) Date: Tue, 08 Mar 2005 15:45:20 +0100 Subject: [SciPy-user] robustfit In-Reply-To: <422D514D.1090603@ucsd.edu> References: <4228E944.8040506@csun.edu> <422C6672.6000501@gmx.net> <422D514D.1090603@ucsd.edu> Message-ID: <422DBA80.3090800@gmx.net> Robert Kern wrote: > Steve Schmerler wrote: > >> Hi developers >> >> I wonder if there are plans to implement a routine in scipy.stats >> similar to MATLAB's robustfit() function which handels outliers in a >> data set by iteratively adjusting a residual weight array. > > > I don't think anyone has any concrete plans to do so. Can you provide > any documentation about the specific algorithm you'd like to see? I've > encountered any number of such beasts (and don't trust any of them!). > Never mind. I ended up using Numerical Recipe's medfit() function which works pertty well. Cheers, steve -- There are three types of people in this world: those who make things happen, those who watch things happen and those who wonder what happened. - Mary Kay Ash From flory at fdu.edu Tue Mar 8 11:33:33 2005 From: flory at fdu.edu (David Flory) Date: Tue, 8 Mar 2005 11:33:33 -0500 Subject: [SciPy-user] Novice Question on plt.plot Message-ID: -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1 I cannot run plt.plot from a python 2.3.5 shell. I have no problems from inside pycrust. My DOS window look like this: D:\Flory>python ActivePython 2.3.5 Build 236 (ActiveState Corp.) based on Python 2.3.5 (#62, Feb 9 2005, 16:17:08) [MSC v.1200 32 bit (Intel)] on win32 Type "help", "copyright", "credits" or "license" for more information. >>> import gui_thread >>> gui_thread.start() >>> from scipy import plt >>> plt.plot((1,2,3)) >>> D:\Flory> After the plt.plot() call I pop up an empty window "Figure 0" and the console give the line above. When I click on the window, it turns white stays blank, and reports (Not responding). When I close it, WinXP reports that the program is not responding and forces a close. I have SciPy 0.3.2 installed and wxPython 2.5.3.1. I have looked in the documentation and not found anything. Suggestions or pointers to help would be appreciated. - -------------------------------------------------- David Flory -----BEGIN PGP SIGNATURE----- Version: PGP 8.1 iQA/AwUBQi3T3Ve2/rcN3lp8EQJqjgCgo9I5JLd9bfiPcRolcvppjFp+nHEAoJwE fjbOFUGIVXnTfH5ISehw4yja =vmCF -----END PGP SIGNATURE----- From elcorto at gmx.net Tue Mar 8 12:55:50 2005 From: elcorto at gmx.net (Steve Schmerler) Date: Tue, 08 Mar 2005 18:55:50 +0100 Subject: [SciPy-user] Novice Question on plt.plot In-Reply-To: References: Message-ID: <422DE726.80105@gmx.net> David Flory wrote: > > -----BEGIN PGP SIGNED MESSAGE----- > Hash: SHA1 > > I cannot run plt.plot from a python 2.3.5 shell. I have no problems > from inside pycrust. Is there a special reason you have for calling the python interpreter from the DOS command line (rather than using e.g. PyCrust (as you mentioned), PythonWin or IPython)? There are many other fine plotting packages if you can't get plt running: scipy.xplt (fast, good for short interactive tasks) matplotlib (matplotlib.sourceforge.net, like MATLAB) gnuplot-py (gnuplot-py.sourceforge.net, very nice and fast gnuplot interface) .... Cheers, steve > My DOS window look like this: > > D:\Flory>python > ActivePython 2.3.5 Build 236 (ActiveState Corp.) based on > Python 2.3.5 (#62, Feb 9 2005, 16:17:08) [MSC v.1200 32 bit (Intel)] > on win32 > Type "help", "copyright", "credits" or "license" for more > information. > >>>>import gui_thread >>>>gui_thread.start() >>>>from scipy import plt >>>>plt.plot((1,2,3)) > > _a062ff03_p_wxFrame> > > D:\Flory> > > After the plt.plot() call I pop up an empty window "Figure 0" and the > console give the line above. When I > click on the window, it turns white stays blank, and reports (Not > responding). When I close it, WinXP reports that the program is not > responding and forces a close. > > I have SciPy 0.3.2 installed and wxPython 2.5.3.1. I have looked in > the documentation and not found anything. Suggestions or pointers to > help would be appreciated. > > - -------------------------------------------------- > David Flory > > -----BEGIN PGP SIGNATURE----- > Version: PGP 8.1 > > iQA/AwUBQi3T3Ve2/rcN3lp8EQJqjgCgo9I5JLd9bfiPcRolcvppjFp+nHEAoJwE > fjbOFUGIVXnTfH5ISehw4yja > =vmCF > -----END PGP SIGNATURE----- > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user > > -- There are three types of people in this world: those who make things happen, those who watch things happen and those who wonder what happened. - Mary Kay Ash From oliphant at ee.byu.edu Wed Mar 9 02:32:33 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Wed, 09 Mar 2005 00:32:33 -0700 Subject: [SciPy-user] Future directions for SciPy in light of meeting at Berkeley Message-ID: <422EA691.9080404@ee.byu.edu> I wanted to send an update to this list regarding the meeting at Berkeley that I attended. A lot of good disscussions took place at the meeting that should stimulate larger feedback. Personally, I had far more to discuss before I had to leave, and so I hope that the discussions can continue. I was looking to try and understand why with an increasing number of Scientific users of Python, relatively few people actually seem to want to contribute to scipy, regularly, even becoming active developers. There are lots of people who seem to identify problems (though very often vague ones), but not many who seem able (either through time or interest contraints) to actually contribute to code, documentation, or infrastructure. Scipy is an Open source project and relies on the self-selection process of open source contributors. It would seem that while the scipy conference demonstrates a continuing and even increasing use of Python for scientific computing, not as many of these users are scipy devotees. Why? I think the answers come down to a few issues which I will attempt to answer with proposals. 1) Plotting -- scipy's plotting wasn't good enough (we knew that) and the promised solution (chaco) took too long to emerge as a simple replacement. While the elements were all there for chaco to work, very few people knew that and nobody stepped up to take chaco to the level that matplotlib, for example, has reached in terms of cross-gui applicability and user-interface usability. Proposal: Incorporate matplotlib as part of the scipy framework (replacing plt). Chaco is not there anymore and the other two plotting solutions could stay as backward compatible but not progressing solutions. I have not talked to John about this, though I would like to. I think if some other packaging issues are addressed we might be able to get John to agree. 2) Installation problems -- I'm not completely clear on what the "installation problems" really are. I hear people talk about them, but Pearu has made significant strides to improve installation, so I'm not sure what precise issues remain. Yes, installing ATLAS can be a pain, but scipy doesn't require it. Yes, fortran support can be a pain, but if you use g77 then it isn't a big deal. The reality, though, is that there is this perception of installation trouble and it must be based on something. Let's find out what it is. Please speak up users of the world!!!! Proposal (just an idea to start discussion): Subdivide scipy into several super packages that install cleanly but can also be installed separately. Implement a CPAN-or-yum-like repository and query system for installing scientific packages. Base package: scipy_core -- this super package should be easy to install (no Fortran) and should essentially be old Numeric. It was discussed at Berkeley, that very likely Numeric3 should just be included here. I think this package should also include plotting, weave, scipy_distutils, and even f2py. Some of these could live in dual namespaces (i.e. both weave and scipy.weave are available on install). scipy.traits scipy.weave (weave) scipy.plt (matplotlib) scipy.numeric (Numeric3 -- uses atlas when installed later) scipy.f2py scipy.distutils scipy.fft scipy.linalg? (include something like lapack-lite for basic but slow functionality, installation of improved package replaces this with atlas usage) scipy.stats scipy.util (everything else currently in scipy_core) scipy.testing (testing facilities) Each of these should be a separate package installable and distributable separately (though there may be co-dependencies so that scipy.plt would have to be distributed with scipy. Libraries (each separately installable). scipy.lib -- there should be several sub-packages that could live under hear. This is simply raw code with basic wrappers (kind of like a /usr/lib) scipy.lib.lapack -- installation also updates narray and linalg (hooks to do that) scipy.lib.blas -- installation updates narray and linalg scipy.lib.quadpack etc... Extra sub-packages: named in a hierarchy to be determined and probably each dependent on a variety of scipy-sub-packages. I haven't fleshed this thing out yet as you can tell. I'm mainly talking publicly to spur discussion. The basic idea is that we should force ourselves to distribute scipy in separate packages. This would force us to implement a yum-or-CPAN-like package repository, so that we define the interface as to how an additional module could be developed by someone, even maintained separately (with a different license), and simply inserted into an intelligent point under the scipy infrastructure. It also allow installation/compilation issues to be handled on a more per-module basis so that difficult ones could be noted. I think this would also help interested people get some of the enthought stuff put in to the scipy hierarchy as well. Thoughts and comments (and even half-working code) welcomed and encouraged... -Travis O. From pearu at cens.ioc.ee Wed Mar 9 04:50:09 2005 From: pearu at cens.ioc.ee (pearu at cens.ioc.ee) Date: Wed, 9 Mar 2005 11:50:09 +0200 (EET) Subject: [SciPy-user] Re: [Numpy-discussion] Future directions for SciPy in light of meeting at Berkeley In-Reply-To: <422EBC9C.1060503@ims.u-tokyo.ac.jp> Message-ID: On Wed, 9 Mar 2005, Michiel Jan Laurens de Hoon wrote: > Travis Oliphant wrote: > > Proposal (just an idea to start discussion): > > > > Subdivide scipy into several super packages that install cleanly but can > > also be installed separately. Implement a CPAN-or-yum-like repository > > and query system for installing scientific packages. > > Yes! If SciPy could become a kind of scientific CPAN for python from > which users can download the packages they need, it would be a real > improvement. In the end, the meaning of SciPy evolve into "the website > where you can download scientific packages for python" rather than "a > python package for scientific computing", and the SciPy developers might > not feel OK with that. Personally, I would be OK with that. SciPy as a "download site" does not exclude it to provide also a "scipy package" as it is now. I am all in favore of refactoring current scipy modules as much as possible. Pearu From prabhu_r at users.sf.net Wed Mar 9 06:24:42 2005 From: prabhu_r at users.sf.net (Prabhu Ramachandran) Date: Wed, 9 Mar 2005 16:54:42 +0530 Subject: [SciPy-user] Re: [SciPy-dev] Future directions for SciPy in light of meeting at Berkeley In-Reply-To: <422EA691.9080404@ee.byu.edu> References: <422EA691.9080404@ee.byu.edu> Message-ID: <16942.56570.375971.565270@monster.linux.in> Hi Travis, >>>>> "TO" == Travis Oliphant writes: TO> I was looking to try and understand why with an increasing TO> number of Scientific users of Python, relatively few people TO> actually seem to want to contribute to scipy, regularly, even TO> becoming active developers. There are lots of people who seem TO> to identify problems (though very often vague ones), but not TO> many who seem able (either through time or interest TO> contraints) to actually contribute to code, documentation, or TO> infrastructure. I think there are two issues here. 1. Finding developers. Unfortunately, I'm as clueless as anyone else. It looks to me that most folks who are capable of contributing are already occupied with other projects. The rest use scipy and are quite happy with it (except for the occasional problem). Others are either heavily invested in other solutions, or don't have the skill or time to contribute. I also think that there are a fair number of users who use scipy at some level or another but are quiet about it and don't have a chance to contribute. From what I can tell, the intersection of the set of people who possess good computing skills and also persue numerical work from Python is still a very small number compared to other fields. 2. Packaging issues. More on this later. [...] TO> I think the answers come down to a few issues which I will TO> attempt to answer with proposals. TO> 1) Plotting -- scipy's plotting wasn't good enough (we knew I am not sure what this has to do with scipy's utility? Do you mean to say that you'd like to have people starting to use scipy to plot things and then hope that they contribute back to scipy's numeric algorithms? If all they did was to use scipy for plotting, the only contributions would be towards plotting. If you only mean this as a convenience, then this seems like a packaging issue and not related to scipy. Plotting is one part of the puzzle. You don't seem to mention any deficiencies with respect to numerical algorithms. This seems to suggest that apart from things like packaging and docs, the numeric side is pretty solid. Let me take this to an extreme, if plotting be deemed a part of scipy's core then how about f2py? It is definitely core functionality. So why not make f2py part of scipy? How about g77, g95, and gcc. The only direction this looks to be headed is to make a SciPy OS (== Enthon?). I think we are mixing packaging along with other issues here. To make it clear, I am not against incorporating matplotlib in scipy. I just think that the argument for its inclusion does not seem clear to me. [...] TO> 2) Installation problems -- I'm not completely clear on what TO> the TO> "installation problems" really are. I hear people talk about [...] TO> Proposal (just an idea to start discussion): TO> Subdivide scipy into several super packages that install TO> cleanly but can also be installed separately. Implement a TO> CPAN-or-yum-like repository and query system for installing TO> scientific packages. What does this have to do with scipy per se? This is more like a user convenience issue. [scipy-sub-packages] TO> I haven't fleshed this thing out yet as you can tell. I'm TO> mainly talking publicly to spur discussion. The basic idea is TO> that we should force ourselves to distribute scipy in separate TO> packages. This would force us to implement a yum-or-CPAN-like TO> package repository, so that we define the interface as to how TO> an additional module could be developed by someone, even TO> maintained separately (with a different license), and simply TO> inserted into an intelligent point under the scipy TO> infrastructure. This is in general a good idea but one that goes far beyond scipy itself. Joe Cooper mentioned that he had ideas on how to really do this in a cross-platform way. Many of us eagerly await his solution. :) regards, prabhu From aisaac at american.edu Wed Mar 9 08:49:32 2005 From: aisaac at american.edu (Alan G Isaac) Date: Wed, 9 Mar 2005 08:49:32 -0500 (Eastern Standard Time) Subject: [SciPy-user] Re[2]: [SciPy-dev] Future directions for SciPy in light of meeting at Berkeley In-Reply-To: <16942.56570.375971.565270@monster.linux.in> References: <422EA691.9080404@ee.byu.edu><16942.56570.375971.565270@monster.linux.in> Message-ID: On Wed, 9 Mar 2005, Prabhu Ramachandran apparently wrote: > What does this have to do with scipy per se? This is more > like a user convenience issue. I think the proposal is: development effort is a function of community size, and community size is a function of convenience as well as functionality. This seems right to me. Cheers, Alan Isaac From cdavis at staffmail.ed.ac.uk Wed Mar 9 09:24:17 2005 From: cdavis at staffmail.ed.ac.uk (Cory Davis) Date: 09 Mar 2005 14:24:17 +0000 Subject: [SciPy-user] Re[2]: [SciPy-dev] Future directions for SciPy in light of meeting at Berkeley In-Reply-To: References: <422EA691.9080404@ee.byu.edu> <16942.56570.375971.565270@monster.linux.in> Message-ID: <1110378257.10146.28.camel@fog> Hi All > I think the proposal is: > development effort is a function of community size, Undeniably true! > and community size is a function of convenience as > well as functionality. > This is only partly true. I think the main barriers to more people using scipy are... 1. Not that many people actually know about it 2. People aren't easily convinced to change from what they were taught to use as an under-graduate (e.g. Matlab, IDL, Mathematica) As it stands, I don't think scipy is particularly inconvenient to install or use. On the two suggested improvements: I think incorporating matplotlib is an excellent idea. But I think the second suggestion of separating Scipy into independent packages will prove to be counter-productive. It might put people off even before they start, because instead of installing one package, they have a bewildering choice of many. And it could prove to be annoying to people using scipy who want to share or distribute code, with the requirement that both parties have scipy becoming a requirement that both parties have a specific combination of scipy packages. Also, another reason why there might be a lack of developers is that there a people like me who find that scipy and matplotlib already do everything that they need. Which is good right? Cheers, Cory. > This seems right to me. > > Cheers, > Alan Isaac > > > > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user -- )))))))))))))))))))))))))))))))))))))))))))) Cory Davis Meteorology School of GeoSciences University of Edinburgh King's Buildings EDINBURGH EH9 3JZ ph: +44(0)131 6505092 fax +44(0)131 6505780 cdavis at staffmail.ed.ac.uk cory at met.ed.ac.uk http://www.geos.ed.ac.uk/contacts/homes/cdavis )))))))))))))))))))))))))))))))))))))))))))) From jh at oobleck.astro.cornell.edu Wed Mar 9 10:42:07 2005 From: jh at oobleck.astro.cornell.edu (Joe Harrington) Date: Wed, 9 Mar 2005 10:42:07 -0500 Subject: [SciPy-user] Re: Future directions for SciPy in light of meeting at Berkeley In-Reply-To: <20050309112636.24F99334FE@sc8-sf-spam1.sourceforge.net> (numpy-discussion-request@lists.sourceforge.net) References: <20050309112636.24F99334FE@sc8-sf-spam1.sourceforge.net> Message-ID: <200503091542.j29Fg7nX021779@oobleck.astro.cornell.edu> These were exactly the issues we addressed at SciPy04, and which led to the ASP project. All of the issues brought up in the current discussion have already been discussed there, and with largely the same conclusions. The basic gist is this: THERE ARE THOUSANDS OF PEOPLE WAITING FOR SCIPY TO REACH CRITICAL MASS! SciPy will reach the open-source jumping-off point when an outsider has the following experience: They google, find us, visit us, learn what they'll be getting, install it trivially, and read a tutorial that in less than 15 minutes has them plotting their own data. In that process, which will take less than 45 minutes total, they must also gain confidence in the solidity and longevity of the software and find a supportive community. We don't meet all the elements of this test now. Once we do, people will be ready to jump on and work the open-source magic. The goal of ASP (Accessible SciPy) is to meet that test. Some of what we need is being done already, but by a very small number of people. We need everyone's help to reach a meaningful rate of progress. The main points and their status: 1. Resolve the numeric/numarray split and get at least stubs for the basic routines in the Python core. Nothing scares new users more than instability and uncertainty. Travis O. is now attempting to incorporate numarray's added features (including much of the code that implements them) into numeric, and has made a lot of headway. Perry G. has said that he would switch back to numeric if it did the things numarray does. I think we can forsee a resolution to this split in the calendar year IF that effort stays the course. 2. Package it so that it's straightforward to install on all the popular architectures. Joe Cooper has done a lot here, as have others. The basic stuff installs trivially on Red Hat versions of Linux, Windows, and several others (including Debian, I think, and Mac, modulo the inherent problems people report with the Mac package managers, which we can do nothing about). Optimized installs are also available and not all that difficult, particularly if you're willing to issue a one-line command to rebuild a source package. For Linux, it was decided to stick with a core and add-on packages, and to offer umbrella packages that install common groups of packages through the dependency mechanism (e.g., for astronomy or biology). The main issue here is not the packaging, but the documentation, which is trivial to write at this point. I was able to do a "yum install scipy" at SciPy04, once I knew where the repository was. It's: http://www.enthought.com/python/fedora/$releasever We need someone to write installation notes for each package manager. We also need umbrella packages. 3. Document it thoroughly for both new and experienced users. Right now what we have doesn't scratch the surface. I mean no offense to those who have written what we do have. We need to update that and to write a lot more and a lot else. Janet Swisher and several others are ready to dig into this, but we're waiting for the numeric/numarray split to resolve. A list of needed documents is in the ASP proposal. 4. Focus new users on a single selection of packages. The variety of packages available to do a particular task is both a strength and a weakness. While experienced people will want choice, new users need simplicity. We will select a single package each application (like plotting), and will mainly describe those in the tutorial-level docs. We will not be afraid to change the selection of packages. You're only a new user once, so it will not affect you if we switch the docs after you've become experienced. For example, Matplotlib was selected at the SciPy04 BoF, but if Chaco ever reaches that level of new-user friendliness, we might switch. Both packages will of course always be available. Neither needs to be in the core on Linux and other systems that have package management. New users will be steered to the "starter" umbrella package, which will pull in any components that are not in the core. Enthon will continue to include all the packages in the world, I'm sure! 5. Provide a web site that is easy to use and that communicates to each client audience. We (me, Perry, Janet, Jon-Eric) were actually gearing up to solicit proposals for improving the site and making it the go-to place for all things numerical in python when Travis started his work on problem #1. This is the next step, but we're waiting for item 1 to finish so that we don't distract everyone's attention from its resolution. Many developers are interested in contributing here, too. If people feel it's time, we can begin this process. I just don't want to slow Travis and his helpers one tiny bit! 6. Catalog all the add-ons and external web sites so that scipy.org becomes the portal for all things numeric in python. This, at least, is done, thanks to Fernando Perez. See: http://www.scipy.org/wikis/topical_software/TopicalSoftware I'll add one more issue: 7. Do something so people who use SciPy, numeric, and numarray remember that these issues are being worked, and where, and how to contribute. To that end, all I can do is post periodically about ASP and encourage you to remember it whenever someone wonders why we haven't hit critical mass yet. Please visit the ASP wiki. Read the ASP proposal if you haven't, sign up to do something, and do it! Right now, a paltry 6 people have signed up to help out. http://www.scipy.org/wikis/accessible_scipy/AccessibleSciPy The ASP proposal is linked in the first paragraph of the wiki. After giving it some thought, we decided to use scipy-dev at scipy.net as our mailing list, to avoid cross-posted discussions on the 4 mailing lists. Please carry on any further discussion there. Thanks, --jh-- From prabhu_r at users.sf.net Wed Mar 9 11:52:42 2005 From: prabhu_r at users.sf.net (Prabhu Ramachandran) Date: Wed, 9 Mar 2005 22:22:42 +0530 Subject: [SciPy-user] Re[2]: [SciPy-dev] Future directions for SciPy in light of meeting at Berkeley In-Reply-To: References: <422EA691.9080404@ee.byu.edu> <16942.56570.375971.565270@monster.linux.in> Message-ID: <16943.10714.85815.666793@monster.linux.in> >>>>> "AI" == Alan G Isaac writes: AI> On Wed, 9 Mar 2005, Prabhu Ramachandran apparently wrote: >> What does this have to do with scipy per se? This is more like >> a user convenience issue. AI> I think the proposal is: development effort is a function of AI> community size, and community size is a function of AI> convenience as well as functionality. To put it bluntly, I don't believe that someone who can't install scipy today is really capable of contributing code to scipy. I seriously doubt claims that scipy is scary or hard to install today. Therefore, the real problem does not appear to be convenience and IMHO neither is functionality the problem. My only point is this. I think Travis and Pearu have been doing a great job! I'd rather see them working on things like Numeric3 and core scipy functionality rather than spend time worrying about packaging, including other new packages and making things more comfortable for the user (especially when these things are already taken care of). Anyway, Joe's post about ASP's role is spot on! Thanks Joe. More on that thread. cheers, prabhu From stephen.walton at csun.edu Wed Mar 9 12:33:19 2005 From: stephen.walton at csun.edu (Stephen Walton) Date: Wed, 09 Mar 2005 09:33:19 -0800 Subject: [SciPy-user] Re: [SciPy-dev] Future directions for SciPy in light of meeting at Berkeley In-Reply-To: <422EA691.9080404@ee.byu.edu> References: <422EA691.9080404@ee.byu.edu> Message-ID: <422F335F.8060107@csun.edu> I only have a little to contribute at this point: > Proposal: > Incorporate matplotlib as part of the scipy framework (replacing plt). While this is an admirable goal, I personally find scipy and matplotlib easy to install separately. The only difficulty (of course!) is the numarray/numeric split, so I have to be sure that I select numerix as Numeric in my .matplotlibrc file before typing 'ipython -pylab -p scipy', which actually works really well. > 2) Installation problems -- I'm not completely clear on what the > "installation problems" really are. scipy and matplotlib are both very easy to install. Using ATLAS is the biggest pain, as Travis says, and one can do without it. Now that a simple 'scipy setup.py bdist_rpm' seems to work reliably, I for one am happy. I think splitting scipy up into multiple subpackages isn't such a good idea. Perhaps I'm in the minority, but I find CPAN counter-intuitive, hard to use, and hard to keep track of in an RPM-based environment. Any large package is going to include a lot of stuff most people don't need, but like a NY Times ad used to say, "You might not read it all, but isn't it nice to know it's all there?" I can tell you why I'm not contributing much code to the effort at least in one recent instance. Since I'm still getting core dumps when I try to use optimize.leastsq with a defined Jacobian function, I dove into _minpackmodule.c and its associated routines last night. I'm at sea. I know enough Python to be dangerous, used LMDER from Fortran extensively while doing my Ph.D., and am pretty good at C, but am completely unfamiliar with the Python-C API. So I don't even know how to begin tracking the problem down. Finally, as I mentioned at SciPy04, our particular physics department is at an undergraduate institution (no Ph.D. program), so we mainly produce majors who stop at the B.S. or M.S. degree. Their job market seems to want MATLAB skills, not Python, at the moment, so that's what the faculty are learning and teaching to their students. Many of them/us simply don't have the time to learn Python on top of that. Though, when I showed some colleagues how trivial it was to trim some unwanted bits out of data files they had using Python, I think I converted them. From bryan.cole at teraview.com Wed Mar 9 13:14:13 2005 From: bryan.cole at teraview.com (Bryan Cole) Date: Wed, 09 Mar 2005 18:14:13 +0000 Subject: [SciPy-user] Compiling python C-extensions to 'Enthon' on Win32 Message-ID: Hi, Can anyone tell me what compilers can be used to build python C-extensions for the current (2.3) Enthought python distribution (on WindowsXP)? I didn't have any luck with the free-to-download MSVC compiler. Do I need the full MS-VisualStudio? Which version and does it matter? Are there any alternatives (BCC or mingw perhaps)? cheers, Bryan From jdhunter at ace.bsd.uchicago.edu Wed Mar 9 13:44:30 2005 From: jdhunter at ace.bsd.uchicago.edu (John Hunter) Date: Wed, 09 Mar 2005 12:44:30 -0600 Subject: [SciPy-user] Future directions for SciPy in light of meeting at Berkeley In-Reply-To: <422EA691.9080404@ee.byu.edu> (Travis Oliphant's message of "Wed, 09 Mar 2005 00:32:33 -0700") References: <422EA691.9080404@ee.byu.edu> Message-ID: >>>>> "Travis" == Travis Oliphant writes: Travis> It would seem that while the scipy conference demonstrates Travis> a continuing and even increasing use of Python for Travis> scientific computing, not as many of these users are scipy Travis> devotees. Why? Hi Travis, I like a lot of your proposal, and I want to throw a couple of additional ideas into the mix. There are two ideas about what scipy is: a collection of scientific algorithms and a general purpose scientific computing environment. On the first front, scipy has been a great success; on the second, less so. I think the following would be crucial to make such an effort a success (some of these are just restatements of your ideas with additional comments) * Easy to install: - it would be probably be important to have a fault-tolerant install so that even if a component fails, the parts that don't depend on that can continue. Matthew Knepley's build system might be an important tool to make this work right for source installs, rather than trying to push distutils too hard. * A package repository and a way of specifying dependencies between the packages and allow automated recursive downloads ala apt-get, yum, etc.... So basically we have to come up with a package manager, and probably one that supports src as well as binary installs. Everyone knows this is a significant problem in python, and we're in a good place to tackle it in that we have experience distributing complex packages across platforms which are a mixture of python/C/C++/FORTRAN, so if we can make it work, it will probably work for all of python. I think we would want contributions from people who do packaging on OSX and win32, eg Bob Ippolito, Joe Cooper, Robert Kern, and others. * Transparent support for Numeric, numarray and Numeric3 built into a compatibility layer, eg something like matplotlib.numerix which enables the user to be shielded from past and future changes in the array package. If you and the numarray developers can agree on that interface, that is an important start, because no matter how much success you have with Numeric3, Numeric 23.x and numarray will be in the wild for some time to come. Having all the major players come together and agree on a core interface layer would be a win. In practice, it works well in matplotlib.numerix. * Buy-in from the developers of all the major packages that people want and need to have the CVS / SVN live on a single site which also has mailing lists etc. I think this is a possibility, actually; I'm open to it at least. * Good tutorial, printable documentation, perhaps following a "dive into python" model with a "just-in-time" model of teaching the language; ie, task oriented. A question I think should be addressed is whether scipy is the right vehicle for this aggregation. I know this has been a long-standing goal of yours and appreciate your efforts to continue to make it happen. But there is a lot of residual belief that scipy is hard to install, and this is founded in an old memory that refuses, sometimes irrationally, to die, and in part from people's continued difficulties. If we make a grand effort to unify into a coherent whole, we might be better off with a new name that doesn't carry the difficult-to-install connotation. And easy-to-install should be our #1 priority. Another reason to consider a neutral name is that it wouldn't scare off a lot of people who want to use these tools but don't consider themselves to be scientists. In matplotlib, there are people who just want to make bar and pie charts, and in talks I've given many people are very happy when I tell them that I am interested in providing plotting capabilities outside the realm of scientific plotting. This is obviously a lot to bite off but it could be made viable with some dedicated effort; python is like that. Another concern I have, though, is that it seems to duplicate a lot of the enthought effort to build a scientific python bundle -- they do a great job already for win32 and I think an enthought edition for linux and OSX are in the works. The advantage of your approach is that it is modular rather than monolithic. To really make this work, I think enthought would need to be on board with it. Eg mayavi2 and traits2 are both natural candidates for inclusion into this beast, but both live in the enthought subversion tree. Much of what you describe seems to be parallel to the enthought python, which also provides scipy, numeric, ipython, mayavi, plotting, and so on. I am hesitant to get too involved in the packaging game -- it's really hard and would take a lot of work. We might be better off each making little focused pieces, and let packagers (pythonmac, fink, yum, debian, enthought, ...) do what they do well. Not totally opposed, mind you, just hesitant.... JDH From cjw at sympatico.ca Wed Mar 9 15:15:54 2005 From: cjw at sympatico.ca (Colin J. Williams) Date: Wed, 09 Mar 2005 15:15:54 -0500 Subject: [SciPy-user] Re: [SciPy-dev] Future directions for SciPy in light of meeting at Berkeley In-Reply-To: <422F335F.8060107@csun.edu> References: <422EA691.9080404@ee.byu.edu> <422F335F.8060107@csun.edu> Message-ID: <422F597A.6070905@sympatico.ca> Stephen Walton wrote: > I think splitting scipy up into multiple subpackages isn't such a good > idea. Perhaps I'm in the minority, but I find CPAN counter-intuitive, > hard to use, and hard to keep track of in an RPM-based environment. > Any large package is going to include a lot of stuff most people don't > need, but like a NY Times ad used to say, "You might not read it all, > but isn't it nice to know it's all there?" > Debian has a nice package management system with a GUI (aptitude) to go on top of the basic apt. This keeps track of and enforces pakage dependencies. Colin W. From eric at enthought.com Wed Mar 9 15:51:24 2005 From: eric at enthought.com (eric jones) Date: Wed, 09 Mar 2005 14:51:24 -0600 Subject: [SciPy-user] Compiling python C-extensions to 'Enthon' on Win32 In-Reply-To: References: Message-ID: <422F61CC.5080600@enthought.com> Python Enthought Edition comes with mingw (C:\Python23\Enthought\MingW\bin) and should set up your path correctly to use it (try gcc from the command line to make sure it is there). When running setup.py for your module, you can do the following: python setup.py build_ext --compiler=mingw32 and that should work. eric Bryan Cole wrote: >Hi, > >Can anyone tell me what compilers can be used to build python >C-extensions for the current (2.3) Enthought python distribution (on >WindowsXP)? > >I didn't have any luck with the free-to-download MSVC compiler. Do I need >the full MS-VisualStudio? Which version and does it matter? Are there any >alternatives (BCC or mingw perhaps)? > >cheers, >Bryan > >_______________________________________________ >SciPy-user mailing list >SciPy-user at scipy.net >http://www.scipy.net/mailman/listinfo/scipy-user > > From stephen.walton at csun.edu Wed Mar 9 16:50:12 2005 From: stephen.walton at csun.edu (Stephen Walton) Date: Wed, 09 Mar 2005 13:50:12 -0800 Subject: [SciPy-user] Re: [SciPy-dev] Future directions for SciPy in light of meeting at Berkeley In-Reply-To: <422F597A.6070905@sympatico.ca> References: <422EA691.9080404@ee.byu.edu> <422F335F.8060107@csun.edu> <422F597A.6070905@sympatico.ca> Message-ID: <422F6F94.1040105@csun.edu> Colin J. Williams wrote: > Stephen Walton wrote: > >> Perhaps I'm in the minority, but I find CPAN counter-intuitive, hard >> to use, and hard to keep track of in an RPM-based environment. > > Debian has a nice package management system with a GUI (aptitude) to > go on top of the basic apt. This keeps track of and enforces pakage > dependencies. So does yum, and CPAN itself for that matter, but that isn't really what I was raising. If I use cpan to install Perl::MyPackage::MyModule it doesn't show up when I do 'rpm -qa'. I can almost always say 'yum install perl-MyPackage-MyModule' though with my repository collection. Perhaps my main complaint is that the CPAN divisions are just too fine grained. Disk space is way cheaper than my time, and I'd rather get everything I need in one go. From bob at redivi.com Wed Mar 9 19:38:10 2005 From: bob at redivi.com (Bob Ippolito) Date: Wed, 9 Mar 2005 19:38:10 -0500 Subject: [SciPy-user] Re: [SciPy-dev] Future directions for SciPy in light of meeting at Berkeley In-Reply-To: <422F6F94.1040105@csun.edu> References: <422EA691.9080404@ee.byu.edu> <422F335F.8060107@csun.edu> <422F597A.6070905@sympatico.ca> <422F6F94.1040105@csun.edu> Message-ID: <8a3a3fb49aaf411d1b01c829704b7c5c@redivi.com> On Mar 9, 2005, at 4:50 PM, Stephen Walton wrote: > Colin J. Williams wrote: > >> Stephen Walton wrote: >> >>> Perhaps I'm in the minority, but I find CPAN counter-intuitive, hard >>> to use, and hard to keep track of in an RPM-based environment. >> >> Debian has a nice package management system with a GUI (aptitude) to >> go on top of the basic apt. This keeps track of and enforces pakage >> dependencies. > > So does yum, and CPAN itself for that matter, but that isn't really > what I was raising. If I use cpan to install > Perl::MyPackage::MyModule it doesn't show up when I do 'rpm -qa'. I > can almost always say 'yum install perl-MyPackage-MyModule' though > with my repository collection. Perhaps my main complaint is that the > CPAN divisions are just too fine grained. Disk space is way cheaper > than my time, and I'd rather get everything I need in one go. I also find that disk space is way cheaper than my time, however, building source packages (CPAN, DarwinPorts, etc.) can take *WAY* too long. Especially for CPAN, where it asks you questions all the goddamn time, so you have to babysit. That annoys me to no end. Personally, I think the focus should be on repositories for win32 and Mac OS X binary packages. Linux/BSD/etc. don't need yet another repository for packages, because they have systems like yum, apt, pkgsrc, etc. with an inherent distribution model. If any distribution effort is put into the *nix OS'es, it should probably be to develop bdist_* targets so that it can integrate with the OS, not to develop One Package Manager To Rule Them All. Not yet, anyway. The best policy I have found is to build everything statically, taking advantage of whatever is already on the OS (for example, the ATLAS capabilities in Apple's Accelerate framework), so you don't have to worry about non-Python dependencies. There are a couple exceptions, like SDL and wxWidgets, where it is better to go with the dynamically linkable version because you're going to have a bunch of things touching it. Python dependencies can be managed rather simply just by maintaining a list such as X *needs* Y, Z.. however, it becomes more complicated when you have X *wants* Y. Normally I just punt on this problem, and say that all wants are needs, so if you install a package you'll have everything it can do at the expense of downloading more stuff. -bob From bob at redivi.com Wed Mar 9 19:51:03 2005 From: bob at redivi.com (Bob Ippolito) Date: Wed, 9 Mar 2005 19:51:03 -0500 Subject: [SciPy-user] Re: [SciPy-dev] Future directions for SciPy in light of meeting at Berkeley In-Reply-To: <422F335F.8060107@csun.edu> References: <422EA691.9080404@ee.byu.edu> <422F335F.8060107@csun.edu> Message-ID: <6916ec732f2e70d1789cc0f480f82e7f@redivi.com> On Mar 9, 2005, at 12:33 PM, Stephen Walton wrote: >> 2) Installation problems -- I'm not completely clear on what the >> "installation problems" really are. > > scipy and matplotlib are both very easy to install. Using ATLAS is > the biggest pain, as Travis says, and one can do without it. Now that > a simple 'scipy setup.py bdist_rpm' seems to work reliably, I for one > am happy. On Mac OS X, using ATLAS should be pretty trivial because the OS already ships with an optimized implementation! The patch I created for Numeric was very short, and I'm pretty sure it's on the trunk (though last I packaged it, I had to make a trivial fix or two, which I reported on sourceforge). I haven't delved into SciPy's source in a really long time, so I'm not sure where changes would need to be made, but I think someone else should be fine to look at Numeric's setup.py and do what needs to be done to SciPy. FYI, matplotlib, the optimized Numeric, and several other Mac OS X packages are available in binary form here: http://pythonmac.org/packages/ > I think splitting scipy up into multiple subpackages isn't such a good > idea. Perhaps I'm in the minority, but I find CPAN counter-intuitive, > hard to use, and hard to keep track of in an RPM-based environment. > Any large package is going to include a lot of stuff most people don't > need, but like a NY Times ad used to say, "You might not read it all, > but isn't it nice to know it's all there?" I also think that a monolithic package is a pretty good idea until it begins to cause problems with the release cycle. Twisted had this problem at 1.3, and went through a major refactoring between then and 2.0 (which is almost out the door). Though Twisted 2.0 is technically many different packages, they still plan on maintaining a "sumo" package that includes all of the Twisted components, plus zope.interface (the only required dependency). There are still several optional dependencies not included, though (such as PyCrypto). SciPy could go this route, and simply market the "sumo" package to anyone who doesn't already know what they're doing. An experienced SciPy user may want to upgrade one particular component of SciPy as early as possible, but leave the rest be, for example. -bob From rkern at ucsd.edu Wed Mar 9 20:30:20 2005 From: rkern at ucsd.edu (Robert Kern) Date: Wed, 09 Mar 2005 17:30:20 -0800 Subject: [SciPy-user] Re: [SciPy-dev] Future directions for SciPy in light of meeting at Berkeley In-Reply-To: <6916ec732f2e70d1789cc0f480f82e7f@redivi.com> References: <422EA691.9080404@ee.byu.edu> <422F335F.8060107@csun.edu> <6916ec732f2e70d1789cc0f480f82e7f@redivi.com> Message-ID: <422FA32C.3040305@ucsd.edu> Bob Ippolito wrote: > > On Mar 9, 2005, at 12:33 PM, Stephen Walton wrote: > >>> 2) Installation problems -- I'm not completely clear on what the >>> "installation problems" really are. >> >> >> scipy and matplotlib are both very easy to install. Using ATLAS is >> the biggest pain, as Travis says, and one can do without it. Now that >> a simple 'scipy setup.py bdist_rpm' seems to work reliably, I for one >> am happy. > > > On Mac OS X, using ATLAS should be pretty trivial because the OS already > ships with an optimized implementation! The patch I created for Numeric > was very short, and I'm pretty sure it's on the trunk (though last I > packaged it, I had to make a trivial fix or two, which I reported on > sourceforge). I haven't delved into SciPy's source in a really long > time, so I'm not sure where changes would need to be made, but I think > someone else should be fine to look at Numeric's setup.py and do what > needs to be done to SciPy. Scipy already works just fine with Accelerate. No patch is necessary. However, the ATLAS provided in the Accelerate framework is incomplete. Namely, it's missing the row-major versions of BLAS and the optimized LAPACK routines. Scipy does make good use of these *if* they are available, and the availability of these routines can be quite important (if you are doing heavy linear algebra, of course, if you Just Need It To Build, then Accelerate is the way to go). > FYI, matplotlib, the optimized Numeric, and several other Mac OS X > packages are available in binary form here: > http://pythonmac.org/packages/ > >> I think splitting scipy up into multiple subpackages isn't such a good >> idea. Perhaps I'm in the minority, but I find CPAN counter-intuitive, >> hard to use, and hard to keep track of in an RPM-based environment. >> Any large package is going to include a lot of stuff most people don't >> need, but like a NY Times ad used to say, "You might not read it all, >> but isn't it nice to know it's all there?" > > > I also think that a monolithic package is a pretty good idea until it > begins to cause problems with the release cycle. Twisted had this > problem at 1.3, and went through a major refactoring between then and > 2.0 (which is almost out the door). Though Twisted 2.0 is technically > many different packages, they still plan on maintaining a "sumo" package > that includes all of the Twisted components, plus zope.interface (the > only required dependency). There are still several optional > dependencies not included, though (such as PyCrypto). > > SciPy could go this route, and simply market the "sumo" package to > anyone who doesn't already know what they're doing. An experienced > SciPy user may want to upgrade one particular component of SciPy as > early as possible, but leave the rest be, for example. The Twisted re-org is something I think we should look at. There seem to be reference to using zpkgtools to handle the split, but I can't find any of the data files that are referenced in the build scripts. Does anyone more familiar with the Twisted re-org care to chime in? -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From bob at redivi.com Wed Mar 9 20:42:28 2005 From: bob at redivi.com (Bob Ippolito) Date: Wed, 9 Mar 2005 20:42:28 -0500 Subject: [SciPy-user] Re: [SciPy-dev] Future directions for SciPy in light of meeting at Berkeley In-Reply-To: <422FA32C.3040305@ucsd.edu> References: <422EA691.9080404@ee.byu.edu> <422F335F.8060107@csun.edu> <6916ec732f2e70d1789cc0f480f82e7f@redivi.com> <422FA32C.3040305@ucsd.edu> Message-ID: <6c08d7d57bcc3bc17f4654a833d43177@redivi.com> On Mar 9, 2005, at 8:30 PM, Robert Kern wrote: > Bob Ippolito wrote: >> On Mar 9, 2005, at 12:33 PM, Stephen Walton wrote: >>>> 2) Installation problems -- I'm not completely clear on what the >>>> "installation problems" really are. >>> >>> >>> scipy and matplotlib are both very easy to install. Using ATLAS is >>> the biggest pain, as Travis says, and one can do without it. Now >>> that a simple 'scipy setup.py bdist_rpm' seems to work reliably, I >>> for one am happy. >> On Mac OS X, using ATLAS should be pretty trivial because the OS >> already ships with an optimized implementation! The patch I created >> for Numeric was very short, and I'm pretty sure it's on the trunk >> (though last I packaged it, I had to make a trivial fix or two, which >> I reported on sourceforge). I haven't delved into SciPy's source in >> a really long time, so I'm not sure where changes would need to be >> made, but I think someone else should be fine to look at Numeric's >> setup.py and do what needs to be done to SciPy. > > Scipy already works just fine with Accelerate. No patch is necessary. > However, the ATLAS provided in the Accelerate framework is incomplete. > Namely, it's missing the row-major versions of BLAS and the optimized > LAPACK routines. Scipy does make good use of these *if* they are > available, and the availability of these routines can be quite > important (if you are doing heavy linear algebra, of course, if you > Just Need It To Build, then Accelerate is the way to go). You should file a bug at http://bugreport.apple.com/ if you have not already. There's still a chance for 10.4! -bob From oliphant at ee.byu.edu Wed Mar 9 22:21:46 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Wed, 09 Mar 2005 20:21:46 -0700 Subject: [SciPy-user] Current thoughts on future directions In-Reply-To: <6916ec732f2e70d1789cc0f480f82e7f@redivi.com> References: <422EA691.9080404@ee.byu.edu> <422F335F.8060107@csun.edu> <6916ec732f2e70d1789cc0f480f82e7f@redivi.com> Message-ID: <422FBD4A.3030708@ee.byu.edu> I had a lengthy discussion with Eric today and clarified some things in my mind about the future directions of scipy. The following is basically what we have decided. We are still interested in input so don't think the issues are closed, but I'm just giving people an idea of my (and Eric's as far as I understand it) thinking on scipy. 1) There will be a scipy_core package which will be essentially what Numeric has always been (plus a few easy to install extras already in current scipy_core). It will likely contain the functionality of (the names and placements will be similar to current scipy_core). Numeric3 (actually called ndarray or narray or numstar or numerix or something....) fft (based on c-only code -- no fortran dependency) linalg (a lite version -- no fortran or ATLAS dependency) stats (a lite version --- no fortran dependency) special (only c-code --- no fortran dependency) weave f2py? (still need to ask Pearu about this) scipy_distutils and testing matrix and polynomial classes ...others...? We will push to make this an easy-to-install effective replacement for Numeric and hopefully for numarray users as well. Therefore community input and assistance will be particularly important. 2) The rest of scipy will be a package (or a series of packages) of algorithms. We will not try to do plotting as part of scipy. The current plotting in scipy will be supported for a time, but users will be weaned off to other packages: matplotlib, pygist (for xplt -- and I will work to get any improvements for xplt into pygist itself), gnuplot, etc. 3) Having everything under a scipy namespace is not necessary, nor worth worrying about at this point. My scipy-related focus over the next 5-6 months will be to get scipy_core to the point that most can agree it effectively replaces the basic tools of Numeric and numarray. -Travis From timmck at cns.bu.edu Wed Mar 9 22:59:08 2005 From: timmck at cns.bu.edu (Tim McKenna) Date: Wed, 09 Mar 2005 22:59:08 -0500 Subject: [SciPy-user] NaN in matrices Message-ID: <422FC60C.2030807@cns.bu.edu> Hi, I've gotten used to using NaN in matrices (mostly for missing data values) in Matlab and Octave. there you can go: a = [ 1 2 ; NaN 4] no problem i tried >>> a = mat('[1 3 5; 2 5 3; NaN 5 7]') among other things and scipy balks Is there a way to use NaN in matrices/arrays Tim -- Tim McKenna BU, CNS dept. (617)353-7677(office) (617)524-0938(home) (857)498-2574(mobile) From oliphant at ee.byu.edu Wed Mar 9 23:19:46 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Wed, 09 Mar 2005 21:19:46 -0700 Subject: [SciPy-user] NaN in matrices In-Reply-To: <422FC60C.2030807@cns.bu.edu> References: <422FC60C.2030807@cns.bu.edu> Message-ID: <422FCAE2.3070907@ee.byu.edu> Tim McKenna wrote: > Hi, > I've gotten used to using NaN in matrices (mostly for missing data > values) in Matlab and Octave. > there you can go: > a = [ 1 2 ; NaN 4] no problem > > i tried > >>> a = mat('[1 3 5; 2 5 3; NaN 5 7]') > among other things and scipy balks > > Is there a way to use NaN in matrices/arrays > nans can be used in matrices, nan is the correct spelling in scipy. But, it looks like the string-handling of mat does not handle them well. This is a bug. You can create a matrix with nans using a = mat([[1,3,5],[2,5,3],[nan,5,7]]) -Travis From eric at enthought.com Wed Mar 9 23:41:45 2005 From: eric at enthought.com (eric jones) Date: Wed, 09 Mar 2005 22:41:45 -0600 Subject: [SciPy-user] Current thoughts on future directions In-Reply-To: <422FBD4A.3030708@ee.byu.edu> References: <422EA691.9080404@ee.byu.edu> <422F335F.8060107@csun.edu> <6916ec732f2e70d1789cc0f480f82e7f@redivi.com> <422FBD4A.3030708@ee.byu.edu> Message-ID: <422FD009.4020706@enthought.com> Hey Travis, It sounds like the Berkeley meeting went well. I am glad that the Numeric3 project is going well and looks like it has a good chance to unify the Numeric/Numarray communities. I really appreciate you putting in so much effort intto its implementation. I also appreciate all the work by Perry, Todd, and the others at StSci have done building Numarray. We've all learned a ton from it. Most of the plans sound right to me (several questions/comments below). Much of SciPy has been structured in this way already, but we really have never worked to make the core useful as a stand alone package. Supporting lite and full versions of fft, linalg, and stats sounds potentially painful, but also worthwhile given the circumstances. Now: 1. How much of stats do we loose from removing fortran dependencies? 2. I do question whether weave really be in this core? I think it was in scipy_core before because it was needed to build some of scipy. 3. Now that I think about it, I also wonder if f2py should really be there -- especially since we are explicitly removing any fortran dependencies from the core. 4. I think keeping scipy a algorithms library and leaving plotting to other libraries is a good plan. At one point, the setup_xplt.py file was more than 1000 lines. It is much cleaner now, but dealing with X11, etc. does take maintenance work. Removing these libraries from scipy would decrease the maintenance effort and leave the plotting to matplotlib, chaco, and others. 5. I think having all the generic algorithm packages (signal, ga, stats, etc. -- basically all the packages that are there now) under the scipy namespace is a good idea. It prevents worry about colliding with other peoples packages. However, I think domain specific libraries (such as astropy) should be in their own namespace and shouldn't be in scipy. thanks, eric Travis Oliphant wrote: > I had a lengthy discussion with Eric today and clarified some things > in my mind about the future directions of scipy. The following is > basically what we have decided. We are still interested in input so > don't think the issues are closed, but I'm just giving people an idea > of my (and Eric's as far as I understand it) thinking on scipy. > > 1) There will be a scipy_core package which will be essentially what > Numeric has always been (plus a few easy to install extras already in > current scipy_core). It will likely contain the functionality of > (the names and placements will be similar to current scipy_core). > Numeric3 (actually called ndarray or narray or numstar or numerix or > something....) > fft (based on c-only code -- no fortran dependency) > linalg (a lite version -- no fortran or ATLAS dependency) > stats (a lite version --- no fortran dependency) > special (only c-code --- no fortran dependency) > weave > f2py? (still need to ask Pearu about this) > scipy_distutils and testing > matrix and polynomial classes > > ...others...? > > We will push to make this an easy-to-install effective replacement for > Numeric and hopefully for numarray users as well. Therefore > community input and assistance will be particularly important. > > 2) The rest of scipy will be a package (or a series of packages) of > algorithms. We will not try to do plotting as part of scipy. The > current plotting in scipy will be supported for a time, but users will > be weaned off to other packages: matplotlib, pygist (for xplt -- and > I will work to get any improvements for xplt into pygist itself), > gnuplot, etc. > > 3) Having everything under a scipy namespace is not necessary, nor > worth worrying about at this point. > > My scipy-related focus over the next 5-6 months will be to get > scipy_core to the point that most can agree it effectively replaces > the basic tools of Numeric and numarray. > > > -Travis > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user From evelien at ster.kuleuven.ac.be Thu Mar 10 03:01:19 2005 From: evelien at ster.kuleuven.ac.be (Evelien Vanhollebeke) Date: Thu, 10 Mar 2005 09:01:19 +0100 Subject: [SciPy-user] plotting with xplt Message-ID: <422FFECF.2030602@ster.kuleuven.ac.be> Hi everybody, I wondered if anyone has ever tried with xplt to plot multiple figures on one page. I know it is easy with pylab, but I would like to try it in xplt. One of the reasons is that in pylab I can't make a ps file consisting of different pages without doing a "psmerge" or something like that. If anyone has ever been successful on this, I hope I can get some ideas/examples on how to do it. I am still in the process of deciding which plotting package fulfills my needs. Thanks! Evelien From perry at stsci.edu Thu Mar 10 10:01:55 2005 From: perry at stsci.edu (Perry Greenfield) Date: Thu, 10 Mar 2005 10:01:55 -0500 Subject: [SciPy-user] Current thoughts on future directions In-Reply-To: <422FD009.4020706@enthought.com> References: <422EA691.9080404@ee.byu.edu> <422F335F.8060107@csun.edu> <6916ec732f2e70d1789cc0f480f82e7f@redivi.com> <422FBD4A.3030708@ee.byu.edu> <422FD009.4020706@enthought.com> Message-ID: On Mar 9, 2005, at 11:41 PM, eric jones wrote: > > 2. I do question whether weave really be in this core? I think it was > in scipy_core before because it was needed to build some of scipy. > 3. Now that I think about it, I also wonder if f2py should really be > there -- especially since we are explicitly removing any fortran > dependencies from the core. It would seem to me that so long as: 1) both these tools have very general usefulness (and I think they do), and 2) are not installation problems (I don't believe they are since they themselves don't require any compilation of Fortran, C++ or whatever--am I wrong on that?) That they are perfectly fine to go into the core. In fact, if they are used by any of the extra packages, they should be in the core to eliminate the extra step in the installation of those packages. Perry From perry at stsci.edu Thu Mar 10 10:28:48 2005 From: perry at stsci.edu (Perry Greenfield) Date: Thu, 10 Mar 2005 10:28:48 -0500 Subject: [SciPy-user] Notes from meeting with Guido regarding inclusion of array package in Python core Message-ID: On March 7th Travis Oliphant and Perry Greenfield met Guido and Paul Dubois to discuss some issues regarding the inclusion of an array package within core Python. The following represents thoughts and conclusions regarding our meeting with Guido. They in no way represent the order of discussion with Guido and some of the points we raise weren't actually mentioned during the meeting, but instead were spurred by subsequent discussion after the meeting with Guido. 1) Including an array package in the Python core. To start, before the meeting we both agreed that we did not think that this itself was a high priority in itself. Rather we both felt that the most important issue was making arrays an acceptable and widely supported interchange format (it may not be apparent to some that this does not require arrays be in the core; more on that later). In discussing the desirability of including arrays in the core with Guido, we quickly came to the conclusion that not only was it not important, that in the near term (the next couple years and possibly much longer) it was a bad thing to do so. This is primarily because it would mean that updates to the array package would wait on Python releases potentially delaying important bug fixes, performance enhancements, or new capabilities greatly. Neither of us envisions any scenario regarding array packages, whether that be Numeric3 or numarray, where we would consider it to be something that would not *greatly* benefit from decoupling its release needs from that of Python (it's also true that it possibly introduces complications for Python releases if they need to synch with array schedules, but being inconsiderate louts, we don't care much about that). And when one considers that the move to multicore and 64-bit processors will introduce the need for significant changes in the internals to take advantage of these capabilities, it is unlike we will see a quiescent, maintenance-level state for an array package for some time. In short, this issue is a distraction at the moment and will only sap energy from what needs to be done to unify the array packages. So what about supporting arrays as an interchange format? There are a number of possibilities to consider, none of which require inclusion of arrays into the core. It is possible for 3rd party extensions to optionally support arrays as an interchange format through one of the following mechanisms: a) So long as the extension package has access to the necessary array include files, it can build the extension to use the arrays as a format without actually having the array package installed. The include files alone could be included into the core (Guido has previously been receptive to doing this though at this meeting he didn't seem quite as receptive instead suggesting the next option) or could be packaged with extension (we would prefer the former to reduce the possibilities of many copies of include files). The extension could then be successfully compiled without actually having the array package present. The extension would, when requested to use arrays would see if it could import the array package, if not, then all use of arrays would result in exceptions. The advantage of this approach is that it does not require that arrays be installed before the extension is built for arrays to supported. It could be built, and then later the array package could be installed and no rebuilding would be necessary. b) One could modify the extension build process to see if the package is installed and the include files are available, if so, it is built with the support, otherwise not. The advantage of this approach is that it doesn't require the include files be included with the core or be bundled with the extension, thus avoiding any potential version mismatches. The disadvantage is that later adding the array package require the extension to be rebuilt, and it results in more complex build process (more things to go wrong). c) One could provide the support at the Python level by instead relying on the use of buffer objects by the extension at the C level, thus avoiding any dependence on the array C api. So long as the extension has the ability to return buffer objects containing the putative array data to the Python level and the necessary meta information (in this case, the shape, type, and other info, e.g., byteswapping, necessary to properly interpret the array) to Python, the extension can provide its own functions or methods to convert these buffer objects into arrays without copying of the data in the buffer object. The extension can try to import the array package, and if it is present, provide arrays as a data format using this scheme. In many respects this is the most attractive approach. It has no dependencies on include files, build order, etc. This approach led to the suggestion that Python develop a buffer object that could contain meta information, and a way of supporting community conventions (e.g., a name attribute indicating which conventions was being used) to facilitate the interchange of any sort of binary data, not just arrays. We also concluded that it would be nice to be able create buffer objects from Python with malloced memory (currently one can only create buffer objects from other objects that already have memory allocated; there is no way of creating newly allocated, writable memory from Python within a buffer object; one can create a buffer object from a string, but it is not writable). Nevertheless, if an extension is written in C, none of these changes are necessary to make use of this mechanism for interchange purposes now. This is the approach we recommend trying. The obvious case to apply it to is PIL as test case. We should do this ourselves and offer it as a patch to PIL. Other obvious cases are to support image interchange for GUIs (e.g., wxPython) and OpenGL. 2) Scalar support, rank-0 and related. Travis and I agreed (we certainly seek comments on this conclusion; we may have forgotten about key arguments arguing for one the different approaches) that the desirability of using rank-0 arrays as return values from single element indexing depends on other factors, most importantly Python's support for scalars in various aspects. This is a multifaceted issue that will need to be determined by considering all the facets simultaneously. The following tries to list the pro's and con's previously discussed for returning scalars (two cases previously discussed) or rank-0 arrays (input welcomed). a) return only existing Python scalar types (cast upwards except for long long and long double based types) Pros: - What users probably expect (except matlab users!) - No performance hit in subsequent scalar expressions - faster indexing performance (?) Cons: - Doesn't support array attributes, numeric behaviors - What do you return for long long and long double? No matter what is done, you will either lose precision or lose consistency. Or you create a few new Python scalar types for the unrepresentable types? But, with subclassing in C the effort to create a few scalar types is very close to the effort to create many. b) create new Python scalar types and return those (one for each basic array type) Pros: - Exactly what numeric users expect in representation - No peformance hit in subsequent scalar expressions - faster indexing performance - Scalars have the same methods and attributes as arrays Cons: - Might require great political energy to eventually get the arraytype with all of its scalartype-children into the Python core. This is really an unknown, though, since if the arrayobject is in the standard module and not in the types module, then people may not care (a new type is essentially a new-style class and there are many, many classes in the Python standard library). A good scientific-packaging solution that decreases the desireability of putting the arrayobject into the core would help alleviate this problem as well. - By itself it doesn't address different numeric behaviors for the "still-present" Python scalars throughout Python. c) return rank-0 array Pros: - supports all array behaviors, particularly with regard to numerical processing, particularly with regard to ieee exception handling (a matter of some controversy, some would like it also to be len()=1 and support [0] index, which strictly speaking rank-0 arrays should not support) Cons: - Performance hit on all scalar operations (e.g., if one then does many loops over what appears to be a pure scalar expression, use of rank-0 will be much slower than Python scalars since use of arrays incurs significant overhead. - Doesn't eliminate the fact that one can still run into different numerical behavior involving operations between Python scalars. - Still necessary to write code that must deal with Python scalars "leaking" into code as inputs to functions. - Can't currently be used to index sequences (so not completely usable in place of scalars) Out of this came two potential needs (The first isn't strictly necessary if approach a is taken, but could help smooth use of all integer types as indexes if approach b is taken): If rank-0 arrays are returned, then Guido was very receptive to supporting a special method, __index__ which would allow any Python object to be used as an index to a sequence or mapping object. Calling this would return a value that would be suitable as index if the object was not itself suitable directly. Thus rank-0 arrays would have this method called to convert its internal integer value into a Python integer. There are some details about how this would work at the C level that need to be worked out. This would allow rank-0 integer arrays to be used as indices. To be useful, it would be necessary to get this into the core as quickly as possible (if there are C API issues that have lingering solutions that won't be solved right away, then a greatly delayed implementation in Python would make this less than useful). We talked at some length about whether it was possible to change Python's numeric behavior for scalars, namely support for configurable handling of numeric exceptions in the way numarray does it (and Numeric3 as well). In short, not much was resolved. Guido didn't much like the stack approach to the exception handling mode. His argument (a reasonable one) was that even if the stack allowed pushing and popping modes, it was fragile for two reasons. If one called other functions in other modules that were previously written without knowledge that the mode could be changed, those functions presumed the previous behavior and thus could be broken with mode change (though we suppose that just puts the burden on the caller to guard all external calls with restores to default behavior; even so, many won't do that leading to spurious bug reports that may annoy maintainers to no end though no fault of their own). He also felt that some termination conditions may cause missed pops leading to incorrect modes. He suggested studying the use of the decimal's use of context to see if it could used as a model. Overall he seemed to think that setting mode on a module basis was a better approach. Travis and I wondered about how that could be implemented (it seems to imply that the exception handling needs to know what module or namespace is being executed in order to determine the mode. So some more thought is needed regarding this. The difficulty of proposing such changes and getting them accepted is likely to be considerable. But Travis had a brilliant idea (some may see this as evil but I think it has great merit). Nothing prevents a C extension from hijacking the existing Python scalar objects behaviors. Once a reference is obtained to an integer, float or complex value, one can replace the table of operations on those objects with whatever code one wishes. In this way an array package could (optionally) change the behavior of Python scalars. In this way we could test the behavior of proposed changes quite easily, distribute that behavior quite easily in the community, and ultimately see if there are really any problems without expending any political energy to get it accepted. Once seeing if it really worked (without "forking" Python either), would place us in a much stronger position to have the new behaviors incorporated into the core. Even then, it may never prove necessary if can be so customized by the array package. This holds out the potential of making scalar/array behavior much more consistent. Doing this may allow option a) as the ultimate solution, i.e., no changes needed to Python at all (as such), no rank-0 arrays. This will be studied further. One possible issue is that adding the necessary machinery to make numeric scalar processing consistent with that of the array package may introduce significant performance penalties (what is negligible overhead for arrays may not be for scalars). One last comment is that it is unlikely that any choice in this area prevents the need for added helper functions to the array package to assist in writing code that works well with scalars and arrays. There are likely a number of such issues. A common approach is to wrap all unknown objects with "asarray". This works reasonably well but doesn't handle the following case: If you wish to write a function that will accept arrays or scalars, in principal it would be nice to return scalars if all that was supplied were scalars. So functions to help determine what the output type should be based on the inputs would be helpful, for example to distinguish from when someone provided a rank-0 array as an input (or rank-1 len-1 array) and an actual scalar if asarray happens to map this to the same thing so that the return can properly return a scalar if that is what was originally input. Other such tools may help writing code that allows the main body to treat all objects as arrays without needing checks for scalars. Other miscellaneous comments. The old use of where() may be deprecated and only "nonzero" interpretation will be kept. A new function will be defined to replace the old usage of where (we deem that regular expression search and replaces should work pretty well to make changes in almost all old code). With the use of buffer objects, tostring methods are likely to be deprecated. Python PEPs needed =================== From the discussions it was clear that at least two Python PEPs need to be written and implemented, but that these needed to wait until the unification of the arrayobject takes place. PEP 1: Insertion of an __index__ special method and an as_index slot (perhaps in the as_sequence methods) in the C-level typeobject into Python. PEP 2: Improvements on the buffer object and buffer builtin method so that buffer objects can be Python-tracked wrappers around allocated memory that extension packages can use and share. Two extensions are considered so far. 1) The buffer objects have a meta attribute so that meta information can be passed around in a unified manner and 2) The buffer builtin should take an integer giving the size of writeable buffer object to create. From jh at oobleck.astro.cornell.edu Thu Mar 10 11:45:50 2005 From: jh at oobleck.astro.cornell.edu (Joe Harrington) Date: Thu, 10 Mar 2005 11:45:50 -0500 Subject: [SciPy-user] Re: Notes from meeting with Guido regarding inclusion of array package in Python core In-Reply-To: <20050310153125.D7F6088827@sc8-sf-spam1.sourceforge.net> (numpy-discussion-request@lists.sourceforge.net) References: <20050310153125.D7F6088827@sc8-sf-spam1.sourceforge.net> Message-ID: <200503101645.j2AGjopf019350@oobleck.astro.cornell.edu> It never rains, but it pours! Thanks for talking with Guido and hammering out these issues and options. You are of course right that the release schedule issue is enough to keep us out of Python core for the time being (and matplotlib out of scipy, according to JDH at SciPy04, for the same reason). However, I think we should still strongly work to put it there eventually. For now, this means keeping it "acceptable", and communicating with Guido often to get his feedback and let him know what we are doing. There are three reasons I see for this. First, having it core-acceptable makes it clear to potential users that this is standard, stable, well-thought-out stuff. Second, it will mean that numerical behavior and plain python behavior will be as close as possible, so it will be easiest to switch between the two. Third, if we don't strive for acceptability, we will likely run into a problem in the future when something we depend on is deprecated or changed. No doubt this will happen anyway, but it will be worse if we aren't tight with Guido. Conversely, if we *are* tight with Guido, he is likely to be aware of our concerns and take them into account when making decisions about Python core. --jh-- From stephen.walton at csun.edu Thu Mar 10 12:33:42 2005 From: stephen.walton at csun.edu (Stephen Walton) Date: Thu, 10 Mar 2005 09:33:42 -0800 Subject: [SciPy-user] Re: [Numpy-discussion] Current thoughts on future directions In-Reply-To: <422FBD4A.3030708@ee.byu.edu> References: <422EA691.9080404@ee.byu.edu> <422F335F.8060107@csun.edu> <6916ec732f2e70d1789cc0f480f82e7f@redivi.com> <422FBD4A.3030708@ee.byu.edu> Message-ID: <423084F6.7020804@csun.edu> Can I put in a good word for Fortran? Not the language itself, but the available packages for it. I've always thought that one of the really good things about Scipy was the effort put into getting all those powerful, well tested, robust Fortran routines from Netlib inside Scipy. Without them, it seems to me that folks who just install the new scipy_base are going to re-invent a lot of wheels. Is it really that hard to install g77 on non-Linux platforms? Steve Walton From oliphant at ee.byu.edu Thu Mar 10 12:39:47 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Thu, 10 Mar 2005 10:39:47 -0700 Subject: [SciPy-user] plotting with xplt In-Reply-To: <422FFECF.2030602@ster.kuleuven.ac.be> References: <422FFECF.2030602@ster.kuleuven.ac.be> Message-ID: <42308663.70303@ee.byu.edu> Evelien Vanhollebeke wrote: > Hi everybody, > > I wondered if anyone has ever tried with xplt to plot multiple figures > on one page. I know it is easy with pylab, but I would like to try it > in xplt. One of the reasons is that in pylab I can't make a ps file > consisting of different pages without doing a "psmerge" or something > like that. If anyone has ever been successful on this, I hope I can > get some ideas/examples on how to do it. I am still in the process of > deciding which plotting package fulfills my needs. > Yes, I've done it many times. Look at xplt.subplot. You use xplt.plsys to change which coordinate system you are plotting to. The nice thing about xplt is that it is very fast... -Travis From eric at enthought.com Thu Mar 10 12:54:17 2005 From: eric at enthought.com (eric jones) Date: Thu, 10 Mar 2005 11:54:17 -0600 Subject: [SciPy-user] Re: [Numpy-discussion] Current thoughts on future directions In-Reply-To: <423084F6.7020804@csun.edu> References: <422EA691.9080404@ee.byu.edu> <422F335F.8060107@csun.edu> <6916ec732f2e70d1789cc0f480f82e7f@redivi.com> <422FBD4A.3030708@ee.byu.edu> <423084F6.7020804@csun.edu> Message-ID: <423089C9.7000205@enthought.com> There are no plans to abandon Fortran. SciPy will still rely on it heavily. We're only talking about making some core features here available as python + C to make them easily buildable by the widest group of people. Everything else will still use Fortran the same way it does now. eric Stephen Walton wrote: > Can I put in a good word for Fortran? Not the language itself, but > the available packages for it. I've always thought that one of the > really good things about Scipy was the effort put into getting all > those powerful, well tested, robust Fortran routines from Netlib > inside Scipy. Without them, it seems to me that folks who just > install the new scipy_base are going to re-invent a lot of wheels. > > Is it really that hard to install g77 on non-Linux platforms? > > Steve Walton > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user From perry at stsci.edu Thu Mar 10 15:21:16 2005 From: perry at stsci.edu (Perry Greenfield) Date: Thu, 10 Mar 2005 15:21:16 -0500 Subject: [SciPy-user] Re: [Numpy-discussion] Notes from meeting with Guido regarding inclusion of array package in Python core In-Reply-To: <4230D60A.9050108@noaa.gov> References: <4230D60A.9050108@noaa.gov> Message-ID: <747d91fa83ebcbcfc71fb15dc54bce5b@stsci.edu> On Mar 10, 2005, at 6:19 PM, Chris Barker wrote: > >> a) So long as the extension package has access to the necessary array >> include files, it can build the extension to use the arrays as a >> format without actually having the array package installed. > > The >> extension would, when requested to use arrays would see if it could >> import the array package, if not, then all use of arrays would result >> in exceptions. > > I'm not sure this is even necessary. In fact, in the above example, > what would most likely happen is that the **Helper functions would > check to see if the input object was an array, and then fork the code > if it were. An array couldn't be passed in unless the package were > there, so there would be no need for checking imports or raising > exceptions. > So what would the helper function do if the argument was an array? You mean use the sequence protocol? Yes, I suppose that is always a fallback (but presumes that the original code to deal with such things is present; figuring out that a sequence satisfies array constraints can be a bit involved, especially at the C level) >> It could be built, and then later the array package could be >> installed and no rebuilding would be necessary. > > That is a great feature. > > I'm concerned about the inclusion of all the headers in either the > core or with the package, as that would lock you to a different > upgrade cycle than the main numerix upgrade cycle. It's my experience > that Numeric has not been binary compatible across versions. Hmmm, I thought it had been. It does make it much harder to change the api and structure layouts once in, but I thought that had been pretty stable. >> b) One could modify the extension build process to see if the package >> is installed and the include files are available, if so, it is built >> with the support, otherwise not.The disadvantage is that later adding >> the array package >> require the extension to be rebuilt > > This is a very big deal as most users on Windows and OS-X (and maybe > even Linux) don't build packages themselves. > > A while back this was discussed on this very list, and it seemed like > there was some idea about including not the whole numerix header > package, but just the code for PyArray_Check or an equivalent. This > would allow code to check if an input object was an array, and do > something special if it was. That array-specific code would only get > run if an array was passed in, so you'd know numerix was installed at > run time. This would require Numerix to be installed at build time, > but it would be optional at run time. I like this, because anyone > capable of building wxPython (it can be tricky) is capable of > installing Numeric, but folks that are using binaries don't need to > know anything about it. > > This would only really work for extensions that use arrays, but don't > create them. We'd still have the version mismatch problem too. > Yes, at the binary level. Perry From zpincus at stanford.edu Fri Mar 11 20:10:54 2005 From: zpincus at stanford.edu (Zachary Pincus) Date: Fri, 11 Mar 2005 17:10:54 -0800 Subject: [SciPy-user] OS X build problems with non-Apple python -- diagnosis and solution Message-ID: <99a9c9d75bf474d1f5e8bb13e6b485f0@stanford.edu> Hello folks, I've run into some trouble building scipy for OS X. Fortunately, I've got it mostly licked, so I thought I'd explain how. The issues were two: (1) A problem with g77 and the -bundle switch. This switch must not be used at the beginning of the command line. (2) The linker not being pointed at the correct set of python libraries for non-Apple python installs. Here are the details: Note that I'm following the Christopher Fonnesbeck's directions (http://www.scipy.org/documentation/Members/fonnesbeck/osx_build.txt ) exactly, except that I've used Fink to install python. All other packages were manually installed. I'm using a CVS checkout of SciPy from today. The first problem that I encountered was in building the various fortran libs with g77. (Full error text below as #1). The gist of the error is that g77 reports "couldn't run `/usr/local/bin/undle-gcc-3.4.2': No such file or directory". This problem turns out to be due to the (insane) fact that g77 has a "-b MACHINE" switch that is only honored if it is the first element on the command line. So 'g77 -bundle' makes the compiler look for the executable undle-gcc-3.4.2. The solution is to put something before the '-bundle' switch. I used -WALL because it's relatively harmless. (See: http://www.mail-archive.com/fink-devel at lists.sourceforge.net/ msg10604.html ) So it's important to make sure that -bundle is not the first switch. I suspect that this problem has not cropped up before because users without Fink-installed python will get a '-framework' switch before the -bundle switch. However, this might change in the future; as such it would be a good idea to explicitly make sure that -bundle is never first in the g77 command. The second problem that I ran into was precisely the same issue that Christopher reported last year (http://www.scipy.net/pipermail/scipy-user/2004-February/002576.html ). That is, the linker was not finding the python libraries to link to the fortran libraries. The solution proffered at that time (http://www.scipy.net/pipermail/scipy-user/2004-February/002587.html ) was to add specific switches to the compiler options for darwin. Unfortunately, these switches are specific for the apple-installed python framework. If a user has manually installed a new python (or has had fink do so for them), this fails. (see Error #2 below) I believe the python distutils provide a unified method to find out the correct libraries to link to, in order to avoid these sort of problems. Anyhow, I don't know how to modify the distutils stuff to fix this automatically. I've succeeded in building SciPy only by manually fixing each breaking call to g77 -- a painful process indeed. Sorry I can't give a specific patch for the problem. Zach Pincus Department of Biochemistry and Program in Biomedical Informatics Stanford University School of Medicine Error #1 -------- gcc: build/src/Lib/fftpack/_fftpackmodule.c /usr/local/bin/g77 -bundle build/temp.darwin-7.8.0-PowerMacintosh-2.3/build/src/Lib/fftpack/ _fftpackmodule.o build/temp.darwin-7.8.0-PowerMacintosh-2.3/Lib/fftpack/src/zfft.o build/temp.darwin-7.8.0-PowerMacintosh-2.3/Lib/fftpack/src/drfft.o build/temp.darwin-7.8.0-PowerMacintosh-2.3/Lib/fftpack/src/zrfft.o build/temp.darwin-7.8.0-PowerMacintosh-2.3/Lib/fftpack/src/zfftnd.o build/temp.darwin-7.8.0-PowerMacintosh-2.3/build/src/fortranobject.o -L/sw/lib -L/usr/local/lib/gcc/powerpc-apple-darwin6.8/3.4.2 -Lbuild/temp.darwin-7.8.0-PowerMacintosh-2.3 -ldfftpack -ldrfftw -ldfftw -lg2c -lcc_dynamic -o build/lib.darwin-7.8.0-PowerMacintosh-2.3/scipy/fftpack/_fftpack.so g77: couldn't run `/usr/local/bin/undle-gcc-3.4.2': No such file or directory g77: couldn't run `/usr/local/bin/undle-gcc-3.4.2': No such file or directory Error #2 -------- /usr/local/bin/g77 -WALL -bundle build/temp.darwin-7.8.0-PowerMacintosh-2.3/build/src/Lib/fftpack/ _fftpackmodule.o build/temp.darwin-7.8.0-PowerMacintosh-2.3/Lib/fftpack/src/zfft.o build/temp.darwin-7.8.0-PowerMacintosh-2.3/Lib/fftpack/src/drfft.o build/temp.darwin-7.8.0-PowerMacintosh-2.3/Lib/fftpack/src/zrfft.o build/temp.darwin-7.8.0-PowerMacintosh-2.3/Lib/fftpack/src/zfftnd.o build/temp.darwin-7.8.0-PowerMacintosh-2.3/build/src/fortranobject.o -L/sw/lib -L/usr/local/lib/gcc/powerpc-apple-darwin6.8/3.4.2 -Lbuild/temp.darwin-7.8.0-PowerMacintosh-2.3 -ldfftpack -ldrfftw -ldfftw -lg2c -lcc_dynamic -o build/lib.darwin-7.8.0-PowerMacintosh-2.3/scipy/fftpack/_fftpack.so /usr/bin/ld: Undefined symbols: _PyArg_ParseTuple _PyArg_ParseTupleAndKeywords _PyCObject_AsVoidPtr _PyCObject_Type _PyComplex_Type _PyDict_GetItemString _PyDict_SetItemString _PyErr_Clear _PyErr_NewException _PyErr_Occurred _PyErr_Print _PyErr_SetString _PyImport_ImportModule _PyInt_Type _PyModule_GetDict _PyNumber_Int _PyObject_GetAttrString _PySequence_Check _PySequence_GetItem _PyString_FromString _PyString_Type _PyType_IsSubtype _PyType_Type _Py_BuildValue _Py_FatalError _Py_InitModule4 __Py_NoneStruct _PyCObject_FromVoidPtr _PyDict_DelItemString _PyDict_New _PyErr_Format _PyExc_AttributeError _PyExc_RuntimeError _PyExc_TypeError _PyObject_Free _PyString_ConcatAndDel _Py_FindMethod __PyObject_New collect2: ld returned 1 exit status From rkern at ucsd.edu Fri Mar 11 20:48:33 2005 From: rkern at ucsd.edu (Robert Kern) Date: Fri, 11 Mar 2005 17:48:33 -0800 Subject: [SciPy-user] OS X build problems with non-Apple python -- diagnosis and solution In-Reply-To: <99a9c9d75bf474d1f5e8bb13e6b485f0@stanford.edu> References: <99a9c9d75bf474d1f5e8bb13e6b485f0@stanford.edu> Message-ID: <42324A71.2010202@ucsd.edu> Zachary Pincus wrote: > Hello folks, > > I've run into some trouble building scipy for OS X. Fortunately, I've > got it mostly licked, so I thought I'd explain how. > > The issues were two: > (1) A problem with g77 and the -bundle switch. This switch must not be > used at the beginning of the command line. > (2) The linker not being pointed at the correct set of python libraries > for non-Apple python installs. > > Here are the details: > > Note that I'm following the Christopher Fonnesbeck's directions > (http://www.scipy.org/documentation/Members/fonnesbeck/osx_build.txt ) > exactly, except that I've used Fink to install python. All other > packages were manually installed. I'm using a CVS checkout of SciPy > from today. > > The first problem that I encountered was in building the various > fortran libs with g77. (Full error text below as #1). The gist of the > error is that g77 reports "couldn't run > `/usr/local/bin/undle-gcc-3.4.2': No such file or directory". This > problem turns out to be due to the (insane) fact that g77 has a "-b > MACHINE" switch that is only honored if it is the first element on the > command line. Oh, *fantastic*. > So 'g77 -bundle' makes the compiler look for the > executable undle-gcc-3.4.2. The solution is to put something before the > '-bundle' switch. I used -WALL because it's relatively harmless. (See: > http://www.mail-archive.com/fink-devel at lists.sourceforge.net/ > msg10604.html ) > > So it's important to make sure that -bundle is not the first switch. I > suspect that this problem has not cropped up before because users > without Fink-installed python will get a '-framework' switch before the > -bundle switch. However, this might change in the future; as such it > would be a good idea to explicitly make sure that -bundle is never > first in the g77 command. > > The second problem that I ran into was precisely the same issue that > Christopher reported last year > (http://www.scipy.net/pipermail/scipy-user/2004-February/002576.html ). > That is, the linker was not finding the python libraries to link to the > fortran libraries. The solution proffered at that time > (http://www.scipy.net/pipermail/scipy-user/2004-February/002587.html ) > was to add specific switches to the compiler options for darwin. > Unfortunately, these switches are specific for the apple-installed > python framework. If a user has manually installed a new python (or has > had fink do so for them), this fails. (see Error #2 below) I believe > the python distutils provide a unified method to find out the correct > libraries to link to, in order to avoid these sort of problems. I think the answer to both of your problems is to keep the "-undefined dynamic_lookup" first for both framework and non-framework builds. Google says it should work for Fink Python, too. Try current CVS, please, and report back. -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From zpincus at stanford.edu Sat Mar 12 00:11:30 2005 From: zpincus at stanford.edu (Zachary Pincus) Date: Fri, 11 Mar 2005 21:11:30 -0800 Subject: [SciPy-user] OS X build problems with non-Apple python -- diagnosis and solution In-Reply-To: <42324A71.2010202@ucsd.edu> References: <99a9c9d75bf474d1f5e8bb13e6b485f0@stanford.edu> <42324A71.2010202@ucsd.edu> Message-ID: >> The gist of the error is that g77 reports "couldn't run >> `/usr/local/bin/undle-gcc-3.4.2': No such file or directory". This >> problem turns out to be due to the (insane) fact that g77 has a "-b >> MACHINE" switch that is only honored if it is the first element on >> the command line. > > Oh, *fantastic*. Yeah, isn't that great. > I think the answer to both of your problems is to keep the "-undefined > dynamic_lookup" first for both framework and non-framework builds. > Google says it should work for Fink Python, too. > Try current CVS, please, and report back. A current CVS checkout compiles perfectly and tests without errors. Thank you for your quick reply and fix! I really appreciate it. (Now I can get to that fitting of splines that I always wanted to do...) Zach From dd55 at cornell.edu Sat Mar 12 21:07:40 2005 From: dd55 at cornell.edu (Darren Dale) Date: Sat, 12 Mar 2005 21:07:40 -0500 Subject: [SciPy-user] efficient linear algebra Message-ID: <200503122107.40780.dd55@cornell.edu> Hi, I have a simulation that has to repeatedly find the dot product of a 250,000x3 array with a 3x1 array. I have a working atlas 3.6.0, and believe scipy to be properly installed, all the tests pass, etc. I read (http://www.scipy.org/mailinglists/mailman?fn=scipy-user/2004-June/002902.html) that scipy.dot may not be the fastest solution, so I tried dot, = scipy.linalg.blas.get_blas_funcs(['gemm'],(ones([250000,3],'d'),ones((3,1),'d'))) and dot, = scipy.linalg.blas.get_blas_funcs(['dot'],(ones([250000,3],'d'),ones((3,1),'d'))) When I tested how long each takes, scipy.dot took .1 seconds, the blas gemm took 0.13 seconds, and the blas dot gave me the following error: error: (len(y)-offy>(n-1)*abs(incy)) failed for 1st keyword n If I dot an Nx3 with a 3x1, I get the above error. If I dot a 1x3 with a 3xN, I do not get the error. Is this a bug? Is my approach correct? Should I expect much of a performance boost from using the function returned by get_blas_funcs, and if so, are there suggestions as to why I might be taking a performance hit? Where can I learn more about using the blas routines intelligently? Thank you, Darren From zpincus at stanford.edu Sun Mar 13 02:44:07 2005 From: zpincus at stanford.edu (Zachary Pincus) Date: Sat, 12 Mar 2005 23:44:07 -0800 Subject: [SciPy-user] Plot vectors at specified (x,y) locations? Message-ID: Hello, I'm wondering if anyone knows a good way to plot (preferably with xplt, but anything will do) a handful of vectors, given the vector magnitude (u, v) and position (x, y). Now, xplt can plot a vector field defined over a quadrilateral mesh, but I don't need to plot a whole field of vectors! Figuring out how to take a list of arbitrary (x, y) locations and convert that list into a quad mesh where the nodes of the mesh are on the original points is not my idea of fun, so if there's any way to do this otherwise I would be most interested. The basic idea here is that I want to plot the track of an object in (x, y) and then plot velocity or acceleration vectors at various points along the track. This should be simple, right? Any help is appreciated, Zach Pincus Department of Biochemistry and Program in Biomedical Informatics Stanford University School of Medicine From jeremy at jeremysanders.net Sun Mar 13 16:48:09 2005 From: jeremy at jeremysanders.net (Jeremy Sanders) Date: Sun, 13 Mar 2005 21:48:09 +0000 (GMT) Subject: [SciPy-user] ANN: Veusz-0.4 released (plotting package) Message-ID: Veusz 0.4 [first public release] --------- http://home.gna.org/veusz/ Veusz is Copyright (C) 2003-2005 Jeremy Sanders Licenced under the GPL (version 2 or greater) Veusz is a scientific plotting package written in Python (currently 100% Python). It uses PyQt for display and user-interfaces, and numarray for handling the numeric data. Veusz is designed to produce publication-ready Postscript output. Veusz provides a GUI, command line and scripting interface (based on Python) to its plotting facilities. The plots are built using an object-based system to provide a consistent interface. Currently done: * X-Y plots (with errorbars) * Stepped plots (for histograms) * Line plots * Function plots * Stacked plots and arrays of plots * Plot keys * Plot labels * LaTeX-like formatting for text * EPS output * Simple data importing * Scripting interface * Save/Load plots * Some work on a manual and introduction To be done: * Contour plots * Images * Filled regions * UI improvements * Import filters (for qdp and other plotting packages, fits, csv) * Data manipulation * Python embedding interface (for embedding Veusz in other programs). [some of the external interface is complete] * Proper installation program Requirements: Python (probably 2.3 or greater required) http://www.python.org/ Qt (free edition) http://www.trolltech.com/products/qt/ PyQt (SIP is required to be installed first) http://www.riverbankcomputing.co.uk/pyqt/ http://www.riverbankcomputing.co.uk/sip/ numarray http://www.stsci.edu/resources/software_hardware/numarray Microsoft Core Fonts (recommended) http://corefonts.sourceforge.net/ For documentation on using Veusz, see the "Documents" directory. The manual is in pdf, html and text format (generated from docbook). If you enjoy using Veusz, I would love to hear from you. Please join the mailing lists at https://gna.org/mail/?group=veusz to discuss new features or if you'd like to contribute code. The newest code can always be found in CVS. Cheers Jeremy -- Jeremy Sanders http://www.jeremysanders.net/ Cambridge, UK Public Key Server PGP Key ID: E1AAE053 From dd55 at cornell.edu Sun Mar 13 17:41:56 2005 From: dd55 at cornell.edu (Darren Dale) Date: Sun, 13 Mar 2005 17:41:56 -0500 Subject: [SciPy-user] advice on scipy installation Message-ID: <200503131741.56949.dd55@cornell.edu> Hello scipy users, I am having some trouble with installing scipy-0.3.2, specifically linking to atlas. I asked a question about this a month or two ago, and didnt receive a response. Please, I would really appreciate some help, and I think I located a bug. First, I have read all the documentation about installing scipy: I am using site.cfg to reflect my setup. The output of system_info.py is included at the end of this email, all the atlas, lapack, and blas info was found and reported (with the exception of blas_src_info and lapack_src_info, which are not available on my machine). So up to this point, things look good. The trouble starts when I try to run python setup.py build. At this point I am getting NOT AVAILABLE messages, but for packages that system_info.py had just successfully queried. Why would system_info.py report the existence of packages when I run it manually, but fail to report the same packages during installation? I think I have a lead. site.cfg is not being read during the build, because the installer is looking for it at ./site.cfg instead of at ./scipy_core/scipy_distutils/site.cfg. I think the offending line is around line 287 in system_info.py: cf = os.path.join(os.path.split(os.path.abspath(__file__))[0], 'site.cfg') __file__ is ./setup.py during the build, and ./scipy_core/scipy_distutils/system_info.py when I call system_info.py. Could somebody recommend a fix? Do I copy site.cfg to ./? I dont think this is the right thing to do, but I dont know how to point cf to the right location (I still have a lot to learn about python). Thank you for your time. Darren Here is the output of: python setup_atlas_version.py build_ext --inplace --force ______________________________________________________________ atlas_threads_info: scipy_distutils.system_info.atlas_threads_info NOT AVAILABLE atlas_info: scipy_distutils.system_info.atlas_info NOT AVAILABLE Traceback (most recent call last): File "setup_atlas_version.py", line 27, in ? setup(**configuration()) File "setup_atlas_version.py", line 17, in configuration raise AtlasNotFoundError,AtlasNotFoundError.__doc__ scipy_distutils.system_info.AtlasNotFoundError: Atlas (http://math-atlas.sourceforge.net/) libraries not found. Directories to search for the libraries can be specified in the scipy_distutils/site.cfg file (section [atlas]) or by setting the ATLAS environment variable. output from system_info.py listed below: ______________________________________________________________ _pkg_config_info: NOT AVAILABLE agg2_info: ( src_dirs = .:/usr/local/src ) NOT AVAILABLE atlas_blas_info: ( library_dirs = /usr/lib:/usr/local/lib ) ( paths: /usr/lib/liblapack.so ) ( paths: /usr/lib/libblas.so ) ( paths: /usr/lib/libcblas.so ) ( paths: /usr/lib/libatlas.so ) system_info.atlas_blas_info ( include_dirs = /usr/include/atlas:/usr/local/include:/usr/include ) ( paths: /usr/include/atlas/atlas_cNCmm.h,/usr/include/atlas/atlas_smvN.h,/usr/include/atlas/atlas_smvS.h,/usr/include/atlas/atlas_smvT.h,/usr/include/atlas/atlas_dNCmm.h,/usr/include/atlas/atlas_type.h,/usr/include/atlas/atlas_zmvN.h,/usr/include/atlas/atlas_zmvS.h,/usr/include/atlas/atlas_zmvT.h,/usr/include/atlas/atlas_sNCmm.h,/usr/include/atlas/atlas_zNCmm.h,/usr/include/atlas/atlas_zsysinfo.h,/usr/include/atlas/atlas_ctrsmXover.h,/usr/include/atlas/atlas_ztrsmXover.h,/usr/include/atlas/atlas_csysinfo.h,/usr/include/atlas/atlas_dtrsmXover.h,/usr/include/atlas/atlas_ssysinfo.h,/usr/include/atlas/atlas_cr1.h,/usr/include/atlas/atlas_cmv.h,/usr/include/atlas/atlas_dr1.h,/usr/include/atlas/atlas_dmv.h,/usr/include/atlas/atlas_sr1.h,/usr/include/atlas/atlas_smv.h,/usr/include/atlas/atlas_zr1.h,/usr/include/atlas/atlas_zmv.h,/usr/include/atlas/atlas_strsmXover.h,/usr/include/atlas/atlas_dsysinfo.h,/usr/include/atlas/atlas_cacheedge.h,/usr/include/atlas/atlas_buildinfo.h,/usr/include/atlas/atlas_cmvN.h,/usr/include/atlas/atlas_cmvS.h,/usr/include/atlas/atlas_cmvT.h,/usr/include/atlas/atlas_dmvN.h,/usr/include/atlas/atlas_dmvS.h,/usr/include/atlas/atlas_dmvT.h ) ( paths: /usr/include/atlas ) ( paths: /usr/include/atlas/cblas.h,/usr/include/atlas/cblas.h ) FOUND: libraries = ['lapack', 'blas', 'cblas', 'atlas'] library_dirs = ['/usr/lib'] language = c include_dirs = ['/usr/include/atlas'] atlas_blas_threads_info: ( library_dirs = /usr/lib:/usr/local/lib ) ( paths: /usr/lib/liblapack.so ) ( paths: /usr/lib/libblas.so ) ( paths: /usr/lib/libcblas.so ) ( paths: /usr/lib/libatlas.so ) system_info.atlas_blas_threads_info ( include_dirs = /usr/include/atlas:/usr/local/include:/usr/include ) ( paths: /usr/include/atlas/atlas_cNCmm.h,/usr/include/atlas/atlas_smvN.h,/usr/include/atlas/atlas_smvS.h,/usr/include/atlas/atlas_smvT.h,/usr/include/atlas/atlas_dNCmm.h,/usr/include/atlas/atlas_type.h,/usr/include/atlas/atlas_zmvN.h,/usr/include/atlas/atlas_zmvS.h,/usr/include/atlas/atlas_zmvT.h,/usr/include/atlas/atlas_sNCmm.h,/usr/include/atlas/atlas_zNCmm.h,/usr/include/atlas/atlas_zsysinfo.h,/usr/include/atlas/atlas_ctrsmXover.h,/usr/include/atlas/atlas_ztrsmXover.h,/usr/include/atlas/atlas_csysinfo.h,/usr/include/atlas/atlas_dtrsmXover.h,/usr/include/atlas/atlas_ssysinfo.h,/usr/include/atlas/atlas_cr1.h,/usr/include/atlas/atlas_cmv.h,/usr/include/atlas/atlas_dr1.h,/usr/include/atlas/atlas_dmv.h,/usr/include/atlas/atlas_sr1.h,/usr/include/atlas/atlas_smv.h,/usr/include/atlas/atlas_zr1.h,/usr/include/atlas/atlas_zmv.h,/usr/include/atlas/atlas_strsmXover.h,/usr/include/atlas/atlas_dsysinfo.h,/usr/include/atlas/atlas_cacheedge.h,/usr/include/atlas/atlas_buildinfo.h,/usr/include/atlas/atlas_cmvN.h,/usr/include/atlas/atlas_cmvS.h,/usr/include/atlas/atlas_cmvT.h,/usr/include/atlas/atlas_dmvN.h,/usr/include/atlas/atlas_dmvS.h,/usr/include/atlas/atlas_dmvT.h ) ( paths: /usr/include/atlas ) ( paths: /usr/include/atlas/cblas.h,/usr/include/atlas/cblas.h ) FOUND: libraries = ['lapack', 'blas', 'cblas', 'atlas'] library_dirs = ['/usr/lib'] language = c include_dirs = ['/usr/include/atlas'] atlas_info: ( library_dirs = /usr/lib:/usr/local/lib ) ( paths: /usr/lib/liblapack.so ) ( paths: /usr/lib/libblas.so ) ( paths: /usr/lib/libcblas.so ) ( paths: /usr/lib/libatlas.so ) ( paths: /usr/lib/liblapack.so ) system_info.atlas_info ( include_dirs = /usr/include/atlas:/usr/local/include:/usr/include ) ( paths: /usr/include/atlas/atlas_cNCmm.h,/usr/include/atlas/atlas_smvN.h,/usr/include/atlas/atlas_smvS.h,/usr/include/atlas/atlas_smvT.h,/usr/include/atlas/atlas_dNCmm.h,/usr/include/atlas/atlas_type.h,/usr/include/atlas/atlas_zmvN.h,/usr/include/atlas/atlas_zmvS.h,/usr/include/atlas/atlas_zmvT.h,/usr/include/atlas/atlas_sNCmm.h,/usr/include/atlas/atlas_zNCmm.h,/usr/include/atlas/atlas_zsysinfo.h,/usr/include/atlas/atlas_ctrsmXover.h,/usr/include/atlas/atlas_ztrsmXover.h,/usr/include/atlas/atlas_csysinfo.h,/usr/include/atlas/atlas_dtrsmXover.h,/usr/include/atlas/atlas_ssysinfo.h,/usr/include/atlas/atlas_cr1.h,/usr/include/atlas/atlas_cmv.h,/usr/include/atlas/atlas_dr1.h,/usr/include/atlas/atlas_dmv.h,/usr/include/atlas/atlas_sr1.h,/usr/include/atlas/atlas_smv.h,/usr/include/atlas/atlas_zr1.h,/usr/include/atlas/atlas_zmv.h,/usr/include/atlas/atlas_strsmXover.h,/usr/include/atlas/atlas_dsysinfo.h,/usr/include/atlas/atlas_cacheedge.h,/usr/include/atlas/atlas_buildinfo.h,/usr/include/atlas/atlas_cmvN.h,/usr/include/atlas/atlas_cmvS.h,/usr/include/atlas/atlas_cmvT.h,/usr/include/atlas/atlas_dmvN.h,/usr/include/atlas/atlas_dmvS.h,/usr/include/atlas/atlas_dmvT.h ) ( paths: /usr/include/atlas ) ( paths: /usr/include/atlas/cblas.h,/usr/include/atlas/cblas.h ) FOUND: libraries = ['lapack', 'lapack', 'blas', 'cblas', 'atlas'] library_dirs = ['/usr/lib'] language = f77 include_dirs = ['/usr/include/atlas'] atlas_threads_info: ( library_dirs = /usr/lib:/usr/local/lib ) ( paths: /usr/lib/liblapack.so ) ( paths: /usr/lib/libblas.so ) ( paths: /usr/lib/libcblas.so ) ( paths: /usr/lib/libatlas.so ) ( paths: /usr/lib/liblapack.so ) system_info.atlas_threads_info ( include_dirs = /usr/include/atlas:/usr/local/include:/usr/include ) ( paths: /usr/include/atlas/atlas_cNCmm.h,/usr/include/atlas/atlas_smvN.h,/usr/include/atlas/atlas_smvS.h,/usr/include/atlas/atlas_smvT.h,/usr/include/atlas/atlas_dNCmm.h,/usr/include/atlas/atlas_type.h,/usr/include/atlas/atlas_zmvN.h,/usr/include/atlas/atlas_zmvS.h,/usr/include/atlas/atlas_zmvT.h,/usr/include/atlas/atlas_sNCmm.h,/usr/include/atlas/atlas_zNCmm.h,/usr/include/atlas/atlas_zsysinfo.h,/usr/include/atlas/atlas_ctrsmXover.h,/usr/include/atlas/atlas_ztrsmXover.h,/usr/include/atlas/atlas_csysinfo.h,/usr/include/atlas/atlas_dtrsmXover.h,/usr/include/atlas/atlas_ssysinfo.h,/usr/include/atlas/atlas_cr1.h,/usr/include/atlas/atlas_cmv.h,/usr/include/atlas/atlas_dr1.h,/usr/include/atlas/atlas_dmv.h,/usr/include/atlas/atlas_sr1.h,/usr/include/atlas/atlas_smv.h,/usr/include/atlas/atlas_zr1.h,/usr/include/atlas/atlas_zmv.h,/usr/include/atlas/atlas_strsmXover.h,/usr/include/atlas/atlas_dsysinfo.h,/usr/include/atlas/atlas_cacheedge.h,/usr/include/atlas/atlas_buildinfo.h,/usr/include/atlas/atlas_cmvN.h,/usr/include/atlas/atlas_cmvS.h,/usr/include/atlas/atlas_cmvT.h,/usr/include/atlas/atlas_dmvN.h,/usr/include/atlas/atlas_dmvS.h,/usr/include/atlas/atlas_dmvT.h ) ( paths: /usr/include/atlas ) ( paths: /usr/include/atlas/cblas.h,/usr/include/atlas/cblas.h ) FOUND: libraries = ['lapack', 'lapack', 'blas', 'cblas', 'atlas'] library_dirs = ['/usr/lib'] language = f77 include_dirs = ['/usr/include/atlas'] blas_info: ( library_dirs = /usr/lib:/usr/local/lib ) ( paths: /usr/lib/libblas.so ) FOUND: libraries = ['blas'] library_dirs = ['/usr/lib'] language = f77 blas_opt_info: running build_src building extension "atlas_version" sources creating build/src adding 'build/src/atlas_version_0x53b6c39a.c' to sources. running build_ext customize UnixCCompiler customize UnixCCompiler using build_ext building 'atlas_version' extension compiling C sources i686-pc-linux-gnu-gcc options: '-pthread -fno-strict-aliasing -DNDEBUG -fPIC' creating build/temp.linux-i686-2.3 creating build/temp.linux-i686-2.3/build creating build/temp.linux-i686-2.3/build/src compile options: '-I/usr/include/atlas -I/usr/include/python2.3 -c' i686-pc-linux-gnu-gcc: build/src/atlas_version_0x53b6c39a.c i686-pc-linux-gnu-gcc -pthread -shared build/temp.linux-i686-2.3/build/src/atlas_version_0x53b6c39a.o -L/usr/lib -latlas -o build/temp.linux-i686-2.3/atlas_version.so FOUND: libraries = ['lapack', 'blas', 'cblas', 'atlas'] library_dirs = ['/usr/lib'] language = c define_macros = [('ATLAS_INFO', '"\\"3.6.0\\""')] include_dirs = ['/usr/include/atlas'] blas_src_info: ( src_dirs = .:/usr/local/src ) NOT AVAILABLE boost_python_info: ( src_dirs = .:/usr/local/src ) NOT AVAILABLE dfftw_info: ( library_dirs = /usr/local/lib:/usr/lib ) ( include_dirs = /usr/local/include:/usr/include ) NOT AVAILABLE dfftw_threads_info: ( library_dirs = /usr/local/lib:/usr/lib ) ( include_dirs = /usr/local/include:/usr/include ) NOT AVAILABLE djbfft_info: ( library_dirs = /usr/local/lib:/usr/lib ) ( include_dirs = /usr/local/include:/usr/include ) NOT AVAILABLE fftw_info: ( library_dirs = /usr/local/lib:/usr/lib ) ( include_dirs = /usr/local/include:/usr/include ) NOT AVAILABLE fftw_threads_info: ( library_dirs = /usr/local/lib:/usr/lib ) ( include_dirs = /usr/local/include:/usr/include ) NOT AVAILABLE freetype2_info: FOUND: libraries = ['freetype', 'z'] define_macros = [('FREETYPE2_INFO', '"\\"9.7.3\\""'), ('FREETYPE2_VERSION_9_7_3', None)] include_dirs = ['/usr/include/freetype2'] gdk_2_info: FOUND: libraries = ['gdk-x11-2.0', 'gdk_pixbuf-2.0', 'm', 'pangoxft-1.0', 'pangox-1.0', 'pango-1.0', 'gobject-2.0', 'gmodule-2.0', 'dl', 'glib-2.0'] extra_link_args = ['-Wl,--export-dynamic'] define_macros = [('GDK_2_INFO', '"\\"2.6.2\\""'), ('GDK_VERSION_2_6_2', None), ('XTHREADS', None), ('_REENTRANT', None), ('XUSE_MTSAFE_API', None)] include_dirs = ['/usr/include/gtk-2.0', '/usr/lib/gtk-2.0/include', '/usr/X11R6/include', '/usr/include/pango-1.0', '/usr/include/freetype2', '/usr/include/freetype2/config', '/usr/include/glib-2.0', '/usr/lib/glib-2.0/include'] gdk_info: FOUND: libraries = ['gdk', 'Xi', 'Xext', 'X11', 'm', 'glib'] library_dirs = ['/usr/X11R6/lib'] define_macros = [('GDK_INFO', '"\\"1.2.10\\""'), ('GDK_VERSION_1_2_10', None)] include_dirs = ['/usr/include/gtk-1.2', '/usr/X11R6/include', '/usr/include/glib-1.2', '/usr/lib/glib/include'] gdk_pixbuf_2_info: FOUND: libraries = ['gdk_pixbuf-2.0', 'm', 'gobject-2.0', 'gmodule-2.0', 'dl', 'glib-2.0'] define_macros = [('GDK_PIXBUF_2_INFO', '"\\"2.6.2\\""'), ('GDK_PIXBUF_VERSION_2_6_2', None)] include_dirs = ['/usr/include/gtk-2.0', '/usr/include/glib-2.0', '/usr/lib/glib-2.0/include'] gdk_pixbuf_xlib_2_info: FOUND: libraries = ['gdk_pixbuf_xlib-2.0', 'gdk_pixbuf-2.0', 'm', 'gobject-2.0', 'gmodule-2.0', 'dl', 'glib-2.0'] extra_link_args = ['-Wl,--export-dynamic'] define_macros = [('GDK_PIXBUF_XLIB_2_INFO', '"\\"2.6.2\\""'), ('GDK_PIXBUF_XLIB_VERSION_2_6_2', None)] include_dirs = ['/usr/include/gtk-2.0', '/usr/include/glib-2.0', '/usr/lib/glib-2.0/include'] gdk_x11_2_info: FOUND: libraries = ['gdk-x11-2.0', 'gdk_pixbuf-2.0', 'm', 'pangoxft-1.0', 'pangox-1.0', 'pango-1.0', 'gobject-2.0', 'gmodule-2.0', 'dl', 'glib-2.0'] extra_link_args = ['-Wl,--export-dynamic'] define_macros = [('GDK_X11_2_INFO', '"\\"2.6.2\\""'), ('GDK_X11_VERSION_2_6_2', None), ('XTHREADS', None), ('_REENTRANT', None), ('XUSE_MTSAFE_API', None)] include_dirs = ['/usr/include/gtk-2.0', '/usr/lib/gtk-2.0/include', '/usr/X11R6/include', '/usr/include/pango-1.0', '/usr/include/freetype2', '/usr/include/freetype2/config', '/usr/include/glib-2.0', '/usr/lib/glib-2.0/include'] gtkp_2_info: FOUND: libraries = ['gtk-x11-2.0', 'gdk-x11-2.0', 'atk-1.0', 'gdk_pixbuf-2.0', 'm', 'pangoxft-1.0', 'pangox-1.0', 'pango-1.0', 'gobject-2.0', 'gmodule-2.0', 'dl', 'glib-2.0'] extra_link_args = ['-Wl,--export-dynamic'] define_macros = [('GTKP_2_INFO', '"\\"2.6.2\\""'), ('GTK_VERSION_2_6_2', None), ('XTHREADS', None), ('_REENTRANT', None), ('XUSE_MTSAFE_API', None)] include_dirs = ['/usr/include/gtk-2.0', '/usr/lib/gtk-2.0/include', '/usr/X11R6/include', '/usr/include/atk-1.0', '/usr/include/pango-1.0', '/usr/include/freetype2', '/usr/include/freetype2/config', '/usr/include/glib-2.0', '/usr/lib/glib-2.0/include'] gtkp_x11_2_info: FOUND: libraries = ['gtk-x11-2.0', 'gdk-x11-2.0', 'atk-1.0', 'gdk_pixbuf-2.0', 'm', 'pangoxft-1.0', 'pangox-1.0', 'pango-1.0', 'gobject-2.0', 'gmodule-2.0', 'dl', 'glib-2.0'] extra_link_args = ['-Wl,--export-dynamic'] define_macros = [('GTKP_X11_2_INFO', '"\\"2.6.2\\""'), ('GTK_X11_VERSION_2_6_2', None), ('XTHREADS', None), ('_REENTRANT', None), ('XUSE_MTSAFE_API', None)] include_dirs = ['/usr/include/gtk-2.0', '/usr/lib/gtk-2.0/include', '/usr/X11R6/include', '/usr/include/atk-1.0', '/usr/include/pango-1.0', '/usr/include/freetype2', '/usr/include/freetype2/config', '/usr/include/glib-2.0', '/usr/lib/glib-2.0/include'] lapack_atlas_info: ( library_dirs = /usr/lib:/usr/local/lib ) ( paths: /usr/lib/liblapack.so ) ( paths: /usr/lib/libblas.so ) ( paths: /usr/lib/libcblas.so ) ( paths: /usr/lib/libatlas.so ) ( paths: /usr/lib/liblapack.so ) system_info.lapack_atlas_info ( include_dirs = /usr/include/atlas:/usr/local/include:/usr/include ) ( paths: /usr/include/atlas/atlas_cNCmm.h,/usr/include/atlas/atlas_smvN.h,/usr/include/atlas/atlas_smvS.h,/usr/include/atlas/atlas_smvT.h,/usr/include/atlas/atlas_dNCmm.h,/usr/include/atlas/atlas_type.h,/usr/include/atlas/atlas_zmvN.h,/usr/include/atlas/atlas_zmvS.h,/usr/include/atlas/atlas_zmvT.h,/usr/include/atlas/atlas_sNCmm.h,/usr/include/atlas/atlas_zNCmm.h,/usr/include/atlas/atlas_zsysinfo.h,/usr/include/atlas/atlas_ctrsmXover.h,/usr/include/atlas/atlas_ztrsmXover.h,/usr/include/atlas/atlas_csysinfo.h,/usr/include/atlas/atlas_dtrsmXover.h,/usr/include/atlas/atlas_ssysinfo.h,/usr/include/atlas/atlas_cr1.h,/usr/include/atlas/atlas_cmv.h,/usr/include/atlas/atlas_dr1.h,/usr/include/atlas/atlas_dmv.h,/usr/include/atlas/atlas_sr1.h,/usr/include/atlas/atlas_smv.h,/usr/include/atlas/atlas_zr1.h,/usr/include/atlas/atlas_zmv.h,/usr/include/atlas/atlas_strsmXover.h,/usr/include/atlas/atlas_dsysinfo.h,/usr/include/atlas/atlas_cacheedge.h,/usr/include/atlas/atlas_buildinfo.h,/usr/include/atlas/atlas_cmvN.h,/usr/include/atlas/atlas_cmvS.h,/usr/include/atlas/atlas_cmvT.h,/usr/include/atlas/atlas_dmvN.h,/usr/include/atlas/atlas_dmvS.h,/usr/include/atlas/atlas_dmvT.h ) ( paths: /usr/include/atlas ) ( paths: /usr/include/atlas/cblas.h,/usr/include/atlas/cblas.h ) FOUND: libraries = ['lapack', 'lapack', 'blas', 'cblas', 'atlas'] library_dirs = ['/usr/lib'] language = f77 include_dirs = ['/usr/include/atlas'] lapack_atlas_threads_info: ( library_dirs = /usr/lib:/usr/local/lib ) ( paths: /usr/lib/liblapack.so ) ( paths: /usr/lib/libblas.so ) ( paths: /usr/lib/libcblas.so ) ( paths: /usr/lib/libatlas.so ) ( paths: /usr/lib/liblapack.so ) system_info.lapack_atlas_threads_info ( include_dirs = /usr/include/atlas:/usr/local/include:/usr/include ) ( paths: /usr/include/atlas/atlas_cNCmm.h,/usr/include/atlas/atlas_smvN.h,/usr/include/atlas/atlas_smvS.h,/usr/include/atlas/atlas_smvT.h,/usr/include/atlas/atlas_dNCmm.h,/usr/include/atlas/atlas_type.h,/usr/include/atlas/atlas_zmvN.h,/usr/include/atlas/atlas_zmvS.h,/usr/include/atlas/atlas_zmvT.h,/usr/include/atlas/atlas_sNCmm.h,/usr/include/atlas/atlas_zNCmm.h,/usr/include/atlas/atlas_zsysinfo.h,/usr/include/atlas/atlas_ctrsmXover.h,/usr/include/atlas/atlas_ztrsmXover.h,/usr/include/atlas/atlas_csysinfo.h,/usr/include/atlas/atlas_dtrsmXover.h,/usr/include/atlas/atlas_ssysinfo.h,/usr/include/atlas/atlas_cr1.h,/usr/include/atlas/atlas_cmv.h,/usr/include/atlas/atlas_dr1.h,/usr/include/atlas/atlas_dmv.h,/usr/include/atlas/atlas_sr1.h,/usr/include/atlas/atlas_smv.h,/usr/include/atlas/atlas_zr1.h,/usr/include/atlas/atlas_zmv.h,/usr/include/atlas/atlas_strsmXover.h,/usr/include/atlas/atlas_dsysinfo.h,/usr/include/atlas/atlas_cacheedge.h,/usr/include/atlas/atlas_buildinfo.h,/usr/include/atlas/atlas_cmvN.h,/usr/include/atlas/atlas_cmvS.h,/usr/include/atlas/atlas_cmvT.h,/usr/include/atlas/atlas_dmvN.h,/usr/include/atlas/atlas_dmvS.h,/usr/include/atlas/atlas_dmvT.h ) ( paths: /usr/include/atlas ) ( paths: /usr/include/atlas/cblas.h,/usr/include/atlas/cblas.h ) FOUND: libraries = ['lapack', 'lapack', 'blas', 'cblas', 'atlas'] library_dirs = ['/usr/lib'] language = f77 include_dirs = ['/usr/include/atlas'] lapack_info: ( library_dirs = /usr/lib:/usr/local/lib ) ( paths: /usr/lib/liblapack.so ) FOUND: libraries = ['lapack'] library_dirs = ['/usr/lib'] language = f77 lapack_opt_info: /scratch/python/SciPy_complete-0.3.2/scipy_core/scipy_distutils/system_info.py:812: FutureWarning: hex()/oct() of negative int will return a signed string inPython 2.4 and up magic = hex(hash(`config`)) running build_src building extension "atlas_version" sources adding 'build/src/atlas_version_0xe146d5ff.c' to sources. running build_ext customize UnixCCompiler customize UnixCCompiler using build_ext Could not locate executable f77 customize GnuFCompiler customize GnuFCompiler customize GnuFCompiler using build_ext FOUND: libraries = ['lapack', 'lapack', 'blas', 'cblas', 'atlas'] library_dirs = ['/usr/lib'] language = f77 define_macros = [('ATLAS_INFO', '"\\"3.6.0\\""')] include_dirs = ['/usr/include/atlas'] lapack_src_info: ( src_dirs = .:/usr/local/src ) NOT AVAILABLE numarray_info: ( include_dirs = /usr/include/python2.3 ) ( paths: /usr/include/python2.3/numarray/arrayobject.h ) FOUND: define_macros = [('NUMARRAY_VERSION', '"\\"1.1.1\\""')] include_dirs = ['/usr/include/python2.3'] numpy_info: ( include_dirs = /usr/include/python2.3 ) ( paths: /usr/include/python2.3/Numeric/arrayobject.h ) FOUND: define_macros = [('NUMERIC_VERSION', '"\\"23.8\\""')] include_dirs = ['/usr/include/python2.3'] sfftw_info: ( library_dirs = /usr/local/lib:/usr/lib ) ( include_dirs = /usr/local/include:/usr/include ) NOT AVAILABLE sfftw_threads_info: ( library_dirs = /usr/local/lib:/usr/lib ) ( include_dirs = /usr/local/include:/usr/include ) NOT AVAILABLE wx_info: FOUND: libraries = ['wx_gtk2-2.4'] extra_link_args = ['-pthread'] define_macros = [('WX_INFO', '"\\"2.4.2\\""'), ('WX_VERSION_2_4_2', None), ('WX_RELEASE_2_4', None), ('GTK_NO_CHECK_CASTS', None), ('__WXGTK__', None), ('_FILE_OFFSET_BITS', '64'), ('_LARGE_FILES', None)] include_dirs = ['/usr/lib/wx/include/gtk2-2.4'] x11_info: ( library_dirs = /usr/X11R6/lib:/usr/lib ) ( include_dirs = /usr/X11R6/include:/usr/include ) ( paths: /usr/X11R6/lib/libX11.so ) ( paths: /usr/X11R6/include/X11/X.h ) FOUND: libraries = ['X11'] library_dirs = ['/usr/X11R6/lib'] include_dirs = ['/usr/X11R6/include'] xft_info: FOUND: libraries = ['Xft', 'X11', 'freetype', 'Xrender', 'fontconfig'] define_macros = [('XFT_INFO', '"\\"2.1.2.2\\""'), ('XFT_VERSION_2_1_2_2', None)] include_dirs = ['/usr/include/freetype2', '/usr/include/freetype2/config'] selected output from python setup.py build: ________________________________________________________ lapack_opt_info: atlas_threads_info: scipy_distutils.system_info.atlas_threads_info NOT AVAILABLE atlas_info: scipy_distutils.system_info.atlas_info NOT AVAILABLE scipy_core/scipy_distutils/system_info.py:909: UserWarning: Atlas (http://math-atlas.sourceforge.net/) libraries not found. Directories to search for the libraries can be specified in the scipy_distutils/site.cfg file (section [atlas]) or by setting the ATLAS environment variable. warnings.warn(AtlasNotFoundError.__doc__) lapack_info: FOUND: libraries = ['lapack'] library_dirs = ['/usr/lib'] language = f77 blas_info: FOUND: libraries = ['blas'] library_dirs = ['/usr/lib'] language = f77 FOUND: libraries = ['lapack', 'blas'] library_dirs = ['/usr/lib'] define_macros = [('NO_ATLAS_INFO', 1)] language = f77 blas_opt_info: atlas_blas_threads_info: scipy_distutils.system_info.atlas_blas_threads_info NOT AVAILABLE atlas_blas_info: scipy_distutils.system_info.atlas_blas_info NOT AVAILABLE scipy_core/scipy_distutils/system_info.py:982: UserWarning: Atlas (http://math-atlas.sourceforge.net/) libraries not found. Directories to search for the libraries can be specified in the scipy_distutils/site.cfg file (section [atlas]) or by setting the ATLAS environment variable. warnings.warn(AtlasNotFoundError.__doc__) FOUND: libraries = ['blas'] library_dirs = ['/usr/lib'] define_macros = [('NO_ATLAS_INFO', 1)] language = f77 From pajer at iname.com Sun Mar 13 17:49:36 2005 From: pajer at iname.com (Gary) Date: Sun, 13 Mar 2005 17:49:36 -0500 Subject: [SciPy-user] Plot vectors at specified (x,y) locations? In-Reply-To: References: Message-ID: <4234C380.3060608@iname.com> Zachary Pincus wrote: > Hello, > > I'm wondering if anyone knows a good way to plot (preferably with > xplt, but anything will do) a handful of vectors, given the vector > magnitude (u, v) and position (x, y). > > Now, xplt can plot a vector field defined over a quadrilateral mesh, > but I don't need to plot a whole field of vectors! Figuring out how to > take a list of arbitrary (x, y) locations and convert that list into a > quad mesh where the nodes of the mesh are on the original points is > not my idea of fun, so if there's any way to do this otherwise I would > be most interested. > > The basic idea here is that I want to plot the track of an object in > (x, y) and then plot velocity or acceleration vectors at various > points along the track. This should be simple, right? > An alternative that has this functionality (and is very easy to use) is Visual Python: www.vpython.org If you haven't already, you might consider it. -gary From pearu at scipy.org Sun Mar 13 18:20:36 2005 From: pearu at scipy.org (Pearu Peterson) Date: Sun, 13 Mar 2005 17:20:36 -0600 (CST) Subject: [SciPy-user] advice on scipy installation In-Reply-To: <200503131741.56949.dd55@cornell.edu> References: <200503131741.56949.dd55@cornell.edu> Message-ID: On Sun, 13 Mar 2005, Darren Dale wrote: > Hello scipy users, > > I am having some trouble with installing scipy-0.3.2, specifically linking to > atlas. I asked a question about this a month or two ago, and didnt receive a > response. Please, I would really appreciate some help, and I think I located > a bug. > > First, I have read all the documentation about installing scipy: I am using > site.cfg to reflect my setup. The output of system_info.py is included at the > end of this email, all the atlas, lapack, and blas info was found and > reported (with the exception of blas_src_info and lapack_src_info, which are > not available on my machine). So up to this point, things look good. > > The trouble starts when I try to run python setup.py build. At this point I am > getting NOT AVAILABLE messages, but for packages that system_info.py had just > successfully queried. > > Why would system_info.py report the existence of packages when I run it > manually, but fail to report the same packages during installation? I think I > have a lead. site.cfg is not being read during the build, because the > installer is looking for it at ./site.cfg instead of > at ./scipy_core/scipy_distutils/site.cfg. > > I think the offending line is around line 287 in system_info.py: > > cf = os.path.join(os.path.split(os.path.abspath(__file__))[0], > 'site.cfg') > > __file__ is ./setup.py during the build, > and ./scipy_core/scipy_distutils/system_info.py when I call system_info.py. > Could somebody recommend a fix? Do I copy site.cfg to ./? I dont think this > is the right thing to do, but I dont know how to point cf to the right > location (I still have a lot to learn about python). This bug is now fixed in scipy CVS. The fix is to replace the six lines starting at line #285 in system_info.py by try: f = __file__ except NameError,msg: f = sys.argv[0] cf = os.path.join(os.path.split(os.path.abspath(f))[0], 'site.cfg') Thanks for reporting this bug! Pearu From brendansimons at yahoo.ca Sun Mar 13 22:49:27 2005 From: brendansimons at yahoo.ca (Brendan Simons) Date: Sun, 13 Mar 2005 22:49:27 -0500 Subject: [SciPy-user] optimize.leastsq -> fjac and the covariance redux In-Reply-To: <20050211042304.C1A643EB9A@www.scipy.com> References: <20050211042304.C1A643EB9A@www.scipy.com> Message-ID: <3a3fb9faa2c520b871a7b249a85747b0@yahoo.ca> I am scratching my brain trying to find the quickest way to construct the covariance matrix corresponding to a least squares fit, and I've come across some documentation quirks that are confusing me. The method scipy.optimize.leastsq returns, in its optional infodict parameter, a value called fjac which according to the docstring is: the orthogonal matrix, q, produced by the QR facotrization of the final approximate Jacobi matrix, stored column wise. But scipy.optimize.leastsq calls the trusty MINPACK fortran routine lmder, who's comments say: c fjac is an output m by n array. the upper n by n submatrix c of fjac contains an upper triangular matrix r with c diagonal elements of nonincreasing magnitude such that c c p^t *(jac^t *jac)*p = r^t *r, c c where p is a permutation matrix and jac is the final c calculated jacobian. column j of p is column ipvt(j) c (see below) of the identity matrix. the lower trapezoidal c part of fjac contains information generated during c the computation of r. And a recent comment to this list said: On 10-Feb-05, at 11:23 PM, scipy-user-request at scipy.net wrote: > R. Padraic Springuel wrote: > >> Is there a way to get the uncertainty in the resultant parameters out >> of the optimize.leastsq fitting function? > > optimize.leastsq? reveals that it can return a dictionary of optional > values, among which is the Jacobian matrix fjac at the final step. The > diagonal elements of fjac times its transpose can be taken as the > squares of the errors in the corresponding coefficients. So fjac is either: 1) the jacobian at the final step, 2) The Q of the QR factorization of the jacobian. 3) or the R of the QR factorization of the jacobian. Why is this important? Because the jacobian at the final step is needed to compute the covariance matrix, which gives the uncertainty in the fitted parameters, which is VERY important. So, any MINPACK experts out there? Can you explain to me what fjac is in clearer terms? Is the scipy.optimize.leastsq docstring accurate? -Brendan ----------------- PS, Here's a bit of linear algebra background: lmder is an implementation of the Levenberg-Marquardt algorithm. The levenberg marquardt method minimizes the sum of the squares of the residual functions by iteratively solving its derivative: ( J^t * J ) * x = J^t * f( x ), for x: x = ( J^t * J )^-1 * J^t * f( x ) where J is the Jacobian matrix (the derivatives of all the functions wrt each parameter x), x is the parameters to solve for, and f(x) is a vector of the residuals. (There's another bit here, where the diagonals of ( J^t * J ) are multiplied by a damping factor, but that's not important for the present problem) That first term on the right side of that last equation, ( J^t * J )^-1, is important, because *that's* the covariance matrix of the parameters x. Here's where I start to get confused. I *think* lmder solves the above function with QR factorization, so that the above equation becomes: x = ( R^t * Q^t * Q * R )^-1* R^t * Q^t * f( x ) = (R^t * R)^-1* R^t * Q^t * f( x ) = R^-1 * Q^T * f( x ) where Q, R are the results of the QR factorization of J. If lmder is indeed doing this (I've looked at the code, but my Fortran skills are poor), than I should be able to get the value of R at the last step, and calculate my covariance from (R^t * R)^-1 But at this point my head is hurting too much to know where to look for this matrix. Am I way off base? From oliphant at ee.byu.edu Sun Mar 13 23:55:32 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Sun, 13 Mar 2005 21:55:32 -0700 Subject: [SciPy-user] optimize.leastsq -> fjac and the covariance redux In-Reply-To: <3a3fb9faa2c520b871a7b249a85747b0@yahoo.ca> References: <20050211042304.C1A643EB9A@www.scipy.com> <3a3fb9faa2c520b871a7b249a85747b0@yahoo.ca> Message-ID: <42351944.2020800@ee.byu.edu> Brendan Simons wrote: > I am scratching my brain trying to find the quickest way to construct > the covariance matrix corresponding to a least squares fit, and I've > come across some documentation quirks that are confusing me. > > The method scipy.optimize.leastsq returns, in its optional infodict > parameter, a value called fjac which according to the docstring is: Brendan, Thanks for your detailed comments. I would love to see the covariance matrix as a standard optional output. If you can help us get there from the code that would be great. As I look at the Fortran docstring, I think it's pretty clear that the leastsq Python docstring is wrong (sorry about that since I probably wrote it...) fjac is basically the R matrix (once you apply some permutations), thus, I'm pretty sure you can use the upper triangular part of the fjac matrix with the ipvt vector (also in infodict) to caculate the covariance matrix as you describe. So, I would construct a permutation matrix P (using ipvt) like this: n = len(ipvt) p = mat(take(eye(n),ipvt)) and then get R using some thing like r = mat(MLab.triu(fjac)) R = r * p.T Then covariance matrix estimate is C = (R.T * R).I This is totally untested, but I think you are on the right track to use fjac as the R matrix of the QR factorization. Thanks for your help, -Travis From gruben at bigpond.net.au Mon Mar 14 05:08:00 2005 From: gruben at bigpond.net.au (Gary Ruben) Date: Mon, 14 Mar 2005 21:08:00 +1100 Subject: [SciPy-user] Plot vectors at specified (x,y) locations? In-Reply-To: References: Message-ID: <42356280.7070309@bigpond.net.au> From gnuplot's help: vectors The vectors style draws a vector from (x,y) to (x+xdelta,y+ydelta). Thus it requires four columns of data. It also draws a small arrowhead at the end of the vector. Example: plot 'file.dat' using 1:2:3:4 with vectors head filled lt 2 Here's a gnuplot example of a vector field plot: If you don't find out how to do it with xplt, try gnuplot.py to drive gnuplot to do what you want. Gary R. Zachary Pincus wrote: > Hello, > > I'm wondering if anyone knows a good way to plot (preferably with xplt, > but anything will do) a handful of vectors, given the vector magnitude > (u, v) and position (x, y). > Any help is appreciated, > > Zach Pincus > > Department of Biochemistry and Program in Biomedical Informatics > Stanford University School of Medicine From nikolai.hlubek at mailbox.tu-dresden.de Mon Mar 14 06:25:21 2005 From: nikolai.hlubek at mailbox.tu-dresden.de (Nikolai Hlubek) Date: Mon, 14 Mar 2005 11:25:21 +0000 Subject: [SciPy-user] CVS build + Livedocs Message-ID: <1110799521l.3349l.1l@ptpcp11> Hi everyone, I'm a bit lost when trying to use the livedocs with the latest cvs build. 1.) The building process seems to ignore my atlas packages and uses only lapack. This is the error I get: /scratch/hlubek/cvs/scipy/scipy_core/scipy_distutils/ system_info.py:653: UserWarning: ********************************************************************* Could not find lapack library within the ATLAS installation. ********************************************************************* warnings.warn(message) /scratch/hlubek/cvs/scipy/scipy_core/scipy_distutils/ system_info.py:865: FutureWarning: hex()/oct() of negative int will return a signed string in Python 2.4 and up magic = hex(hash(`config`)) I'm using debian sarge with the atlas2-sse2 packages but atlas3-sse2 is also installed due to some python-2.3 dependencies. The output of system_info.py can be found below. 2.) When using the livedocs I don't get all of the help files. I.e. scipy.xplt.histogram is displaying an empty help string in the livedocs, although running >> ??scipy.xplt.histogram from ipython clearly shows that it is available. scipy.xplt.legend isn't even available in the livedocs, as are many other methods and all of the source files. Could these two issues be related? Or has anyone another idea what I am doing wrong? Thanks for your help, Nikolai _pkg_config_info: NOT AVAILABLE agg2_info: ( src_dirs = .:/usr/local/src ) NOT AVAILABLE atlas_blas_info: ( library_dirs = /usr/local/lib:/usr/lib ) ( paths: /usr/lib/atlas,/usr/lib/sse2 ) ( paths: /usr/lib/sse2/libf77blas.so ) ( paths: /usr/lib/sse2/libcblas.so ) ( paths: /usr/lib/sse2/libatlas.so ) ( include_dirs = /usr/local/include:/usr/include ) ( paths: /usr/include/atlas_misc.h,/usr/include/atlas_enum.h,/usr/ include/atlas_aux.h,/usr/include/atlas_type.h ) ( library_dirs = /usr/local/lib:/usr/lib ) ( paths: /usr/lib/atlas,/usr/lib/sse2 ) FOUND: libraries = ['f77blas', 'cblas', 'atlas'] library_dirs = ['/usr/lib/sse2'] language = c atlas_blas_threads_info: Setting PTATLAS=ATLAS ( library_dirs = /usr/local/lib:/usr/lib ) ( paths: /usr/lib/atlas,/usr/lib/sse2 ) ( paths: /usr/lib/sse2/libatlas.so ) ( paths: /usr/lib/sse2/libatlas.a ) NOT AVAILABLE atlas_info: ( library_dirs = /usr/local/lib:/usr/lib ) ( paths: /usr/lib/atlas,/usr/lib/sse2 ) ( paths: /usr/lib/sse2/libf77blas.so ) ( paths: /usr/lib/sse2/libcblas.so ) ( paths: /usr/lib/sse2/libatlas.so ) ( paths: /usr/lib/sse2/liblapack_atlas.so ) system_info.atlas_info ( include_dirs = /usr/local/include:/usr/include ) ( paths: /usr/include/atlas_misc.h,/usr/include/atlas_enum.h,/usr/ include/atlas_aux.h,/usr/include/atlas_type.h ) ( library_dirs = /usr/local/lib:/usr/lib ) ( paths: /usr/lib/atlas,/usr/lib/sse2 ) FOUND: libraries = ['f77blas', 'cblas', 'atlas'] library_dirs = ['/usr/lib/sse2'] language = c define_macros = [('ATLAS_WITHOUT_LAPACK', None)] atlas_threads_info: Setting PTATLAS=ATLAS ( library_dirs = /usr/local/lib:/usr/lib ) ( paths: /usr/lib/atlas,/usr/lib/sse2 ) ( paths: /usr/lib/sse2/libatlas.so ) ( paths: /usr/lib/sse2/libatlas.a ) ( paths: /usr/lib/sse2/liblapack_atlas.so ) system_info.atlas_threads_info NOT AVAILABLE blas_info: ( library_dirs = /usr/local/lib:/usr/lib ) NOT AVAILABLE blas_opt_info: running build_src building extension "atlas_version" sources creating build creating build/src adding 'build/src/atlas_version_0xaa8955e5.c' to sources. running build_ext customize UnixCCompiler customize UnixCCompiler using build_ext building 'atlas_version' extension compiling C sources gcc options: '-pthread -fno-strict-aliasing -DNDEBUG -g -O3 -Wall -Wstrict-prototypes -fPIC' creating build/temp.linux-i686-2.3 creating build/temp.linux-i686-2.3/build creating build/temp.linux-i686-2.3/build/src compile options: '-I/usr/include/python2.3 -c' gcc: build/src/atlas_version_0xaa8955e5.c gcc -pthread -shared build/temp.linux-i686-2.3/build/src/ atlas_version_0xaa8955e5.o -L/usr/lib/sse2 -lf77blas -lcblas -latlas -o build/temp.linux-i686-2.3/atlas_version.so Command: /usr/bin/python -c "import imp;imp.load_dynamic (\"atlas_version\",\"atlas_version.so\")" Status: 1 Output: Traceback (most recent call last): File "", line 1, in ? ImportError: /usr/lib/sse2/libf77blas.so.2: undefined symbol: e_wsfe ( library_dirs = /usr/local/lib:/usr/lib ) FOUND: libraries = ['f77blas', 'cblas', 'atlas'] library_dirs = ['/usr/lib/sse2'] language = c define_macros = [('NO_ATLAS_INFO', 2)] blas_src_info: ( src_dirs = .:/usr/local/src ) NOT AVAILABLE boost_python_info: ( src_dirs = .:/usr/local/src ) NOT AVAILABLE dfftw_info: ( library_dirs = /usr/local/lib:/usr/lib ) ( include_dirs = /usr/local/include:/usr/include ) NOT AVAILABLE dfftw_threads_info: ( library_dirs = /usr/local/lib:/usr/lib ) ( include_dirs = /usr/local/include:/usr/include ) NOT AVAILABLE djbfft_info: ( library_dirs = /usr/local/lib:/usr/lib ) ( include_dirs = /usr/local/include:/usr/include ) NOT AVAILABLE fftw_info: ( library_dirs = /usr/local/lib:/usr/lib ) ( include_dirs = /usr/local/include:/usr/include ) ( paths: /usr/lib/librfftw.so ) ( paths: /usr/lib/libfftw.so ) ( paths: /usr/include/fftw.h,/usr/include/rfftw.h ) ( library_dirs = /usr/local/lib:/usr/lib ) FOUND: libraries = ['rfftw', 'fftw'] library_dirs = ['/usr/lib'] define_macros = [('SCIPY_FFTW_H', None)] include_dirs = ['/usr/include'] fftw_threads_info: ( library_dirs = /usr/local/lib:/usr/lib ) ( include_dirs = /usr/local/include:/usr/include ) ( paths: /usr/lib/librfftw_threads.so ) ( paths: /usr/lib/libfftw_threads.so ) ( paths: /usr/include/fftw_threads.h,/usr/include/rfftw_threads.h ) ( library_dirs = /usr/local/lib:/usr/lib ) FOUND: libraries = ['rfftw_threads', 'fftw_threads'] library_dirs = ['/usr/lib'] define_macros = [('SCIPY_FFTW_THREADS_H', None)] include_dirs = ['/usr/include'] freetype2_info: NOT AVAILABLE gdk_2_info: NOT AVAILABLE gdk_info: NOT AVAILABLE gdk_pixbuf_2_info: NOT AVAILABLE gdk_pixbuf_xlib_2_info: NOT AVAILABLE gdk_x11_2_info: NOT AVAILABLE gtkp_2_info: NOT AVAILABLE gtkp_x11_2_info: NOT AVAILABLE lapack_atlas_info: ( library_dirs = /usr/local/lib:/usr/lib ) ( paths: /usr/lib/atlas,/usr/lib/sse2 ) ( paths: /usr/lib/sse2/liblapack_atlas.so ) ( paths: /usr/lib/sse2/libf77blas.so ) ( paths: /usr/lib/sse2/libcblas.so ) ( paths: /usr/lib/sse2/libatlas.so ) ( paths: /usr/lib/sse2/liblapack_atlas.so ) system_info.lapack_atlas_info ( include_dirs = /usr/local/include:/usr/include ) ( paths: /usr/include/atlas_misc.h,/usr/include/atlas_enum.h,/usr/ include/atlas_aux.h,/usr/include/atlas_type.h ) ( library_dirs = /usr/local/lib:/usr/lib ) ( paths: /usr/lib/atlas,/usr/lib/sse2 ) FOUND: libraries = ['lapack_atlas', 'f77blas', 'cblas', 'atlas'] library_dirs = ['/usr/lib/sse2'] language = c define_macros = [('ATLAS_WITH_LAPACK_ATLAS', None)] lapack_atlas_threads_info: Setting PTATLAS=ATLAS ( library_dirs = /usr/local/lib:/usr/lib ) ( paths: /usr/lib/atlas,/usr/lib/sse2 ) ( paths: /usr/lib/sse2/liblapack_atlas.so ) ( paths: /usr/lib/sse2/libatlas.so ) ( paths: /usr/lib/sse2/liblapack_atlas.a ) ( paths: /usr/lib/sse2/libatlas.a ) ( paths: /usr/lib/sse2/liblapack_atlas.so ) system_info.lapack_atlas_threads_info NOT AVAILABLE lapack_info: ( library_dirs = /usr/local/lib:/usr/lib ) ( paths: /usr/lib/liblapack.a ) ( library_dirs = /usr/local/lib:/usr/lib ) FOUND: libraries = ['lapack'] library_dirs = ['/usr/lib'] language = f77 lapack_opt_info: running build_src building extension "atlas_version" sources adding 'build/src/atlas_version_0x31b82f3b.c' to sources. running build_ext customize UnixCCompiler customize UnixCCompiler using build_ext Command: /usr/bin/python -c "import imp;imp.load_dynamic (\"atlas_version\",\"atlas_version.so\")" Status: 1 Output: Traceback (most recent call last): File "", line 1, in ? ImportError: /usr/lib/sse2/libf77blas.so.2: undefined symbol: e_wsfe ( library_dirs = /usr/local/lib:/usr/lib ) FOUND: libraries = ['f77blas', 'cblas', 'atlas', 'lapack'] library_dirs = ['/usr/lib/sse2', '/usr/lib'] define_macros = [('ATLAS_WITHOUT_LAPACK', None), ('NO_ATLAS_INFO', 2)] language = f77 lapack_src_info: ( src_dirs = .:/usr/local/src ) NOT AVAILABLE numarray_info: ( include_dirs = /usr/include/python2.3 ) ( paths: /usr/include/python2.3/numarray/arrayobject.h ) ( library_dirs = ) FOUND: define_macros = [('NUMARRAY_VERSION', '"\\"1.1.1\\""')] include_dirs = ['/usr/include/python2.3'] numpy_info: ( include_dirs = /usr/include/python2.3 ) ( paths: /usr/include/python2.3/Numeric/arrayobject.h ) ( library_dirs = ) FOUND: define_macros = [('NUMERIC_VERSION', '"\\"23.7\\""')] include_dirs = ['/usr/include/python2.3'] sfftw_info: ( library_dirs = /usr/local/lib:/usr/lib ) ( include_dirs = /usr/local/include:/usr/include ) NOT AVAILABLE sfftw_threads_info: ( library_dirs = /usr/local/lib:/usr/lib ) ( include_dirs = /usr/local/include:/usr/include ) NOT AVAILABLE wx_info: Could not locate executable wx-config File not found: wx-config. Cannot determine wx info. NOT AVAILABLE x11_info: ( library_dirs = /usr/X11R6/lib:/usr/lib ) ( include_dirs = /usr/X11R6/include:/usr/include ) ( paths: /usr/X11R6/lib/libX11.so ) ( paths: /usr/X11R6/include/X11/X.h ) ( library_dirs = /usr/X11R6/lib:/usr/lib ) FOUND: libraries = ['X11'] library_dirs = ['/usr/X11R6/lib'] include_dirs = ['/usr/X11R6/include'] xft_info: NOT AVAILABLE From dd55 at cornell.edu Mon Mar 14 09:24:40 2005 From: dd55 at cornell.edu (Darren Dale) Date: Mon, 14 Mar 2005 09:24:40 -0500 Subject: [SciPy-user] advice on scipy installation In-Reply-To: References: <200503131741.56949.dd55@cornell.edu> Message-ID: <200503140924.41083.dd55@cornell.edu> On Sunday 13 March 2005 06:20 pm, Pearu Peterson wrote: > > This bug is now fixed in scipy CVS. The fix is to replace the six lines > starting at line #285 in system_info.py by > > try: > f = __file__ > except NameError,msg: > f = sys.argv[0] > cf = os.path.join(os.path.split(os.path.abspath(f))[0], > 'site.cfg') > Thank you Pearu, it looks like that did the trick. By the way, is anyone on this list working on, or interested in working on, the scipy ebuild for gentoo? This bugfix brings the ebuild one step closer to completion. Darren From andrew.fant at tufts.edu Mon Mar 14 09:28:04 2005 From: andrew.fant at tufts.edu (Andrew D. Fant) Date: Mon, 14 Mar 2005 09:28:04 -0500 Subject: [SciPy-user] advice on scipy installation In-Reply-To: <200503140924.41083.dd55@cornell.edu> References: <200503131741.56949.dd55@cornell.edu> <200503140924.41083.dd55@cornell.edu> Message-ID: <42359F74.3060001@tufts.edu> Darren Dale wrote: > By the way, is anyone on this list working on, or interested in working on, > the scipy ebuild for gentoo? This bugfix brings the ebuild one step closer to > completion. Darren, I've been following the ebuild saga and wouldn't mind lending a hand with testing and the like. Are you following the bugs related to it in bugzilla? Andy From dd55 at cornell.edu Mon Mar 14 10:02:30 2005 From: dd55 at cornell.edu (Darren Dale) Date: Mon, 14 Mar 2005 10:02:30 -0500 Subject: [SciPy-user] advice on scipy installation In-Reply-To: <42359F74.3060001@tufts.edu> References: <200503131741.56949.dd55@cornell.edu> <200503140924.41083.dd55@cornell.edu> <42359F74.3060001@tufts.edu> Message-ID: <200503141002.30629.dd55@cornell.edu> On Monday 14 March 2005 09:28 am, Andrew D. Fant wrote: > Darren Dale wrote: > > By the way, is anyone on this list working on, or interested in working > > on, the scipy ebuild for gentoo? This bugfix brings the ebuild one step > > closer to completion. > > Darren, > I've been following the ebuild saga and wouldn't mind lending a hand > with testing and the like. Are you following the bugs related to it in > bugzilla? I have been following the bugzilla page. In fact, I made the last five posts to that page. I got the ball rolling on changes to the numeric ebuild, so it would link to existing lapack, blas, and atlas libraries. The official ebuild just installs the libraries shipped with numeric. They havent ironed out all the wrinkles on all the platforms, but I expect the numeric changes will soon be part of the official gentoo tree. Once they get numeric figured out, we can copy numeric's atlas/lapack/blas dependencies into the scipy ebuild. Now that site.cfg is being read during the build, I think I can get the ebuild working. I'll check with the gentoo devs and see whats happening with numeric. -- Darren From brendansimons at yahoo.ca Mon Mar 14 14:07:21 2005 From: brendansimons at yahoo.ca (Brendan Simons) Date: Mon, 14 Mar 2005 14:07:21 -0500 (EST) Subject: [SciPy-user] Re: optimize.leastsq -> fjac and the covariance redux In-Reply-To: <20050314112543.779723EC08@www.scipy.com> Message-ID: <20050314190721.2915.qmail@web31110.mail.mud.yahoo.com> >Brendan, > >Thanks for your detailed comments. I would love to >see the covariance >matrix as a standard optional output. If you can help >us get there from >the code that would be great. > >As I look at the Fortran docstring, I think it's >pretty clear that the >leastsq Python docstring is wrong (sorry about that >since I probably >wrote it...) fjac is basically the R matrix (once you >apply some >permutations), thus, I'm pretty sure you can use the >upper triangular >part of the fjac matrix with the ipvt vector (also in >infodict) to >caculate the covariance matrix as you describe. > >So, I would construct a permutation matrix P (using >ipvt) like this: > >n = len(ipvt) >p = mat(take(eye(n),ipvt)) > >and then get R using some thing like > >r = mat(MLab.triu(fjac)) >R = r * p.T > >Then covariance matrix estimate is > >C = (R.T * R).I > > >This is totally untested, but I think you are on the >right track to use >fjac as the R matrix of the QR factorization. > >Thanks for your help, > >-Travis Travis, I too would like to see the covariance in the leastsq optional outputs. Using your suggestion, I whipped up the following script, which fits a simple quadratic (chosen because I can calculate the jacobian easily by hand): #---begin script-------- """script to test the fjac output matrix of scipy.optimize.leastsq Placed in public domain by Brendan Simons March 2005 """ from scipy import * import MLab from scipy.optimize import leastsq x = array([0., 1., 2., 3., 4., 5.], typecode='f') coeffs = [4., 3., 5.] yErrs = array([0.1, 0.2, -0.1, 0.05, 0, -.02], typecode='f') m = len(x) n = len(coeffs) def evalY(p, x): """a simple quadratic""" return p[2]*x**2 + p[1]*x + p[0] def J(p, x): """the jacobian of a quadratic""" result = zeros([m,n], typecode='f') result[:, 0] = ones(m, typecode='f') result[:, 1] = x result[:, 2] = x*x return result def resid(p, y, x): return y - evalY(p, x) y_true = y(coeffs, x) y_meas = y_true + yErrs p0 = [3., 2., 4.] pMin, infodict, ier, mesg = leastsq(resid, p0, args=(y_meas, x), full_output=True) fjac = infodict['fjac'] ipvt = infodict['ipvt'] #here's travis' suggestion perm = mat(take(eye(n),ipvt-1)) r = mat(MLab.triu(fjac[:,:n])) R = r * perm.T C = (R.T * R).I #check against know algorithms for computing C Q_true, R_true = linalg.qr(J(pMin, x)) C_true = linalg.inv(dot(transpose(J(pMin, x)), J(pMin, x))) C_true2 = linalg.inv(dot(transpose(R_true), R_true)) print 'pMin' print pMin print '\nfjac' print fjac print'\nipvt' print ipvt print '\nperm' print perm print '\nr' print r print '\nR' print R print '\nR_true' print R_true print '\nC' print C print '\nC_true' print C_true print '\nC_true2' print C_true2 #---end script-------- Note that I had to make the following modifications to your algorithm: a) The Fortran docstring says r is in the first nxn submatrix of fjac, so I truncated fjac with [:, :n] before applying MLab.triu b) the identity matrix columns returned by ipvt are 1-based, not 0-based (Fortran is 1-based I guess). So I subtracted 1 in which results in perm = mat(take(eye(n),ipvt-1)) The results from the script are as follows: #--- begin script output---- pMin [ 4.13714286 2.93428571 5.00714286] fjac [[-31.28897539 -0.03196014 -0.12784056 -0.28764125 -0.51136222 -0.79900346] [ -7.19103119 1.81357942 0.59589038 0.51365984 0.17797854 -0.41115309] [ -1.75780755 1.30104626 -1.10335453 0.03523648 -0.29732131 -0.95308465]] ipvt [3 2 1] perm Matrix([[ 0., 0., 1.], [ 0., 1., 0.], [ 1., 0., 0.]]) r Matrix([[-31.28897539, -0.03196014, -0.12784056], [ 0. , 1.81357942, 0.59589038], [ 0. , 0. , -1.10335453]]) R Matrix([[ -0.12784056, -0.03196014, -31.28897539], [ 0.59589038, 1.81357942, 0. ], [ -1.10335453, 0. , 0. ]]) R_true [[ -2.44948983 -6.12372446 -22.45365715] [ 0. 4.18329954 20.916502 ] [ 0. 0. 6.11010218] [ 0. 0. 0. ] [ 0. 0. 0. ] [ 0. 0. 0. ]] C Matrix([[ 8.21428627e-001, -2.69897976e-001, -3.08050728e-003], [-2.69897976e-001, 3.92718047e-001, 7.01607645e-004], [-3.08050728e-003, 7.01607645e-004, 1.03332016e-003]]) C_true [[ 0.8214277 -0.58928472 0.08928555] [-0.58928496 0.72678488 -0.13392843] [ 0.08928559 -0.13392843 0.02678569]] C_true2 [[ 0.82142925 -0.58928621 0.08928578] [-0.58928627 0.72678584 -0.13392855] [ 0.08928579 -0.13392855 0.0267857 ]] #--- end script output---- So you can see it doesn't work as expected. C does not equal C_true. More troublesome is that *no* columns of fjac resemble R_true, no matter what permutation matrix is applied. At this point I'm lost as to what is what. Any suggestions? -Brendan -- PS, If and when we figure out the answer, I'd like to submit the change to CVS. However, never having done such a thing, I will need some instruction. ______________________________________________________________________ Post your free ad now! http://personals.yahoo.ca From brendansimons at yahoo.ca Mon Mar 14 15:32:39 2005 From: brendansimons at yahoo.ca (Brendan Simons) Date: Mon, 14 Mar 2005 15:32:39 -0500 (EST) Subject: [SciPy-user] Re: optimize.leastsq -> fjac and the covariance redux In-Reply-To: 6667 Message-ID: <20050314203239.2304.qmail@web31108.mail.mud.yahoo.com> Aha! Great job. I do want to learn CVS at some point, but it's probably not worth the effort to get me developer access, especially as my implementation would probably not the most efficient. If you're willing to implement the Covariance output yourself, please send me a note when it's committed. Thanks again for your help. -Brendan --- Travis Oliphant wrote: > Brendan Simons wrote: > > Thanks for the example script. > > I think I've got it worked out. > > The only problem with the script is that fjac is in > "column-order" i.e. > Fortran order. So, really what we have provided is > r.T > > Note that there are several q-r factorizations for a > given matrix, so > comparing r is not useful. > > If you change this line: > > r = mat(MLab.triu(fjac[:,:n])) > > to this: > > r = mat(MLab.triu(transpose(fjac)[:n,:])) > > the results match up !!! > > With the change, I get > > C > Matrix([[ 0.821428641610194, -0.589285827415852, > 0.089285739190248], > [-0.589285827415852, 0.726785855320911, > -0.133928600229779], > [ 0.089285739190248, -0.133928600229779, > 0.026785719996822]]) > > C_true > [[ 0.821428571428571 -0.589285714285714 > 0.089285714285714] > [-0.589285714285714 0.726785714285714 > -0.133928571428571] > [ 0.089285714285714 -0.133928571428571 > 0.026785714285714]] > > C_true2 > [[ 0.82142857142857 -0.589285714285713 > 0.089285714285714] > [-0.589285714285714 0.726785714285714 > -0.133928571428571] > [ 0.089285714285714 -0.133928571428571 > 0.026785714285714]] > > > This is great. I'll see if I can't put in a new > optional output for > leastsq that makes use of this. > > If you'd like to do it, then that would be great > too. It is easy to > commit to CVS once you have developer access. If > you are interested > then I could get you set up. > > You would need to check out a "logged-in" version of > the CVS tree and > then committing is as easy as > > cvs commit -m "Some comments" > > > Thanks for your help. > > -Travis > > ______________________________________________________________________ Post your free ad now! http://personals.yahoo.ca From oliphant at ee.byu.edu Mon Mar 14 15:45:53 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Mon, 14 Mar 2005 13:45:53 -0700 Subject: [SciPy-user] Re: optimize.leastsq -> fjac and the covariance redux In-Reply-To: <20050314203239.2304.qmail@web31108.mail.mud.yahoo.com> References: <20050314203239.2304.qmail@web31108.mail.mud.yahoo.com> Message-ID: <4235F801.4020900@ee.byu.edu> Brendan Simons wrote: >Aha! Great job. > > >If you're willing to implement the Covariance output >yourself, please send me a note when it's committed. > > I made the change and now when using full_output for leastsq, the covariance matrix is returned (as a separate output). Please test it and report back problems. -Travis From stephen.walton at csun.edu Mon Mar 14 18:11:41 2005 From: stephen.walton at csun.edu (Stephen Walton) Date: Mon, 14 Mar 2005 15:11:41 -0800 Subject: [SciPy-user] Re: [Numpy-discussion] Current thoughts on future directions In-Reply-To: <4231076E.6090507@ims.u-tokyo.ac.jp> References: <422EA691.9080404@ee.byu.edu> <422F335F.8060107@csun.edu> <6916ec732f2e70d1789cc0f480f82e7f@redivi.com> <422FBD4A.3030708@ee.byu.edu> <423084F6.7020804@csun.edu> <4231076E.6090507@ims.u-tokyo.ac.jp> Message-ID: <42361A2D.2030708@csun.edu> Michiel Jan Laurens de Hoon wrote: > I agree that Netlib should be in SciPy. But why should Netlib be in > scipy_base? It should not, and I'm sorry if my original message made it sound like I was advocating for that. I was mainly advocating for f2py to be in scipy_base. From stephen.walton at csun.edu Mon Mar 14 19:59:29 2005 From: stephen.walton at csun.edu (Stephen Walton) Date: Mon, 14 Mar 2005 16:59:29 -0800 Subject: [SciPy-user] Re: optimize.leastsq -> fjac and the covariance redux In-Reply-To: <4235F801.4020900@ee.byu.edu> References: <20050314203239.2304.qmail@web31108.mail.mud.yahoo.com> <4235F801.4020900@ee.byu.edu> Message-ID: <42363371.2030709@csun.edu> Travis Oliphant wrote: > I made the change and now when using full_output for leastsq, the > covariance matrix is returned (as a separate output). I tested it against Brendan's original script after making a couple of changes: at line 34 the call to y() needs to be evalY(); add the additional argument to the list of output arguments from leastsq; and add "epsfcn=0.0001" to the calling sequence of leastsq. This last is needed because otherwise the delta value for the numeric Jacobian is chosen too small for single precision arithmetic. Once that's done, the values of C_true, C_true2, and the new output from leastsq agree to within roundoff error, I believe. From brendansimons at yahoo.ca Mon Mar 14 22:39:47 2005 From: brendansimons at yahoo.ca (Brendan Simons) Date: Mon, 14 Mar 2005 22:39:47 -0500 Subject: [SciPy-user] Re: optimize.leastsq -> fjac and the covariance redux In-Reply-To: <4235F801.4020900@ee.byu.edu> References: <20050314203239.2304.qmail@web31108.mail.mud.yahoo.com> <4235F801.4020900@ee.byu.edu> Message-ID: <32f1b6832c08ecf6bada19715bfa7114@yahoo.ca> >> > I made the change and now when using full_output for leastsq, the > covariance matrix is returned (as a separate output). > Please test it and report back problems. > > -Travis > Hmmm, some issues. I ran the following updated test script: #----------begin test script-------------- from scipy import * from scipy.optimize import leastsq x = array([0., 1., 2., 3., 4., 5.], Float64) coeffs = [4., 3., 5., 2.] yErrs = array([0.1, 0.2, -0.1, 0.05, 0, -.02], Float64) m = len(x) n = len(coeffs) def evalY(p, x): """a simple cubic""" return x**3 * p[3] + x**2 * p[2] + x * p[1] + p[0] def J(p, y, x): """the jacobian of a cubic (not actually a function of y, p since resid is linear in p) """ result = zeros([m,n], Float64) result[:, 0] = ones(m, Float64) result[:, 1] = x result[:, 2] = x**2 result[:, 3] = x**3 return result def resid(p, y, x): return y - evalY(p, x) y_true = evalY(coeffs, x) y_meas = y_true + yErrs p0 = [3., 2., 4., 1.] pMin, C, infodict, ier, mesg = leastsq(resid, p0, args=(y_meas, x), full_output=True) #check against know algorithms for computing C JAtPMin = J(pMin, None, x) Q_true, R_true = linalg.qr(JAtPMin) C_true = linalg.inv(dot(transpose(JAtPMin), JAtPMin)) C_true2 = linalg.inv(dot(transpose(R_true), R_true)) print 'pMin' print pMin print '\nC' print C print '\nC_true' print C_true print '\nC_true2' print C_true2 #-------end test script--------- Which is basically the same as before, but with a cubic instead of a quadratic. Here's what I get: #------ begin script output------ pMin [ 4.1315873 2.95965608 4.99325397 2.00185185] C [[ 3.62323451 -1.22354445 0.21141966 -1.71957585] [-1.22354445 0.96031733 -0.04629627 0.43650769] [ 0.21141966 -0.04629627 0.01543209 -0.11574069] [-1.71957585 0.43650769 -0.11574069 0.89484084]] C_true [[ 0.96031746 -1.22354497 0.43650794 -0.0462963 ] [-1.22354497 3.62323633 -1.71957672 0.21141975] [ 0.43650794 -1.71957672 0.89484127 -0.11574074] [-0.0462963 0.21141975 -0.11574074 0.0154321 ]] C_true2 [[ 0.96031746 -1.22354497 0.43650794 -0.0462963 ] [-1.22354497 3.62323633 -1.71957672 0.21141975] [ 0.43650794 -1.71957672 0.89484127 -0.11574074] [-0.0462963 0.21141975 -0.11574074 0.0154321 ]] #----------end script output----------- Here C has all the right outputs, but in the wrong order. Evidently we're applying the ipvt permutation matrix wrong somehow. Here's something worse. If I include the jacobian in the leastsq call as follows pMin, C, infodict, ier, mesg = leastsq(resid, p0, args=(y_meas, x), Dfun = J, full_output=True) the resulting pMin is the same as p0, and C is way off of C_true. What gives? -Brendan From evelien at ster.kuleuven.ac.be Tue Mar 15 03:50:28 2005 From: evelien at ster.kuleuven.ac.be (Evelien Vanhollebeke) Date: Tue, 15 Mar 2005 09:50:28 +0100 Subject: [SciPy-user] plotting with xplt In-Reply-To: <42308663.70303@ee.byu.edu> References: <422FFECF.2030602@ster.kuleuven.ac.be> <42308663.70303@ee.byu.edu> Message-ID: <4236A1D4.6030508@ster.kuleuven.ac.be> This indeed, was what I was looking for, thanks for the help! Unfortunately most of the time when a problem gets solved another appears... I want to write my plots to a ps file. In principle this is not a problem: with hcp_file I set my default ps file and every fma() causes my plots to be written to that file. The problem is, I do not want them to be shown first on the X window, I want to directly plot them in my ps file. So I was wondering if this was possible with the subplot command. In the window command there are some options like display and so, but if I first use the subplot command and then the window command the X window still appears in between these 2 commands and also my ps file doesn't look anymore as I intended to (it should contain multiple pages, with on every page multiple plots, but not always the same amount of plots on a page). Thanks! Evelien Travis Oliphant wrote: > Evelien Vanhollebeke wrote: > >> Hi everybody, >> >> I wondered if anyone has ever tried with xplt to plot multiple >> figures on one page. I know it is easy with pylab, but I would like >> to try it in xplt. One of the reasons is that in pylab I can't make a >> ps file consisting of different pages without doing a "psmerge" or >> something like that. If anyone has ever been successful on this, I >> hope I can get some ideas/examples on how to do it. I am still in the >> process of deciding which plotting package fulfills my needs. >> > > Yes, I've done it many times. Look at xplt.subplot. You use > xplt.plsys to change which coordinate system you are plotting to. > > The nice thing about xplt is that it is very fast... > > -Travis > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user From brendansimons at yahoo.ca Tue Mar 15 12:55:56 2005 From: brendansimons at yahoo.ca (Brendan Simons) Date: Tue, 15 Mar 2005 12:55:56 -0500 (EST) Subject: [SciPy-user] Re: optimize.leastsq -> fjac and the covariance redux In-Reply-To: 6667 Message-ID: <20050315175556.6455.qmail@web31110.mail.mud.yahoo.com> OK, solved both problems. The r matrix obtained from fjac must be post multiplied by the permutation matrix, not its transpose. Line 254 in minpack.py should be changed from: R = dot(r, transpose(perm)) to: R = dot(r, perm) That gets the ordering of the Covariance to match my test. As for the second issue, (the fact that giving Dfun=J in leastsq leads to the wrong minimum) I did some googling and found this post from John Hunter a year ago: >John Hunter scipy-user at scipy.net >Mon, 21 Apr 2003 13:55:09 -0500 > >I am using leastsq to do a best fit to a simple >exponential function. >In my test script, I find that if I use col_deriv=0, I >get a different >answer than I get if I do a col_deriv=1 with the err >func returning >the transposed jacobian. When I compare the true >parameters with the >best fit parameters, the correct answer is with >col_deriv=1 but not >with col_deriv=0. Am I misusing this parameter or is >something amiss? --example snipped-- Sure enough, providing the transpose of J to Dfun, and setting col_deriv = True gives the right answer! This is obviously a bug in how leastsq handles col_deriv, and needs to be looked at. The good news is I now have a covariance matrix! Thanks Travis! -Brendan On March 15 I wrote: > >Here C has all the right outputs, but in the wrong >order. Evidently >we're applying the >ipvt permutation matrix wrong somehow. > >Here's something worse. If I include the jacobian in >the leastsq call >as follows >pMin, C, infodict, ier, mesg = leastsq(resid, p0, >args=(y_meas, x), >Dfun = J, full_output=True) >the resulting pMin is the same as p0, and C is way off >of C_true. What >gives? -Brendan ______________________________________________________________________ Post your free ad now! http://personals.yahoo.ca From rspringuel at smcvt.edu Tue Mar 15 13:56:02 2005 From: rspringuel at smcvt.edu (R. Padraic Springuel) Date: Tue, 15 Mar 2005 13:56:02 -0500 Subject: [SciPy-user] Re: Partial Derivatives Message-ID: <42372FC2.1080300@smcvt.edu> I know its been a while since I last brought this up (Vol 19, Issue 1), but after some digging, I've discovered Scientific.Functions.Derivatives (it came with the Enthought Edition of python that I use). It defines a DerivVar type that works well for taking derivatives of functions, supporting partial derivatives (even cross partials) very well. I haven't fully tested the limits of its accuracy, but some basic tests show it to be at least as accurate as the top level derivative function in scipy. Perhaps instead of trying to modify derivative (and central_diff_weights) it'll be easier in future versions of scipy to include this type and raise it to the top level. -- R. Padraic Springuel From oliphant at ee.byu.edu Tue Mar 15 14:06:59 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Tue, 15 Mar 2005 12:06:59 -0700 Subject: [SciPy-user] Re: Partial Derivatives In-Reply-To: <42372FC2.1080300@smcvt.edu> References: <42372FC2.1080300@smcvt.edu> Message-ID: <42373253.1050700@ee.byu.edu> R. Padraic Springuel wrote: > I know its been a while since I last brought this up (Vol 19, Issue > 1), but after some digging, I've discovered > Scientific.Functions.Derivatives (it came with the Enthought Edition > of python that I use). It defines a DerivVar type that works well for > taking derivatives of functions, supporting partial derivatives (even > cross partials) very well. I haven't fully tested the limits of its > accuracy, but some basic tests show it to be at least as accurate as > the top level derivative function in scipy. Perhaps instead of trying > to modify derivative (and central_diff_weights) it'll be easier in > future versions of scipy to include this type and raise it to the top > level. > This is a nice class as it does automatic differentiation. I like a lot of what is in Scientific (though I don't like the naming conventions). But, does it handle all cases? What if the function is not built from standard functions? Does it perform numerical differentiation then? -Travis From Fernando.Perez at colorado.edu Tue Mar 15 20:46:46 2005 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Tue, 15 Mar 2005 18:46:46 -0700 Subject: [Numpy-discussion] Re: [SciPy-user] Current thoughts on future directions In-Reply-To: <42310D7D.3000009@ims.u-tokyo.ac.jp> References: <422EA691.9080404@ee.byu.edu> <422F335F.8060107@csun.edu> <6916ec732f2e70d1789cc0f480f82e7f@redivi.com> <422FBD4A.3030708@ee.byu.edu> <422FD009.4020706@enthought.com> <42310D7D.3000009@ims.u-tokyo.ac.jp> Message-ID: <42379006.4070501@colorado.edu> Michiel Jan Laurens de Hoon wrote: > Perry Greenfield wrote: [weave & f2py in the core] >>That they are perfectly fine to go into the core. In fact, if they are >>used by any of the extra packages, they should be in the core to >>eliminate the extra step in the installation of those packages. >> > > -0. > 1) In der Beschraenkung zeigt sich der Meister. In other words, avoid > software bloat. > 2) f2py is a Fortran-Python interface generator, once the interface is > created there is no need for the generator. > 3) I'm sure f2py is useful, but I doubt that it has very general > usefulness. There are lots of other useful Python packages, but we're > not including them in scipy-core either. > 4) f2py and weave don't fit in well with the rest of scipy-core, which > is mainly standard numerical algorithms. I'd like to argue that these two tools are actually critically important in the core of a python for scientific computing toolkit, at its most basic layer. The reason is that python's dynamic runtime type checking makes it impossible to write efficient loop-based code, as we all know. And it is not always feasible to write all algorithms in terms of Numeric vector operations: sometimes you just need to write an indexed loop. At this point, the standard python answer is 'go write an extension module'. While writing extension modules by hand, from scratch, is not all that hard, it certainly presents a significant barrier for less experienced programmers. And yet both weave and f2py make it incredibly easy to get working compiled array code in no time at all. I say this from direct experience, having pointed colleagues to weave and f2py for this very problem. After handing them some notes I have to get started, they've come back saying "I can't believe it was that easy: in a few minutes I had sped up the loop I needed with a bit of C, and now I can continue working on the problem I'm interested in". I know for a fact that if I'd told them to write a full extension module by hand, the result would have been quite different. The reality is that, in scientific work, you are likely to run into this problem at a very early stage, much more so than for other kinds of python usage. For this reason, it is important that the basic toolset provides a clean solution from the start. At least that's been my experience. Regards, f From r_hlp at yahoo.com Wed Mar 16 20:15:21 2005 From: r_hlp at yahoo.com (r help) Date: Wed, 16 Mar 2005 17:15:21 -0800 (PST) Subject: [SciPy-user] stats newbie question Message-ID: <20050317011521.51748.qmail@web42408.mail.yahoo.com> hi i am attempting to use the cdf of a lognormal distribution but i cant seem to locate the docs to do this correctly. ive been trying from scipy import * from scipy.stats import * then i try lognorm.cdf(1,1) stats.lognorm.cdf(1,1) which just give errors. can anyone shed some light on how to access the lognorm distribution. -gong __________________________________ Do you Yahoo!? Yahoo! Small Business - Try our new resources site! http://smallbusiness.yahoo.com/resources/ From rkern at ucsd.edu Wed Mar 16 20:57:31 2005 From: rkern at ucsd.edu (Robert Kern) Date: Wed, 16 Mar 2005 17:57:31 -0800 Subject: [SciPy-user] stats newbie question In-Reply-To: <20050317011521.51748.qmail@web42408.mail.yahoo.com> References: <20050317011521.51748.qmail@web42408.mail.yahoo.com> Message-ID: <4238E40B.5060103@ucsd.edu> r help wrote: > hi > > i am attempting to use the cdf of a lognormal > distribution but i cant seem to locate the docs to do > this correctly. Starting from http://www.scipy.org/livedocs/ you navigate to http://oliphant.ee.byu.edu:81/scipy/stats/lognorm Alternatively, you can read the docstring of the lognorm object. > ive been trying > > from scipy import * > from scipy.stats import * > > then i try > > lognorm.cdf(1,1) > stats.lognorm.cdf(1,1) > > which just give errors. What errors? This is what I get: In [1]: from scipy import * In [2]: stats.lognorm.cdf(1,1) Out[2]: NumPy array, format: long 0.5 > can anyone shed some light on > how to access the lognorm distribution. Read the livedocs. Also, help us to help you and read http://www.catb.org/~esr/faqs/smart-questions.html -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From petr.kucera1 at seznam.cz Thu Mar 17 10:58:42 2005 From: petr.kucera1 at seznam.cz (=?us-ascii?Q?Petr=20Kucera?=) Date: Thu, 17 Mar 2005 16:58:42 +0100 (CET) Subject: [SciPy-user] Simple 2D animations? Message-ID: <414.19-8045-306696085-1111075122@seznam.cz> Hello Scipy users! I'm new here and actually I'm looking for a tool to plot simple 2D animations. I want for example do a simple 2D animation showing earth going around the sun. Another example is to plot x-y animation of lorenz attractor. I would like to have a point which will move with time and possible draw the trail. To give a better idea, here is a simple script: ============================ import Numeric t = 0 i = 0 while 1: t=i/100.0 x=Numeric.sin(t) y=Numeric.sin(t+0.5) print x, y i+=1 ============================ I would like to plot just last (x,y) point. I should get a simple animation, where the (x,y) will move in a graph. Is SciPy the right tool for me? Many thanks for any hint! Peter ____________________________________________________________ http://www.seznam.cz - e-mailov? schr?nka 1000 MB From nwagner at mecha.uni-stuttgart.de Thu Mar 17 11:06:23 2005 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Thu, 17 Mar 2005 17:06:23 +0100 Subject: [SciPy-user] Simple 2D animations? In-Reply-To: <414.19-8045-306696085-1111075122@seznam.cz> References: <414.19-8045-306696085-1111075122@seznam.cz> Message-ID: <4239AAFF.3080603@mecha.uni-stuttgart.de> Petr Kucera wrote: >Hello Scipy users! > >I'm new here and actually I'm looking for a tool to plot simple 2D animations. >I want for example do a simple 2D animation showing earth going around the >sun. Another example is to plot x-y animation of lorenz attractor. I would >like to have a point which will move with time and possible draw the trail. > >To give a better idea, here is a simple script: >============================ >import Numeric >t = 0 >i = 0 >while 1: > t=i/100.0 > x=Numeric.sin(t) > y=Numeric.sin(t+0.5) > print x, y > i+=1 >============================ > >I would like to plot just last (x,y) point. I should get a simple animation, >where the (x,y) will move in a graph. > >Is SciPy the right tool for me? > >Many thanks for any hint! >Peter >____________________________________________________________ >http://www.seznam.cz - e-mailov? schr?nka 1000 MB > >_______________________________________________ >SciPy-user mailing list >SciPy-user at scipy.net >http://www.scipy.net/mailman/listinfo/scipy-user > > You may try import Numeric from scipy import * from scipy.xplt import * import gui_thread xplt.hold('on') t = 0 i = 0 while 1: t=i/100.0 x=Numeric.sin(t) y=Numeric.sin(t+0.5) print x, y xplt.plot(x,y,'b+') xplt.pause(10) i+=1 Nils From rspringuel at smcvt.edu Thu Mar 17 11:28:25 2005 From: rspringuel at smcvt.edu (R. Padraic Springuel) Date: Thu, 17 Mar 2005 11:28:25 -0500 Subject: [SciPy-user] Re: Re: Partial Derivatives In-Reply-To: <20050317160646.5C9923EB5F@www.scipy.com> References: <20050317160646.5C9923EB5F@www.scipy.com> Message-ID: <4239B029.9050600@smcvt.edu> What do you mean by non-standard functions? -- R. Padraic Springuel From nwagner at mecha.uni-stuttgart.de Thu Mar 17 11:59:25 2005 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Thu, 17 Mar 2005 17:59:25 +0100 Subject: [SciPy-user] Jacobian matrix Message-ID: <4239B76D.5080202@mecha.uni-stuttgart.de> Hi all, Is there a simpler way (built-in function) to compute the Jacobian of a vector function ? How about higher order derivatives of vector-valued functions ? from scipy import * def f(x): # tmp = zeros(5,Float) tmp[0] = x[0] + x[1] + x[2]**2 + x[3]**2 + x[4]**2 - 2 tmp[1] = x[0] - x[1] + x[2]**2 + x[3]**2 + x[4]**2 tmp[2] = -x[2]**2 + x[3]**2 + x[4]**2 tmp[3] = x[2]**2 - x[3]**2 + x[4]**2 tmp[4] = x[2]**2 + x[3]**2 - x[4]**2 return tmp x0 = array(([1.02,1.02,0.02,0.02,0.02])) eps = 1.e-5 J = zeros((5,5),Float) for i in arange(0,5): ei = zeros(5,Float) ei[i] = 1.0 J[:,i] = (f(x0+eps*ei)-f(x0))/eps Any pointer would be appreciated. Thanks in advance Nils From oliphant at ee.byu.edu Thu Mar 17 12:18:19 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Thu, 17 Mar 2005 10:18:19 -0700 Subject: [SciPy-user] Re: Re: Partial Derivatives In-Reply-To: <4239B029.9050600@smcvt.edu> References: <20050317160646.5C9923EB5F@www.scipy.com> <4239B029.9050600@smcvt.edu> Message-ID: <4239BBDB.3030503@ee.byu.edu> R. Padraic Springuel wrote: > What do you mean by non-standard functions? > I mean functions for which the Scientific.Derivatives does not have a known derivative (like a function that comes from some integration, or optimization step). -Travis From r_hlp at yahoo.com Thu Mar 17 15:57:17 2005 From: r_hlp at yahoo.com (r help) Date: Thu, 17 Mar 2005 12:57:17 -0800 (PST) Subject: [SciPy-user] stats newbie question In-Reply-To: 6667 Message-ID: <20050317205717.54289.qmail@web42407.mail.yahoo.com> hi > In [1]: from scipy import * > In [2]: stats.lognorm.cdf(1,1) > Out[2]: NumPy array, format: long > 0.5 thanks for the suggestion, here are the errors i get when i type in the suggestion. note that i installed scipy using a debian package; does this mean the problem lies in the debian scipy package? thanks for any further help -gong > What errors? here they are: Python 2.3.5 (#2, Feb 9 2005, 00:38:15) [GCC 3.3.5 (Debian 1:3.3.5-8)] on linux2 Type "help", "copyright", "credits" or "license" for more information. >>> from scipy import * >>> stats.lognorm.cdf(1,1) Traceback (most recent call last): File "", line 1, in ? File "/usr/lib/python2.3/site-packages/scipy_base/ppimport.py", line 303, in __getattr__ module = self._ppimport_importer() File "/usr/lib/python2.3/site-packages/scipy_base/ppimport.py", line 273, in _ppimport_importer module = __import__(name,None,None,['*']) File "/usr/lib/python2.3/site-packages/scipy/stats/__init__.py", line 7, in ? import pstats ImportError: No module named pstats >>> __________________________________ Do you Yahoo!? Take Yahoo! Mail with you! Get it on your mobile phone. http://mobile.yahoo.com/maildemo From oliphant at ee.byu.edu Thu Mar 17 16:10:36 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Thu, 17 Mar 2005 14:10:36 -0700 Subject: [SciPy-user] stats newbie question In-Reply-To: <20050317205717.54289.qmail@web42407.mail.yahoo.com> References: <20050317205717.54289.qmail@web42407.mail.yahoo.com> Message-ID: <4239F24C.5010406@ee.byu.edu> r help wrote: >here they are: > > >Python 2.3.5 (#2, Feb 9 2005, 00:38:15) >[GCC 3.3.5 (Debian 1:3.3.5-8)] on linux2 >Type "help", "copyright", "credits" or "license" for >more information. > > >>>>from scipy import * >>>>stats.lognorm.cdf(1,1) >>>> >>>> >Traceback (most recent call last): > File "", line 1, in ? > File >"/usr/lib/python2.3/site-packages/scipy_base/ppimport.py", >line 303, in __getattr__ > module = self._ppimport_importer() > File >"/usr/lib/python2.3/site-packages/scipy_base/ppimport.py", >line 273, in _ppimport_importer > module = __import__(name,None,None,['*']) > File >"/usr/lib/python2.3/site-packages/scipy/stats/__init__.py", >line 7, in ? > import pstats >ImportError: No module named pstats > > > > > > This looks like a package installation problem. Could you tell us what version of scipy you are running? type scipy.__version__ From pearu at scipy.org Thu Mar 17 16:18:28 2005 From: pearu at scipy.org (Pearu Peterson) Date: Thu, 17 Mar 2005 15:18:28 -0600 (CST) Subject: [SciPy-user] stats newbie question In-Reply-To: <4239F24C.5010406@ee.byu.edu> References: <20050317205717.54289.qmail@web42407.mail.yahoo.com> <4239F24C.5010406@ee.byu.edu> Message-ID: On Thu, 17 Mar 2005, Travis Oliphant wrote: > r help wrote: > >> here they are: >> >>>>> from scipy import * >>>>> stats.lognorm.cdf(1,1) >>>>> >> import pstats >> ImportError: No module named pstats >> > This looks like a package installation problem. Could you tell us what > version of scipy you are running? pstats comes from python profiler package. Either one should install python-profiler or use scipy from CVS that contains a fix to this issue. Pearu From r_hlp at yahoo.com Thu Mar 17 16:21:15 2005 From: r_hlp at yahoo.com (r help) Date: Thu, 17 Mar 2005 13:21:15 -0800 (PST) Subject: [SciPy-user] stats newbie question In-Reply-To: 6667 Message-ID: <20050317212115.57983.qmail@web42406.mail.yahoo.com> hi it is version 0.3.2. however ive looked over the debian bug system and this error has been logged against the debian package python2.3-scipy. it has been closed; the problem is that the fix hasnt propagated down to my distribution yet, so this is not a problem with scipy but with the debian package. the fix is simply to comment out 'import pstats' in stats/__init__.py, which ive done and so far it seems to work. thanks anyway! -gong --- Travis Oliphant wrote: > r help wrote: > > >here they are: > > > > > >Python 2.3.5 (#2, Feb 9 2005, 00:38:15) > >[GCC 3.3.5 (Debian 1:3.3.5-8)] on linux2 > >Type "help", "copyright", "credits" or "license" > for > >more information. > > > > > >>>>from scipy import * > >>>>stats.lognorm.cdf(1,1) > >>>> > >>>> > >Traceback (most recent call last): > > File "", line 1, in ? > > File > >"/usr/lib/python2.3/site-packages/scipy_base/ppimport.py", > >line 303, in __getattr__ > > module = self._ppimport_importer() > > File > >"/usr/lib/python2.3/site-packages/scipy_base/ppimport.py", > >line 273, in _ppimport_importer > > module = __import__(name,None,None,['*']) > > File > >"/usr/lib/python2.3/site-packages/scipy/stats/__init__.py", > >line 7, in ? > > import pstats > >ImportError: No module named pstats > > > > > > > > > > > > > This looks like a package installation problem. > Could you tell us what version of scipy you are > running? > > type > > scipy.__version__ > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user > __________________________________ Do you Yahoo!? Yahoo! Small Business - Try our new resources site! http://smallbusiness.yahoo.com/resources/ From Fernando.Perez at colorado.edu Thu Mar 17 16:28:24 2005 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Thu, 17 Mar 2005 14:28:24 -0700 Subject: [SciPy-user] stats newbie question In-Reply-To: <20050317212115.57983.qmail@web42406.mail.yahoo.com> References: <20050317212115.57983.qmail@web42406.mail.yahoo.com> Message-ID: <4239F678.5030408@colorado.edu> r help wrote: > hi > > it is version 0.3.2. however ive looked over the > debian bug system and this error has been logged > against the debian package python2.3-scipy. it has > been closed; the problem is that the fix hasnt > propagated down to my distribution yet, so this is not > a problem with scipy but with the debian package. the > fix is simply to comment out 'import pstats' in > stats/__init__.py, which ive done and so far it seems > to work. thanks anyway! Debian recently removed the profiler package for licensing reasons, you need to get it separately. http://bugs.debian.org/cgi-bin/bugreport.cgi?bug=294500 I had to patch ipython to handle this problem myself. Best, f From gruben at bigpond.net.au Fri Mar 18 03:31:18 2005 From: gruben at bigpond.net.au (Gary Ruben) Date: Fri, 18 Mar 2005 19:31:18 +1100 Subject: [SciPy-user] Simple 2D animations? In-Reply-To: <414.19-8045-306696085-1111075122@seznam.cz> References: <414.19-8045-306696085-1111075122@seznam.cz> Message-ID: <423A91D6.7090000@bigpond.net.au> Hi Petr, Take a look at Visual Python at http://vpython.org It is a 3-D package and comes with examples of both orbiting masses and a lorentz attractor in 3D. Gary R. Petr Kucera wrote: > Hello Scipy users! > > I'm new here and actually I'm looking for a tool to plot simple 2D animations. > I want for example do a simple 2D animation showing earth going around the > sun. Another example is to plot x-y animation of lorenz attractor. I would > like to have a point which will move with time and possible draw the trail. > > To give a better idea, here is a simple script: > ============================ > import Numeric > t = 0 > i = 0 > while 1: > t=i/100.0 > x=Numeric.sin(t) > y=Numeric.sin(t+0.5) > print x, y > i+=1 > ============================ > > I would like to plot just last (x,y) point. I should get a simple animation, > where the (x,y) will move in a graph. > > Is SciPy the right tool for me? > > Many thanks for any hint! > Peter > ____________________________________________________________ > http://www.seznam.cz - e-mailov? schr?nka 1000 MB > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user > From JeanClaude.Razafindrakoto at axa-cessions.com Fri Mar 18 05:17:56 2005 From: JeanClaude.Razafindrakoto at axa-cessions.com (Razafindrakoto Jean Claude) Date: Fri, 18 Mar 2005 11:17:56 +0100 Subject: [SciPy-user] [Scipy & Python 2.4] Naive question Message-ID: Is there any release of Scipy working with Python 2.4 ? Thanks ______________________________________________________________ AXA CESSIONS Jean-Claude RAZAFINDRAKOTO Re-ARMS Reinsurance - Actuarial and Risk Management Services 109 rue La Bo?tie 75008 Paris Tel. : +33 1 56 43 78 54 Fax. : +33 1 56 43 78 70 E-mail : jeanclaude.razafindrakoto at axa-cessions.com _______________________________________________________________ -------------- next part -------------- An HTML attachment was scrubbed... URL: From jdgleeson at mac.com Fri Mar 18 11:17:34 2005 From: jdgleeson at mac.com (John Gleeson) Date: Fri, 18 Mar 2005 09:17:34 -0700 Subject: [SciPy-user] Jacobian matrix In-Reply-To: <4239B76D.5080202@mecha.uni-stuttgart.de> References: <4239B76D.5080202@mecha.uni-stuttgart.de> Message-ID: On Mar 17, 2005, at 9:59 AM, Nils Wagner wrote: > Hi all, > > Is there a simpler way (built-in function) to compute the Jacobian of > a vector function ? > How about higher order derivatives of vector-valued functions ? > > > from scipy import * > > def f(x): > # > tmp = zeros(5,Float) > tmp[0] = x[0] + x[1] + x[2]**2 + x[3]**2 + x[4]**2 - 2 > tmp[1] = x[0] - x[1] + x[2]**2 + x[3]**2 + x[4]**2 > tmp[2] = -x[2]**2 + x[3]**2 + x[4]**2 > tmp[3] = x[2]**2 - x[3]**2 + x[4]**2 > tmp[4] = x[2]**2 + x[3]**2 - x[4]**2 > return tmp > > x0 = array(([1.02,1.02,0.02,0.02,0.02])) > eps = 1.e-5 > J = zeros((5,5),Float) > for i in arange(0,5): > > ei = zeros(5,Float) > ei[i] = 1.0 > J[:,i] = (f(x0+eps*ei)-f(x0))/eps > > This approach doesn't uses the ScientificPython extension to scipy, so it doesn't satisfy your request for a built-in function, but it is fairly simple, and even provides higher order derivatives: >>> def f(x): ... # ... tmp = zeros(5,PyObject) # --- Notice change in typecode --- ... tmp[0] = x[0] + x[1] + x[2]**2 + x[3]**2 + x[4]**2 - 2 ... tmp[1] = x[0] - x[1] + x[2]**2 + x[3]**2 + x[4]**2 ... tmp[2] = -x[2]**2 + x[3]**2 + x[4]**2 ... tmp[3] = x[2]**2 - x[3]**2 + x[4]**2 ... tmp[4] = x[2]**2 + x[3]**2 - x[4]**2 ... return tmp >>> x0 = array(([DerivVar(1.02,0,2),DerivVar(1.02,1,2),DerivVar(0.02,2,2),DerivVa r(0.02,3,2),DerivVar(0.02,4,2)]),PyObject) >>> D = f(x0) >>> D array([(0.041199999999999903, [1.0, 1.0, 0.040000000000000001, 0.040000000000000001, 0.040000000000000001], [[0.0, 0.0, 0.0, 0.0, 0.0], [0.0, 0.0, 0.0, 0.0, 0.0], [0.0, 0.0, 2.0, 0.0, 0.0], [0.0, 0.0, 0.0, 2.0, 0.0], [0.0, 0.0, 0.0, 0.0, 2.0]]) , (0.0012000000000000001, [1.0, -1.0, 0.040000000000000001, 0.040000000000000001, 0.040000000000000001], [[0.0, 0.0, 0.0, 0.0, 0.0], [0.0, 0.0, 0.0, 0.0, 0.0], [0.0, 0.0, 2.0, 0.0, 0.0], [0.0, 0.0, 0.0, 2.0, 0.0], [0.0, 0.0, 0.0, 0.0, 2.0]]) , (0.00040000000000000002, [0.0, 0.0, -0.040000000000000001, 0.040000000000000001, 0.040000000000000001], [[0.0, 0.0, 0.0, 0.0, 0.0], [0.0, 0.0, 0.0, 0.0, 0.0], [0.0, 0.0, -2.0, 0.0, 0.0], [0.0, 0.0, 0.0, 2.0, 0.0], [0.0, 0.0, 0.0, 0.0, 2.0]]) , (0.00040000000000000002, [0.0, 0.0, 0.040000000000000001, -0.040000000000000001, 0.040000000000000001], [[0.0, 0.0, 0.0, 0.0, 0.0], [0.0, 0.0, 0.0, 0.0, 0.0], [0.0, 0.0, 2.0, 0.0, 0.0], [0.0, 0.0, 0.0, -2.0, 0.0], [0.0, 0.0, 0.0, 0.0, 2.0]]) , (0.00040000000000000002, [0.0, 0.0, 0.040000000000000001, 0.040000000000000001, -0.040000000000000001], [[0.0, 0.0, 0.0, 0.0, 0.0], [0.0, 0.0, 0.0, 0.0, 0.0], [0.0, 0.0, 2.0, 0.0, 0.0], [0.0, 0.0, 0.0, 2.0, 0.0], [0.0, 0.0, 0.0, 0.0, -2.0]]) ],'O') # To get them into a more natural form: >>> f0 = array(([x[0] for x in D])) >>> f0 array([ 0.0412, 0.0012, 0.0004, 0.0004, 0.0004]) >>> Df0 = array(([x[1] for x in D])) >>> Df0 array([[ 1. , 1. , 0.04, 0.04, 0.04], [ 1. , -1. , 0.04, 0.04, 0.04], [ 0. , 0. , -0.04, 0.04, 0.04], [ 0. , 0. , 0.04, -0.04, 0.04], [ 0. , 0. , 0.04, 0.04, -0.04]]) >>> D2f0 = array(([x[2] for x in D])) >>> D2f0 array([[[ 0., 0., 0., 0., 0.], [ 0., 0., 0., 0., 0.], [ 0., 0., 2., 0., 0.], [ 0., 0., 0., 2., 0.], [ 0., 0., 0., 0., 2.]], [[ 0., 0., 0., 0., 0.], [ 0., 0., 0., 0., 0.], [ 0., 0., 2., 0., 0.], [ 0., 0., 0., 2., 0.], [ 0., 0., 0., 0., 2.]], [[ 0., 0., 0., 0., 0.], [ 0., 0., 0., 0., 0.], [ 0., 0., -2., 0., 0.], [ 0., 0., 0., 2., 0.], [ 0., 0., 0., 0., 2.]], [[ 0., 0., 0., 0., 0.], [ 0., 0., 0., 0., 0.], [ 0., 0., 2., 0., 0.], [ 0., 0., 0., -2., 0.], [ 0., 0., 0., 0., 2.]], [[ 0., 0., 0., 0., 0.], [ 0., 0., 0., 0., 0.], [ 0., 0., 2., 0., 0.], [ 0., 0., 0., 2., 0.], [ 0., 0., 0., 0., -2.]]]) From jdgleeson at mac.com Fri Mar 18 11:19:28 2005 From: jdgleeson at mac.com (John Gleeson) Date: Fri, 18 Mar 2005 09:19:28 -0700 Subject: [SciPy-user] Jacobian matrix In-Reply-To: <4239B76D.5080202@mecha.uni-stuttgart.de> References: <4239B76D.5080202@mecha.uni-stuttgart.de> Message-ID: I forgot to include the first line: >>> from Scientific.Functions.Derivatives import DerivVar From cookedm at physics.mcmaster.ca Fri Mar 18 16:42:18 2005 From: cookedm at physics.mcmaster.ca (David M. Cooke) Date: Fri, 18 Mar 2005 16:42:18 -0500 Subject: [SciPy-user] [Scipy & Python 2.4] Naive question In-Reply-To: (Razafindrakoto Jean Claude's message of "Fri, 18 Mar 2005 11:17:56 +0100") References: Message-ID: Razafindrakoto Jean Claude writes: > Is there any release of Scipy working with Python 2.4 ? Current CVS works fine for me. -- |>|\/|< /--------------------------------------------------------------------------\ |David M. Cooke http://arbutus.physics.mcmaster.ca/dmc/ |cookedm at physics.mcmaster.ca From rspringuel at smcvt.edu Fri Mar 18 23:41:10 2005 From: rspringuel at smcvt.edu (R. Padraic Springuel) Date: Fri, 18 Mar 2005 23:41:10 -0500 Subject: [SciPy-user] Partial Derivatives In-Reply-To: <20050318101647.314753EB8D@www.scipy.com> References: <20050318101647.314753EB8D@www.scipy.com> Message-ID: <423BAD66.20201@smcvt.edu> Well, after some figuring, I've come to the following conclusions. Scientific.Functions.Derivatives is only for analytic functions. If you have a numerical data set, you can't use it to get the derivative of the function that describes that data set at a given point unless you know what that function's analytical form. This means that it can be used in place of scipy.derivative, but not scipy.central_diff_weights. However, for its purpose it seems to work well. As written it works with most functions within Numeric. It doesn't currently have differentiation routines for arcsinh, arccosh, and arctanh, but adding those is fairly easy (and a fix I've already made). It also can't handle logn, log2, sec, csc, cot, arcsec, arccsc, arccot, sech, csch, coth, arcsech, arccsch, and arccoth but I think that's more because these functions are not defined by Numeric (likely because they all can be written in terms of functions which are defined fairly easily). I'm probably going to fix this at some point, I just have to figure out where to put the definitions of these additional functions since ufuncs (where the other basic mathematical functions are defined) is not editable. The biggest issue I have with the package is an issue it inherits from Numeric. For certain functions which have a limited domain for the real range space, you have to explicitly tell numeric that your allowing for the possibility of a complex answer by phrasing the input as a complex number with a 0 imaginary component (i.e. sqrt(-3) doesn't work but sqrt(-3+0j) does). In order for DerivVar to work in the part of the domain that uses the non-real range you have to pull the same trick. Now, the wrappers in scimath do quite nicely as a work around for this difficulty for normal functions, but they don't work for DerivVar class inputs. They weren't written with the DerivVar class in mind and raise a TypeError when you try to put a DerivVar into them (even in the "normal" domain for the function). I'm not quite sure how to fix this issue yet, but I'm looking into it. I'm pretty sure it'll involve a rewrite of _tocomplex in scimath, I'm just not sure how to do so yet. If anyone has any ideas or suggestions, let me know. I'll make sure I post my progress and will make the code available when its finished. -- R. Padraic Springuel From rspringuel at smcvt.edu Sat Mar 19 11:37:07 2005 From: rspringuel at smcvt.edu (R. Padraic Springuel) Date: Sat, 19 Mar 2005 11:37:07 -0500 Subject: [SciPy-user] Issues with scimath In-Reply-To: <20050318101647.314753EB8D@www.scipy.com> References: <20050318101647.314753EB8D@www.scipy.com> Message-ID: <423C5533.1040701@smcvt.edu> Does any one else have problems with the scimath wrappers for arccos, arcsin, and arctanh? I'm running Python 2.3 on a Windows XP machine and every time I try to call those functions for an input that has a result in the complex range, python experiences a fatal error and is forced to close. This doesn't happen for any of the other functions in scimath and also doesn't happen when I import Numeric instead of scipy. Has anyone else experienced this problem or know how to fix it? -- R. Padraic Springuel From nwagner at mecha.uni-stuttgart.de Mon Mar 21 02:29:22 2005 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Mon, 21 Mar 2005 08:29:22 +0100 Subject: [SciPy-user] Jacobian matrix In-Reply-To: References: <4239B76D.5080202@mecha.uni-stuttgart.de> Message-ID: <423E77D2.7090701@mecha.uni-stuttgart.de> John Gleeson wrote: > > On Mar 17, 2005, at 9:59 AM, Nils Wagner wrote: > >> Hi all, >> >> Is there a simpler way (built-in function) to compute the Jacobian >> of a vector function ? >> How about higher order derivatives of vector-valued functions ? >> >> >> from scipy import * >> >> def f(x): >> # >> tmp = zeros(5,Float) >> tmp[0] = x[0] + x[1] + x[2]**2 + x[3]**2 + x[4]**2 - 2 >> tmp[1] = x[0] - x[1] + x[2]**2 + x[3]**2 + x[4]**2 >> tmp[2] = -x[2]**2 + x[3]**2 + x[4]**2 >> tmp[3] = x[2]**2 - x[3]**2 + x[4]**2 >> tmp[4] = x[2]**2 + x[3]**2 - x[4]**2 >> return tmp >> >> x0 = array(([1.02,1.02,0.02,0.02,0.02])) >> eps = 1.e-5 >> J = zeros((5,5),Float) >> for i in arange(0,5): >> >> ei = zeros(5,Float) >> ei[i] = 1.0 >> J[:,i] = (f(x0+eps*ei)-f(x0))/eps >> >> > > This approach doesn't uses the ScientificPython extension to scipy, > so it doesn't satisfy your request for a built-in function, but it > is fairly simple, and even provides higher order derivatives: Afaik, ScientificPython is an extra package and not a part of scipy. http://starship.python.net/~hinsen/ScientificPython/ScientificPythonManual/Scientific_8.html However, it would be nice, if this functionality is directly available in scipy as well. NIls > > >>> def f(x): > ... # > ... tmp = zeros(5,PyObject) # --- Notice change in typecode --- > ... tmp[0] = x[0] + x[1] + x[2]**2 + x[3]**2 + x[4]**2 - 2 > ... tmp[1] = x[0] - x[1] + x[2]**2 + x[3]**2 + x[4]**2 > ... tmp[2] = -x[2]**2 + x[3]**2 + x[4]**2 > ... tmp[3] = x[2]**2 - x[3]**2 + x[4]**2 > ... tmp[4] = x[2]**2 + x[3]**2 - x[4]**2 > ... return tmp > > >>> x0 = > array(([DerivVar(1.02,0,2),DerivVar(1.02,1,2),DerivVar(0.02,2,2),DerivVa > r(0.02,3,2),DerivVar(0.02,4,2)]),PyObject) > >>> D = f(x0) > >>> D > array([(0.041199999999999903, [1.0, 1.0, 0.040000000000000001, > 0.040000000000000001, 0.040000000000000001], [[0.0, 0.0, 0.0, 0.0, > 0.0], [0.0, 0.0, 0.0, 0.0, 0.0], [0.0, 0.0, 2.0, 0.0, 0.0], [0.0, > 0.0, 0.0, 2.0, 0.0], [0.0, 0.0, 0.0, 0.0, 2.0]]) , > (0.0012000000000000001, [1.0, -1.0, 0.040000000000000001, > 0.040000000000000001, 0.040000000000000001], [[0.0, 0.0, 0.0, 0.0, > 0.0], [0.0, 0.0, 0.0, 0.0, 0.0], [0.0, 0.0, 2.0, 0.0, 0.0], [0.0, > 0.0, 0.0, 2.0, 0.0], [0.0, 0.0, 0.0, 0.0, 2.0]]) , > (0.00040000000000000002, [0.0, 0.0, > -0.040000000000000001, 0.040000000000000001, 0.040000000000000001], > [[0.0, 0.0, 0.0, 0.0, 0.0], [0.0, 0.0, 0.0, 0.0, 0.0], [0.0, 0.0, > -2.0, 0.0, 0.0], [0.0, 0.0, 0.0, 2.0, 0.0], [0.0, 0.0, 0.0, 0.0, > 2.0]]) , > (0.00040000000000000002, [0.0, 0.0, 0.040000000000000001, > -0.040000000000000001, 0.040000000000000001], [[0.0, 0.0, 0.0, 0.0, > 0.0], [0.0, 0.0, 0.0, 0.0, 0.0], [0.0, 0.0, 2.0, 0.0, 0.0], [0.0, > 0.0, 0.0, -2.0, 0.0], [0.0, 0.0, 0.0, 0.0, 2.0]]) , > (0.00040000000000000002, [0.0, 0.0, 0.040000000000000001, > 0.040000000000000001, -0.040000000000000001], [[0.0, 0.0, 0.0, 0.0, > 0.0], [0.0, 0.0, 0.0, 0.0, 0.0], [0.0, 0.0, 2.0, 0.0, 0.0], [0.0, > 0.0, 0.0, 2.0, 0.0], [0.0, 0.0, 0.0, 0.0, -2.0]]) ],'O') > > # To get them into a more natural form: > > >>> f0 = array(([x[0] for x in D])) > >>> f0 > array([ 0.0412, 0.0012, 0.0004, 0.0004, 0.0004]) > >>> Df0 = array(([x[1] for x in D])) > >>> Df0 > array([[ 1. , 1. , 0.04, 0.04, 0.04], > [ 1. , -1. , 0.04, 0.04, 0.04], > [ 0. , 0. , -0.04, 0.04, 0.04], > [ 0. , 0. , 0.04, -0.04, 0.04], > [ 0. , 0. , 0.04, 0.04, -0.04]]) > >>> D2f0 = array(([x[2] for x in D])) > >>> D2f0 > array([[[ 0., 0., 0., 0., 0.], > [ 0., 0., 0., 0., 0.], > [ 0., 0., 2., 0., 0.], > [ 0., 0., 0., 2., 0.], > [ 0., 0., 0., 0., 2.]], > [[ 0., 0., 0., 0., 0.], > [ 0., 0., 0., 0., 0.], > [ 0., 0., 2., 0., 0.], > [ 0., 0., 0., 2., 0.], > [ 0., 0., 0., 0., 2.]], > [[ 0., 0., 0., 0., 0.], > [ 0., 0., 0., 0., 0.], > [ 0., 0., -2., 0., 0.], > [ 0., 0., 0., 2., 0.], > [ 0., 0., 0., 0., 2.]], > [[ 0., 0., 0., 0., 0.], > [ 0., 0., 0., 0., 0.], > [ 0., 0., 2., 0., 0.], > [ 0., 0., 0., -2., 0.], > [ 0., 0., 0., 0., 2.]], > [[ 0., 0., 0., 0., 0.], > [ 0., 0., 0., 0., 0.], > [ 0., 0., 2., 0., 0.], > [ 0., 0., 0., 2., 0.], > [ 0., 0., 0., 0., -2.]]]) > > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user From jdgleeson at mac.com Mon Mar 21 09:44:09 2005 From: jdgleeson at mac.com (John Gleeson) Date: Mon, 21 Mar 2005 07:44:09 -0700 Subject: [SciPy-user] Jacobian matrix In-Reply-To: <423E77D2.7090701@mecha.uni-stuttgart.de> References: <4239B76D.5080202@mecha.uni-stuttgart.de> <423E77D2.7090701@mecha.uni-stuttgart.de> Message-ID: <812a9f08b196a91f3aec03c42f578aa2@mac.com> On Mar 21, 2005, at 12:29 AM, Nils Wagner wrote: > John Gleeson wrote: >> >> This approach doesn't uses the ScientificPython extension to scipy, >> so it doesn't satisfy your request for a built-in function, but it >> is fairly simple, and even provides higher order derivatives: > > Afaik, ScientificPython is an extra package and not a part of scipy. > ... > NIls Yes, you are correct. I tried to admit that in my first sentence, but garbled it very badly. Would you believe that English is my first language? Really, it's true. :) From srijit at yahoo.com Mon Mar 21 11:37:23 2005 From: srijit at yahoo.com (Srijit Kumar Bhadra) Date: Mon, 21 Mar 2005 08:37:23 -0800 (PST) Subject: [SciPy-user] Python 2.4 (Windows) Binaries of SciPy Message-ID: <20050321163723.22947.qmail@web60006.mail.yahoo.com> Hello, I am looking for Python 2.4 Binaries (Windows)of SciPy. Are there any plans to upload Python 2.4 binaries? With regards, Srijit Kumar Bhadra __________________________________ Do you Yahoo!? Yahoo! Small Business - Try our new resources site! http://smallbusiness.yahoo.com/resources/ From mattb at columbia.edu Mon Mar 21 15:52:41 2005 From: mattb at columbia.edu (Matthew Bogosian) Date: Mon, 21 Mar 2005 12:52:41 -0800 Subject: [SciPy-user] Python 2.4 Win32 Binaries? Message-ID: -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1 This may have been asked before, but I haven't been able to find a recent reference in the mailing lists. From the installation documentation, under "Building SciPy under MS Windows": > Building Scipy is easy --- that is, provided that you have all > required software installed properly. This may very well be true, but properly installing all the required software is anything but. I know that Python 2.4 has been out for quite some time. Numeric is up to 23.8 now. If building SciPy is so easy, is there a reason why 2.4 (and more recent Numeric) binaries are not available? -- Matt -----BEGIN PGP SIGNATURE----- Version: GnuPG v1.2.4 (Darwin) iD8DBQFCPzQinLpDzL5I7l8RAo8CAJsGZ5M+aBUvnGnmavAL5GHIbSAhoACfXUbZ 9fOXLmC/2exQ1jzWZd3iO2A= =HPtD -----END PGP SIGNATURE----- From oliphant at ee.byu.edu Mon Mar 21 19:36:16 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Mon, 21 Mar 2005 17:36:16 -0700 Subject: [SciPy-user] Python 2.4 Win32 Binaries? In-Reply-To: References: Message-ID: <423F6880.5020109@ee.byu.edu> Matthew Bogosian wrote: > -----BEGIN PGP SIGNED MESSAGE----- > Hash: SHA1 > > This may have been asked before, but I haven't been able to find a > recent reference in the mailing lists. > > From the installation documentation, under "Building SciPy under MS > Windows": > >> Building Scipy is easy --- that is, provided that you have all >> required software installed properly. > > > This may very well be true, but properly installing all the required > software is anything but. I know that Python 2.4 has been out for > quite some time. Numeric is up to 23.8 now. If building SciPy is so > easy, is there a reason why 2.4 (and more recent Numeric) binaries are > not available? > The reason: lack of willing helpers. Nobody has stepped up to do it. I, for example, don't use Windows regularly, and I'm involved in the numarray-Numeric reunification project right now. Not to mention, my main Python platform is still 2.3 For me, building on Windows is a pain to set up (because of Windows not scipy). Enthought delivers an Enthon edition which helps -- but I don't think they are using Python 2.4 yet. Those of you who use Windows regularly and are using Python 2.4 should really be building the binaries. -Travis From srijit at yahoo.com Wed Mar 23 05:38:50 2005 From: srijit at yahoo.com (Srijit Kumar Bhadra) Date: Wed, 23 Mar 2005 02:38:50 -0800 (PST) Subject: [SciPy-user] Build Error (Python 2.4, Scipy 0.3.2) on Windows XP Message-ID: <20050323103850.50357.qmail@web60001.mail.yahoo.com> Hello, I attempted to build SciPy 0.3.2 with Python 2.4 on Windows XP machine. I followed http://www.scipy.org/documentation/buildscipywin32.txt. I am using MinGW-3.2.0-rc-3, Numeric-23.8.win32-py2.4, atlas3.6.0_WinNT_P4SSE2 and FFTW 2.1.3 (http://claymore.engineer.gvsu.edu/~steriana/fftw213.libs.zip) I am getting the following build error (copied and pasted below). I look forward to support for solving this problem. With regards, Srijit running build_ext customize Mingw32CCompiler customize Mingw32CCompiler using build_ext customize GnuFCompiler customize GnuFCompiler customize GnuFCompiler using build_ext building 'scipy.fftpack._fftpack' extension compiling C sources gcc options: '-O2 -Wall -Wstrict-prototypes' compile options: '-Ibuild\src -ID:\python24\include -ID:\python24\PC -c' D:\MinGW\bin\g77.exe -shared build\temp.win32-2.4\Release\build\src\lib\fftpack\ _fftpackmodule.o build\temp.win32-2.4\Release\lib\fftpack\src\zfft.o build\temp. win32-2.4\Release\lib\fftpack\src\drfft.o build\temp.win32-2.4\Release\lib\fftpa ck\src\zrfft.o build\temp.win32-2.4\Release\lib\fftpack\src\zfftnd.o build\temp. win32-2.4\Release\build\src\fortranobject.o -LD:/MinGW/bin/../lib/gcc/mingw32/3. 4.2 -LD:\python24\libs -LD:\python24\PCBuild -Lbuild\temp.win32-2.4 -ldfftpack - lpython24 -lgcc -lg2c -o build\lib.win32-2.4\scipy\fftpack\_fftpack.pyd D:/MinGW/bin/../lib/gcc/mingw32/3.4.2/libgcc.a(__main.o)(.text+0x4f): undefined reference to `__EH_FRAME_BEGIN__' D:/MinGW/bin/../lib/gcc/mingw32/3.4.2/libgcc.a(__main.o)(.text+0x73): undefined reference to `__EH_FRAME_BEGIN__' collect2: ld returned 1 exit status error: Command "D:\MinGW\bin\g77.exe -shared build\temp.win32-2.4\Release\build\ src\lib\fftpack\_fftpackmodule.o build\temp.win32-2.4\Release\lib\fftpack\src\zf ft.o build\temp.win32-2.4\Release\lib\fftpack\src\drfft.o build\temp.win32-2.4\R elease\lib\fftpack\src\zrfft.o build\temp.win32-2.4\Release\lib\fftpack\src\zfft nd.o build\temp.win32-2.4\Release\build\src\fortranobject.o -LD:/MinGW/bin/../li b/gcc/mingw32/3.4.2 -LD:\python24\libs -LD:\python24\PCBuild -Lbuild\temp.win32- 2.4 -ldfftpack -lpython24 -lgcc -lg2c -o build\lib.win32-2.4\scipy\fftpack\_fftp ack.pyd" failed with exit status 1 __________________________________ Do you Yahoo!? Yahoo! Small Business - Try our new resources site! http://smallbusiness.yahoo.com/resources/ From bgoli at sun.ac.za Wed Mar 23 14:26:00 2005 From: bgoli at sun.ac.za (Brett G. Olivier) Date: Wed, 23 Mar 2005 21:26:00 +0200 (SAST) Subject: [SciPy-user] Build Error (Python 2.4, Scipy 0.3.2) on Windows XP In-Reply-To: <20050323103850.50357.qmail@web60001.mail.yahoo.com> References: <20050323103850.50357.qmail@web60001.mail.yahoo.com> Message-ID: <44210.196.2.56.5.1111605960.squirrel@196.2.56.5> Hi I'm not an expert with compiling stuff but I did also encounter this problem. Basically, it's a MinGW 3.4.2 issue and according to the MinGW developers the undefined reference to `__EH_FRAME_BEGIN__' is caused by the -lgcc option: """ If you remove the unnecessary -lgcc from the command line, the undefined reference should not be generated. The -lg2c is also unnecessary since g77 will put "-lfrtbegin -lg2c" in for you Danny """ Taking out -lgcc does allow a full SciPy compile with GCC 3.4.2, but then I also experienced some very strange behaviour with certain Fortran functions such as scipy.optimize.fsolve which ran and simply returned zero arrays. I don't know if/how the two problems are related and up until now haven't had time to further characterise this problem. Basically, I went back to MinGW 3.1 with GCC 3.3.3 which works fine with Win2K. If anyone has any suggestions on how to approach this, I'll be more than happy to help debug this problem. Regards Brett > Hello, > I attempted to build SciPy 0.3.2 with Python 2.4 on Windows XP machine. > I followed http://www.scipy.org/documentation/buildscipywin32.txt. I am > using MinGW-3.2.0-rc-3, Numeric-23.8.win32-py2.4, > atlas3.6.0_WinNT_P4SSE2 and FFTW 2.1.3 > (http://claymore.engineer.gvsu.edu/~steriana/fftw213.libs.zip) > > I am getting the following build error (copied and pasted below). > I look forward to support for solving this problem. > > With regards, > Srijit > > running build_ext > customize Mingw32CCompiler > customize Mingw32CCompiler using build_ext > customize GnuFCompiler > customize GnuFCompiler > customize GnuFCompiler using build_ext > building 'scipy.fftpack._fftpack' extension > compiling C sources > gcc options: '-O2 -Wall -Wstrict-prototypes' > compile options: '-Ibuild\src -ID:\python24\include -ID:\python24\PC > -c' > D:\MinGW\bin\g77.exe -shared > build\temp.win32-2.4\Release\build\src\lib\fftpack\ > _fftpackmodule.o build\temp.win32-2.4\Release\lib\fftpack\src\zfft.o > build\temp. > win32-2.4\Release\lib\fftpack\src\drfft.o > build\temp.win32-2.4\Release\lib\fftpa > ck\src\zrfft.o build\temp.win32-2.4\Release\lib\fftpack\src\zfftnd.o > build\temp. > win32-2.4\Release\build\src\fortranobject.o > -LD:/MinGW/bin/../lib/gcc/mingw32/3. > 4.2 -LD:\python24\libs -LD:\python24\PCBuild -Lbuild\temp.win32-2.4 > -ldfftpack - > lpython24 -lgcc -lg2c -o build\lib.win32-2.4\scipy\fftpack\_fftpack.pyd > D:/MinGW/bin/../lib/gcc/mingw32/3.4.2/libgcc.a(__main.o)(.text+0x4f): > undefined > reference to `__EH_FRAME_BEGIN__' > D:/MinGW/bin/../lib/gcc/mingw32/3.4.2/libgcc.a(__main.o)(.text+0x73): > undefined > reference to `__EH_FRAME_BEGIN__' > collect2: ld returned 1 exit status > error: Command "D:\MinGW\bin\g77.exe -shared > build\temp.win32-2.4\Release\build\ > src\lib\fftpack\_fftpackmodule.o > build\temp.win32-2.4\Release\lib\fftpack\src\zf > ft.o build\temp.win32-2.4\Release\lib\fftpack\src\drfft.o > build\temp.win32-2.4\R > elease\lib\fftpack\src\zrfft.o > build\temp.win32-2.4\Release\lib\fftpack\src\zfft > nd.o build\temp.win32-2.4\Release\build\src\fortranobject.o > -LD:/MinGW/bin/../li > b/gcc/mingw32/3.4.2 -LD:\python24\libs -LD:\python24\PCBuild > -Lbuild\temp.win32- > 2.4 -ldfftpack -lpython24 -lgcc -lg2c -o > build\lib.win32-2.4\scipy\fftpack\_fftp > ack.pyd" failed with exit status 1 > > > > > > > > __________________________________ > Do you Yahoo!? > Yahoo! Small Business - Try our new resources site! > http://smallbusiness.yahoo.com/resources/ > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user > -- Brett G. Olivier Ph.D. National Bioinformatics Network - Stellenbosch Node Triple-J Group for Molecular Cell Physiology, Stellenbosch University bgoli at sun dot ac dot za http://glue.jjj.sun.ac.za/~bgoli Tel +27-21-8082697 Fax +27-21-8085863 Mobile +27-82-7329306 From H.FANGOHR at soton.ac.uk Wed Mar 23 18:28:53 2005 From: H.FANGOHR at soton.ac.uk (Hans Fangohr) Date: Wed, 23 Mar 2005 23:28:53 +0000 (GMT) Subject: [SciPy-user] stopping criterion for integrate.odeint? Message-ID: Dear all, we have been using odeint in a number of situations and it works great. Now we'd like to interrupt the integration of the set of ODEs once a conditions is fulfilled that depends on the solution that is being computed. Here is an example. Suppose we want to solve the ODEs associated with an object being thrown straight into the sky (i.e. against the gravitational force). Assume we can describe this by r''(t) = -g where r'' is the second derivative of the (scalar) position with respect to time. We convert this into two 1st order ODEs v' = -g and r' = v and use odeint to solve these. (So far, so good, please find attached below the complete program demonstrating this). Here is the complication: Assume we want to interrupt the integration if r<=0, e.g. when the object hits the ground. Is there an elegant way of doing this? The source code below should demonstrate what we try to do. A few more comments: (i) the actual example I am trying to solve is more complicated (yes, I can compute the time when the object hits the ground analytically in this case ;-) but that's not the point here) (ii) options I considered include: (a) integrating the set of ODEs in small steps dt by calling odeint repeatedly. Feasible but not beautiful as the final solution will depend on the size of this dt. (b) raising an exception in the rhs function (doesn't work) (c) setting all derivatives to zero in the rhs function (seems to do what I want since r and v don't change anymore, but is wasting CPU cycles because the integration doesn't stop). Okay, I hope I outlined by problem. Doe anyone have any advice? Many thanks, Hans ------------------------------------------------------------------------------------------------------------ import Numeric as N from scipy.integrate import odeint def rhs(y,t): """expect y = N.array([r,v]) Right-Hand-Side function to solve 2nd order ODE r''= -g (with g being a constant). returns dr/dt = v dv/dt = a = F/m = -g and """ r,v = y drdt = v dvdt = -9.81 #m/s^2 # this is what I don't know how to do elegantly if r<=0: print "We'd like to stop integrating now" return N.array([drdt,dvdt]) y = odeint(rhs, N.array([0,10]), N.arrayrange(0,3,0.1)) try: import pylab pylab.plot(y[:,0]) pylab.show() except ImportError: pass #can't import matplotlib From rng7 at cornell.edu Wed Mar 23 19:22:09 2005 From: rng7 at cornell.edu (Ryan Gutenkunst) Date: Wed, 23 Mar 2005 19:22:09 -0500 Subject: [SciPy-user] stopping criterion for integrate.odeint? In-Reply-To: References: Message-ID: <42420831.7070605@cornell.edu> Hans Fangohr wrote: > Dear all, > > we have been using odeint in a number of situations and it works > great. Now we'd like to interrupt the integration of the set of ODEs > once a conditions is fulfilled that depends on the solution that is > being computed. > > Here is the complication: Assume we want to interrupt the integration > if r<=0, e.g. when the object hits the ground. Is there an elegant way > of doing this? > > > Many thanks, > > Hans > Hi Hans, In our work with biochemical networks we've encountered a very similar requirement. (We, however, want to stop the integration, change some parameters, than start again from where we left off.) Our solution isn't particularly pretty, but it does give the time and variable values when the condition is fulfilled exactly. Consider the integration of dy/dt where y can be a vector. We want to know exactly when the condition c(y) = 0 is satisfied. We integrate dy/dt from 0 to some time T, requesting a reasonably fine grid of times. Then we go back and calculate c(y, t) for each time point in our trajectory. If two adjacent time points (t1 and t2) have different signs for c(y, t), we know our condition was fulfilled sometime in between those times. Now we can integrate backward from t2 to find the exact time the condition was fulfilled. To do so, we make a change of variables from t to c. Namely, we integrate the equations defined by: [dy/dc = dy/dt / dc/dt, dt/dc = 1 / dc/dt] from c = c(t2) to c = 0. (Note that this is one more equation that we were integrating before.) The initial conditions are [y(t2), 0]. This integration terminates with the values [y(tc), t2 - tc] where tc is the time when c crossed zero. We can then insert these values into our trajectory, and terminate the integration process if we want. So there are some obvious drawbacks to this: 1) The condition needs to be expressed in a continuous way, and we need to know dc/dt analytically. This usually isn't so bad since it's probably simply related to dy/dt. (For example, we could write your r <= 0 as c = r - 0 then dc/dt = dr/dt which we already know how to calculate.) 2) We have to integrate all the way out to T first, before we can check whether a condition fired. So if N conditions fire, we're doing about N times as much integration work. 3) If the sign of c(y) changes but then changes back before the next time point we sampled in the initial integration, we'll miss that firing of the condition. Nevertheless, it's pretty easy to write in pure Python, and it works without access to the integrator's internal approximation of the function. Doing this exactly and efficiently would require low-level access to the guts of the integrator. There do exist integrators that can handle this sort of condition-checking internally, but as far as I can tell the one odeint wraps isn't among them. For now, it's not worth it to me to wrap up another package, especially since that would be yet another thing for our users to have to install. If there's interest in moving scipy over to a more sophisticated integrator, I'd be happy to help, but I'm still somewhat of a newbie and have no clue wrt FORTRAN, so I couldn't do it alone. I also expect that it might break a lot of existing code if we aren't careful with outputs. Anyways, I hope my explanation of our solution is reasonably clear. If folks are interested, I'll clean-up, comment, and post our code. And, of course, if anyone has a better solution, I'm dying to hear it. :-) Cheers, Ryan -- Ryan Gutenkunst | Cornell Dept. of Physics | "It is not the mountain | we conquer but ourselves." Clark 535 / (607)255-6068 | -- Sir Edmund Hillary AIM: JepettoRNG | http://www.physics.cornell.edu/~rgutenkunst/ From Fernando.Perez at colorado.edu Wed Mar 23 19:34:53 2005 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Wed, 23 Mar 2005 17:34:53 -0700 Subject: [SciPy-user] stopping criterion for integrate.odeint? In-Reply-To: <42420831.7070605@cornell.edu> References: <42420831.7070605@cornell.edu> Message-ID: <42420B2D.6000701@colorado.edu> Ryan Gutenkunst wrote: > Consider the integration of dy/dt where y can be a vector. We want to > know exactly when the condition c(y) = 0 is satisfied. We integrate > dy/dt from 0 to some time T, requesting a reasonably fine grid of times. > Then we go back and calculate c(y, t) for each time point in our > trajectory. If two adjacent time points (t1 and t2) have different signs > for c(y, t), we know our condition was fulfilled sometime in between > those times. > > Now we can integrate backward from t2 to find the exact time the > condition was fulfilled. To do so, we make a change of variables from t > to c. Namely, we integrate the equations defined by: > [dy/dc = dy/dt / dc/dt, dt/dc = 1 / dc/dt] > from c = c(t2) to c = 0. (Note that this is one more equation that we > were integrating before.) The initial conditions are [y(t2), 0]. This > integration terminates with the values [y(tc), t2 - tc] where tc is the > time when c crossed zero. We can then insert these values into our > trajectory, and terminate the integration process if we want. In case anyone wants a full reference: - M. Henon, Physica 5D, 412 (1982). This is a standard trick in the construction of Poincare maps in dynamical systems. The above paper by Henon is the place where I first read about it, though it is something simple enough that it has probably been 'reinvented' numerous times and in many different contexts. Best, f From rkern at ucsd.edu Wed Mar 23 19:38:30 2005 From: rkern at ucsd.edu (Robert Kern) Date: Wed, 23 Mar 2005 16:38:30 -0800 Subject: [SciPy-user] stopping criterion for integrate.odeint? In-Reply-To: <42420831.7070605@cornell.edu> References: <42420831.7070605@cornell.edu> Message-ID: <42420C06.1040300@ucsd.edu> Ryan Gutenkunst wrote: > For now, it's not worth it to me to wrap up another package, especially > since that would be yet another thing for our users to have to install. > If there's interest in moving scipy over to a more sophisticated > integrator, I'd be happy to help, but I'm still somewhat of a newbie and > have no clue wrt FORTRAN, so I couldn't do it alone. I also expect that > it might break a lot of existing code if we aren't careful with outputs. We probably wouldn't replace the current functions for the reason you mention. However, we have no problem with adding quality algorithms (size, license, and distributability allowing). Did you have a particular package in mind? > Anyways, I hope my explanation of our solution is reasonably clear. If > folks are interested, I'll clean-up, comment, and post our code. Please do. -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From rng7 at cornell.edu Wed Mar 23 21:56:38 2005 From: rng7 at cornell.edu (Ryan Gutenkunst) Date: Wed, 23 Mar 2005 21:56:38 -0500 Subject: [SciPy-user] stopping criterion for integrate.odeint? In-Reply-To: <42420C06.1040300@ucsd.edu> References: <42420831.7070605@cornell.edu> <42420C06.1040300@ucsd.edu> Message-ID: On Mar 23, 2005, at 7:38 PM, Robert Kern wrote: > Ryan Gutenkunst wrote: > >> For now, it's not worth it to me to wrap up another package, >> especially since that would be yet another thing for our users to >> have to install. If there's interest in moving scipy over to a more >> sophisticated integrator, I'd be happy to help, but I'm still >> somewhat of a newbie and have no clue wrt FORTRAN, so I couldn't do >> it alone. I also expect that it might break a lot of existing code if >> we aren't careful with outputs. > > We probably wouldn't replace the current functions for the reason you > mention. However, we have no problem with adding quality algorithms > (size, license, and distributability allowing). Did you have a > particular package in mind? Well the only example I've actually used is in Matlab. We'd really love to be able to use SUNDIALS (http://www.llnl.gov/CASC/sundials/). The sensitivity capabilities and ability to handle DAE systems are very attractive. (We've implemented some of those capabilities using odeint, but our implementations are pretty problem-specific and not that efficient.) The code is BSD, but the suite itself is a 7 meg download (!). In a more realistic vein, LSODAR (http://www.netlib.org/odepack/opkd-sum) would be a nice addition. It would add root-finding capabilities that would be useful to Hans and myself. Since odeint already uses LSODA, I imagine wrapping LSODAR wouldn't be that hard. >> Anyways, I hope my explanation of our solution is reasonably clear. >> If folks are interested, I'll clean-up, comment, and post our code. > > Please do. I'm attaching the code. It's rough, and I'd love to hear improvements from the many more experienced coders on this list. Ryan -------------- next part -------------- A non-text attachment was scrubbed... Name: Integration.py Type: application/octet-stream Size: 4902 bytes Desc: not available URL: -------------- next part -------------- > > -- > Robert Kern > rkern at ucsd.edu > -- Ryan Gutenkunst | Cornell Dept. of Physics | "It is not the mountain | we conquer but ourselves." Clark 535 / (607)255-6068 | -- Sir Edmund Hillary AIM: JepettoRNG | http://www.physics.cornell.edu/~rgutenkunst/ From rkern at ucsd.edu Wed Mar 23 23:19:59 2005 From: rkern at ucsd.edu (Robert Kern) Date: Wed, 23 Mar 2005 20:19:59 -0800 Subject: [SciPy-user] stopping criterion for integrate.odeint? In-Reply-To: References: <42420831.7070605@cornell.edu> <42420C06.1040300@ucsd.edu> Message-ID: <42423FEF.5000407@ucsd.edu> Ryan Gutenkunst wrote: > We'd really love to be able to use SUNDIALS > (http://www.llnl.gov/CASC/sundials/). The sensitivity capabilities and > ability to handle DAE systems are very attractive. (We've implemented > some of those capabilities using odeint, but our implementations are > pretty problem-specific and not that efficient.) The code is BSD, but > the suite itself is a 7 meg download (!). It's actually not so bad. Most of that is redundant PDF and PS documentation. Strip those out, the gzipped tarball is only 716k and the uncompressed directory is 4M (compared to, say 3.4M for xplt) and 1M for the installed static libraries on OS X (15M for a complete ATLAS). It's really just a matter of someone wanting it enough to put the effort in. And it's going to be a fairly large job to do it thoroughly. > In a more realistic vein, LSODAR > (http://www.netlib.org/odepack/opkd-sum) would be a nice addition. It > would add root-finding capabilities that would be useful to Hans and > myself. Since odeint already uses LSODA, I imagine wrapping LSODAR > wouldn't be that hard. Probably easier than wrapping LSODA was. Poor Travis had to do it without f2py to help him! If I used ODEs more (e.g. at all), I might do it myself. Alas, MacEnthon sucks away my Python-hours. (A testing release is coming! Soon! As soon as I write some readme-type docs and I get access to Enthought's download server.) -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From novak at ucolick.org Wed Mar 23 23:44:12 2005 From: novak at ucolick.org (Greg Novak) Date: Wed, 23 Mar 2005 20:44:12 -0800 Subject: [SciPy-user] Scipy Linear Algebra functions Message-ID: <20050324044412.GA10517@dionysus.ucolick.org> I get the following traceback when using Scipy 0.3.2 on Mac OSX 10.3. The Fink package I'm using is numbered: 1:0.3.2-2. Any suggestions are appreciated. Thanks! Greg In [25]: linalg.inv([[1,0],[0,1]]) --------------------------------------------------------------------------- PPImportError Traceback (most recent call last) /Users/novak/Projects/Thesis/ /sw/lib/python2.3/site-packages/scipy_base/ppimport.py in __getattr__(self, name) 301 module = self.__dict__['_ppimport_module'] 302 except KeyError: --> 303 module = self._ppimport_importer() 304 return getattr(module, name) 305 /sw/lib/python2.3/site-packages/scipy_base/ppimport.py in _ppimport_importer(self) 260 exc_info = self.__dict__.get('_ppimport_exc_info') 261 if exc_info is not None: --> 262 raise PPImportError,\ 263 ''.join(traceback.format_exception(*exc_info)) 264 else: PPImportError: Traceback (most recent call last): File "/sw/lib/python2.3/site-packages/scipy_base/ppimport.py", line 273, in _ppimport_importer module = __import__(name,None,None,['*']) File "/sw/lib/python2.3/site-packages/scipy/linalg/__init__.py", line 8, in ? from basic import * File "/sw/lib/python2.3/site-packages/scipy/linalg/basic.py", line 13, in ? from flinalg import get_flinalg_funcs File "/sw/lib/python2.3/site-packages/scipy/linalg/flinalg.py", line 7, in ? from scipy_distutils.misc_util import PostponedException ImportError: No module named scipy_distutils.misc_util From srijit at yahoo.com Thu Mar 24 04:48:41 2005 From: srijit at yahoo.com (Srijit Kumar Bhadra) Date: Thu, 24 Mar 2005 01:48:41 -0800 (PST) Subject: [SciPy-user] More Information - Build Error (Python 2.4, Scipy 0.3.2) on Windows XP Message-ID: <20050324094841.60731.qmail@web60006.mail.yahoo.com> Hello, Thanks to Brett for his reponse. I am now using MinGW-3.1.0-1.exe (gcc version 3.2.3 and not 3.3.3 as mentioned by Brett). I am still not sure whether the build is successful. Python crashed when I ran t=scipy.test() after installation of SciPy 0.3.2. So, I am copying and pasting, below, the last portion of build output(command python setup.py build --compiler=mingw32) in the dos window. After executing install command (python setup.py build --compiler=mingw32 install), I got errors which I am also copying and pasting, below. Best Regards, Srijit Outputs of Build command and install command are separately pasted below. 1) Command: python setup.py build --compiler=mingw32 building library "rootfind" sources building library "c_misc" sources building library "cephes" sources building library "dfftpack" sources building library "fitpack" sources building library "mach" sources building library "amos" sources building library "toms" sources building library "cdf" sources building library "specfun" sources building library "statlib" sources building library "quadpack_src" sources building library "linpack_lite_src" sources building library "mach_src" sources building library "odepack_src" sources building library "minpack" sources building library "superlu_src" sources running build_py copying build\src\scipy\scipy\config.py -> build\lib.win32-2.4\scipy package init file 'Lib\tests\__init__.py' not found (or not a regular file) package init file 'D:\srijit_data\Bhadra\Software\Python24\SciPy_complete-0.3.2\ Lib\cluster\tests\__init__.py' not found (or not a regular file) package init file 'D:\srijit_data\Bhadra\Software\Python24\SciPy_complete-0.3.2\ Lib\fftpack\tests\__init__.py' not found (or not a regular file) package init file 'D:\srijit_data\Bhadra\Software\Python24\SciPy_complete-0.3.2\ Lib\fftpack\tests\__init__.py' not found (or not a regular file) package init file 'D:\srijit_data\Bhadra\Software\Python24\SciPy_complete-0.3.2\ Lib\gui_thread\tests\__init__.py' not found (or not a regular file) package init file 'D:\srijit_data\Bhadra\Software\Python24\SciPy_complete-0.3.2\ Lib\interpolate\tests\__init__.py' not found (or not a regular file) package init file 'D:\srijit_data\Bhadra\Software\Python24\SciPy_complete-0.3.2\ Lib\io\tests\__init__.py' not found (or not a regular file) package init file 'D:\srijit_data\Bhadra\Software\Python24\SciPy_complete-0.3.2\ Lib\linalg\tests\__init__.py' not found (or not a regular file) package init file 'D:\srijit_data\Bhadra\Software\Python24\SciPy_complete-0.3.2\ Lib\optimize\tests\__init__.py' not found (or not a regular file) package init file 'D:\srijit_data\Bhadra\Software\Python24\SciPy_complete-0.3.2\ Lib\signal\tests\__init__.py' not found (or not a regular file) package init file 'D:\srijit_data\Bhadra\Software\Python24\SciPy_complete-0.3.2\ Lib\sparse\tests\__init__.py' not found (or not a regular file) package init file 'D:\srijit_data\Bhadra\Software\Python24\SciPy_complete-0.3.2\ Lib\special\tests\__init__.py' not found (or not a regular file) package init file 'D:\srijit_data\Bhadra\Software\Python24\SciPy_complete-0.3.2\ Lib\stats\tests\__init__.py' not found (or not a regular file) package init file 'D:\srijit_data\Bhadra\Software\Python24\SciPy_complete-0.3.2\ Lib\xxx\tests\__init__.py' not found (or not a regular file) package init file 'D:\srijit_data\Bhadra\Software\Python24\SciPy_complete-0.3.2\ Lib\xxx\yyy\tests\__init__.py' not found (or not a regular file) package init file 'D:\srijit_data\Bhadra\Software\Python24\SciPy_complete-0.3.2\ scipy_core\scipy_base\tests\__init__.py' not found (or not a regular file) package init file 'scipy_core\weave\tests\__init__.py' not found (or not a regul ar file) package init file 'scipy_core\weave\tests\__init__.py' not found (or not a regul ar file) package init file 'Lib\tests\__init__.py' not found (or not a regular file) package init file 'D:\srijit_data\Bhadra\Software\Python24\SciPy_complete-0.3.2\ Lib\cluster\tests\__init__.py' not found (or not a regular file) package init file 'D:\srijit_data\Bhadra\Software\Python24\SciPy_complete-0.3.2\ Lib\fftpack\tests\__init__.py' not found (or not a regular file) package init file 'D:\srijit_data\Bhadra\Software\Python24\SciPy_complete-0.3.2\ Lib\fftpack\tests\__init__.py' not found (or not a regular file) package init file 'D:\srijit_data\Bhadra\Software\Python24\SciPy_complete-0.3.2\ Lib\gui_thread\tests\__init__.py' not found (or not a regular file) package init file 'D:\srijit_data\Bhadra\Software\Python24\SciPy_complete-0.3.2\ Lib\interpolate\tests\__init__.py' not found (or not a regular file) package init file 'D:\srijit_data\Bhadra\Software\Python24\SciPy_complete-0.3.2\ Lib\io\tests\__init__.py' not found (or not a regular file) package init file 'D:\srijit_data\Bhadra\Software\Python24\SciPy_complete-0.3.2\ Lib\linalg\tests\__init__.py' not found (or not a regular file) package init file 'D:\srijit_data\Bhadra\Software\Python24\SciPy_complete-0.3.2\ Lib\optimize\tests\__init__.py' not found (or not a regular file) package init file 'D:\srijit_data\Bhadra\Software\Python24\SciPy_complete-0.3.2\ Lib\signal\tests\__init__.py' not found (or not a regular file) package init file 'D:\srijit_data\Bhadra\Software\Python24\SciPy_complete-0.3.2\ Lib\sparse\tests\__init__.py' not found (or not a regular file) package init file 'D:\srijit_data\Bhadra\Software\Python24\SciPy_complete-0.3.2\ Lib\special\tests\__init__.py' not found (or not a regular file) package init file 'D:\srijit_data\Bhadra\Software\Python24\SciPy_complete-0.3.2\ Lib\stats\tests\__init__.py' not found (or not a regular file) package init file 'D:\srijit_data\Bhadra\Software\Python24\SciPy_complete-0.3.2\ Lib\xxx\tests\__init__.py' not found (or not a regular file) package init file 'D:\srijit_data\Bhadra\Software\Python24\SciPy_complete-0.3.2\ Lib\xxx\yyy\tests\__init__.py' not found (or not a regular file) package init file 'D:\srijit_data\Bhadra\Software\Python24\SciPy_complete-0.3.2\ scipy_core\scipy_base\tests\__init__.py' not found (or not a regular file) package init file 'scipy_core\weave\tests\__init__.py' not found (or not a regul ar file) package init file 'scipy_core\weave\tests\__init__.py' not found (or not a regul ar file) running build_clib customize Mingw32CCompiler customize Mingw32CCompiler using build_clib customize GnuFCompiler customize GnuFCompiler customize GnuFCompiler using build_clib running build_ext customize Mingw32CCompiler customize Mingw32CCompiler using build_ext customize GnuFCompiler customize GnuFCompiler customize GnuFCompiler using build_ext 2) Command: python setup.py build --compiler=mingw32 install building library "mach_src" sources building library "odepack_src" sources building library "minpack" sources building library "superlu_src" sources running build_py copying build\src\scipy\scipy\config.py -> build\lib.win32-2.4\scipy package init file 'Lib\tests\__init__.py' not found (or not a regular file) package init file 'D:\srijit_data\Bhadra\Software\Python24\SciPy_complete-0.3.2\ Lib\cluster\tests\__init__.py' not found (or not a regular file) package init file 'D:\srijit_data\Bhadra\Software\Python24\SciPy_complete-0.3.2\ Lib\fftpack\tests\__init__.py' not found (or not a regular file) package init file 'D:\srijit_data\Bhadra\Software\Python24\SciPy_complete-0.3.2\ Lib\fftpack\tests\__init__.py' not found (or not a regular file) package init file 'D:\srijit_data\Bhadra\Software\Python24\SciPy_complete-0.3.2\ Lib\gui_thread\tests\__init__.py' not found (or not a regular file) package init file 'D:\srijit_data\Bhadra\Software\Python24\SciPy_complete-0.3.2\ Lib\interpolate\tests\__init__.py' not found (or not a regular file) package init file 'D:\srijit_data\Bhadra\Software\Python24\SciPy_complete-0.3.2\ Lib\io\tests\__init__.py' not found (or not a regular file) package init file 'D:\srijit_data\Bhadra\Software\Python24\SciPy_complete-0.3.2\ Lib\linalg\tests\__init__.py' not found (or not a regular file) package init file 'D:\srijit_data\Bhadra\Software\Python24\SciPy_complete-0.3.2\ Lib\optimize\tests\__init__.py' not found (or not a regular file) package init file 'D:\srijit_data\Bhadra\Software\Python24\SciPy_complete-0.3.2\ Lib\signal\tests\__init__.py' not found (or not a regular file) package init file 'D:\srijit_data\Bhadra\Software\Python24\SciPy_complete-0.3.2\ Lib\sparse\tests\__init__.py' not found (or not a regular file) package init file 'D:\srijit_data\Bhadra\Software\Python24\SciPy_complete-0.3.2\ Lib\special\tests\__init__.py' not found (or not a regular file) package init file 'D:\srijit_data\Bhadra\Software\Python24\SciPy_complete-0.3.2\ Lib\stats\tests\__init__.py' not found (or not a regular file) package init file 'D:\srijit_data\Bhadra\Software\Python24\SciPy_complete-0.3.2\ Lib\xxx\tests\__init__.py' not found (or not a regular file) package init file 'D:\srijit_data\Bhadra\Software\Python24\SciPy_complete-0.3.2\ Lib\xxx\yyy\tests\__init__.py' not found (or not a regular file) package init file 'D:\srijit_data\Bhadra\Software\Python24\SciPy_complete-0.3.2\ scipy_core\scipy_base\tests\__init__.py' not found (or not a regular file) package init file 'scipy_core\weave\tests\__init__.py' not found (or not a regul ar file) package init file 'scipy_core\weave\tests\__init__.py' not found (or not a regul ar file) package init file 'Lib\tests\__init__.py' not found (or not a regular file) package init file 'D:\srijit_data\Bhadra\Software\Python24\SciPy_complete-0.3.2\ Lib\cluster\tests\__init__.py' not found (or not a regular file) package init file 'D:\srijit_data\Bhadra\Software\Python24\SciPy_complete-0.3.2\ Lib\fftpack\tests\__init__.py' not found (or not a regular file) package init file 'D:\srijit_data\Bhadra\Software\Python24\SciPy_complete-0.3.2\ Lib\fftpack\tests\__init__.py' not found (or not a regular file) package init file 'D:\srijit_data\Bhadra\Software\Python24\SciPy_complete-0.3.2\ Lib\gui_thread\tests\__init__.py' not found (or not a regular file) package init file 'D:\srijit_data\Bhadra\Software\Python24\SciPy_complete-0.3.2\ Lib\interpolate\tests\__init__.py' not found (or not a regular file) package init file 'D:\srijit_data\Bhadra\Software\Python24\SciPy_complete-0.3.2\ Lib\io\tests\__init__.py' not found (or not a regular file) package init file 'D:\srijit_data\Bhadra\Software\Python24\SciPy_complete-0.3.2\ Lib\linalg\tests\__init__.py' not found (or not a regular file) package init file 'D:\srijit_data\Bhadra\Software\Python24\SciPy_complete-0.3.2\ Lib\optimize\tests\__init__.py' not found (or not a regular file) package init file 'D:\srijit_data\Bhadra\Software\Python24\SciPy_complete-0.3.2\ Lib\signal\tests\__init__.py' not found (or not a regular file) package init file 'D:\srijit_data\Bhadra\Software\Python24\SciPy_complete-0.3.2\ Lib\sparse\tests\__init__.py' not found (or not a regular file) package init file 'D:\srijit_data\Bhadra\Software\Python24\SciPy_complete-0.3.2\ Lib\special\tests\__init__.py' not found (or not a regular file) package init file 'D:\srijit_data\Bhadra\Software\Python24\SciPy_complete-0.3.2\ Lib\stats\tests\__init__.py' not found (or not a regular file) package init file 'D:\srijit_data\Bhadra\Software\Python24\SciPy_complete-0.3.2\ Lib\xxx\tests\__init__.py' not found (or not a regular file) package init file 'D:\srijit_data\Bhadra\Software\Python24\SciPy_complete-0.3.2\ Lib\xxx\yyy\tests\__init__.py' not found (or not a regular file) package init file 'D:\srijit_data\Bhadra\Software\Python24\SciPy_complete-0.3.2\ scipy_core\scipy_base\tests\__init__.py' not found (or not a regular file) package init file 'scipy_core\weave\tests\__init__.py' not found (or not a regul ar file) package init file 'scipy_core\weave\tests\__init__.py' not found (or not a regul ar file) running build_clib customize Mingw32CCompiler customize Mingw32CCompiler using build_clib customize GnuFCompiler customize GnuFCompiler customize GnuFCompiler using build_clib running build_ext customize Mingw32CCompiler customize Mingw32CCompiler using build_ext customize GnuFCompiler customize GnuFCompiler customize GnuFCompiler using build_ext running install running install_lib copying build\lib.win32-2.4\scipy\config.py -> D:\python24\Lib\site-packages\sci py byte-compiling D:\python24\Lib\site-packages\scipy\config.py to config.pyc running install_data error: can't copy 'Lib\gplt\wgnuplot.exe': doesn't exist or not a regular file __________________________________ Do you Yahoo!? Make Yahoo! your home page http://www.yahoo.com/r/hs From bgoli at sun.ac.za Thu Mar 24 05:26:16 2005 From: bgoli at sun.ac.za (Brett G. Olivier) Date: Thu, 24 Mar 2005 12:26:16 +0200 (SAST) Subject: [SciPy-user] More Information - Build Error (Python 2.4, Scipy 0.3.2) on Windows XP In-Reply-To: <20050324094841.60731.qmail@web60006.mail.yahoo.com> References: <20050324094841.60731.qmail@web60006.mail.yahoo.com> Message-ID: <30075.165.146.209.52.1111659976.squirrel@165.146.209.52> Hi I'm not sure about the scipy.test problem but part of the install problem is that the scipy-0.3.2 source archive doesn't include wgnuplot.exe If you have an older version of scipy which includes wgnuplot.exe and gnuplot_helper.exe try copying them into the source directory Lib\gplt and then do a setup.py install. I think they are also available at http://www.scipy.org/cvs/viewcvs/map?rmurl=http%3A//scipy.net/cgi-bin/viewcvsx.cgi/scipy/Lib/gplt/) Hope this helps Brett > After executing install command (python setup.py build > --compiler=mingw32 install), I got errors which I am also copying and > pasting, below. > byte-compiling D:\python24\Lib\site-packages\scipy\config.py to > config.pyc > running install_data > error: can't copy 'Lib\gplt\wgnuplot.exe': doesn't exist or not a > regular file -- Brett G. Olivier Ph.D. National Bioinformatics Network - Stellenbosch Node Triple-J Group for Molecular Cell Physiology, Stellenbosch University bgoli at sun dot ac dot za http://glue.jjj.sun.ac.za/~bgoli Tel +27-21-8082697 Fax +27-21-8085863 Mobile +27-82-7329306 From p.schnizer at gsi.de Thu Mar 24 09:19:54 2005 From: p.schnizer at gsi.de (Pierre SCHNIZER) Date: Thu, 24 Mar 2005 15:19:54 +0100 Subject: [SciPy-user] stopping criterion for integrate.odeint? In-Reply-To: <42423FEF.5000407@ucsd.edu> References: <42420831.7070605@cornell.edu> <42420C06.1040300@ucsd.edu> <42423FEF.5000407@ucsd.edu> Message-ID: <4242CC8A.6030406@gsi.de> Robert Kern wrote: > Ryan Gutenkunst wrote: > >> We'd really love to be able to use SUNDIALS >> (http://www.llnl.gov/CASC/sundials/). > > > It's really just a matter of someone wanting it enough to put the effort > in. And it's going to be a fairly large job to do it thoroughly. > Yes thats quite some work (considering what was necessary for the pygsl solvers.) SWIG can help a lot to wrap such stuff, but there is no help for the automatic generation of callbacks :-( At least I did not hear of that). >> In a more realistic vein, LSODAR >> (http://www.netlib.org/odepack/opkd-sum) would be a nice addition. It >> would add root-finding capabilities that would be useful to Hans and >> myself. Since odeint already uses LSODA, I imagine wrapping LSODAR >> wouldn't be that hard. > You can find a start for this wrapper in my f2py tutorial at http://www.itp.tu-graz.ac.at/~pierre/f2py_tutorial.tar.gz Its quite a while ago I wrote it, and did not test it a lot, but I still hope, that not too many bugs will wait for you. The heart is lsodar and that beats reliable :-). Pierre From yichunwe at usc.edu Fri Mar 25 03:21:00 2005 From: yichunwe at usc.edu (Yichun Wei) Date: Fri, 25 Mar 2005 00:21:00 -0800 Subject: [SciPy-user] typo in document of fmin_powell Message-ID: <4243C9EC.6030308@usc.edu> Hi Scipy List, I think I find a typo in documentation for optimize.fmin_powell: Outputs: (xopt, {fopt, xi, direc, iter, funcalls, warnflag}, {allvecs}) ~~ <- is it a typo? regards, yichun From mtreiber at gmail.com Sun Mar 27 13:44:13 2005 From: mtreiber at gmail.com (Mark Treiber) Date: Sun, 27 Mar 2005 13:44:13 -0500 Subject: [SciPy-user] Scipy Linear Algebra functions In-Reply-To: <20050324044412.GA10517@dionysus.ucolick.org> References: <20050324044412.GA10517@dionysus.ucolick.org> Message-ID: <27e04e9105032710442f072012@mail.gmail.com> Hi Greg. I had the same problem a couple of weeks ago but solved it pretty easily. Looking through the fink package it appears that the scipy setup.py script has been modified so that scipy_distutils is supposed to be installed with f2py instead of scipy. A separate scipy_distutils tgz should have been downloaded when f2py was installed. Simply unpack it and run the setup script accordingly and it should install and work. The command I used was something like "/sw/bin/python2.3 setup.py install". Mark. On Wed, 23 Mar 2005 20:44:12 -0800, Greg Novak wrote: > I get the following traceback when using Scipy 0.3.2 on Mac OSX 10.3. > The Fink package I'm using is numbered: 1:0.3.2-2. Any suggestions > are appreciated. > > Thanks! > Greg > > In [25]: linalg.inv([[1,0],[0,1]]) > --------------------------------------------------------------------------- > PPImportError Traceback (most recent call > last) > > /Users/novak/Projects/Thesis/ > > /sw/lib/python2.3/site-packages/scipy_base/ppimport.py in > __getattr__(self, name) > 301 module = self.__dict__['_ppimport_module'] > 302 except KeyError: > --> 303 module = self._ppimport_importer() > 304 return getattr(module, name) > 305 > > /sw/lib/python2.3/site-packages/scipy_base/ppimport.py in > _ppimport_importer(self) > 260 exc_info = self.__dict__.get('_ppimport_exc_info') > 261 if exc_info is not None: > --> 262 raise PPImportError,\ > 263 > ''.join(traceback.format_exception(*exc_info)) > 264 else: > > PPImportError: Traceback (most recent call last): > File "/sw/lib/python2.3/site-packages/scipy_base/ppimport.py", line > 273, in _ppimport_importer > module = __import__(name,None,None,['*']) > File "/sw/lib/python2.3/site-packages/scipy/linalg/__init__.py", > line 8, in ? > from basic import * > File "/sw/lib/python2.3/site-packages/scipy/linalg/basic.py", line > 13, in ? > from flinalg import get_flinalg_funcs > File "/sw/lib/python2.3/site-packages/scipy/linalg/flinalg.py", line > 7, in ? > from scipy_distutils.misc_util import PostponedException > ImportError: No module named scipy_distutils.misc_util > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user > From novak at ucolick.org Wed Mar 30 00:04:35 2005 From: novak at ucolick.org (Greg Novak) Date: Tue, 29 Mar 2005 21:04:35 -0800 Subject: [SciPy-user] C/Fortran Integrate/minimization? Message-ID: <20050330050435.GC1924@dionysus.ucolick.org> Some time ago I inquired about the ability to integrate a system of differential equations by specify a C function instead of a python function to compute derivatives. This speeds up the calculation enormously. There seemed to be interest in such a feature on the mailing list. I posted code to do this here: http://www.ucolick.org/~novak/integrate.tgz. Are there any plans to include such a feature in scipy? The issue comes up for me again because I'd like to do the same thing for function minimization. I have to do many fits as part of a computation and I suspect that the Python function call overhead is dominating the run time. Thanks, Greg From curtis at hindmost.LPL.Arizona.EDU Wed Mar 30 01:03:32 2005 From: curtis at hindmost.LPL.Arizona.EDU (Curtis Cooper) Date: Tue, 29 Mar 2005 23:03:32 -0700 (MST) Subject: [SciPy-user] weave.inline keeps recompling my C++ code (fwd) Message-ID: Hi, I'm using weave.inline to compare the performance of weave with several other methods of doing the same calculation. It is simply a matrix multiplication: Z = sin(X)*cos(Y). I have attached my program for you to peruse. I am in general very impressed with the usability and performance of weave. I have two questions, however: 1) Why does it tell me 'repairing catalog by removing key' every single time I run this program using 'python tryweave.py'? I thought the idea was for scipy-weave to only have to recompile the C++ portions if the C++ source code changes. 2) How can I use the sin and cos functions with weave.blitz? Up until now, I have had to comment out the weave_blitz version because I can't figure out how to add cmath support to my blitz++ expression. FYI, my system is Debian Linux (Sarge/Testing branch). The Python prompt says: Python 2.3.5 (#2, Feb 9 2005, 00:38:15) [GCC 3.3.5 (Debian 1:3.3.5-8)] on linux2 Type "help", "copyright", "credits" or "license" for more information. >>> Cheers, Curtis -------------- next part -------------- #!/usr/bin/env python import weave import time N_ELEM = 720 M_ELEM = 720 N_ITER = 1 #---------- def main(): from pylab import linspace, meshgrid, pi from pylab import figure, imshow, hot, colorbar, show from numarray import ravel x = linspace(-pi, pi, N_ELEM) y = linspace(-pi, pi, M_ELEM) [XI, YI] = meshgrid(x, y) print print "Array Shapes: ", (N_ELEM, M_ELEM), ", # Iterations: ", N_ITER print "-----------------------------------------------" print start = time.clock() for i in xrange(N_ITER): Z_numarray = numarray_looping(XI, YI) stop = time.clock() dt = stop - start print "numarray_looping: ", dt, " [s]" start = time.clock() for i in xrange(N_ITER): Z = naive_looping(XI, YI) stop = time.clock() dt = stop - start print "naive_looping: ", dt, " [s]" # For rest, convert to Numeric from Numeric import reshape, resize, size, empty, Float, asarray, array, arange, pi, exp, sin, tan, cos start = time.clock() X = empty(XI.shape, typecode = 'd') Y = empty(XI.shape, typecode = 'd') #X[:] = ravel(XI) #Y[:] = ravel(YI) #reshape(X, XI.shape) #reshape(Y, XI.shape) #print type(X), X.shape #X = asarray(XI, savespace = 1) #Y = asarray(YI, savespace = 1) X[:] = XI; Y[:] = YI stop = time.clock() dt = stop - start print "Convert to Numeric: ", dt, " [s]" start = time.clock() for i in xrange(N_ITER): Z = Numeric_looping(X, Y) stop = time.clock() dt = stop - start print "Numeric_looping: ", dt, " [s]" start = time.clock() for i in xrange(N_ITER): Z = weave_inline_blitz(X, Y) stop = time.clock() dt = stop - start print "weave_inline_blitz: ", dt, " [s]" start = time.clock() for i in xrange(N_ITER): Z_weave_fast = weave_fast_looping(X, Y) stop = time.clock() dt = stop - start print "weave_fast_looping: ", dt, " [s]" from numarray import array start = time.clock() Z_out = array(Z_weave_fast) stop = time.clock() dt = stop - start print "Convert back: ", dt, " [s]" #start = time.clock() #for i in xrange(N_ITER): # Z_blitz = weave_blitz(X, Y) #stop = time.clock() #dt = stop - start #print "weave_blitz: ", dt, " [s]" start = time.clock() for i in xrange(N_ITER): Z_psyco = psyco_looping(X, Y) stop = time.clock() dt = stop - start print "psyco_looping: ", dt, " [s]" #figure() #imshow(Z_numarray, extent = [-pi, pi, -pi, pi]) #hot() #colorbar() #figure() #imshow(Z_out, extent = [-pi, pi, -pi, pi]) #hot() #colorbar() #show() def Numeric_looping(X, Y): """ Using Numeric's built-in looping abilities """ #print "Numeric looping: " from Numeric import sin, cos, tan Z = sin(X)*cos(Y) return Z def numarray_looping(X, Y): """ Using Numeric's built-in looping abilities """ #print "numarray looping: " from numarray import sin, cos, tan Z = sin(X)*cos(Y) return Z def naive_looping(X, Y): """ Naive Python loop computation """ #print "Naive looping: " from Numeric import arange from math import sin, cos, tan Z = X.copy() for i in xrange(X.shape[0]): for j in xrange(X.shape[1]): Z[i,j] = sin(X[i,j]) * cos(Y[i,j]) return Z def psyco_looping(X, Y): """ Psyco naive Python loop computation """ #print "Psyco looping: " from Numeric import arange from math import sin, cos, tan import psyco psyco.bind(psyco_looping) psyco.full() Z = X.copy() for i in xrange(X.shape[0]): for j in xrange(X.shape[1]): Z[i,j] = sin(X[i,j]) * cos(Y[i,j]) return Z def weave_inline_blitz(X, Y): """ Uses weave.inline """ #print "Weave inline:" from Numeric import array, Float64, zeros from weave import converters n, m = X.shape Z = zeros((n, m), 'd') cpp_code = ( """ int i = 0, j = 0; for (i = 0; i < n; ++i) for (j = 0; j < m; ++j) Z(i, j) = sin(X(i, j)) * cos(Y(i, j)); """) #print "Weave starting" weave.inline(cpp_code, ['X', 'Y', 'Z', 'n', 'm'], type_converters = converters.blitz) #print "Weave returning" return Z def weave_fast_looping(X, Y): """ Uses weave.inline """ #print "Weave inline:" from Numeric import array, Float64, zeros from weave import converters n, m = X.shape Z = zeros((n, m), 'd') # Weave module support_code = ( """ #include template class Array3D { public: /** Take in buffer of 3D array and store dimensions */ Array3D(T* buffer, size_t n_rows, size_t n_cols, size_t n_depth = 1) : rows(n_rows), cols(n_cols), depth(n_depth), M(buffer) {} const T& operator()(size_t i, size_t j, size_t k = 0) const { return M[(i*cols+j)*depth+k]; } T& operator()(size_t i, size_t j, size_t k = 0) { return M[(i*cols+j)*depth+k]; } T& operator[](size_t index) {return M[index];} const T& operator[](size_t index) const {return M[index];} private: size_t rows, cols, depth; T* M; }; """) cpp_code = ( """ using namespace std; int rows = NX[0]; int cols = NX[1]; int depth = 1; int k = 0; Array3D XH(X, rows, cols); Array3D YH(Y, rows, cols); Array3D ZH(Z, rows, cols); for (int i = 0; i < rows; ++i) for (int j = 0; j < cols; ++j) ZH(i, j) = sin(XH(i, j)) * cos(YH(i, j)); // As fast as it gets; requires type_converters = converters.blitz //Z = sin(X) * cos(Y); """) weave.inline(cpp_code, ['X', 'Y', 'Z'], support_code = support_code) return Z def weave_blitz(X, Y): """ Uses weave.blitz """ #print "Weave inline:" from Numeric import array, Float64, zeros from weave import converters from math import sin, cos n, m = X.shape Z = zeros((n, m), 'd') expr = "Z = sin(X)*cos(Y)" weave.blitz(expr) return Z if __name__ == '__main__': main() From pearu at scipy.org Wed Mar 30 01:37:40 2005 From: pearu at scipy.org (Pearu Peterson) Date: Wed, 30 Mar 2005 00:37:40 -0600 (CST) Subject: [SciPy-user] C/Fortran Integrate/minimization? In-Reply-To: <20050330050435.GC1924@dionysus.ucolick.org> References: <20050330050435.GC1924@dionysus.ucolick.org> Message-ID: On Tue, 29 Mar 2005, Greg Novak wrote: > Some time ago I inquired about the ability to integrate a system of > differential equations by specify a C function instead of a python > function to compute derivatives. This speeds up the calculation > enormously. There seemed to be interest in such a feature on the > mailing list. I posted code to do this here: > http://www.ucolick.org/~novak/integrate.tgz. > > Are there any plans to include such a feature in scipy? Yes, in fact, its partly already there. f2py has a _cpointer feature that allows one to pass C/Fortran function pointers directly to computational routines taking callback arguments giving C/Fortran speed also for callback arguments. In scipy this feature is available for codes that are wrapped with f2py 2.45.241_1926 or up. Some codes in scipy are still handwrapped but in future they will be replaced with f2py generated wrappers. Pearu From Fernando.Perez at colorado.edu Wed Mar 30 01:59:08 2005 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Tue, 29 Mar 2005 23:59:08 -0700 Subject: [SciPy-user] weave.inline keeps recompling my C++ code (fwd) In-Reply-To: References: Message-ID: <424A4E3C.8030403@colorado.edu> Curtis Cooper wrote: > Hi, > > I'm using weave.inline to compare the performance of weave with several > other methods of doing the same calculation. It is simply a matrix > multiplication: Z = sin(X)*cos(Y). I have attached my program for you to > peruse. I am in general very impressed with the usability and > performance of weave. I have two questions, however: > > 1) Why does it tell me 'repairing catalog by removing key' every single > time I run this program using 'python tryweave.py'? I thought the idea > was for scipy-weave to only have to recompile the C++ portions if the C++ > source code changes. I think this is a bug in weave.inline. Incidentally, I ran into it just today, and also when trying to use some C code which called sin/cos. I may have a look at the weave code to fix it, but I can't make any promises as to when. weave is not quite trivial, and the last time I fixed something there, it took a bit of effort just to wrap my head around enough of the code to understand what to look for. If you can fix it quicker, by all means send a patch. > 2) How can I use the sin and cos functions with weave.blitz? Up until > now, I have had to comment out the weave_blitz version because I can't > figure out how to add cmath support to my blitz++ expression. You need to make sure that the math.h header is included, and that libm is linked in. Here's a trivial (scalar) example: def sinC(x): """sinC(x) -> sin(x). Implemented using the C sin() function. """ support = \ """ #include """ code = \ """ return_val = sin(x); """ return inline(code,['x'], type_converters = converters.blitz, support_code = support, libraries = ['m'], ) Once you do those two things, your looping example with sin(X(i,j)) should work just fine. best, f From duncan at enthought.com Wed Mar 30 17:15:18 2005 From: duncan at enthought.com (Duncan Child) Date: Wed, 30 Mar 2005 16:15:18 -0600 Subject: [SciPy-user] Units in SciPy Message-ID: <424B24F6.4010104@enthought.com> Hi all, We have been using the units package written by Michael Aivaziz which is available here: http://www.cacr.caltech.edu/projects/pyre/ http://www.cacr.caltech.edu/projects/pyre/epydoc/public/pyre.units-module.html One potential issue is that a units package has to guess whether the user wants to convert a temperature gradient or a temperature: 1Celsius = 1Kelvin (gradient) 1Celsius = 274.15Kelvin (absolute) For a proprietary project Enthought reworked Michael's package to assume temperatures are absolute. This is maybe less surprising to a regular user but was tantamount to forking Michael's project. It also leaves the inherent ambiguity unresolved. If we put Mike's work into scipy we would probably want to use it unchanged. So my questions are: 1) any interest out there in us putting a units package into scipy? 2) any preference for a particular implementation? Unum, etc. 3) does anyone have input or suggestions regarding the temperature issue? Thanks, Duncan From pwang at enthought.com Wed Mar 30 18:13:03 2005 From: pwang at enthought.com (Peter Wang) Date: Wed, 30 Mar 2005 17:13:03 -0600 Subject: [SciPy-user] Units in SciPy In-Reply-To: <424B24F6.4010104@enthought.com> References: <424B24F6.4010104@enthought.com> Message-ID: <424B327F.7000707@enthought.com> Duncan Child wrote: > One potential issue is that a units package has to guess whether the > user wants to convert a temperature gradient or a temperature: ---snip--- > 3) does anyone have input or suggestions regarding the temperature issue? One thing I'd like to point out is that this is not specific to temperature. Whenever two unit systems have different origins, the unit framework will have to distinguish between differences and positions. (Alternatively, the framework can handle all positional measurements as (X - X_0), and use one general-case transformation.) In general, operators are not unit preserving; it's just that we intuitively expect addition/subtraction to be better behaved than multiplication. Off the top of my head, time measurement also suffers this problem. I'm sure there are others (although they're probably obscure). -Peter From Fernando.Perez at colorado.edu Wed Mar 30 18:16:00 2005 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Wed, 30 Mar 2005 16:16:00 -0700 Subject: [SciPy-user] Units in SciPy In-Reply-To: <424B24F6.4010104@enthought.com> References: <424B24F6.4010104@enthought.com> Message-ID: <424B3330.8050400@colorado.edu> Duncan Child wrote: > Hi all, > > We have been using the units package written by Michael Aivaziz which is > available here: > > http://www.cacr.caltech.edu/projects/pyre/ > http://www.cacr.caltech.edu/projects/pyre/epydoc/public/pyre.units-module.html > > One potential issue is that a units package has to guess whether the > user wants to convert a temperature gradient or a temperature: > > 1Celsius = 1Kelvin (gradient) > 1Celsius = 274.15Kelvin (absolute) > > For a proprietary project Enthought reworked Michael's package to assume > temperatures are absolute. This is maybe less surprising to a regular > user but was tantamount to forking Michael's project. It also leaves the > inherent ambiguity unresolved. If we put Mike's work into scipy we would > probably want to use it unchanged. > > So my questions are: > > 1) any interest out there in us putting a units package into scipy? Yes. Note that Scientific (Konrad's) also has units support. I just mention it for reference, I can't say anything in terms of comparisons. > 3) does anyone have input or suggestions regarding the temperature issue? Putting my physicist hat on, I'd strongly argue for simply saying that temperatures are temperatures, period. If you need to convert temperature _differences_ (gradients), you should explicitly say so. It's OK for a package to provide utility functions to do this conveniently, if it's a frequent need. But the concept of temperature should be unchanged: T_celsius = T_kelvin - 274.15 This formula implicitly encapsulates the fact that delta_t_celsius == delta_t_kelvin, but it's the only formula with basic physical meaning. The relative size of the unit steps is a simple consequence of this formula. And I think that good scientific libraries should encode, when possible, the more basic concepts, while providing whichever higher-level utility functions are deemed necessary. Just my $.02 Best, f From rkern at ucsd.edu Wed Mar 30 19:15:19 2005 From: rkern at ucsd.edu (Robert Kern) Date: Wed, 30 Mar 2005 16:15:19 -0800 Subject: [SciPy-user] Units in SciPy In-Reply-To: <424B24F6.4010104@enthought.com> References: <424B24F6.4010104@enthought.com> Message-ID: <424B4117.6030506@ucsd.edu> Duncan Child wrote: > 1) any interest out there in us putting a units package into scipy? Sure, but not any of the current ones. > 2) any preference for a particular implementation? Unum, etc. Unum is GPL. I don't think any of the current offerings (Unum, ScientificPython's PhysicalQuantities, and pyre.units) do particularly well with this issue, although, as I note below, PhysicalQuantities gets closest. > 3) does anyone have input or suggestions regarding the temperature issue? Personally: computing absolute "degrees Fahrenheit" from absolute "degrees Celsius" is not a unit conversion; it is a calculation built on top of unit conversions. Converting "feet" from "meters" is a unit conversion. Computing "feet from my house" from "meters from the North Pole" is a calculation. I think that it is unwise to expect a unit conversion system to handle such a computation the same way it handles everything else. That said, ScientificPython hacks around this creditably: "K" and "degR" are both the absolute (referenced to absolute zero) and the differential units. "degC" and "degF" are the absolute units referenced to their respective zeros. It's limited, though. You can't add a "degR" differential temperature to a "degC" absolute temperature, but you can add a "K" differential temperature. You can add "degC" to a "degC", but that's inconsistent. Frink[1], which I hold as the gold standard for computer unit conversions (and whose "Sample Calculations"[2] ought to be repeated for any proposed unit package), sorta gets this right. Celsius[x] and Fahrenheit[x] are functions that go back and forth between systems. Give them an unadorned number, they'll interpret as, e.g. "10 degrees Celsius", and convert to absolute Kelvin. Give it an absolute temperature (in Kelvin or Rankine), and it will spit out the numerical value of the temperature in its own scale (although unadorned by any unit). "degC" and "degF" are the differential units. It would be useful to have a system that would handle the general case of referenced measurements like the Celsius and Fahrenheit scales and "meters from my house". The appropriate distinctions should be made when operations mix referenced measurements and "differential" quantities. Oh, and any system should yoink the contents of Frink's database. [1] http://futureboy.homeip.net/frinkdocs/ [2] http://futureboy.homeip.net/frinkdocs/#SampleCalculations -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From cookedm at physics.mcmaster.ca Wed Mar 30 20:01:57 2005 From: cookedm at physics.mcmaster.ca (David M. Cooke) Date: Wed, 30 Mar 2005 20:01:57 -0500 Subject: [SciPy-user] Units in SciPy In-Reply-To: <424B3330.8050400@colorado.edu> (Fernando Perez's message of "Wed, 30 Mar 2005 16:16:00 -0700") References: <424B24F6.4010104@enthought.com> <424B3330.8050400@colorado.edu> Message-ID: Fernando Perez writes: > Putting my physicist hat on, I'd strongly argue for simply saying that > temperatures are temperatures, period. If you need to convert > temperature _differences_ (gradients), you should explicitly say so. > It's OK for a package to provide utility functions to do this > conveniently, if it's a frequent need. But the concept of temperature > should be unchanged: > > T_celsius = T_kelvin - 274.15 Maybe you'd better check the fit of that physicist's hat ;-) -- |>|\/|< /--------------------------------------------------------------------------\ |David M. Cooke http://arbutus.physics.mcmaster.ca/dmc/ |cookedm at physics.mcmaster.ca From Fernando.Perez at colorado.edu Wed Mar 30 20:14:59 2005 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Wed, 30 Mar 2005 18:14:59 -0700 Subject: [SciPy-user] Units in SciPy In-Reply-To: References: <424B24F6.4010104@enthought.com> <424B3330.8050400@colorado.edu> Message-ID: <424B4F13.4090000@colorado.edu> David M. Cooke wrote: > Fernando Perez writes: > > >>Putting my physicist hat on, I'd strongly argue for simply saying that >>temperatures are temperatures, period. If you need to convert >>temperature _differences_ (gradients), you should explicitly say so. >>It's OK for a package to provide utility functions to do this >>conveniently, if it's a frequent need. But the concept of temperature >>should be unchanged: >> >>T_celsius = T_kelvin - 274.15 > > > Maybe you'd better check the fit of that physicist's hat ;-) Sorry, it's 273.15, I didn't actually look it up, I just used the value given by the OP. Best, f From rkern at ucsd.edu Thu Mar 31 08:40:44 2005 From: rkern at ucsd.edu (Robert Kern) Date: Thu, 31 Mar 2005 05:40:44 -0800 Subject: [SciPy-user] Units in SciPy In-Reply-To: <424B24F6.4010104@enthought.com> References: <424B24F6.4010104@enthought.com> Message-ID: <424BFDDC.1070701@ucsd.edu> Duncan Child wrote: > 3) does anyone have input or suggestions regarding the temperature issue? I've thought about this a little more, and I would like to describe here an outline of the system that I would like to see with examples drawn from my current research. One of the many things that Konrad got right was the name: the system is dealing with physical quantities. The most general form would consist of "numerical quantities tagged by their measurement system." A large subset can be described as "numerical quantities with units." What we need is a system that can perform operations and conversions between measurement systems, some of which will be user-defined and arbitrarily complicated. The methods for dealing with the unit-subset are sufficiently covered by the currently available software. With the exception of certain, very standard conversions like temperature scales, most of the non-unit-subset must be user-defined. Allow me to describe some of the many measurement systems in my current project in more detail than you probably care; feel free to skip ahead. I am a geophysicist. I am modelling the distribution of slip over a fault surface during and after an earthquake. I am using measurements from a local GPS network and Interferometric Synthetic Aperture Radar (InSAR: I get images of the very small displacements from each pixel between two flyovers of a sattelite; it's kind of like having a very dense GPS network that doesn't get sampled very often). My measurement systems include: Location: * Latitude and longitude of various entities: GPS stations, corners of fault patches, the corners of my InSAR images * Local northings, eastings, up (essentially the meters north, east, and up from a given local reference point) * X, Y, Z referenced to the fault geometry * Pixels in the InSAR images * Distance along-slip and along-dip on the fault (a location on the plane of the fault place itself) Displacements: * Differential ground displacements of the GPS given a reference station * Differential ground displacements from the InSAR images given a reference pixel projected onto the line of sight of the satellite * Absolute ground displacements given a slip model Time: * GPS time: seconds from an arbitrary start point, no leap seconds * Satellite time: seconds from Jan 1 00:00 of the launch year of the satellite, no leap seconds * Calendrical time: leap seconds! Miscellaneous: * Slip, rake: magnitude and direction of the slip for each fault patch * Dip-slip, strike-slip: orthogonal slip components for each fault patch * Green functions: observable displacements (GPS and InSAR) as a function of slip across each fault patch * Smoothing kernel: for regularizing my inverse problem ... and probably some more. Now, I don't claim that even given my dream physical quantities system that I would commit all of these entities to it. Some, like time, are more suited to the strategy of converting everything as soon as it enters the picture. Others don't need conversions between different measurement systems. Skip to here if you are bored by geophysics. I have found that I have been duplicating snippets of code, throwing around little one-to-one conversion functions, or worst, just not bothering with some task because it's too annoying to switch between these systems. Here is my sketch of a physical quantities system: Each quantity has a value, which can be anything reasonably number-like, and measurement system, which probably should be a full-fledged object, but perhaps could also be a string for standard shortcuts. In[1]: x = quantity(10.5, apples) In[2]: y = quantity(12, oranges) There will exist a function for converting quantities from one system to another. In[3]: z = convert(x, oranges) In the case of real, honest-to-goodness units like "m" and "degC" (the differential kind, not absolute systems, not even Kelvin or Rankine), converting falls down to the well-known algorithms. Failing that, convert() will look for an appropriate conversion path from all of the measurement system objects registered with it. This lookup algorithm can be stolen from PyProtocols[1]. Operations that can be performed on quantities of a given measurement system will be defined by the measurement system object. For example, quantities in the (lat, lon) system can be converted to (northings, eastings). To a (lat, lon) quantity, I can add or subtract a unitted quantity of dimensions (length, length). I cannot add another (lat, lon) quantity, but I can subtract it yielding a (length, length), which, given a suitable context, can be converted to (range, azimuth). Multiplication does not make sense, nor does division, but trigonometric functions might depending on your use cases. Of course, temperature scales fit neatly into this system. Since the "shift-scale-shift" type of conversion happens reasonably often, a bit of general-purpose scaffolding can be constructed for it and which the temperature scales can use. [1] http://peak.telecommunity.com/PyProtocols.html -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From ckkart at hoc.net Thu Mar 31 08:56:14 2005 From: ckkart at hoc.net (Christian Kristukat) Date: Thu, 31 Mar 2005 15:56:14 +0200 Subject: [SciPy-user] bivariate spline Message-ID: <424C017E.3030306@hoc.net> Hi, I was trying to use interpolate.bisplrep on a series of data points and I got the following error: numerix Numeric 23.0 iopt,kx,ky,m= 0 3 3 30 nxest,nyest,nmax= 9 9 9 lwrk1,lwrk2,kwrk= 1410 571 34 xb,xe,yb,ye= -9.68459276E-07 14.1335814 2.7324011 22.748873 eps,s 1.E-16 22.2540333 Traceback (most recent call last): File "relax_6.py", line 1001, in ? main() File "relax_6.py", line 976, in main bsp = bisplrep(x,y,h) File "/usr/lib/python2.3/site-packages/scipy/interpolate/fitpack.py", line 611, in bisplrep tx,ty,nxest,nyest,wrk,lwrk1,lwrk2) SystemError: error return without exception set x,y,h are all 1dim float arrays. I'm using a quite recent version of scipy from cvs and python 2.3 on linux. I'm a little confused about the usage of the bisplrep function. What type of input is allowed? I successfully tried with the rank-2 arrays output of mgrid[,] but with rank-1 arrays describing some points I get the error above. Are there different fortran subroutines being called depending on the input and is my type of input valid or did I misunderstand something? Regards, Christian From pajer at iname.com Thu Mar 31 09:29:53 2005 From: pajer at iname.com (Gary) Date: Thu, 31 Mar 2005 09:29:53 -0500 Subject: [SciPy-user] Units in SciPy In-Reply-To: <424B24F6.4010104@enthought.com> References: <424B24F6.4010104@enthought.com> Message-ID: <424C0961.9050508@iname.com> Duncan Child wrote: > Hi all, > > We have been using the units package written by Michael Aivaziz which > is available here: [...] > > 1) any interest out there in us putting a units package into scipy? I've tried units packages in other systems. (notably Mathcad) After 30 minutes I abandon them. Much typing, little payback. > 2) any preference for a particular implementation? Unum, etc. > 3) does anyone have input or suggestions regarding the temperature issue? This may be the same as the Scientific Python solution (which I don't quite understand): two units, degC and Cdeg. 100 degC = 373 K. 100 Cdeg = 100 Kdeg. The standard physics text Halliday, Resnick and Walker (and its offspring Cummings, et al) do it that way. -gary > > Thanks, > > Duncan > > > > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user >