From john.russo at poste.it Sun Oct 9 22:45:45 2005 From: john.russo at poste.it (john russo) Date: Mon, 10 Oct 2005 04:45:45 +0200 Subject: [SciPy-user] problems installing scipy Message-ID: <200510100445.46154.john.russo@poste.it> Hi, I need to build Scipy from sources on my Linux desktop (Slamd64 with Intel Fortran compiler). I've installed Python-2.4.1 in my home directory ~/Python-2.4.1 and Numpy, F2PY in another directory ~/PythonHome with the following command: python setup.py install --home=~/PythonHome/ Then I've compiled the BLAS and LAPACK packages in the directory ~/src/blas and ~/src/LAPACK so in these directory I've got libfblas.a and lapack_LINUX_IFC.a So I've setted the environment variables BLAS and LAPACK to these files: export ~/src/blas/libfblas.a export ~/src/LAPACK/lapack_LINUX_IFC.a export ATLAS=None But the when I go to the Scipy directory and execute python setup.py install --home=~/PythonHome/ I get the following error: Traceback (most recent call last): File "setup.py", line 111, in ? setup_package(ignore_packages) File "setup.py", line 85, in setup_package ignore_packages = ignore_packages) File "scipy_core/scipy_distutils/misc_util.py", line 475, in get_subpackages config = setup_module.configuration(*args) File "/home/john/SciPy_complete-0.3.2/Lib/linalg/setup_linalg.py", line 40, in configuration raise NotFoundError,'no lapack/blas resources found' scipy_distutils.system_info.NotFoundError: no lapack/blas resources found Can anyone help me? Thanks, John From strawman at astraw.com Sun Oct 2 23:16:22 2005 From: strawman at astraw.com (Andrew Straw) Date: Sun, 02 Oct 2005 20:16:22 -0700 Subject: [SciPy-user] Building scipy without fortran compiler on Windows using MSVC C++ Toolkit compiler Message-ID: <4340A286.5070505@astraw.com> Hi, I'm trying to get scipy compiled for Python 2.4 on Windows using the free (but not open source) MSVC C++ Toolkit compiler. (Setup of this compiler is detailed here: http://www.vrplumber.com/programming/mstoolkit/ ) I'm trying to use the MSVC C++ Toolkit compiler because I've got a hunk of other (C++) code compiled with it. (Although perhaps mingw will link fine to that code -- anyone know?) So far, setup.py fails when it can't find f77. I remember talk about making scipy easy to install and specifically not requiring a fortran compiler, at least for a minimal install. Is this possible today? I'm talking about the "traditional" scipy (with Numeric) here, not newcore. In particular I'd like to use scipy.sparse. Here are notes on what I've done: (Using Tortoise SVN) checked out http://svn.scipy.org/svn/scipy/trunk into "scipy". cd scipy mkdir scipy_core (Using Tortoise SVN) checked out http://svn.scipy.org/svn/scipy_core/trunk into "scipy_core" cd .. (now back in scipy) As described in http://www.scipy.org/documentation/buildscipywin32.txt installed ATLAS into C:\atlas\3.6.0\P4SSE2 (downloaded from http://www.scipy.org/download/atlasbinaries/winnt/ ) Set BLAS=c:\atlas\3.6.0\P4SSE2\libf77blas.a Set LAPACK=c:\atlas\3.6.0\P4SSE2\liblapack.a python setup.py install (multiple times -- it kept complaining about no permissions in C:\windows\temp\*, but running again seemed to fix the problem.) python setup.py install [snipped lots of stuff here which looked OK] running build_clib No module named msvccompiler in scipy_distutils, trying from distutils.. customize MSVCCompiler customize MSVCCompiler using build_clib 0 Could not locate executable g77 Could not locate executable f77 customize GnuFCompiler Could not locate executable f77 Executable f77 does not exist Could not locate executable f77 Executable f77 does not exist Could not locate executable f77 Executable f77 does not exist Could not locate executable ifort Could not locate executable ifc Could not locate executable efort Could not locate executable efc Could not locate executable ifort MSVCCompiler instance has no attribute 'lib' customize AbsoftFCompiler Could not locate executable f90 Executable f90 does not exist MSVCCompiler instance has no attribute 'lib' Could not locate executable ifort Could not locate executable ifc Could not locate executable efort Could not locate executable efc Could not locate executable ifort MSVCCompiler instance has no attribute 'lib' customize G95FCompiler Could not locate executable g95 Executable g95 does not exist customize GnuFCompiler Could not locate executable f77 Executable f77 does not exist Could not locate executable f77 Executable f77 does not exist Could not locate executable f77 Executable f77 does not exist customize GnuFCompiler Could not locate executable f77 Executable f77 does not exist Could not locate executable f77 Executable f77 does not exist Could not locate executable f77 Executable f77 does not exist customize GnuFCompiler using build_clib building 'mach' library compiling Fortran sources f77(f77) options: '-Wall -fno-second-underscore -O2 -funroll-loops' compile options: '-c' f77:f77: Lib\special\mach\d1mach.f Could not locate executable f77 Executable f77 does not exist error: Command "f77 -Wall -fno-second-underscore -O2 -funroll-loops -c -c Lib\sp ecial\mach\d1mach.f -o build\temp.win32-2.4\Lib\special\mach\d1mach.o" failed wi th exit status 1 From rkern at ucsd.edu Sun Oct 2 23:28:15 2005 From: rkern at ucsd.edu (Robert Kern) Date: Sun, 02 Oct 2005 20:28:15 -0700 Subject: [SciPy-user] Building scipy without fortran compiler on Windows using MSVC C++ Toolkit compiler In-Reply-To: <4340A286.5070505@astraw.com> References: <4340A286.5070505@astraw.com> Message-ID: <4340A54F.6050708@ucsd.edu> Andrew Straw wrote: > Hi, > > I'm trying to get scipy compiled for Python 2.4 on Windows using the > free (but not open source) MSVC C++ Toolkit compiler. (Setup of this > compiler is detailed here: > http://www.vrplumber.com/programming/mstoolkit/ ) > > I'm trying to use the MSVC C++ Toolkit compiler because I've got a hunk > of other (C++) code compiled with it. (Although perhaps mingw will link > fine to that code -- anyone know?) IIRC, you can't link C++ code compiled with mingw with C++ code that was compiled with MSVC. The C++ ABIs are different. However, it should be possible to have both mingw-compiled C and FORTRAN extension modules and MSVC-compiled C++ extension modules in the same process. > So far, setup.py fails when it can't find f77. I remember talk about > making scipy easy to install and specifically not requiring a fortran > compiler, at least for a minimal install. Is this possible today? I'm > talking about the "traditional" scipy (with Numeric) here, not newcore. > In particular I'd like to use scipy.sparse. That's probably not possible with old scipy. scipy.sparse in particular has some FORTRAN in it. -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From christophe.grimault at novagrid.com Mon Oct 3 05:29:06 2005 From: christophe.grimault at novagrid.com (christophe grimault) Date: Mon, 03 Oct 2005 09:29:06 +0000 Subject: [SciPy-user] Consequenses of SciPy Core 0.4 ? In-Reply-To: <433D4C80.9080003@ee.byu.edu> References: <433CE24A.6040509@ee.byu.edu> <17212.63995.2169.219479@monster.linux.in> <08b0ad3bde9aedda81f752e937f45de3@uni-mainz.de> <1128095333.5231.5.camel@pandora> <433D4C80.9080003@ee.byu.edu> Message-ID: <1128331746.5313.2.camel@pandora> Le vendredi 30 septembre 2005 ? 08:32 -0600, Travis Oliphant a ?crit : > christophe grimault wrote: > > >Hi Travis, > > > >Does the new scipy core means that, at least on unix, we won't have to > >build and install Numeric anymore ? > > > > > > > Right, scipy core replaces Numeric. > > Now, you still need to get scipy core built and installed. Which > version of UNIX are you running? > > -Travis > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user I use linux Fedora Core3, with python 2.4, Scipy 0.3.2 and boost_1_33. Chris From dani.berzas at gmail.com Mon Oct 3 07:42:19 2005 From: dani.berzas at gmail.com (Daniel =?ISO-8859-1?Q?Jim=E9nez?=) Date: Mon, 03 Oct 2005 13:42:19 +0200 Subject: [SciPy-user] memory leak with Python/C API Message-ID: <1128339739.3851.56.camel@localhost.localdomain> Hi, every one, First sorry if this is not the right the place to ask this question. I'm having memory leak when working with Python/C API. A big amount of data is generated in C and transfered to Python, but after use the memory is not freed. The program run until the OS kill the process. I'm working with Ubuntu-Warty, Python 2.3.4, [GCC 3.3.4 (Debian 1:3.3.4-9ubuntu5)] on linux2 with scipy 0.3.2 The C program goes as follows: { Load the initial data (list of arrays) with PyArg_ParseTuple(). ExitList = PyList_New(..); HpList = PyList_New(n_esp); loop_1 { hp = calloc() (compute hp) PyList_SET_ITEM(HpList,i,...(char *)hp) ); } PyList_SET_ITEM(ExitList,0,(PyObject *)HpList); return ExitList; } In Python I activate garbage collector and delelete ExitList. But the used memory continues increasing until the process is killed. I've also used valgring but without results. I'll really appreciate any suggestion on this topic. Shall I try wrapping C with SWIG, Boost, weave, Pyrex, etc. ?. Thanks for your help. Dani. From rkern at ucsd.edu Mon Oct 3 07:56:43 2005 From: rkern at ucsd.edu (Robert Kern) Date: Mon, 03 Oct 2005 04:56:43 -0700 Subject: [SciPy-user] memory leak with Python/C API In-Reply-To: <1128339739.3851.56.camel@localhost.localdomain> References: <1128339739.3851.56.camel@localhost.localdomain> Message-ID: <43411C7B.40801@ucsd.edu> Daniel Jim?nez wrote: > Hi, every one, > First sorry if this is not the right the place to ask this question. Probably not. You'll probably get a better response at a more general forum like comp.lang.python . > I'm having memory leak when working with Python/C API. > > A big amount of data is generated in C and transfered to Python, but > after use the memory is not freed. The program run until the OS kill the > process. > > I'm working with Ubuntu-Warty, Python 2.3.4, > [GCC 3.3.4 (Debian 1:3.3.4-9ubuntu5)] on linux2 > with scipy 0.3.2 > > > The C program goes as follows: > { > Load the initial data (list of arrays) with PyArg_ParseTuple(). > ExitList = PyList_New(..); > HpList = PyList_New(n_esp); > loop_1 > { hp = calloc() > (compute hp) > PyList_SET_ITEM(HpList,i,...(char *)hp) ); You're eliding a lot of the code that's going to tell us where your leak is. For example, PyList_SET_ITEM doesn't take char* as an argument. Do you have a PyString_FromString() in there? It's entirely possible that the function you are calling copies the data in the char*. PyString_FromString() certainly does. You will have to free(hp) yourself. > } > > PyList_SET_ITEM(ExitList,0,(PyObject *)HpList); > > return ExitList; > } It would help if you could boil down your function to the smallest complete example that displays the problem. In the process, you'll probably find the answer yourself. If you don't, I'll be waiting on comp.lang.python . -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From swartzjp at gmail.com Mon Oct 3 09:13:12 2005 From: swartzjp at gmail.com (Julian Swartz) Date: Mon, 3 Oct 2005 15:13:12 +0200 Subject: [SciPy-user] Possible bug in bessel special function h2vp Message-ID: <583cfa120510030613w62e73048hdbdd9d6994112602@mail.gmail.com> Hi, I've encountered what appears to be a bug in the function h2vp, which calculates the nth order derivative of a hankel function of the second kind. Here is an example with a second derivative being requested. The function returns two different values on successive calls. Any subsequent calls will yield the same result as the second call shown below. This problem is also evident in function h1vp, calculating the derivatives of a hankel function of the first kind. >>> from scipy import * >>> from scipy.special import * >>> n = 1.0 >>> x = 2*0.3*pi >>> h2vp(n, x, 2) (2.9279764804171628e+165+0.22981683442664608j) >>> h2vp(n, x, 2) (-0.23515420288453626+0.54623524288750835j) Even more interesting is that given that the hankel function of the second kind is defined as: (2) H (x) = J (x) - jY (x) n n n Where J and Y are the standard cylindrical bessel functions of the first and second kind. One would therefore expect that the second derivative of the Hankel function should be the sum of the second derivatives of the ordinary bessel functions. Using the scipy functions jvp and yvp, >>> jvp(n, x, 2) -0.40831349982163045 >>> yvp(n, x, 2) -0.18651604740953789 Which is quite different to either of the results above. Any ideas on what the problem could be. (or am I just missing something?) Julian -------------- next part -------------- An HTML attachment was scrubbed... URL: From rkern at ucsd.edu Mon Oct 3 09:33:41 2005 From: rkern at ucsd.edu (Robert Kern) Date: Mon, 03 Oct 2005 06:33:41 -0700 Subject: [SciPy-user] Possible bug in bessel special function h2vp In-Reply-To: <583cfa120510030613w62e73048hdbdd9d6994112602@mail.gmail.com> References: <583cfa120510030613w62e73048hdbdd9d6994112602@mail.gmail.com> Message-ID: <43413335.7050308@ucsd.edu> Julian Swartz wrote: > Hi, > > I've encountered what appears to be a bug in the function h2vp, which > calculates the nth order derivative of a hankel function of the second kind. > Here is an example with a second derivative being requested. The > function returns two different values on successive calls. Any > subsequent calls will yield the same result as the second call shown > below. This problem is also evident in function h1vp, calculating the > derivatives of a hankel function of the first kind. > > >>>> from scipy import * >>>> from scipy.special import * >>>> n = 1.0 >>>> x = 2*0.3*pi >>>> h2vp(n, x, 2) > (2.9279764804171628e+165+0.22981683442664608j) >>>> h2vp(n, x, 2) > (-0.23515420288453626+0.54623524288750835j) > > Even more interesting is that given that the hankel function of the > second kind is defined as: > > (2) > H (x) = J (x) - jY (x) > n n n > > Where J and Y are the standard cylindrical bessel functions of the first > and second kind. One would therefore expect that the second derivative > of the Hankel function should be the sum of the second derivatives of > the ordinary bessel functions. Using the scipy functions jvp and yvp, > >>>> jvp(n, x, 2) > -0.40831349982163045 >>>> yvp(n, x, 2) > -0.18651604740953789 > > Which is quite different to either of the results above. > > Any ideas on what the problem could be. (or am I just missing something?) Not a clue. On OS X with the last CVS of scipy before the migration to SVN, I don't see differences when repeating the call, but I do see a difference from the stated definition for derivatives of order > 1: In [1]: n = 1.0 In [2]: x = 2*0.3*pi In [3]: special.h2vp(n,x,2) Out[3]: (-0.26294530063194937+0.22981683442664608j) In [4]: special.h2vp(n,x,2) Out[4]: (-0.26294530063194937+0.22981683442664608j) In [8]: special.jvp(n,x,2) - 1j*special.yvp(n,x,2) Out[8]: (-0.40831349982163045+0.18651604740953792j) In [23]: special.h2vp(n,x,1) Out[23]: (-0.017916685502942564-0.58616758520739864j) In [24]: special.jvp(n,x,1) - 1j*special.yvp(n,x,1) Out[24]: (-0.017916685502942648-0.58616758520739876j) In [21]: special.h2vp(n,x,3) Out[21]: (-0.2382065153405013+0.02078698962754455j) In [22]: special.jvp(n,x,3) - 1j*special.yvp(n,x,3) Out[22]: (0.050806004596450884+0.10554382826770611j) In [25]: special.hankel2(n,x) Out[25]: (0.58147279675872476+0.1732031480684324j) In [26]: special.jv(n,x) - 1j*special.yv(n,x) Out[26]: (0.58147279675872487+0.17320314806843251j) -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From stephen.walton at csun.edu Mon Oct 3 11:23:03 2005 From: stephen.walton at csun.edu (Stephen Walton) Date: Mon, 03 Oct 2005 08:23:03 -0700 Subject: [SciPy-user] Re: [Numpy-discussion] Release of SciPy Core 0.4 (Beta) In-Reply-To: <4340C6FA.1050307@optusnet.com.au> References: <433CE24A.6040509@ee.byu.edu> <4340B872.8020102@optusnet.com.au> <4340BE90.6080302@pfdubois.com> <4340C6FA.1050307@optusnet.com.au> Message-ID: <43414CD7.8020507@csun.edu> Tim Churches wrote: >That would eb a most welcome development. If you are in the mood for >some minor extensions to MA (you probably aren't, but just on the >off-chance...), then support for multiple masking values would be great. > > This sounds like an overly complicated addition to MA. If I may be so bold, it seems to me that what you might want is your own object which maintains parallel arrays of values and "mask reasons," if you will, and generates a mask array accordingly From stephen.walton at csun.edu Mon Oct 3 11:42:28 2005 From: stephen.walton at csun.edu (Stephen Walton) Date: Mon, 03 Oct 2005 08:42:28 -0700 Subject: [SciPy-user] Possible bug in bessel special function h2vp In-Reply-To: <43413335.7050308@ucsd.edu> References: <583cfa120510030613w62e73048hdbdd9d6994112602@mail.gmail.com> <43413335.7050308@ucsd.edu> Message-ID: <43415164.5090306@csun.edu> Robert Kern wrote: >Julian Swartz wrote: > > >>Hi, >> >>I've encountered what appears to be a bug in the function h2vp, which >>calculates the nth order derivative of a hankel function of the second kind. >>Here is an example with a second derivative being requested. The >>function returns two different values on successive calls. >> >Not a clue. On OS X with the last CVS of scipy before the migration to >SVN, I don't see differences when repeating the call, > I get the same answers as Robert. Fedora Core 4. From shupe at ipac.caltech.edu Mon Oct 3 12:03:39 2005 From: shupe at ipac.caltech.edu (David Shupe) Date: Mon, 3 Oct 2005 09:03:39 -0700 Subject: [SciPy-user] Re: Problems Installing SciPy on OS X 10.4.2 w/python 2.3.5 In-Reply-To: <20050930173726.A17933EB9E@www.scipy.com> References: <20050930173726.A17933EB9E@www.scipy.com> Message-ID: On Sep 30, 2005, at 10:37 AM, scipy-user-request at scipy.net wrote: > > Message: 6 > Date: Fri, 30 Sep 2005 09:44:06 -0700 > From: Zane Crawford > Subject: [SciPy-user] Problems Installing SciPy on OS X 10.4.2 w/ > python 2.3.5 > To: scipy-user at scipy.net > Message-ID: <20050930164406.GA1688 at ideotrope.org> > Content-Type: text/plain; charset=us-ascii > > With much frustration, and some patient help from Chris Fonnesbeck, I > have been trying to get SciPy installed on my Mac. > > I am running OS X 10.4.2, and am attempting to use the stock python > (2.3.5) which is installed by the Apple developer tools. I believe I > have all of the underlying dependencies installed and functioning > (Numeric 24.0b2, AquaTerm 1.0b2, fftw, GnuPlot 4.0.0, F2PY, and > probably > other things I installed along the way), and am now running into > problems with the actual build of SciPy itself. I have just tried > three > different ways: > > 1) I installed a binary package which Chris created for me, built > on python > 2.3.5. It installs fine, and I can "import scipy" without issues, > however, I can't use anything from it, because of the > __cvs_version__ > bug, which is referred to in the mailing list archives, but > which was > apparently not fixed in a persistent way. (it seems there was some > clever code which allowed the software to keep track of its own > version based on the CVS versions of the files, but this became > broken when the source was migrated to SVN?) > > 2) I tried to build the last CVS version of SciPy, from July 29th, > 2005. This fails almost immediately, regardless of whether I'm > using > GCC 3.3 or GCC 4.0, with an error from posixpath.py, line 77: > i = p.rfind('/') + 1 > AttributeError: 'NoneType' object has no attribute 'rfind' > > 3) I tried pulling the most recent version of the source from SVN. > This > seems to get much further than the CVS build, but fails eventually: > > In file included from Lib/xplt/src/play/all/hash0.c:8: > build/temp.darwin-8.2.0-Power_Macintosh-2.3/config_pygist/ > config.h:3:3: > #error destroy this config.h and rerun configure script > > However, there are many many warnings prior to this failure, and I > don't know what the acutal problem is > > I've completely removed (as far as I can tell) all installed SciPy > packages (from Chris' build) and think I have a clean 2.3.5 python > environment ready to install SciPy. If anyone has any other sources > they would recommend building from, I'm happy to try. My initial > attempt to install SciPy (prior to this mess) was via Fink, which > seemed > to result in its own problems. > > -- > Zane A. Crawford . + zane at ideotrope.org > Amateur Human * , * https://ideotrope.org > Boulder, Colorado __o > work : 303.735.3729 _`\<,_ "Here we are, trapped in the > cell : 303.815.6866 (*)/ (*) amber of the moment. There > PGP : 0x55E0815F is no why." I was able to install SciPy from svn last Thursday (Sep 29) on a new system with OS X 10.4.2 and XCode 2. I didn't use the stock Python because it doesn't have readline support. I wanted to have matplotlib and IPython in addition to SciPy, and couldn't get matplotlib working using Fink. If you are willing to use a different Python, here are the steps I followed: - From http://undefined.org/python/ install MacPython 2.4.1 and TigerPython24Fix. Then set PATH to use /usr/local/bin first. - From http://pythonmac.org/packages/ install these packages: Numeric numarray wxPython matplotlib - Using fink, install atlas and fftw3. - In the svn checkout of the scipy trunk, set up site.cfg to find the Atlas and FFTw3 libraries. Here's the relevant snippet of my site.cfg: [DEFAULT] # library_dirs = /usr/local/lib:/opt/lib:/usr/lib # include_dirs = /usr/local/include:/opt/include:/usr/include # src_dirs = /usr/local/src:/opt/src library_dirs = /usr/local/lib:/usr/lib:/sw/lib include_dirs = /usr/local/include:/usr/include:/sw/include # Search static libraries (.a) in preference to shared ones (.so,.sl): # XXX: will be removed in future unless proved to be useful search_static_first = 1 [atlas] # system_info.py searches atlas and lapack from the following paths # library_dir:DEFAULT/atlas*:DEFAULT/ATLAS*:DEFAULT # where DEFAULT refers to library_dir defined in [DEFAULT] section and #library_dirs = /usr/lib/3dnow # Debian Sid # For overriding the names of the atlas libraries: # atlas_libs = lapack, f77blas, cblas, atlas [lapack] # library_dirs = .. [lapack_src] # src_dirs = .. [blas] # library_dirs = .. [fftw] #fftw_libs = fftw, rfftw #fftw_opt_libs = fftw_threaded, rfftw_threaded # if the above aren't found, look for {s,d}fftw_libs and {s,d} fftw_opt_libs [x11] library_dirs = /usr/X11R6/lib include_dirs = /usr/X11R6/include - Then build scipy_core and scipy. My first build of scipy failed with a message about the loader not finding a flag "cc_dynamic". I commented out lines 107 and 108 in the scipy_distutils/ gnufcompiler.py file: ~/soft/scipy/scipy_core/scipy_distutils> diff gnufcompiler.py gnufcompiler.py.bak 107,108c107,108 < #if sys.platform == 'darwin': < # opt.append('cc_dynamic') --- > if sys.platform == 'darwin': > opt.append('cc_dynamic') Then my second build worked great! scipy.test() runs cleanly. To use matplotlib, I start my sessions using /usr/local/bin/pythonw. This was much easier than my previous attempts to build scipy on Solaris! Hope this helps, David Shupe Spitzer Science Center, Caltech From rowen at cesmail.net Mon Oct 3 11:59:23 2005 From: rowen at cesmail.net (Russell E. Owen) Date: Mon, 03 Oct 2005 08:59:23 -0700 Subject: [SciPy-user] pyfits bug: SIMPLE mis-formatted Message-ID: We are using pyfits 0.9.8.2 to create some FITS files and they end up with SIMPLE = T formatted as "free format". This technically violates the standard, which insists of fixed format -- though it surprises me that any fits reading software would care about such a violation. Somebody else trying to read the same files with pyfits (I'm guessing an earlier version, but I don't have any details yet other than "it was on solaris") was getting this error: "Cannot read file: mandatory keywords are not fixed format" I hope you will consider fixing the error in formatting, trivial though it is. Also, if modern pyfits still does care about fixed format for the mandatory keywords, I hope you'll change that. But I doubt it's still an issue. Other pyfits users can read the files just fine. -- Russell From rowen at cesmail.net Mon Oct 3 12:06:36 2005 From: rowen at cesmail.net (Russell E. Owen) Date: Mon, 03 Oct 2005 09:06:36 -0700 Subject: [SciPy-user] pyfits query; how to include a bit mask Message-ID: We're writing fits files with pyfits and would like to include a few bit masks (in particular "is saturated" and "ignore this pixel"). We're a bit stumped on how to do it and would appreciate any advice. -- Russell From stephen.walton at csun.edu Mon Oct 3 16:02:34 2005 From: stephen.walton at csun.edu (Stephen Walton) Date: Mon, 03 Oct 2005 13:02:34 -0700 Subject: [SciPy-user] Re: [Numpy-discussion] Release of SciPy Core 0.4 (Beta) In-Reply-To: <4340B872.8020102@optusnet.com.au> References: <433CE24A.6040509@ee.byu.edu> <4340B872.8020102@optusnet.com.au> Message-ID: <43418E5A.5040105@csun.edu> Tim Churches wrote: >However, can I ask if there are plans to add Masked Arrays to SciPy >Core? > Eric Firing pointed out on the matplotlib mailing list that they're already there: import scipy.base.ma as ma From meesters at uni-mainz.de Mon Oct 3 16:29:22 2005 From: meesters at uni-mainz.de (Christian Meesters) Date: Mon, 3 Oct 2005 22:29:22 +0200 Subject: [SciPy-user] pydoc bug? Message-ID: Hi when applying PyDoc on modules which import (parts of) scipy (version 0.3.2) it keeps running literaly for hours until I get a long Traceback ending with ... "/System/Library/Frameworks/Python.framework/Versions/2.3/lib/ python2.3/sre.py", line 137, in search return _compile(pattern, flags).search(string) RuntimeError: maximum recursion depth exceeded Outcommenting scipy imports helps. Now, since pydoc is running just fine on anything else I have, I'm wondering where the problem is. Is this a known bug (I couldn't find it anywhere)? Is there a different way to solve the problem? Is it a problem of my system (still Apple's python 2.3 on OS X 10.3.9 - I intend to change end of the week). And finally: Should I file a bug report? Cheers Christian From oliphant at ee.byu.edu Mon Oct 3 17:02:12 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Mon, 03 Oct 2005 15:02:12 -0600 Subject: [SciPy-user] Re: [Numpy-discussion] Release of SciPy Core 0.4 (Beta) In-Reply-To: <43418E5A.5040105@csun.edu> References: <433CE24A.6040509@ee.byu.edu> <4340B872.8020102@optusnet.com.au> <43418E5A.5040105@csun.edu> Message-ID: <43419C54.6030003@ee.byu.edu> Stephen Walton wrote: > Tim Churches wrote: > >> However, can I ask if there are plans to add Masked Arrays to SciPy >> Core? >> > Eric Firing pointed out on the matplotlib mailing list that they're > already there: > > import scipy.base.ma as ma from scipy import ma also works (everything under scipy.base is also under scipy alone). -Travis From oliphant at ee.byu.edu Mon Oct 3 20:23:58 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Mon, 03 Oct 2005 18:23:58 -0600 Subject: [SciPy-user] Purchasing Documentation In-Reply-To: <4341C1C8.3000506@optusnet.com.au> References: <43418AA8.2040809@hawaii.edu> <4341A410.5010504@optusnet.com.au> <4341B659.7070503@hawaii.edu> <4341C1C8.3000506@optusnet.com.au> Message-ID: <4341CB9E.4050306@ee.byu.edu> Tim Churches wrote: >Eric Firing wrote: > > >>>OK, thanks. In the absence of documentation, I just looked for an MA >>>subdirectory, couldn't find one and assumed that it wasn't (yet) >>>supported. >>> >>> >>Tim, >> >>Documentation is coming along, but being made available in an unusual >>way: http://www.tramy.us/ >> >> > >copy of the documentation. Most likely, one copy of the documentation >will be purchased to be shared between several (or many) users in a >workgroup within an institution. > Note that it is expressly against the agreement for one copy to be shared between multiple users at the same institution. I hope this is clear.... Of course you can let somebody else look at it a couple of times, but if they will use it regularly, they need to get their own copy. Prices are always a matter of supply and demand. The whole point of the system is to allow the price system to help coordinate what people think is valuable and what developers spend time doing. What you see currently is the maximum price (and time) I could possibly set as per the agreement with Trelgol. These things can always come down, however, as time proceeds, and the market responds. Now, obviously the cost of the documentation includes something of the cost of producing the actual code. Of course, you may disagree, but I did choose the numbers based on a little bit of market research. I don't think that 7000 copies of the documentation or 7 years is all that ridiculous given that there have been over 12000 downloads of the Numeric 24.0b2 source code since April and Numeric has been in stable use for nearly 10 years. If scipy does it's job correctly, then a user-base needing documentation of 7000 is rather low-balling it I would say. I want scipy to surpass the number of users of Numeric. I'm trying to make scipy core so that everybody can convert to it, eventually. The old Numeric manual still provides documentation, and the source is still available. I think you are still getting a great deal. Unless there is another majore re-write, the documentation will be updated as it goes (and you get the updates). >I would say that perhaps $30k or one year (after completion of the >documentation) would be more reasonable criteria for making the >documentation freely available (but then I am not writing it). > > Well, given the time I had to spend on this, that is quite a bit less than the market will bear for my services elsewhere. I suppose if I were rich, I could donate more of my time. But, I'm not.... I'm really not trying to make people upset. I'm really a huge fan of open information, and would love to see the documentation completely free. It's just that I cannot afford to create it for nothing. I have lots of demands on my time. Spending it doing scientific python has to be justified, somehow. I did not start the creation of a hybrid Numeric / numarray with the hope of making any money. I started it because there was a need. I thought I would get more help with its implementation. But, given the reality of people's scarce time (they need to make money...), nobody was able to help me. Out of this, the documentation idea was born to try and help fund development of scipy core. I hope people can understand that the reality of scarcity dictates that we coordinate efforts through some mechanism. The price mechanism has been the most succesful large-scale mechanism yet developed. I am interested in feedback. If you don't buy the book because you think I'm asking too much money, then let me know, as Tim has done. You can email me directly, as well. Best regards, -Travis From steve at shrogers.com Mon Oct 3 23:00:20 2005 From: steve at shrogers.com (Steven H. Rogers) Date: Mon, 03 Oct 2005 21:00:20 -0600 Subject: [SciPy-user] Release of SciPy Core 0.4 (Beta) In-Reply-To: <433CE24A.6040509@ee.byu.edu> References: <433CE24A.6040509@ee.byu.edu> Message-ID: <4341F044.8030900@shrogers.com> "python setup.py install" for scipy_core 0.4.1 fails to build with: error: Command "gcc -pthread -fno-strict-aliasing -DNDEBUG -g -O3 -Wall -Wstrict-prototypes -fPIC -DNO_ATLAS_INFO=1 -Iscipy/base/include -Ibuild/src/scipy/base -Iscipy/base/src -I/usr/local/include/python2.4 -c scipy/corelib/blasdot/_dotblas.c -o build/temp.linux-i686-2.4/scipy/corelib/blasdot/_dotblas.o" failed with exit status 1 Red Hat 9.0, gcc 3.2.2, Python 2.4.2 Any hints? Regards, Steve Travis Oliphant wrote: > > This is to announce the release of SciPy Core 0.4.X (beta) > > It is available at sourceforge which you can access by going to > > http://numeric.scipy.org > > Thanks to all who helped me with this release, especially > > Robert Kern > Pearu Peterson > > Now, let's start getting this stable.... > > -Travis Oliphant > > > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user > > -- Steven H. Rogers, Ph.D., steve at shrogers.com Weblog: http://shrogers.com/weblog "He who refuses to do arithmetic is doomed to talk nonsense." -- John McCarthy From oliphant at ee.byu.edu Mon Oct 3 23:49:36 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Mon, 03 Oct 2005 21:49:36 -0600 Subject: [SciPy-user] Release of SciPy Core 0.4 (Beta) In-Reply-To: <4341F044.8030900@shrogers.com> References: <433CE24A.6040509@ee.byu.edu> <4341F044.8030900@shrogers.com> Message-ID: <4341FBD0.8090107@ee.byu.edu> Steven H. Rogers wrote: > "python setup.py install" for scipy_core 0.4.1 fails to build with: > > error: Command "gcc -pthread -fno-strict-aliasing -DNDEBUG -g -O3 > -Wall -Wstrict-prototypes -fPIC -DNO_ATLAS_INFO=1 -Iscipy/base/include > -Ibuild/src/scipy/base > -Iscipy/base/src -I/usr/local/include/python2.4 -c > scipy/corelib/blasdot/_dotblas.c -o > build/temp.linux-i686-2.4/scipy/corelib/blasdot/_dotblas.o" failed > with exit status 1 Hmm. This is definitely a blas-related problem. What other errors do you see. A quick fix is to uncomment the blas_info=0 (line 15 of scipy/corelib/setup.py) and rerun setup (this will not try to build _dotblas.c I'd like to track down what the real problem is though. Do you see any output below blas_opt_info: when you run setup.py For example, my system shows. blas_opt_info: atlas_blas_threads_info: Setting PTATLAS=ATLAS NOT AVAILABLE atlas_blas_info: FOUND: libraries = ['f77blas', 'cblas', 'atlas'] library_dirs = ['/usr/lib/atlas'] language = c include_dirs = ['/usr/include/atlas'] These include_dirs are needed to pick up cblas.h -Travis From steve at shrogers.com Tue Oct 4 01:22:27 2005 From: steve at shrogers.com (Steven H. Rogers) Date: Mon, 03 Oct 2005 23:22:27 -0600 Subject: [SciPy-user] Release of SciPy Core 0.4 (Beta) In-Reply-To: <4341FBD0.8090107@ee.byu.edu> References: <433CE24A.6040509@ee.byu.edu> <4341F044.8030900@shrogers.com> <4341FBD0.8090107@ee.byu.edu> Message-ID: <43421193.5060202@shrogers.com> Looks like I have BLAS but not ATLAS. Here's the entire output from running setup.py: root at sojourner:/home/steve/dl/python/scipy/scipy_core-0.4.1# python setup.py install Assuming default configuration (scipy/distutils/command/{setup_command,setup}.py was not found) Appending scipy.distutils.command configuration to scipy.distutils Assuming default configuration (scipy/distutils/fcompiler/{setup_fcompiler,setup}.py was not found) Appending scipy.distutils.fcompiler configuration to scipy.distutils Appending scipy.distutils configuration to scipy Assuming default configuration (scipy/weave/{setup_weave,setup}.py was not found) Appending scipy.weave configuration to scipy Assuming default configuration (scipy/test/{setup_test,setup}.py was not found) Appending scipy.test configuration to scipy F2PY Version 2_1126 Appending scipy.f2py configuration to scipy Appending scipy.base configuration to scipy blas_opt_info: atlas_blas_threads_info: Setting PTATLAS=ATLAS NOT AVAILABLE atlas_blas_info: NOT AVAILABLE /home/steve/dl/python/scipy/scipy_core-0.4.1/scipy/distutils/system_info.py:1046: UserWarning: Atlas (http://math-atlas.sourceforge.net/) libraries not found. Directories to search for the libraries can be specified in the scipy_distutils/site.cfg file (section [atlas]) or by setting the ATLAS environment variable. warnings.warn(AtlasNotFoundError.__doc__) blas_info: FOUND: libraries = ['blas'] library_dirs = ['/usr/lib'] language = f77 FOUND: libraries = ['blas'] library_dirs = ['/usr/lib'] define_macros = [('NO_ATLAS_INFO', 1)] language = f77 lapack_opt_info: atlas_threads_info: Setting PTATLAS=ATLAS scipy.distutils.system_info.atlas_threads_info NOT AVAILABLE atlas_info: scipy.distutils.system_info.atlas_info NOT AVAILABLE /home/steve/dl/python/scipy/scipy_core-0.4.1/scipy/distutils/system_info.py:970: UserWarning: Atlas (http://math-atlas.sourceforge.net/) libraries not found. Directories to search for the libraries can be specified in the scipy_distutils/site.cfg file (section [atlas]) or by setting the ATLAS environment variable. warnings.warn(AtlasNotFoundError.__doc__) lapack_info: FOUND: libraries = ['lapack'] library_dirs = ['/usr/lib'] language = f77 FOUND: libraries = ['lapack', 'blas'] library_dirs = ['/usr/lib'] define_macros = [('NO_ATLAS_INFO', 1)] language = f77 Appending scipy.lib configuration to scipy Assuming default configuration (scipy/fftpack/{setup_fftpack,setup}.py was not found) Appending scipy.fftpack configuration to scipy Assuming default configuration (scipy/linalg/{setup_linalg,setup}.py was not found) Appending scipy.linalg configuration to scipy Assuming default configuration (scipy/stats/{setup_stats,setup}.py was not found) Appending scipy.stats configuration to scipy Appending scipy configuration to scipy_core version 0.4.1 running install running build running config_fc running build_src building extension "scipy.base.multiarray" sources adding 'build/src/scipy/base/config.h' to sources. adding 'build/src/scipy/base/__multiarray_api.h' to sources. adding 'build/src/scipy/base/src' to include_dirs. building extension "scipy.base.umath" sources adding 'build/src/scipy/base/config.h' to sources. adding 'build/src/scipy/base/__ufunc_api.h' to sources. adding 'build/src/scipy/base/src' to include_dirs. building extension "scipy.base._compiled_base" sources adding 'build/src/scipy/base/config.h' to sources. adding 'build/src/scipy/base/__multiarray_api.h' to sources. building extension "scipy.lib._dotblas" sources building extension "scipy.lib.fftpack_lite" sources building extension "scipy.lib.mtrand" sources building extension "scipy.lib.lapack_lite" sources running build_py running build_ext customize UnixCCompiler customize UnixCCompiler using build_ext customize GnuFCompiler customize GnuFCompiler customize GnuFCompiler using build_ext building 'scipy.lib._dotblas' extension compiling C sources gcc options: '-pthread -fno-strict-aliasing -DNDEBUG -g -O3 -Wall -Wstrict-prototypes -fPIC' compile options: '-DNO_ATLAS_INFO=1 -Iscipy/base/include -Ibuild/src/scipy/base -Iscipy/base/src -I/usr/local/include/python2.4 -c' gcc: scipy/corelib/blasdot/_dotblas.c In file included from scipy/corelib/blasdot/_dotblas.c:6: scipy/base/include/scipy/arrayobject.h:84: warning: redefinition of `ushort' /usr/include/sys/types.h:152: warning: `ushort' previously declared here scipy/base/include/scipy/arrayobject.h:85: warning: redefinition of `uint' /usr/include/sys/types.h:153: warning: `uint' previously declared here scipy/base/include/scipy/arrayobject.h:86: warning: redefinition of `ulong' /usr/include/sys/types.h:151: warning: `ulong' previously declared here scipy/corelib/blasdot/_dotblas.c:10:22: cblas.h: No such file or directory scipy/corelib/blasdot/_dotblas.c: In function `FLOAT_dot': scipy/corelib/blasdot/_dotblas.c:21: warning: implicit declaration of function `cblas_sdot' scipy/corelib/blasdot/_dotblas.c: In function `DOUBLE_dot': scipy/corelib/blasdot/_dotblas.c:31: warning: implicit declaration of function `cblas_ddot' scipy/corelib/blasdot/_dotblas.c: In function `CFLOAT_dot': scipy/corelib/blasdot/_dotblas.c:42: warning: implicit declaration of function `cblas_cdotu_sub' scipy/corelib/blasdot/_dotblas.c: In function `CDOUBLE_dot': scipy/corelib/blasdot/_dotblas.c:52: warning: implicit declaration of function `cblas_zdotu_sub' scipy/corelib/blasdot/_dotblas.c: In function `dotblas_matrixproduct': scipy/corelib/blasdot/_dotblas.c:235: warning: implicit declaration of function `cblas_daxpy' scipy/corelib/blasdot/_dotblas.c:239: warning: implicit declaration of function `cblas_saxpy' scipy/corelib/blasdot/_dotblas.c:243: warning: implicit declaration of function `cblas_zaxpy' scipy/corelib/blasdot/_dotblas.c:247: warning: implicit declaration of function `cblas_caxpy' scipy/corelib/blasdot/_dotblas.c:277: warning: implicit declaration of function `cblas_dgemv' scipy/corelib/blasdot/_dotblas.c:277: `CblasRowMajor' undeclared (first use in this function) scipy/corelib/blasdot/_dotblas.c:277: (Each undeclared identifier is reported only once scipy/corelib/blasdot/_dotblas.c:277: for each function it appears in.) scipy/corelib/blasdot/_dotblas.c:278: `CblasNoTrans' undeclared (first use in this function) scipy/corelib/blasdot/_dotblas.c:283: warning: implicit declaration of function `cblas_sgemv' scipy/corelib/blasdot/_dotblas.c:289: warning: implicit declaration of function `cblas_zgemv' scipy/corelib/blasdot/_dotblas.c:295: warning: implicit declaration of function `cblas_cgemv' scipy/corelib/blasdot/_dotblas.c:306: `CblasTrans' undeclared (first use in this function) scipy/corelib/blasdot/_dotblas.c:334: warning: implicit declaration of function `cblas_dgemm' scipy/corelib/blasdot/_dotblas.c:341: warning: implicit declaration of function `cblas_sgemm' scipy/corelib/blasdot/_dotblas.c:348: warning: implicit declaration of function `cblas_zgemm' scipy/corelib/blasdot/_dotblas.c:355: warning: implicit declaration of function `cblas_cgemm' scipy/corelib/blasdot/_dotblas.c: In function `dotblas_innerproduct': scipy/corelib/blasdot/_dotblas.c:522: `CblasRowMajor' undeclared (first use in this function) scipy/corelib/blasdot/_dotblas.c:523: `CblasNoTrans' undeclared (first use in this function) scipy/corelib/blasdot/_dotblas.c:579: `CblasTrans' undeclared (first use in this function) scipy/corelib/blasdot/_dotblas.c: In function `dotblas_vdot': scipy/corelib/blasdot/_dotblas.c:699: warning: implicit declaration of function `cblas_zdotc_sub' scipy/corelib/blasdot/_dotblas.c:703: warning: implicit declaration of function `cblas_cdotc_sub' In file included from scipy/corelib/blasdot/_dotblas.c:6: scipy/base/include/scipy/arrayobject.h:84: warning: redefinition of `ushort' /usr/include/sys/types.h:152: warning: `ushort' previously declared here scipy/base/include/scipy/arrayobject.h:85: warning: redefinition of `uint' /usr/include/sys/types.h:153: warning: `uint' previously declared here scipy/base/include/scipy/arrayobject.h:86: warning: redefinition of `ulong' /usr/include/sys/types.h:151: warning: `ulong' previously declared here scipy/corelib/blasdot/_dotblas.c:10:22: cblas.h: No such file or directory scipy/corelib/blasdot/_dotblas.c: In function `FLOAT_dot': scipy/corelib/blasdot/_dotblas.c:21: warning: implicit declaration of function `cblas_sdot' scipy/corelib/blasdot/_dotblas.c: In function `DOUBLE_dot': scipy/corelib/blasdot/_dotblas.c:31: warning: implicit declaration of function `cblas_ddot' scipy/corelib/blasdot/_dotblas.c: In function `CFLOAT_dot': scipy/corelib/blasdot/_dotblas.c:42: warning: implicit declaration of function `cblas_cdotu_sub' scipy/corelib/blasdot/_dotblas.c: In function `CDOUBLE_dot': scipy/corelib/blasdot/_dotblas.c:52: warning: implicit declaration of function `cblas_zdotu_sub' scipy/corelib/blasdot/_dotblas.c: In function `dotblas_matrixproduct': scipy/corelib/blasdot/_dotblas.c:235: warning: implicit declaration of function `cblas_daxpy' scipy/corelib/blasdot/_dotblas.c:239: warning: implicit declaration of function `cblas_saxpy' scipy/corelib/blasdot/_dotblas.c:243: warning: implicit declaration of function `cblas_zaxpy' scipy/corelib/blasdot/_dotblas.c:247: warning: implicit declaration of function `cblas_caxpy' scipy/corelib/blasdot/_dotblas.c:277: warning: implicit declaration of function `cblas_dgemv' scipy/corelib/blasdot/_dotblas.c:277: `CblasRowMajor' undeclared (first use in this function) scipy/corelib/blasdot/_dotblas.c:277: (Each undeclared identifier is reported only once scipy/corelib/blasdot/_dotblas.c:277: for each function it appears in.) scipy/corelib/blasdot/_dotblas.c:278: `CblasNoTrans' undeclared (first use in this function) scipy/corelib/blasdot/_dotblas.c:283: warning: implicit declaration of function `cblas_sgemv' scipy/corelib/blasdot/_dotblas.c:289: warning: implicit declaration of function `cblas_zgemv' scipy/corelib/blasdot/_dotblas.c:295: warning: implicit declaration of function `cblas_cgemv' scipy/corelib/blasdot/_dotblas.c:306: `CblasTrans' undeclared (first use in this function) scipy/corelib/blasdot/_dotblas.c:334: warning: implicit declaration of function `cblas_dgemm' scipy/corelib/blasdot/_dotblas.c:341: warning: implicit declaration of function `cblas_sgemm' scipy/corelib/blasdot/_dotblas.c:348: warning: implicit declaration of function `cblas_zgemm' scipy/corelib/blasdot/_dotblas.c:355: warning: implicit declaration of function `cblas_cgemm' scipy/corelib/blasdot/_dotblas.c: In function `dotblas_innerproduct': scipy/corelib/blasdot/_dotblas.c:522: `CblasRowMajor' undeclared (first use in this function) scipy/corelib/blasdot/_dotblas.c:523: `CblasNoTrans' undeclared (first use in this function) scipy/corelib/blasdot/_dotblas.c:579: `CblasTrans' undeclared (first use in this function) scipy/corelib/blasdot/_dotblas.c: In function `dotblas_vdot': scipy/corelib/blasdot/_dotblas.c:699: warning: implicit declaration of function `cblas_zdotc_sub' scipy/corelib/blasdot/_dotblas.c:703: warning: implicit declaration of function `cblas_cdotc_sub' error: Command "gcc -pthread -fno-strict-aliasing -DNDEBUG -g -O3 -Wall -Wstrict-prototypes -fPIC -DNO_ATLAS_INFO=1 -Iscipy/base/include -Ibuild/src/scipy/base -Iscipy/base/src -I/usr/local/include/python2.4 -c scipy/corelib/blasdot/_dotblas.c -o build/temp.linux-i686-2.4/scipy/corelib/blasdot/_dotblas.o" failed with exit status 1 Thanks, Steve Travis Oliphant wrote: > Steven H. Rogers wrote: > >> "python setup.py install" for scipy_core 0.4.1 fails to build with: >> >> error: Command "gcc -pthread -fno-strict-aliasing -DNDEBUG -g -O3 >> -Wall -Wstrict-prototypes -fPIC -DNO_ATLAS_INFO=1 -Iscipy/base/include >> -Ibuild/src/scipy/base >> -Iscipy/base/src -I/usr/local/include/python2.4 -c >> scipy/corelib/blasdot/_dotblas.c -o >> build/temp.linux-i686-2.4/scipy/corelib/blasdot/_dotblas.o" failed >> with exit status 1 > > > > Hmm. This is definitely a blas-related problem. What other errors do > you see. > > A quick fix is to uncomment the blas_info=0 (line 15 of > scipy/corelib/setup.py) and rerun setup (this will not try to build > _dotblas.c > > I'd like to track down what the real problem is though. > > Do you see any output below > blas_opt_info: > when you run setup.py > > For example, my system shows. > > blas_opt_info: > atlas_blas_threads_info: > Setting PTATLAS=ATLAS > NOT AVAILABLE > > atlas_blas_info: > FOUND: > libraries = ['f77blas', 'cblas', 'atlas'] > library_dirs = ['/usr/lib/atlas'] > language = c > include_dirs = ['/usr/include/atlas'] > > These include_dirs are needed to pick up cblas.h > > -Travis > > > > > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user > > -- Steven H. Rogers, Ph.D., steve at shrogers.com Weblog: http://shrogers.com/weblog "He who refuses to do arithmetic is doomed to talk nonsense." -- John McCarthy From brendansimons at yahoo.ca Tue Oct 4 01:49:42 2005 From: brendansimons at yahoo.ca (Brendan Simons) Date: Tue, 4 Oct 2005 01:49:42 -0400 Subject: [SciPy-user] Re: Problems Installing SciPy on OS X 10.4.2 In-Reply-To: <20051003232556.AF7D23EB52@www.scipy.com> References: <20051003232556.AF7D23EB52@www.scipy.com> Message-ID: <46F4ED57-5D45-4392-ADE3-2417C308DDAF@yahoo.ca> Thanks for the instructions David. I wouldn't have been able to figure out these tricks, but with your information, I'll try to build Scipy myself this week. Is anyone (with more skills than me) interested in building a bdist_mpkg installer, or is it still premature? (http:// undefined.org/python/py2app.html#bdist-mpkg-documentation) -Brendan On 3-Oct-05, at 7:25 PM, David Shupe wrote: > I was able to install SciPy from svn last Thursday (Sep 29) on a new > system with OS X 10.4.2 and XCode 2. I didn't use the stock Python > because it doesn't have readline support. I wanted to have > matplotlib and IPython in addition to SciPy, and couldn't get > matplotlib working using Fink. If you are willing to use a different > Python, here are the steps I followed: -------------- next part -------------- An HTML attachment was scrubbed... URL: From pearu at scipy.org Tue Oct 4 03:45:05 2005 From: pearu at scipy.org (Pearu Peterson) Date: Tue, 4 Oct 2005 02:45:05 -0500 (CDT) Subject: [SciPy-user] Release of SciPy Core 0.4 (Beta) In-Reply-To: <4341FBD0.8090107@ee.byu.edu> References: <433CE24A.6040509@ee.byu.edu> <4341F044.8030900@shrogers.com> <4341FBD0.8090107@ee.byu.edu> Message-ID: On Mon, 3 Oct 2005, Travis Oliphant wrote: > Steven H. Rogers wrote: > >> "python setup.py install" for scipy_core 0.4.1 fails to build with: >> >> error: Command "gcc -pthread -fno-strict-aliasing -DNDEBUG -g -O3 -Wall >> -Wstrict-prototypes -fPIC -DNO_ATLAS_INFO=1 -Iscipy/base/include >> -Ibuild/src/scipy/base >> -Iscipy/base/src -I/usr/local/include/python2.4 -c >> scipy/corelib/blasdot/_dotblas.c -o >> build/temp.linux-i686-2.4/scipy/corelib/blasdot/_dotblas.o" failed with >> exit status 1 > > > Hmm. This is definitely a blas-related problem. What other errors do you > see. > > A quick fix is to uncomment the blas_info=0 (line 15 of > scipy/corelib/setup.py) and rerun setup (this will not try to build > _dotblas.c > > I'd like to track down what the real problem is though. The problem is that cblas.h was missing that _dotblas.c includes (see previous message in this thread). In newcore svn I have included cblas.h to scipy/corelib/blasdot to workaround this problem. Pearu From perry at stsci.edu Tue Oct 4 05:15:21 2005 From: perry at stsci.edu (Perry Greenfield) Date: Tue, 4 Oct 2005 05:15:21 -0400 Subject: [SciPy-user] Re: [Numpy-discussion] Re: Purchasing Documentation In-Reply-To: <434213F0.30105@optusnet.com.au> References: <43418AA8.2040809@hawaii.edu> <4341A410.5010504@optusnet.com.au> <4341B659.7070503@hawaii.edu> <4341C1C8.3000506@optusnet.com.au> <4341CB9E.4050306@ee.byu.edu> <4341D32A.5000000@optusnet.com.au> <4341F777.3030404@ee.byu.edu> <434204B8.4050308@optusnet.com.au> <43420C28.5010701@pfdubois.com> <434213F0.30105@optusnet.com.au> Message-ID: First of all, I certainly don't want to prevent Travis from recovering something for all the effort he has put into scipy_core. Nor do I wish to discourage anyone that wishes to provide alternative free documentation if they so choose. But I will note some facts about numarray documentation below that may prove useful to anyone considering the latter. On Oct 4, 2005, at 1:32 AM, Tim Churches wrote: > Paul F. Dubois wrote: >> The original sources were in Framemaker. I am not positive where they >> are. In any case they are copyrighted by the Regents of the University >> of California. I am not a lawyer and don't know what the consequences >> of >> that are. LLNL granted free distribution of the printed document with >> the Numeric source code but I don't know what their position would be >> on >> using their copyrighted text in a new document or on giving away the >> sources. The numarray User's Manual was derived from the Numpy manual that LLNL originally sponsored David Ascher to write (if that is incorrect I'm sure Paul will correct me). We dutifully propagated the legal notice in the original to the numarray version. Although IANAL I'll note that the text seems to permit changes by others, but it also seems mis-worded as it refers to software, not documentation regarding the rights. What that really means in the end I'm not sure. In any case, Jochen Kupper (I hope I have that right) converted the format from Framemaker to the Python document latex style. The source for the numarray version is currently on sourceforge (under the numarray/doc directory). I'll also note much of the capabilities in scipy_core are very similar to those of numarray. There are differences, though I don't believe it would take a great deal of work to udpate the numarray version to reflect these (e.g. the changes in the types system, how rank-0 issues, the C-API, object array details and the names of the standard packages within scipy_core.) > OK, thanks Paul. That may have implications for you, Travis, if you are > planning to base your SciPy Core book on the existing NumPy > documentation. > From what I've seen of it, I don't believe it is based at all on the original manual or any derivative, but I'll leave that for Travis to comment further on. Perry From aisaac at american.edu Tue Oct 4 05:22:05 2005 From: aisaac at american.edu (Alan G Isaac) Date: Tue, 4 Oct 2005 05:22:05 -0400 Subject: [SciPy-user] Re: [Numpy-discussion] Purchasing Documentation In-Reply-To: <4341CB9E.4050306@ee.byu.edu> References: <43418AA8.2040809@hawaii.edu> <4341A410.5010504@optusnet.com.au> <4341B659.7070503@hawaii.edu> <4341C1C8.3000506@optusnet.com.au><4341CB9E.4050306@ee.byu.edu> Message-ID: On Mon, 03 Oct 2005, Travis Oliphant apparently wrote: > I hope people can understand that the reality of scarcity > dictates that we coordinate efforts through some > mechanism. The price mechanism has been the most > succesful large-scale mechanism yet developed. > I am interested in feedback. If you don't buy the book > because you think I'm asking too much money, then let me > know, as Tim has done. I found this an interesting approach to supporting the project. I plan to buy the book when it is released. Hmm, why wait? I should put my money where my mouth is. Just a moment ... ok, done. I view the book as a *complement* to other documentation that will appear and as a way to support the project. I agree with Tim that freely accessible online documentation will and must become available as well. As Chris notes, some of this can happen on the Wiki. I also plan to ask our library to purchase the book, but I am concerned that your statement that multiple users each need their own copy might mean a library purchase is forbidden. I assume it did not mean that, and that you just meant that making multiple copies is restricted. (Our library supports electronic book check out.) Ruling out library purchases would, I think, be a costly mistake for many reasons, which I can list if you are interested. Finally, I agree with Tim that seven years is too long and at the price I'd hope for a paperback copy. I think a better strategy would be two years copy protection, with an updated edition every two years. (But then I am not writing the code!) The basic concept is really nice, as long as it does not make it harder for you to - fully document your code, - smile on the free documentation that emerges, and - keep your sunny disposition. Cheers, Alan Isaac From jonas at cortical.mit.edu Tue Oct 4 07:48:25 2005 From: jonas at cortical.mit.edu (Eric Jonas) Date: Tue, 4 Oct 2005 07:48:25 -0400 Subject: [SciPy-user] Re: [SciPy-dev] Re: [Numpy-discussion] Purchasing Documentation In-Reply-To: References: <43418AA8.2040809@hawaii.edu> <4341A410.5010504@optusnet.com.au> <4341B659.7070503@hawaii.edu> Message-ID: <20051004114825.GX5015@convolution.mit.edu> I wanted to echo isaac's point: > I also plan to ask our library to purchase the book, but > I am concerned that your statement that multiple users each > need their own copy might mean a library purchase is > forbidden. I assume it did not mean that, and that you > just meant that making multiple copies is restricted. (Our > library supports electronic book check out.) Ruling out > library purchases would, I think, be a costly mistake for > many reasons, which I can list if you are interested. I couldn't agree more, although I'm not quite sure how a license could be worded such that it would allow a library copy and prevent a lab bench copy, which I think was Travis' intent. That said, have you considered selling "site licenses" of a sort? I know my lab would pay a few hundred to get a PDF that we could just stick in our fileserver and use in perpetuity. I know that right now there's nothing -preventing- us from buying just one copy and doing that (other than pesky copyright law), but we'd like to support the project. ...Eric From steve at shrogers.com Tue Oct 4 08:02:02 2005 From: steve at shrogers.com (Steven H. Rogers) Date: Tue, 04 Oct 2005 06:02:02 -0600 Subject: [SciPy-user] Release of SciPy Core 0.4 (Beta) In-Reply-To: References: <433CE24A.6040509@ee.byu.edu> <4341F044.8030900@shrogers.com> <4341FBD0.8090107@ee.byu.edu> Message-ID: <43426F3A.3030303@shrogers.com> Pearu Peterson wrote: > > > On Mon, 3 Oct 2005, Travis Oliphant wrote: > >> Steven H. Rogers wrote: >> >>> "python setup.py install" for scipy_core 0.4.1 fails to build with: >>> >>> error: Command "gcc -pthread -fno-strict-aliasing -DNDEBUG -g -O3 >>> -Wall -Wstrict-prototypes -fPIC -DNO_ATLAS_INFO=1 >>> -Iscipy/base/include -Ibuild/src/scipy/base >>> -Iscipy/base/src -I/usr/local/include/python2.4 -c >>> scipy/corelib/blasdot/_dotblas.c -o >>> build/temp.linux-i686-2.4/scipy/corelib/blasdot/_dotblas.o" failed >>> with exit status 1 >> >> >> >> Hmm. This is definitely a blas-related problem. What other errors >> do you see. >> >> A quick fix is to uncomment the blas_info=0 (line 15 of >> scipy/corelib/setup.py) and rerun setup (this will not try to build >> _dotblas.c >> >> I'd like to track down what the real problem is though. > > > The problem is that cblas.h was missing that _dotblas.c includes (see > previous message in this thread). In newcore svn I have included cblas.h > to scipy/corelib/blasdot to workaround this problem. > > Pearu > Ucommenting the blas_info=0 line allowed scipy_core to build and install. "import scipy" and "from scipy import *" seem to work, but scipy_core function names are not recognized. Also, README is an empty file. Is this expected? Steve -- Steven H. Rogers, Ph.D., steve at shrogers.com Weblog: http://shrogers.com/weblog "He who refuses to do arithmetic is doomed to talk nonsense." -- John McCarthy From rkern at ucsd.edu Tue Oct 4 08:13:34 2005 From: rkern at ucsd.edu (Robert Kern) Date: Tue, 04 Oct 2005 05:13:34 -0700 Subject: [SciPy-user] Release of SciPy Core 0.4 (Beta) In-Reply-To: <43426F3A.3030303@shrogers.com> References: <433CE24A.6040509@ee.byu.edu> <4341F044.8030900@shrogers.com> <4341FBD0.8090107@ee.byu.edu> <43426F3A.3030303@shrogers.com> Message-ID: <434271EE.2050906@ucsd.edu> Steven H. Rogers wrote: > Ucommenting the blas_info=0 line allowed scipy_core to build and > install. "import scipy" and "from scipy import *" seem to work, but > scipy_core function names are not recognized. Could you show some code that fails and the error that results? Are you sure that you don't have an older version of scipy installed? > Also, README is an empty file. Is this expected? Yes, we haven't gotten around to writing one, yet. -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From jdhunter at ace.bsd.uchicago.edu Tue Oct 4 10:09:01 2005 From: jdhunter at ace.bsd.uchicago.edu (John Hunter) Date: Tue, 04 Oct 2005 09:09:01 -0500 Subject: [SciPy-user] Re: [Numpy-discussion] Re: Purchasing Documentation In-Reply-To: <434213F0.30105@optusnet.com.au> (Tim Churches's message of "Tue, 04 Oct 2005 15:32:32 +1000") References: <43418AA8.2040809@hawaii.edu> <4341A410.5010504@optusnet.com.au> <4341B659.7070503@hawaii.edu> <4341C1C8.3000506@optusnet.com.au> <4341CB9E.4050306@ee.byu.edu> <4341D32A.5000000@optusnet.com.au> <4341F777.3030404@ee.byu.edu> <434204B8.4050308@optusnet.com.au> <43420C28.5010701@pfdubois.com> <434213F0.30105@optusnet.com.au> Message-ID: <87wtkt2x36.fsf@peds-pc311.bsd.uchicago.edu> >>>>> "Tim" == Tim Churches writes: Tim> Yes, I agree entirely. Travis, you are at perfect liberty to Tim> create commercial documentation for SciPy Core, but please Tim> don't object if others try to organise to create free open Tim> source documentation as well. Object? I'll bet dollars to doughnuts that Travis would be delighted to see this, just as he would have been at any time over the last several years. And be careful: Perry still owes me a doughnut. JDH From dd55 at cornell.edu Tue Oct 4 10:30:29 2005 From: dd55 at cornell.edu (Darren Dale) Date: Tue, 4 Oct 2005 10:30:29 -0400 Subject: [SciPy-user] Release of SciPy Core 0.4 (Beta) In-Reply-To: <434271EE.2050906@ucsd.edu> References: <433CE24A.6040509@ee.byu.edu> <43426F3A.3030303@shrogers.com> <434271EE.2050906@ucsd.edu> Message-ID: <200510041030.29174.dd55@cornell.edu> I had trouble compiling the release candidate, it was not locating my external LAPACK, BLAS and ATLAS libraries. It turned out that site.cfg and sample_site.cfg are not present in the distutils directory of the release candidate. I copied site.cfg over from my SciPy-0.3.2 build and then compilation was successful. It might be useful to note in the README. Darren On Tuesday 04 October 2005 8:13 am, Robert Kern wrote: > Steven H. Rogers wrote: > > Ucommenting the blas_info=0 line allowed scipy_core to build and > > install. "import scipy" and "from scipy import *" seem to work, but > > scipy_core function names are not recognized. > > Could you show some code that fails and the error that results? Are you > sure that you don't have an older version of scipy installed? > > > Also, README is an empty file. Is this expected? > > Yes, we haven't gotten around to writing one, yet. From alopez at imim.es Tue Oct 4 11:31:20 2005 From: alopez at imim.es (LOPEZ GARCIA DE LOMANA, ADRIAN) Date: Tue, 4 Oct 2005 17:31:20 +0200 Subject: [SciPy-user] odeint Message-ID: <66373AD054447F47851FCC5EB49B36113E27EA@basquet.imim.es> Hi people, I'm using odeint from integrate to solve a system of ODEs. Although the output seems correct a strange error pop up at the terminal: [alopez at thymus scipy]$ python single_global.integ.py Traceback (most recent call last): File "single_global.integ.py", line 23, in func xdot[7] = + (v_NACT * x[5]**m_NACT) / (x[5]**m_NACT + k_NACT**m_NACT) - k_deg_h * x[7] ValueError: negative number cannot be raised to a fractional power odepack.error: Error occured while calling the Python function named func Traceback (most recent call last): File "single_global.integ.py", line 23, in func xdot[7] = + (v_NACT * x[5]**m_NACT) / (x[5]**m_NACT + k_NACT**m_NACT) - k_deg_h * x[7] ValueError: negative number cannot be raised to a fractional power odepack.error: Error occured while calling the Python function named func Traceback (most recent call last): File "single_global.integ.py", line 23, in func xdot[7] = + (v_NACT * x[5]**m_NACT) / (x[5]**m_NACT + k_NACT**m_NACT) - k_deg_h * x[7] ValueError: negative number cannot be raised to a fractional power odepack.error: Error occured while calling the Python function named func Traceback (most recent call last): File "single_global.integ.py", line 23, in func xdot[7] = + (v_NACT * x[5]**m_NACT) / (x[5]**m_NACT + k_NACT**m_NACT) - k_deg_h * x[7] ValueError: negative number cannot be raised to a fractional power odepack.error: Error occured while calling the Python function named func The input file looks like this: from Numeric import * from scipy.integrate import odeint def func(x, t, *args): xdot = [0.46752920416900001, 0.0, 0.00496614094561, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.45751357598800002, 0.0, 0.035314464084199998, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0] xdot[0] = + R_n - k_deg_n * x[0] xdot[1] = - k_deg_N * x[1] + k_tln_n * x[0] - k_bind_ND * (1/64.0 * x[1] * x[12]) xdot[2] = - k_deg_d * x[2] + R_d / (1 + k_inhib_H * x[8]**s_H) xdot[3] = - k_deg_D * x[3] - k_bind_ND * (1/64.0 * x[3] * x[10]) + k_tln_d * x[2] xdot[4] = - k_deg_ND * x[4] + k_bind_ND * (1/64.0 * x[1] * x[12]) - k_diss_ND * (1/8.0 * x[4]) xdot[5] = + k_diss_ND * (1/8.0 * x[4]) - k_deg_NACT * x[5] xdot[6] = - k_deg_DACT * x[6] + k_diss_ND * (1/8.0 * x[13]) xdot[7] = + (v_NACT * x[5]**m_NACT) / (x[5]**m_NACT + k_NACT**m_NACT) - k_deg_h * x[7] xdot[8] = - k_deg_H * x[8] + k_tln_h * x[7] xdot[9] = + R_n - k_deg_n * x[9] xdot[10] = - k_deg_N * x[10] + k_tln_n * x[9] - k_bind_ND * (1/64.0 * x[10] * x[3]) xdot[11] = - k_deg_d * x[11] + R_d / (1 + k_inhib_H * x[17]**s_H) xdot[12] = - k_deg_D * x[12] - k_bind_ND * (1/64.0 * x[12] * x[1]) + k_tln_d * x[11] xdot[13] = - k_deg_ND * x[13] + k_bind_ND * (1/64.0 * x[10] * x[3]) - k_diss_ND * (1/8.0 * x[13]) xdot[14] = + k_diss_ND * (1/8.0 * x[13]) - k_deg_NACT * x[14] xdot[15] = - k_deg_DACT * x[15] + k_diss_ND * (1/8.0 * x[4]) xdot[16] = + (v_NACT * x[14]**m_NACT) / (x[14]**m_NACT + k_NACT**m_NACT) - k_deg_h * x[16] xdot[17] = - k_deg_H * x[17] + k_tln_h * x[16] return xdot t = arange(0, 780.1, 1.0) k_deg_d = 0.0437670312482 k_tln_d = 0.198782788861 v_NACT = 0.127802051983 k_tln_n = 0.164704331687 R_d = 0.0156035391386 k_deg_H = 0.180839336779 k_deg_h = 0.136197706451 k_tln_h = 0.177097298407 k_deg_n = 0.0306548917163 k_deg_D = 0.156002070979 k_deg_DACT = 0.136154162928 k_inhib_H = 41625892.9717 k_NACT = 5.53010217661 R_n = 0.0190046778533 k_bind_ND = 38207555.8983 s_H = 2.75203671214 k_deg_N = 0.0882819939502 k_deg_ND = 0.108276907855 m_NACT = 5.58940252216 k_deg_NACT = 0.0335928388267 k_diss_ND = 66522866.1189 parameters = [k_deg_d, k_tln_d, v_NACT, k_tln_n, R_d, k_deg_H, k_deg_h, k_tln_h, k_deg_n, k_deg_D, k_deg_DACT, k_inhib_H, k_NACT, R_n, k_bind_ND, s_H, k_deg_N, k_deg_ND, m_NACT, k_deg_NACT, k_diss_ND] x_0 = [0.467529204169, 0.0, 0.00496614094561, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.457513575988, 0.0, 0.0353144640842, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0] g = open('single_global.veloc.out', 'w') args = (parameters, g) x = odeint(func, x_0, t, args) g.close() f = open('single_global.out', 'w') f.write('#\n# generated by ByoDyn\n#\n') for i in range(len(t)): f.write('%s\t%s\t%s\t%s\t%s\t%s\t%s\t%s\t%s\t%s\t%s\t%s\t%s\t%s\t%s\t%s\t%s\t%s\t%s\n'%(t[i], x[i][0], x[i][1], x[i][2], x[i][3], x[i][4], x[i][5], x[i][6], x[i][7], x[i][8], x[i][9], x[i][10], x[i][11], x[i][12], x[i][13], x[i][14], x[i][15], x[i][16], x[i][17])) f.close() And the output file, looks OK, there is no single negative number. Can anyone reproduce the error? Can anyone tell me what's going on? Thanks in advance, Adri?n. From alopez at imim.es Tue Oct 4 11:43:30 2005 From: alopez at imim.es (LOPEZ GARCIA DE LOMANA, ADRIAN) Date: Tue, 4 Oct 2005 17:43:30 +0200 Subject: [SciPy-user] odeint Message-ID: <66373AD054447F47851FCC5EB49B36113E27EC@basquet.imim.es> By the way, I've just built the same file in octave and it runs without any errors. I've compared the two results using gnuplot and the trajectories are the same. Does anyone know why is giving me those errors? I've been able to reproduce the error in other machines. Thanks again, Adri?n. From rng7 at cornell.edu Tue Oct 4 12:00:49 2005 From: rng7 at cornell.edu (Ryan Gutenkunst) Date: Tue, 04 Oct 2005 12:00:49 -0400 Subject: [SciPy-user] odeint In-Reply-To: <66373AD054447F47851FCC5EB49B36113E27EA@basquet.imim.es> References: <66373AD054447F47851FCC5EB49B36113E27EA@basquet.imim.es> Message-ID: <4342A731.5010703@cornell.edu> LOPEZ GARCIA DE LOMANA, ADRIAN wrote: > Hi people, > > I'm using odeint from integrate to solve a system of ODEs. Although the output seems correct > a strange error pop up at the terminal: > > [alopez at thymus scipy]$ python single_global.integ.py > Traceback (most recent call last): > File "single_global.integ.py", line 23, in func > xdot[7] = + (v_NACT * x[5]**m_NACT) / (x[5]**m_NACT + k_NACT**m_NACT) - k_deg_h * x[7] > ValueError: negative number cannot be raised to a fractional power > odepack.error: Error occured while calling the Python function named func > And the output file, looks OK, there is no single negative number. > > Can anyone reproduce the error? Can anyone tell me what's going on? I get the same error running your code, although it stops the integration in my case. I ran into the same error running some of our own biochemical simulations. My hunch (someone correct me if this is nonsense) is that this error comes from the extrapolating steps the integrator takes. odeint is a variable step size integrator, so the routine tries a step, checks tolerances, adjusts step as necessary, and so on. If the value of your variable is getting close to zero, the extrapolated step may take it to a negative value. When odeint evaluates xdot at that point, you see those errors. The solution we came up with was to integrate in terms of the logarithm of the variable values. This ensures that they are always positive, and for our systems the speed penalty is pretty small. It's pretty easy to do since: d log(x)/dt = dx/dt * 1/x So you might want to try something like: def func_log(log_x, t, *args): x = exp(log_x) dx_dt = func(x, t, *args) return dx_dt/x Of course, your initial condition must be log(x_IC). And a zero IC will kill you, but maybe picking something like 1e-16 is reasonable for your problem. Cheers, Ryan > Thanks in advance, > > Adri?n. > -- Ryan Gutenkunst | Cornell LASSP | "It is not the mountain | we conquer but ourselves." Clark 535 / (607)227-7914 | -- Sir Edmund Hillary AIM: JepettoRNG | http://www.physics.cornell.edu/~rgutenkunst/ From alopez at imim.es Tue Oct 4 13:52:59 2005 From: alopez at imim.es (LOPEZ GARCIA DE LOMANA, ADRIAN) Date: Tue, 4 Oct 2005 19:52:59 +0200 Subject: [SciPy-user] odeint Message-ID: <66373AD054447F47851FCC5EB49B36113E27EF@basquet.imim.es> Thanks for your answer, I'll check if it works in my machine. I guess your're right, the errors appear when the values of the simulation are close to zero. The problem is that I'm not just interested on solving this specific example but it is part of a bigger program where I search for the parameters, run the simulations, and evaluate a score given some data. I'll check how if I can integrate your suggestion into the program. But, don't you think we are talking about a bug? Overall, if we want to compare scipy with Octave ... I mean, I definetly prefer SciPy, it's much faster, but I also want it to be reliable, and I don't want to be caring if the simulation is "too" close to zero. Thanks for all, Adri?n. -----Original Message----- From: scipy-user-bounces at scipy.net on behalf of Ryan Gutenkunst Sent: Tue 04/10/2005 17:00 To: SciPy Users List Subject: Re: [SciPy-user] odeint LOPEZ GARCIA DE LOMANA, ADRIAN wrote: > Hi people, > > I'm using odeint from integrate to solve a system of ODEs. Although the output seems correct > a strange error pop up at the terminal: > > [alopez at thymus scipy]$ python single_global.integ.py > Traceback (most recent call last): > File "single_global.integ.py", line 23, in func > xdot[7] = + (v_NACT * x[5]**m_NACT) / (x[5]**m_NACT + k_NACT**m_NACT) - k_deg_h * x[7] > ValueError: negative number cannot be raised to a fractional power > odepack.error: Error occured while calling the Python function named func > And the output file, looks OK, there is no single negative number. > > Can anyone reproduce the error? Can anyone tell me what's going on? I get the same error running your code, although it stops the integration in my case. I ran into the same error running some of our own biochemical simulations. My hunch (someone correct me if this is nonsense) is that this error comes from the extrapolating steps the integrator takes. odeint is a variable step size integrator, so the routine tries a step, checks tolerances, adjusts step as necessary, and so on. If the value of your variable is getting close to zero, the extrapolated step may take it to a negative value. When odeint evaluates xdot at that point, you see those errors. The solution we came up with was to integrate in terms of the logarithm of the variable values. This ensures that they are always positive, and for our systems the speed penalty is pretty small. It's pretty easy to do since: d log(x)/dt = dx/dt * 1/x So you might want to try something like: def func_log(log_x, t, *args): x = exp(log_x) dx_dt = func(x, t, *args) return dx_dt/x Of course, your initial condition must be log(x_IC). And a zero IC will kill you, but maybe picking something like 1e-16 is reasonable for your problem. Cheers, Ryan > Thanks in advance, > > Adri?n. > -- Ryan Gutenkunst | Cornell LASSP | "It is not the mountain | we conquer but ourselves." Clark 535 / (607)227-7914 | -- Sir Edmund Hillary AIM: JepettoRNG | http://www.physics.cornell.edu/~rgutenkunst/ _______________________________________________ SciPy-user mailing list SciPy-user at scipy.net http://www.scipy.net/mailman/listinfo/scipy-user From managan at llnl.gov Tue Oct 4 15:16:39 2005 From: managan at llnl.gov (Rob Managan) Date: Tue, 4 Oct 2005 12:16:39 -0700 Subject: [SciPy-user] Release of SciPy Core 0.4 (Beta) In-Reply-To: <4341FBD0.8090107@ee.byu.edu> References: <433CE24A.6040509@ee.byu.edu> <4341F044.8030900@shrogers.com> <4341FBD0.8090107@ee.byu.edu> Message-ID: Does the new scipy_core support the Numeric function PyArray_FromDimsAndData? I use that to create a front end to a stand alone program that generates a lot of arrays that I want to be able to query. Not the best approach maybe but it works! -- *-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*- Rob Managan email managan at llnl.gov LLNL phone: 925-423-0903 P.O. Box 808, L-095 FAX: 925-422-3389 Livermore, CA 94551-0808 From zollars at caltech.edu Tue Oct 4 16:11:15 2005 From: zollars at caltech.edu (Eric Zollars) Date: Tue, 04 Oct 2005 13:11:15 -0700 Subject: [SciPy-user] building scipy core 0.4 Message-ID: <1128456676.32448.7.camel@gaijin.mayo.caltech.edu> Could someone lay out simple build instructions for scipy core 0.4 (or point me to them)? How do I indicate local builds of ATLAS/LAPACK? Fortran compiler and options? And it appears that f2py is now included? Thanks. Eric From managan at llnl.gov Tue Oct 4 16:22:24 2005 From: managan at llnl.gov (Rob Managan) Date: Tue, 4 Oct 2005 13:22:24 -0700 Subject: [SciPy-user] Release of SciPy Core 0.4 (Beta) In-Reply-To: <4341FBD0.8090107@ee.byu.edu> References: <433CE24A.6040509@ee.byu.edu> <4341F044.8030900@shrogers.com> <4341FBD0.8090107@ee.byu.edu> Message-ID: As of version 0.4.2 it appears to install on Mac OSX 10.3.9. However, scipy.test() gives a bus error Here is the end of scipy.test(verbosity=2) Found 4 tests for scipy.base.index_tricks Found 0 tests for __main__ check_basic (scipy.base.function_base.test_function_base.test_all) ... ok check_nd (scipy.base.function_base.test_function_base.test_all) ... BUFFER...3 BUFFER...3 ok check_basic (scipy.base.function_base.test_function_base.test_amax) ... ok check_basic (scipy.base.function_base.test_function_base.test_amin) ... ok check_basic (scipy.base.function_base.test_function_base.test_angle) ... ok check_basic (scipy.base.function_base.test_function_base.test_any) ... ok check_nd (scipy.base.function_base.test_function_base.test_any) ... BUFFER...3 BUFFER...3 ok check_basic (scipy.base.function_base.test_function_base.test_cumprod) ... ok check_basic (scipy.base.function_base.test_function_base.test_cumsum) ... ok check_basic (scipy.base.function_base.test_function_base.test_diff) ... ok check_nd (scipy.base.function_base.test_function_base.test_diff) ... NOBUFFER...200 NOBUFFER...200 NOBUFFER...200 NOBUFFER...200 NOBUFFER...200 ok check_basic (scipy.base.function_base.test_function_base.test_extins) ... ok check_both (scipy.base.function_base.test_function_base.test_extins) ... Bus error if I add some prints into sciy/base/tests/test_function_base.py: test_extins def check_both(self): print ' ' a = rand(10) print 'a',a mask = a > 0.5 print 'mask',mask ac = a.copy() print 'ac',ac c = extract(mask, a) print 'c',c insert(a,mask,0) print 'a',a insert(a,mask,c) print 'c',c assert_array_equal(a,ac) I get a [ 0.24629164 0.1397958 0.56355685 0.53457765 0.45035916 0.05456988 0.18843402 0.057217 0.48848317 0.89830088] mask [False False True True False False False False False True] ac [ 0.24629164 0.1397958 0.56355685 0.53457765 0.45035916 0.05456988 0.18843402 0.057217 0.48848317 0.89830088] c [ 0.56355685 0.53457765 0.89830088] Bus error which implies that the insert function fails. Any ideas as to what would cause this?? Thanks!! -- *-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*- Rob Managan email managan at llnl.gov LLNL phone: 925-423-0903 P.O. Box 808, L-095 FAX: 925-422-3389 Livermore, CA 94551-0808 From oliphant at ee.byu.edu Tue Oct 4 18:08:43 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Tue, 04 Oct 2005 16:08:43 -0600 Subject: [SciPy-user] Release of SciPy Core 0.4 (Beta) In-Reply-To: References: <433CE24A.6040509@ee.byu.edu> <4341F044.8030900@shrogers.com> <4341FBD0.8090107@ee.byu.edu> Message-ID: <4342FD6B.20009@ee.byu.edu> Rob Managan wrote: > Does the new scipy_core support the Numeric function > PyArray_FromDimsAndData? > > I use that to create a front end to a stand alone program that > generates a lot of arrays that I want to be able to query. Not the > best approach maybe but it works! > Yes, All of the old Numeric C-API is available (I believe...). Direct access to descr->one and descr->zero does not work anymore, though (it's replaced with a function call). If you look at the source, you will see that these older calls are all special cases of the call to PyArray_New() -Travis From oliphant at ee.byu.edu Tue Oct 4 18:10:43 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Tue, 04 Oct 2005 16:10:43 -0600 Subject: [SciPy-user] Release of SciPy Core 0.4 (Beta) In-Reply-To: References: <433CE24A.6040509@ee.byu.edu> <4341F044.8030900@shrogers.com> <4341FBD0.8090107@ee.byu.edu> Message-ID: <4342FDE3.3040901@ee.byu.edu> Pearu Peterson wrote: > > The problem is that cblas.h was missing that _dotblas.c includes (see > previous message in this thread). In newcore svn I have included > cblas.h to scipy/corelib/blasdot to workaround this problem. > Thanks, I think this is a good idea. -Travis From oliphant at ee.byu.edu Tue Oct 4 18:23:55 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Tue, 04 Oct 2005 16:23:55 -0600 Subject: [SciPy-user] Re: [SciPy-dev] Re: [Numpy-discussion] Purchasing Documentation In-Reply-To: References: <43418AA8.2040809@hawaii.edu> <4341A410.5010504@optusnet.com.au> <4341B659.7070503@hawaii.edu> <4341C1C8.3000506@optusnet.com.au><4341CB9E.4050306@ee.byu.edu> Message-ID: <434300FB.7070606@ee.byu.edu> Alan G Isaac wrote: >I also plan to ask our library to purchase the book, but >I am concerned that your statement that multiple users each >need their own copy might mean a library purchase is >forbidden. I assume it did not mean that, and that you >just meant that making multiple copies is restricted. (Our >library supports electronic book check out.) Ruling out >library purchases would, I think, be a costly mistake for >many reasons, which I can list if you are interested. > > A library purchase is fine. If that how a single copy is shared. I'll make that more clear. But, really, if multiple users need to use it at the same time, then the library should purchase several copies. >Finally, I agree with Tim that seven years is too long and >at the price I'd hope for a paperback copy. I think >a better strategy would be two years copy protection, with >an updated edition every two years. (But then I am not >writing the code!) T > Thanks for the feedback. I'm experimenting with the right combination of total price and total time so feedback is welcomed. I want to encourage people who can really afford the book to spend the money on it. What is the "right" time/$$ combination that will encourage this. I'm willing to shorten the time and come down on the total price. They are set so I cannot increase them. But, there is no problem with lowering them. I could also support the idea of a cheaper total price 1st edition with a need to spend for the 2nd edition again. Thanks for the feedback. >he basic concept is really nice, as >long as it does not make it harder for you to >- fully document your code, >- smile on the free documentation that emerges, and >- keep your sunny disposition. > > Don't worry, I'm not banking my future on this little experiment, so I won't worry about what other people do. In fact, as John Hunter inferred, I would be thrilled by more contributions however they come. I just want to see more people use Python for scientific computing, and am trying some things out to help it along. My only question about writing "free" documentation, is that it just seems rather wasteful to spend time writing free documentation when you can set free the documentation by spending a little money instead. If you think I'm charging too much (either in $$ or time-delay), then continue to give me feedback. I am interested in what works. -Travis From oliphant at ee.byu.edu Tue Oct 4 18:24:57 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Tue, 04 Oct 2005 16:24:57 -0600 Subject: [SciPy-user] Release of SciPy Core 0.4 (Beta) In-Reply-To: <43426F3A.3030303@shrogers.com> References: <433CE24A.6040509@ee.byu.edu> <4341F044.8030900@shrogers.com> <4341FBD0.8090107@ee.byu.edu> <43426F3A.3030303@shrogers.com> Message-ID: <43430139.10703@ee.byu.edu> Steven H. Rogers wrote: > > > Pearu Peterson wrote: > >> >> >> On Mon, 3 Oct 2005, Travis Oliphant wrote: >> >>> Steven H. Rogers wrote: >>> >>>> "python setup.py install" for scipy_core 0.4.1 fails to build with: >>>> >>>> error: Command "gcc -pthread -fno-strict-aliasing -DNDEBUG -g -O3 >>>> -Wall -Wstrict-prototypes -fPIC -DNO_ATLAS_INFO=1 >>>> -Iscipy/base/include -Ibuild/src/scipy/base >>>> -Iscipy/base/src -I/usr/local/include/python2.4 -c >>>> scipy/corelib/blasdot/_dotblas.c -o >>>> build/temp.linux-i686-2.4/scipy/corelib/blasdot/_dotblas.o" failed >>>> with exit status 1 >>> >>> >>> >>> >>> Hmm. This is definitely a blas-related problem. What other >>> errors do you see. >>> >>> A quick fix is to uncomment the blas_info=0 (line 15 of >>> scipy/corelib/setup.py) and rerun setup (this will not try to build >>> _dotblas.c >>> >>> I'd like to track down what the real problem is though. >> >> >> >> The problem is that cblas.h was missing that _dotblas.c includes (see >> previous message in this thread). In newcore svn I have included >> cblas.h to scipy/corelib/blasdot to workaround this problem. >> >> Pearu >> > > Ucommenting the blas_info=0 line allowed scipy_core to build and > install. "import scipy" and "from scipy import *" seem to work, but > scipy_core function names are not recognized. > Which ones are you missing? -Travis From aisaac at american.edu Tue Oct 4 18:41:32 2005 From: aisaac at american.edu (Alan G Isaac) Date: Tue, 4 Oct 2005 18:41:32 -0400 Subject: [SciPy-user] Re[2]: [SciPy-dev] Re: [Numpy-discussion] Purchasing Documentation In-Reply-To: <434300FB.7070606@ee.byu.edu> References: <43418AA8.2040809@hawaii.edu> <4341A410.5010504@optusnet.com.au> <4341B659.7070503@hawaii.edu> <4341C1C8.3000506@optusnet.com.au><4341CB9E.4050306@ee.byu.edu> <434300FB.7070606@ee.byu.edu> Message-ID: On Tue, 04 Oct 2005, Travis Oliphant apparently wrote: > A library purchase is fine. If that how a single copy is shared. I'll > make that more clear. But, really, if multiple users need to use it at > the same time, then the library should purchase several copies. Our library supports single copy checkout. I think this currently is a pretty standard library function these days. > My only question about writing "free" documentation, is that it just > seems rather wasteful to spend time writing free documentation when you > can set free the documentation by spending a little money instead. I've done my part. ;-) Cheers, Alan Isaac From Fernando.Perez at colorado.edu Tue Oct 4 18:53:00 2005 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Tue, 04 Oct 2005 16:53:00 -0600 Subject: [SciPy-user] Re: [SciPy-dev] Re: [Numpy-discussion] Purchasing Documentation In-Reply-To: <434300FB.7070606@ee.byu.edu> References: <43418AA8.2040809@hawaii.edu> <4341A410.5010504@optusnet.com.au> <4341B659.7070503@hawaii.edu> <4341C1C8.3000506@optusnet.com.au><4341CB9E.4050306@ee.byu.edu> <434300FB.7070606@ee.byu.edu> Message-ID: <434307CC.7020303@colorado.edu> Travis Oliphant wrote: > Alan G Isaac wrote: > > >>I also plan to ask our library to purchase the book, but >>I am concerned that your statement that multiple users each >>need their own copy might mean a library purchase is >>forbidden. I assume it did not mean that, and that you >>just meant that making multiple copies is restricted. (Our >>library supports electronic book check out.) Ruling out >>library purchases would, I think, be a costly mistake for >>many reasons, which I can list if you are interested. >> >> > > A library purchase is fine. If that how a single copy is shared. I'll > make that more clear. But, really, if multiple users need to use it at > the same time, then the library should purchase several copies. Travis, I think that what confused some people (and which I believe was not your original intent) was the impression that you meant to have terms on the printed version of the book which were more restrictive than those of traditional paper books. With a single physical copy of a paper book, the rules are pretty simple and constrained by the laws of nature (one non-quantum object can't really be in more than one place at the same time). Lending, borrowing, library use, 'lab bench' use, etc, are all accepted practices because while one person is using the book, nobody else has access to it. Since your book is originally provided electronically, there is the technical possiblity to make multiple physical copies, which I believe is what you wish to prevent (and something I'm not arguing with). So perhaps a clarification along the lines of the following could help (this is my wording, of course, so you should say what _you_ want, not what I get from trying to read your mind :) 'a single printed copy can be made from the electronic version, which is subject to the same restrictions imposed on paper books (lending is OK but not wholesale photocopying for redistribution, for example)' This would put at ease a lot of people who normally buy a book in a lab or research group with the natural assumption that anyone in that lab can go to the shelf and read it. Obviously if it's a book with very frequent use, traditional book purchasers buy multiple copies. With your book, the exact same thing would be expected: just because they have the PDF doesn't mean they can print 10 copies of it for the whole lab. Or at least that's my understanding. Best regards, Fernando. From oliphant at ee.byu.edu Tue Oct 4 19:08:18 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Tue, 04 Oct 2005 17:08:18 -0600 Subject: [SciPy-user] Re: [SciPy-dev] Re: [Numpy-discussion] Purchasing Documentation In-Reply-To: <434307CC.7020303@colorado.edu> References: <43418AA8.2040809@hawaii.edu> <4341A410.5010504@optusnet.com.au> <4341B659.7070503@hawaii.edu> <4341C1C8.3000506@optusnet.com.au><4341CB9E.4050306@ee.byu.edu> <434300FB.7070606@ee.byu.edu> <434307CC.7020303@colorado.edu> Message-ID: <43430B62.9090908@ee.byu.edu> Fernando Perez wrote: > Travis Oliphant wrote: > >> Alan G Isaac wrote: >> >> >>> I also plan to ask our library to purchase the book, but I am >>> concerned that your statement that multiple users each need their >>> own copy might mean a library purchase is forbidden. I assume it >>> did not mean that, and that you just meant that making multiple >>> copies is restricted. (Our library supports electronic book check >>> out.) Ruling out library purchases would, I think, be a costly >>> mistake for many reasons, which I can list if you are interested. >>> >> >> A library purchase is fine. If that how a single copy is shared. >> I'll make that more clear. But, really, if multiple users need to >> use it at the same time, then the library should purchase several >> copies. > > > 'a single printed copy can be made from the electronic version, which > is subject to the same restrictions imposed on paper books (lending is > OK but not wholesale photocopying for redistribution, for example)' > > This would put at ease a lot of people who normally buy a book in a > lab or research group with the natural assumption that anyone in that > lab can go to the shelf and read it. Obviously if it's a book with > very frequent use, traditional book purchasers buy multiple copies. > With your book, the exact same thing would be expected: just because > they have the PDF doesn't mean they can print 10 copies of it for the > whole lab. Or at least that's my understanding. > Thanks, I like this wording. It is exactly what I meant. I also think except for a library-checkout system (where only one digital copy is in circulation), somebody should not be able to buy an e-copy and then make electronic copies for everybody in their organization. That's really quite counter productive. If you want to share your e-copy with someone for a while (or give it away) fine... I'm really just asking that you treat the e-copy something like a physical book. I'll make some more details concerning the intent, available. Thanks for everybody's help. -Travis From fonnesbeck at gmail.com Tue Oct 4 21:39:16 2005 From: fonnesbeck at gmail.com (Chris Fonnesbeck) Date: Tue, 4 Oct 2005 21:39:16 -0400 Subject: [SciPy-user] building scipy_distutils Message-ID: <723eb6930510041839k459e7d26qe34f7321ed1cef4c@mail.gmail.com> It seems that one needs scipy_distutils to build scipy_distutils: Nelson:/usr/local/src/scipy_distutils chris$ python2.4 setup.py build Traceback (most recent call last): File "setup.py", line 47, in ? import scipy_distutils ImportError: No module named scipy_distutils This doesnt make much sense; how do you get around this for a clean build? Thanks, Chris From rkern at ucsd.edu Tue Oct 4 21:48:42 2005 From: rkern at ucsd.edu (Robert Kern) Date: Tue, 04 Oct 2005 18:48:42 -0700 Subject: [SciPy-user] building scipy_distutils In-Reply-To: <723eb6930510041839k459e7d26qe34f7321ed1cef4c@mail.gmail.com> References: <723eb6930510041839k459e7d26qe34f7321ed1cef4c@mail.gmail.com> Message-ID: <434330FA.9040207@ucsd.edu> Chris Fonnesbeck wrote: > It seems that one needs scipy_distutils to build scipy_distutils: > > Nelson:/usr/local/src/scipy_distutils chris$ python2.4 setup.py build > Traceback (most recent call last): > File "setup.py", line 47, in ? > import scipy_distutils > ImportError: No module named scipy_distutils > > This doesnt make much sense; how do you get around this for a clean build? Add /usr/local/src to the PYTHONPATH temporarily. It's a hack, but the usual way to install was to install scipy_core (the old one, not the new one) as a whole, not just scipy_distutils alone. -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From steve at shrogers.com Tue Oct 4 22:50:52 2005 From: steve at shrogers.com (Steven H. Rogers) Date: Tue, 04 Oct 2005 20:50:52 -0600 Subject: [SciPy-user] Release of SciPy Core 0.4 (Beta) In-Reply-To: <434271EE.2050906@ucsd.edu> References: <433CE24A.6040509@ee.byu.edu> <4341F044.8030900@shrogers.com> <4341FBD0.8090107@ee.byu.edu> <43426F3A.3030303@shrogers.com> <434271EE.2050906@ucsd.edu> Message-ID: <43433F8C.8010808@shrogers.com> Robert Kern wrote: > Steven H. Rogers wrote: > > >>Ucommenting the blas_info=0 line allowed scipy_core to build and >>install. "import scipy" and "from scipy import *" seem to work, but >>scipy_core function names are not recognized. > > > Could you show some code that fails and the error that results? Are you > sure that you don't have an older version of scipy installed? > > When I wrote the previous post nothing seemed to work. Things are looking better now. steve at sojourner:~$ python Python 2.4.2 (#1, Sep 30 2005, 18:34:48) [GCC 3.2.2 20030222 (Red Hat Linux 3.2.2-5)] on linux2 Type "help", "copyright", "credits" or "license" for more information. >>> from scipy import * >>> a = array([[1,2,3],[4,5,6]]) >>> print a [[1 2 3] [4 5 6]] >>> a[1,1] 5 >>> a.shape (2, 3) >>> a.dtype Traceback (most recent call last): File "", line 1, in ? AttributeError: dtype -- Steven H. Rogers, Ph.D., steve at shrogers.com Weblog: http://shrogers.com/weblog "He who refuses to do arithmetic is doomed to talk nonsense." -- John McCarthy From rkern at ucsd.edu Tue Oct 4 23:03:04 2005 From: rkern at ucsd.edu (Robert Kern) Date: Tue, 04 Oct 2005 20:03:04 -0700 Subject: [SciPy-user] Release of SciPy Core 0.4 (Beta) In-Reply-To: <43433F8C.8010808@shrogers.com> References: <433CE24A.6040509@ee.byu.edu> <4341F044.8030900@shrogers.com> <4341FBD0.8090107@ee.byu.edu> <43426F3A.3030303@shrogers.com> <434271EE.2050906@ucsd.edu> <43433F8C.8010808@shrogers.com> Message-ID: <43434268.8010100@ucsd.edu> Steven H. Rogers wrote: > Robert Kern wrote: >> Could you show some code that fails and the error that results? Are you >> sure that you don't have an older version of scipy installed? > > When I wrote the previous post nothing seemed to work. Things are > looking better now. > > steve at sojourner:~$ python > Python 2.4.2 (#1, Sep 30 2005, 18:34:48) > [GCC 3.2.2 20030222 (Red Hat Linux 3.2.2-5)] on linux2 > Type "help", "copyright", "credits" or "license" for more information. >>>> from scipy import * >>>> a = array([[1,2,3],[4,5,6]]) >>>> print a > [[1 2 3] > [4 5 6]] >>>> a[1,1] > 5 >>>> a.shape > (2, 3) >>>> a.dtype > Traceback (most recent call last): > File "", line 1, in ? > AttributeError: dtype Are you sure that you don't have an older version of scipy installed? It should have a version number 0.4.x, not 0.3.x: In [6]: scipy.__version__ Out[6]: '0.4.2' -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From oliphant at ee.byu.edu Tue Oct 4 23:19:02 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Tue, 04 Oct 2005 21:19:02 -0600 Subject: [SciPy-user] Release of SciPy Core 0.4 (Beta) In-Reply-To: <43433F8C.8010808@shrogers.com> References: <433CE24A.6040509@ee.byu.edu> <4341F044.8030900@shrogers.com> <4341FBD0.8090107@ee.byu.edu> <43426F3A.3030303@shrogers.com> <434271EE.2050906@ucsd.edu> <43433F8C.8010808@shrogers.com> Message-ID: <43434626.2070401@ee.byu.edu> Steven H. Rogers wrote: > Robert Kern wrote: > >> Steven H. Rogers wrote: >> >> >>> Ucommenting the blas_info=0 line allowed scipy_core to build and >>> install. "import scipy" and "from scipy import *" seem to work, but >>> scipy_core function names are not recognized. >> >> >> > > When I wrote the previous post nothing seemed to work. Things are > looking better now. > > steve at sojourner:~$ python > Python 2.4.2 (#1, Sep 30 2005, 18:34:48) > [GCC 3.2.2 20030222 (Red Hat Linux 3.2.2-5)] on linux2 > Type "help", "copyright", "credits" or "license" for more information. > >>> from scipy import * > >>> a = array([[1,2,3],[4,5,6]]) > >>> print a > [[1 2 3] > [4 5 6]] > >>> a[1,1] > 5 > >>> a.shape > (2, 3) > >>> a.dtype > Traceback (most recent call last): > File "", line 1, in ? > AttributeError: dtype You are picking up an older version of scipy. Do you have scipy installed already? If so, we are working quickly on getting a replacement so that scipy can build on top of scipy_core. For now, you probably need to delete your scipy installation, and re-install scipy_core. Again, Old scipy and new scipy core do not work together yet. We are trying to fix this, but have not done so yet. -Travis From steve at shrogers.com Tue Oct 4 23:35:37 2005 From: steve at shrogers.com (Steven H. Rogers) Date: Tue, 04 Oct 2005 21:35:37 -0600 Subject: [SciPy-user] Release of SciPy Core 0.4 (Beta) In-Reply-To: <43434268.8010100@ucsd.edu> References: <433CE24A.6040509@ee.byu.edu> <4341F044.8030900@shrogers.com> <4341FBD0.8090107@ee.byu.edu> <43426F3A.3030303@shrogers.com> <434271EE.2050906@ucsd.edu> <43433F8C.8010808@shrogers.com> <43434268.8010100@ucsd.edu> Message-ID: <43434A09.8030500@shrogers.com> Didn't think so, but I did have an old version in my PYTHONPATH. With that corrected, my 0.4.1 installation seems to be working. Thanks, Steve ////////////////// Robert Kern wrote: > Steven H. Rogers wrote: > >>Robert Kern wrote: > > >>>Could you show some code that fails and the error that results? Are you >>>sure that you don't have an older version of scipy installed? >> >>When I wrote the previous post nothing seemed to work. Things are >>looking better now. >> >>steve at sojourner:~$ python >>Python 2.4.2 (#1, Sep 30 2005, 18:34:48) >>[GCC 3.2.2 20030222 (Red Hat Linux 3.2.2-5)] on linux2 >>Type "help", "copyright", "credits" or "license" for more information. >> >>>>>from scipy import * >>>>>a = array([[1,2,3],[4,5,6]]) >>>>>print a >> >>[[1 2 3] >> [4 5 6]] >> >>>>>a[1,1] >> >>5 >> >>>>>a.shape >> >>(2, 3) >> >>>>>a.dtype >> >>Traceback (most recent call last): >> File "", line 1, in ? >>AttributeError: dtype > > > Are you sure that you don't have an older version of scipy installed? It > should have a version number 0.4.x, not 0.3.x: > > In [6]: scipy.__version__ > Out[6]: '0.4.2' > -- Steven H. Rogers, Ph.D., steve at shrogers.com Weblog: http://shrogers.com/weblog "He who refuses to do arithmetic is doomed to talk nonsense." -- John McCarthy From worknit at gmail.com Tue Oct 4 23:49:26 2005 From: worknit at gmail.com (Jon Savian) Date: Tue, 4 Oct 2005 20:49:26 -0700 Subject: [SciPy-user] building scipy from source Message-ID: <8d9f49590510042049y78e1850ape1c2c579357f1810@mail.gmail.com> I am trying to build scipy from source. I've installed all the prerequisites using the --prefix=/path/to/instalation. In this situation how would i go about installing scipy. Do i need to specify the directories of the prereqs? How would i do this? Thanks! -------------- next part -------------- An HTML attachment was scrubbed... URL: From Fernando.Perez at colorado.edu Wed Oct 5 01:46:28 2005 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Tue, 04 Oct 2005 23:46:28 -0600 Subject: [SciPy-user] From Python to Fortran, Mathematica and back Message-ID: <434368B4.5070107@colorado.edu> Hi all, Here are a few things which I realized might be of interest to others: http://amath.colorado.edu/faculty/fperez/python/code Some in our group still write a fair amount of Fortran source code, yet often they need constants or arrays in their source, which can be quickly computed using python/scipy's convenient resources. The py2fortran module will generate syntactically valid Fortran source (either to a string or on disk) for python variables. We also use mathematica heavily, so I've also included a pair of files to do the same job (exporting variables from one language to another), but for mathematica <-> Python. Note that the mathematica stuff is older and less tested, but it works reasonably well. All of this is fairly brittle code, since it relies on delicate formatting of the output (in particular, the mathematica-related routines use Numeric.array_repr, which may have changed in the new core). Still, it may prove to be a big time-saver for some (it is for us here). Perhaps with a bit of review/cleanup from others, this kind of functionality (if it's of interest) can be folded in the new scipy, in the i/o facilities module. I post it here mainly to gauge community interest, and to give back a bit after Travis' huge contribution with the new core. Regards, f From sylvain.gerbier at newlogic.fr Wed Oct 5 02:22:45 2005 From: sylvain.gerbier at newlogic.fr (Sylvain Gerbier) Date: Wed, 05 Oct 2005 08:22:45 +0200 Subject: [SciPy-user] Scipy with Python 2.4 on Windows - solution In-Reply-To: <1527564003.20050926222630@gmail.com> References: <1527564003.20050926222630@gmail.com> Message-ID: <43437135.4030202@newlogic.fr> Hi, Coming back to you .. I did download the exe and try to install it but without success .. The installation went well, but when I import scipy, the editor (PythonWin or Idle) just crash .. I hadn't the time (and maybe the "fighting spirit" !!) to compile the sources, so I just went back to Python 2.3 and adapt my scripts. Thanks anyway for your help .. Sylvain rainman wrote: > Hello scipy-user, > > To succesfully build SciPy with Python 2.4, replace file > \SciPy_complete-0.3.2\scipy_core\scipy_distutils\mingw32_support.py > with > http://home.uic.tula.ru/~s001180/scipy-py24/mingw32ccompiler.py > > Those who do not believe their spirit is bold enough to face > tremendous nightmares of compiling tons of sources, may take a > chance with this (beware, downloading is slow): > > http://home.uic.tula.ru/~s001180/scipy-py24/SciPy_complete-0.3.2.win32-py2.4-atlas3.6.0_WinNT_PIII.exe > > Note that this is UNofficial bugfix and UNofficial build. Just a > quick hack for those who lost all hope. > > From nwagner at mecha.uni-stuttgart.de Wed Oct 5 03:41:56 2005 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Wed, 05 Oct 2005 09:41:56 +0200 Subject: [SciPy-user] Release of SciPy Core 0.4 (Beta) In-Reply-To: <43434626.2070401@ee.byu.edu> References: <433CE24A.6040509@ee.byu.edu> <4341F044.8030900@shrogers.com> <4341FBD0.8090107@ee.byu.edu> <43426F3A.3030303@shrogers.com> <434271EE.2050906@ucsd.edu> <43433F8C.8010808@shrogers.com> <43434626.2070401@ee.byu.edu> Message-ID: <434383C4.2080507@mecha.uni-stuttgart.de> Travis Oliphant wrote: > Steven H. Rogers wrote: > >> Robert Kern wrote: >> >>> Steven H. Rogers wrote: >>> >>> >>>> Ucommenting the blas_info=0 line allowed scipy_core to build and >>>> install. "import scipy" and "from scipy import *" seem to work, but >>>> scipy_core function names are not recognized. >>> >>> >>> >> >> When I wrote the previous post nothing seemed to work. Things are >> looking better now. >> >> steve at sojourner:~$ python >> Python 2.4.2 (#1, Sep 30 2005, 18:34:48) >> [GCC 3.2.2 20030222 (Red Hat Linux 3.2.2-5)] on linux2 >> Type "help", "copyright", "credits" or "license" for more information. >> >>> from scipy import * >> >>> a = array([[1,2,3],[4,5,6]]) >> >>> print a >> [[1 2 3] >> [4 5 6]] >> >>> a[1,1] >> 5 >> >>> a.shape >> (2, 3) >> >>> a.dtype >> Traceback (most recent call last): >> File "", line 1, in ? >> AttributeError: dtype > > > You are picking up an older version of scipy. Do you have scipy > installed already? If so, we are working quickly on getting a > replacement so that scipy can build on top of scipy_core. For now, > you probably need to delete your scipy installation, and re-install > scipy_core. > > > Again, > > Old scipy and new scipy core do not work together yet. > > We are trying to fix this, but have not done so yet. > > -Travis > > > > > > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user Hi all, I have installed an old version of scipy via svn. cd /var/tmp/svn svn co http://svn.scipy.org/svn/scipy/trunk scipy cd /var/tmp/svn/scipy svn co http://svn.scipy.org/svn/scipy_core/trunk/ scipy_core python setup.py build python setup.py install (as root) Now I would like to install new scipy. What should be done prior to a new installation since old scipy and new scipy core do not work together rm -rf /usr/local/lib/python2.4/site-packages/scipy_base/ rm -rf /usr/local/lib/python2.4/site-packages/scipy rm -rf /usr/local/lib/python2.4/site-packages/scipy_distutils/ And how do I get the new scipy ? svn co http://svn.scipy.org/svn/scipy/trunk scipy svn ??? Nils From rkern at ucsd.edu Wed Oct 5 03:59:36 2005 From: rkern at ucsd.edu (Robert Kern) Date: Wed, 05 Oct 2005 00:59:36 -0700 Subject: [SciPy-user] Release of SciPy Core 0.4 (Beta) In-Reply-To: <434383C4.2080507@mecha.uni-stuttgart.de> References: <433CE24A.6040509@ee.byu.edu> <4341F044.8030900@shrogers.com> <4341FBD0.8090107@ee.byu.edu> <43426F3A.3030303@shrogers.com> <434271EE.2050906@ucsd.edu> <43433F8C.8010808@shrogers.com> <43434626.2070401@ee.byu.edu> <434383C4.2080507@mecha.uni-stuttgart.de> Message-ID: <434387E8.40101@ucsd.edu> Nils Wagner wrote: > Now I would like to install new scipy. > > What should be done prior to a new installation since old scipy and new > scipy core do not work together > > rm -rf /usr/local/lib/python2.4/site-packages/scipy_base/ > rm -rf /usr/local/lib/python2.4/site-packages/scipy > rm -rf /usr/local/lib/python2.4/site-packages/scipy_distutils/ You will also want to remove the f2py package and its script as scipy.f2py replaces it. weave, too, although that is less likely to get in the way. > And how do I get the new scipy ? > > svn co http://svn.scipy.org/svn/scipy/trunk scipy > svn ??? No, that's still the old scipy. The new scipy_core and scipy in general are being developed on branches at the moment. As has been explained in the various announcement emails: # for scipy_core: svn co http://svn.scipy.org/svn/scipy_core/branches/newcore/ cd newcore python setup.py install cd .. # for the new scipy: svn co http://svn.scipy.org/svn/scipy/branches/newscipy/ We are in the process of porting the rest of scipy to the new scipy_core, so the latter tree doesn't build anything useful, yet. Contributions are welcome. In another email, Pearu has briefly outlined some basic information one needs to know about the new scipy.distutils, and there's more documentation on its way. There is a file TOCHANGE.txt that contains a list of things that need to be done to complete the port. -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From nwagner at mecha.uni-stuttgart.de Wed Oct 5 04:12:36 2005 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Wed, 05 Oct 2005 10:12:36 +0200 Subject: [SciPy-user] Release of SciPy Core 0.4 (Beta) In-Reply-To: <434387E8.40101@ucsd.edu> References: <433CE24A.6040509@ee.byu.edu> <4341F044.8030900@shrogers.com> <4341FBD0.8090107@ee.byu.edu> <43426F3A.3030303@shrogers.com> <434271EE.2050906@ucsd.edu> <43433F8C.8010808@shrogers.com> <43434626.2070401@ee.byu.edu> <434383C4.2080507@mecha.uni-stuttgart.de> <434387E8.40101@ucsd.edu> Message-ID: <43438AF4.8090305@mecha.uni-stuttgart.de> Robert Kern wrote: >Nils Wagner wrote: > > >>Now I would like to install new scipy. >> >>What should be done prior to a new installation since old scipy and new >>scipy core do not work together >> >>rm -rf /usr/local/lib/python2.4/site-packages/scipy_base/ >>rm -rf /usr/local/lib/python2.4/site-packages/scipy >>rm -rf /usr/local/lib/python2.4/site-packages/scipy_distutils/ >> > >You will also want to remove the f2py package and its script as >scipy.f2py replaces it. weave, too, although that is less likely to get >in the way. > > >>And how do I get the new scipy ? >> >>svn co http://svn.scipy.org/svn/scipy/trunk scipy >>svn ??? >> > >No, that's still the old scipy. The new scipy_core and scipy in general >are being developed on branches at the moment. As has been explained in >the various announcement emails: > ># for scipy_core: >svn co http://svn.scipy.org/svn/scipy_core/branches/newcore/ >cd newcore >python setup.py install >cd .. > ># for the new scipy: >svn co http://svn.scipy.org/svn/scipy/branches/newscipy/ > >We are in the process of porting the rest of scipy to the new >scipy_core, so the latter tree doesn't build anything useful, yet. >Contributions are welcome. In another email, Pearu has briefly outlined >some basic information one needs to know about the new scipy.distutils, >and there's more documentation on its way. There is a file TOCHANGE.txt >that contains a list of things that need to be done to complete the port. > > Robert, Thank you very much for your helpful comments. How about /usr/local/include/python2.4/scipy/ ? Nils From rkern at ucsd.edu Wed Oct 5 04:46:39 2005 From: rkern at ucsd.edu (Robert Kern) Date: Wed, 05 Oct 2005 01:46:39 -0700 Subject: [SciPy-user] Release of SciPy Core 0.4 (Beta) In-Reply-To: <43438AF4.8090305@mecha.uni-stuttgart.de> References: <433CE24A.6040509@ee.byu.edu> <4341F044.8030900@shrogers.com> <4341FBD0.8090107@ee.byu.edu> <43426F3A.3030303@shrogers.com> <434271EE.2050906@ucsd.edu> <43433F8C.8010808@shrogers.com> <43434626.2070401@ee.byu.edu> <434383C4.2080507@mecha.uni-stuttgart.de> <434387E8.40101@ucsd.edu> <43438AF4.8090305@mecha.uni-stuttgart.de> Message-ID: <434392EF.7020607@ucsd.edu> Nils Wagner wrote: > How about > > /usr/local/include/python2.4/scipy/ ? That looks like a leftover from an older installation of the new scipy_core. You should remove it. -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From dani.berzas at gmail.com Wed Oct 5 05:19:08 2005 From: dani.berzas at gmail.com (Daniel =?ISO-8859-1?Q?Jim=E9nez?=) Date: Wed, 05 Oct 2005 11:19:08 +0200 Subject: [SciPy-user] SOLVED memory leak with Python/C API In-Reply-To: <43411C7B.40801@ucsd.edu> References: <1128339739.3851.56.camel@localhost.localdomain> <43411C7B.40801@ucsd.edu> Message-ID: <1128503948.26375.47.camel@localhost.localdomain> Hi ,again: only say that the problem is solved, the answer was on the manual (RTFM, me culpa). http://docs.python.org/api/memoryOverview.html Python and C uses different heaps. Python can't free memory allocated with malloc(), alloc(), etc... ---------------------------------------------------------- wrong --> hp = (double *) calloc(tama_hp ,sizeof(double)); right --> hp = (double *) PyMem_New (double, tama_hp); ---------------------------------------------------------- Sorry and thanks. Dani. > Daniel Jim?nez wrote: > > Hi, every one, > > First sorry if this is not the right the place to ask this question. > > Probably not. You'll probably get a better response at a more general > forum like comp.lang.python . > > > I'm having memory leak when working with Python/C API. > > > > A big amount of data is generated in C and transfered to Python, but > > after use the memory is not freed. The program run until the OS kill the > > process. > > > > I'm working with Ubuntu-Warty, Python 2.3.4, > > [GCC 3.3.4 (Debian 1:3.3.4-9ubuntu5)] on linux2 > > with scipy 0.3.2 > > > > > > The C program goes as follows: > > { > > Load the initial data (list of arrays) with PyArg_ParseTuple(). > > ExitList = PyList_New(..); > > HpList = PyList_New(n_esp); > > loop_1 > > { hp = calloc() > > (compute hp) > > PyList_SET_ITEM(HpList,i,...(char *)hp) ); > > You're eliding a lot of the code that's going to tell us where your leak > is. For example, PyList_SET_ITEM doesn't take char* as an argument. Do > you have a PyString_FromString() in there? It's entirely possible that > the function you are calling copies the data in the char*. > PyString_FromString() certainly does. You will have to free(hp) yourself. > > > } > > > > PyList_SET_ITEM(ExitList,0,(PyObject *)HpList); > > > > return ExitList; > > } > > It would help if you could boil down your function to the smallest > complete example that displays the problem. In the process, you'll > probably find the answer yourself. If you don't, I'll be waiting on > comp.lang.python . > From drohr at few.vu.nl Wed Oct 5 05:45:16 2005 From: drohr at few.vu.nl (Daniel R. Rohr) Date: Wed, 05 Oct 2005 11:45:16 +0200 Subject: [SciPy-user] plotting with gplt Message-ID: <4343A0AC.5020806@few.vu.nl> Hi list I'm trying to plot some of my data. e.g. like this >>> import scipy >>> a = [0,1,2,3] >>> b = [[0,1,2,3],[1,2,3,4],[2,3,4,5]] >>> scipy.gplt.plot(a,scipy.transpose(b)) Then I get three lines in different colors. Now I'd like to name these graphs and maybe change the colors. Also I'd like to change the range of the x- and y-axis. Basically I'd like to use all the features of gnuplot with this python interface. I couldn't find a documentation on this and looking in the source code didn't help me at all. Does anyone know a good tutorial, documentation or can help me with this? Thanks Daniel From jr at sun.ac.za Wed Oct 5 06:03:36 2005 From: jr at sun.ac.za (Johann Rohwer) Date: Wed, 5 Oct 2005 12:03:36 +0200 Subject: [SciPy-user] plotting with gplt In-Reply-To: <4343A0AC.5020806@few.vu.nl> References: <4343A0AC.5020806@few.vu.nl> Message-ID: <200510051203.36816.jr@sun.ac.za> On Wednesday, 5 October 2005 11:45, Daniel R. Rohr wrote: > I'm trying to plot some of my data. e.g. like this > > >>> import scipy > >>> a = [0,1,2,3] > >>> b = [[0,1,2,3],[1,2,3,4],[2,3,4,5]] > >>> scipy.gplt.plot(a,scipy.transpose(b)) > > Then I get three lines in different colors. Now I'd like to name > these graphs and maybe change the colors. Also I'd like to change > the range of the x- and y-axis. Basically I'd like to use all the > features of gnuplot with this python interface. > I couldn't find a documentation on this and looking in the source > code didn't help me at all. Try the Scipy plotting tutorial: http://www.scipy.org/documentation/plottutorial.html Regards Johann From rkern at ucsd.edu Wed Oct 5 06:07:50 2005 From: rkern at ucsd.edu (Robert Kern) Date: Wed, 05 Oct 2005 03:07:50 -0700 Subject: [SciPy-user] plotting with gplt In-Reply-To: <4343A0AC.5020806@few.vu.nl> References: <4343A0AC.5020806@few.vu.nl> Message-ID: <4343A5F6.2040906@ucsd.edu> Daniel R. Rohr wrote: > Hi list > > I'm trying to plot some of my data. e.g. like this > >>>> import scipy >>>> a = [0,1,2,3] >>>> b = [[0,1,2,3],[1,2,3,4],[2,3,4,5]] >>>> scipy.gplt.plot(a,scipy.transpose(b)) > > Then I get three lines in different colors. Now I'd like to name these > graphs and maybe change the colors. Also I'd like to change the range of > the x- and y-axis. Basically I'd like to use all the features of gnuplot > with this python interface. There are other, fuller interfaces to Gnuplot that you should be using instead. gplt was designed to expose a relatively simple interface that was the same as plt and xplt, which use different backends. They are now deprecated and won't be supported in the future (although xplt seems like it will be spun out on its own). Google for "python gnuplot" and also look at some of the improvements provided by ipython. http://ipython.scipy.org/doc/manual/node15.html You may also want to consider using matplotlib, instead. http://matplotlib.sourceforge.net -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From rkern at ucsd.edu Wed Oct 5 07:33:53 2005 From: rkern at ucsd.edu (Robert Kern) Date: Wed, 05 Oct 2005 04:33:53 -0700 Subject: [SciPy-user] odeint In-Reply-To: <66373AD054447F47851FCC5EB49B36113E27EF@basquet.imim.es> References: <66373AD054447F47851FCC5EB49B36113E27EF@basquet.imim.es> Message-ID: <4343BA21.9020208@ucsd.edu> LOPEZ GARCIA DE LOMANA, ADRIAN wrote: > Thanks for your answer, I'll check if it works in my machine. I guess your're right, the errors appear when the values of the simulation are close to zero. > > The problem is that I'm not just interested on solving this specific example but it is part of a bigger program where I search for the parameters, run the simulations, and evaluate a score given some data. I'll check how if I can integrate your suggestion into the program. > > But, don't you think we are talking about a bug? Overall, if we want to compare scipy with Octave ... I mean, I definetly prefer SciPy, it's much faster, but I also want it to be reliable, and I don't want to be caring if the simulation is "too" close to zero. What exactly is the bug? odeint can't know that some of your parameters aren't supposed to be negative and thus automatically avoid them. When you ran your simulation, you got some output warning you that the function was raising exceptions, but the simulation completed did it not? odeint correctly determined that its guesses overshot the boundary on the domain of valid parameters, backed up and continued integrating. It might be worth adding an argument to the function to silence the output of such error messages. -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From alopez at imim.es Wed Oct 5 08:16:17 2005 From: alopez at imim.es (LOPEZ GARCIA DE LOMANA, ADRIAN) Date: Wed, 5 Oct 2005 14:16:17 +0200 Subject: [SciPy-user] odeint Message-ID: <66373AD054447F47851FCC5EB49B36113E27F7@basquet.imim.es> I really agree with you, Robert. The result of the simulation is correct, but the amount of the error is so big that I guess I'm loosing performance on the speed of the program. Do you have any suggestion about how to silence these errors? Thanks, Adri?n. -----Original Message----- From: scipy-user-bounces at scipy.net on behalf of Robert Kern Sent: Wed 05/10/2005 12:33 To: SciPy Users List Subject: Re: [SciPy-user] odeint LOPEZ GARCIA DE LOMANA, ADRIAN wrote: > Thanks for your answer, I'll check if it works in my machine. I guess your're right, the errors appear when the values of the simulation are close to zero. > > The problem is that I'm not just interested on solving this specific example but it is part of a bigger program where I search for the parameters, run the simulations, and evaluate a score given some data. I'll check how if I can integrate your suggestion into the program. > > But, don't you think we are talking about a bug? Overall, if we want to compare scipy with Octave ... I mean, I definetly prefer SciPy, it's much faster, but I also want it to be reliable, and I don't want to be caring if the simulation is "too" close to zero. What exactly is the bug? odeint can't know that some of your parameters aren't supposed to be negative and thus automatically avoid them. When you ran your simulation, you got some output warning you that the function was raising exceptions, but the simulation completed did it not? odeint correctly determined that its guesses overshot the boundary on the domain of valid parameters, backed up and continued integrating. It might be worth adding an argument to the function to silence the output of such error messages. -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter _______________________________________________ SciPy-user mailing list SciPy-user at scipy.net http://www.scipy.net/mailman/listinfo/scipy-user From worknit at gmail.com Wed Oct 5 12:13:59 2005 From: worknit at gmail.com (Jon Savian) Date: Wed, 5 Oct 2005 09:13:59 -0700 Subject: [SciPy-user] Fwd: building scipy from source In-Reply-To: <8d9f49590510042049y78e1850ape1c2c579357f1810@mail.gmail.com> References: <8d9f49590510042049y78e1850ape1c2c579357f1810@mail.gmail.com> Message-ID: <8d9f49590510050913g4fc62b30q4702a927738e628d@mail.gmail.com> ---------- Forwarded message ---------- From: Jon Savian Date: Oct 4, 2005 8:49 PM Subject: building scipy from source To: scipy-user at scipy.net I am trying to build scipy from source. I've installed all the prerequisites using the --prefix=/path/to/instalation. In this situation how would i go about installing scipy. Do i need to specify the directories of the prereqs? How would i do this? Thanks! -------------- next part -------------- An HTML attachment was scrubbed... URL: From agn at noc.soton.ac.uk Wed Oct 5 19:04:41 2005 From: agn at noc.soton.ac.uk (George Nurser) Date: Thu, 6 Oct 2005 00:04:41 +0100 Subject: [SciPy-user] wrong sign of determinants in numarray with lapack. In-Reply-To: <056D32E9B2D93B49B01256A88B3EB218766E8B@icex2.ic.ac.uk> References: <056D32E9B2D93B49B01256A88B3EB218766E8B@icex2.ic.ac.uk> Message-ID: I'm sorry this is more of a numerix (or even a lapack) problem than a direct SciPy problem. I've been trying to make up Numeric & numarray on a Linux opteron box, using atlas/lapack. Numeric eventually seemed to work and pass its tests, though I had to compile ATLAS and LAPACK with -fPIC. ATLAS passed sanity tests. numarray-1.3.3 is proving more difficult. Worked out that installing numarray with python setup.py config --use_lapack install --gencode is the (undocumented) way to link to external libraries. No compilation or installation problems, but when I ran testall.test() found 3 problems to do with determinants (messages at end). Sign of 2nx2n determinants seems to be wrong. e.g. import numarray.linear_algebra.LinearAlgebra2 as la2 import numarray.numeric as num a = num.array([[1.,2.], [3.,4.]]) print la2.determinant(a) -> gives 2.0 (should be -2) but 2n+1 x2n+1 seems OK. c = num.array([[1,0,0],[0,1.,-2.], [0,3.,4.]]) print la2.determinant(b) -> gives (correctly) 10.0 Default installation w/o lapack gives the right answers. So I tested lapack by compiling *all* the lapack stuff, (including the test suite) instead of just the lapacklib i.e. I did make all instead of make lapacklib This gave 'major' (i.e. order 10^16) errors in cgd.out, csep.out, ded.out, dgd.out, sgd.out, ssep.out,ssvd.out, zgd.out. Now [cdsz]gd.out errors are apparently expected -- see http://web.mit.edu/lapack/www/faq.html#1.24 But errors in ded.out,ssep.out, ssvd.out may be more serious. Are they causing the incorrect-sign determinants? Puzzled. --George Nurser. compilation method --------------------------- ATLAS. make xconfig ./xconfig -F f '-fPIC -fomit-frame-pointer -O -m64 -fno-second-underscore' -F c '-fPIC -fomit-frame-pointer -O -mfpmath=387 -m64' -F m '-fPIC -fomit-frame-pointer -O -mfpmath=387 -m64' -b /usr/lib/libblas.a make install arch=Linux_HAMMER64SSE2 LAPACK. use ATLAS blas; OPTS/NOOPT following http://www.scipy.org/mailinglists/mailman?fn=scipy-user/2005-February/ 004066.html so in make.inc OPTS = -funroll-all-loops -O3 -m64 -fno-second-underscore -fPIC NOOPT = -m64 -fno-second-underscore -fPIC BLASLIB = .....ATLAS/lib/Linux_HAMMER64SSE2/f77blas.a lapack errors ------------------ grep ail * cgd.out: Matrix types (see CDRGEV for details): cgd.out: CGV drivers: 63 out of 1092 tests failed to pass the threshold csep.out: Matrix types (see xDRVST for details): csep.out: CST drivers: 1 out of 11664 tests failed to pass the threshold ded.out: Matrix types (see DDRVES for details): ded.out: DES: 1 out of 3270 tests failed to pass the threshold ded.out: Matrix types (see DDRVSX for details): ded.out: DSX: 1 out of 3500 tests failed to pass the threshold dgd.out: Matrix types (see DDRGEV for details): dgd.out: DGV drivers: 8 out of 1092 tests failed to pass the threshold dgd.out: DXV drivers: 200 out of 5000 tests failed to pass the threshold sgd.out: Matrix types (see SDRGEV for details): sgd.out: SGV drivers: 8 out of 1092 tests failed to pass the threshold sgd.out: SXV drivers: 37 out of 5000 tests failed to pass the threshold ssep.out: Matrix types (see SCHKST for details): ssep.out:Test performed: see SCHKST for details. ssep.out: SST: 2 out of 4662 tests failed to pass the threshold ssep.out: Matrix types (see xDRVST for details): ssep.out: SST drivers: 1 out of 14256 tests failed to pass the threshold ssvd.out: Matrix types (see xCHKBD for details): ssvd.out: SBD: 1 out of 5510 tests failed to pass the threshold zgd.out: Matrix types (see ZDRGEV for details): zgd.out: ZGV drivers: 62 out of 1092 tests failed to pass the threshold zgd.out: ZXV drivers: 24 out of 5000 tests failed to pass the threshold From nwagner at mecha.uni-stuttgart.de Thu Oct 6 05:27:13 2005 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Thu, 06 Oct 2005 11:27:13 +0200 Subject: [SciPy-user] Triple integral with Gaussian quadrature Message-ID: <4344EDF1.5060802@mecha.uni-stuttgart.de> Hi all, Is there a way to compute a triple integral via Gaussian quadrature in scipy ? A small example would be appreciated. What integration method is used in tplquad(func, a, b, gfun, hfun, qfun, rfun, args=(), epsabs=1.4899999999999999e-08, epsrel=1.4899999999999999e-08) ? quadrature(func, a, b, args=(), tol=1.4899999999999999e-08, maxiter=50) Nils From rkern at ucsd.edu Thu Oct 6 05:46:34 2005 From: rkern at ucsd.edu (Robert Kern) Date: Thu, 06 Oct 2005 02:46:34 -0700 Subject: [SciPy-user] Triple integral with Gaussian quadrature In-Reply-To: <4344EDF1.5060802@mecha.uni-stuttgart.de> References: <4344EDF1.5060802@mecha.uni-stuttgart.de> Message-ID: <4344F27A.1090804@ucsd.edu> Nils Wagner wrote: > Hi all, > > Is there a way to compute a triple integral via Gaussian quadrature in > scipy ? > A small example would be appreciated. > > What integration method is used in > tplquad(func, a, b, gfun, hfun, qfun, rfun, args=(), > epsabs=1.4899999999999999e-08, epsrel=1.4899999999999999e-08) ? It's just quad() run over the function defined by doing the double integral over the remaining dimensions (and the double integral is also implemented with quad() by integrating over the function defined by doing the integral using quad() to finally integrate over the last dimension). quad() comes from QUADPACK. Exactly which QUADPACK function is called depends on the inputs; infinite bounds require one function, weighted integrals require others, explicit breakpoints require yet another. You can figure which by looking at the source code in Lib/integrate/quadpack.py, and then you can read the documentation in Lib/integrate/quadpack/readme to learn about the algorithms used by those functions. In order to do triple integrals with Gaussian quadrature, you can probably do a similar recursive scheme like tplquad() does. You can read the source for details. -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From a.h.jaffe at gmail.com Thu Oct 6 10:04:25 2005 From: a.h.jaffe at gmail.com (Andrew Jaffe) Date: Thu, 6 Oct 2005 15:04:25 +0100 Subject: [SciPy-user] Fwd: Scipy_core and numarray References: <17221.11905.227711.3197@imperial.ac.uk> Message-ID: <36BF0393-F2E6-46FA-8A2A-978B1D51535C@gmail.com> Hi All, I've tried to post via the newsgroup interface, but my messages have never appeared, so here's another try!... I'm currently a numarray user from the astrophysics community and I suppose I want to understand a bit more about the relationship between the two projects. Do the two projects exist side-by-side? Will numarray be superseded by scipy_core? I like the speed advantage of scipy_core, but I there may be features of numarray I don't want to give up (or legacy code I need to interfact with). Along the latter lines, I have a more practical question: one numarray capability that I used a lot was the ability to create uninitialized arrays, such as arr = numarray.array(shape=(3,5), type=Float64) Such an initializer doesn't seem to exist in Numeric/scipy.base; is this correct? If not, please consider this a feature request. Obviously, zeros() will work fine, and I'm sure that worrying about the time spent filling with zeros is premature optimization at its worst, but it's still vaguely annoying..." (Any other comments on scipy v numarray, from either camp, would be welcome, too.) Andrew ______________________________________________________________________ Andrew Jaffe a.jaffe at imperial.ac.uk Astrophysics Group +44 207 594-7526 Blackett Laboratory, Room 1013 FAX 7541 Imperial College, Prince Consort Road London SW7 2AZ ENGLAND http://astro.imperial.ac.uk/~jaffe From rkern at ucsd.edu Thu Oct 6 10:12:34 2005 From: rkern at ucsd.edu (Robert Kern) Date: Thu, 06 Oct 2005 07:12:34 -0700 Subject: [SciPy-user] Fwd: Scipy_core and numarray In-Reply-To: <36BF0393-F2E6-46FA-8A2A-978B1D51535C@gmail.com> References: <17221.11905.227711.3197@imperial.ac.uk> <36BF0393-F2E6-46FA-8A2A-978B1D51535C@gmail.com> Message-ID: <434530D2.8000302@ucsd.edu> Andrew Jaffe wrote: > Hi All, > > I've tried to post via the newsgroup interface, but my messages have > never appeared, so here's another try!... > > I'm currently a numarray user from the astrophysics community and I > suppose I want to understand a bit more about the relationship between > the two projects. Do the two projects exist side-by-side? Will numarray > be superseded by scipy_core? You can install both packages side-by-side, yes, but the intention of the new scipy_core is to *replace* both Numeric and numarray. > I like the speed advantage of scipy_core, > but I there may be features of numarray I don't want to give up (or > legacy code I need to interfact with). We intend to expand the featureset of scipy_core to include that of numarray. In particular, we have our eye on porting nd_image although I won't speculate on a timeframe for that. > Along the latter lines, I have a more practical question: one > numarray capability that I used a lot was the ability to create > uninitialized arrays, such as > > arr = numarray.array(shape=(3,5), type=Float64) arr = scipy.empty((3,5), scipy.Float64) -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From oliphant at ee.byu.edu Thu Oct 6 11:22:29 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Thu, 06 Oct 2005 09:22:29 -0600 Subject: [SciPy-user] [SciPy-dev] Re: [Numpy-discussion] Purchasing Documentation In-Reply-To: <1e2af89e0510060529s5319192ayea270ebe2949b6b8@mail.gmail.com> References: <43418AA8.2040809@hawaii.edu> <4341A410.5010504@optusnet.com.au> <4341B659.7070503@hawaii.edu> <4341C1C8.3000506@optusnet.com.au> <4341CB9E.4050306@ee.byu.edu> <434300FB.7070606@ee.byu.edu> <434307CC.7020303@colorado.edu> <43430B62.9090908@ee.byu.edu> <1e2af89e0510060529s5319192ayea270ebe2949b6b8@mail.gmail.com> Message-ID: <43454135.503@ee.byu.edu> Matthew Brett wrote: >Hi Travis, > >So, can I summarize for my own benefit? > >I think we all agree that software without documentation is not >usable, and therefore that free software requires that the basic >documentation also be free. > >Obviously if the _basic_ documentation is not free, this will >dramatically reduce software uptake. > >And I think we all also agree that there is no problem of any sort >with not-free documentation that expands in a useful way on the basic >documentation. > >So, just to clarify - does Fenando win his bet (was it dollars or >doughnuts)? Are you happy for the community to write and release free >basic documentation for scipy.base? > > Yes, yes. Fernando knows me pretty well in that regard. I'll probably contribute to it with useful selections from the book (once I have the book finished). It really motivates me to produce more (of everything) when I see people actually purchasing the documentation. And I certainly don't want "lack of free documentation" to hamper adoption of scipy core. So, it benefits everybody to have free version available. It would be great if somebody could at least post the pydoc output for the core. That is a good start. I'll add basic C-API calls to free documentation and we'll be ready to go. -Travis From oliphant at ee.byu.edu Thu Oct 6 11:29:02 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Thu, 06 Oct 2005 09:29:02 -0600 Subject: [SciPy-user] Fwd: Scipy_core and numarray In-Reply-To: <36BF0393-F2E6-46FA-8A2A-978B1D51535C@gmail.com> References: <17221.11905.227711.3197@imperial.ac.uk> <36BF0393-F2E6-46FA-8A2A-978B1D51535C@gmail.com> Message-ID: <434542BE.3010005@ee.byu.edu> Andrew Jaffe wrote: >Hi All, > >I've tried to post via the newsgroup interface, but my messages have >never appeared, so here's another try!... > >I'm currently a numarray user from the astrophysics community and I >suppose I want to understand a bit more about the relationship between >the two projects. Do the two projects exist side-by-side? Will numarray >be superseded by scipy_core? I like the speed advantage of scipy_core, >but I there may be features of numarray I don't want to give up (or >legacy code I need to interfact with). > > I'm interested in the "features" you don't want to give up. SciPy core should have all features of numarray. Also, scipy core, numarray, and Numeric can all live together. If you are using sufficient versions of each, you can exchange data via the array interface. The ONLY reason I started scipy core was to unify Numeric and numarray users. My selling documentation, should not be construed as anything else but trying to help people who can't find time to contribute in other ways to contribute to that goal. If I was "just trying to make money," I would be doing something far different than this... > >Along the latter lines, I have a more practical question: one >numarray capability that I used a lot was the ability to create >uninitialized arrays, such as > > arr = numarray.array(shape=(3,5), type=Float64) > > It's actually there in Numeric (since 23.7 I think). Numeric.empty((3,5),'d') and also in (new) scipy scipy.empty((3,5),scipy.float64) In (new) scipy you can also create new arrays from buffers very easily. -Travis From ryanfedora at comcast.net Thu Oct 6 11:50:25 2005 From: ryanfedora at comcast.net (Ryan Krauss) Date: Thu, 6 Oct 2005 11:50:25 -0400 Subject: [SciPy-user] convert shelved object back to dictionary Message-ID: Is there an easy way to convert a shelved object back to a dictionary? When I save a dictionary using scipy.io.sve and then load it in a later session, I have an object whose property names are the keys of the dictionary used as an input to shelve. For example, instead of user['name'] I have user.name. I would prefer to have the scripts I write to analyze the saved data have a similar syntax to the ones I used to create it, so I would rather deal with a dictionary after I load the shelved data. Ryan From pajer at iname.com Thu Oct 6 11:56:12 2005 From: pajer at iname.com (Gary) Date: Thu, 06 Oct 2005 11:56:12 -0400 Subject: [SciPy-user] Install of scipy core for current scipy users In-Reply-To: <433CE67F.3060907@ee.byu.edu> References: <433CE67F.3060907@ee.byu.edu> Message-ID: <4345491C.6090309@iname.com> Travis Oliphant wrote: > > If you are a current user of scipy and want to try out the new scipy > core. You can do so. > > However, the default install will change four __init__.py files on > your system (which will probably "break" the rest of scipy). > > Hopefully we will get scipy as a whole to work with the new system > shortly. There has also been talk of putting together a simple > install to fixup the __init__ files so that old scipy and new > scipy_core can live together peacefully (anyone who can help with that > is welcomed). > > > scipy\__init__.py > scipy\linalg\__init__.py > scipy\fftpack\__init__.py > scipy\stats\__init__.py > > are the affected files. > > The ones from scipy can just be copied back, but then you won't get > the functions from new scipy. > > -Travis I anxious to try scipy core, but I don't want my system to stop working, either. So let's see if I understand: Current: WinXP, python 2.3 I would: - upgrade to python 2.4 - rename the four files above (to save them, just in case ?) - install scipy core - make the necessary changes to my sources Third-party apps that depend on scipy, and whose sources have not been updated, will break. Numeric and numarray won't be touched, so apps that use only them (and no scipy) won't know the difference (VPython in particular) Anyone know if matplotlib would break? (I've asked on that list, too.) Anyone know which, if any, apps would break? (Mayavi, etc ?) -gary From oliphant at ee.byu.edu Thu Oct 6 12:06:41 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Thu, 06 Oct 2005 10:06:41 -0600 Subject: [SciPy-user] convert shelved object back to dictionary In-Reply-To: References: Message-ID: <43454B91.7050701@ee.byu.edu> Ryan Krauss wrote: >Is there an easy way to convert a shelved object back to a dictionary? > When I save a dictionary using scipy.io.sve and then load it in a >later session, I have an object whose property names are the keys of >the dictionary used as an input to shelve. For example, instead of >user['name'] I have user.name. I would prefer to have the scripts I >write to analyze the saved data have a similar syntax to the ones I >used to create it, so I would rather deal with a dictionary after I >load the shelved data. > > Does user.__dict__ give you what you want? So user=user.__dict__ would allow you to do what you want to do? -Travis From ryanfedora at comcast.net Thu Oct 6 12:20:37 2005 From: ryanfedora at comcast.net (Ryan Krauss) Date: Thu, 6 Oct 2005 12:20:37 -0400 Subject: [SciPy-user] convert shelved object back to dictionary In-Reply-To: <43454B91.7050701@ee.byu.edu> References: <43454B91.7050701@ee.byu.edu> Message-ID: Just about. For some reason __dict__ didn't come up as an option when I typed user. and hit tab in Ipython, so I didn't know it was there. The only slight problem with __dict__ is that it includes some properties like __builtins__ that I would like to filter out of the dictionary. Thanks for your help. Ryan On 10/6/05, Travis Oliphant wrote: > Ryan Krauss wrote: > > >Is there an easy way to convert a shelved object back to a dictionary? > > When I save a dictionary using scipy.io.sve and then load it in a > >later session, I have an object whose property names are the keys of > >the dictionary used as an input to shelve. For example, instead of > >user['name'] I have user.name. I would prefer to have the scripts I > >write to analyze the saved data have a similar syntax to the ones I > >used to create it, so I would rather deal with a dictionary after I > >load the shelved data. > > > > > > Does user.__dict__ give you what you want? > > So > > user=user.__dict__ would allow you to do what you want to do? > > > -Travis > > > From R.Springuel at umit.maine.edu Thu Oct 6 12:28:06 2005 From: R.Springuel at umit.maine.edu (R. Padraic Springuel) Date: Thu, 06 Oct 2005 12:28:06 -0400 Subject: [SciPy-user] plotting with gplt In-Reply-To: References: Message-ID: <43455096.2070803@umit.maine.edu> If you're insistent about sticking with scipy's gnuplot, I've found that by far the best way to use the interface, and to unlock every gnuplot option, is to use the "gplt.load" command. You first need to create a file with data you're going to plot, formated exactly as you would format it to bring it into gnuplot and plot it manually. Then write a file that contains the gnuplot commands that you want to run, as you'd type them on the gnuplot command line. Finally, from within scipy execute the command gplt.load('filename.txt'). This will load gnuplot and run the commands as if you had entered them in the gnuplot interface. Its not the most elegant way to interface with gnuplot, but between the simplistic interface of scipy's gplt and this work around, you can create any kind of gnuplot you want without having to learn another package. Generally I'd recommend using the simplistic gplt interface to draft your graphs, then when you know how you want them to look, or need to add features gplt doesn't support (like error bars) you use the method I described above. I've written some routines to automate some of the above described methods, though there is still one more modification that I'd like to make. However, if you're interested, I'll upload it to some web space so that it can be downloaded. -- R. Padraic Springuel Teaching Assistant Department of Physics and Astronomy University of Maine Bennet 214 Office Hours: Monday 10:00 - 11:00 am; Thursday 12:00 - 1:00 pm From Fernando.Perez at colorado.edu Thu Oct 6 12:38:17 2005 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Thu, 06 Oct 2005 10:38:17 -0600 Subject: [SciPy-user] convert shelved object back to dictionary In-Reply-To: References: <43454B91.7050701@ee.byu.edu> Message-ID: <434552F9.9050703@colorado.edu> Ryan Krauss wrote: > Just about. For some reason __dict__ didn't come up as an option when > I typed user. and hit tab in Ipython, so I didn't know it was there. > The only slight problem with __dict__ is that it includes some > properties like __builtins__ that I would like to filter out of the > dictionary. You can control whether tab-completion shows the single and double-underscore prefixed attributes of objects. This is the relevant section in the ~/.ipython/ipythonrc file: # (iii) readline_omit__names: normally hitting after a '.' in a name # will complete all attributes of an object, including all the special methods # whose names start with single or double underscores (like __getitem__ or # __class__). # This variable allows you to control this completion behavior: # readline_omit__names 1 -> completion will omit showing any names starting # with two __, but it will still show names starting with one _. # readline_omit__names 2 -> completion will omit all names beginning with one # _ (which obviously means filtering out the double __ ones). # Even when this option is set, you can still see those names by explicitly # typing a _ after the period and hitting : 'name._' will always # complete attribute names starting with '_'. # This option is off by default so that new users see all attributes of any # objects they are dealing with. readline_omit__names 1 See what value you have for that, and set it to something you like. Note that you can still see the underscored names regardless of this setting if you type a single underscore yourself, and THEN hit tab: In [4]: a='hi' In [5]: a. a.capitalize a.expandtabs a.islower a.lower a.rstrip a.title a.center a.find a.isspace a.lstrip a.split [etc...] In [5]: a._ a.__add__ a.__getattribute__ a.__le__ a.__reduce__ a.__class__ a.__getitem__ a.__len__ a.__reduce_ex__ [etc...] At runtime, you can see your whole ipython config state by simply typing %config. Cheers, f From oliphant at ee.byu.edu Thu Oct 6 12:50:09 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Thu, 06 Oct 2005 10:50:09 -0600 Subject: [SciPy-user] Install of scipy core for current scipy users In-Reply-To: <4345491C.6090309@iname.com> References: <433CE67F.3060907@ee.byu.edu> <4345491C.6090309@iname.com> Message-ID: <434555C1.4000002@ee.byu.edu> Gary wrote: >Travis Oliphant wrote: > > > >>If you are a current user of scipy and want to try out the new scipy >>core. You can do so. >> >>However, the default install will change four __init__.py files on >>your system (which will probably "break" the rest of scipy). >> >>Hopefully we will get scipy as a whole to work with the new system >>shortly. There has also been talk of putting together a simple >>install to fixup the __init__ files so that old scipy and new >>scipy_core can live together peacefully (anyone who can help with that >>is welcomed). >> >> >>scipy\__init__.py >>scipy\linalg\__init__.py >>scipy\fftpack\__init__.py >>scipy\stats\__init__.py >> >>are the affected files. >> >>The ones from scipy can just be copied back, but then you won't get >>the functions from new scipy. >> >>-Travis >> >> > >I anxious to try scipy core, but I don't want my system to stop working, >either. So let's see if I understand: > >Current: WinXP, python 2.3 > >I would: >- upgrade to python 2.4 >- rename the four files above (to save them, just in case ?) >- install scipy core >- make the necessary changes to my sources > >Third-party apps that depend on scipy, and whose sources have not been >updated, will break. > > What I'm doing is running new scipy on Python 2.4 and old scipy on Python 2.3 --- then my SciPy installs don't conflict with each other at all. >Numeric and numarray won't be touched, so apps that use only them (and >no scipy) won't know the difference (VPython in particular) > > Yes, that's correct. The only problem might be if you have an old (Anyone know if matplotlib would break? (I've asked on that list, too.) > > No, it won't break. But, unless your Numeric is high-enough, it won't understand your scipy core arrays, either. >Anyone know which, if any, apps would break? (Mayavi, etc ?) > > Again, nothing should "break" except scipy itself (for the time being...) Worst case is that the apps won't understand the scipy array (except through the generic Python sequence interface)... -Travis From gpajer at rider.edu Thu Oct 6 12:54:37 2005 From: gpajer at rider.edu (Gary Pajer) Date: Thu, 06 Oct 2005 12:54:37 -0400 Subject: [SciPy-user] Install of scipy core for current scipy users In-Reply-To: <434555C1.4000002@ee.byu.edu> References: <433CE67F.3060907@ee.byu.edu> <4345491C.6090309@iname.com> <434555C1.4000002@ee.byu.edu> Message-ID: <434556CD.8090807@rider.edu> Travis Oliphant wrote: >Gary wrote: > > > >>Travis Oliphant wrote: >> >> >> >> >> >>>If you are a current user of scipy and want to try out the new scipy >>>core. You can do so. >>> >>>However, the default install will change four __init__.py files on >>>your system (which will probably "break" the rest of scipy). >>> >>>Hopefully we will get scipy as a whole to work with the new system >>>shortly. There has also been talk of putting together a simple >>>install to fixup the __init__ files so that old scipy and new >>>scipy_core can live together peacefully (anyone who can help with that >>>is welcomed). >>> >>> >>>scipy\__init__.py >>>scipy\linalg\__init__.py >>>scipy\fftpack\__init__.py >>>scipy\stats\__init__.py >>> >>>are the affected files. >>> >>>The ones from scipy can just be copied back, but then you won't get >>>the functions from new scipy. >>> >>>-Travis >>> >>> >>> >>> >>I anxious to try scipy core, but I don't want my system to stop working, >>either. So let's see if I understand: >> >>Current: WinXP, python 2.3 >> >>I would: >>- upgrade to python 2.4 >>- rename the four files above (to save them, just in case ?) >>- install scipy core >>- make the necessary changes to my sources >> >>Third-party apps that depend on scipy, and whose sources have not been >>updated, will break. >> >> >> >> >What I'm doing is running new scipy on Python 2.4 and old scipy on >Python 2.3 --- then my SciPy installs don't conflict with each other at all. > > > Can one do that on WinXP without upsetting the Registry? From aisaac at american.edu Thu Oct 6 12:34:37 2005 From: aisaac at american.edu (Alan G Isaac) Date: Thu, 6 Oct 2005 12:34:37 -0400 Subject: [SciPy-user] Fwd: Scipy_core and numarray In-Reply-To: <434530D2.8000302@ucsd.edu> References: <17221.11905.227711.3197@imperial.ac.uk><36BF0393-F2E6-46FA-8A2A-978B1D51535C@gmail.com><434530D2.8000302@ucsd.edu> Message-ID: >> Along the latter lines, I have a more practical question: one >> numarray capability that I used a lot was the ability to create >> uninitialized arrays, such as >> arr = numarray.array(shape=(3,5), type=Float64) On Thu, 06 Oct 2005, Robert Kern apparently wrote: > arr = scipy.empty((3,5), scipy.Float64) Can you please explain what "unitialized" means in this context and what the advantage is over using zeros? Thank you, Alan Isaac >>> x=scipy.empty((2,2),scipy.Float64) >>> print x [[ 3.82265070e-297 -3.13335570e-294] [ 1.93390236e-309 4.22764505e-307]] >>> From rmuller at sandia.gov Thu Oct 6 13:34:58 2005 From: rmuller at sandia.gov (Richard Muller) Date: Thu, 06 Oct 2005 11:34:58 -0600 Subject: [SciPy-user] Help compiling Scipy on Fedora Core 4 Message-ID: <43456042.2030008@sandia.gov> I get the following error when building Scipy. Can anyone suggest a workaround? building 'scipy.stats.rand' extension compiling C sources gcc options: '-pthread -fno-strict-aliasing -DNDEBUG -O2 -g -pipe -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -m32 -march=i386 -mtune=pentium4 -fasynchronous-unwind-tables -D_GNU_SOURCE -fPIC -fPIC' compile options: '-I/usr/include/python2.4 -c' gcc: Lib/stats/ranlib_all.c In file included from /usr/include/python2.4/Python.h:8, from Lib/stats/ranlib_all.c:4: /usr/include/python2.4/pyconfig.h:835:1: warning: "_POSIX_C_SOURCE" redefined In file included from /usr/include/stdio.h:28, from Lib/stats/ranlib_all.c:2: /usr/include/features.h:150:1: warning: this is the location of the previous definition Lib/stats/ranlib_all.c: In function ?ignlgi?: Lib/stats/ranlib_all.c:2149: error: invalid storage class for function ?inrgcm? Lib/stats/ranlib_all.c: At top level: Lib/stats/ranlib_all.c:2259: warning: conflicting types for ?inrgcm? Lib/stats/ranlib_all.c:2259: error: static declaration of ?inrgcm? follows non-static declaration Lib/stats/ranlib_all.c:2158: error: previous implicit declaration of ?inrgcm? was here In file included from /usr/include/python2.4/Python.h:8, from Lib/stats/ranlib_all.c:4: /usr/include/python2.4/pyconfig.h:835:1: warning: "_POSIX_C_SOURCE" redefined In file included from /usr/include/stdio.h:28, from Lib/stats/ranlib_all.c:2: /usr/include/features.h:150:1: warning: this is the location of the previous definition Lib/stats/ranlib_all.c: In function ?ignlgi?: Lib/stats/ranlib_all.c:2149: error: invalid storage class for function ?inrgcm? Lib/stats/ranlib_all.c: At top level: Lib/stats/ranlib_all.c:2259: warning: conflicting types for ?inrgcm? Lib/stats/ranlib_all.c:2259: error: static declaration of ?inrgcm? follows non-static declaration Lib/stats/ranlib_all.c:2158: error: previous implicit declaration of ?inrgcm? was here error: Command "gcc -pthread -fno-strict-aliasing -DNDEBUG -O2 -g -pipe -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -m32 -march=i386 -mtune=pentium4 -fasynchronous-unwind-tables -D_GNU_SOURCE -fPIC -fPIC -I/usr/include/python2.4 -c Lib/stats/ranlib_all.c -o build/temp.linux-i686-2.4/Lib/stats/ranlib_all.o" failed with exit status 1 From oliphant at ee.byu.edu Thu Oct 6 13:35:13 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Thu, 06 Oct 2005 11:35:13 -0600 Subject: [SciPy-user] Fwd: Scipy_core and numarray In-Reply-To: References: <17221.11905.227711.3197@imperial.ac.uk><36BF0393-F2E6-46FA-8A2A-978B1D51535C@gmail.com><434530D2.8000302@ucsd.edu> Message-ID: <43456051.2070409@ee.byu.edu> Alan G Isaac wrote: >>>Along the latter lines, I have a more practical question: one >>>numarray capability that I used a lot was the ability to create >>>uninitialized arrays, such as >>> arr = numarray.array(shape=(3,5), type=Float64) >>> >>> > >On Thu, 06 Oct 2005, Robert Kern apparently wrote: > > >>arr = scipy.empty((3,5), scipy.Float64) >> >> > >Can you please explain what "unitialized" means in this >context and what the advantage is over using zeros? > >Thank you, >Alan Isaac > > It's faster. Often you are going to fill the array with something. So, why waste time filling it with zeros first... -Travis From wjdandreta at att.net Thu Oct 6 13:56:34 2005 From: wjdandreta at att.net (Bill Dandreta) Date: Thu, 06 Oct 2005 13:56:34 -0400 Subject: [SciPy-user] [SciPy-dev] Re: [Numpy-discussion] Purchasing Documentation In-Reply-To: <43454135.503@ee.byu.edu> References: <43418AA8.2040809@hawaii.edu> <4341A410.5010504@optusnet.com.au> <4341B659.7070503@hawaii.edu> <4341C1C8.3000506@optusnet.com.au> <4341CB9E.4050306@ee.byu.edu> <434300FB.7070606@ee.byu.edu> <434307CC.7020303@colorado.edu> <43430B62.9090908@ee.byu.edu> <1e2af89e0510060529s5319192ayea270ebe2949b6b8@mail.gmail.com> <43454135.503@ee.byu.edu> Message-ID: <43456552.2090405@att.net> Travis Oliphant wrote: >And I certainly don't want "lack of free documentation" to hamper >adoption of scipy core. So, it benefits everybody to have free version >available. > >It would be great if somebody could at least post the pydoc output for >the core. That is a good start. I'll add basic C-API calls to free >documentation and we'll be ready to go. > > I have a couple of suggestions. I think scipy would benefit from a book like the "Python CookBook". Most scipy users are not programmers and having a lot of sample code would make scipy easier to learn and use. The way the "Python CookBook" was created is the editors requested submission of code snippets doing common programming tasks. The editors selected the best ones and put them in a book. The editors of a scipy cookbook would not necessarily have to be the scipy developers but should be scipy experts. You might want to consider distributing your commercial documentation on a CD that includes all the free documentation as well as scipy and dependencies that all work together. Bill From Fernando.Perez at colorado.edu Thu Oct 6 14:07:22 2005 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Thu, 06 Oct 2005 12:07:22 -0600 Subject: [SciPy-user] convert shelved object back to dictionary In-Reply-To: References: <43454B91.7050701@ee.byu.edu> <434552F9.9050703@colorado.edu> Message-ID: <434567DA.7050003@colorado.edu> Ryan Krauss wrote: > The weird thing is that I am seeing other __properties__ and methods > but not __dict__ > > In [68]: bd._ > bd.__doc__ bd.__file__ bd.__name__ bd.__class__ > > but > In [69]: bd.__dict__ > > returns the dictionary Travis was talking about. Yup, sorry. That's because tab completion relies on python's builtin dir() function, which filters out a few things first, and I have no control over that: In [1]: class bunch:pass ...: In [2]: a=bunch() In [3]: a.__dict__ Out[3]: {} In [4]: dir (a) Out[4]: ['__doc__', '__module__'] In [5]: a.__ a.__class__ a.__doc__ a.__module__ So this one is out of my control, I'm afraid. Cheers, f From aisaac at american.edu Thu Oct 6 14:25:03 2005 From: aisaac at american.edu (Alan G Isaac) Date: Thu, 6 Oct 2005 14:25:03 -0400 Subject: [SciPy-user] Fwd: Scipy_core and numarray In-Reply-To: <43456051.2070409@ee.byu.edu> References: <17221.11905.227711.3197@imperial.ac.uk><36BF0393-F2E6-46FA-8A2A-978B1D51535C@gmail.com><434530D2.8000302@ucsd.edu> <43456051.2070409@ee.byu.edu> Message-ID: > Alan G Isaac wrote: >> Can you please explain what "unitialized" means in this >> context and what the advantage is over using zeros? On Thu, 06 Oct 2005, Travis Oliphant apparently wrote: > It's faster. Often you are going to fill the array with > something. So, why waste time filling it with zeros > first... I am afraid this is a very basic question. So the is the way this works is that a chunk of memory is reserved without changing its current state? So if I print scipy.empty((50,50),scipy.Float64) I'll see the current state of that memory, interpreted as Float64. Is that right? Thanks, Alan Isaac From stephen.walton at csun.edu Thu Oct 6 14:30:45 2005 From: stephen.walton at csun.edu (Stephen Walton) Date: Thu, 06 Oct 2005 11:30:45 -0700 Subject: [SciPy-user] Help compiling Scipy on Fedora Core 4 In-Reply-To: <43456042.2030008@sandia.gov> References: <43456042.2030008@sandia.gov> Message-ID: <43456D55.20005@csun.edu> Richard Muller wrote: >I get the following error when building Scipy. Can anyone suggest a >workaround? > > I've built Scipy on Fedora Core 4 and never saw this error. What version of SciPy are you building? Last CVS release? Latest SVN? Also, are you aware of the earlier discussion, which may have been on SciPy-dev, that gfortran as shipped with FC4 does not compile SciPy correctly? The bug involved is fixed in gcc 4.0.2, currently in Fedora "development." This means it may or may not get shipped as an update to FC4 but will be in FC5. From rmuller at sandia.gov Thu Oct 6 14:41:26 2005 From: rmuller at sandia.gov (Richard Muller) Date: Thu, 06 Oct 2005 12:41:26 -0600 Subject: [SciPy-user] Help compiling Scipy on Fedora Core 4 In-Reply-To: <43456D55.20005@csun.edu> References: <43456042.2030008@sandia.gov> <43456D55.20005@csun.edu> Message-ID: <43456FD6.9090807@sandia.gov> Stephen Walton wrote: > Richard Muller wrote: > > >>I get the following error when building Scipy. Can anyone suggest a >>workaround? >> >> > > I've built Scipy on Fedora Core 4 and never saw this error. What > version of SciPy are you building? Last CVS release? Latest SVN? Scipy 0.3.2. Would it help if I used the CVS release? > > Also, are you aware of the earlier discussion, which may have been on > SciPy-dev, that gfortran as shipped with FC4 does not compile SciPy > correctly? The bug involved is fixed in gcc 4.0.2, currently in Fedora > "development." This means it may or may not get shipped as an update to > FC4 but will be in FC5. No, I wasn't aware of the discussion. Thanks for pointing it out. However, my build is dying on a gcc compile, so I don't know if that alone explains it. Maybe I should ask a different question. Right now I really just want scipy_base and scipy.optimize. Can anyone suggest a minimal set of targets I can build, or, more to the point, a maximal set of "ignore_packages" for my setup.py file? Thanks in advance. Rick From oliphant at ee.byu.edu Thu Oct 6 15:35:02 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Thu, 06 Oct 2005 13:35:02 -0600 Subject: [SciPy-user] Fwd: Scipy_core and numarray In-Reply-To: References: <17221.11905.227711.3197@imperial.ac.uk><36BF0393-F2E6-46FA-8A2A-978B1D51535C@gmail.com><434530D2.8000302@ucsd.edu> <43456051.2070409@ee.byu.edu> Message-ID: <43457C66.7040908@ee.byu.edu> Alan G Isaac wrote: >>Alan G Isaac wrote: >> >> >I am afraid this is a very basic question. >So the is the way this works is that a chunk of memory is >reserved without changing its current state? So if I >print scipy.empty((50,50),scipy.Float64) >I'll see the current state of that memory, >interpreted as Float64. Is that right > > Yes. Empty just gives you a chunk of memory that your system's malloc (C) function provides. -Travis From Fernando.Perez at colorado.edu Thu Oct 6 18:25:13 2005 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Thu, 06 Oct 2005 16:25:13 -0600 Subject: [SciPy-user] [SciPy-dev] Re: [Numpy-discussion] Purchasing Documentation In-Reply-To: <43454135.503@ee.byu.edu> References: <43418AA8.2040809@hawaii.edu> <4341A410.5010504@optusnet.com.au> <4341B659.7070503@hawaii.edu> <4341C1C8.3000506@optusnet.com.au> <4341CB9E.4050306@ee.byu.edu> <434300FB.7070606@ee.byu.edu> <434307CC.7020303@colorado.edu> <43430B62.9090908@ee.byu.edu> <1e2af89e0510060529s5319192ayea270ebe2949b6b8@mail.gmail.com> <43454135.503@ee.byu.edu> Message-ID: <4345A449.8040303@colorado.edu> Travis Oliphant wrote: > Yes, yes. Fernando knows me pretty well in that regard. I'll probably > contribute to it with useful selections from the book (once I have the > book finished). It really motivates me to produce more (of > everything) when I see people actually purchasing the documentation. To reiterate before my reputation is tarnished forever: the bet was with John, who is the doughnuts-eating one. Being a man of taste, I eat bagels :) [and I wasn't present when the bet was made] With that critical point out of the way, on to minor details. > It would be great if somebody could at least post the pydoc output for > the core. That is a good start. I'll add basic C-API calls to free > documentation and we'll be ready to go. While I fully respect the opinion of those concerned about Travis' decision, I have to say that I am personally not so worried. At scipy'05, when Travis announced the book idea, I asked a very simple question: 'are the docstrings complete?'. His answer was 'yes'. There was a good reason for my asking this question: I personally find that when coding, the most productive and efficient to learn a library is by typing import foo foo. and then foo.bar? or foo.bar?? if I want to see the source (for python-implemented things). I understand that some people prefer to read manuals/books, and there are topics beyond the single-function API that are best discussed in a book format. But given the instantaneous response of the above, I would rather use this than wade through an index in most cases. In fact, I'd like docstrings to provide (simple) examples in them, and one thing on my radar now that ipython is growing GUI/XML legs is to add support for latex in docstrings. I think that a library where all individual methods/functions have full, accurate and example-loaded docstrings, and where class declarations and top-level module docstrings provide the higher-level overview, is extremely usable as is. Especially when you can also keep pydoc running in html server mode (pydoc -p) to browse at your leisure, and trivially dump that HTML info to disk for static reference if needed. If this is done, I think that the space for a 'real book' is better defined, providing less of a detail reference and more of a teaching coverage. I sense that this is Travis' approach, and I personally have no beef with it whatsoever. If he had deliberately stripped or crippled all docstrings from new_scipy to artificially create a need for his book, that would be a different story. But he certainly didn't do such a thing. I do think that there's a lot of room for higher-level books on python for scientific computing. Langtangen's is already out there, and Perry's excellent tutorial is already free for all to use (while slightly focused on astronomy, I find it a very useful document). I guess what I'm trying to say is that the _library_ documentation can be considered to be a (good) set of docstrings. As long as scipy ships with such a thing (which can be improved with the easiest of all patches by users, since no code is touched), I am not so worried that it will be perceived as lacking real docs. Please note that I am not discussing the price/time constraints of his licensing, which is strictly his choice to make. Again, this is just my _personal_ opinion. And while I have great appreciation for Travis, he's not paying me to say any of this :) Regards, f From arnd.baecker at web.de Fri Oct 7 03:11:25 2005 From: arnd.baecker at web.de (Arnd Baecker) Date: Fri, 7 Oct 2005 09:11:25 +0200 (CEST) Subject: [SciPy-user] [SciPy-dev] Re: [Numpy-discussion] Purchasing Documentation In-Reply-To: <4345A449.8040303@colorado.edu> References: <43418AA8.2040809@hawaii.edu> <4341A410.5010504@optusnet.com.au> <4341B659.7070503@hawaii.edu> <4341C1C8.3000506@optusnet.com.au> <434300FB.7070606@ee.byu.edu> <434307CC.7020303@colorado.edu> <1e2af89e0510060529s5319192ayea270ebe2949b6b8@mail.gmail.com> <43454135.503@ee.byu.edu> <4345A449.8040303@colorado.edu> Message-ID: On Thu, 6 Oct 2005, Fernando Perez wrote: [...] > To reiterate before my reputation is tarnished forever: the bet was with John, > who is the doughnuts-eating one. Being a man of taste, I eat bagels :) [and I > wasn't present when the bet was made] > > With that critical point out of the way, on to minor details. Thanks for the clarification - this definitively made my day ;-) [...] > In fact, I'd like docstrings to provide (simple) examples in them, +(some large positive number ;-) > and one thing on my radar now that > ipython is growing GUI/XML legs is to add support for latex in docstrings. Are you thinking of restructured text http://docutils.sourceforge.net/ together with the mathhack http://docutils.sourceforge.net/sandbox/cben/rolehack/ or the tools used in the LaTeXwiki, http://mcelrath.org/Notes/LatexWiki ? Independent of this, I see two main points: - how should the examples be added - who should add the examples to the doc-strings? Getting "end-users" to help with this task sounds quite attractive to me. Would it make sense to revive the pynotes idea http://www.scipy.org/documentation/mailman?fn=scipy-dev/2004-November/002536.html which would make it possible to add remarks to any python command? However, I don't know if there is a good solution on how to submit these notes and how they finally should get merged into the doc-string (One possibility would be to combine this in some way with Travis' live-docs). It might be unavoidable that at the end some knowledgable person has to go over suggested additions and incorporate them on a case-by-case basis? Best, Arnd From d.howey at imperial.ac.uk Fri Oct 7 06:45:17 2005 From: d.howey at imperial.ac.uk (Howey, David A) Date: Fri, 7 Oct 2005 11:45:17 +0100 Subject: [SciPy-user] Python on ubuntu Message-ID: <056D32E9B2D93B49B01256A88B3EB218766FBA@icex2.ic.ac.uk> What's the easiest way to install ipython/matplotlib/scipy on ubuntu? Thanks Dave From jdhunter at ace.bsd.uchicago.edu Fri Oct 7 09:32:20 2005 From: jdhunter at ace.bsd.uchicago.edu (John Hunter) Date: Fri, 07 Oct 2005 08:32:20 -0500 Subject: [SciPy-user] Python on ubuntu In-Reply-To: <056D32E9B2D93B49B01256A88B3EB218766FBA@icex2.ic.ac.uk> ("Howey, David A"'s message of "Fri, 7 Oct 2005 11:45:17 +0100") References: <056D32E9B2D93B49B01256A88B3EB218766FBA@icex2.ic.ac.uk> Message-ID: <87vf098nbv.fsf@peds-pc311.bsd.uchicago.edu> >>>>> "Howey," == Howey, David A writes: Howey,> What's the easiest way to install ipython/matplotlib/scipy Howey,> on ubuntu? Thanks The standard approach: > sudo apt-get install ipython python2.4-scipy For the standard matplotlib package, add these lines to your /etc/apt/sources.list: deb http://anakonda.altervista.org/debian packages/ deb-src http://anakonda.altervista.org/debian sources/ and then run: > sudo apt-get update > sudo apt-get install python-matplotlib python-matplotlib-doc These are all professional debian packages and most of them are a little out of date. If you like to live on the bleeding edge, you can use my poor man's Ubuntu Hoary packages (I've been told they also work for Breezy) which have matplotlib 0.84, ipython-0.6.16.cvs and scipy 3.3.304.4617. You need to enable universe and multiverse in /etc/apt/sources.list and then add deb http://peds-pc311.bsd.uchicago.edu binary/ to that file and do > sudo apt-get update > sudo apt-get install python-matplotlib-jdh scipy-jdh ipython-jdh I can assure you that I did not follow all the debian conventions for making packages, so no promises that these will make you happy. They work for me. The repository location is likely to change in the next couple of weeks, so ping me if you need it at some future date and I can give you the new address. JDH From aisaac at american.edu Fri Oct 7 07:44:19 2005 From: aisaac at american.edu (Alan G Isaac) Date: Fri, 7 Oct 2005 07:44:19 -0400 Subject: [SciPy-user] [SciPy-dev] Re: [Numpy-discussion] Purchasing Documentation In-Reply-To: References: <43418AA8.2040809@hawaii.edu> <4341A410.5010504@optusnet.com.au><4341B659.7070503@hawaii.edu> <4341C1C8.3000506@optusnet.com.au><434300FB.7070606@ee.byu.edu> <434307CC.7020303@colorado.edu><1e2af89e0510060529s5319192ayea270ebe2949b6b8@mail.gmail.com><43454135.503@ee.byu.edu> <4345A449.8040303@colorado.edu> Message-ID: On Fri, 7 Oct 2005, T) Arnd Baecker apparently wrote: > Are you thinking of restructured text > http://docutils.sourceforge.net/ > together with the mathhack > http://docutils.sourceforge.net/sandbox/cben/rolehack/ Perhaps a nice approach is http://docutils.sourceforge.net/sandbox/jensj/latex_math/README.txt fwiw, Alan Isaac From arnd.baecker at web.de Fri Oct 7 10:38:28 2005 From: arnd.baecker at web.de (Arnd Baecker) Date: Fri, 7 Oct 2005 16:38:28 +0200 (CEST) Subject: [SciPy-user] [SciPy-dev] Re: [Numpy-discussion] Purchasing Documentation In-Reply-To: References: <43418AA8.2040809@hawaii.edu> <4341A410.5010504@optusnet.com.au><4341B659.7070503@hawaii.edu> <4341C1C8.3000506@optusnet.com.au><434300FB.7070606@ee.byu.edu> <434307CC.7020303@colorado.edu><1e2af89e0510060529s5319192ayea270ebe2949b6b8@mail.gmail.com><43454135.503@ee.byu.edu> <4345A449.8040303@colorado.edu> Message-ID: On Fri, 7 Oct 2005, Alan G Isaac wrote: > On Fri, 7 Oct 2005, T) Arnd Baecker apparently wrote: > > Are you thinking of restructured text > > http://docutils.sourceforge.net/ > > together with the mathhack > > http://docutils.sourceforge.net/sandbox/cben/rolehack/ > > Perhaps a nice approach is > http://docutils.sourceforge.net/sandbox/jensj/latex_math/README.txt That looks very nice! In the end presumably different types of output (.ps/.pdf/html/html with mathml/...) depending on the purpose will be of interest... Best, Arnd From stefan at sun.ac.za Fri Oct 7 11:46:22 2005 From: stefan at sun.ac.za (Stefan van der Walt) Date: Fri, 7 Oct 2005 17:46:22 +0200 Subject: [SciPy-user] Python on ubuntu In-Reply-To: <87vf098nbv.fsf@peds-pc311.bsd.uchicago.edu> References: <056D32E9B2D93B49B01256A88B3EB218766FBA@icex2.ic.ac.uk> <87vf098nbv.fsf@peds-pc311.bsd.uchicago.edu> Message-ID: <20051007154622.GA19861@rajah> On Fri, Oct 07, 2005 at 08:32:20AM -0500, John Hunter wrote: > For the standard matplotlib package, add these lines to your > /etc/apt/sources.list: > > deb http://anakonda.altervista.org/debian packages/ > deb-src http://anakonda.altervista.org/debian sources/ > > and then run: > > > > sudo apt-get update > > sudo apt-get install python-matplotlib python-matplotlib-doc > > These are all professional debian packages and most of them are a > little out of date. Ubuntu is not binary compatible with Debian, so I don't know if this is a good idea. > If you like to live on the bleeding edge, you can use my poor man's > Ubuntu Hoary packages (I've been told they also work for Breezy) which > have matplotlib 0.84, ipython-0.6.16.cvs and scipy 3.3.304.4617. You > need to enable universe and multiverse in /etc/apt/sources.list and > then add > > deb http://peds-pc311.bsd.uchicago.edu binary/ I think this route is a lot safer! I just installed Breezy and am impressed -- it already contains scipy-core (don't know which version). Unfortunately no official matplotlib... (yet). Regards St?fan From gerard.vermeulen at grenoble.cnrs.fr Sat Oct 8 08:25:07 2005 From: gerard.vermeulen at grenoble.cnrs.fr (Gerard Vermeulen) Date: Sat, 8 Oct 2005 14:25:07 +0200 Subject: [SciPy-user] newcore-1209: PyArray_New() returns fortran arrays when flags == CARRAY_FLAGS Message-ID: <20051008142507.16c8bc8f.gerard.vermeulen@grenoble.cnrs.fr> --- newcore/scipy/base/src/arrayobject.c.gv 2005-10-08 08:16:28.000000000 +0200 +++ newcore/scipy/base/src/arrayobject.c 2005-10-08 13:35:27.000000000 +0200 @@ -29,7 +29,7 @@ #include "structmember.h" #define _MULTIARRAYMODULE -#include "Numeric3/arrayobject.h" +#include "scipy/base/arrayobject.h" */ /* Helper functions */ @@ -3177,7 +3177,18 @@ self->dimensions = NULL; if (data == NULL) { /* strides is NULL too */ self->flags = DEFAULT_FLAGS; +#ifdef GOTCHA + /* The following test results in returning a fortran array, + if the user (me) or eg PyArray_FromDims() passes in + flags = CARRAY_FLAGS. + */ if (flags) { +#else + /* This fixes at least PyArray_FromDims(), but may break + something else :-) + */ + if (flags & FORTRAN) { +#endif self->flags |= FORTRAN; if (nd > 1) self->flags &= ~CONTIGUOUS; flags = FORTRAN; Gerard From oliphant at ee.byu.edu Sat Oct 8 15:31:45 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Sat, 08 Oct 2005 13:31:45 -0600 Subject: [SciPy-user] newcore-1209: PyArray_New() returns fortran arrays when flags == CARRAY_FLAGS In-Reply-To: <20051008142507.16c8bc8f.gerard.vermeulen@grenoble.cnrs.fr> References: <20051008142507.16c8bc8f.gerard.vermeulen@grenoble.cnrs.fr> Message-ID: <43481EA1.3040901@ee.byu.edu> Gerard Vermeulen wrote: >--- newcore/scipy/base/src/arrayobject.c.gv 2005-10-08 08:16:28.000000000 +0200 >+++ newcore/scipy/base/src/arrayobject.c 2005-10-08 13:35:27.000000000 +0200 >@@ -29,7 +29,7 @@ > #include "structmember.h" > > #define _MULTIARRAYMODULE >-#include "Numeric3/arrayobject.h" >+#include "scipy/base/arrayobject.h" > */ > > /* Helper functions */ >@@ -3177,7 +3177,18 @@ > self->dimensions = NULL; > if (data == NULL) { /* strides is NULL too */ > self->flags = DEFAULT_FLAGS; >+#ifdef GOTCHA >+ /* The following test results in returning a fortran array, >+ if the user (me) or eg PyArray_FromDims() passes in >+ flags = CARRAY_FLAGS. >+ */ > > The problem is that when data is NULL, flags should only be used to indicate a fortran-style array. Thus, the bug is in PyArray_FromDims (it should not be using CARRAY_FLAGS). Thanks for finding this. -Travis From morgan.hough at gmail.com Sat Oct 8 17:24:34 2005 From: morgan.hough at gmail.com (Morgan Hough) Date: Sat, 8 Oct 2005 22:24:34 +0100 Subject: [SciPy-user] scipy_core-0.4.1-src.rpm for fedora core 4 Message-ID: <102408b60510081424i384836fbib65f6769e1f7cb7d@mail.gmail.com> I downloaded the scipy_core-0.4.1-src.rpm and rebuilt it on an up-to-date fc4 system. The build initially failed. I installed the new Fedora Extras Atlas rpm (+devel) and it rebuilt fine. I am now wondering how I can check out a matching SciPy. What tag would I use to grab it out from subversion? Is there a schedule for a similar beta release? Thanks in advance for your time. Cheers, -Morgan From oliphant at ee.byu.edu Sat Oct 8 18:27:32 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Sat, 08 Oct 2005 16:27:32 -0600 Subject: [SciPy-user] scipy_core-0.4.1-src.rpm for fedora core 4 In-Reply-To: <102408b60510081424i384836fbib65f6769e1f7cb7d@mail.gmail.com> References: <102408b60510081424i384836fbib65f6769e1f7cb7d@mail.gmail.com> Message-ID: <434847D4.3090801@ee.byu.edu> Morgan Hough wrote: >I downloaded the scipy_core-0.4.1-src.rpm and rebuilt it on an >up-to-date fc4 system. The build initially failed. I installed the new >Fedora Extras Atlas rpm (+devel) and it rebuilt fine. > >I am now wondering how I can check out a matching SciPy. What tag >would I use to grab it out from subversion? Is there a schedule for a >similar beta release? > > The development version of scipy which will build on top of scipy core is under http://svn.scipy.org/svn/scipy/branches/newscipy You can get it, but it is under heavy development. Right now, a time frame is hard to say, but probably within 3-4 weeks, depending on how many issues we find. I suspect that by the time we make a beta release of new scipy, we will be ready to make a full release of newcore. We are particularly in need of people with access to 64-bit systems for testing... -Travis From rmuller at sandia.gov Sun Oct 9 18:48:23 2005 From: rmuller at sandia.gov (Rick Muller) Date: Sun, 9 Oct 2005 16:48:23 -0600 Subject: [SciPy-user] Undefined symbol __MAIN__: Help needed with Scipy on OS X Message-ID: <1AC85F5D-7A64-4AE0-8AE9-FF2C8B7BB378@sandia.gov> I'm trying to compile scipy 0.3.2 under Macintosh OS X 10.4. I'm seeing the error message when I link: /usr/bin/ld: Undefined symbols: _MAIN__ Any hint as to what's wrong? Rick From Fernando.Perez at colorado.edu Sun Oct 9 19:17:56 2005 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Sun, 09 Oct 2005 17:17:56 -0600 Subject: [SciPy-user] [SciPy-dev] Re: [Numpy-discussion] Purchasing Documentation In-Reply-To: <1e2af89e0510091306w5b435c32j3e59cc453a35be30@mail.gmail.com> References: <43418AA8.2040809@hawaii.edu> <4341C1C8.3000506@optusnet.com.au> <4341CB9E.4050306@ee.byu.edu> <434300FB.7070606@ee.byu.edu> <434307CC.7020303@colorado.edu> <43430B62.9090908@ee.byu.edu> <1e2af89e0510060529s5319192ayea270ebe2949b6b8@mail.gmail.com> <43454135.503@ee.byu.edu> <4345A449.8040303@colorado.edu> <1e2af89e0510091306w5b435c32j3e59cc453a35be30@mail.gmail.com> Message-ID: <4349A524.40208@colorado.edu> Matthew Brett wrote: >>While I fully respect the opinion of those concerned about Travis' decision, I >>have to say that I am personally not so worried. At scipy'05, when Travis >>announced the book idea, I asked a very simple question: 'are the docstrings >>complete?'. His answer was 'yes'. >> >>There was a good reason for my asking this question: I personally find that >>when coding, the most productive and efficient to learn a library is by typing >> >>import foo >>foo. >> >>and then >> >>foo.bar? > > > Just a tiny addition. I know what you mean, but my experience is of > someone recently trying to learn numarray / Numeric, and I suspect I > am not alone in printing out the documentation and reading a moderate > amount of it before I get started. Call me old-fashioned (it > wouldn't be the first time!), Certainly. As I said, I do think there is room for 'book-style' (as opposed to API reference) books for scipy. Langtangen's ($$$) and Perry's (free) are two such existing offers, and now Travis' comes in as well (and I still believe there is room for more). My idea was of top-level (module) docstrings which would provide a reasonable overview, along with single-function ones providing not only API reference but also a few examples. Since pydoc -w can generate permanent HTML for browsing/printing out of any module (and docutils has even more sophisticated facilities), I think this provides acceptable coverage of the basic library. And it does give anyone who wants 'material to read on the bus' (which I often need myself) a reasonable solution, I think. I just wanted to clarify that a docstring-based set of docs is not limited either to interactive usage via ipython, nor to raw API information. It can both be printed for offline use, and can cover enough overview and examples to be genuinely useful standalone. Not a substitute for a full book, but not a crippled tool either, I think. Cheers, f From ryanlists at gmail.com Mon Oct 10 08:36:58 2005 From: ryanlists at gmail.com (Ryan Krauss) Date: Mon, 10 Oct 2005 08:36:58 -0400 Subject: [SciPy-user] installation on fedora core 4 Message-ID: I have recently installed fedora core 4. What is the easiest way to get scipy installed for python 2.4? I ran fedora core 3 for a while and had to go through a bit of pain to get scipy installed (building ATLAS from source and other things). Is there an easier way to get a fairly recent version? Can I add a repo to yum? Thanks, Ryan From ryanlists at gmail.com Mon Oct 10 09:35:12 2005 From: ryanlists at gmail.com (Ryan Krauss) Date: Mon, 10 Oct 2005 09:35:12 -0400 Subject: [SciPy-user] installation on fedora core 4 In-Reply-To: References: Message-ID: I thought I was getting there. I don't know if this is the easiest or best way, but I installed atlas, blas, and lapack (and the devels) with yum from fedora extras. I also had to install compat-gcc-32-g77 from yum (FC4 comes only with f95 and not f77). Things seem to go o.k. for a while and then the install exited with: gcc: build/src/Lib/interpolate/dfitpackmodule.c build/src/Lib/interpolate/dfitpackmodule.c: In function 'f2py_rout_dfitpack_surf it_smth': build/src/Lib/interpolate/dfitpackmodule.c:2446: error: invalid storage class fo r function 'calc_lwrk1' build/src/Lib/interpolate/dfitpackmodule.c:2458: error: invalid storage class fo r function 'calc_lwrk2' build/src/Lib/interpolate/dfitpackmodule.c: In function 'f2py_rout_dfitpack_surf it_lsq': build/src/Lib/interpolate/dfitpackmodule.c:2882: error: invalid storage class fo r function 'calc_lwrk1' build/src/Lib/interpolate/dfitpackmodule.c:2894: error: invalid storage class fo r function 'calc_lwrk2' build/src/Lib/interpolate/dfitpackmodule.c: In function 'f2py_rout_dfitpack_surf it_smth': build/src/Lib/interpolate/dfitpackmodule.c:2446: error: invalid storage class fo r function 'calc_lwrk1' build/src/Lib/interpolate/dfitpackmodule.c:2458: error: invalid storage class fo r function 'calc_lwrk2' build/src/Lib/interpolate/dfitpackmodule.c: In function 'f2py_rout_dfitpack_surf it_lsq': build/src/Lib/interpolate/dfitpackmodule.c:2882: error: invalid storage class fo r function 'calc_lwrk1' build/src/Lib/interpolate/dfitpackmodule.c:2894: error: invalid storage class fo r function 'calc_lwrk2' error: Command "gcc -pthread -fno-strict-aliasing -DNDEBUG -O2 -g -pipe -Wp,-D_F ORTIFY_SOURCE=2 -fexceptions -m32 -march=i386 -mtune=pentium4 -fasynchronous-unw ind-tables -D_GNU_SOURCE -fPIC -fPIC -Ibuild/src -I/usr/include/python2.4 -c bui ld/src/Lib/interpolate/dfitpackmodule.c -o build/temp.linux-i686-2.4/build/src/L ib/interpolate/dfitpackmodule.o" failed with exit status 1 Any help? (I should mention that I downloaded the source for scipy 0.3.2 from the website and used python setup.py install.) Ryan On 10/10/05, Ryan Krauss wrote: > I have recently installed fedora core 4. What is the easiest way to > get scipy installed for python 2.4? I ran fedora core 3 for a while > and had to go through a bit of pain to get scipy installed (building > ATLAS from source and other things). Is there an easier way to get a > fairly recent version? Can I add a repo to yum? > > Thanks, > > Ryan > From morgan.hough at gmail.com Mon Oct 10 09:48:44 2005 From: morgan.hough at gmail.com (Morgan Hough) Date: Mon, 10 Oct 2005 14:48:44 +0100 Subject: [SciPy-user] installation on fedora core 4 In-Reply-To: References: Message-ID: <102408b60510100648s4df6a180vd999993be84af56e@mail.gmail.com> Ryan, There is an ATLAS rpm in Fedora Extras now as well as a src.rpm if you want the best optimization. Once I installed that it was easy to install scipy_core4beta from src.rpm. I haven't tried it yet but I assume building an rpm from SciPy 0.3.2 will work as well. If you find a repo do let me know:) Cheers, -Morgan On 10/10/05, Ryan Krauss wrote: > > I have recently installed fedora core 4. What is the easiest way to > get scipy installed for python 2.4? I ran fedora core 3 for a while > and had to go through a bit of pain to get scipy installed (building > ATLAS from source and other things). Is there an easier way to get a > fairly recent version? Can I add a repo to yum? > > Thanks, > > Ryan > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user > -------------- next part -------------- An HTML attachment was scrubbed... URL: From rkern at ucsd.edu Mon Oct 10 09:50:12 2005 From: rkern at ucsd.edu (Robert Kern) Date: Mon, 10 Oct 2005 06:50:12 -0700 Subject: [SciPy-user] Undefined symbol __MAIN__: Help needed with Scipy on OS X In-Reply-To: <1AC85F5D-7A64-4AE0-8AE9-FF2C8B7BB378@sandia.gov> References: <1AC85F5D-7A64-4AE0-8AE9-FF2C8B7BB378@sandia.gov> Message-ID: <434A7194.70503@ucsd.edu> Rick Muller wrote: > I'm trying to compile scipy 0.3.2 under Macintosh OS X 10.4. I'm > seeing the error message when I link: > > /usr/bin/ld: Undefined symbols: > _MAIN__ > > > Any hint as to what's wrong? Not a clue! What compilers are you using? I recommend gcc 3.3 and g77 3.4. Which Python are you using? -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From ryanlists at gmail.com Mon Oct 10 10:24:13 2005 From: ryanlists at gmail.com (Ryan Krauss) Date: Mon, 10 Oct 2005 10:24:13 -0400 Subject: [SciPy-user] installation on fedora core 4 In-Reply-To: <102408b60510100648s4df6a180vd999993be84af56e@mail.gmail.com> References: <102408b60510100648s4df6a180vd999993be84af56e@mail.gmail.com> Message-ID: I have ATLAS from Fedora Extras. Where did you get the scipy rpm from? I tried: [root at localhost downloads]# rpm -i scipy_core-0.4.0-1.i586.rpm error: Failed dependencies: python-base >= 2.4 is needed by scipy_core-0.4.0-1.i586 The rpm was from http://rpm.pbone.net/index.php3/stat/4/idpl/2250851/com/scipy_core-0.4.0-1.i586.rpm.html Any other thoughts? I also updated to gcc-4.0.2 based on another email to this list and am still getting the error I posted earlier. Ryan On 10/10/05, Morgan Hough wrote: > Ryan, > > There is an ATLAS rpm in Fedora Extras now as well as a src.rpm if you want > the best optimization. Once I installed that it was easy to install > scipy_core4beta from src.rpm. I haven't tried it yet but I assume building > an rpm from SciPy 0.3.2 will work as well. > > If you find a repo do let me know:) > > Cheers, > > -Morgan > > On 10/10/05, Ryan Krauss wrote: > > > > I have recently installed fedora core 4. What is the easiest way to > > get scipy installed for python 2.4? I ran fedora core 3 for a while > > and had to go through a bit of pain to get scipy installed (building > > ATLAS from source and other things). Is there an easier way to get a > > fairly recent version? Can I add a repo to yum? > > > > Thanks, > > > > Ryan > > > > _______________________________________________ > > SciPy-user mailing list > > SciPy-user at scipy.net > > http://www.scipy.net/mailman/listinfo/scipy-user > > > > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user > > > From ryanlists at gmail.com Mon Oct 10 10:25:07 2005 From: ryanlists at gmail.com (Ryan Krauss) Date: Mon, 10 Oct 2005 10:25:07 -0400 Subject: [SciPy-user] installation on fedora core 4 In-Reply-To: References: <102408b60510100648s4df6a180vd999993be84af56e@mail.gmail.com> Message-ID: I guess I should have added that the thing I don't understand about the error python-base>-2.4 is that I am running 2.4.1. On 10/10/05, Ryan Krauss wrote: > I have ATLAS from Fedora Extras. Where did you get the scipy rpm from? > I tried: > [root at localhost downloads]# rpm -i scipy_core-0.4.0-1.i586.rpm > error: Failed dependencies: > python-base >= 2.4 is needed by scipy_core-0.4.0-1.i586 > > The rpm was from > http://rpm.pbone.net/index.php3/stat/4/idpl/2250851/com/scipy_core-0.4.0-1.i586.rpm.html > > Any other thoughts? > > I also updated to gcc-4.0.2 based on another email to this list and am > still getting the error I posted earlier. > > Ryan > > On 10/10/05, Morgan Hough wrote: > > Ryan, > > > > There is an ATLAS rpm in Fedora Extras now as well as a src.rpm if you want > > the best optimization. Once I installed that it was easy to install > > scipy_core4beta from src.rpm. I haven't tried it yet but I assume building > > an rpm from SciPy 0.3.2 will work as well. > > > > If you find a repo do let me know:) > > > > Cheers, > > > > -Morgan > > > > On 10/10/05, Ryan Krauss wrote: > > > > > > I have recently installed fedora core 4. What is the easiest way to > > > get scipy installed for python 2.4? I ran fedora core 3 for a while > > > and had to go through a bit of pain to get scipy installed (building > > > ATLAS from source and other things). Is there an easier way to get a > > > fairly recent version? Can I add a repo to yum? > > > > > > Thanks, > > > > > > Ryan > > > > > > _______________________________________________ > > > SciPy-user mailing list > > > SciPy-user at scipy.net > > > http://www.scipy.net/mailman/listinfo/scipy-user > > > > > > > > > _______________________________________________ > > SciPy-user mailing list > > SciPy-user at scipy.net > > http://www.scipy.net/mailman/listinfo/scipy-user > > > > > > > From morgan.hough at gmail.com Mon Oct 10 10:37:03 2005 From: morgan.hough at gmail.com (Morgan Hough) Date: Mon, 10 Oct 2005 15:37:03 +0100 Subject: [SciPy-user] installation on fedora core 4 In-Reply-To: References: <102408b60510100648s4df6a180vd999993be84af56e@mail.gmail.com> Message-ID: <102408b60510100737kb303ba7k8d14592e8c2c0eed@mail.gmail.com> I am running GCC 4.0.1 although I know the email that you are talking about. I used the src.rpm link that was posted on the list recently. I am actually thinking of uninstalling this though as I have some other packages I want to get running that only require 0.3.2. You can make rpms from the source tarballs. I think the command is something like bdist_rpm. Instructions can be found on the scipy site under instruction for packagers. I haven't tried it yet myself on fc4. On 10/10/05, Ryan Krauss wrote: > > I have ATLAS from Fedora Extras. Where did you get the scipy rpm from? > I tried: > [root at localhost downloads]# rpm -i scipy_core-0.4.0-1.i586.rpm > error: Failed dependencies: > python-base >= 2.4 is needed by scipy_core-0.4.0-1.i586 > > The rpm was from > > http://rpm.pbone.net/index.php3/stat/4/idpl/2250851/com/scipy_core-0.4.0-1.i586.rpm.html > > Any other thoughts? > > I also updated to gcc-4.0.2 based on another email to this list and am > still getting the error I posted earlier. > > Ryan > > On 10/10/05, Morgan Hough wrote: > > Ryan, > > > > There is an ATLAS rpm in Fedora Extras now as well as a src.rpm if you > want > > the best optimization. Once I installed that it was easy to install > > scipy_core4beta from src.rpm. I haven't tried it yet but I assume > building > > an rpm from SciPy 0.3.2 will work as well. > > > > If you find a repo do let me know:) > > > > Cheers, > > > > -Morgan > > > > On 10/10/05, Ryan Krauss wrote: > > > > > > I have recently installed fedora core 4. What is the easiest way to > > > get scipy installed for python 2.4? I ran fedora core 3 for a while > > > and had to go through a bit of pain to get scipy installed (building > > > ATLAS from source and other things). Is there an easier way to get a > > > fairly recent version? Can I add a repo to yum? > > > > > > Thanks, > > > > > > Ryan > > > > > > _______________________________________________ > > > SciPy-user mailing list > > > SciPy-user at scipy.net > > > http://www.scipy.net/mailman/listinfo/scipy-user > > > > > > > > > _______________________________________________ > > SciPy-user mailing list > > SciPy-user at scipy.net > > http://www.scipy.net/mailman/listinfo/scipy-user > > > > > > > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user > -------------- next part -------------- An HTML attachment was scrubbed... URL: From rmuller at sandia.gov Mon Oct 10 10:42:40 2005 From: rmuller at sandia.gov (Rick Muller) Date: Mon, 10 Oct 2005 08:42:40 -0600 Subject: [SciPy-user] Undefined symbol __MAIN__: Help needed with Scipy on OS X In-Reply-To: <434A7194.70503@ucsd.edu> References: <1AC85F5D-7A64-4AE0-8AE9-FF2C8B7BB378@sandia.gov> <434A7194.70503@ucsd.edu> Message-ID: <0387F5C7-2EE6-4C7D-AD0B-F116668FF945@sandia.gov> (Pearu: copying you since this also fixes the f2py bug I emailed you about. Wasn't f2py's fault at all. Sorry about that.) PROBLEM FIXED. Here's what I did wrong: I'm using gcc 4.0.0 (which is the default under Xcode2, I believe), and g77 3.4.3. I'm using Python 2.3.5 (the "framework" build on OS X). The problem arises from the scipy_distutils package. When I try to install the packages using the stock distutils I get the error message: /usr/bin/ld: can't locate file for: -lcc_dynamic collect2: ld returned 1 exit status during the first link attempt. I did some googling around and found that the line: opt.extend(["-lcc_dynamic","-bundle"]) in the gnufcompiler.py file could be a culprit. WHAT I DID WRONG: I tried commenting out this line. That's a bad idea, and leads to the problem I reported. However, when I just removed the "-lcc_dynamic" part, it worked. Up until the time that I hit what looks like the standard scipy/gcc 4.0.0 bugs. Sigh. But at least the install and the f2py stuff is working now R. On Oct 10, 2005, at 7:50 AM, Robert Kern wrote: > Rick Muller wrote: > >> I'm trying to compile scipy 0.3.2 under Macintosh OS X 10.4. I'm >> seeing the error message when I link: >> >> /usr/bin/ld: Undefined symbols: >> _MAIN__ >> >> >> Any hint as to what's wrong? >> > > Not a clue! What compilers are you using? I recommend gcc 3.3 and g77 > 3.4. Which Python are you using? > > -- > Robert Kern > rkern at ucsd.edu > > "In the fields of hell where the grass grows high > Are the graves of dreams allowed to die." > -- Richard Harter > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user > > From ryanlists at gmail.com Mon Oct 10 10:52:30 2005 From: ryanlists at gmail.com (Ryan Krauss) Date: Mon, 10 Oct 2005 10:52:30 -0400 Subject: [SciPy-user] installation on fedora core 4 In-Reply-To: <102408b60510100737kb303ba7k8d14592e8c2c0eed@mail.gmail.com> References: <102408b60510100648s4df6a180vd999993be84af56e@mail.gmail.com> <102408b60510100737kb303ba7k8d14592e8c2c0eed@mail.gmail.com> Message-ID: I would assume that if I can't install scipy from the source, I also can't build an rpm that would be useful. Ryan On 10/10/05, Morgan Hough wrote: > I am running GCC 4.0.1 although I know the email that you are talking about. > I used the src.rpm link that was posted on the list recently. > > I am actually thinking of uninstalling this though as I have some other > packages I want to get running that only require 0.3.2. You can make rpms > from the source tarballs. I think the command is something like bdist_rpm. > Instructions can be found on the scipy site under instruction for packagers. > I haven't tried it yet myself on fc4. > > > On 10/10/05, Ryan Krauss wrote: > > > > I have ATLAS from Fedora Extras. Where did you get the scipy rpm from? > > I tried: > > [root at localhost downloads]# rpm -i scipy_core-0.4.0-1.i586.rpm > > error: Failed dependencies: > > python-base >= 2.4 is needed by scipy_core-0.4.0-1.i586 > > > > The rpm was from > > > http://rpm.pbone.net/index.php3/stat/4/idpl/2250851/com/scipy_core-0.4.0-1.i586.rpm.html > > > > Any other thoughts? > > > > I also updated to gcc-4.0.2 based on another email to this list and am > > still getting the error I posted earlier. > > > > Ryan > > > > On 10/10/05, Morgan Hough wrote: > > > Ryan, > > > > > > There is an ATLAS rpm in Fedora Extras now as well as a src.rpm if you > want > > > the best optimization. Once I installed that it was easy to install > > > scipy_core4beta from src.rpm. I haven't tried it yet but I assume > building > > > an rpm from SciPy 0.3.2 will work as well. > > > > > > If you find a repo do let me know:) > > > > > > Cheers, > > > > > > -Morgan > > > > > > On 10/10/05, Ryan Krauss wrote: > > > > > > > > I have recently installed fedora core 4. What is the easiest way to > > > > get scipy installed for python 2.4? I ran fedora core 3 for a while > > > > and had to go through a bit of pain to get scipy installed (building > > > > ATLAS from source and other things). Is there an easier way to get a > > > > fairly recent version? Can I add a repo to yum? > > > > > > > > Thanks, > > > > > > > > Ryan > > > > > > > > _______________________________________________ > > > > SciPy-user mailing list > > > > SciPy-user at scipy.net > > > > http://www.scipy.net/mailman/listinfo/scipy-user > > > > > > > > > > > > > _______________________________________________ > > > SciPy-user mailing list > > > SciPy-user at scipy.net > > > http://www.scipy.net/mailman/listinfo/scipy-user > > > > > > > > > > > > > _______________________________________________ > > SciPy-user mailing list > > SciPy-user at scipy.net > > http://www.scipy.net/mailman/listinfo/scipy-user > > > > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user > > > From wjdandreta at att.net Mon Oct 10 11:49:36 2005 From: wjdandreta at att.net (Bill Dandreta) Date: Mon, 10 Oct 2005 11:49:36 -0400 Subject: [SciPy-user] installation on fedora core 4 In-Reply-To: References: Message-ID: <434A8D90.2060107@att.net> An HTML attachment was scrubbed... URL: From ryanlists at gmail.com Mon Oct 10 12:03:46 2005 From: ryanlists at gmail.com (Ryan Krauss) Date: Mon, 10 Oct 2005 12:03:46 -0400 Subject: [SciPy-user] installation on fedora core 4 In-Reply-To: <434A8D90.2060107@att.net> References: <434A8D90.2060107@att.net> Message-ID: It is good to know it works on Gentoo. That is a distro I would consider. I don't think I need to build ATLAS, blas, and lapack from source now that I have them from yum. I just don't know what to do with the error I am getting trying to build scipy from source. Ryan On 10/10/05, Bill Dandreta wrote: > Ryan Krauss wrote: > I have recently installed fedora core 4. What is the easiest way to > get scipy installed for python 2.4? I ran fedora core 3 for a while > and had to go through a bit of pain to get scipy installed (building > ATLAS from source and other things). Is there an easier way to get a > fairly recent version? Can I add a repo to yum? > > Don't be afraid to compile from source. blas and atlas take a long time > especially on a slow machine, just let them run overnight. > > I use Gentoo, not FC4 but this is what I installed for scipy. > > gnuplot gnuplot-py > numeric fftw wxGTK wxpython > blas-atlas > lapack-atlas > atlas > scipy > matplotlib > > I don't know if the exact order is important. > > I had to 'turn off' LDFLAGS to get scipy to compile. > > I had scipy 0.3.2 installed with Python 2.3.5. Gentoo just upgraded Python > to 2.4.1. After the upgrade, I ran the python updater and it recompiled all > my Python packages, including scipy. I've only used it once since then but > it worked OK. > > If you want to create binary packages (rpm,debian or slackware), > checkinstall is a handy little program that monitors the > > make install > > process and creates a binary for you. Really great for uninstalling. > > > Bill > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user > > > From ryanlists at gmail.com Mon Oct 10 12:09:17 2005 From: ryanlists at gmail.com (Ryan Krauss) Date: Mon, 10 Oct 2005 12:09:17 -0400 Subject: [SciPy-user] installation on fedora core 4 In-Reply-To: <434A8D90.2060107@att.net> References: <434A8D90.2060107@att.net> Message-ID: Do I need to speficy a version of gcc other than 4.0? Can I pass an option like that to python setup.py install? Has anyone out there built scipy from source with gcc 4.0? Ryan On 10/10/05, Bill Dandreta wrote: > Ryan Krauss wrote: > I have recently installed fedora core 4. What is the easiest way to > get scipy installed for python 2.4? I ran fedora core 3 for a while > and had to go through a bit of pain to get scipy installed (building > ATLAS from source and other things). Is there an easier way to get a > fairly recent version? Can I add a repo to yum? > > Don't be afraid to compile from source. blas and atlas take a long time > especially on a slow machine, just let them run overnight. > > I use Gentoo, not FC4 but this is what I installed for scipy. > > gnuplot gnuplot-py > numeric fftw wxGTK wxpython > blas-atlas > lapack-atlas > atlas > scipy > matplotlib > > I don't know if the exact order is important. > > I had to 'turn off' LDFLAGS to get scipy to compile. > > I had scipy 0.3.2 installed with Python 2.3.5. Gentoo just upgraded Python > to 2.4.1. After the upgrade, I ran the python updater and it recompiled all > my Python packages, including scipy. I've only used it once since then but > it worked OK. > > If you want to create binary packages (rpm,debian or slackware), > checkinstall is a handy little program that monitors the > > make install > > process and creates a binary for you. Really great for uninstalling. > > > Bill > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user > > > From stephen.walton at csun.edu Mon Oct 10 12:31:21 2005 From: stephen.walton at csun.edu (Stephen Walton) Date: Mon, 10 Oct 2005 09:31:21 -0700 Subject: [SciPy-user] installation on fedora core 4 In-Reply-To: References: <434A8D90.2060107@att.net> Message-ID: <434A9759.1080609@csun.edu> Ryan Krauss wrote: >Do I need to speficy a version of gcc other than 4.0? > This thread came up in scipy-dev, and I've posted here before, but to clarify again: In an all-default configuration, it is not presently possible to build a working scipy on Fedora Core 4, either old or new. There is a bug in gfortran which causes it to mis-compile the i1mach, r1mach, and d1mach functions which are at the bottom of much of the Fortran code in scipy. If you use g77 instead by using the appropriate compatibility RPMs (compat-gcc-32-g77 and compat-libf2c-32), you'll find that gcc4 cannot link to libg2c because it isn't in a directory which gcc4 searches. I haven't found a way of forcing a scipy build to use gcc 3.2.3 (the compatibility version which is part of FC4). The only feasible workaround I've found is to use g77 on FC4 to build scipy (and Numeric and numarray), and create symbolic links to libg2c.a and libg2c.so so that gcc4 can find them. That is, do the following: ln -s /usr/lib/gcc-lib/i386-redhat-linux/3.2.3/libg2c.so /usr/lib/gcc/i386-redhat-linux/4.0.1 ln -s /usr/lib/gcc-lib/i386-redhat-linux/3.2.3/libg2c.a /usr/lib/gcc/i386-redhat-linux/4.0.1 This is not supported, use at your own risk, your mileage may vary, results are not guaranteed, pressure RedHat to release a gcc 4.0.2 update for FC4 in which the gfortran bug is fixed. Personally, I'm using Absoft Fortran on gcc4; if you're an educational institution, Intel will let you download and use their Fortran compiler for free. Both seem to work fine with newscipy and FC4. From ryanlists at gmail.com Mon Oct 10 12:40:23 2005 From: ryanlists at gmail.com (Ryan Krauss) Date: Mon, 10 Oct 2005 12:40:23 -0400 Subject: [SciPy-user] installation on fedora core 4 In-Reply-To: <434A9759.1080609@csun.edu> References: <434A8D90.2060107@att.net> <434A9759.1080609@csun.edu> Message-ID: Thanks for this clarification Stephen. It saves me a lot of pain continuing to try and do what cannot currently be done. As another potential work around, I have scipy installed and built from source on a fedora core installation (with python 2.4 - I am almost sure). Can I in any way use it to get scipy working in fedora core 4? Ryan On 10/10/05, Stephen Walton wrote: > Ryan Krauss wrote: > > >Do I need to speficy a version of gcc other than 4.0? > > > This thread came up in scipy-dev, and I've posted here before, but to > clarify again: > > In an all-default configuration, it is not presently possible to build a > working scipy on Fedora Core 4, either old or new. There is a bug in > gfortran which causes it to mis-compile the i1mach, r1mach, and d1mach > functions which are at the bottom of much of the Fortran code in scipy. > If you use g77 instead by using the appropriate compatibility RPMs > (compat-gcc-32-g77 and compat-libf2c-32), you'll find that gcc4 cannot > link to libg2c because it isn't in a directory which gcc4 searches. I > haven't found a way of forcing a scipy build to use gcc 3.2.3 (the > compatibility version which is part of FC4). > > The only feasible workaround I've found is to use g77 on FC4 to build > scipy (and Numeric and numarray), and create symbolic links to libg2c.a > and libg2c.so so that gcc4 can find them. That is, do the following: > > ln -s /usr/lib/gcc-lib/i386-redhat-linux/3.2.3/libg2c.so > /usr/lib/gcc/i386-redhat-linux/4.0.1 > ln -s /usr/lib/gcc-lib/i386-redhat-linux/3.2.3/libg2c.a > /usr/lib/gcc/i386-redhat-linux/4.0.1 > > This is not supported, use at your own risk, your mileage may vary, > results are not guaranteed, pressure RedHat to release a gcc 4.0.2 > update for FC4 in which the gfortran bug is fixed. Personally, I'm > using Absoft Fortran on gcc4; if you're an educational institution, > Intel will let you download and use their Fortran compiler for free. > Both seem to work fine with newscipy and FC4. > > > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user > From ryanlists at gmail.com Mon Oct 10 12:57:38 2005 From: ryanlists at gmail.com (Ryan Krauss) Date: Mon, 10 Oct 2005 12:57:38 -0400 Subject: [SciPy-user] installation on fedora core 4 In-Reply-To: References: <434A8D90.2060107@att.net> <434A9759.1080609@csun.edu> Message-ID: I left out the crucial 3 in my previous post: I have scipy installed and built from source on a fedora core >>3<< installation On 10/10/05, Ryan Krauss wrote: > Thanks for this clarification Stephen. It saves me a lot of pain > continuing to try and do what cannot currently be done. > > As another potential work around, I have scipy installed and built > from source on a fedora core installation (with python 2.4 - I am > almost sure). Can I in any way use it to get scipy working in fedora > core 4? > > Ryan > > On 10/10/05, Stephen Walton wrote: > > Ryan Krauss wrote: > > > > >Do I need to speficy a version of gcc other than 4.0? > > > > > This thread came up in scipy-dev, and I've posted here before, but to > > clarify again: > > > > In an all-default configuration, it is not presently possible to build a > > working scipy on Fedora Core 4, either old or new. There is a bug in > > gfortran which causes it to mis-compile the i1mach, r1mach, and d1mach > > functions which are at the bottom of much of the Fortran code in scipy. > > If you use g77 instead by using the appropriate compatibility RPMs > > (compat-gcc-32-g77 and compat-libf2c-32), you'll find that gcc4 cannot > > link to libg2c because it isn't in a directory which gcc4 searches. I > > haven't found a way of forcing a scipy build to use gcc 3.2.3 (the > > compatibility version which is part of FC4). > > > > The only feasible workaround I've found is to use g77 on FC4 to build > > scipy (and Numeric and numarray), and create symbolic links to libg2c.a > > and libg2c.so so that gcc4 can find them. That is, do the following: > > > > ln -s /usr/lib/gcc-lib/i386-redhat-linux/3.2.3/libg2c.so > > /usr/lib/gcc/i386-redhat-linux/4.0.1 > > ln -s /usr/lib/gcc-lib/i386-redhat-linux/3.2.3/libg2c.a > > /usr/lib/gcc/i386-redhat-linux/4.0.1 > > > > This is not supported, use at your own risk, your mileage may vary, > > results are not guaranteed, pressure RedHat to release a gcc 4.0.2 > > update for FC4 in which the gfortran bug is fixed. Personally, I'm > > using Absoft Fortran on gcc4; if you're an educational institution, > > Intel will let you download and use their Fortran compiler for free. > > Both seem to work fine with newscipy and FC4. > > > > > > > > _______________________________________________ > > SciPy-user mailing list > > SciPy-user at scipy.net > > http://www.scipy.net/mailman/listinfo/scipy-user > > > From stephen.walton at csun.edu Mon Oct 10 13:34:20 2005 From: stephen.walton at csun.edu (Stephen Walton) Date: Mon, 10 Oct 2005 10:34:20 -0700 Subject: [SciPy-user] installation on fedora core 4 In-Reply-To: References: <434A8D90.2060107@att.net> <434A9759.1080609@csun.edu> Message-ID: <434AA61C.7010508@csun.edu> Ryan Krauss wrote: >I left out the crucial 3 in my previous post: >I have scipy installed and built from source on a fedora core >>3<< >installation > And I think the answer to your question (will it work on FC4) is "no" due to both the Python versions (2.3 and 2.4) and the different versions of glibc. From ryanlists at gmail.com Mon Oct 10 14:12:28 2005 From: ryanlists at gmail.com (Ryan Krauss) Date: Mon, 10 Oct 2005 14:12:28 -0400 Subject: [SciPy-user] installation on fedora core 4 In-Reply-To: <434AA61C.7010508@csun.edu> References: <434A8D90.2060107@att.net> <434A9759.1080609@csun.edu> <434AA61C.7010508@csun.edu> Message-ID: Even after Stephen's symbolic link I am still getting: error: Command "gcc -pthread -fno-strict-aliasing -DNDEBUG -O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector --param=ssp-buffer-size=4 -m32 -march=i386 -mtune=pentium4 -fasynchronous-unwind-tables -D_GNU_SOURCE -fPIC -fPIC -Ibuild/src -I/usr/include/python2.4 -c build/src/Lib/interpolate/dfitpackmodule.c -o build/temp.linux-i686-2.4/build/src/Lib/interpolate/dfitpackmodule.o" failed with exit status 1 The message starts with: gcc: build/src/Lib/interpolate/dfitpackmodule.c build/src/Lib/interpolate/dfitpackmodule.c: In function 'f2py_rout_dfitpack_surfit_smth': build/src/Lib/interpolate/dfitpackmodule.c:2446: error: invalid storage class for function 'calc_lwrk1' build/src/Lib/interpolate/dfitpackmodule.c:2458: error: invalid storage class for function 'calc_lwrk2' build/src/Lib/interpolate/dfitpackmodule.c: In function 'f2py_rout_dfitpack_surfit_lsq': build/src/Lib/interpolate/dfitpackmodule.c:2882: error: invalid storage class for function 'calc_lwrk1' build/src/Lib/interpolate/dfitpackmodule.c:2894: error: invalid storage class for function 'calc_lwrk2' build/src/Lib/interpolate/dfitpackmodule.c: In function 'f2py_rout_dfitpack_surfit_smth': build/src/Lib/interpolate/dfitpackmodule.c:2446: error: invalid storage class for function 'calc_lwrk1' build/src/Lib/interpolate/dfitpackmodule.c:2458: error: invalid storage class for function 'calc_lwrk2' Any other suggestions? Ryan On 10/10/05, Stephen Walton wrote: > Ryan Krauss wrote: > > >I left out the crucial 3 in my previous post: > >I have scipy installed and built from source on a fedora core >>3<< > >installation > > > And I think the answer to your question (will it work on FC4) is "no" > due to both the Python versions (2.3 and 2.4) and the different versions > of glibc. > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user > From gerard.vermeulen at grenoble.cnrs.fr Mon Oct 10 15:02:59 2005 From: gerard.vermeulen at grenoble.cnrs.fr (Gerard Vermeulen) Date: Mon, 10 Oct 2005 21:02:59 +0200 Subject: [SciPy-user] install location of the newcore header files Message-ID: <20051010210259.029a6a1f.gerard.vermeulen@grenoble.cnrs.fr> I finally discovered where the header files get installed when reading CAPI.txt; on the scip-dev list it is argued that the Python include directory is not a standard place for headers. This amazes me, because the first 100-150 lines of distutils/command/install.py spell out the default install directories for different type of files; and header files are supposed to be installed in the Python include directory. Deviating from the standard is a bad idea, because it means that the programmer who wants to read the headers has to remember that they are in a strange place (much more annoying than fixing a setup.py script or other installation tool). I understand that some people cannot write to the Python include directory, but distutils can take care of that in principle, see http://python.org/doc/2.4.2/inst/search-path.html The same technique works (in principle) also for header files (admittedly, there are some quirks). IMO, it is better to fix the quirks to improve distutils instead of breaking the distutils policies which are meant to facilitate life. Regards -- Gerard PS: I like to point out that the Python Imaging Library, PyGame and sip (the interface generator for PyQt) also put their headers in the standard place. wxPython is part of a heavy-weight package which is not using distutils and installs everything in its own directory (I dislike the big-package policy, but that is life). From oliphant at ee.byu.edu Mon Oct 10 15:21:34 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Mon, 10 Oct 2005 13:21:34 -0600 Subject: [SciPy-user] install location of the newcore header files In-Reply-To: <20051010210259.029a6a1f.gerard.vermeulen@grenoble.cnrs.fr> References: <20051010210259.029a6a1f.gerard.vermeulen@grenoble.cnrs.fr> Message-ID: <434ABF3E.80803@ee.byu.edu> Gerard Vermeulen wrote: >I finally discovered where the header files get installed when reading CAPI.txt; >on the scip-dev list it is argued that the Python include directory is not a >standard place for headers. > >This amazes me, because the first 100-150 lines of distutils/command/install.py >spell out the default install directories for different type of files; and header >files are supposed to be installed in the Python include directory. > > > Thanks for this feedback. I'm not a big fan of putting the headers in a different place, myself. So, perhaps a better solution can be arrived at that still let's people who can't install to the standard header directories use the sytem. Knowing where the headers are installed is definitely important for developers. -Travis From rkern at ucsd.edu Mon Oct 10 15:24:41 2005 From: rkern at ucsd.edu (Robert Kern) Date: Mon, 10 Oct 2005 12:24:41 -0700 Subject: [SciPy-user] install location of the newcore header files In-Reply-To: <20051010210259.029a6a1f.gerard.vermeulen@grenoble.cnrs.fr> References: <20051010210259.029a6a1f.gerard.vermeulen@grenoble.cnrs.fr> Message-ID: <434ABFF9.3020209@ucsd.edu> Gerard Vermeulen wrote: > I finally discovered where the header files get installed when reading CAPI.txt; > on the scip-dev list it is argued that the Python include directory is not a > standard place for headers. No one is arguing that it's not standard. I'm arguing that it's a *bad* place because frequently people can't install to it. > This amazes me, because the first 100-150 lines of distutils/command/install.py > spell out the default install directories for different type of files; and header > files are supposed to be installed in the Python include directory. > > Deviating from the standard is a bad idea, because it means that the programmer > who wants to read the headers has to remember that they are in a strange place > (much more annoying than fixing a setup.py script or other installation tool). In my experience, I usually go header-diving after something fails to compile. The scipy include directory is always explicitly listed in the compile command. > I understand that some people cannot write to the Python include directory, > but distutils can take care of that in principle, see > http://python.org/doc/2.4.2/inst/search-path.html Yes, that solves installing the headers to a new location. But people then have to modify every setup.py script for modules that use scipy headers to point to the new location. > The same technique works (in principle) also for header files (admittedly, there > are some quirks). IMO, it is better to fix the quirks to improve distutils > instead of breaking the distutils policies which are meant to facilitate life. The policy to install headers to the main Python include directory was created at a time when it wasn't easy to install data into the package itself. That distutils quirk got fixed in Python 2.4 (and scipy.distutils also does this easily for Python <2.4). I imagine that if package data had been easy at the beginning of distutils's life, we would have been using this solution all along. The install_headers policy has been a thorn in our side ever since it was created. There's a better way, now. -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From oliphant at ee.byu.edu Mon Oct 10 17:35:12 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Mon, 10 Oct 2005 15:35:12 -0600 Subject: [SciPy-user] install location of the newcore header files In-Reply-To: <434ABFF9.3020209@ucsd.edu> References: <20051010210259.029a6a1f.gerard.vermeulen@grenoble.cnrs.fr> <434ABFF9.3020209@ucsd.edu> Message-ID: <434ADE90.9090907@ee.byu.edu> Robert Kern wrote: >Gerard Vermeulen wrote: > > >>I finally discovered where the header files get installed when reading CAPI.txt; >>on the scip-dev list it is argued that the Python include directory is not a >>standard place for headers. >> >> > >No one is arguing that it's not standard. I'm arguing that it's a *bad* >place because frequently people can't install to it. > > > >>I understand that some people cannot write to the Python include directory, >>but distutils can take care of that in principle, see >>http://python.org/doc/2.4.2/inst/search-path.html >> >> > >Yes, that solves installing the headers to a new location. But people >then have to modify every setup.py script for modules that use scipy >headers to point to the new location. > > So, why not have the get_headers_dir() function in scipy.distutils, but just install to the "standard" place by default and only install to the non-standard place if needed. This seems like it would make everybody happy. -Travis From w.northcott at unsw.edu.au Mon Oct 10 21:57:17 2005 From: w.northcott at unsw.edu.au (Bill Northcott) Date: Tue, 11 Oct 2005 11:57:17 +1000 Subject: [SciPy-user] Undefined symbol __MAIN__: Help needed with Scipy on OS X In-Reply-To: References: Message-ID: <7AC62937-26EB-4358-B6F5-80F8799712A3@unsw.edu.au> On 11/10/2005, at 1:40 AM, Rick Muller wrote: > > > PROBLEM FIXED. Here's what I did wrong: > > I'm using gcc 4.0.0 (which is the default under Xcode2, I believe), > and g77 3.4.3. I'm using Python 2.3.5 (the "framework" build on OS X). > > The problem arises from the scipy_distutils package. When I try to > install the packages using the stock distutils I get the error > message: > > /usr/bin/ld: can't locate file for: -lcc_dynamic > collect2: ld returned 1 exit status > blah blah > I tried commenting out this line. That's a bad idea, and leads to the > problem I reported. However, when I just removed the "-lcc_dynamic" > part, it worked. Up until the time that I hit what looks like the > standard scipy/gcc 4.0.0 bugs. Sigh. But at least the install and the > f2py stuff is working now > This is not a bug. It is a documentation issue. To say for the n th time, g77 is NOT part of gcc-4.x compilers and is NOT compatible with them. This is not a bug. It is intentional. The Fortran compiler in gcc-4.x is gfortran, which unfortunately is not quite ready for production use. So if you want use g77 on MacOS X just make gcc-3.3 the default compiler by running 'sudo gcc_select 3.3' before starting the build. Then all these problems go away. Bill Northcott From mpf at cs.ubc.ca Tue Oct 11 03:03:23 2005 From: mpf at cs.ubc.ca (Michael P Friedlander) Date: Tue, 11 Oct 2005 00:03:23 -0700 Subject: [SciPy-user] scipy_distutils and f2py_options Message-ID: <67ED3F3A-E85A-43F8-A17F-F001E5635FDF@cs.ubc.ca> Hi Folks. I see how the Extension class argument f2py_options can be used to pass extra arguments to the f2py module. But it seems that only the first set of optional arguments to f2py [] [[[only:]||[skip:]] \ ] \ [: ...] can be set using f2py_options. Is there an Extension class argument that will let me set the optional arguments that go *after* ? I would like to specify a list of subroutines that should be "skipped". What I'm really after is to use scipy_distutils to compile and link a Fortran package and a hand-tailored extension module for that package. We'd like to use scipy_distutils to do all the compilation and linking, but not create the module. Thanks for the help! --M From martigan at gmail.com Tue Oct 11 04:02:18 2005 From: martigan at gmail.com (Mauro Cherubini) Date: Tue, 11 Oct 2005 10:02:18 +0200 Subject: [SciPy-user] [newbie] standardize a matrix Message-ID: <33501A19-2E31-49EC-AFD8-38206EBE118C@gmail.com> Dear All, sorry for the very basic question I am posing, but I could not find an exhaustive answer elsewhere ... I have a symmetrical bi-dimensional array that contains distances between a certain number of points. The matrix diagoal are all zeros because of course the distance of a point from self is zero. I would love to standardize the matrix using one or all of these methods: a) divide each attribute distance value of a point by the maximum observed absolute distance value. This should restrict the values to lie between -1 and 1. Often the values are all positive, and thus, all transformed values will lie between 0 and 1. b) for each distance value subtract off the mean of that distances and then divide by the distances' standard deviation. If the distances are normally distributed then most distance values will lie between -1 and 1. c) for each distance value subtract off the mean of the distances and divide by the distances absolute deviation. Typically most distance values will lie between -1 and 1. I looked in the SciPy documentation and what I understood is that I can use an 'ufunc' to define one of these methods. Unfortunately my knowledge of Python, Numeric and SciPy is very low, so I could not figure out how. There are very few examples in the documentation at the moment. Can anyone point me to a possible implementation or where to look up? Thanks a lot in advance Mauro -- web: http://craft.epfl.ch -- blog: http://www.i-cherubini.it/mauro/blog/ From noel.oboyle2 at mail.dcu.ie Tue Oct 11 04:37:15 2005 From: noel.oboyle2 at mail.dcu.ie (Noel O'Boyle) Date: Tue, 11 Oct 2005 09:37:15 +0100 Subject: [SciPy-user] [newbie] standardize a matrix In-Reply-To: <33501A19-2E31-49EC-AFD8-38206EBE118C@gmail.com> References: <33501A19-2E31-49EC-AFD8-38206EBE118C@gmail.com> Message-ID: <1129019835.30702.14.camel@sandwi.ch.cam.ac.uk> On Tue, 2005-10-11 at 10:02 +0200, Mauro Cherubini wrote: > I have a symmetrical bi-dimensional array that contains distances > between a certain number of points. The matrix diagoal are all > zeros > because of course the distance of a point from self is zero. A distance matrix - PyChem is developing a number of methods to deal with distance matrices (and multivariate analysis in general). At the moment it is actively under development but I recommend you to forward your question on to the pychem users mailing list (www.sf.net/projects/pychem). > I would love to standardize the matrix using one or all of these > methods: > > a) divide each attribute distance value of a point by the maximum > observed absolute distance value. This should restrict the values to > lie between -1 and 1. Often the values are all positive, and thus, > all transformed values will lie between 0 and 1. (Shouldn't all distances be positive?) >>> a = array([[0.1,0.2],[0.5,0.6]] ) >>> print a.flat [0.1,0.2,0.5,0.6] >>> print max(a.flat) 0.6 >>> divide(a,0.6,a) >>> print a array([[ 0.16666667, 0.33333333], [ 0.83333333, 1. ]]) > b) for each distance value subtract off the mean of that distances > and then divide by the distances' standard deviation. If the > distances are normally distributed then most distance values will > lie > between -1 and 1. You need to decide whether you want to include the diagonal elements. If so, then use mean(a.flat) and so on. > c) for each distance value subtract off the mean of the distances > and > divide by the distances absolute deviation. Typically most distance > values will lie between -1 and 1. > > I looked in the SciPy documentation and what I understood is that I > can use an 'ufunc' to define one of these methods. Unfortunately my > knowledge of Python, Numeric and SciPy is very low, so I could not > figure out how. There are very few examples in the documentation at > the moment. > Can anyone point me to a possible implementation or where to look up? > Thanks a lot in advance From alexander.borghgraef.rma at gmail.com Tue Oct 11 05:10:39 2005 From: alexander.borghgraef.rma at gmail.com (Alexander Borghgraef) Date: Tue, 11 Oct 2005 11:10:39 +0200 Subject: [SciPy-user] Problem while importing linalg: undefined symbol: s_wsfe Message-ID: <9e8c52a20510110210yaeaade1p9a754a9d0709da86@mail.gmail.com> Hi all, So far I thought I had a working scipy installation, until I tried out "import linalg". I then got the following error message: >>> import linalg Traceback (most recent call last): File "", line 1, in ? File "/homes/morlet/aborghgr/local/lib/python2.3/site-packages/scipy/linalg/__init__.py", line 8, in ? from basic import * File "/homes/morlet/aborghgr/local/lib/python2.3/site-packages/scipy/linalg/basic.py", line 12, in ? from lapack import get_lapack_funcs File "/homes/morlet/aborghgr/local/lib/python2.3/site-packages/scipy/linalg/lapack.py", line 15, in ? import flapack ImportError: /homes/morlet/aborghgr/local/lib/python2.3/site-packages/scipy/linalg/flapack.so: undefined symbol: s_wsfe I googled for this, and I found that this error could appear when you're building scipy with a non-gcc compiler whereas lapack has been compiled with g77. Which is strange, since everything I compile here uses gcc. I installed scipy 0.3.2, compiled the ATLAS libs from source, and use a local install of Numeric 24.0. I use a Fedora core 3 OS, with python 2.3 and gcc 3.4.4. Any ideas? -- Alex Borghgraef -------------- next part -------------- An HTML attachment was scrubbed... URL: From martigan at gmail.com Tue Oct 11 09:04:47 2005 From: martigan at gmail.com (Mauro Cherubini) Date: Tue, 11 Oct 2005 15:04:47 +0200 Subject: [SciPy-user] [newbie] standardize a matrix In-Reply-To: <1129019835.30702.14.camel@sandwi.ch.cam.ac.uk> References: <33501A19-2E31-49EC-AFD8-38206EBE118C@gmail.com> <1129019835.30702.14.camel@sandwi.ch.cam.ac.uk> Message-ID: <606392B9-4F80-4B88-8700-9A89B144F06B@gmail.com> Hi Noel, thanks for the reply. Follwoing you advices I implemented this method inside one of my classes: #This method should standardize the matrix def standardize_matrix(self, matrix, method='standard'): if method == 'max': max_x = max(matrix[:,1].flat) divide(matrix[:,1], max_x, stdmatrix[:,1]) max_y = max(matrix[:,2].flat) divide(matrix[:,2], max_y, stdmatrix[:,2]) return stdmatrix; if method == 'standard': mean_x = mean(matrix[:,1].flat) stddev_x = std(matrix[:,1].flat) subtract(matrix[:,1], mean_x, stdmatrix[:,1]) divide(matrix[:,1], stddev_x, stdmatrix[:,1]) mean_y = mean(matrix[:,2].flat) stddev_y = std(matrix[:,2].flat) subtract(matrix[:,2], mean_y, stdmatrix[:,2]) divide(matrix[:,2], stddev_y, stdmatrix[:,2]) return stdmatrix; When I call it from the main it throws this error: mean_x = mean(matrix[:,1].flat) NameError: global name 'mean' is not defined I tried to understand which package are you importing as I initially thought that was the common scipy base. Apparently I was able to find a mean() method only inside scipy.stats.stats. However I am not able to import this. Could you please help me to understand how to call this method mean() and std()? I have a bit of confusion in mind between NumPy and SciPy because I could not find information in the documentation for SciPy. Thanks Mauro On Oct 11, 2005, at 10:37 , Noel O'Boyle wrote: > On Tue, 2005-10-11 at 10:02 +0200, Mauro Cherubini wrote: > >> I have a symmetrical bi-dimensional array that contains distances >> between a certain number of points. The matrix diagoal are all >> zeros >> because of course the distance of a point from self is zero. >> > > A distance matrix - PyChem is developing a number of methods to deal > with distance matrices (and multivariate analysis in general). At the > moment it is actively under development but I recommend you to forward > your question on to the pychem users mailing list > (www.sf.net/projects/pychem). > > >> I would love to standardize the matrix using one or all of these >> methods: >> >> a) divide each attribute distance value of a point by the maximum >> observed absolute distance value. This should restrict the values to >> lie between -1 and 1. Often the values are all positive, and thus, >> all transformed values will lie between 0 and 1. >> > > (Shouldn't all distances be positive?) > > >>>> a = array([[0.1,0.2],[0.5,0.6]] ) >>>> print a.flat >>>> > [0.1,0.2,0.5,0.6] > >>>> print max(a.flat) >>>> > 0.6 > >>>> divide(a,0.6,a) >>>> print a >>>> > array([[ 0.16666667, 0.33333333], > [ 0.83333333, 1. ]]) > > > >> b) for each distance value subtract off the mean of that distances >> and then divide by the distances' standard deviation. If the >> distances are normally distributed then most distance values will >> lie >> between -1 and 1. >> > > You need to decide whether you want to include the diagonal > elements. If > so, then use mean(a.flat) and so on. > > > >> c) for each distance value subtract off the mean of the distances >> and >> divide by the distances absolute deviation. Typically most distance >> values will lie between -1 and 1. >> >> I looked in the SciPy documentation and what I understood is that I >> can use an 'ufunc' to define one of these methods. Unfortunately my >> knowledge of Python, Numeric and SciPy is very low, so I could not >> figure out how. There are very few examples in the documentation at >> the moment. >> Can anyone point me to a possible implementation or where to look up? >> Thanks a lot in advance >> > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user > From noel.oboyle2 at mail.dcu.ie Tue Oct 11 09:10:54 2005 From: noel.oboyle2 at mail.dcu.ie (Noel O'Boyle) Date: Tue, 11 Oct 2005 14:10:54 +0100 Subject: [SciPy-user] [newbie] standardize a matrix In-Reply-To: <606392B9-4F80-4B88-8700-9A89B144F06B@gmail.com> References: <33501A19-2E31-49EC-AFD8-38206EBE118C@gmail.com> <1129019835.30702.14.camel@sandwi.ch.cam.ac.uk> <606392B9-4F80-4B88-8700-9A89B144F06B@gmail.com> Message-ID: <1129036254.31444.3.camel@sandwi.ch.cam.ac.uk> On Tue, 2005-10-11 at 15:04 +0200, Mauro Cherubini wrote: > Hi Noel, > thanks for the reply. > Follwoing you advices I implemented this method inside one of my > classes: > > #This method should standardize the matrix > def standardize_matrix(self, matrix, method='standard'): > if method == 'max': > max_x = max(matrix[:,1].flat) > divide(matrix[:,1], max_x, stdmatrix[:,1]) > max_y = max(matrix[:,2].flat) > divide(matrix[:,2], max_y, stdmatrix[:,2]) > return stdmatrix; > if method == 'standard': > mean_x = mean(matrix[:,1].flat) > stddev_x = std(matrix[:,1].flat) > subtract(matrix[:,1], mean_x, stdmatrix[:,1]) > divide(matrix[:,1], stddev_x, stdmatrix[:,1]) > mean_y = mean(matrix[:,2].flat) > stddev_y = std(matrix[:,2].flat) > subtract(matrix[:,2], mean_y, stdmatrix[:,2]) > divide(matrix[:,2], stddev_y, stdmatrix[:,2]) > return stdmatrix; > > When I call it from the main it throws this error: > > mean_x = mean(matrix[:,1].flat) > NameError: global name 'mean' is not defined > > I tried to understand which package are you importing as I initially > thought that was the common scipy base. Apparently I was able to find > a mean() method only inside scipy.stats.stats. However I am not able > to import this. Could you please help me to understand how to call > this method mean() and std()? > > I have a bit of confusion in mind between NumPy and SciPy because I > could not find information in the documentation for SciPy. from scipy import * mean should then work (it's actually in stats but some of the stats stuff is also imported into the scipy namespace) > > Thanks > > > Mauro > > > On Oct 11, 2005, at 10:37 , Noel O'Boyle wrote: > > > On Tue, 2005-10-11 at 10:02 +0200, Mauro Cherubini wrote: > > > >> I have a symmetrical bi-dimensional array that contains distances > >> between a certain number of points. The matrix diagoal are all > >> zeros > >> because of course the distance of a point from self is zero. > >> > > > > A distance matrix - PyChem is developing a number of methods to deal > > with distance matrices (and multivariate analysis in general). At the > > moment it is actively under development but I recommend you to forward > > your question on to the pychem users mailing list > > (www.sf.net/projects/pychem). > > > > > >> I would love to standardize the matrix using one or all of these > >> methods: > >> > >> a) divide each attribute distance value of a point by the maximum > >> observed absolute distance value. This should restrict the values to > >> lie between -1 and 1. Often the values are all positive, and thus, > >> all transformed values will lie between 0 and 1. > >> > > > > (Shouldn't all distances be positive?) > > > > > >>>> a = array([[0.1,0.2],[0.5,0.6]] ) > >>>> print a.flat > >>>> > > [0.1,0.2,0.5,0.6] > > > >>>> print max(a.flat) > >>>> > > 0.6 > > > >>>> divide(a,0.6,a) > >>>> print a > >>>> > > array([[ 0.16666667, 0.33333333], > > [ 0.83333333, 1. ]]) > > > > > > > >> b) for each distance value subtract off the mean of that distances > >> and then divide by the distances' standard deviation. If the > >> distances are normally distributed then most distance values will > >> lie > >> between -1 and 1. > >> > > > > You need to decide whether you want to include the diagonal > > elements. If > > so, then use mean(a.flat) and so on. > > > > > > > >> c) for each distance value subtract off the mean of the distances > >> and > >> divide by the distances absolute deviation. Typically most distance > >> values will lie between -1 and 1. > >> > >> I looked in the SciPy documentation and what I understood is that I > >> can use an 'ufunc' to define one of these methods. Unfortunately my > >> knowledge of Python, Numeric and SciPy is very low, so I could not > >> figure out how. There are very few examples in the documentation at > >> the moment. > >> Can anyone point me to a possible implementation or where to look up? > >> Thanks a lot in advance > >> > > > > _______________________________________________ > > SciPy-user mailing list > > SciPy-user at scipy.net > > http://www.scipy.net/mailman/listinfo/scipy-user > > > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user From gerard.vermeulen at grenoble.cnrs.fr Tue Oct 11 10:26:23 2005 From: gerard.vermeulen at grenoble.cnrs.fr (Gerard Vermeulen) Date: Tue, 11 Oct 2005 16:26:23 +0200 Subject: [SciPy-user] install location of the newcore header files In-Reply-To: <434ABF3E.80803@ee.byu.edu> References: <20051010210259.029a6a1f.gerard.vermeulen@grenoble.cnrs.fr> <434ABF3E.80803@ee.byu.edu> Message-ID: <20051011162623.66c26cfe.gerard.vermeulen@grenoble.cnrs.fr> Sorry if this shows up in a strange place in the thread, but some replies did not make it into my inbox. Robert Kern wrote: >> I finally discovered where the header files get installed when reading CAPI.txt; >> on the scip-dev list it is argued that the Python include directory is not a >> standard place for headers. > > No one is arguing that it's not standard. I'm arguing that it's a *bad* > place because frequently people can't install to it. To anybody who can't install some files in the "standard" locations, I advise to do something like: python setup.py install --root=~/MyPythonStuff The directory layout of this installation under ~/MyPythonStuff is the same as if it would be installed in the Python tree. It is a minor adjustment for the dictates of distutils; and works for any package which tries to install headers in the "standard" locations. Gerard From jh at oobleck.astro.cornell.edu Tue Oct 11 10:33:50 2005 From: jh at oobleck.astro.cornell.edu (Joe Harrington) Date: Tue, 11 Oct 2005 10:33:50 -0400 Subject: [SciPy-user] ASP status, docs, and supporting development Message-ID: <200510111433.j9BEXo0p013993@oobleck.astro.cornell.edu> After SciPy '04, Perry, Janet, and a few others and I put together a roadmap and framework, called ASP, for people to contribute to all the parts of SciPy other than the software: docs, packaging, website, etc. There was much interest expressed, and a few people even signed on the electronic dotted line to be involved: http://www.scipy.org/wikis/accessible_scipy/AccessibleSciPy The roadmap is laid out in this thread, which is linked in the first paragraph of the page above: http://www.scipy.org/mailinglists/mailman?fn=scipy-dev/2004-October/002412.html The first thing we did was to gather all the external projects using SciPy that we could find, and make an index page. The community found them and Fernando Perez did the hard work (more than he bargained for) of collating everything into the index page: http://www.scipy.org/wikis/topical_software/TopicalSoftware That is now a live wiki page, so if your project isn't on there, please add it! We were gearing up for effort #2, a web-site overhaul, early this year, when Travis announced his intention to heal the rift between the small- and large-array communities. We held up our push on web development, which was a few days from kickoff, so that it wouldn't take volunteers from his more-crucial effort. We all know the story of last year: Travis worked like crazy, called for volunteers, got only a few, and finished the job anyway. Now he's publishing a book in hopes of supporting some of his work from the revenues. He's made it clear that he will not be offended by or opposed to free docs, and may even contribute to them. He's also still Head Nummie and is leading the hard work of testing and bug swatting. Of course, we all want him to continue, and most of us freely admit our getting more than we are giving, and our gratitude to Travis, Todd, Robert, and the other core developers. Meanwhile, we have the problem of needing some basic docs that are free, and there seems to be quite a bit of interest in the community for doing one or more free docs. This seems to have more community energy behind it now than a web overhaul, so let's do it. The wiki and procedures are all set up to do some docs, and have been for about a year now. If you're interested in doing some docs, either as the lead author of a doc, as a contributor, or as a reviewer, please sign up on http://www.scipy.org/wikis/accessible_scipy/AccessibleSciPy The goal of the signups page is to make it easy for people to find each other: for lead authors to find people who will help them, for the community to identify who is taking part in what efforts, for low-level-of-effort volunteers to become hooked up with bigger projects, etc. There should be dozens of names there, not just three! So: If you're interested in LEADING A DOC, please add your name to the page, make a page for your doc on the wiki, and hang it off the main page, as "Scipy Cookbook" has done (there's a help link at the top of the page with instructions). A project can be anything from writing a collaborative book from scratch, to writing a monograph, to editing and revising existing docs. Announce your project on scipy-dev. If you would like to do a little work but not take the lead on something, you can contribute to the Cookbook or sign up to be a reviewer or contributor, either on an existing doc or at large. Or, contact a doc lead directly and sign up under that project. Please read the roadmap for ideas of docs we thought the community needs. The roadmap document is meant to be amended. For example, is the idea of using the docstrings to make a full-blown reference manual a good idea? I think so, since it's a rather self-updating format, but it will require some substantial work to get them all fleshed out and up to par. Discuss plans, changes, and ongoing efforts for docs on the scipy-dev mailing. It would be nice to have each project have a home on scipy.org, and Plone has excellent workflow-management tools. But, it's ok to home a project on your own site and just put a link on scipy.org. Finally, PLEASE everyone buy Travis's book if you can! Wait for the price to go up. Buy copies for everyone in your lab, if you can afford that. Buy one for your grandmother. It looks like it will be a really nice book, but it's more than that. This is a way to support development, which everyone desperately needs but few have the time (and fewer the skill and experience) to do. If you're at a company that benefits from SciPy and that hasn't already contributed resources to the effort, please consider parting with a larger chunk of change, either by buying more copies or by hiring Travis or others directly to do ongoing maintenance and package development. --jh-- From martigan at gmail.com Tue Oct 11 10:40:55 2005 From: martigan at gmail.com (Mauro Cherubini) Date: Tue, 11 Oct 2005 16:40:55 +0200 Subject: [SciPy-user] [newbie] standardize a matrix In-Reply-To: <1129036254.31444.3.camel@sandwi.ch.cam.ac.uk> References: <33501A19-2E31-49EC-AFD8-38206EBE118C@gmail.com> <1129019835.30702.14.camel@sandwi.ch.cam.ac.uk> <606392B9-4F80-4B88-8700-9A89B144F06B@gmail.com> <1129036254.31444.3.camel@sandwi.ch.cam.ac.uk> Message-ID: On Oct 11, 2005, at 15:10 , Noel O'Boyle wrote: > On Tue, 2005-10-11 at 15:04 +0200, Mauro Cherubini wrote: > >> Hi Noel, >> thanks for the reply. >> Follwoing you advices I implemented this method inside one of my >> classes: >> >> #This method should standardize the matrix >> def standardize_matrix(self, matrix, method='standard'): >> if method == 'max': >> max_x = max(matrix[:,1].flat) >> divide(matrix[:,1], max_x, stdmatrix[:,1]) >> max_y = max(matrix[:,2].flat) >> divide(matrix[:,2], max_y, stdmatrix[:,2]) >> return stdmatrix; >> if method == 'standard': >> mean_x = mean(matrix[:,1].flat) >> stddev_x = std(matrix[:,1].flat) >> subtract(matrix[:,1], mean_x, stdmatrix[:,1]) >> divide(matrix[:,1], stddev_x, stdmatrix[:,1]) >> mean_y = mean(matrix[:,2].flat) >> stddev_y = std(matrix[:,2].flat) >> subtract(matrix[:,2], mean_y, stdmatrix[:,2]) >> divide(matrix[:,2], stddev_y, stdmatrix[:,2]) >> return stdmatrix; >> >> When I call it from the main it throws this error: >> >> mean_x = mean(matrix[:,1].flat) >> NameError: global name 'mean' is not defined >> >> I tried to understand which package are you importing as I initially >> thought that was the common scipy base. Apparently I was able to find >> a mean() method only inside scipy.stats.stats. However I am not able >> to import this. Could you please help me to understand how to call >> this method mean() and std()? >> >> I have a bit of confusion in mind between NumPy and SciPy because I >> could not find information in the documentation for SciPy. >> > > from scipy import * > mean should then work (it's actually in stats but some of the stats > stuff is also imported into the scipy namespace) > Actually this doesn't help. Trying to catch the problem I realised that I only installed the 'scipy_core-0.4.1', which is obviously missing some stuff (if I have well understood). So I tried to complete my installation with the old NumPy (which installed fine) and f2py2e (which returned some conflicts warnings). Then I tried to compile and install the last complete scipy: SciPy_complete-0.3.2. The setup doesn't complete and it exits. I found lots of __init__.py files missing: package init file 'scipy_core/weave/tests/__init__.py' not found (or not a regular file) package init file 'Lib/tests/__init__.py' not found (or not a regular file) package init file '/Users/mauro/Desktop/python/extra modules/ SciPy_complete-0.3.2/Lib/cluster/tests/__init__.py' not found (or not a regular file) And lots of these conflicts: gcc: Lib/special/cephes/powi.c In file included from Lib/special/cephes/mconf_BE.h:190, from Lib/special/cephes/mconf.h:3, from Lib/special/cephes/powi.c:46: Lib/special/cephes/protos.h:27: warning: conflicting types for built- in function 'cexp' Lib/special/cephes/protos.h:28: warning: conflicting types for built- in function 'csin' Lib/special/cephes/protos.h:29: warning: conflicting types for built- in function 'ccos' Lib/special/cephes/protos.h:30: warning: conflicting types for built- in function 'ctan' Finally the install exit this way: ranlib:@ build/temp.darwin-8.2.0-Power_Macintosh-2.3/libcephes.a building 'dfftpack' library compiling Fortran sources f95(f77) options: '-fixed -O4' f95(f90) options: '-O4' f95(fix) options: '-fixed -O4' creating build/temp.darwin-8.2.0-Power_Macintosh-2.3/Lib/fftpack creating build/temp.darwin-8.2.0-Power_Macintosh-2.3/Lib/fftpack/ dfftpack compile options: '-c' f95:f77: Lib/fftpack/dfftpack/dcosqb.f sh: line 1: f95: command not found sh: line 1: f95: command not found error: Command "f95 -fixed -O4 -c -c Lib/fftpack/dfftpack/dcosqb.f -o build/temp.darwin-8.2.0-Power_Macintosh-2.3/Lib/fftpack/dfftpack/ dcosqb.o" failed with exit status 127 craftmac2:/Users/mauro/Desktop/python/extra modules/ SciPy_complete-0.3.2 root# I have to add that I am running this from a Macintosh running OS X 10.4.2. Can anybody suggest something else to try? Thanks Mauro -- web: http://craft.epfl.ch -- blog: http://www.i-cherubini.it/mauro/blog/ From alexander.borghgraef.rma at gmail.com Tue Oct 11 10:52:36 2005 From: alexander.borghgraef.rma at gmail.com (Alexander Borghgraef) Date: Tue, 11 Oct 2005 16:52:36 +0200 Subject: [SciPy-user] Problem while importing linalg: undefined symbol: s_wsfe In-Reply-To: <9e8c52a20510110210yaeaade1p9a754a9d0709da86@mail.gmail.com> References: <9e8c52a20510110210yaeaade1p9a754a9d0709da86@mail.gmail.com> Message-ID: <9e8c52a20510110752q300ff393s9f31ddea01e7d701@mail.gmail.com> Never mind. I'm still curious as to what caused it, but I fixed it by rebuilding atlas and scipy. -- Alex Borghgraef -------------- next part -------------- An HTML attachment was scrubbed... URL: From noel.oboyle2 at mail.dcu.ie Tue Oct 11 11:11:17 2005 From: noel.oboyle2 at mail.dcu.ie (Noel O'Boyle) Date: Tue, 11 Oct 2005 16:11:17 +0100 Subject: [SciPy-user] [newbie] standardize a matrix In-Reply-To: References: <33501A19-2E31-49EC-AFD8-38206EBE118C@gmail.com> <1129019835.30702.14.camel@sandwi.ch.cam.ac.uk> <606392B9-4F80-4B88-8700-9A89B144F06B@gmail.com> <1129036254.31444.3.camel@sandwi.ch.cam.ac.uk> Message-ID: <1129043477.31444.22.camel@sandwi.ch.cam.ac.uk> I should have said: I'm still using scipy 0.3.2 (the Debian package). On Tue, 2005-10-11 at 16:40 +0200, Mauro Cherubini wrote: > On Oct 11, 2005, at 15:10 , Noel O'Boyle wrote: > > > On Tue, 2005-10-11 at 15:04 +0200, Mauro Cherubini wrote: > > > >> Hi Noel, > >> thanks for the reply. > >> Follwoing you advices I implemented this method inside one of my > >> classes: > >> > >> #This method should standardize the matrix > >> def standardize_matrix(self, matrix, method='standard'): > >> if method == 'max': > >> max_x = max(matrix[:,1].flat) > >> divide(matrix[:,1], max_x, stdmatrix[:,1]) > >> max_y = max(matrix[:,2].flat) > >> divide(matrix[:,2], max_y, stdmatrix[:,2]) > >> return stdmatrix; > >> if method == 'standard': > >> mean_x = mean(matrix[:,1].flat) > >> stddev_x = std(matrix[:,1].flat) > >> subtract(matrix[:,1], mean_x, stdmatrix[:,1]) > >> divide(matrix[:,1], stddev_x, stdmatrix[:,1]) > >> mean_y = mean(matrix[:,2].flat) > >> stddev_y = std(matrix[:,2].flat) > >> subtract(matrix[:,2], mean_y, stdmatrix[:,2]) > >> divide(matrix[:,2], stddev_y, stdmatrix[:,2]) > >> return stdmatrix; > >> > >> When I call it from the main it throws this error: > >> > >> mean_x = mean(matrix[:,1].flat) > >> NameError: global name 'mean' is not defined > >> > >> I tried to understand which package are you importing as I initially > >> thought that was the common scipy base. Apparently I was able to find > >> a mean() method only inside scipy.stats.stats. However I am not able > >> to import this. Could you please help me to understand how to call > >> this method mean() and std()? > >> > >> I have a bit of confusion in mind between NumPy and SciPy because I > >> could not find information in the documentation for SciPy. > >> > > > > from scipy import * > > mean should then work (it's actually in stats but some of the stats > > stuff is also imported into the scipy namespace) > > > > Actually this doesn't help. Trying to catch the problem I realised > that I only installed the 'scipy_core-0.4.1', which is obviously > missing some stuff (if I have well understood). So I tried to > complete my installation with the old NumPy (which installed fine) > and f2py2e (which returned some conflicts warnings). Then I tried to > compile and install the last complete scipy: SciPy_complete-0.3.2. > > The setup doesn't complete and it exits. I found lots of __init__.py > files missing: > > package init file 'scipy_core/weave/tests/__init__.py' not found (or > not a regular file) > package init file 'Lib/tests/__init__.py' not found (or not a regular > file) > package init file '/Users/mauro/Desktop/python/extra modules/ > SciPy_complete-0.3.2/Lib/cluster/tests/__init__.py' not found (or not > a regular file) > > And lots of these conflicts: > > gcc: Lib/special/cephes/powi.c > In file included from Lib/special/cephes/mconf_BE.h:190, > from Lib/special/cephes/mconf.h:3, > from Lib/special/cephes/powi.c:46: > Lib/special/cephes/protos.h:27: warning: conflicting types for built- > in function 'cexp' > Lib/special/cephes/protos.h:28: warning: conflicting types for built- > in function 'csin' > Lib/special/cephes/protos.h:29: warning: conflicting types for built- > in function 'ccos' > Lib/special/cephes/protos.h:30: warning: conflicting types for built- > in function 'ctan' > > Finally the install exit this way: > > ranlib:@ build/temp.darwin-8.2.0-Power_Macintosh-2.3/libcephes.a > building 'dfftpack' library > compiling Fortran sources > f95(f77) options: '-fixed -O4' > f95(f90) options: '-O4' > f95(fix) options: '-fixed -O4' > creating build/temp.darwin-8.2.0-Power_Macintosh-2.3/Lib/fftpack > creating build/temp.darwin-8.2.0-Power_Macintosh-2.3/Lib/fftpack/ > dfftpack > compile options: '-c' > f95:f77: Lib/fftpack/dfftpack/dcosqb.f > sh: line 1: f95: command not found > sh: line 1: f95: command not found > error: Command "f95 -fixed -O4 -c -c Lib/fftpack/dfftpack/dcosqb.f -o > build/temp.darwin-8.2.0-Power_Macintosh-2.3/Lib/fftpack/dfftpack/ > dcosqb.o" failed with exit status 127 > craftmac2:/Users/mauro/Desktop/python/extra modules/ > SciPy_complete-0.3.2 root# > > I have to add that I am running this from a Macintosh running OS X > 10.4.2. > Can anybody suggest something else to try? > > Thanks > > > Mauro > > > -- > web: http://craft.epfl.ch -- blog: http://www.i-cherubini.it/mauro/blog/ > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user From aisaac at american.edu Tue Oct 11 11:28:24 2005 From: aisaac at american.edu (Alan G Isaac) Date: Tue, 11 Oct 2005 11:28:24 -0400 Subject: [SciPy-user] Guide not listed under documentation Message-ID: I was just trying to refer someone to Travis's Guide and discovered it is not listed under documentation http://www.scipy.org/documentation/ I also could not find it with a site search. Finally I noticed http://www.trelgol.com is on the first page of my copy, but his won't help someone seeking to purchase a copy! I believe it should be listed under documentation, perhaps in a separate category. As far as I can tell, I cannot edit this page http://www.scipy.org/documentation/ or I'd add it myself. Cheers, Alan Isaac From rkern at ucsd.edu Tue Oct 11 13:22:59 2005 From: rkern at ucsd.edu (Robert Kern) Date: Tue, 11 Oct 2005 10:22:59 -0700 Subject: [SciPy-user] install location of the newcore header files In-Reply-To: <20051011162623.66c26cfe.gerard.vermeulen@grenoble.cnrs.fr> References: <20051010210259.029a6a1f.gerard.vermeulen@grenoble.cnrs.fr> <434ABF3E.80803@ee.byu.edu> <20051011162623.66c26cfe.gerard.vermeulen@grenoble.cnrs.fr> Message-ID: <434BF4F3.5090607@ucsd.edu> Gerard Vermeulen wrote: > Sorry if this shows up in a strange place in the thread, but some replies did > not make it into my inbox. > > Robert Kern wrote: > >>>I finally discovered where the header files get installed when reading CAPI.txt; >>>on the scip-dev list it is argued that the Python include directory is not a >>>standard place for headers. >> >>No one is arguing that it's not standard. I'm arguing that it's a *bad* >>place because frequently people can't install to it. > > To anybody who can't install some files in the "standard" locations, I advise > to do something like: > > python setup.py install --root=~/MyPythonStuff > > The directory layout of this installation under ~/MyPythonStuff is the > same as if it would be installed in the Python tree. > > It is a minor adjustment for the dictates of distutils; and works for > any package which tries to install headers in the "standard" locations. I know about that. You're missing my point, however, that it then requires one to modify the setup script of every *other* package that needs the scipy header files to compile to point to this alternate, nonstandard location. That interferes with automated building and installation. It's also incompatible with PythonEggs because they can't contain header files to be installed the old way. It's a burden on the user rather than the developer. It's a cost that gets paid every time he has to install something that needs the scipy includes. Putting the headers into the package means that it will work in every conceivable installation in exactly the same way everywhere without effort on the part of the user. That's worth making developers read the documentation (or the compile commands spat out by distutils) to find the headers. -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From swisher at enthought.com Tue Oct 11 13:58:56 2005 From: swisher at enthought.com (Janet M. Swisher) Date: Tue, 11 Oct 2005 12:58:56 -0500 Subject: [SciPy-user] Guide not listed under documentation In-Reply-To: References: Message-ID: <434BFD60.1030806@enthought.com> Alan G Isaac wrote: >I was just trying to refer someone to Travis's Guide and >discovered it is not listed under documentation >http://www.scipy.org/documentation/ >I also could not find it with a site search. >Finally I noticed >http://www.trelgol.com >is on the first page of my copy, but his won't >help someone seeking to purchase a copy! > >I believe it should be listed under documentation, >perhaps in a separate category. As far as I can >tell, I cannot edit this page >http://www.scipy.org/documentation/ >or I'd add it myself. > > Thanks for the reminder. I changed the "Documentation" sub-heading to "Free Documentation", and added a sub-heading for "Fee-Based Documentation, with a bullet for Travis's book. -- Janet Swisher --- Senior Technical Writer Enthought, Inc. http://www.enthought.com From Fernando.Perez at colorado.edu Tue Oct 11 14:10:14 2005 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Tue, 11 Oct 2005 12:10:14 -0600 Subject: [SciPy-user] Purchasing Documentation In-Reply-To: References: <43418AA8.2040809@hawaii.edu> <4341A410.5010504@optusnet.com.au><4341B659.7070503@hawaii.edu> <4341C1C8.3000506@optusnet.com.au><434300FB.7070606@ee.byu.edu> <434307CC.7020303@colorado.edu><1e2af89e0510060529s5319192ayea270ebe2949b6b8@mail.gmail.com><43454135.503@ee.byu.edu> <4345A449.8040303@colorado.edu> Message-ID: <434C0006.1030701@colorado.edu> Alan G Isaac wrote: > On Fri, 7 Oct 2005, T) Arnd Baecker apparently wrote: > >>Are you thinking of restructured text >> http://docutils.sourceforge.net/ >>together with the mathhack >>http://docutils.sourceforge.net/sandbox/cben/rolehack/ > > > Perhaps a nice approach is > http://docutils.sourceforge.net/sandbox/jensj/latex_math/README.txt Thanks for the links: I've started a new section in the ipython trac wiki for all new design discussions, and posted this info there. We'll slowly move forward on this front, promised :) Cheers, f From gerard.vermeulen at grenoble.cnrs.fr Tue Oct 11 15:00:34 2005 From: gerard.vermeulen at grenoble.cnrs.fr (Gerard Vermeulen) Date: Tue, 11 Oct 2005 21:00:34 +0200 Subject: [SciPy-user] install location of the newcore header files In-Reply-To: <434BF4F3.5090607@ucsd.edu> References: <20051010210259.029a6a1f.gerard.vermeulen@grenoble.cnrs.fr> <434ABF3E.80803@ee.byu.edu> <20051011162623.66c26cfe.gerard.vermeulen@grenoble.cnrs.fr> <434BF4F3.5090607@ucsd.edu> Message-ID: <20051011210034.248fbbf5.gerard.vermeulen@grenoble.cnrs.fr> On Tue, 11 Oct 2005 10:22:59 -0700 Robert Kern wrote: > Gerard Vermeulen wrote: > > Sorry if this shows up in a strange place in the thread, but some replies did > > not make it into my inbox. > > > > Robert Kern wrote: > > > >>>I finally discovered where the header files get installed when reading CAPI.txt; > >>>on the scip-dev list it is argued that the Python include directory is not a > >>>standard place for headers. > >> > >>No one is arguing that it's not standard. I'm arguing that it's a *bad* > >>place because frequently people can't install to it. > > > > To anybody who can't install some files in the "standard" locations, I advise > > to do something like: > > > > python setup.py install --root=~/MyPythonStuff > > > > The directory layout of this installation under ~/MyPythonStuff is the > > same as if it would be installed in the Python tree. > > > > It is a minor adjustment for the dictates of distutils; and works for > > any package which tries to install headers in the "standard" locations. > > I know about that. You're missing my point, however, that it then > requires one to modify the setup script of every *other* package that > needs the scipy header files to compile to point to this alternate, > nonstandard location. Why? (1) One can always add non-standard include directories: python setup.py build_ext --help .. --include-dirs (-I) list of directories to search for header files .. and this is a worst case. I am pretty sure this can be taken care of in a config file. (2) If you recommend the 'install --root' method the location of the alternative Python include directory is implicitly known and is easy to figure out. I will use such a facility if SciPy offers it. > That interferes with automated building and > installation. It's also incompatible with PythonEggs because they can't > contain header files to be installed the old way. > I have refrained from commenting on PythonEggs, but I think it has a design problem if it can't handle standard distutils packages (how does PythonEggs handle C extension modules?). > > It's a burden on the user rather than the developer. It's a cost that > gets paid every time he has to install something that needs the scipy > includes. You mean a minority of users (< 5 % ?) and you should not think in terms of SciPy only. Isn't it better in the long run to educate those users so that they can cope with any distutils package that exports a C-API? > Putting the headers into the package means that it will work > in every conceivable installation in exactly the same way everywhere > without effort on the part of the user. True; but the cost is that you are breaking a standard. > That's worth making developers > read the documentation (or the compile commands spat out by distutils) > to find the headers. I see myself recompiling SciPy (or any other package) when I want to find the headers. IMO the problem can be solved with 10 lines in a README (see point 2 above) without putting a burden on developers. Gerard From pearu at scipy.org Tue Oct 11 14:47:03 2005 From: pearu at scipy.org (Pearu Peterson) Date: Tue, 11 Oct 2005 13:47:03 -0500 (CDT) Subject: [SciPy-user] install location of the newcore header files In-Reply-To: <20051011210034.248fbbf5.gerard.vermeulen@grenoble.cnrs.fr> References: <20051010210259.029a6a1f.gerard.vermeulen@grenoble.cnrs.fr> <20051011162623.66c26cfe.gerard.vermeulen@grenoble.cnrs.fr> <20051011210034.248fbbf5.gerard.vermeulen@grenoble.cnrs.fr> Message-ID: On Tue, 11 Oct 2005, Gerard Vermeulen wrote: >> Robert Kern wrote: >>> To anybody who can't install some files in the "standard" locations, I advise >>> to do something like: >>> >>> python setup.py install --root=~/MyPythonStuff >>> >>> The directory layout of this installation under ~/MyPythonStuff is the >>> same as if it would be installed in the Python tree. >>> >>> It is a minor adjustment for the dictates of distutils; and works for >>> any package which tries to install headers in the "standard" locations. >> >> I know about that. You're missing my point, however, that it then >> requires one to modify the setup script of every *other* package that >> needs the scipy header files to compile to point to this alternate, >> nonstandard location. > > Why? > > (1) One can always add non-standard include directories: > > python setup.py build_ext --help > .. > --include-dirs (-I) list of directories to search for header files > .. > and this is a worst case. I am pretty sure this can be taken > care of in a config file. I have noticed that even experienced Python programmers hardly use distutils commands with extra options or even know about distutils possibilities that could easily help to resolve various problems, they rather write emails to mailing lists. > (2) If you recommend the 'install --root' method the location of the > alternative Python include directory is implicitly known and is > easy to figure out. I will use such a facility if SciPy offers it. > >> That interferes with automated building and >> installation. It's also incompatible with PythonEggs because they can't >> contain header files to be installed the old way. >> > > I have refrained from commenting on PythonEggs, but I think it has a design > problem if it can't handle standard distutils packages (how does PythonEggs > handle C extension modules?). I agree on that. >> >> It's a burden on the user rather than the developer. It's a cost that >> gets paid every time he has to install something that needs the scipy >> includes. > > You mean a minority of users (< 5 % ?) and you should not think in terms of > SciPy only. Isn't it better in the long run to educate those users so that > they can cope with any distutils package that exports a C-API? Where did you get this number? I think majority of potential users will stop using a software if they need to learn such details. >> Putting the headers into the package means that it will work >> in every conceivable installation in exactly the same way everywhere >> without effort on the part of the user. > > True; but the cost is that you are breaking a standard. Is it then so bad if breaking the standard makes the life of users easier as well as of those who try to support users? And I don't see any harm in breaking this particular bit of the Python standard. >> That's worth making developers >> read the documentation (or the compile commands spat out by distutils) >> to find the headers. > > I see myself recompiling SciPy (or any other package) when I want to find the > headers. IMO the problem can be solved with 10 lines in a README (see point 2 > above) without putting a burden on developers. Note that Scipy developers should use scipy.distutils anyway instead of standard distutils for various reasons, and when using scipy.distutils, the scipy include directory is added silently to the list of include directories and so Scipy developrs need not to worry about the current issue at all. Pearu From oliphant at ee.byu.edu Tue Oct 11 15:52:10 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Tue, 11 Oct 2005 13:52:10 -0600 Subject: [SciPy-user] install location of the newcore header files In-Reply-To: References: <20051010210259.029a6a1f.gerard.vermeulen@grenoble.cnrs.fr> <20051011162623.66c26cfe.gerard.vermeulen@grenoble.cnrs.fr> <20051011210034.248fbbf5.gerard.vermeulen@grenoble.cnrs.fr> Message-ID: <434C17EA.9050608@ee.byu.edu> My attitude right now, given the feelings of Robert and Pearu on this issue is to leave things as they are and just place visible documentation to let the curious new-comer know where the headers are. We should probably drop this issue, though, as there are plenty-more pressing needs. Of course, discussion on PythonEggs and its relationship to scipy is still appropriate. -Travis From rkern at ucsd.edu Tue Oct 11 16:07:00 2005 From: rkern at ucsd.edu (Robert Kern) Date: Tue, 11 Oct 2005 13:07:00 -0700 Subject: [SciPy-user] install location of the newcore header files In-Reply-To: <20051011210034.248fbbf5.gerard.vermeulen@grenoble.cnrs.fr> References: <20051010210259.029a6a1f.gerard.vermeulen@grenoble.cnrs.fr> <434ABF3E.80803@ee.byu.edu> <20051011162623.66c26cfe.gerard.vermeulen@grenoble.cnrs.fr> <434BF4F3.5090607@ucsd.edu> <20051011210034.248fbbf5.gerard.vermeulen@grenoble.cnrs.fr> Message-ID: <434C1B64.20107@ucsd.edu> Gerard Vermeulen wrote: > I have refrained from commenting on PythonEggs, but I think it has a design > problem if it can't handle standard distutils packages (how does PythonEggs > handle C extension modules?). Just fine. These are the eggs I have installed on my system: ABCGI-0.0.0-py2.4-macosx-10.4-ppc.egg CherryPy-2.1.0_betadev-py2.4.egg EventNet-9-py2.4.egg F2PY-2.46.243_1997-py2.4.egg Fnorb-1.3-py2.4.egg FontTools_TTX-2.0b1-py2.4-macosx-10.4-ppc.egg FormEncode-0.2.1-py2.4.egg MayaVi-1.5-py2.4.egg NanoThreads-5-py2.4.egg Numeric-24.0b2-py2.4-darwin-8.1.0-Power_Macintosh.egg PIL-1.1.5-py2.4-darwin-8.1.0-Power_Macintosh.egg Pmw-1.2-py2.4.egg PyChecker-0.8.15-py2.4.egg PyChecker-0.8.16-py2.4.egg PyMC-0.9.1-py2.4-macosx-10.4-ppc.egg PyProtocols-1.0a0-py2.4-macosx-10.4-ppc.egg PyXML-0.8.4-py2.4-darwin-8.1.0-Power_Macintosh.egg Pyrex-0.9.3.1-py2.4.egg Pyro-3.4-py2.4.egg Reportlab-1.20-py2.4.egg RuleDispatch-0.5a0-py2.4-macosx-10.4-ppc.egg SCALE-0.0.1-py2.4.egg SPE-0.7.5.c-py2.4.egg SQLObject-0.7b1dev-py2.4.egg ScientificPython-2.4.9-py2.4-darwin-8.1.0-Power_Macintosh.egg SimPy-1.6-py2.4.egg TestGears-0.2-py2.4.egg Testido-0.1-py2.4.egg TurboGears-0.5.1-py2.4.egg VTK-4.5.0-py2.4.egg ZODB3-3.5.0-py2.4-macosx-10.4-ppc.egg _renderPM-0.99-py2.4-macosx-10.4-ppc.egg _rl_accel-0.52-py2.4-macosx-10.4-ppc.egg biopython-1.40b-py2.4-darwin-8.1.0-Power_Macintosh.egg biopython_corba-0.3.0-py2.4.egg bzr-0.0.5-py2.4.egg cElementTree-1.0.2_20050302-py2.4-darwin-8.1.0-Power_Macintosh.egg ctypes-2.0.0.0cvs-py2.4-macosx-10.4-ppc.egg cwm-1.0.0-py2.4.egg docutils-0.3.7-py2.4.egg egenix_mx_base-2.0.6-py2.4-macosx-10.4-ppc.egg egenix_mx_experimental-0.9.0-py2.4-macosx-10.4-ppc.egg elementtree-1.2.6_20050316-py2.4.egg eyeD3-0.6.7-py2.4.egg gadfly-1.0.0-py2.4.egg ipython-0.6.16.tzanko-py2.4.egg ipython-0.6.16_svn-py2.4.egg json_py-3.2.1-py2.4.egg kid-0.6.4-py2.4.egg kid-0.7a-py2.4.egg lxml-0.7-py2.4-macosx-10.4-ppc.egg matplotlib-0.84-py2.4-macosx-10.4-ppc.egg matplotlib-0.84cvs-py2.4-macosx-10.4-ppc.egg metakit-2.4.9.3-py2.4-macosx-10.4-ppc.egg networkx-0.23-py2.4.egg networkx-0.24-py2.4.egg numarray-1.4.0-py2.4-macosx-10.4-ppc.egg paramiko-1.4-py2.4.egg pgu-0.4-py2.4.egg ply-1.3.1-py2.4.egg py-0.8.0_alpha2-py2.4.egg pycairo-0.9.0-py2.4-macosx-10.4-ppc.egg pychinko-0.1-py2.4.egg pycrypto-2.0.1-py2.4-macosx-10.4-ppc.egg pydot-0.9.10-py2.4.egg pyflakes-svn-py2.4.egg pygame-1.7.2pre-py2.4-macosx-10.4-ppc.egg pyparsing-1.3-py2.4.egg pyproj-1.0-py2.4-macosx-10.4-ppc.egg pysqlite-2.0.3-py2.4-macosx-10.4-ppc.egg pysqlite-2.0.4-py2.4-macosx-10.4-ppc.egg pythondoc-2.1b4_20050618-py2.4.egg pythonutils-0.2.2-py2.4.egg rdflib-2.2.2-py2.4.egg redfoot-2.0.0-py2.4.egg rpy-0.4.6-py2.4-macosx-10.4-ppc.egg scipy-0.4.2_1307-py2.4-macosx-10.4-ppc.egg scipy_complete-0.3.3_309.4626-py2.4-macosx-10.4-ppc.egg scipy_core-0.4.2-py2.4-macosx-10.4-ppc.egg scons-0.96.90-py2.4.egg setuptools-0.6a2-py2.4.egg setuptools-0.6a4-py2.4.egg smart-0.37-py2.4-macosx-10.4-ppc.egg spambayes-1.1a1_-py2.4.egg sparta-0.8-py2.4.egg tables-1.1.1-py2.4-macosx-10.4-ppc.egg testoob-0.6-py2.4.egg unrealtowersodoku-1.5-py2.4.egg zope_interface-3.0.1-py2.4-macosx-10.4-ppc.egg Specifically, C extension modules can be handled in one of two ways: you can unpack the zipped egg (as I have for most of the above), or you can leave it zipped and the extension modules will be unpacked to a particular directory on the first import. I usually unpack the eggs so that I can use ipython's ?? magic which needs a real file on the filesystem. -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From Fernando.Perez at colorado.edu Tue Oct 11 16:18:43 2005 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Tue, 11 Oct 2005 14:18:43 -0600 Subject: [SciPy-user] install location of the newcore header files In-Reply-To: <434C1B64.20107@ucsd.edu> References: <20051010210259.029a6a1f.gerard.vermeulen@grenoble.cnrs.fr> <434ABF3E.80803@ee.byu.edu> <20051011162623.66c26cfe.gerard.vermeulen@grenoble.cnrs.fr> <434BF4F3.5090607@ucsd.edu> <20051011210034.248fbbf5.gerard.vermeulen@grenoble.cnrs.fr> <434C1B64.20107@ucsd.edu> Message-ID: <434C1E23.4060009@colorado.edu> Robert Kern wrote: > particular directory on the first import. I usually unpack the eggs so > that I can use ipython's ?? magic which needs a real file on the filesystem. Though if eggs become popular, we could certainly (and probably will) extend ?? so that it reads also inside the egg. There's nothing in principle that makes this impossible, as far as I understand. Cheers, f From rkern at ucsd.edu Tue Oct 11 16:22:15 2005 From: rkern at ucsd.edu (Robert Kern) Date: Tue, 11 Oct 2005 13:22:15 -0700 Subject: [SciPy-user] install location of the newcore header files In-Reply-To: <434C1E23.4060009@colorado.edu> References: <20051010210259.029a6a1f.gerard.vermeulen@grenoble.cnrs.fr> <434ABF3E.80803@ee.byu.edu> <20051011162623.66c26cfe.gerard.vermeulen@grenoble.cnrs.fr> <434BF4F3.5090607@ucsd.edu> <20051011210034.248fbbf5.gerard.vermeulen@grenoble.cnrs.fr> <434C1B64.20107@ucsd.edu> <434C1E23.4060009@colorado.edu> Message-ID: <434C1EF7.1000302@ucsd.edu> Fernando Perez wrote: > Robert Kern wrote: > >>particular directory on the first import. I usually unpack the eggs so >>that I can use ipython's ?? magic which needs a real file on the filesystem. > > Though if eggs become popular, we could certainly (and probably will) extend > ?? so that it reads also inside the egg. There's nothing in principle that > makes this impossible, as far as I understand. It would involve reimplementing inspect.getsource(), I think. -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From Fernando.Perez at colorado.edu Tue Oct 11 16:54:23 2005 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Tue, 11 Oct 2005 14:54:23 -0600 Subject: [SciPy-user] install location of the newcore header files In-Reply-To: <434C1EF7.1000302@ucsd.edu> References: <20051010210259.029a6a1f.gerard.vermeulen@grenoble.cnrs.fr> <434ABF3E.80803@ee.byu.edu> <20051011162623.66c26cfe.gerard.vermeulen@grenoble.cnrs.fr> <434BF4F3.5090607@ucsd.edu> <20051011210034.248fbbf5.gerard.vermeulen@grenoble.cnrs.fr> <434C1B64.20107@ucsd.edu> <434C1E23.4060009@colorado.edu> <434C1EF7.1000302@ucsd.edu> Message-ID: <434C267F.4010204@colorado.edu> Robert Kern wrote: > Fernando Perez wrote: > >>Robert Kern wrote: >> >> >>>particular directory on the first import. I usually unpack the eggs so >>>that I can use ipython's ?? magic which needs a real file on the filesystem. >> >>Though if eggs become popular, we could certainly (and probably will) extend >>?? so that it reads also inside the egg. There's nothing in principle that >>makes this impossible, as far as I understand. > > > It would involve reimplementing inspect.getsource(), I think. getsource is 2 lines of code, and getsourcelines (called by getsource) another 2. We just need to get the actual lines of source, which I suppose requires reading inside the egg (it's a zip archive, right?) and pulling the contents of the file out. It sounds relatively straightforward to me, but perhaps I'm missing something. Cheers, f From rkern at ucsd.edu Tue Oct 11 17:11:28 2005 From: rkern at ucsd.edu (Robert Kern) Date: Tue, 11 Oct 2005 14:11:28 -0700 Subject: [SciPy-user] install location of the newcore header files In-Reply-To: <434C267F.4010204@colorado.edu> References: <20051010210259.029a6a1f.gerard.vermeulen@grenoble.cnrs.fr> <434ABF3E.80803@ee.byu.edu> <20051011162623.66c26cfe.gerard.vermeulen@grenoble.cnrs.fr> <434BF4F3.5090607@ucsd.edu> <20051011210034.248fbbf5.gerard.vermeulen@grenoble.cnrs.fr> <434C1B64.20107@ucsd.edu> <434C1E23.4060009@colorado.edu> <434C1EF7.1000302@ucsd.edu> <434C267F.4010204@colorado.edu> Message-ID: <434C2A80.90806@ucsd.edu> Fernando Perez wrote: > getsource is 2 lines of code, and getsourcelines (called by getsource) another > 2. We just need to get the actual lines of source, which I suppose requires > reading inside the egg (it's a zip archive, right?) and pulling the contents > of the file out. > > It sounds relatively straightforward to me, but perhaps I'm missing something. No, I think you got everything. It doesn't change much wrt the scipy_core egg, though. Besides the header files, it's not "zip-safe" so easy_install will always unpack it anyways. This isn't really a problem, though; there isn't much of a downside to unpacked eggs over zipped eggs. zip_safe flag not set; analyzing archive contents... scipy.base.setup: module references __file__ scipy.distutils.exec_command: module references __file__ scipy.distutils.misc_util: module references __file__ scipy.distutils.system_info: module references __file__ scipy.distutils.command.build_src: module references __file__ scipy.f2py.diagnose: module references __file__ scipy.f2py.f2py2e: module references __file__ scipy.f2py.setup: module references __file__ scipy.test.logging: module references __file__ scipy.test.logging: module MAY be using inspect.stack scipy.test.testing: module references __file__ scipy.weave.blitz_spec: module references __file__ scipy.weave.blitz_tools: module references __file__ scipy.weave.bytecodecompiler: module MAY be using inspect.getsource scipy.weave.bytecodecompiler: module MAY be using inspect.getcomments scipy.weave.bytecodecompiler: module MAY be using inspect.stack scipy.weave.c_spec: module references __file__ scipy.weave.catalog: module references __file__ scipy.weave.catalog: module references __path__ scipy.weave.inline_tools: module references __file__ -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From Fernando.Perez at colorado.edu Tue Oct 11 17:17:42 2005 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Tue, 11 Oct 2005 15:17:42 -0600 Subject: [SciPy-user] install location of the newcore header files In-Reply-To: <434C2A80.90806@ucsd.edu> References: <20051010210259.029a6a1f.gerard.vermeulen@grenoble.cnrs.fr> <434ABF3E.80803@ee.byu.edu> <20051011162623.66c26cfe.gerard.vermeulen@grenoble.cnrs.fr> <434BF4F3.5090607@ucsd.edu> <20051011210034.248fbbf5.gerard.vermeulen@grenoble.cnrs.fr> <434C1B64.20107@ucsd.edu> <434C1E23.4060009@colorado.edu> <434C1EF7.1000302@ucsd.edu> <434C267F.4010204@colorado.edu> <434C2A80.90806@ucsd.edu> Message-ID: <434C2BF6.5010000@colorado.edu> Robert Kern wrote: > Fernando Perez wrote: > > >>getsource is 2 lines of code, and getsourcelines (called by getsource) another >>2. We just need to get the actual lines of source, which I suppose requires >>reading inside the egg (it's a zip archive, right?) and pulling the contents >>of the file out. >> >>It sounds relatively straightforward to me, but perhaps I'm missing something. > > > No, I think you got everything. Good. Let me know at what point this becomes of interest, and we'll move it up the priority queue. For the time being, I've noted it in the wiki. Cheers, f From alexander.borghgraef.rma at gmail.com Wed Oct 12 04:23:55 2005 From: alexander.borghgraef.rma at gmail.com (Alexander Borghgraef) Date: Wed, 12 Oct 2005 10:23:55 +0200 Subject: [SciPy-user] Maximal element of a matrix Message-ID: <9e8c52a20510120123t43642622j873a1c485d7e58cd@mail.gmail.com> Is there a function for determining the largest element of a matrix in scipy? Max nor maximum cut it, so I have no idea. What does maximum do anyway? Documentation on it is rather minimal, help(scipy.maximum) just returns , not very helpful. -- Alex Borghgraef -------------- next part -------------- An HTML attachment was scrubbed... URL: From oliphant at ee.byu.edu Wed Oct 12 04:27:40 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Wed, 12 Oct 2005 02:27:40 -0600 Subject: [SciPy-user] Maximal element of a matrix In-Reply-To: <9e8c52a20510120123t43642622j873a1c485d7e58cd@mail.gmail.com> References: <9e8c52a20510120123t43642622j873a1c485d7e58cd@mail.gmail.com> Message-ID: <434CC8FC.5030500@ee.byu.edu> Alexander Borghgraef wrote: > Is there a function for determining the largest element of a matrix in > scipy? Max nor maximum > cut it, so I have no idea. What does maximum do anyway? Documentation > on it is rather > minimal, help(scipy.maximum) just returns , not very > helpful. > Yeah. You have to understand ufuncs for the maximum function to make sense. It takes the maximum element-by-element of two arrays. But maximum.reduce(ravel(a)) does what you want. In scipy this can be called as a.max() -Travis From nphala at angloresearch.com Wed Oct 12 04:36:44 2005 From: nphala at angloresearch.com (Noko Phala) Date: Wed, 12 Oct 2005 10:36:44 +0200 Subject: [SciPy-user] Maximal element of a matrix Message-ID: If your matrix is a, then "max(a.flat)" should give you the largest element, provided you have imported either Numeric or Scipy (from scipy import *) Hope that helps. Dr Noko Phala Process Research Anglo Research PO Box 106 Crown Mines 2025 Republic of South Africa Tel: +27 (11) 377 4817 e-mail: nphala at angloresearch.com -----Original Message----- From: scipy-user-bounces at scipy.net [mailto:scipy-user-bounces at scipy.net] On Behalf Of Alexander Borghgraef Sent: Wednesday, October 12, 2005 10:24 AM To: SciPy Users List Subject: [SciPy-user] Maximal element of a matrix Is there a function for determining the largest element of a matrix in scipy? Max nor maximum cut it, so I have no idea. What does maximum do anyway? Documentation on it is rather minimal, help(scipy.maximum) just returns , not very helpful. -- Alex Borghgraef -------------- next part -------------- An HTML attachment was scrubbed... URL: From oliphant at ee.byu.edu Wed Oct 12 04:51:30 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Wed, 12 Oct 2005 02:51:30 -0600 Subject: [SciPy-user] Maximal element of a matrix In-Reply-To: References: Message-ID: <434CCE92.5080504@ee.byu.edu> Noko Phala wrote: > If your matrix is a, then ?max(a.flat)? should give you the largest > element, provided you have imported either Numeric or Scipy (from > scipy import *) > Yes, this will work, but it will use a slower sequence interface. If your arrays are large: maximum.reduce(ravel(a)) will work faster. In (new) scipy core this is a.max() -Travis From alexander.borghgraef.rma at gmail.com Wed Oct 12 04:56:48 2005 From: alexander.borghgraef.rma at gmail.com (Alexander Borghgraef) Date: Wed, 12 Oct 2005 10:56:48 +0200 Subject: [SciPy-user] Maximal element of a matrix In-Reply-To: <434CC8FC.5030500@ee.byu.edu> References: <9e8c52a20510120123t43642622j873a1c485d7e58cd@mail.gmail.com> <434CC8FC.5030500@ee.byu.edu> Message-ID: <9e8c52a20510120156v130cb331qb6125f2f71171873@mail.gmail.com> On 10/12/05, Travis Oliphant wrote: > > > Yeah. You have to understand ufuncs for the maximum function to make > sense. It takes the maximum element-by-element of two arrays. > > But maximum.reduce(ravel(a)) does what you want. Ok, it does. > In scipy this can be called as > > a.max() > The new scipy 0.4 beta? Because when I run this in scipy 0.3.2, I get an attribute error. -- Alex Borghgraef -------------- next part -------------- An HTML attachment was scrubbed... URL: From alexander.borghgraef.rma at gmail.com Wed Oct 12 04:57:43 2005 From: alexander.borghgraef.rma at gmail.com (Alexander Borghgraef) Date: Wed, 12 Oct 2005 10:57:43 +0200 Subject: [SciPy-user] Maximal element of a matrix In-Reply-To: References: Message-ID: <9e8c52a20510120157s105af076o4945cf729b2efbb5@mail.gmail.com> On 10/12/05, Noko Phala wrote: > > If your matrix is a, then "max(a.flat)" should give you the largest > element, provided you have imported either Numeric or Scipy (from scipy > import *) > > Hope that helps. > That looks likes the most elegant solution right now, thanks. -- Alex Borghgraef -------------- next part -------------- An HTML attachment was scrubbed... URL: From jaroslav.stark at imperial.ac.uk Wed Oct 12 06:17:31 2005 From: jaroslav.stark at imperial.ac.uk (J. Stark) Date: Wed, 12 Oct 2005 11:17:31 +0100 Subject: [SciPy-user] Weave under OSX 10.4 Message-ID: I haven't used weave for a while, and in particular not since I upgraded to OSX 10.4. Now I find that scripts which worked fine previously fail to compile - the error messages start with ... Numeric/arrayobject.h : No such file or directory A search of my hard disk fails to find arrayobject.h. Is this a know problem, or am I doing something stupid? I have set gcc to 3.3, rather than the default 4, and run the Tiger Python fix from http://undefined.org/python/ Any help would be much appreciated. J. From giovanni.samaey at cs.kuleuven.ac.be Wed Oct 12 11:20:32 2005 From: giovanni.samaey at cs.kuleuven.ac.be (Giovanni Samaey) Date: Wed, 12 Oct 2005 17:20:32 +0200 Subject: [SciPy-user] subversion repository Message-ID: <434D29C0.1060109@cs.kuleuven.ac.be> Dear all, I am wondering if any centralized information will become available about how to check out scipy from svn (what exactly needs to be checked out and installed). I realize that all the information is scattered in a number of mails on this list, but I would like to see some place (e.g. on the website) updated to reflect the switch from cvs to svn. (I don't think I am alone on this one... It must be hard for new users to try SciPy with this situation.) Best, Giovanni Disclaimer: http://www.kuleuven.be/cwis/email_disclaimer.htm From nwagner at mecha.uni-stuttgart.de Wed Oct 12 11:27:31 2005 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Wed, 12 Oct 2005 17:27:31 +0200 Subject: [SciPy-user] subversion repository In-Reply-To: <434D29C0.1060109@cs.kuleuven.ac.be> References: <434D29C0.1060109@cs.kuleuven.ac.be> Message-ID: <434D2B63.6020707@mecha.uni-stuttgart.de> Giovanni Samaey wrote: >Dear all, > >I am wondering if any centralized information will become available >about how to check out scipy from svn (what exactly needs to be checked >out and installed). >I realize that all the information is scattered in a number of mails on >this list, but I would like to see some place (e.g. on the website) updated >to reflect the switch from cvs to svn. >(I don't think I am alone on this one... It must be hard for new users >to try SciPy with this situation.) > >Best, > >Giovanni > >Disclaimer: http://www.kuleuven.be/cwis/email_disclaimer.htm > >_______________________________________________ >SciPy-user mailing list >SciPy-user at scipy.net >http://www.scipy.net/mailman/listinfo/scipy-user > # # New scipy # svn co http://svn.scipy.org/svn/scipy_core/branches/newcore/ svn co http://svn.scipy.org/svn/scipy/branches/newscipy/ From rkern at ucsd.edu Wed Oct 12 11:52:32 2005 From: rkern at ucsd.edu (Robert Kern) Date: Wed, 12 Oct 2005 08:52:32 -0700 Subject: [SciPy-user] Weave under OSX 10.4 In-Reply-To: References: Message-ID: <434D3140.7020505@ucsd.edu> J. Stark wrote: > I haven't used weave for a while, and in particular not since I > upgraded to OSX 10.4. Now I find that scripts which worked fine > previously fail to compile - the error messages start with > > ... Numeric/arrayobject.h : No such file or directory > > A search of my hard disk fails to find arrayobject.h. > > Is this a know problem, or am I doing something stupid? I have set > gcc to 3.3, rather than the default 4, and run the Tiger Python fix > from http://undefined.org/python/ You installation of Numeric is broken (and BTW, yes, you should stick with gcc 3.3). Are you using the system Python 2.3.5? If so, then I suspect that when /System/Library/Frameworks/Python.framework got upgraded, the Numeric headers that were installed there went *poof*, too. Reinstall Numeric, and the headers should be installed again. -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From stephen.walton at csun.edu Wed Oct 12 12:19:49 2005 From: stephen.walton at csun.edu (Stephen Walton) Date: Wed, 12 Oct 2005 09:19:49 -0700 Subject: [SciPy-user] Problem while importing linalg: undefined symbol: s_wsfe In-Reply-To: <9e8c52a20510110752q300ff393s9f31ddea01e7d701@mail.gmail.com> References: <9e8c52a20510110210yaeaade1p9a754a9d0709da86@mail.gmail.com> <9e8c52a20510110752q300ff393s9f31ddea01e7d701@mail.gmail.com> Message-ID: <434D37A5.4000107@csun.edu> Alexander Borghgraef wrote: > Never mind. I'm still curious as to what caused it, but I fixed it by > rebuilding atlas and scipy. "It" being a missing s_wsfe symbol on an import of linalg. Most likely there was a missing link to -lg2c somewhere along the line. Don't worry about it. From myeates at jpl.nasa.gov Wed Oct 12 16:01:37 2005 From: myeates at jpl.nasa.gov (Mathew Yeates) Date: Wed, 12 Oct 2005 13:01:37 -0700 Subject: [SciPy-user] begging for binaries Message-ID: <434D6BA1.6030707@jpl.nasa.gov> Hi Would it be possible to post binaries for Windows using Python2.4?? I've been trying to install from source and ..... well its not going too smoothly. It would be greatly appreciated. Mathew From oliphant at ee.byu.edu Wed Oct 12 16:09:53 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Wed, 12 Oct 2005 14:09:53 -0600 Subject: [SciPy-user] begging for binaries In-Reply-To: <434D6BA1.6030707@jpl.nasa.gov> References: <434D6BA1.6030707@jpl.nasa.gov> Message-ID: <434D6D91.2010408@ee.byu.edu> Mathew Yeates wrote: >Hi > >Would it be possible to post binaries for Windows using Python2.4?? > > If you are talking about full scipy, then I think you've caught most developers at a time when all energy is being spent porting full scipy to newcore. There will be binaries available when that is done. In the mean time, perhaps you will get help if you describe what is not going smoothly. I know there were/are issues with Python2.4 and distutils that create some problems. -Travis From oliphant at ee.byu.edu Wed Oct 12 16:59:07 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Wed, 12 Oct 2005 14:59:07 -0600 Subject: [SciPy-user] New 0.4.2 beta release of scipy core Message-ID: <434D791B.9060809@ee.byu.edu> I made another beta release of scipy core last night. There are windows binaries for Python 2.4 and Python 2.3. If you are already a user of scipy, the new __init__ file installed for newcore will break your current installation of scipy (but the problem with linalg, fftpack, and stats is no longer there). There have been many improvements: - bug fixes (including 64-bit fixes) - threading support fixes - optimizations - more random numbers (thanks Robert Kern). - more distutils fixes (thanks Pearu Peterson). More tests are welcome. We are moving towards a release (but still need to get Masked Arrays working and all of scipy to build on top of the new scipy core before a full release). -Travis From Fernando.Perez at colorado.edu Wed Oct 12 18:36:12 2005 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Wed, 12 Oct 2005 16:36:12 -0600 Subject: [SciPy-user] [SciPy-dev] New 0.4.2 beta release of scipy core In-Reply-To: <434D791B.9060809@ee.byu.edu> References: <434D791B.9060809@ee.byu.edu> Message-ID: <434D8FDC.2050306@colorado.edu> Travis Oliphant wrote: > I made another beta release of scipy core last night. There are > windows binaries for Python 2.4 and Python 2.3. If you are already a Sorry if I missed the blindingly obvious, but a URL for the tarballs (do they exist?) would be most appreciated at this point (I have a friend looking for it, and I'm embarrassed to say that I can't seem to find it). As a hook: this friend may test the build on Itanium boxes, so I'd like to respond to him soonish :) If it's just an SVN pull that's fine, just let me know and I'll pass the info along. Cheers, f From rkern at ucsd.edu Wed Oct 12 18:39:29 2005 From: rkern at ucsd.edu (Robert Kern) Date: Wed, 12 Oct 2005 15:39:29 -0700 Subject: [SciPy-user] [SciPy-dev] New 0.4.2 beta release of scipy core In-Reply-To: <434D8FDC.2050306@colorado.edu> References: <434D791B.9060809@ee.byu.edu> <434D8FDC.2050306@colorado.edu> Message-ID: <434D90A1.9040105@ucsd.edu> Fernando Perez wrote: > Travis Oliphant wrote: > >>I made another beta release of scipy core last night. There are >>windows binaries for Python 2.4 and Python 2.3. If you are already a > > Sorry if I missed the blindingly obvious, but a URL for the tarballs (do they > exist?) would be most appreciated at this point (I have a friend looking for > it, and I'm embarrassed to say that I can't seem to find it). As a hook: this > friend may test the build on Itanium boxes, so I'd like to respond to him > soonish :) If it's just an SVN pull that's fine, just let me know and I'll > pass the info along. http://numeric.scipy.org/ leads to the Sourceforge download page: http://sourceforge.net/project/showfiles.php?group_id=1369&package_id=6222 -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From Fernando.Perez at colorado.edu Wed Oct 12 18:43:33 2005 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Wed, 12 Oct 2005 16:43:33 -0600 Subject: [SciPy-user] [SciPy-dev] New 0.4.2 beta release of scipy core In-Reply-To: <434D90A1.9040105@ucsd.edu> References: <434D791B.9060809@ee.byu.edu> <434D8FDC.2050306@colorado.edu> <434D90A1.9040105@ucsd.edu> Message-ID: <434D9195.1000603@colorado.edu> Robert Kern wrote: > Fernando Perez wrote: > >>Travis Oliphant wrote: >> >> >>>I made another beta release of scipy core last night. There are >>>windows binaries for Python 2.4 and Python 2.3. If you are already a >> >>Sorry if I missed the blindingly obvious, but a URL for the tarballs (do they >>exist?) would be most appreciated at this point (I have a friend looking for >>it, and I'm embarrassed to say that I can't seem to find it). As a hook: this >>friend may test the build on Itanium boxes, so I'd like to respond to him >>soonish :) If it's just an SVN pull that's fine, just let me know and I'll >>pass the info along. > > > http://numeric.scipy.org/ leads to the Sourceforge download page: > > http://sourceforge.net/project/showfiles.php?group_id=1369&package_id=6222 Thanks. Obvious in retrospect, but I have to admit I hadn't updated my bookmarks to the new project page. Cheers, f From myeates at jpl.nasa.gov Wed Oct 12 20:06:34 2005 From: myeates at jpl.nasa.gov (Mathew Yeates) Date: Wed, 12 Oct 2005 17:06:34 -0700 Subject: [SciPy-user] begging for binaries In-Reply-To: <434D6D91.2010408@ee.byu.edu> References: <434D6BA1.6030707@jpl.nasa.gov> <434D6D91.2010408@ee.byu.edu> Message-ID: <434DA50A.10803@jpl.nasa.gov> okay, I finally got everything installed. I think. Now when I run scipy.test() I get a whole bunch of output followed by python crashing Here are the 2 lines before the crash ----- check_simple_underdet_complex (scipy.linalg.decomp.test_decomp.test_svdvals) ... ok check_basic (scipy.io.array_import.test_array_import.test_numpyio) ... BOOM! Mathew Travis Oliphant wrote: >Mathew Yeates wrote: > > > >>Hi >> >>Would it be possible to post binaries for Windows using Python2.4?? >> >> >> >> >If you are talking about full scipy, then I think you've caught most >developers at a time when all energy is being spent porting full scipy >to newcore. There will be binaries available when that is done. In >the mean time, perhaps you will get help if you describe what is not >going smoothly. > >I know there were/are issues with Python2.4 and distutils that create >some problems. > >-Travis > >_______________________________________________ >SciPy-user mailing list >SciPy-user at scipy.net >http://www.scipy.net/mailman/listinfo/scipy-user > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From oliphant at ee.byu.edu Wed Oct 12 21:16:54 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Wed, 12 Oct 2005 19:16:54 -0600 Subject: [SciPy-user] begging for binaries In-Reply-To: <434DA50A.10803@jpl.nasa.gov> References: <434D6BA1.6030707@jpl.nasa.gov> <434D6D91.2010408@ee.byu.edu> <434DA50A.10803@jpl.nasa.gov> Message-ID: <434DB586.7010303@ee.byu.edu> Mathew Yeates wrote: > okay, I finally got everything installed. I think. > > Now when I run scipy.test() I get a whole bunch of output followed by > python crashing > Here are the 2 lines before the crash > ----- > check_simple_underdet_complex > (scipy.linalg.decomp.test_decomp.test_svdvals) ... > ok > check_basic (scipy.io.array_import.test_array_import.test_numpyio) ... > BOOM! This is a problem with scipy distutils picking up the wrong system IO library. I don't know why things changed in the move to Python 2.4, but for some reason, Python is being compiled against one system IO library, and scipy is being compiled against another. I don't know how to fix it, right way. Everything worked fine with Python 2.3. -Travis From rkern at ucsd.edu Wed Oct 12 21:26:21 2005 From: rkern at ucsd.edu (Robert Kern) Date: Wed, 12 Oct 2005 18:26:21 -0700 Subject: [SciPy-user] begging for binaries In-Reply-To: <434DB586.7010303@ee.byu.edu> References: <434D6BA1.6030707@jpl.nasa.gov> <434D6D91.2010408@ee.byu.edu> <434DA50A.10803@jpl.nasa.gov> <434DB586.7010303@ee.byu.edu> Message-ID: <434DB7BD.6050905@ucsd.edu> Travis Oliphant wrote: > Mathew Yeates wrote: > >>okay, I finally got everything installed. I think. >> >>Now when I run scipy.test() I get a whole bunch of output followed by >>python crashing >>Here are the 2 lines before the crash >>----- >>check_simple_underdet_complex >>(scipy.linalg.decomp.test_decomp.test_svdvals) ... >> ok >>check_basic (scipy.io.array_import.test_array_import.test_numpyio) ... >>BOOM! > > This is a problem with scipy distutils picking up the wrong system IO > library. I don't know why things changed in the move to Python 2.4, > but for some reason, Python is being compiled against one system IO > library, and scipy is being compiled against another. > > I don't know how to fix it, right way. Everything worked fine with > Python 2.3. The official Python 2.4 binaries are being compiled with MSVC 7 and against that runtime. John Hunter discovered a way to compile his matplotlib binaries with mingw, but no one has yet reported if they've tried this with scipy: http://mail.python.org/pipermail/python-list/2004-December/254826.html Some conflicting advice can be found at the bottom of this page: http://jove.prohosting.com/iwave/ipython/issues.html Please try those methods and let us know how it goes. -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From sgarcia at olfac.univ-lyon1.fr Thu Oct 13 05:00:29 2005 From: sgarcia at olfac.univ-lyon1.fr (Samuel GARCIA) Date: Thu, 13 Oct 2005 11:00:29 +0200 Subject: [SciPy-user] Newbie and Old matlab user Message-ID: <434E222D.7010707@olfac.univ-lyon1.fr> hello, I am a newbie I program with matlab and I try to change for scipy/ipython/pylab. A question : I am have 2 file source1.py and source2.py, in source1.py there is : from source2 import * and source1 use source2 functions Then in ipython console, with the command run source1 this execute my code. My problem is when I edit source1 and source2, run command take in account the modification in source1py, but don't take in account in the modofiction in source2.py Sorry, for this simple question. samuel -- Samuel GARCIA CNRS - UMR5020 Universite Claude Bernard LYON 1 Laboratoire des Neurosciences et Systemes Sensoriels 50, avenue Tony Garnier 69366 LYON Cedex 07 04 37 28 74 64 From noel.oboyle2 at mail.dcu.ie Thu Oct 13 05:06:48 2005 From: noel.oboyle2 at mail.dcu.ie (Noel O'Boyle) Date: Thu, 13 Oct 2005 10:06:48 +0100 Subject: [SciPy-user] Newbie and Old matlab user In-Reply-To: <434E222D.7010707@olfac.univ-lyon1.fr> References: <434E222D.7010707@olfac.univ-lyon1.fr> Message-ID: <1129194408.31444.117.camel@sandwi.ch.cam.ac.uk> That's true and it is a general problem I think. Try "rm source2.pyc" before the run statement. (I think you can use bash commands in ipython). This will cause the .pyc to be recreated from the .py source. Regards Noel On Thu, 2005-10-13 at 11:00 +0200, Samuel GARCIA wrote: > hello, > I am a newbie > I program with matlab and I try to change for scipy/ipython/pylab. > > A question : > > I am have 2 file source1.py and source2.py, > in source1.py there is : > from source2 import * > and source1 use source2 functions > > Then in ipython console, with the command > run source1 > this execute my code. > > My problem is when I edit source1 and source2, run command take in > account the modification in source1py, > but don't take in account in the modofiction in source2.py > > Sorry, for this simple question. > > samuel > From sgarcia at olfac.univ-lyon1.fr Thu Oct 13 05:18:33 2005 From: sgarcia at olfac.univ-lyon1.fr (Samuel GARCIA) Date: Thu, 13 Oct 2005 11:18:33 +0200 Subject: [SciPy-user] Newbie and Old matlab user In-Reply-To: <1129194408.31444.117.camel@sandwi.ch.cam.ac.uk> References: <434E222D.7010707@olfac.univ-lyon1.fr> <1129194408.31444.117.camel@sandwi.ch.cam.ac.uk> Message-ID: <434E2669.4070409@olfac.univ-lyon1.fr> An HTML attachment was scrubbed... URL: From jaroslav.stark at imperial.ac.uk Thu Oct 13 06:40:57 2005 From: jaroslav.stark at imperial.ac.uk (J. Stark) Date: Thu, 13 Oct 2005 11:40:57 +0100 Subject: [SciPy-user] Weave under OSX 10.4 In-Reply-To: References: Message-ID: >Robert Kern wrote: >J. Stark wrote: >> I haven't used weave for a while, and in particular not since I >> upgraded to OSX 10.4. Now I find that scripts which worked fine >> previously fail to compile - the error messages start with >> >> ... Numeric/arrayobject.h : No such file or directory >> >> A search of my hard disk fails to find arrayobject.h. >> >> Is this a know problem, or am I doing something stupid? I have set >> gcc to 3.3, rather than the default 4, and run the Tiger Python fix >> from http://undefined.org/python/ > >You installation of Numeric is broken (and BTW, yes, you should stick >with gcc 3.3). Are you using the system Python 2.3.5? If so, then I >suspect that when /System/Library/Frameworks/Python.framework got >upgraded, the Numeric headers that were installed there went *poof*, too. > >Reinstall Numeric, and the headers should be installed again. > >-- >Robert Kern >rkern at ucsd.edu Thanks. Reinstalling Numeric did indeed fix things. I did a 10.4 Archive and Install, and it looks as if this installs a fresh Python.framework, and fails to copy any customization from the previous System. Thanks for your help. J. From Jonathan.Peirce at nottingham.ac.uk Thu Oct 13 07:47:32 2005 From: Jonathan.Peirce at nottingham.ac.uk (Jon Peirce) Date: Thu, 13 Oct 2005 12:47:32 +0100 Subject: [SciPy-user] special functions in scipy 0.4 In-Reply-To: References: Message-ID: <434E4954.3030009@psychology.nottingham.ac.uk> Travis (or others?) will the *special* subpackage (special.erf etc...) be appearing in scipy 0.4x? Is it there already somewhere hidden? Or maybe I need to fetch it separately from scipy0.3? More generally is there a sensible way for users to combine Scipy Core with previous Scipy (install the old one first then install new one over top?) Jon -- Jon Peirce Nottingham University +44 (0)115 8467176 (tel) +44 (0)115 9515324 (fax) http://www.peirce.org.uk/ This message has been checked for viruses but the contents of an attachment may still contain software viruses, which could damage your computer system: you are advised to perform your own checks. Email communications with the University of Nottingham may be monitored as permitted by UK legislation. From aisaac at american.edu Thu Oct 13 09:29:51 2005 From: aisaac at american.edu (Alan G Isaac) Date: Thu, 13 Oct 2005 09:29:51 -0400 Subject: [SciPy-user] Newbie and Old matlab user In-Reply-To: <434E222D.7010707@olfac.univ-lyon1.fr> References: <434E222D.7010707@olfac.univ-lyon1.fr> Message-ID: On Thu, 13 Oct 2005, Samuel CIA apparently wrote: > in source1.py there is : > from source2 import * > and source1 use source2 functions ... > My problem is when I edit source1 and source2, run command take in > account the modification in source1py, > but don't take in account in the modofiction in source2.py To save time when working at the interpreter, modules must be explicitly reloaded: http://docs.python.org/lib/built-in-funcs.html#reload hth, Alan Isaac From sgarcia at olfac.univ-lyon1.fr Thu Oct 13 10:02:41 2005 From: sgarcia at olfac.univ-lyon1.fr (Samuel GARCIA) Date: Thu, 13 Oct 2005 16:02:41 +0200 Subject: [SciPy-user] Newbie and Old matlab user In-Reply-To: References: <434E222D.7010707@olfac.univ-lyon1.fr> Message-ID: <434E6901.2080209@olfac.univ-lyon1.fr> An HTML attachment was scrubbed... URL: From Fernando.Perez at colorado.edu Thu Oct 13 10:25:43 2005 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Thu, 13 Oct 2005 08:25:43 -0600 Subject: [SciPy-user] Newbie and Old matlab user In-Reply-To: <434E6901.2080209@olfac.univ-lyon1.fr> References: <434E222D.7010707@olfac.univ-lyon1.fr> <434E6901.2080209@olfac.univ-lyon1.fr> Message-ID: <434E6E67.2050506@colorado.edu> Samuel GARCIA wrote: > thank you. > > but howto deal with th case from source2 import * ? > > > And the case import source2 does the reload function mean that I have to type : > reload(source2) > in the file source1.py > each time I modify source2.py and then delete this line before typing : > run source1 > in ipython > > It's llok difficult ! just to clarify a bit. Yes, module reloading can be a bit annoying in python, unfortunately. If you are working on s1.py but also on s2.py, and s1.py is your top-level file, the easiest solution (which I use a lot) is just to put import s2 reload(s2) into s1.py. You _don't_ have to delete the reload line, at least not until you know you are done with this project. It simply means that s2 is re-imported on every run of s1, but that's typically not a big deal (unless s2 is enormously complex). If s2 further requires things which may change, it gets worse. You can try to put further reload() statements down the chain, or you can try to use ipython's recursive reloader. In s1: from IPython.deep_reload import reload import s2 reload(s2) This will _try_ to do a full, recursive reload of s2 and everything it needs. it's slower, and not 100% reliable (in particular, extension code can't be reloaded), but it can be a big time saver in many cases. Finally, the 'from s2 import *' idiom breaks all this, I think. You can try this, but I'm not sure it works (I think it doesn't): import s2 reload(s2) from s2 import * Cheers, f From rkern at ucsd.edu Thu Oct 13 10:31:06 2005 From: rkern at ucsd.edu (Robert Kern) Date: Thu, 13 Oct 2005 07:31:06 -0700 Subject: [SciPy-user] special functions in scipy 0.4 In-Reply-To: <434E4954.3030009@psychology.nottingham.ac.uk> References: <434E4954.3030009@psychology.nottingham.ac.uk> Message-ID: <434E6FAA.5020409@ucsd.edu> Jon Peirce wrote: > Travis (or others?) > > will the *special* subpackage (special.erf etc...) be appearing in scipy > 0.4x? Is it there already somewhere hidden? Or maybe I need to fetch it > separately from scipy0.3? > > More generally is there a sensible way for users to combine Scipy Core > with previous Scipy (install the old one first then install new one over > top?) Don't worry, there will be scipy.special. We're still in the process of porting it. The porting is taking place in a branch on the SVN repository. The process to install the full scipy would be to install scipy_core and then install the full scipy, just as if they were separate packages. svn co http://svn.scipy.org/svn/scipy/branches/newscipy/ I'm currently getting segfaults when trying to initialize the _cephes module, however. -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From aisaac at american.edu Thu Oct 13 10:39:41 2005 From: aisaac at american.edu (Alan G Isaac) Date: Thu, 13 Oct 2005 10:39:41 -0400 Subject: [SciPy-user] Newbie and Old matlab user In-Reply-To: <434E6E67.2050506@colorado.edu> References: <434E222D.7010707@olfac.univ-lyon1.fr> <434E6901.2080209@olfac.univ-lyon1.fr><434E6E67.2050506@colorado.edu> Message-ID: On Thu, 13 Oct 2005, Fernando Perez apparently wrote: > Finally, the 'from s2 import *' idiom breaks all this, I think. You can try > this, but I'm not sure it works (I think it doesn't): The docs say: If a module imports objects from another module using from ... import ..., calling reload() for the other module does not redefine the objects imported from it -- one way around this is to re-execute the from statement, another is to use import and qualified names (module.name) instead. In any case, maybe the OP could just use execfile: http://docs.python.org/lib/built-in-funcs.html#execfile Cheers, Alan Isaac From Fernando.Perez at colorado.edu Thu Oct 13 11:39:13 2005 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Thu, 13 Oct 2005 09:39:13 -0600 Subject: [SciPy-user] Newbie and Old matlab user In-Reply-To: References: <434E222D.7010707@olfac.univ-lyon1.fr> <434E6901.2080209@olfac.univ-lyon1.fr><434E6E67.2050506@colorado.edu> Message-ID: <434E7FA1.5070800@colorado.edu> Alan G Isaac wrote: > On Thu, 13 Oct 2005, Fernando Perez apparently wrote: > > >>Finally, the 'from s2 import *' idiom breaks all this, I think. You can try >>this, but I'm not sure it works (I think it doesn't): > > > The docs say: > If a module imports objects from another module > using from ... import ..., calling reload() for the > other module does not redefine the objects imported > from it -- one way around this is to re-execute the > from statement, another is to use import and > qualified names (module.name) instead. Thanks, that's what I vaguely remembered. One more reason not to use 'import *'... > In any case, maybe the OP could just use execfile: > http://docs.python.org/lib/built-in-funcs.html#execfile Yes, the only problem with this is that execfile requires a full path, so it's a bit less convenient than import. But while developing/debugging, it can do the trick. Cheers, f From sgarcia at olfac.univ-lyon1.fr Thu Oct 13 12:00:06 2005 From: sgarcia at olfac.univ-lyon1.fr (Samuel GARCIA) Date: Thu, 13 Oct 2005 18:00:06 +0200 Subject: [SciPy-user] Newbie and Old matlab user In-Reply-To: <434E7FA1.5070800@colorado.edu> References: <434E222D.7010707@olfac.univ-lyon1.fr> <434E6901.2080209@olfac.univ-lyon1.fr><434E6E67.2050506@colorado.edu> <434E7FA1.5070800@colorado.edu> Message-ID: <434E8486.50506@olfac.univ-lyon1.fr> An HTML attachment was scrubbed... URL: From Fernando.Perez at colorado.edu Thu Oct 13 12:22:10 2005 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Thu, 13 Oct 2005 10:22:10 -0600 Subject: [SciPy-user] Newbie and Old matlab user In-Reply-To: <434E7FA1.5070800@colorado.edu> References: <434E222D.7010707@olfac.univ-lyon1.fr> <434E6901.2080209@olfac.univ-lyon1.fr><434E6E67.2050506@colorado.edu> <434E7FA1.5070800@colorado.edu> Message-ID: <434E89B2.4050807@colorado.edu> Fernando Perez wrote: > Alan G Isaac wrote: >>In any case, maybe the OP could just use execfile: >>http://docs.python.org/lib/built-in-funcs.html#execfile > > > Yes, the only problem with this is that execfile requires a full path, so it's > a bit less convenient than import. But while developing/debugging, it can do > the trick. I should clarify that execfile('/path/to/foo.py') is not quite equivalent to 'from foo import *'. In particular: - if foo declares an __all__ list of names, the import form only imports those names, while execfile brute-forces everything in the foo namespace up into that of the caller. - by default, python does NOT import underscore-prefixed names with underscore prefixes: In [1]: cat foo.py top = 0 _one = 1 __two = 2 In [2]: from foo import * In [3]: _one --------------------------------------------------------------------------- exceptions.NameError Traceback (most recent call last) /home/fperez/ NameError: name '_one' is not defined In [4]: __two --------------------------------------------------------------------------- exceptions.NameError Traceback (most recent call last) /home/fperez/ NameError: name '__two' is not defined In [5]: execfile('foo.py') In [6]: _one Out[6]: 1 In [7]: __two Out[7]: 2 So yes, for quick and dirty work execfile can be substituted for 'import *', but it's important to realize that they are NOT the same. Sorry if my previous post was imprecise. cheers, f From Fernando.Perez at colorado.edu Thu Oct 13 12:37:06 2005 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Thu, 13 Oct 2005 10:37:06 -0600 Subject: [SciPy-user] Newbie and Old matlab user In-Reply-To: References: <434E222D.7010707@olfac.univ-lyon1.fr> <434E6901.2080209@olfac.univ-lyon1.fr><434E6E67.2050506@colorado.edu> Message-ID: <434E8D32.7050404@colorado.edu> Alan G Isaac wrote: > On Thu, 13 Oct 2005, Fernando Perez apparently wrote: > > >>Finally, the 'from s2 import *' idiom breaks all this, I think. You can try >>this, but I'm not sure it works (I think it doesn't): > > > The docs say: > If a module imports objects from another module > using from ... import ..., calling reload() for the > other module does not redefine the objects imported > from it -- one way around this is to re-execute the > from statement, another is to use import and > qualified names (module.name) instead. Just to clarify, my little trick does work around this stated limitation in the docs. I hadn't tested it when I sent the previous email (in a hurry to catch a bus :). Details in my other post. Cheers, f From Fernando.Perez at colorado.edu Thu Oct 13 12:37:25 2005 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Thu, 13 Oct 2005 10:37:25 -0600 Subject: [SciPy-user] Newbie and Old matlab user In-Reply-To: <434E8486.50506@olfac.univ-lyon1.fr> References: <434E222D.7010707@olfac.univ-lyon1.fr> <434E6901.2080209@olfac.univ-lyon1.fr><434E6E67.2050506@colorado.edu> <434E7FA1.5070800@colorado.edu> <434E8486.50506@olfac.univ-lyon1.fr> Message-ID: <434E8D45.8050301@colorado.edu> Samuel GARCIA wrote: > thank a lot to Alan G Isaac and Fernando Perez > yours responses exactly anwser my questions and my poor knowledge in python! > > But I still have a conceptual problem of how to work interactively ! > How to do when I edit my source2.py when debugging and run again source1. > > execfile can be used instead off run but not instead off from ... import * > > In my case, using module.name is not a way, otherwise the code will be to heavy > to read. Keep in mind that python gives you many ways of shortening names: from your.very.nested.package import really_obnoxiously_long_name as x x.foo() x.bar() If you are going to call foo and bar a zillion times, you can either do: foo,bar = x.foo, x.bar or from your.very.nested.package.really_obnoxiously_long_name import foo, bar Note that the former gives you automatically reload capabilities, while the latter requires a little trick: import foo reload(foo) from foo import a,b,c Then, you do get a,b,c to properly refresh if foo changes. I should add that this trick (which I earlier said didn't work) does work even for *: import foo reload(foo) from foo import * I just tested it at an ipython prompt: In [18]: cat foo.py top = 1 In [19]: import foo In [20]: reload foo -------> reload(foo) Out[20]: In [21]: from foo import * In [22]: top Out[22]: 1 In [23]: cat foo.py top = 'surprise' In [24]: reload foo -------> reload(foo) Out[24]: In [25]: from foo import * In [26]: top Out[26]: 'surprise' Still, I would almost argue that the 'from foo import *' form is only acceptable for interactive work, where you really want to save up everey keystroke you can, and you are only testing ideas, visualizing data, etc. For code written in an editor and meant to survive, a bit of namespace protection will save you a LOT of work in the long run, believe me. Before anyone (John Hunter, I'm looking at you :) goes grepping the ipython sources: yes, I have used import * a lot in the past. I shouldn't. I'm older, wiser, and know better now. I wrote ipython in my 2nd week of learning python, when 'import *' looked like a really neat and convenient facility. It's just a trap, so avoid it. > How do people comming from matlab world ? I can't answer this, as I've never used matlab, but perhaps others may help you on that. Cheers, f From myeates at jpl.nasa.gov Thu Oct 13 13:24:55 2005 From: myeates at jpl.nasa.gov (Mathew Yeates) Date: Thu, 13 Oct 2005 10:24:55 -0700 Subject: [SciPy-user] begging for binaries In-Reply-To: <434DB7BD.6050905@ucsd.edu> References: <434D6BA1.6030707@jpl.nasa.gov> <434D6D91.2010408@ee.byu.edu> <434DA50A.10803@jpl.nasa.gov> <434DB586.7010303@ee.byu.edu> <434DB7BD.6050905@ucsd.edu> Message-ID: <434E9867.2030903@jpl.nasa.gov> it went nowhere. I have very little experience with scipy's build process. Python24 was released almost a year ago. Its a shame that it can't be used. Mathew Robert Kern wrote: >Travis Oliphant wrote: > > >>Mathew Yeates wrote: >> >> >> >>>okay, I finally got everything installed. I think. >>> >>>Now when I run scipy.test() I get a whole bunch of output followed by >>>python crashing >>>Here are the 2 lines before the crash >>>----- >>>check_simple_underdet_complex >>>(scipy.linalg.decomp.test_decomp.test_svdvals) ... >>>ok >>>check_basic (scipy.io.array_import.test_array_import.test_numpyio) ... >>>BOOM! >>> >>> >>This is a problem with scipy distutils picking up the wrong system IO >>library. I don't know why things changed in the move to Python 2.4, >>but for some reason, Python is being compiled against one system IO >>library, and scipy is being compiled against another. >> >>I don't know how to fix it, right way. Everything worked fine with >>Python 2.3. >> >> > >The official Python 2.4 binaries are being compiled with MSVC 7 and >against that runtime. John Hunter discovered a way to compile his >matplotlib binaries with mingw, but no one has yet reported if they've >tried this with scipy: > > http://mail.python.org/pipermail/python-list/2004-December/254826.html > >Some conflicting advice can be found at the bottom of this page: > > http://jove.prohosting.com/iwave/ipython/issues.html > >Please try those methods and let us know how it goes. > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From rkern at ucsd.edu Thu Oct 13 13:27:24 2005 From: rkern at ucsd.edu (Robert Kern) Date: Thu, 13 Oct 2005 10:27:24 -0700 Subject: [SciPy-user] begging for binaries In-Reply-To: <434E9867.2030903@jpl.nasa.gov> References: <434D6BA1.6030707@jpl.nasa.gov> <434D6D91.2010408@ee.byu.edu> <434DA50A.10803@jpl.nasa.gov> <434DB586.7010303@ee.byu.edu> <434DB7BD.6050905@ucsd.edu> <434E9867.2030903@jpl.nasa.gov> Message-ID: <434E98FC.30503@ucsd.edu> Mathew Yeates wrote: > it went nowhere. I have very little experience with scipy's build process. > Python24 was released almost a year ago. Its a shame that it can't be used. Can you give us some more information? Exactly what did you do? What didn't work? Did things fail to compile? link? run? -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From myeates at jpl.nasa.gov Thu Oct 13 13:33:18 2005 From: myeates at jpl.nasa.gov (Mathew Yeates) Date: Thu, 13 Oct 2005 10:33:18 -0700 Subject: [SciPy-user] begging for binaries In-Reply-To: <434E98FC.30503@ucsd.edu> References: <434D6BA1.6030707@jpl.nasa.gov> <434D6D91.2010408@ee.byu.edu> <434DA50A.10803@jpl.nasa.gov> <434DB586.7010303@ee.byu.edu> <434DB7BD.6050905@ucsd.edu> <434E9867.2030903@jpl.nasa.gov> <434E98FC.30503@ucsd.edu> Message-ID: <434E9A5E.9060608@jpl.nasa.gov> Robert Kern wrote: >Mathew Yeates wrote: > > >>it went nowhere. I have very little experience with scipy's build process. >>Python24 was released almost a year ago. Its a shame that it can't be used. >> >> > >Can you give us some more information? Exactly what did you do? What >didn't work? Did things fail to compile? link? run? > > > Searched for references to msvcrt in scipy_distutils. Didn't find it. This isn't a problem I can solve. -------------- next part -------------- An HTML attachment was scrubbed... URL: From rkern at ucsd.edu Thu Oct 13 13:47:58 2005 From: rkern at ucsd.edu (Robert Kern) Date: Thu, 13 Oct 2005 10:47:58 -0700 Subject: [SciPy-user] begging for binaries In-Reply-To: <434E9A5E.9060608@jpl.nasa.gov> References: <434D6BA1.6030707@jpl.nasa.gov> <434D6D91.2010408@ee.byu.edu> <434DA50A.10803@jpl.nasa.gov> <434DB586.7010303@ee.byu.edu> <434DB7BD.6050905@ucsd.edu> <434E9867.2030903@jpl.nasa.gov> <434E98FC.30503@ucsd.edu> <434E9A5E.9060608@jpl.nasa.gov> Message-ID: <434E9DCE.6010402@ucsd.edu> Mathew Yeates wrote: > Robert Kern wrote: > >>Mathew Yeates wrote: >> >>>it went nowhere. I have very little experience with scipy's build process. >>>Python24 was released almost a year ago. Its a shame that it can't be used. >> >>Can you give us some more information? Exactly what did you do? What >>didn't work? Did things fail to compile? link? run? >> > Searched for references to msvcrt in scipy_distutils. Didn't find it. > This isn't a problem I can solve. It's not a problem I can solve either, but I'm trying. Please try the following: Find the gcc specs file. If gcc.exe is %mingwpath%\bin\gcc.exe, and the version is %mingwversion%, then the specs file should be $mingwpath%\lib\gcc\%mingwversion%\specs . Change "-lmsvcrt" to "-lmsvcrt71". Now, edit scipy_distutils/mingw32ccompiler.py at around line 102 or so: """ # no additional libraries needed self.dll_libraries=[] return """ to """ # no additional libraries needed self.dll_libraries=['msvcrt71'] return """ -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From aisaac at american.edu Thu Oct 13 13:58:28 2005 From: aisaac at american.edu (Alan G Isaac) Date: Thu, 13 Oct 2005 13:58:28 -0400 Subject: [SciPy-user] begging for binaries In-Reply-To: <434E9867.2030903@jpl.nasa.gov> References: <434D6BA1.6030707@jpl.nasa.gov> <434D6D91.2010408@ee.byu.edu><434DA50A.10803@jpl.nasa.gov> <434DB586.7010303@ee.byu.edu><434DB7BD.6050905@ucsd.edu> <434E9867.2030903@jpl.nasa.gov> Message-ID: On Thu, 13 Oct 2005, Mathew Yeates apparently wrote: > I have very little experience with scipy's build process. > Python24 was released almost a year ago. Its a shame that > it can't be used. Are you aware that there is a Windows binary available for scipy_core? fwiw, Alan Isaac From rkern at ucsd.edu Thu Oct 13 13:58:24 2005 From: rkern at ucsd.edu (Robert Kern) Date: Thu, 13 Oct 2005 10:58:24 -0700 Subject: [SciPy-user] begging for binaries In-Reply-To: References: <434D6BA1.6030707@jpl.nasa.gov> <434D6D91.2010408@ee.byu.edu><434DA50A.10803@jpl.nasa.gov> <434DB586.7010303@ee.byu.edu><434DB7BD.6050905@ucsd.edu> <434E9867.2030903@jpl.nasa.gov> Message-ID: <434EA040.4070401@ucsd.edu> Alan G Isaac wrote: > On Thu, 13 Oct 2005, Mathew Yeates apparently wrote: > >>I have very little experience with scipy's build process. >>Python24 was released almost a year ago. Its a shame that >>it can't be used. > > Are you aware that there is a Windows binary available for > scipy_core? He's trying to get the full scipy working with Python 2.4. The problem lies in scipy.io. Much of the rest works fine. -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From myeates at jpl.nasa.gov Thu Oct 13 13:58:51 2005 From: myeates at jpl.nasa.gov (Mathew Yeates) Date: Thu, 13 Oct 2005 10:58:51 -0700 Subject: [SciPy-user] begging for binaries In-Reply-To: References: <434D6BA1.6030707@jpl.nasa.gov> <434D6D91.2010408@ee.byu.edu> <434DA50A.10803@jpl.nasa.gov> <434DB586.7010303@ee.byu.edu> <434DB7BD.6050905@ucsd.edu> <434E9867.2030903@jpl.nasa.gov> Message-ID: <434EA05B.6040005@jpl.nasa.gov> Python 2.4? Where? Alan G Isaac wrote: >On Thu, 13 Oct 2005, Mathew Yeates apparently wrote: > > >>I have very little experience with scipy's build process. >>Python24 was released almost a year ago. Its a shame that >>it can't be used. >> >> > >Are you aware that there is a Windows binary available for >scipy_core? > >fwiw, >Alan Isaac > >_______________________________________________ >SciPy-user mailing list >SciPy-user at scipy.net >http://www.scipy.net/mailman/listinfo/scipy-user > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From oliphant at ee.byu.edu Thu Oct 13 14:00:55 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Thu, 13 Oct 2005 12:00:55 -0600 Subject: [SciPy-user] begging for binaries In-Reply-To: <434EA040.4070401@ucsd.edu> References: <434D6BA1.6030707@jpl.nasa.gov> <434D6D91.2010408@ee.byu.edu><434DA50A.10803@jpl.nasa.gov> <434DB586.7010303@ee.byu.edu><434DB7BD.6050905@ucsd.edu> <434E9867.2030903@jpl.nasa.gov> <434EA040.4070401@ucsd.edu> Message-ID: <434EA0D7.5070400@ee.byu.edu> Robert Kern wrote: >Alan G Isaac wrote: > > >>On Thu, 13 Oct 2005, Mathew Yeates apparently wrote: >> >> >> >>>I have very little experience with scipy's build process. >>>Python24 was released almost a year ago. Its a shame that >>>it can't be used. >>> >>> >>Are you aware that there is a Windows binary available for >>scipy_core? >> >> > >He's trying to get the full scipy working with Python 2.4. The problem >lies in scipy.io. Much of the rest works fine. > > > scipy.io is the only place where C-level writing to a file opened by Python itself is done. Apparently, the a different FILE structure is being used in the library linked to by Python, then the one linked to by scipy. -Travis From oliphant at ee.byu.edu Thu Oct 13 14:01:56 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Thu, 13 Oct 2005 12:01:56 -0600 Subject: [SciPy-user] begging for binaries In-Reply-To: <434EA05B.6040005@jpl.nasa.gov> References: <434D6BA1.6030707@jpl.nasa.gov> <434D6D91.2010408@ee.byu.edu> <434DA50A.10803@jpl.nasa.gov> <434DB586.7010303@ee.byu.edu> <434DB7BD.6050905@ucsd.edu> <434E9867.2030903@jpl.nasa.gov> <434EA05B.6040005@jpl.nasa.gov> Message-ID: <434EA114.7050908@ee.byu.edu> Mathew Yeates wrote: > Python 2.4? Where? Rember this is the "new" scipy core. Full scipy is not yet ready for building on top of it. http://numeric.scipy.org -Travis From rkern at ucsd.edu Thu Oct 13 14:01:52 2005 From: rkern at ucsd.edu (Robert Kern) Date: Thu, 13 Oct 2005 11:01:52 -0700 Subject: [SciPy-user] begging for binaries In-Reply-To: <434EA05B.6040005@jpl.nasa.gov> References: <434D6BA1.6030707@jpl.nasa.gov> <434D6D91.2010408@ee.byu.edu> <434DA50A.10803@jpl.nasa.gov> <434DB586.7010303@ee.byu.edu> <434DB7BD.6050905@ucsd.edu> <434E9867.2030903@jpl.nasa.gov> <434EA05B.6040005@jpl.nasa.gov> Message-ID: <434EA110.9090704@ucsd.edu> Mathew Yeates wrote: > Python 2.4? Where? At the download location on Sourceforge like all the rest. http://sourceforge.net/project/showfiles.php?group_id=1369&package_id=6222 -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From myeates at jpl.nasa.gov Thu Oct 13 20:03:39 2005 From: myeates at jpl.nasa.gov (Mathew Yeates) Date: Thu, 13 Oct 2005 17:03:39 -0700 Subject: [SciPy-user] begging for binaries In-Reply-To: <434E9DCE.6010402@ucsd.edu> References: <434D6BA1.6030707@jpl.nasa.gov> <434D6D91.2010408@ee.byu.edu> <434DA50A.10803@jpl.nasa.gov> <434DB586.7010303@ee.byu.edu> <434DB7BD.6050905@ucsd.edu> <434E9867.2030903@jpl.nasa.gov> <434E98FC.30503@ucsd.edu> <434E9A5E.9060608@jpl.nasa.gov> <434E9DCE.6010402@ucsd.edu> Message-ID: <434EF5DB.7050305@jpl.nasa.gov> after following your instructions I get c:\msys\1.0.10\MinGW\bin\..\lib\gcc-lib\mingw32\3.2.3\..\..\..\..\mingw32\bin\ld .exe: cannot find -lmsvcrt71 I tried setting LD_LIBRARY_PATH to c:\windows\system32 where I found msvcrt71.dll but I got the same error Mathew Robert Kern wrote: >Mathew Yeates wrote: > > >>Robert Kern wrote: >> >> >> >>>Mathew Yeates wrote: >>> >>> >>> >>>>it went nowhere. I have very little experience with scipy's build process. >>>>Python24 was released almost a year ago. Its a shame that it can't be used. >>>> >>>> >>>Can you give us some more information? Exactly what did you do? What >>>didn't work? Did things fail to compile? link? run? >>> >>> >>> >>Searched for references to msvcrt in scipy_distutils. Didn't find it. >>This isn't a problem I can solve. >> >> > >It's not a problem I can solve either, but I'm trying. Please try the >following: > >Find the gcc specs file. If gcc.exe is %mingwpath%\bin\gcc.exe, and the >version is %mingwversion%, then the specs file should be >$mingwpath%\lib\gcc\%mingwversion%\specs . Change "-lmsvcrt" to >"-lmsvcrt71". > >Now, edit scipy_distutils/mingw32ccompiler.py at around line 102 or so: > >""" > # no additional libraries needed > self.dll_libraries=[] > return >""" > >to > >""" > # no additional libraries needed > self.dll_libraries=['msvcrt71'] > return >""" > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From myeates at jpl.nasa.gov Thu Oct 13 20:24:07 2005 From: myeates at jpl.nasa.gov (Mathew Yeates) Date: Thu, 13 Oct 2005 17:24:07 -0700 Subject: [SciPy-user] begging for binaries In-Reply-To: <434EF5DB.7050305@jpl.nasa.gov> References: <434D6BA1.6030707@jpl.nasa.gov> <434D6D91.2010408@ee.byu.edu> <434DA50A.10803@jpl.nasa.gov> <434DB586.7010303@ee.byu.edu> <434DB7BD.6050905@ucsd.edu> <434E9867.2030903@jpl.nasa.gov> <434E98FC.30503@ucsd.edu> <434E9A5E.9060608@jpl.nasa.gov> <434E9DCE.6010402@ucsd.edu> <434EF5DB.7050305@jpl.nasa.gov> Message-ID: <434EFAA7.2060008@jpl.nasa.gov> Mathew Yeates wrote: I have msvcr71.dll (note the missing 't'). When I accounted for this I was able to compile and run scipy.test() without crashing. Mathew > after following your instructions I get > c:\msys\1.0.10\MinGW\bin\..\lib\gcc-lib\mingw32\3.2.3\..\..\..\..\mingw32\bin\ld > .exe: cannot find -lmsvcrt71 > > I tried setting LD_LIBRARY_PATH to c:\windows\system32 where I found > msvcrt71.dll > but I got the same error > > Mathew > > > Robert Kern wrote: > >>Mathew Yeates wrote: >> >> >>>Robert Kern wrote: >>> >>> >>> >>>>Mathew Yeates wrote: >>>> >>>> >>>> >>>>>it went nowhere. I have very little experience with scipy's build process. >>>>>Python24 was released almost a year ago. Its a shame that it can't be used. >>>>> >>>>> >>>>Can you give us some more information? Exactly what did you do? What >>>>didn't work? Did things fail to compile? link? run? >>>> >>>> >>>> >>>Searched for references to msvcrt in scipy_distutils. Didn't find it. >>>This isn't a problem I can solve. >>> >>> >> >>It's not a problem I can solve either, but I'm trying. Please try the >>following: >> >>Find the gcc specs file. If gcc.exe is %mingwpath%\bin\gcc.exe, and the >>version is %mingwversion%, then the specs file should be >>$mingwpath%\lib\gcc\%mingwversion%\specs . Change "-lmsvcrt" to >>"-lmsvcrt71". >> >>Now, edit scipy_distutils/mingw32ccompiler.py at around line 102 or so: >> >>""" >> # no additional libraries needed >> self.dll_libraries=[] >> return >>""" >> >>to >> >>""" >> # no additional libraries needed >> self.dll_libraries=['msvcrt71'] >> return >>""" >> >> >> > >------------------------------------------------------------------------ > >_______________________________________________ >SciPy-user mailing list >SciPy-user at scipy.net >http://www.scipy.net/mailman/listinfo/scipy-user > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From rkern at ucsd.edu Thu Oct 13 20:38:58 2005 From: rkern at ucsd.edu (Robert Kern) Date: Thu, 13 Oct 2005 17:38:58 -0700 Subject: [SciPy-user] begging for binaries In-Reply-To: <434EFAA7.2060008@jpl.nasa.gov> References: <434D6BA1.6030707@jpl.nasa.gov> <434D6D91.2010408@ee.byu.edu> <434DA50A.10803@jpl.nasa.gov> <434DB586.7010303@ee.byu.edu> <434DB7BD.6050905@ucsd.edu> <434E9867.2030903@jpl.nasa.gov> <434E98FC.30503@ucsd.edu> <434E9A5E.9060608@jpl.nasa.gov> <434E9DCE.6010402@ucsd.edu> <434EF5DB.7050305@jpl.nasa.gov> <434EFAA7.2060008@jpl.nasa.gov> Message-ID: <434EFE22.1050908@ucsd.edu> Mathew Yeates wrote: > Mathew Yeates wrote: > > I have msvcr71.dll (note the missing 't'). When I accounted for this I > was able to compile and run scipy.test() without crashing. My apologies for the typo. And hallelujah! -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From bgoli at sun.ac.za Fri Oct 14 03:32:16 2005 From: bgoli at sun.ac.za (Brett Olivier) Date: Fri, 14 Oct 2005 09:32:16 +0200 Subject: [SciPy-user] begging for binaries In-Reply-To: <434EF5DB.7050305@jpl.nasa.gov> References: <434D6BA1.6030707@jpl.nasa.gov> <434E9DCE.6010402@ucsd.edu> <434EF5DB.7050305@jpl.nasa.gov> Message-ID: <200510140932.16431.bgoli@sun.ac.za> I've only just caught this thread now but perhaps you should try the newer candidate MinGW with GCC 3.4.4 I've managed to compile SVN SciPy for Python 2.4 without problems ... perhaps I was just lucky ;-) +++ build environment +++ Python 2.4.2 (#67, Sep 28 2005, 12:41:11) [MSC v.1310 32 bit (Intel)] on win32 ATLAS 3.7.11 compiled with Cygwin+GCC 3.4.4 >>> scipy.__version__ '0.3.3_303.4573' >>> f2py2e.__version__.version '2.46.243_2019' >>> Numeric.__version__ '24.0b2' c:\work\scipy>gcc -v Reading specs from c:/mingw/bin/../lib/gcc/mingw32/3.4.4/specs gcc version 3.4.4 (mingw special) >>> scipy.test(level=10) ---------------------------------------------------------------------- Ran 1188 tests in 66.703s Brett On Friday 14 October 2005 02:03, Mathew Yeates wrote: > after following your instructions I get > c:\msys\1.0.10\MinGW\bin\..\lib\gcc-lib\mingw32\3.2.3\..\..\..\..\mingw32\b >in\ld .exe: cannot find -lmsvcrt71 > > I tried setting LD_LIBRARY_PATH to c:\windows\system32 where I found > msvcrt71.dll > but I got the same error > > Mathew > > Robert Kern wrote: > >Mathew Yeates wrote: > >>Robert Kern wrote: > >>>Mathew Yeates wrote: > >>>>it went nowhere. I have very little experience with scipy's build > >>>> process. Python24 was released almost a year ago. Its a shame that it > >>>> can't be used. > >>> > >>>Can you give us some more information? Exactly what did you do? What > >>>didn't work? Did things fail to compile? link? run? > >> > >>Searched for references to msvcrt in scipy_distutils. Didn't find it. > >>This isn't a problem I can solve. > > > >It's not a problem I can solve either, but I'm trying. Please try the > >following: > > > >Find the gcc specs file. If gcc.exe is %mingwpath%\bin\gcc.exe, and the > >version is %mingwversion%, then the specs file should be > >$mingwpath%\lib\gcc\%mingwversion%\specs . Change "-lmsvcrt" to > >"-lmsvcrt71". > > > >Now, edit scipy_distutils/mingw32ccompiler.py at around line 102 or so: > > > >""" > > # no additional libraries needed > > self.dll_libraries=[] > > return > >""" > > > >to > > > >""" > > # no additional libraries needed > > self.dll_libraries=['msvcrt71'] > > return > >""" -- Brett G. Olivier Postdoctoral Fellow Triple-J Group for Molecular Cell Physiology Stellenbosch University bgoli at sun dot ac dot za http://glue.jjj.sun.ac.za/~bgoli Tel +27-21-8082704 Fax +27-21-8085863 Mobile +27-82-7329306 Debian is the Jedi operating system: "Always two there are, a master and an apprentice". -- Simon Richter on debian-devel From d.howey at imperial.ac.uk Fri Oct 14 05:27:02 2005 From: d.howey at imperial.ac.uk (Howey, David A) Date: Fri, 14 Oct 2005 10:27:02 +0100 Subject: [SciPy-user] Newbie and Old matlab user Message-ID: <056D32E9B2D93B49B01256A88B3EB218766FE1@icex2.ic.ac.uk> Get hold of 'Learning Python' by Lutz, published by O'Reilly. I find it indispensible. 3 days sitting reading that cover to cover and I felt much more confident with python. Then read the ipython and numeric user guides. Dave ________________________________ From: scipy-user-bounces at scipy.net [mailto:scipy-user-bounces at scipy.net] On Behalf Of Samuel GARCIA Sent: 13 October 2005 17:00 To: SciPy Users List Subject: Re: [SciPy-user] Newbie and Old matlab user thank a lot to Alan G Isaac and Fernando Perez yours responses exactly anwser my questions and my poor knowledge in python! But I still have a conceptual problem of how to work interactively ! How to do when I edit my source2.py when debugging and run again source1. execfile can be used instead off run but not instead off from ... import * In my case, using module.name is not a way, otherwise the code will be to heavy to read. How do people comming from matlab world ? Samuel Garcia Fernando Perez a ?crit : Alan G Isaac wrote: On Thu, 13 Oct 2005, Fernando Perez apparently wrote: Finally, the 'from s2 import *' idiom breaks all this, I think. You can try this, but I'm not sure it works (I think it doesn't): The docs say: If a module imports objects from another module using from ... import ..., calling reload() for the other module does not redefine the objects imported from it -- one way around this is to re-execute the from statement, another is to use import and qualified names (module.name) instead. Thanks, that's what I vaguely remembered. One more reason not to use 'import *'... In any case, maybe the OP could just use execfile: http://docs.python.org/lib/built-in-funcs.html#execfile Yes, the only problem with this is that execfile requires a full path, so it's a bit less convenient than import. But while developing/debugging, it can do the trick. Cheers, f _______________________________________________ SciPy-user mailing list SciPy-user at scipy.net http://www.scipy.net/mailman/listinfo/scipy-user -- Samuel GARCIA CNRS - UMR5020 Universite Claude Bernard LYON 1 Laboratoire des Neurosciences et Systemes Sensoriels 50, avenue Tony Garnier 69366 LYON Cedex 07 04 37 28 74 64 -------------- next part -------------- An HTML attachment was scrubbed... URL: From ryanlists at gmail.com Fri Oct 14 12:43:24 2005 From: ryanlists at gmail.com (Ryan Krauss) Date: Fri, 14 Oct 2005 12:43:24 -0400 Subject: [SciPy-user] mat file problem in linux Message-ID: I have a *.mat file created by Matlab in Windows. Scipy in windows can open this file without a problem. I am now trying to run the same code in linux and I get this error: /home/ryan/datafit/bodetest.py 12 #load matlab data 13 #bob=scipy.io.read_array('iobodetest.txt') ---> 14 modeldict=scipy.io.loadmat('tmm_mat_model_params.mat') 15 bodedict=scipy.io.loadmat('bodetest.mat') 16 freq=bodedict['freq'] /usr/lib/python2.3/site-packages/scipy/io/mio.py in loadmat(name, dict, appendmat, basename) 690 if not (0 in test_vals): # MATLAB version 5 format 691 fid.rewind() --> 692 thisdict = _loadv5(fid,basename) 693 if dict is not None: 694 dict.update(thisdict) /usr/lib/python2.3/site-packages/scipy/io/mio.py in _loadv5(fid, basename) 634 dict[varname] = el 635 except EOFError: --> 636 break 637 return dict 638 /usr/lib/python2.3/site-packages/scipy/io/mio.py in _get_element(fid) 617 618 # handle miMatrix type --> 619 el, name = _parse_mimatrix(fid,numbytes) 620 return el, name 621 /usr/lib/python2.3/site-packages/scipy/io/mio.py in _parse_mimatrix(fid, bytes) 499 dclass, cmplx, nzmax =_parse_array_flags(fid) 500 dims = _get_element(fid)[0] --> 501 name = ''.join(asarray(_get_element(fid)[0]).astype('c')) 502 if dclass in mxArrays: 503 result, unused =_get_element(fid) TypeError: sequence item 0: expected string, array (scipy) found WARNING: Failure executing file: In [4]: scipy.__version__ Out[4]: '0.3.2' The version is the same. I actually have scipy installed on 3 different linux distros (the 3rd one isn't really working right now). The other 2 give this same error. The *.mat file is small, so I am attaching it. This particular linux is on fedora core 3 and I compiled everything from source and did the python setup.py install from the 0.3.2 tarball. This FC3 install in running python 2.3. The Windows version that can open the *.mat file is also python 2.3 with the binaries from the scipy website. The other linux distro with the same problem is a Ubuntu Hoary 5.04 with scipy from apt-get following John Hunter's previous post to this list. (The one that isn't really working is running Breezy 5.10. It gets stuck in a different spot. I will look into that.) So, is there something wrong with this mat file or do I need to do something different to make them cross-platform? I guess I thought matlab took care of that part. Thanks, Ryan -------------- next part -------------- A non-text attachment was scrubbed... Name: tmm_mat_model_params.mat Type: application/octet-stream Size: 600 bytes Desc: not available URL: From Fernando.Perez at colorado.edu Fri Oct 14 12:54:47 2005 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Fri, 14 Oct 2005 10:54:47 -0600 Subject: [SciPy-user] mat file problem in linux In-Reply-To: References: Message-ID: <434FE2D7.80709@colorado.edu> Ryan Krauss wrote: > I have a *.mat file created by Matlab in Windows. Scipy in windows > can open this file without a problem. I am now trying to run the same > code in linux and I get this error: > > /home/ryan/datafit/bodetest.py > 12 #load matlab data > 13 #bob=scipy.io.read_array('iobodetest.txt') > ---> 14 modeldict=scipy.io.loadmat('tmm_mat_model_params.mat') > 15 bodedict=scipy.io.loadmat('bodetest.mat') > 16 freq=bodedict['freq'] [...] > TypeError: sequence item 0: expected string, array (scipy) found > WARNING: Failure executing file: > > In [4]: scipy.__version__ > Out[4]: '0.3.2' Whatever it was, it's been fixed in the last CVS before the SVN transition: In [2]: import scipy numerix Numeric 23.7 In [3]: modeldict=scipy.io.loadmat('tmm_mat_model_params.mat') In [4]: scipy.__version__ Out[4]: '0.3.3_309.4624' You can either update to that code base, for which I'm keeping this tarball available until we've fully moved over to newscipy/core: http://ipython.scipy.org/tmp/scipy_cvs_2005-07-29.tgz or try to directly update to newscipy/core, since the new codebase obviously has all the fixes up to that date (but may still be too alpha for your needs). Cheers, f From ryanlists at gmail.com Fri Oct 14 16:52:41 2005 From: ryanlists at gmail.com (Ryan Krauss) Date: Fri, 14 Oct 2005 16:52:41 -0400 Subject: [SciPy-user] mat file problem in linux In-Reply-To: <434FE2D7.80709@colorado.edu> References: <434FE2D7.80709@colorado.edu> Message-ID: Thanks Fernando. I was able to get things up and running from the tarball you posted. I ran into one or two other things that had to be switched from windows code to linux code. This may not be relevant after the new core is released, but one problem that had me confused is that in windows I can use scipy.zeros((n,n),'f') to get a floating point (double) matrix. In linux (at least on FC3), I have to make sure that n is an int (it was 2.0) and the type code needs to be 'd'. The 'f' flag seems to create the right matrix in linux but if you look at the type on an individual element, it is still an array with shape (). This caused errors about not being able to cast types when doing slice assignments. I do have one ipython question that came up while trying to run my code: I found the command "ipython -pylab -p scipy" in my in my bash history. If I use that to start ipython, my code runs fine. If I leave off the -p scipy flag I get another problem with the *.mat file: [ryan at localhost e]$ ipython -pylab Python 2.3.4 (#1, Oct 26 2004, 16:42:40) Type "copyright", "credits" or "license" for more information. IPython 0.6.15 -- An enhanced Interactive Python. ? -> Introduction to IPython's features. %magic -> Information about IPython's 'magic' % functions. help -> Python's own help system. object? -> Details about 'object'. ?object also works, ?? prints more. Welcome to pylab, a matplotlib-based Python environment. For more information, type 'help(pylab)'. In [1]: pwd Out[1]: '/mnt/e' In [2]: cd /home/ryan/datafit/ /home/ryan/datafit In [3]: run bodetest.py --------------------------------------------------------------------------- exceptions.TypeError Traceback (most recent call last) /home/ryan/datafit/bodetest.py 12 #load matlab data 13 #bob=scipy.io.read_array('iobodetest.txt') ---> 14 modeldict=scipy.io.loadmat('tmm_mat_model_params.mat') 15 bodedict=scipy.io.loadmat('bodetest.mat') 16 freq=bodedict['freq'] /usr/lib/python2.3/site-packages/scipy/io/mio.py in loadmat(name, dict, appendmat, basename) 745 if not (0 in test_vals): # MATLAB version 5 format 746 fid.rewind() --> 747 thisdict = _loadv5(fid,basename) 748 if dict is not None: 749 dict.update(thisdict) /usr/lib/python2.3/site-packages/scipy/io/mio.py in _loadv5(fid, basename) 688 dict[varname] = el 689 except EOFError: --> 690 break 691 return dict 692 /usr/lib/python2.3/site-packages/scipy/io/mio.py in _get_element(fid) 671 672 # handle miMatrix type --> 673 el, name = _parse_mimatrix(fid,numbytes) 674 return el, name 675 /usr/lib/python2.3/site-packages/scipy/io/mio.py in _parse_mimatrix(fid, bytes) 551 def _parse_mimatrix(fid,bytes): 552 dclass, cmplx, nzmax =_parse_array_flags(fid) --> 553 dims = _get_element(fid)[0] 554 name = asarray(_get_element(fid)[0]).tostring() 555 tupdims = tuple(dims[::-1]) /usr/lib/python2.3/site-packages/scipy/io/mio.py in _get_element(fid) 663 outarr = fid.read(numbytes,miDataTypes[dtype][2],c_is_b=1) 664 except KeyError: --> 665 raise ValueError, "Unknown data type" 666 mod8 = numbytes%8 667 if mod8: # skip past padding TypeError: unhashable type WARNING: Failure executing file: In [4]: scipy.__version__ Out[4]: '0.3.3_309.4626' What does the -p scipy command line argument do? (I must have found it somewhere but haven't been able to find it using google). And why does it save me from the error with the *.mat file? Both start up methods lead to the same version of scipy being loaded: Other method: [ryan at localhost e]$ ipython -pylab -p scipy Welcome to the SciPy Scientific Computing Environment. Python 2.3.4 (#1, Oct 26 2004, 16:42:40) Type "copyright", "credits" or "license" for more information. IPython 0.6.15 -- An enhanced Interactive Python. ? -> Introduction to IPython's features. %magic -> Information about IPython's 'magic' % functions. help -> Python's own help system. object? -> Details about 'object'. ?object also works, ?? prints more. IPython profile: scipy Welcome to pylab, a matplotlib-based Python environment. For more information, type 'help(pylab)'. In [1]: cd /home/ryan/datafit/ /home/ryan/datafit In [2]: run bodetest.py ...runs without error... In [3]: scipy.__version__ Out[3]: '0.3.3_309.4626' Thanks, Ryan On 10/14/05, Fernando Perez wrote: > Ryan Krauss wrote: > > I have a *.mat file created by Matlab in Windows. Scipy in windows > > can open this file without a problem. I am now trying to run the same > > code in linux and I get this error: > > > > /home/ryan/datafit/bodetest.py > > 12 #load matlab data > > 13 #bob=scipy.io.read_array('iobodetest.txt') > > ---> 14 modeldict=scipy.io.loadmat('tmm_mat_model_params.mat') > > 15 bodedict=scipy.io.loadmat('bodetest.mat') > > 16 freq=bodedict['freq'] > > [...] > > > TypeError: sequence item 0: expected string, array (scipy) found > > WARNING: Failure executing file: > > > > In [4]: scipy.__version__ > > Out[4]: '0.3.2' > > Whatever it was, it's been fixed in the last CVS before the SVN transition: > > In [2]: import scipy > numerix Numeric 23.7 > > In [3]: modeldict=scipy.io.loadmat('tmm_mat_model_params.mat') > > In [4]: scipy.__version__ > Out[4]: '0.3.3_309.4624' > > You can either update to that code base, for which I'm keeping this tarball > available until we've fully moved over to newscipy/core: > > http://ipython.scipy.org/tmp/scipy_cvs_2005-07-29.tgz > > or try to directly update to newscipy/core, since the new codebase obviously > has all the fixes up to that date (but may still be too alpha for your needs). > > Cheers, > > f > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user > From Fernando.Perez at colorado.edu Fri Oct 14 17:02:14 2005 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Fri, 14 Oct 2005 15:02:14 -0600 Subject: [SciPy-user] mat file problem in linux In-Reply-To: References: <434FE2D7.80709@colorado.edu> Message-ID: <43501CD6.4020802@colorado.edu> Ryan Krauss wrote: > I do have one ipython question that came up while trying to run my > code: I found the command "ipython -pylab -p scipy" in my in my bash > history. If I use that to start ipython, my code runs fine. If I > leave off the -p scipy flag I get another problem with the *.mat file: > What does the -p scipy command line argument do? (I must have found > it somewhere but haven't been able to find it using google). And why > does it save me from the error with the *.mat file? Both start up > methods lead to the same version of scipy being loaded: ipython -p foo tells ipython to load the foo profile, which is a shorthand for saying "use the file ~/.ipython/ipythonrc-foo as your initialization file instead of ~/.ipython/ipythonrc". Since ipython config files can include one another, profiles typically are small extensions to your default config (so you always get the same colors, prompts, etc.) which tweak a few things. Have a look at ~/.ipython/ipythonrc-scipy and see what it does that may be changing things for you. Cheers, f From Fernando.Perez at colorado.edu Fri Oct 14 17:05:20 2005 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Fri, 14 Oct 2005 15:05:20 -0600 Subject: [SciPy-user] mat file problem in linux In-Reply-To: <43501CD6.4020802@colorado.edu> References: <434FE2D7.80709@colorado.edu> <43501CD6.4020802@colorado.edu> Message-ID: <43501D90.50300@colorado.edu> Fernando Perez wrote: >>What does the -p scipy command line argument do? (I must have found >>it somewhere but haven't been able to find it using google). And why >>does it save me from the error with the *.mat file? Both start up >>methods lead to the same version of scipy being loaded: For future reference: - man ipython (under *nix) - ipython --help - http://ipython.scipy.org/doc/manual/node5.html are three ways (there's more) of getting a full, detailed list of all ipython command-line options with their description. Cheers, f From ryanlists at gmail.com Fri Oct 14 17:06:06 2005 From: ryanlists at gmail.com (Ryan Krauss) Date: Fri, 14 Oct 2005 17:06:06 -0400 Subject: [SciPy-user] mat file problem in linux In-Reply-To: <43501CD6.4020802@colorado.edu> References: <434FE2D7.80709@colorado.edu> <43501CD6.4020802@colorado.edu> Message-ID: I am pretty sure that it is the unmodified one that ipython installs. Here is the contents: # load our basic configuration with generic options include ipythonrc # import ... # Load SciPy by itself so that 'help scipy' works import_mod scipy # from ... import ... import_some # Now we load all of SciPy # from ... import * import_all scipy IPython.numutils # code execute print 'Welcome to the SciPy Scientific Computing Environment.' execute scipy.alter_numeric() # File with alternate printer system for Numeric Arrays. # Files in the 'Extensions' directory will be found by IPython automatically # (otherwise give the explicit path): execfile Extensions/numeric_formats.py Is the multiple step import doing somethings smarter than just import scipy? Ryan On 10/14/05, Fernando Perez wrote: > Ryan Krauss wrote: > > > I do have one ipython question that came up while trying to run my > > code: I found the command "ipython -pylab -p scipy" in my in my bash > > history. If I use that to start ipython, my code runs fine. If I > > leave off the -p scipy flag I get another problem with the *.mat file: > > > What does the -p scipy command line argument do? (I must have found > > it somewhere but haven't been able to find it using google). And why > > does it save me from the error with the *.mat file? Both start up > > methods lead to the same version of scipy being loaded: > > ipython -p foo > > tells ipython to load the foo profile, which is a shorthand for saying "use > the file ~/.ipython/ipythonrc-foo as your initialization file instead of > ~/.ipython/ipythonrc". > > Since ipython config files can include one another, profiles typically are > small extensions to your default config (so you always get the same colors, > prompts, etc.) which tweak a few things. Have a look at > ~/.ipython/ipythonrc-scipy and see what it does that may be changing things > for you. > > Cheers, > > f > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user > From Fernando.Perez at colorado.edu Fri Oct 14 17:12:15 2005 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Fri, 14 Oct 2005 15:12:15 -0600 Subject: [SciPy-user] mat file problem in linux In-Reply-To: References: <434FE2D7.80709@colorado.edu> <43501CD6.4020802@colorado.edu> Message-ID: <43501F2F.10901@colorado.edu> Ryan Krauss wrote: > I am pretty sure that it is the unmodified one that ipython installs. > execute scipy.alter_numeric() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ try this, see what happens. Cheers, f From ryanlists at gmail.com Fri Oct 14 17:16:00 2005 From: ryanlists at gmail.com (Ryan Krauss) Date: Fri, 14 Oct 2005 17:16:00 -0400 Subject: [SciPy-user] mat file problem in linux In-Reply-To: <43501F2F.10901@colorado.edu> References: <434FE2D7.80709@colorado.edu> <43501CD6.4020802@colorado.edu> <43501F2F.10901@colorado.edu> Message-ID: OK, that worked. I started ipython without the -p scipy option and then imported scipy and ran the alter_numeric() command. My code then ran correctly just as it did when I started ipython with the -p scipy flag. What does this mean? Ryan On 10/14/05, Fernando Perez wrote: > Ryan Krauss wrote: > > I am pretty sure that it is the unmodified one that ipython installs. > > > execute scipy.alter_numeric() > ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ > > try this, see what happens. > > Cheers, > > f > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user > From Fernando.Perez at colorado.edu Fri Oct 14 17:23:02 2005 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Fri, 14 Oct 2005 15:23:02 -0600 Subject: [SciPy-user] mat file problem in linux In-Reply-To: References: <434FE2D7.80709@colorado.edu> <43501CD6.4020802@colorado.edu> <43501F2F.10901@colorado.edu> Message-ID: <435021B6.2000202@colorado.edu> Ryan Krauss wrote: > OK, that worked. I started ipython without the -p scipy option and > then imported scipy and ran the alter_numeric() command. My code then > ran correctly just as it did when I started ipython with the -p scipy > flag. > > What does this mean? In [3]: import scipy In [4]: scipy.alter_numeric? Type: builtin_function_or_method Base Class: String Form: Namespace: Interactive Docstring: alter_numeric() update the behavior of Numeric objects. 1. Change coercion rules so that multiplying by a scalar does not upcast. 2. Add index and mask slicing capability to Numeric arrays. 3. Add .M attribute to Numeric arrays for returning a Matrix 4. (TODO) Speed enhancements. This call changes the behavior for ALL Numeric arrays currently defined and to be defined in the future. The old behavior can be restored for ALL arrays using numeric_restore(). Cheers, f From ryanlists at gmail.com Sat Oct 15 04:22:08 2005 From: ryanlists at gmail.com (Ryan Krauss) Date: Sat, 15 Oct 2005 04:22:08 -0400 Subject: [SciPy-user] ubuntu gcc-4.0 vs. weave Message-ID: I have scipy installed on ubuntu linux 5.04 and 5.10 (I know, I need to pick an OS and stick with one - I am getting close). My test code runs perfectly in 5.04. The only thing that is not working in 5.10 is weave. I have a chunk of code with a switch for python or weave and if I set it to python everything is good. When I set it to weave I get this message: CompileError: error: Command "g++ -pthread -fno-strict-aliasing -DNDEBUG -g -O3 -Wstrict-prototypes -fPIC -I/usr/lib/python2.4/site-packages/weave -I/usr/lib/python2.4/site-packages/weave/scxx -I/usr/lib/python2.4/site-packages/weave/blitz-20001213 -I/usr/include/python2.4 -c /home/ryan/.python24_compiled/sc_7ca2e4310456126e0a9e56dc59b628f613.cpp -o /tmp/ryan/python24_intermediate/compiler_fcf4821de2ee87a8ad4d6579aeaebbc9/home/ryan/.python24_compiled/sc_7ca2e4310456126e0a9e56dc59b628f613.o" failed with exit status 1 WARNING: Failure executing file: I assume this is some gcc(or g++) 4.0 problem. I saw a post from August that Robert Kern updated to the new blitz and got 4.0 to work with weave. He mentioned something about ./configure. I installed the blitz++ package (1.0.....) from the ubuntu package manager. It put a lot of stuff in /usr/include/blitz. I did not see a configure file. I tried copying the blitz folder as Robert mentioned, but got the same message. I even tried going into /usr/bin and modifying the gcc and g++ symbolic links to point to version 3.4. This actually lead to a successful compile, but using the code caused some sort of IPP error from a python 2.4 thread. I am willing to work with ubuntu 5.04 if that is necessary, but if I can get over this last hurdle it would be nice to use 5.10. (I honestly don't know why I have gotten so much further with ubuntu's gcc-4.0 as opposed to fedora's.) Can anyone tell me how to make weave and ubuntu's gcc-4.0 work together? Can it be done? Thanks, Ryan From stefan at sun.ac.za Sat Oct 15 05:01:59 2005 From: stefan at sun.ac.za (Stefan van der Walt) Date: Sat, 15 Oct 2005 11:01:59 +0200 Subject: [SciPy-user] ubuntu gcc-4.0 vs. weave In-Reply-To: References: Message-ID: <20051015090158.GI2393@rajah> Hi Ryan On Sat, Oct 15, 2005 at 04:22:08AM -0400, Ryan Krauss wrote: > the blitz++ package (1.0.....) from the ubuntu package manager. It > put a lot of stuff in /usr/include/blitz. I did not see a configure > file. I tried copying the blitz folder as Robert mentioned, but got > the same message. I even tried going into /usr/bin and modifying the > gcc and g++ symbolic links to point to version 3.4. This actually > lead to a successful compile, but using the code caused some sort of > IPP error from a python 2.4 thread. The Blitz package that comes with Ubuntu (Breezy) is named blitz++ (1:0.8-4). Version 0.8 of Blitz++ does not compile with g++4. You will have to get the 0.9 sources from their homepage. I don't know if this will actually solve your problem, though. Regards St?fan From rainman.rocks at gmail.com Sat Oct 15 08:23:32 2005 From: rainman.rocks at gmail.com (rainman) Date: Sat, 15 Oct 2005 16:23:32 +0400 Subject: [SciPy-user] begging for binaries Message-ID: <1345025027.20051015162332@gmail.com> Hello scipy-user, > I know there were/are issues with Python2.4 and distutils that create > some problems. Wonderful. I just posted this a couple of weeks ago, and seems that everyone has already forgot. To succesfully build SciPy with Python 2.4 under Windows/MinGW, replace file \SciPy_complete-0.3.2\scipy_core\scipy_distutils\mingw32ccompiler.py with http://home.uic.tula.ru/~s001180/scipy-py24/mingw32ccompiler.py According to Travis, this has already been committed to SVN. Well, probably some notes should be added to site too? Maybe updating http://www.scipy.org/documentation/buildscipywin32.txt or including fix in downloadable source package? Sample binaries (work okay on my Athlon-64 2800+ / Windows XP (32-bit) SP2 box): http://home.uic.tula.ru/~s001180/scipy-py24/SciPy_complete-0.3.2.win32-py2.4-atlas3.6.0_WinNT_PIII.exe -- Best regards, rainman mailto:rainman.rocks at gmail.com From ryanlists at gmail.com Sat Oct 15 11:08:33 2005 From: ryanlists at gmail.com (Ryan Krauss) Date: Sat, 15 Oct 2005 11:08:33 -0400 Subject: [SciPy-user] ubuntu gcc-4.0 vs. weave In-Reply-To: <20051015090158.GI2393@rajah> References: <20051015090158.GI2393@rajah> Message-ID: Thanks Stefan. I guess I wasn't paying close enough attention to the package name. But it still didn't work. I downloaded 0.9 from their website and did a configure, then make lib, then make install, and copied the directory as before but get the same error: CompileError: error: Command "g++ -pthread -fno-strict-aliasing -DNDEBUG -g -O3 -Wstrict-prototypes -fPIC -I/usr/lib/python2.4/site-packages/weave -I/usr/lib/python2.4/site-packages/weave/scxx -I/usr/lib/python2.4/site-packages/weave/blitz-20001213 -I/usr/include/python2.4 -c /home/ryan/.python24_compiled/sc_7ca2e4310456126e0a9e56dc59b628f614.cpp -o /tmp/ryan/python24_intermediate/compiler_fcf4821de2ee87a8ad4d6579aeaebbc9/home/ryan/.python24_compiled/sc_7ca2e4310456126e0a9e56dc59b628f614.o" failed with exit status 1 WARNING: Failure executing file: Are you running scipy under Breezy? Does weave work for you? Thanks, Ryan On 10/15/05, Stefan van der Walt wrote: > Hi Ryan > > On Sat, Oct 15, 2005 at 04:22:08AM -0400, Ryan Krauss wrote: > > > the blitz++ package (1.0.....) from the ubuntu package manager. It > > put a lot of stuff in /usr/include/blitz. I did not see a configure > > file. I tried copying the blitz folder as Robert mentioned, but got > > the same message. I even tried going into /usr/bin and modifying the > > gcc and g++ symbolic links to point to version 3.4. This actually > > lead to a successful compile, but using the code caused some sort of > > IPP error from a python 2.4 thread. > > The Blitz package that comes with Ubuntu (Breezy) is named blitz++ > (1:0.8-4). Version 0.8 of Blitz++ does not compile with g++4. You > will have to get the 0.9 sources from their homepage. > > I don't know if this will actually solve your problem, though. > > Regards > St?fan > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user > > From dd55 at cornell.edu Sat Oct 15 12:43:01 2005 From: dd55 at cornell.edu (Darren Dale) Date: Sat, 15 Oct 2005 12:43:01 -0400 Subject: [SciPy-user] subversion repository In-Reply-To: <434D2B63.6020707@mecha.uni-stuttgart.de> References: <434D29C0.1060109@cs.kuleuven.ac.be> <434D2B63.6020707@mecha.uni-stuttgart.de> Message-ID: <200510151243.02043.dd55@cornell.edu> On Wednesday 12 October 2005 11:27 am, Nils Wagner wrote: > Giovanni Samaey wrote: > >Dear all, > > > >I am wondering if any centralized information will become available > >about how to check out scipy from svn (what exactly needs to be checked > >out and installed). > >I realize that all the information is scattered in a number of mails on > >this list, but I would like to see some place (e.g. on the website) > > updated to reflect the switch from cvs to svn. > >(I don't think I am alone on this one... It must be hard for new users > >to try SciPy with this situation.) Is there a problem with the svn server? I get the following error messages when I run these commands: > svn co http://svn.scipy.org/svn/scipy_core/branches/newcore/ svn: Unrecognized URL scheme for 'http://svn.scipy.org/svn/scipy_core/branches/newcore' > svn co http://svn.scipy.org/svn/scipy/branches/newscipy/ svn: Unrecognized URL scheme for 'http://svn.scipy.org/svn/scipy/branches/newscipy' Darren From arnd.baecker at web.de Sat Oct 15 13:10:22 2005 From: arnd.baecker at web.de (Arnd Baecker) Date: Sat, 15 Oct 2005 19:10:22 +0200 (CEST) Subject: [SciPy-user] subversion repository In-Reply-To: <200510151243.02043.dd55@cornell.edu> References: <434D29C0.1060109@cs.kuleuven.ac.be> <434D2B63.6020707@mecha.uni-stuttgart.de> <200510151243.02043.dd55@cornell.edu> Message-ID: Hi Darren, On Sat, 15 Oct 2005, Darren Dale wrote: > Is there a problem with the svn server? I get the following error messages > when I run these commands: > > > svn co http://svn.scipy.org/svn/scipy_core/branches/newcore/ > svn: Unrecognized URL scheme for > 'http://svn.scipy.org/svn/scipy_core/branches/newcore' > > > svn co http://svn.scipy.org/svn/scipy/branches/newscipy/ > svn: Unrecognized URL scheme for > 'http://svn.scipy.org/svn/scipy/branches/newscipy' I just tried the above lines - no problems with them. Do you have a different machine from which you could try again? (svn --version gives for me version 1.2.3) Arnd From rkern at ucsd.edu Sat Oct 15 13:14:53 2005 From: rkern at ucsd.edu (Robert Kern) Date: Sat, 15 Oct 2005 10:14:53 -0700 Subject: [SciPy-user] subversion repository In-Reply-To: <200510151243.02043.dd55@cornell.edu> References: <434D29C0.1060109@cs.kuleuven.ac.be> <434D2B63.6020707@mecha.uni-stuttgart.de> <200510151243.02043.dd55@cornell.edu> Message-ID: <4351390D.3070407@ucsd.edu> Darren Dale wrote: > Is there a problem with the svn server? I get the following error messages > when I run these commands: > >>svn co http://svn.scipy.org/svn/scipy_core/branches/newcore/ > > svn: Unrecognized URL scheme for > 'http://svn.scipy.org/svn/scipy_core/branches/newcore' > >>svn co http://svn.scipy.org/svn/scipy/branches/newscipy/ > > svn: Unrecognized URL scheme for > 'http://svn.scipy.org/svn/scipy/branches/newscipy' Googling for "svn: Unrecognized URL scheme for" gets this: http://svn.haxx.se/users/archive-2005-06/0546.shtml I think it's a problem with how svn was built on your machine. -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From dd55 at cornell.edu Sat Oct 15 14:59:17 2005 From: dd55 at cornell.edu (Darren Dale) Date: Sat, 15 Oct 2005 14:59:17 -0400 Subject: [SciPy-user] subversion repository In-Reply-To: <4351390D.3070407@ucsd.edu> References: <434D29C0.1060109@cs.kuleuven.ac.be> <200510151243.02043.dd55@cornell.edu> <4351390D.3070407@ucsd.edu> Message-ID: <200510151459.17510.dd55@cornell.edu> On Saturday 15 October 2005 1:14 pm, you wrote: > Darren Dale wrote: > > Is there a problem with the svn server? I get the following error > > messages > > > > when I run these commands: > >>svn co http://svn.scipy.org/svn/scipy_core/branches/newcore/ > > > > svn: Unrecognized URL scheme for > > 'http://svn.scipy.org/svn/scipy_core/branches/newcore' > > > >>svn co http://svn.scipy.org/svn/scipy/branches/newscipy/ > > > > svn: Unrecognized URL scheme for > > 'http://svn.scipy.org/svn/scipy/branches/newscipy' > > Googling for "svn: Unrecognized URL scheme for" gets this: > > ? http://svn.haxx.se/users/archive-2005-06/0546.shtml > > I think it's a problem with how svn was built on your machine. Thanks Robert and Arnd. You are right, it was a subversion problem. Sorry for the noise. (For anyone interested, I was using subversion 1.2.3, and there was an incompatibility with neon-0.25.3. Dropping back to neon-0.24.7 and reinstalling subversion cleared things up.) Darren From rmb at obs.univ-lyon1.fr Mon Oct 17 03:28:03 2005 From: rmb at obs.univ-lyon1.fr (Roland Bacon) Date: Mon, 17 Oct 2005 09:28:03 +0200 Subject: [SciPy-user] Windows Binaries for Python 2.4 Message-ID: <20051017072611.97D8C4BB4F@obs.univ-lyon1.fr> I am new to this list. I have installed the 2.4 python windows version and I wonder if there are some plans to have a windows binary distribution for Python 2.4. If not, has anyone succeed to compile it in windows XP and would like to share the info. Thanks Roland From arnd.baecker at web.de Mon Oct 17 06:32:01 2005 From: arnd.baecker at web.de (Arnd Baecker) Date: Mon, 17 Oct 2005 12:32:01 +0200 (CEST) Subject: [SciPy-user] fftw, scipy ("old") install Message-ID: Hi, it seems that I failed with fftw on the opteron ;-) - scipy fft is (much) slower than the Numeric one (see output below). I installed fftw via:: tar xzf ../Sources/fftw-3.0.1.tar.gz cd fftw-3.0.1/ ./configure CFLAGS=-fPIC --prefix=$PHOME make make install and for `python scipy_core/scipy_distutils/system_info.py` I got (during the installation): dfftw_info: ( library_dirs = /scr/python/lib:/usr/local/lib:/usr/lib ) ( include_dirs = /usr/local/include:/usr/include:/scr/python/include ) dfftw not found NOT AVAILABLE dfftw_threads_info: ( library_dirs = /scr/python/lib:/usr/local/lib:/usr/lib ) ( include_dirs = /usr/local/include:/usr/include:/scr/python/include ) dfftw threads not found NOT AVAILABLE djbfft_info: ( library_dirs = /scr/python/lib:/usr/local/lib:/usr/lib ) ( include_dirs = /usr/local/include:/usr/include:/scr/python/include ) NOT AVAILABLE fftw_info: ( library_dirs = /scr/python/lib:/usr/local/lib:/usr/lib ) ( include_dirs = /usr/local/include:/usr/include:/scr/python/include ) ( paths: /scr/python/lib/libfftw3.a ) ( paths: /scr/python/include/fftw3.h ) ( library_dirs = /scr/python/lib:/usr/local/lib:/usr/lib ) FOUND: libraries = ['fftw3'] library_dirs = ['/scr/python/lib'] define_macros = [('SCIPY_FFTW3_H', None)] include_dirs = ['/scr/python/include'] fftw_threads_info: ( library_dirs = /scr/python/lib:/usr/local/lib:/usr/lib ) ( include_dirs = /usr/local/include:/usr/include:/scr/python/include ) fftw threads not found NOT AVAILABLE So I thought everything is ok. Reconsidering this after seeing the poor performance shows that, `nm _fftpack.so` gives me no symbols related to fftw. What did I do wrong? (I know I could have build with these additional options: --enable-float: Produces a single-precision version of FFTW (float) --enable-long-double: Produces a long-double precision --enable-threads --enable-sse2 But is it necessary?) And for future builds: Is there a way to check (on the python level) which fft is used *after* the installation? Best, Arnd Fast Fourier Transform ================================================= | real input | complex input ------------------------------------------------- size | scipy | Numeric | scipy | Numeric ------------------------------------------------- 100 | 0.07 | 0.06 | 0.94 | 0.06 (secs for 7000 calls) 1000 | 0.11 | 0.07 | 0.82 | 0.07 (secs for 2000 calls) 256 | 0.17 | 0.11 | 1.74 | 0.11 (secs for 10000 calls) 512 | 0.31 | 0.19 | 2.32 | 0.20 (secs for 10000 calls) 1024 | 0.05 | 0.04 | 0.36 | 0.03 (secs for 1000 calls) 2048 | 0.10 | 0.07 | 0.61 | 0.07 (secs for 1000 calls) 4096 | 0.10 | 0.11 | 0.57 | 0.13 (secs for 500 calls) 8192 | 0.23 | 0.49 | 1.26 | 0.48 (secs for 500 calls) ..... Multi-dimensional Fast Fourier Transform =================================================== | real input | complex input --------------------------------------------------- size | scipy | Numeric | scipy | Numeric --------------------------------------------------- 100x100 | 0.15 | 0.07 | 0.15 | 0.08 (secs for 100 calls) 1000x100 | 0.14 | 0.08 | 0.13 | 0.08 (secs for 7 calls) 256x256 | 0.14 | 0.08 | 0.14 | 0.07 (secs for 10 calls) 512x512 | 0.32 | 0.14 | 0.33 | 0.15 (secs for 3 calls) ..... Inverse Fast Fourier Transform =============================================== | real input | complex input ----------------------------------------------- size | scipy | Numeric | scipy | Numeric ----------------------------------------------- 100 | 0.07 | 0.15 | 1.26 | 0.15 (secs for 7000 calls) 1000 | 0.12 | 0.17 | 0.93 | 0.18 (secs for 2000 calls) 256 | 0.18 | 0.28 | 2.15 | 0.29 (secs for 10000 calls) 512 | 0.32 | 0.46 | 2.74 | 0.47 (secs for 10000 calls) 1024 | 0.05 | 0.07 | 0.41 | 0.08 (secs for 1000 calls) 2048 | 0.11 | 0.14 | 0.66 | 0.15 (secs for 1000 calls) 4096 | 0.10 | 0.19 | 0.59 | 0.19 (secs for 500 calls) 8192 | 0.23 | 0.68 | 1.30 | 0.66 (secs for 500 calls) ....... Inverse Fast Fourier Transform (real data) ================================== size | scipy | Numeric ---------------------------------- 100 | 0.08 | 0.17 (secs for 7000 calls) 1000 | 0.14 | 0.10 (secs for 2000 calls) 256 | 0.16 | 0.26 (secs for 10000 calls) 512 | 0.26 | 0.33 (secs for 10000 calls) 1024 | 0.05 | 0.05 (secs for 1000 calls) 2048 | 0.14 | 0.07 (secs for 1000 calls) 4096 | 0.14 | 0.06 (secs for 500 calls) 8192 | 0.28 | 0.19 (secs for 500 calls) .... Fast Fourier Transform (real data) ================================== size | scipy | Numeric ---------------------------------- 100 | 0.07 | 0.07 (secs for 7000 calls) 1000 | 0.11 | 0.05 (secs for 2000 calls) 256 | 0.18 | 0.12 (secs for 10000 calls) 512 | 0.30 | 0.18 (secs for 10000 calls) 1024 | 0.05 | 0.03 (secs for 1000 calls) 2048 | 0.10 | 0.04 (secs for 1000 calls) 4096 | 0.10 | 0.04 (secs for 500 calls) 8192 | 0.20 | 0.13 (secs for 500 calls) From pearu at scipy.org Mon Oct 17 05:41:09 2005 From: pearu at scipy.org (Pearu Peterson) Date: Mon, 17 Oct 2005 04:41:09 -0500 (CDT) Subject: [SciPy-user] fftw, scipy ("old") install In-Reply-To: References: Message-ID: On Mon, 17 Oct 2005, Arnd Baecker wrote: > Hi, > > it seems that I failed with fftw on the opteron ;-) - > scipy fft is (much) slower than the Numeric one > (see output below). > > I installed fftw via:: > > tar xzf ../Sources/fftw-3.0.1.tar.gz > cd fftw-3.0.1/ > ./configure CFLAGS=-fPIC --prefix=$PHOME > make > make install The old scipy.fftpack supports only fftw2. I have a patch for fftw3 somewhere. I'll apply it as soon as I'll get a chance to work on fftpack. > Reconsidering this after seeing the poor performance > shows that, `nm _fftpack.so` gives me no symbols related to fftw. Yes, _fftpack.so should contain Fortran fftpack when fftw (v2) library is not available. > And for future builds: > Is there a way to check (on the python level) which fft is used > *after* the installation? Normally it should be unnecessary. However, scipy building scripts install config.py (old scipy) or __config__.py (newscipy) files containing the system information used in building scipy. Pearu From arnd.baecker at web.de Mon Oct 17 07:01:30 2005 From: arnd.baecker at web.de (Arnd Baecker) Date: Mon, 17 Oct 2005 13:01:30 +0200 (CEST) Subject: [SciPy-user] fftw, scipy ("old") install In-Reply-To: References: Message-ID: Hi Pearu, On Mon, 17 Oct 2005, Pearu Peterson wrote: > On Mon, 17 Oct 2005, Arnd Baecker wrote: > > it seems that I failed with fftw on the opteron ;-) - > > scipy fft is (much) slower than the Numeric one > > (see output below). > > > > I installed fftw via:: > > > > tar xzf ../Sources/fftw-3.0.1.tar.gz > > cd fftw-3.0.1/ > > ./configure CFLAGS=-fPIC --prefix=$PHOME > > make > > make install > > The old scipy.fftpack supports only fftw2. I have a patch for fftw3 > somewhere. I'll apply it as soon as I'll get a chance to work on fftpack. I thought it should work - at least that's how I understood your message http://aspn.activestate.com/ASPN/Mail/Message/scipy-user/2811157 So I went for svn co http://svn.scipy.org/svn/scipy/trunk scipy cd scipy svn co http://svn.scipy.org/svn/scipy_core/trunk/ scipy_core which gives: In [1]: import scipy In [2]: scipy.__version__ Out[2]: '0.3.3_303.4573' Was this the wrong branch etc.? > > Reconsidering this after seeing the poor performance > > shows that, `nm _fftpack.so` gives me no symbols related to fftw. > > Yes, _fftpack.so should contain Fortran fftpack when fftw (v2) library is > not available. > > > And for future builds: > > Is there a way to check (on the python level) which fft is used > > *after* the installation? > > Normally it should be unnecessary. However, scipy building scripts install > config.py (old scipy) or __config__.py (newscipy) files containing the > system information used in building scipy. I do think that this is important (so one can check, which capabilities are really there.) So thanks a lot for pointing out config.py - this contains what I was looking for! Best, Arnd From pearu at scipy.org Mon Oct 17 06:29:55 2005 From: pearu at scipy.org (Pearu Peterson) Date: Mon, 17 Oct 2005 05:29:55 -0500 (CDT) Subject: [SciPy-user] fftw, scipy ("old") install In-Reply-To: References: Message-ID: On Mon, 17 Oct 2005, Arnd Baecker wrote: >> The old scipy.fftpack supports only fftw2. I have a patch for fftw3 >> somewhere. I'll apply it as soon as I'll get a chance to work on fftpack. > > I thought it should work - at least that's how I understood > your message > http://aspn.activestate.com/ASPN/Mail/Message/scipy-user/2811157 > > So I went for > svn co http://svn.scipy.org/svn/scipy/trunk scipy > cd scipy > svn co http://svn.scipy.org/svn/scipy_core/trunk/ scipy_core > > which gives: > In [1]: import scipy > In [2]: scipy.__version__ > Out[2]: '0.3.3_303.4573' > > Was this the wrong branch etc.? Oops, indeed, the patch for fftw3 is already applied. But I haven't tested it yet. Looking at system_info.py, get_info('fftw') still looks for only fftw2 libraries. So, fftw3 support needs to be finished. Pearu From meesters at uni-mainz.de Mon Oct 17 10:10:40 2005 From: meesters at uni-mainz.de (Christian Meesters) Date: Mon, 17 Oct 2005 16:10:40 +0200 Subject: [SciPy-user] force to scientific output Message-ID: <3e0fbbd5191f33d42841fbb45a548fa0@uni-mainz.de> Hi This is totally off-topic, but I guess chances are that some of you know an answer. I need to write an output file for a command line tool which requires a column of floats - in so called scientific notation (4.23e-2 instead of 0.0423). Is there a way to dump numbers like this to a file? TIA Christian From nwagner at mecha.uni-stuttgart.de Mon Oct 17 10:20:21 2005 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Mon, 17 Oct 2005 16:20:21 +0200 Subject: [SciPy-user] force to scientific output In-Reply-To: <3e0fbbd5191f33d42841fbb45a548fa0@uni-mainz.de> References: <3e0fbbd5191f33d42841fbb45a548fa0@uni-mainz.de> Message-ID: <4353B325.4060502@mecha.uni-stuttgart.de> Christian Meesters wrote: >Hi > >This is totally off-topic, but I guess chances are that some of you >know an answer. >I need to write an output file for a command line tool which requires a >column of floats - in so called scientific notation (4.23e-2 instead of >0.0423). Is there a way to dump numbers like this to a file? > >TIA >Christian > >_______________________________________________ >SciPy-user mailing list >SciPy-user at scipy.net >http://www.scipy.net/mailman/listinfo/scipy-user > from scipy import * a = rand(10) file=open('dump','w') io.write_array(file,a,precision=2) From meesters at uni-mainz.de Mon Oct 17 10:28:24 2005 From: meesters at uni-mainz.de (Christian Meesters) Date: Mon, 17 Oct 2005 16:28:24 +0200 Subject: [SciPy-user] force to scientific output In-Reply-To: <4353B325.4060502@mecha.uni-stuttgart.de> References: <3e0fbbd5191f33d42841fbb45a548fa0@uni-mainz.de> <4353B325.4060502@mecha.uni-stuttgart.de> Message-ID: <8b36dc3a19e9bc9c1b49c5e78d49c379@uni-mainz.de> Ouch! Stupid me. Sometimes it's better to just leave the lab, hm? Danke Dir, Christian On 17 Oct 2005, at 16:20, Nils Wagner wrote: > Christian Meesters wrote: >> Hi >> >> This is totally off-topic, but I guess chances are that some of you >> know an answer. >> I need to write an output file for a command line tool which requires >> a >> column of floats - in so called scientific notation (4.23e-2 instead >> of >> 0.0423). Is there a way to dump numbers like this to a file? >> >> TIA >> Christian >> >> _______________________________________________ >> SciPy-user mailing list >> SciPy-user at scipy.net >> http://www.scipy.net/mailman/listinfo/scipy-user >> > from scipy import * > a = rand(10) > file=open('dump','w') > io.write_array(file,a,precision=2) > > > > > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user > From jdhunter at ace.bsd.uchicago.edu Mon Oct 17 10:26:51 2005 From: jdhunter at ace.bsd.uchicago.edu (John Hunter) Date: Mon, 17 Oct 2005 09:26:51 -0500 Subject: [SciPy-user] force to scientific output In-Reply-To: <3e0fbbd5191f33d42841fbb45a548fa0@uni-mainz.de> (Christian Meesters's message of "Mon, 17 Oct 2005 16:10:40 +0200") References: <3e0fbbd5191f33d42841fbb45a548fa0@uni-mainz.de> Message-ID: <87slv0xlqs.fsf@peds-pc311.bsd.uchicago.edu> >>>>> "Christian" == Christian Meesters writes: Christian> Hi This is totally off-topic, but I guess chances are Christian> that some of you know an answer. I need to write an Christian> output file for a command line tool which requires a Christian> column of floats - in so called scientific notation Christian> (4.23e-2 instead of 0.0423). Is there a way to dump Christian> numbers like this to a file? You need to use the '%e' format string. In [8]: '%e'%1.2 Out[8]: '1.200000e+00' You can control the precision In [11]: '%.2e'%1.2 Out[11]: '1.20e+00' You can convert an array and then write it as a list of new lines with list comprehensions In [15]: s = '\n'.join(['%e'%val for val in rand(5)]) In [16]: s Out[16]: '5.805749e-01\n8.112261e-01\n5.021213e-01\n5.141693e-01\n9.689626e-02' or use a helper function for this. matplotlib has save save('test3.out', x, fmt='%1.4e') # use exponential notation and scipy has scipy.io.write_array, which is crashing when I test with the new scipy. This could be me, since I rarely use this function, or perhaps a bug in the scipy migration(?) In [2]: import scipy.io In [3]: fh = file('somefile.dat', 'w') In [4]: x = scipy.array([1,2,3]) In [5]: scipy.io.write_array(fh, x, precision='%e') ------------------------------------------------------------ Traceback (most recent call last): File "", line 1, in ? File "/usr/lib/python2.4/site-packages/scipy/io/array_import.py", line 462, in write_array ss = suppress_small) File "/usr/lib/python2.4/site-packages/scipy/io/array_import.py", line 393, in str_array thistype = arr.typecode() AttributeError: 'scipy.ndarray' object has no attribute 'typecode' From Fernando.Perez at colorado.edu Mon Oct 17 14:58:46 2005 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Mon, 17 Oct 2005 12:58:46 -0600 Subject: [SciPy-user] fftw, scipy ("old") install In-Reply-To: References: Message-ID: <4353F466.7020700@colorado.edu> On Oct 17, 2005, at 4:29 AM, Pearu Peterson wrote: > Oops, indeed, the patch for fftw3 is already applied. But I haven't > tested it yet. Looking at system_info.py, get_info('fftw') still looks > for > only fftw2 libraries. So, fftw3 support needs to be finished. If I recall correctly, an fftw3 patch was sent this summer by a user, and I applied it (back n the old_scipy CVS days). That patch actually had a bug in it, but I think you fixed the bug without reverting it, so the core fftw3 support should already be in (though it may still need some polishing). Cheers, f From arnd.baecker at web.de Mon Oct 17 16:15:32 2005 From: arnd.baecker at web.de (Arnd Baecker) Date: Mon, 17 Oct 2005 22:15:32 +0200 (CEST) Subject: [SciPy-user] fftw, scipy ("old") install In-Reply-To: <4353F466.7020700@colorado.edu> References: <4353F466.7020700@colorado.edu> Message-ID: On Mon, 17 Oct 2005, Fernando Perez wrote: > On Oct 17, 2005, at 4:29 AM, Pearu Peterson wrote: > > > Oops, indeed, the patch for fftw3 is already applied. But I haven't > > tested it yet. Looking at system_info.py, get_info('fftw') still looks > > for > > only fftw2 libraries. So, fftw3 support needs to be finished. > > If I recall correctly, an fftw3 patch was sent this summer by a user, > and I applied it (back n the old_scipy CVS days). That patch actually > had a bug in it, but I think you fixed the bug without reverting it, so > the core fftw3 support should already be in (though it may still need > some polishing). Hmm, but it did not work for me, it seems. Maybe I did something wrong (not building the right fftws etc?) OTOH, site-packages/scipy/config.py has fftw_info={'libraries': ['fftw3'], 'library_dirs': ['/scr/python/lib'], 'define_macros': [('SCIPY_FFTW3_H', None)], 'include_dirs': ['/scr/python/include']} OK, it seems to be not my day: nm _fftpack.so | grep fftw *does* show quite a few entries involving fftw (I have no idea what I grepped this morning ...) I am a bit puzzled now and see the following possibilities - for my build scipy does not use the fftw - I should have given some more flags when building fftw3 (including optimization) With my build it only created: libfftw3.a and libfftw3.la (Moving these out of the way does not change anything...) - some weird slow down on the opteron Alright, I will have to do further and better checking tomorrow - sorry for creating noise.... Best, Arnd From vbalko at gmail.com Mon Oct 17 16:47:31 2005 From: vbalko at gmail.com (Vlado Balko) Date: Mon, 17 Oct 2005 22:47:31 +0200 Subject: [SciPy-user] Windows Binaries for Python 2.4 Message-ID: <43540DE3.1080004@gmail.com> Hello, I tried to install scipy core 0.4.2 from windows binaries. Installation was succesfull. Then I tried some examples but I failed. It was no problem to import module from scipy import * but when I tried to import gui_thread or called optimize traceback comes up. Is there some installation guide to scipy core? from sources or not? What prerequisities is needed when installing from binaries? I`m using python 2.4.2 balky From rkern at ucsd.edu Mon Oct 17 16:50:58 2005 From: rkern at ucsd.edu (Robert Kern) Date: Mon, 17 Oct 2005 13:50:58 -0700 Subject: [SciPy-user] Windows Binaries for Python 2.4 In-Reply-To: <43540DE3.1080004@gmail.com> References: <43540DE3.1080004@gmail.com> Message-ID: <43540EB2.6000000@ucsd.edu> Vlado Balko wrote: > Hello, > > I tried to install scipy core 0.4.2 from windows binaries. Installation > was succesfull. Then I tried some examples but I failed. It was no > problem to import module > > from scipy import * > > but when I tried to import gui_thread or called optimize traceback comes > up. Is there some installation guide to scipy core? from sources or not? > What prerequisities is needed when installing from binaries? I`m using > python 2.4.2 scipy_core does not contain gui_thread or scipy.optimize. -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From sgarcia at olfac.univ-lyon1.fr Tue Oct 18 05:38:24 2005 From: sgarcia at olfac.univ-lyon1.fr (Samuel GARCIA) Date: Tue, 18 Oct 2005 11:38:24 +0200 Subject: [SciPy-user] fromfile and offset Message-ID: <4354C290.6070400@olfac.univ-lyon1.fr> Does anyone have a trick to read a text file off data which have, for example, 10 lines of headers wich discripbe the data and then 3 columns off data. The documentation of fromfile function doesn't talk about offset. samuel -- Samuel GARCIA CNRS - UMR5020 Universite Claude Bernard LYON 1 Laboratoire des Neurosciences et Systemes Sensoriels 50, avenue Tony Garnier 69366 LYON Cedex 07 04 37 28 74 64 From arnd.baecker at web.de Tue Oct 18 08:38:08 2005 From: arnd.baecker at web.de (Arnd Baecker) Date: Tue, 18 Oct 2005 14:38:08 +0200 (CEST) Subject: [SciPy-user] fftw, scipy ("old") install In-Reply-To: References: <4353F466.7020700@colorado.edu> Message-ID: Moin, I spend the whole morning with fftw (hope none of my collaborators reads this). Status is as follows: a) fftw2 works: Multi-dimensional Fast Fourier Transform =================================================== | real input | complex input --------------------------------------------------- size | scipy | Numeric | scipy | Numeric --------------------------------------------------- 100x100 | 0.04 | 0.07 | 0.04 | 0.09 (secs for 100 calls) 1000x100 | 0.05 | 0.08 | 0.05 | 0.08 (secs for 7 calls) 256x256 | 0.05 | 0.08 | 0.06 | 0.09 (secs for 10 calls) 512x512 | 0.18 | 0.14 | 0.17 | 0.15 (secs for 3 calls) Installation on the opteron with: ./configure CFLAGS="-fPIC -O3 -fomit-frame-pointer -fno-schedule-insns -fschedule-insns2 -fstrict-aliasing" --prefix=$TSTHOME b) fftw3 works, but is slow: Multi-dimensional Fast Fourier Transform =================================================== | real input | complex input --------------------------------------------------- size | scipy | Numeric | scipy | Numeric --------------------------------------------------- 100x100 | 0.05 | 0.06 | 0.05 | 0.07 (secs for 100 calls) 1000x100 | 0.05 | 0.09 | 0.06 | 0.08 (secs for 7 calls) 256x256 | 0.10 | 0.09 | 0.10 | 0.08 (secs for 10 calls) 512x512 | 0.24 | 0.13 | 0.24 | 0.13 (secs for 3 calls) Already for 256x256 scipy.fft is slower than the Numeric one and it gets worse for 512x512 (and much worse for larger NxN). Installation on the opteron with ./configure CFLAGS="-fPIC -O3 -fomit-frame-pointer -fno-schedule-insns -fstrict-aliasing -mpreferred-stack-boundary=4" --prefix=$PHOME (Using "--enable-sse --enable-sse2 --enable-k7", "--enable-sse --enable-sse2", "--enable-sse" lead to compile errors). So, fftw3 *does* work for scipy, and compiled with the right flags the performance is much better than the one reported before. Still, it does not perform as good as it should. Does anyone have better compile flags for an opteron? Any ideas are very welcome! Best, Arnd Full results for fftw2 ---------------------- Fast Fourier Transform ================================================= | real input | complex input ------------------------------------------------- size | scipy | Numeric | scipy | Numeric ------------------------------------------------- 100 | 0.12 | 0.08 | 0.05 | 0.05 (secs for 7000 calls) 1000 | 0.05 | 0.07 | 0.06 | 0.08 (secs for 2000 calls) 256 | 0.09 | 0.11 | 0.26 | 0.15 (secs for 10000 calls) 512 | 0.13 | 0.19 | 0.14 | 0.20 (secs for 10000 calls) 1024 | 0.02 | 0.04 | 0.02 | 0.04 (secs for 1000 calls) 2048 | 0.04 | 0.07 | 0.06 | 0.07 (secs for 1000 calls) 4096 | 0.04 | 0.10 | 0.07 | 0.11 (secs for 500 calls) 8192 | 0.10 | 0.48 | 0.24 | 0.50 (secs for 500 calls) ..... Multi-dimensional Fast Fourier Transform =================================================== | real input | complex input --------------------------------------------------- size | scipy | Numeric | scipy | Numeric --------------------------------------------------- 100x100 | 0.04 | 0.07 | 0.04 | 0.09 (secs for 100 calls) 1000x100 | 0.05 | 0.08 | 0.05 | 0.08 (secs for 7 calls) 256x256 | 0.05 | 0.08 | 0.06 | 0.09 (secs for 10 calls) 512x512 | 0.18 | 0.14 | 0.17 | 0.15 (secs for 3 calls) ..... Inverse Fast Fourier Transform =============================================== | real input | complex input ----------------------------------------------- size | scipy | Numeric | scipy | Numeric ----------------------------------------------- 100 | 0.05 | 0.14 | 0.06 | 0.14 (secs for 7000 calls) 1000 | 0.05 | 0.17 | 0.09 | 0.18 (secs for 2000 calls) 256 | 0.10 | 0.28 | 0.12 | 0.29 (secs for 10000 calls) 512 | 0.13 | 0.48 | 0.18 | 0.46 (secs for 10000 calls) 1024 | 0.02 | 0.08 | 0.04 | 0.08 (secs for 1000 calls) 2048 | 0.04 | 0.15 | 0.07 | 0.16 (secs for 1000 calls) 4096 | 0.04 | 0.19 | 0.08 | 0.19 (secs for 500 calls) 8192 | 0.12 | 0.68 | 0.26 | 0.69 (secs for 500 calls) ....... Inverse Fast Fourier Transform (real data) ================================== size | scipy | Numeric ---------------------------------- 100 | 0.05 | 0.15 (secs for 7000 calls) 1000 | 0.05 | 0.10 (secs for 2000 calls) 256 | 0.09 | 0.24 (secs for 10000 calls) 512 | 0.13 | 0.32 (secs for 10000 calls) 1024 | 0.02 | 0.05 (secs for 1000 calls) 2048 | 0.04 | 0.07 (secs for 1000 calls) 4096 | 0.04 | 0.07 (secs for 500 calls) 8192 | 0.10 | 0.20 (secs for 500 calls) .... Fast Fourier Transform (real data) ================================== size | scipy | Numeric ---------------------------------- 100 | 0.05 | 0.07 (secs for 7000 calls) 1000 | 0.04 | 0.05 (secs for 2000 calls) 256 | 0.09 | 0.12 (secs for 10000 calls) 512 | 0.12 | 0.16 (secs for 10000 calls) 1024 | 0.01 | 0.03 (secs for 1000 calls) 2048 | 0.04 | 0.04 (secs for 1000 calls) 4096 | 0.03 | 0.05 (secs for 500 calls) 8192 | 0.09 | 0.15 (secs for 500 calls) ... Full results for fftw3 ---------------------- Fast Fourier Transform ================================================= | real input | complex input ------------------------------------------------- size | scipy | Numeric | scipy | Numeric ------------------------------------------------- 100 | 0.12 | 0.14 | 0.32 | 0.06 (secs for 7000 calls) 1000 | 0.04 | 0.07 | 0.36 | 0.08 (secs for 2000 calls) 256 | 0.09 | 0.11 | 0.66 | 0.11 (secs for 10000 calls) 512 | 0.15 | 0.19 | 0.94 | 0.19 (secs for 10000 calls) 1024 | 0.03 | 0.03 | 0.14 | 0.04 (secs for 1000 calls) 2048 | 0.04 | 0.07 | 0.27 | 0.07 (secs for 1000 calls) 4096 | 0.05 | 0.11 | 0.26 | 0.11 (secs for 500 calls) 8192 | 0.11 | 0.48 | 0.62 | 0.51 (secs for 500 calls) ..... Multi-dimensional Fast Fourier Transform =================================================== | real input | complex input --------------------------------------------------- size | scipy | Numeric | scipy | Numeric --------------------------------------------------- 100x100 | 0.05 | 0.06 | 0.05 | 0.07 (secs for 100 calls) 1000x100 | 0.05 | 0.09 | 0.06 | 0.08 (secs for 7 calls) 256x256 | 0.10 | 0.09 | 0.10 | 0.08 (secs for 10 calls) 512x512 | 0.24 | 0.13 | 0.24 | 0.13 (secs for 3 calls) ..... Inverse Fast Fourier Transform =============================================== | real input | complex input ----------------------------------------------- size | scipy | Numeric | scipy | Numeric ----------------------------------------------- 100 | 0.05 | 0.15 | 0.51 | 0.15 (secs for 7000 calls) 1000 | 0.04 | 0.18 | 0.44 | 0.17 (secs for 2000 calls) 256 | 0.09 | 0.28 | 0.92 | 0.28 (secs for 10000 calls) 512 | 0.16 | 0.46 | 1.20 | 0.46 (secs for 10000 calls) 1024 | 0.02 | 0.08 | 0.18 | 0.08 (secs for 1000 calls) 2048 | 0.04 | 0.14 | 0.30 | 0.15 (secs for 1000 calls) 4096 | 0.04 | 0.18 | 0.28 | 0.19 (secs for 500 calls) 8192 | 0.11 | 0.66 | 0.67 | 0.68 (secs for 500 calls) ....... Inverse Fast Fourier Transform (real data) ================================== size | scipy | Numeric ---------------------------------- 100 | 0.05 | 0.16 (secs for 7000 calls) 1000 | 0.06 | 0.09 (secs for 2000 calls) 256 | 0.10 | 0.25 (secs for 10000 calls) 512 | 0.13 | 0.33 (secs for 10000 calls) 1024 | 0.02 | 0.05 (secs for 1000 calls) 2048 | 0.05 | 0.06 (secs for 1000 calls) 4096 | 0.04 | 0.07 (secs for 500 calls) 8192 | 0.11 | 0.19 (secs for 500 calls) .... Fast Fourier Transform (real data) ================================== size | scipy | Numeric ---------------------------------- 100 | 0.05 | 0.07 (secs for 7000 calls) 1000 | 0.04 | 0.06 (secs for 2000 calls) 256 | 0.10 | 0.12 (secs for 10000 calls) 512 | 0.14 | 0.16 (secs for 10000 calls) 1024 | 0.03 | 0.02 (secs for 1000 calls) 2048 | 0.04 | 0.04 (secs for 1000 calls) 4096 | 0.04 | 0.04 (secs for 500 calls) 8192 | 0.09 | 0.14 (secs for 500 calls) ... From ryanlists at gmail.com Tue Oct 18 08:40:48 2005 From: ryanlists at gmail.com (Ryan Krauss) Date: Tue, 18 Oct 2005 08:40:48 -0400 Subject: [SciPy-user] fromfile and offset In-Reply-To: <4354C290.6070400@olfac.univ-lyon1.fr> References: <4354C290.6070400@olfac.univ-lyon1.fr> Message-ID: you might try scipy.io.read_array with lines=[10,-1] On 10/18/05, Samuel GARCIA wrote: > Does anyone have a trick to read a text file off data which have, for > example, 10 lines of headers wich discripbe the data and then 3 columns > off data. > > The documentation of fromfile function doesn't talk about offset. > > samuel > > > -- > Samuel GARCIA > CNRS - UMR5020 > Universite Claude Bernard LYON 1 > Laboratoire des Neurosciences et Systemes Sensoriels > 50, avenue Tony Garnier > 69366 LYON Cedex 07 > 04 37 28 74 64 > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user > From rolf.wester at ilt.fraunhofer.de Tue Oct 18 08:51:51 2005 From: rolf.wester at ilt.fraunhofer.de (Rolf Wester) Date: Tue, 18 Oct 2005 14:51:51 +0200 Subject: [SciPy-user] weave and MSVC .NET 2003 Message-ID: <4354EFE7.8050005@ilt.fraunhofer.de> Hello, I would like to use weave under XP with MSVC .NET 2003. My Python/weave code works under Linux/gcc but when I try it on XP I get the error message that MSVC Version 6 is needed. It seems that weave (or distutils ?) doesn't find the paths to my MSVC directories and/or that registry keys are not found. Is ist possible to pass path information to weave? Is it possible to build Python extensions using weave with MSVC .NET 2003 and is it possible to run this extension on a computer that has no MSVC installed? Thank you in advance Regards Rolf Wester From Jonathan.Peirce at nottingham.ac.uk Tue Oct 18 09:45:52 2005 From: Jonathan.Peirce at nottingham.ac.uk (Jon Peirce) Date: Tue, 18 Oct 2005 14:45:52 +0100 Subject: [SciPy-user] uint8 bug (Scipy core 0.4.1) In-Reply-To: References: Message-ID: <4354FC90.7060309@psychology.nottingham.ac.uk> Scipy arrays with dtype=uint8 or int8 seem to be mathematically-challenged on my machine (AMD64 WinXP running python 2.4.2, scipy core 0.4.1). Simple int (and various others) appear fine. >>>import scipy >>>xx=scipy.array([100,100,100],scipy.int8) >>>print xx.sum() 44 >>>xx=scipy.array([100,100,100],scipy.int) >>>print xx.sum() 300 anybody know why? -- Jon Peirce Nottingham University +44 (0)115 8467176 (tel) +44 (0)115 9515324 (fax) http://www.peirce.org.uk/ This message has been checked for viruses but the contents of an attachment may still contain software viruses, which could damage your computer system: you are advised to perform your own checks. Email communications with the University of Nottingham may be monitored as permitted by UK legislation. From ryanlists at gmail.com Tue Oct 18 11:00:07 2005 From: ryanlists at gmail.com (Ryan Krauss) Date: Tue, 18 Oct 2005 11:00:07 -0400 Subject: [SciPy-user] python+maxima+latex for symolic stuff Message-ID: A while ago I posted a question to this list about how to do symbolic stuff in open source software. I have mixed a few tools together and come up with something I am pretty happy with. It is still in early development, but I find it useful and was wondering if it would help anyone else or if anyone wanted to help me develop it. Basically, I write a LaTeX input document and when I want to do algebra I use a new environment: \begin{maxima} \parseopts{lhs='\theta_a-\theta_b=\theta_{RA}'} thr:Gp*v+M/(k+c*s) \label{thr1} \end{maxima} LaTeX never actually sees this environment. I use python as a pre-processor for LaTeX. Python takes out all of the maxima environments and builds an input script for maxima. Python calls Maxima and tells it to run the script python created, saving each equation to a seperate output file (all very small). Python then reads these output files and replaces the maxima environments with normal LaTeX equation environments with the output from maxima (it may have been possible to do this in a slightly cleaner way with pipes but I was originally developing it under windows and I am a hack). I could package it a little cleaner and include some simple examples if anyone is interested. Ryan From ryanlists at gmail.com Tue Oct 18 11:10:46 2005 From: ryanlists at gmail.com (Ryan Krauss) Date: Tue, 18 Oct 2005 11:10:46 -0400 Subject: [SciPy-user] python+maxima+latex for symolic stuff In-Reply-To: References: Message-ID: Sorry, I sort of left out that the main point of this is that when I do symbolic derivations I really like to look at "pretty" LaTeX output and I like to add explanations to the derivation as I go along. So, mixing LaTeX and symbolic capabilities is very important to me. On 10/18/05, Ryan Krauss wrote: > A while ago I posted a question to this list about how to do symbolic > stuff in open source software. I have mixed a few tools together and > come up with something I am pretty happy with. It is still in early > development, but I find it useful and was wondering if it would help > anyone else or if anyone wanted to help me develop it. > > Basically, I write a LaTeX input document and when I want to do > algebra I use a new environment: > \begin{maxima} > \parseopts{lhs='\theta_a-\theta_b=\theta_{RA}'} > thr:Gp*v+M/(k+c*s) > \label{thr1} > \end{maxima} > > LaTeX never actually sees this environment. I use python as a > pre-processor for LaTeX. Python takes out all of the maxima > environments and builds an input script for maxima. Python calls > Maxima and tells it to run the script python created, saving each > equation to a seperate output file (all very small). Python then > reads these output files and replaces the maxima environments with > normal LaTeX equation environments with the output from maxima (it may > have been possible to do this in a slightly cleaner way with pipes but > I was originally developing it under windows and I am a hack). > > I could package it a little cleaner and include some simple examples > if anyone is interested. > > Ryan > From aisaac at american.edu Tue Oct 18 12:35:53 2005 From: aisaac at american.edu (Alan Isaac) Date: Tue, 18 Oct 2005 12:35:53 -0400 (EDT) Subject: [SciPy-user] OT: python+maxima+latex for symolic stuff In-Reply-To: References: Message-ID: On Tue, 18 Oct 2005, Ryan Krauss wrote: > I use python as a pre-processor for LaTeX. Python takes > out all of the maxima environments and builds an input > script for maxima. Python calls Maxima and tells it to > run the script python created, saving each equation to a > seperate output file (all very small). Python then reads > these output files and replaces the maxima environments > with normal LaTeX equation environments with the output > from maxima I don't need this immediately but will remember it. What it highlights I think is the need for an equivalent of Python's subprocess module in LaTex. Btw, is there a Python equivalent to perltex? Cheers, Alan Isaac From oliphant at ee.byu.edu Tue Oct 18 15:55:58 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Tue, 18 Oct 2005 13:55:58 -0600 Subject: [SciPy-user] uint8 bug (Scipy core 0.4.1) In-Reply-To: <4354FC90.7060309@psychology.nottingham.ac.uk> References: <4354FC90.7060309@psychology.nottingham.ac.uk> Message-ID: <4355534E.3030706@ee.byu.edu> Jon Peirce wrote: >Scipy arrays with dtype=uint8 or int8 seem to be >mathematically-challenged on my machine (AMD64 WinXP running python >2.4.2, scipy core 0.4.1). Simple int (and various others) appear fine. > > >>>import scipy > >>>xx=scipy.array([100,100,100],scipy.int8) > >>>print xx.sum() > 44 > >>>xx=scipy.array([100,100,100],scipy.int) > >>>print xx.sum() > 300 > > This is not a bug. In the first line, you are telling the computer to add up 8-bit integers. The result does not fit in an 8-bit integer --- thus you are computing modulo 256. I suspect you wanted for the first case. xx.sum(rtype=int) -- this will "reduce" using the long integer type on your platform. -Travis From sgarcia at olfac.univ-lyon1.fr Wed Oct 19 03:55:43 2005 From: sgarcia at olfac.univ-lyon1.fr (Samuel GARCIA) Date: Wed, 19 Oct 2005 09:55:43 +0200 Subject: [SciPy-user] Newbie and stats Message-ID: <4355FBFF.3010507@olfac.univ-lyon1.fr> Why stats.distributions.poisson.cdf(10,0) give nan stats.distributions.poisson.cdf(0,0) give nan an equivalent function of matlab gave me 1 for both. Samuel -- Samuel GARCIA CNRS - UMR5020 Universite Claude Bernard LYON 1 Laboratoire des Neurosciences et Systemes Sensoriels 50, avenue Tony Garnier 69366 LYON Cedex 07 04 37 28 74 64 From nwagner at mecha.uni-stuttgart.de Wed Oct 19 04:01:21 2005 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Wed, 19 Oct 2005 10:01:21 +0200 Subject: [SciPy-user] Newbie and stats In-Reply-To: <4355FBFF.3010507@olfac.univ-lyon1.fr> References: <4355FBFF.3010507@olfac.univ-lyon1.fr> Message-ID: <4355FD51.1010302@mecha.uni-stuttgart.de> Samuel GARCIA wrote: >Why >stats.distributions.poisson.cdf(10,0) give nan >stats.distributions.poisson.cdf(0,0) give nan > >an equivalent function of matlab gave me 1 for both. > >Samuel > > > BTW, newscipy results in >>> stats.distributions.poisson.cdf(10,0) Traceback (most recent call last): File "", line 1, in ? File "/usr/local/lib/python2.4/site-packages/scipy/stats/distributions.py", line 3265, in cdf insert(output,cond,self._cdf(*goodargs)) File "/usr/local/lib/python2.4/site-packages/scipy/stats/distributions.py", line 3741, in _cdf k = floor(x) NameError: global name 'floor' is not defined Nils From sgarcia at olfac.univ-lyon1.fr Wed Oct 19 04:07:17 2005 From: sgarcia at olfac.univ-lyon1.fr (Samuel GARCIA) Date: Wed, 19 Oct 2005 10:07:17 +0200 Subject: [SciPy-user] Newbie and stats In-Reply-To: <4355FD51.1010302@mecha.uni-stuttgart.de> References: <4355FBFF.3010507@olfac.univ-lyon1.fr> <4355FD51.1010302@mecha.uni-stuttgart.de> Message-ID: <4355FEB5.4090106@olfac.univ-lyon1.fr> An HTML attachment was scrubbed... URL: From nwagner at mecha.uni-stuttgart.de Wed Oct 19 04:10:32 2005 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Wed, 19 Oct 2005 10:10:32 +0200 Subject: [SciPy-user] Newbie and stats In-Reply-To: <4355FEB5.4090106@olfac.univ-lyon1.fr> References: <4355FBFF.3010507@olfac.univ-lyon1.fr> <4355FD51.1010302@mecha.uni-stuttgart.de> <4355FEB5.4090106@olfac.univ-lyon1.fr> Message-ID: <4355FF78.8050803@mecha.uni-stuttgart.de> Samuel GARCIA wrote: > In [38]: scipy.__version__ > Out[38]: '0.3.2' > >>> stats.distributions.poisson.cdf(0,0) nan >>> scipy.__version__ '0.3.3_303.4573' > > Nils Wagner a ?crit : >>Samuel GARCIA wrote: >> >>>Why >>>stats.distributions.poisson.cdf(10,0) give nan >>>stats.distributions.poisson.cdf(0,0) give nan >>> >>>an equivalent function of matlab gave me 1 for both. >>> >>>Samuel >>> >>> >>> >>> >>BTW, newscipy results in >> >> >>>>>stats.distributions.poisson.cdf(10,0) >>>>> >>Traceback (most recent call last): >> File "", line 1, in ? >> File >>"/usr/local/lib/python2.4/site-packages/scipy/stats/distributions.py", >>line 3265, in cdf >> insert(output,cond,self._cdf(*goodargs)) >> File >>"/usr/local/lib/python2.4/site-packages/scipy/stats/distributions.py", >>line 3741, in _cdf >> k = floor(x) >>NameError: global name 'floor' is not defined >> >>Nils >> >>_______________________________________________ >>SciPy-user mailing list >>SciPy-user at scipy.net >>http://www.scipy.net/mailman/listinfo/scipy-user >> > >-- >Samuel GARCIA >CNRS - UMR5020 >Universite Claude Bernard LYON 1 >Laboratoire des Neurosciences et Systemes Sensoriels >50, avenue Tony Garnier >69366 LYON Cedex 07 >04 37 28 74 64 > >------------------------------------------------------------------------ > >_______________________________________________ >SciPy-user mailing list >SciPy-user at scipy.net >http://www.scipy.net/mailman/listinfo/scipy-user > From nwagner at mecha.uni-stuttgart.de Wed Oct 19 05:12:04 2005 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Wed, 19 Oct 2005 11:12:04 +0200 Subject: [SciPy-user] construct orthogonal basis U of block Krylov subspace; [v a*v a^2*v ... a^(k+1)*v]; Message-ID: <43560DE4.4080502@mecha.uni-stuttgart.de> Hi all, Octave has a loadable function called krylov. http://www.octave.org/doc/Matrix-Factorizations.html Has someone written a similar function for scipy ? Any pointer would be appreciated. Cheers, Nils From Jonathan.Peirce at nottingham.ac.uk Wed Oct 19 11:55:05 2005 From: Jonathan.Peirce at nottingham.ac.uk (Jon Peirce) Date: Wed, 19 Oct 2005 16:55:05 +0100 Subject: [SciPy-user] operations on int8 arrays Message-ID: <43566C59.6090508@psychology.nottingham.ac.uk> An HTML attachment was scrubbed... URL: From rcsqtc at iiqab.csic.es Wed Oct 19 12:03:35 2005 From: rcsqtc at iiqab.csic.es (Ramon Crehuet) Date: Wed, 19 Oct 2005 18:03:35 +0200 Subject: [SciPy-user] appending rows to an array Message-ID: <43566E57.4070803@iiqab.csic.es> Dear all, I need to append rows to an array (many times). For example: array([[1,2,3], [4,5,6]]) and array([7,8,9]) should give: array([[1,2,3], [4,5,6],[7,8,9]]) I know I can do this using the tolist method to transform the arrays to lists, use append and then transform back to array, but is there a more efficient way to do that? I read the manual and found concatenate but I can't figure out how to use it... Thanks in advance, Ramon From sgarcia at olfac.univ-lyon1.fr Wed Oct 19 12:10:58 2005 From: sgarcia at olfac.univ-lyon1.fr (Samuel GARCIA) Date: Wed, 19 Oct 2005 18:10:58 +0200 Subject: [SciPy-user] appending rows to an array In-Reply-To: <43566E57.4070803@iiqab.csic.es> References: <43566E57.4070803@iiqab.csic.es> Message-ID: <43567012.50007@olfac.univ-lyon1.fr> the function is : concatenate but more quickly : for row concatenate : In [14]: r_[array([[1,2,3], [4,5,6]]), array([[7,8,9]]) ] Out[14]: array([[1, 2, 3], [4, 5, 6], [7, 8, 9]]) for columns concatenate : In [22]: c_[array([[1,2,3], [4,5,6]]), array([[1],[2]]) ] Out[22]: array([[1, 2, 3, 1], [4, 5, 6, 2]]) Ramon Crehuet a ?crit : >Dear all, > I need to append rows to an array (many times). For example: >array([[1,2,3], [4,5,6]]) and array([7,8,9]) >should give: >array([[1,2,3], [4,5,6],[7,8,9]]) > >I know I can do this using the tolist method to transform the arrays to >lists, use append and then transform back to array, but is there a more >efficient way to do that? I read the manual and found concatenate but I >can't figure out how to use it... >Thanks in advance, >Ramon > > > >_______________________________________________ >SciPy-user mailing list >SciPy-user at scipy.net >http://www.scipy.net/mailman/listinfo/scipy-user > > -- Samuel GARCIA CNRS - UMR5020 Universite Claude Bernard LYON 1 Laboratoire des Neurosciences et Systemes Sensoriels 50, avenue Tony Garnier 69366 LYON Cedex 07 04 37 28 74 64 From joris at ster.kuleuven.ac.be Wed Oct 19 12:12:39 2005 From: joris at ster.kuleuven.ac.be (Joris De Ridder) Date: Wed, 19 Oct 2005 18:12:39 +0200 Subject: [SciPy-user] appending rows to an array In-Reply-To: <43566E57.4070803@iiqab.csic.es> References: <43566E57.4070803@iiqab.csic.es> Message-ID: <43567077.1080209@ster.kuleuven.ac.be> Hi Ramon, I guess this is what you are looking for: >>> x = array([[1,2,3], [4,5,6]]) >>> y = array([7,8,9]) >>> concatenate((x,[y])) array([[1, 2, 3], [4, 5, 6], [7, 8, 9]]) Cheers, Joris Ramon Crehuet wrote: >Dear all, > I need to append rows to an array (many times). For example: >array([[1,2,3], [4,5,6]]) and array([7,8,9]) >should give: >array([[1,2,3], [4,5,6],[7,8,9]]) > > Disclaimer: http://www.kuleuven.be/cwis/email_disclaimer.htm From stephen.walton at csun.edu Wed Oct 19 12:14:14 2005 From: stephen.walton at csun.edu (Stephen Walton) Date: Wed, 19 Oct 2005 09:14:14 -0700 Subject: [SciPy-user] appending rows to an array In-Reply-To: <43566E57.4070803@iiqab.csic.es> References: <43566E57.4070803@iiqab.csic.es> Message-ID: <435670D6.3050907@csun.edu> Ramon Crehuet wrote: >Dear all, > I need to append rows to an array (many times). > I know someone will correct me if I'm wrong here, but one of the things which makes arrays fast is that they have a fixed size. A MATLAB trick may help you: A = zeros((max_number_rows_I_need,50)); # do your thing with A, keeping track of how many rows you use. Call that nrows. Then A = A[:nrows,:] From vbalko at gmail.com Wed Oct 19 13:15:07 2005 From: vbalko at gmail.com (Vlado Balko) Date: Wed, 19 Oct 2005 19:15:07 +0200 Subject: [SciPy-user] gui_thread error Message-ID: <43567F1B.1070207@gmail.com> Hello, I installed SciPy 0.3.2 on Python 2.4 from binaries compiled for 2.4 (I found it here in mailing list). Then I installed wxPython 2.6.1.0 unicode But when I tried to run this code: # set up graphics capabilities and import scipy into the workspace import gui_thread gui_thread.start() from scipy import * # create the curve values. a = arange(pi,2*pi,.01) b = sin(a) # also find find the function minimum on the same range x = optimize.fminbound(sin,pi,2*pi) plt.plot(a,b,'-'); plt.hold('on'); plt.plot([x],[sin(x)],'ro'); plt.yaxis([-1.5,0]); I get this error Traceback (most recent call last): File "F:\Aplikacie\@@Editory@@\Eclipse\eclipse302\workspace\Skusky\scipy1.py", line 3, in ? gui_thread.start() File "E:\Program Files\Python\Lib\site-packages\gui_thread\__init__.py", line 73, in start wxPython_thread() File "E:\Program Files\Python\Lib\site-packages\gui_thread\wxPython_thread.py", line 160, in wxPython_thread sys.modules[name] = wrap_extmodule(module,call_holder) File "E:\Program Files\Python\Lib\site-packages\gui_thread\wxPython_thread.py", line 61, in wrap_extmodule raise NotImplementedError,`t` NotImplementedError: then I tried to run test script on gui_thread Python\Lib\site-packages\gui_thread\tests>test_gui_thread.py --- from standard install i get: Traceback (most recent call last): File "E:\Program Files\Python\Lib\site-packages\gui_thread\tests\test_gui_thre ad.py", line 349, in ? if not threaded(): File "E:\Program Files\Python\Lib\site-packages\gui_thread\tests\test_gui_thre ad.py", line 29, in threaded return gui_thread.main.running_in_second_thread AttributeError: 'module' object has no attribute 'main' What does it mean? Where is the fault? Balky From Fernando.Perez at colorado.edu Wed Oct 19 13:18:07 2005 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Wed, 19 Oct 2005 11:18:07 -0600 Subject: [SciPy-user] gui_thread error In-Reply-To: <43567F1B.1070207@gmail.com> References: <43567F1B.1070207@gmail.com> Message-ID: <43567FCF.3060705@colorado.edu> Vlado Balko wrote: > Hello, > > I installed SciPy 0.3.2 on Python 2.4 from binaries compiled for 2.4 (I > found it here in mailing list). Then I installed wxPython 2.6.1.0 unicode > > But when I tried to run this code: > > # set up graphics capabilities and import scipy into the workspace > import gui_thread > gui_thread.start() > from scipy import * > # create the curve values. > a = arange(pi,2*pi,.01) > b = sin(a) > # also find find the function minimum on the same range > x = optimize.fminbound(sin,pi,2*pi) > plt.plot(a,b,'-'); Please note that scipy is undergoing a major reorganization, and in the process, the plt module has been deprecated and slated for removal. I suggest you look into using matplotlib for your plotting needs: http://matplotlib.sourceforge.net/ More plotting links can be found here: http://www.scipy.org/wikis/topical_software/TopicalSoftware cheers, f From vbalko at gmail.com Wed Oct 19 13:28:23 2005 From: vbalko at gmail.com (Vlado Balko) Date: Wed, 19 Oct 2005 19:28:23 +0200 Subject: [SciPy-user] gui_thread error In-Reply-To: <43567FCF.3060705@colorado.edu> References: <43567F1B.1070207@gmail.com> <43567FCF.3060705@colorado.edu> Message-ID: <43568237.1040604@gmail.com> So is there some working example and tutorial for 0.3.2 so I can learn myself basics? Only if I will know basics I can experiment with nonstandard things. I`m a little bit dissapointed with SciPy because I`m trying to find and learn it (and install it) for at least a week and still no success. I found it very interesting tool for my graduate work, but the troubles at the beginning are very demotivating. Balky Fernando Perez wrote: >Vlado Balko wrote: > > >>Hello, >> >>I installed SciPy 0.3.2 on Python 2.4 from binaries compiled for 2.4 (I >>found it here in mailing list). Then I installed wxPython 2.6.1.0 unicode >> >>But when I tried to run this code: >> >># set up graphics capabilities and import scipy into the workspace >>import gui_thread >>gui_thread.start() >>from scipy import * >># create the curve values. >>a = arange(pi,2*pi,.01) >>b = sin(a) >># also find find the function minimum on the same range >>x = optimize.fminbound(sin,pi,2*pi) >>plt.plot(a,b,'-'); >> >> > >Please note that scipy is undergoing a major reorganization, and in the >process, the plt module has been deprecated and slated for removal. I suggest >you look into using matplotlib for your plotting needs: > >http://matplotlib.sourceforge.net/ > >More plotting links can be found here: > >http://www.scipy.org/wikis/topical_software/TopicalSoftware > >cheers, > >f > >_______________________________________________ >SciPy-user mailing list >SciPy-user at scipy.net >http://www.scipy.net/mailman/listinfo/scipy-user > > > From Fernando.Perez at colorado.edu Wed Oct 19 14:21:12 2005 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Wed, 19 Oct 2005 12:21:12 -0600 Subject: [SciPy-user] gui_thread error In-Reply-To: <43568237.1040604@gmail.com> References: <43567F1B.1070207@gmail.com> <43567FCF.3060705@colorado.edu> <43568237.1040604@gmail.com> Message-ID: <43568E98.2040804@colorado.edu> Vlado Balko wrote: > So is there some working example and tutorial for 0.3.2 so I can learn > myself basics? Only if I will know basics I can experiment with > nonstandard things. > I`m a little bit dissapointed with SciPy because I`m trying to find and > learn it (and install it) for at least a week and still no success. I > found it very interesting tool for my graduate work, but the troubles at > the beginning are very demotivating. Sorry to hear about your troubles: in all honesty, it's a bit of bad luck with timing on your part. You happen to be coming to it right in the middle of a MAJOR reorganization. The entire codebase is being updated, old (broken) tools are being removed, new ones are falling in place, etc. Trust me: things will stabilize soon and life will in fact be a lot easier than it used to be. You just happened to jump right in the middle of the storm. In the meantime, you can use the old scipy, but ignore gui_thread and the *plt plotting modules. Instead, have a look at matplotlib for plotting and things should be OK. Soon you'll be able to switch to the new system which should smooth out most of these rough edges. Cheers, f From yann.ledu at noos.fr Wed Oct 19 14:32:54 2005 From: yann.ledu at noos.fr (Yann Le Du) Date: Wed, 19 Oct 2005 20:32:54 +0200 (CEST) Subject: [SciPy-user] python+maxima+latex for symolic stuff Message-ID: Hi, Yes, I'm interested in having a go at this tool, looks great. -- Yann Le Du From vbalko at gmail.com Wed Oct 19 14:35:38 2005 From: vbalko at gmail.com (Vlado Balko) Date: Wed, 19 Oct 2005 20:35:38 +0200 Subject: [SciPy-user] gui_thread error In-Reply-To: <43568E98.2040804@colorado.edu> References: <43567F1B.1070207@gmail.com> <43567FCF.3060705@colorado.edu> <43568237.1040604@gmail.com> <43568E98.2040804@colorado.edu> Message-ID: <435691FA.8050806@gmail.com> Thank You, these are words I wanted to hear :) Now I will explore by myself and will try your suggestions. And thank you for patience once again. balky Fernando Perez wrote: >Vlado Balko wrote: > > >>So is there some working example and tutorial for 0.3.2 so I can learn >>myself basics? Only if I will know basics I can experiment with >>nonstandard things. >>I`m a little bit dissapointed with SciPy because I`m trying to find and >>learn it (and install it) for at least a week and still no success. I >>found it very interesting tool for my graduate work, but the troubles at >>the beginning are very demotivating. >> >> > >Sorry to hear about your troubles: in all honesty, it's a bit of bad luck with >timing on your part. You happen to be coming to it right in the middle of a >MAJOR reorganization. The entire codebase is being updated, old (broken) >tools are being removed, new ones are falling in place, etc. > >Trust me: things will stabilize soon and life will in fact be a lot easier >than it used to be. You just happened to jump right in the middle of the storm. > >In the meantime, you can use the old scipy, but ignore gui_thread and the *plt >plotting modules. Instead, have a look at matplotlib for plotting and things >should be OK. Soon you'll be able to switch to the new system which should >smooth out most of these rough edges. > >Cheers, > >f > >_______________________________________________ >SciPy-user mailing list >SciPy-user at scipy.net >http://www.scipy.net/mailman/listinfo/scipy-user > > > From oliphant at ee.byu.edu Wed Oct 19 14:38:14 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Wed, 19 Oct 2005 12:38:14 -0600 Subject: [SciPy-user] gui_thread error In-Reply-To: <43568237.1040604@gmail.com> References: <43567F1B.1070207@gmail.com> <43567FCF.3060705@colorado.edu> <43568237.1040604@gmail.com> Message-ID: <43569296.5010809@ee.byu.edu> Vlado Balko wrote: >So is there some working example and tutorial for 0.3.2 so I can learn >myself basics? Only if I will know basics I can experiment with >nonstandard things. >I`m a little bit dissapointed with SciPy because I`m trying to find and >learn it (and install it) for at least a week and still no success. I >found it very interesting tool for my graduate work, but the troubles at >the beginning are very demotivating. > > I'm sorry for your trouble. The problem is that you are trying to learn SciPy during a time of large changes. The core library that SciPy uses has been changed and a lot of developer time is being spent on fixing scipy to use the new core library. Thus, most of the developers are leaving 0.3.2, and problems are not getting fixed. A few pointers: -gui_thread is not being maintained so I would not try using it. A better solution is to use xplt or to download matplotlib and use it for plotting. The scipy tutorial available at www.scipy.org is still valid for 0.3.2, I would look at that first. Good luck. -Travis From vbalko at gmail.com Wed Oct 19 15:03:04 2005 From: vbalko at gmail.com (Vlado Balko) Date: Wed, 19 Oct 2005 21:03:04 +0200 Subject: [SciPy-user] gui_thread error In-Reply-To: <43569296.5010809@ee.byu.edu> References: <43567F1B.1070207@gmail.com> <43567FCF.3060705@colorado.edu> <43568237.1040604@gmail.com> <43569296.5010809@ee.byu.edu> Message-ID: <43569868.5030904@gmail.com> Thank you, I`m experimenting with matplotlib now and i must say, that it is very easy to learn and use and still very powerfull. It is very good choice to plot an output data. balky Travis Oliphant wrote: >Vlado Balko wrote: > > > >>So is there some working example and tutorial for 0.3.2 so I can learn >>myself basics? Only if I will know basics I can experiment with >>nonstandard things. >>I`m a little bit dissapointed with SciPy because I`m trying to find and >>learn it (and install it) for at least a week and still no success. I >>found it very interesting tool for my graduate work, but the troubles at >>the beginning are very demotivating. >> >> >> >> >I'm sorry for your trouble. The problem is that you are trying to learn >SciPy during a time of large changes. The core library that SciPy uses >has been changed and a lot of developer time is being spent on fixing >scipy to use the new core library. > >Thus, most of the developers are leaving 0.3.2, and problems are not >getting fixed. > >A few pointers: > >-gui_thread is not being maintained so I would not try using it. > >A better solution is to use xplt or to download matplotlib and use it >for plotting. > >The scipy tutorial available at www.scipy.org is still valid for 0.3.2, >I would look at that first. > >Good luck. > >-Travis > >_______________________________________________ >SciPy-user mailing list >SciPy-user at scipy.net >http://www.scipy.net/mailman/listinfo/scipy-user > > > From oliphant at ee.byu.edu Wed Oct 19 16:21:56 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Wed, 19 Oct 2005 14:21:56 -0600 Subject: [SciPy-user] operations on int8 arrays In-Reply-To: <43566C59.6090508@psychology.nottingham.ac.uk> References: <43566C59.6090508@psychology.nottingham.ac.uk> Message-ID: <4356AAE4.7020009@ee.byu.edu> Jon Peirce wrote: >>>>Scipy arrays with dtype=uint8 or int8 seem to be >>>>mathematically-challenged on my machine (AMD64 WinXP running python >>>>2.4.2, scipy core 0.4.1). Simple int (and various others) appear fine. >>>> >>> >>> >>>>>>> >>>import scipy >>>>>>> >>>xx=scipy.array([100,100,100],scipy.int8) >>>>>>> >>>print xx.sum() >>>>>> >>>>>> >>>> 44 >>> >>> >>>>>>> >>>xx=scipy.array([100,100,100],scipy.int) >>>>>>> >>>print xx.sum() >>>>>> >>>>>> >>>> 300 >>>> >>>> >>> >>> >>This is not a bug. In the first line, you are telling the computer to >>add up 8-bit integers. The result does not fit in an 8-bit integer --- >>thus you are computing modulo 256. >> >>I suspect you wanted for the first case. >> >>xx.sum(rtype=int) -- this will "reduce" using the long integer type on >>your platform. >> >>-Travis >> > > Right, yes. I find it a bit unintuitive though that the resulting > array isn't automatically converted to a suitable type where > necessary. Even less intuitive is this: > >>>import scipy > >>>xx=scipy.array([100,100,100],scipy.int8) > >>>print xx.mean() > 14.666666666666666 Hmm. This is true. But, it is consistent with the behavior of true_divide which mean uses. It would be possible to make the default reduce type for integers 32-bit on 32-bit platforms and 64-bit on 64-bit platforms. the long integer type. Do people think that would be a good idea? These kinds of questions do come up. Or, this could simply be the default when calling the .sum method (which is add.reduce under the covers). The reduce method could stay with the default of the integer type. Obviously, it's going to give "unexpected" results to somebody. Automatic upcasting can have its downsides. But, perhaps in this case (integer reductions), it is better to do the upcasting. What do people think? -Travis From Fernando.Perez at colorado.edu Wed Oct 19 16:55:47 2005 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Wed, 19 Oct 2005 14:55:47 -0600 Subject: [SciPy-user] operations on int8 arrays In-Reply-To: <4356AAE4.7020009@ee.byu.edu> References: <43566C59.6090508@psychology.nottingham.ac.uk> <4356AAE4.7020009@ee.byu.edu> Message-ID: <4356B2D3.6040200@colorado.edu> Travis Oliphant wrote: > Or, this could simply be the default when calling the .sum method (which > is add.reduce under the covers). The reduce method could stay with the > default of the integer type. > > Obviously, it's going to give "unexpected" results to somebody. > Automatic upcasting can have its downsides. But, perhaps in this case > (integer reductions), it is better to do the upcasting. What do people > think? I've personally always been of the opinion that accumulator methods are one case where automatic upcasting is justified. Since not doing it is almost guaranteed to produce incorrect results in most cases (esp. for small bit-size types), I'm +1 on upcasting on this one. I agree that in general we shouldn't upcast silently, but I think this is a case of 'practicality beats purity'. Think of it this way: almost anyone writing the equivalent to .sum() manually in C would write this with a wide enough accumulator, so I think it's OK for scipy to do the same. Just my 1e-2. Cheers, f From gpajer at rider.edu Wed Oct 19 17:13:10 2005 From: gpajer at rider.edu (Gary Pajer) Date: Wed, 19 Oct 2005 17:13:10 -0400 Subject: [SciPy-user] operations on int8 arrays In-Reply-To: <4356B2D3.6040200@colorado.edu> References: <43566C59.6090508@psychology.nottingham.ac.uk> <4356AAE4.7020009@ee.byu.edu> <4356B2D3.6040200@colorado.edu> Message-ID: <4356B6E6.3060007@rider.edu> Fernando Perez wrote: >Travis Oliphant wrote: > > > >>Or, this could simply be the default when calling the .sum method (which >>is add.reduce under the covers). The reduce method could stay with the >>default of the integer type. >> >>Obviously, it's going to give "unexpected" results to somebody. >>Automatic upcasting can have its downsides. But, perhaps in this case >>(integer reductions), it is better to do the upcasting. What do people >>think? >> >> > >I've personally always been of the opinion that accumulator methods are one >case where automatic upcasting is justified. Since not doing it is almost >guaranteed to produce incorrect results in most cases (esp. for small bit-size >types), I'm +1 on upcasting on this one. > >I agree that in general we shouldn't upcast silently, but I think this is a >case of 'practicality beats purity'. > > Would there be a way of preventing upcasting, if desired? You wouldn't want to eliminate functionality. My $0.02: getting bitten is easy, but if I'm doing integer arithmetic, I usually want to be in control of what exactly is going on. Having the size of my integer change without my asking for it is ... well, it's not control. My feelings on this are significantly influenced by the way I was "brought up". Might be old fashioned. BTW, off hand, I can't think of why I personally would want a small integer, and keep it small even when the number in it gets big. But maybe cryptographers or somesuch do. >Think of it this way: almost anyone writing the equivalent to .sum() manually >in C would write this with a wide enough accumulator, so I think it's OK for >scipy to do the same. > > "wide enough" *to start with*. You wouldn't change it somewhere in the middle, I don't think. >Just my 1e-2. > >Cheers, > >f > > right back at you, -g >_______________________________________________ >SciPy-user mailing list >SciPy-user at scipy.net >http://www.scipy.net/mailman/listinfo/scipy-user > > > From oliphant at ee.byu.edu Wed Oct 19 17:53:42 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Wed, 19 Oct 2005 15:53:42 -0600 Subject: [SciPy-user] operations on int8 arrays In-Reply-To: <4356B6E6.3060007@rider.edu> References: <43566C59.6090508@psychology.nottingham.ac.uk> <4356AAE4.7020009@ee.byu.edu> <4356B2D3.6040200@colorado.edu> <4356B6E6.3060007@rider.edu> Message-ID: <4356C066.70808@ee.byu.edu> Gary Pajer wrote: >>I've personally always been of the opinion that accumulator methods are one >>case where automatic upcasting is justified. Since not doing it is almost >>guaranteed to produce incorrect results in most cases (esp. for small bit-size >>types), I'm +1 on upcasting on this one. >> >>I agree that in general we shouldn't upcast silently, but I think this is a >>case of 'practicality beats purity'. >> >> >> >> >Would there be a way of preventing upcasting, if desired? >You wouldn't want to eliminate functionality. > >My $0.02: getting bitten is easy, but if I'm doing integer arithmetic, >I usually want to be in control of what exactly is going on. Having the >size of my integer change without my asking for it is ... well, it's not >control. >My feelings on this are significantly influenced by the way I was >"brought up". Might be old fashioned. > > The only thing that would change is what the default reduce type is. You can already get the desired behavior by specifying a wider reduce type explicitly. One issue, though. Upcasting to a different type will be slower. Right now the defaults is that the reduce type is the same as the type of the object coming in. We could change this to some "wide" type. But should we? It will carry a speed hit. But the user can always over-ride the default behavior by an explicit rtype= paramter. I'd like more opinions... -Travis From Fernando.Perez at colorado.edu Wed Oct 19 19:24:39 2005 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Wed, 19 Oct 2005 17:24:39 -0600 Subject: [SciPy-user] operations on int8 arrays In-Reply-To: <4356C066.70808@ee.byu.edu> References: <43566C59.6090508@psychology.nottingham.ac.uk> <4356AAE4.7020009@ee.byu.edu> <4356B2D3.6040200@colorado.edu> <4356B6E6.3060007@rider.edu> <4356C066.70808@ee.byu.edu> Message-ID: <4356D5B7.6090802@colorado.edu> Travis Oliphant wrote: > One issue, though. Upcasting to a different type will be slower. It may be slower, but for the very small types, (8-bit), not upcasting is almost guaranteed to be wrong. That's why I think that the case of accumulators deserves special treatment: without it, the functions are time-bombs waiting to go off (and they have in the past, as we've seen on this list multiple times). For wider types you are obviously less likely to hit the problem, but for small ones it is very acute. People who know that they are adding less than 256 unsigned 8-bit integers, all so small that their sum can't be > 255 (and I don't know how they can know this in advance) could always call the routine with rtype=uint8. Most people would rather get the numerically correct answer, rather than the mysterious 'sum() % 255', which is effectively what the code does today. Anyway, I've stated my case to the best of my ability. If a choice is made not to upcast, I'll just get used to keeping a locally overridden sum() for my own usage, which isn't a blade without a handle. Cheers, f From oliphant at ee.byu.edu Wed Oct 19 19:35:06 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Wed, 19 Oct 2005 17:35:06 -0600 Subject: [SciPy-user] operations on int8 arrays In-Reply-To: <4356D5B7.6090802@colorado.edu> References: <43566C59.6090508@psychology.nottingham.ac.uk> <4356AAE4.7020009@ee.byu.edu> <4356B2D3.6040200@colorado.edu> <4356B6E6.3060007@rider.edu> <4356C066.70808@ee.byu.edu> <4356D5B7.6090802@colorado.edu> Message-ID: <4356D82A.3080802@ee.byu.edu> Fernando Perez wrote: >Travis Oliphant wrote: > > >>One issue, though. Upcasting to a different type will be slower. >> >> > >It may be slower, but for the very small types, (8-bit), not upcasting is >almost guaranteed to be wrong. That's why I think that the case of >accumulators deserves special treatment: without it, the functions are >time-bombs waiting to go off (and they have in the past, as we've seen on this >list multiple times). For wider types you are obviously less likely to hit >the problem, but for small ones it is very acute. > >People who know that they are adding less than 256 unsigned 8-bit integers, >all so small that their sum can't be > 255 (and I don't know how they can know >this in advance) could always call the routine with rtype=uint8. Most people >would rather get the numerically correct answer, rather than the mysterious >'sum() % 255', which is effectively what the code does today. > > > By the way, the sum, prod, cumsum, and cumprod functions already upcast by default for 8-bit and 16-bit integer types and bool types (as of SVN check in). -Travis From sransom at nrao.edu Wed Oct 19 19:37:58 2005 From: sransom at nrao.edu (Scott Ransom) Date: Wed, 19 Oct 2005 19:37:58 -0400 Subject: [SciPy-user] operations on int8 arrays In-Reply-To: <4356D5B7.6090802@colorado.edu> References: <43566C59.6090508@psychology.nottingham.ac.uk> <4356AAE4.7020009@ee.byu.edu> <4356B2D3.6040200@colorado.edu> <4356B6E6.3060007@rider.edu> <4356C066.70808@ee.byu.edu> <4356D5B7.6090802@colorado.edu> Message-ID: <20051019233758.GA21970@ssh.cv.nrao.edu> On Wed, Oct 19, 2005 at 05:24:39PM -0600, Fernando Perez wrote: > Anyway, I've stated my case to the best of my ability. If a choice is made > not to upcast, I'll just get used to keeping a locally overridden sum() for my > own usage, which isn't a blade without a handle. Consider this my vote as well for the reduction operations. Scott -- Scott M. Ransom Address: NRAO Phone: (434) 296-0320 520 Edgemont Rd. email: sransom at nrao.edu Charlottesville, VA 22903 USA GPG Fingerprint: 06A9 9553 78BE 16DB 407B FFCA 9BFA B6FF FFD3 2989 From oliphant at ee.byu.edu Wed Oct 19 19:40:59 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Wed, 19 Oct 2005 17:40:59 -0600 Subject: [SciPy-user] operations on int8 arrays In-Reply-To: <4356D5B7.6090802@colorado.edu> References: <43566C59.6090508@psychology.nottingham.ac.uk> <4356AAE4.7020009@ee.byu.edu> <4356B2D3.6040200@colorado.edu> <4356B6E6.3060007@rider.edu> <4356C066.70808@ee.byu.edu> <4356D5B7.6090802@colorado.edu> Message-ID: <4356D98B.5020505@ee.byu.edu> Fernando Perez wrote: >Travis Oliphant wrote: > > >>One issue, though. Upcasting to a different type will be slower. >> >> > > > O.K. the default for all reduce-like methods of ufuncs has been changed in SVN so that integer types whose size is smaller than sizeof(long) are reduced in the long data type (this is the Python integer type). rtype= is still an option for those that must have (or want reduction in a smaller integer type). -Travis From Fernando.Perez at colorado.edu Wed Oct 19 20:01:39 2005 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Wed, 19 Oct 2005 18:01:39 -0600 Subject: [SciPy-user] operations on int8 arrays In-Reply-To: <4356D98B.5020505@ee.byu.edu> References: <43566C59.6090508@psychology.nottingham.ac.uk> <4356AAE4.7020009@ee.byu.edu> <4356B2D3.6040200@colorado.edu> <4356B6E6.3060007@rider.edu> <4356C066.70808@ee.byu.edu> <4356D5B7.6090802@colorado.edu> <4356D98B.5020505@ee.byu.edu> Message-ID: <4356DE63.5030901@colorado.edu> Travis Oliphant wrote: > O.K. the default for all reduce-like methods of ufuncs has been changed > in SVN so that integer types whose size is smaller than sizeof(long) are > reduced in the long data type (this is the Python integer type). > > rtype= is still an option for those that must have (or want reduction > in a smaller integer type). Thank you :) Best, f From ryanlists at gmail.com Wed Oct 19 22:05:49 2005 From: ryanlists at gmail.com (Ryan Krauss) Date: Wed, 19 Oct 2005 22:05:49 -0400 Subject: [SciPy-user] gui_thread error In-Reply-To: <43569868.5030904@gmail.com> References: <43567F1B.1070207@gmail.com> <43567FCF.3060705@colorado.edu> <43568237.1040604@gmail.com> <43569296.5010809@ee.byu.edu> <43569868.5030904@gmail.com> Message-ID: Matplotlib also has a very active mailing list if you run into any problems. The first thinkg I was told from this list was to try matplotlib and ipython. I like them both a lot. If you aren't already using ipython check it out: http://ipython.scipy.org/ It handles the things that gui_thread used to do and it has tab completion and object inspection. If you are on windows, there are a couple other python packages to install, but it is fairly straightforward. It is availbable in package form from many Linux repositories. Don't hesitate to ask the list with anything you get stuck on. Even though there is a lot of upgrading going on, I am still doing productive graduate work with the old versions. I use it for everything I used to do with Matlab and I like it much better. Ryan On 10/19/05, Vlado Balko wrote: > Thank you, > > I`m experimenting with matplotlib now and i must say, that it is very > easy to learn and use and still very powerfull. It is very good choice > to plot an output data. > > balky > > Travis Oliphant wrote: > > >Vlado Balko wrote: > > > > > > > >>So is there some working example and tutorial for 0.3.2 so I can learn > >>myself basics? Only if I will know basics I can experiment with > >>nonstandard things. > >>I`m a little bit dissapointed with SciPy because I`m trying to find and > >>learn it (and install it) for at least a week and still no success. I > >>found it very interesting tool for my graduate work, but the troubles at > >>the beginning are very demotivating. > >> > >> > >> > >> > >I'm sorry for your trouble. The problem is that you are trying to learn > >SciPy during a time of large changes. The core library that SciPy uses > >has been changed and a lot of developer time is being spent on fixing > >scipy to use the new core library. > > > >Thus, most of the developers are leaving 0.3.2, and problems are not > >getting fixed. > > > >A few pointers: > > > >-gui_thread is not being maintained so I would not try using it. > > > >A better solution is to use xplt or to download matplotlib and use it > >for plotting. > > > >The scipy tutorial available at www.scipy.org is still valid for 0.3.2, > >I would look at that first. > > > >Good luck. > > > >-Travis > > > >_______________________________________________ > >SciPy-user mailing list > >SciPy-user at scipy.net > >http://www.scipy.net/mailman/listinfo/scipy-user > > > > > > > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user > From ted.horst at earthlink.net Wed Oct 19 23:46:36 2005 From: ted.horst at earthlink.net (Ted Horst) Date: Wed, 19 Oct 2005 22:46:36 -0500 Subject: [SciPy-user] patches for scipy_core-0.4.2 Message-ID: <7e1bd49f70649c081409f870481f4677@earthlink.net> Here are 2 patches for scipy_core-0.4.2. The first is on scipy/base/src/arraytypes.in.src. It corrects a casting problem from longdouble to other types: *** arraytypes.inc.src.orig Wed Oct 19 09:11:11 2005 --- arraytypes.inc.src Tue Oct 18 09:23:56 2005 *************** *** 373,379 **** #from=BYTE*13,UBYTE*13,SHORT*13,USHORT*13,INT*13,UINT*13,LONG*13,ULONG*1 3,LONGLONG*13,ULONGLONG*13,FLOAT*13,DOUBLE*13,LONGDOUBLE*13,CFLOAT*13,CD OUBLE*13,CLONGDOUBLE*13# #totyp=(byte, ubyte, short, ushort, int, uint, long, ulong, longlong, ulonglong, float, double, longdouble)*16# #fromtyp=byte*13, ubyte*13, short*13, ushort*13, int*13, uint*13, long*13, ulong*13, longlong*13, ulonglong*13, float*13, double*13, longdouble*13, float*13, double*13, longdouble*13# ! #incr= ip++*166,ip+=2*42# */ static void @from at _to_@to@(@fromtyp@ *ip, @totyp@ *op, intp n, PyArrayObject *aip, --- 373,379 ---- #from=BYTE*13,UBYTE*13,SHORT*13,USHORT*13,INT*13,UINT*13,LONG*13,ULONG*1 3,LONGLONG*13,ULONGLONG*13,FLOAT*13,DOUBLE*13,LONGDOUBLE*13,CFLOAT*13,CD OUBLE*13,CLONGDOUBLE*13# #totyp=(byte, ubyte, short, ushort, int, uint, long, ulong, longlong, ulonglong, float, double, longdouble)*16# #fromtyp=byte*13, ubyte*13, short*13, ushort*13, int*13, uint*13, long*13, ulong*13, longlong*13, ulonglong*13, float*13, double*13, longdouble*13, float*13, double*13, longdouble*13# ! #incr= ip++*169,ip+=2*39# */ static void @from at _to_@to@(@fromtyp@ *ip, @totyp@ *op, intp n, PyArrayObject *aip, The second patch is on scipy/base/src/umathmodule.c.src. It corrects a compilation problem on Mac OSX 10.3 (or other platforms that don't have long double functions). The inverse hyperbolic float and long dboule functions are accounted for twice, once in the generic float and long double templates and then again in a special ifdef section. The patch removes these functions from the generic templates: *** umathmodule.c.src.orig Wed Oct 19 09:10:39 2005 --- umathmodule.c.src Tue Oct 18 09:34:14 2005 *************** *** 178,187 **** /**begin repeat ! #kind=(sin,cos,tan,sinh,cosh,tanh,fabs,floor,ceil,sqrt,log10,log,exp,asi n,acos,atan,asinh,acosh,atanh)*2# ! #typ=longdouble*19, float*19# ! #c=l*19,f*19# ! #TYPE=LONGDOUBLE*19, FLOAT*19# */ #ifndef HAVE_ at TYPE@_FUNCS @typ@ @kind@@c@(@typ@ x) { --- 178,187 ---- /**begin repeat ! #kind=(sin,cos,tan,sinh,cosh,tanh,fabs,floor,ceil,sqrt,log10,log,exp,asi n,acos,atan)*2# ! #typ=longdouble*16, float*16# ! #c=l*16,f*16# ! #TYPE=LONGDOUBLE*16, FLOAT*16# */ #ifndef HAVE_ at TYPE@_FUNCS @typ@ @kind@@c@(@typ@ x) { From jean-paul.jadaud at cea.fr Thu Oct 20 03:30:51 2005 From: jean-paul.jadaud at cea.fr (Jean-Paul Jadaud) Date: Thu, 20 Oct 2005 09:30:51 +0200 Subject: [SciPy-user] python+maxima+latex for symolic stuff In-Reply-To: References: Message-ID: <200510200731.JAA19326@styx.bruyeres.cea.fr> Yann Le Du wrote: >Hi, > >Yes, I'm interested in having a go at this tool, looks great. > >-- >Yann Le Du > >_______________________________________________ >SciPy-user mailing list >SciPy-user at scipy.net >http://www.scipy.net/mailman/listinfo/scipy-user > > > Le hasard des mailing lists fait que je tombe sur ton message ... bonjour ? toi -- Jean-Paul JADAUD CEA/DIF DCRE/SCEP BP 12 91680 Bruyeres le Chatel FRANCE Phone 33 1 69 26 51 43 Fax 33 1 69 26 71 02 From oliphant at ee.byu.edu Thu Oct 20 03:40:42 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Thu, 20 Oct 2005 01:40:42 -0600 Subject: [SciPy-user] patches for scipy_core-0.4.2 In-Reply-To: <7e1bd49f70649c081409f870481f4677@earthlink.net> References: <7e1bd49f70649c081409f870481f4677@earthlink.net> Message-ID: <435749FA.1000308@ee.byu.edu> Ted Horst wrote: Thank you for these useful patches. These problems were introduced while fixing other issues. I've applied them in SVN. -Travis From oliphant at ee.byu.edu Thu Oct 20 03:41:02 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Thu, 20 Oct 2005 01:41:02 -0600 Subject: [SciPy-user] patches for scipy_core-0.4.2 In-Reply-To: <7e1bd49f70649c081409f870481f4677@earthlink.net> References: <7e1bd49f70649c081409f870481f4677@earthlink.net> Message-ID: <43574A0E.3000400@ee.byu.edu> Thank you Ted, for the useful patches. These problems were introduced while fixing other issues. I've applied them in SVN. -Travis From peter at mapledesign.co.uk Thu Oct 20 04:24:19 2005 From: peter at mapledesign.co.uk (Peter Bowyer) Date: Thu, 20 Oct 2005 09:24:19 +0100 Subject: [SciPy-user] Matlab, Scipy and teaching science Message-ID: <6.2.3.4.0.20051020091110.03d156b8@127.0.0.1> Hi, I'm new to this list, so let me introduce myself. I'm currently doing a Masters investigating ways to teach physics students how to program (and to investigate how much programming knowledge they need, given 2/3 of the class don't continue on in physics). In the past as with most physics departments the course was taught as an introduction to C or Basic. This worked fine up to the late 90s when most students had experience of programming. The structure here has been to have an introductory course in the 2nd year teaching people how to program, and then an optional course in the 3rd year introducing computational physics. There's a fundamental shift in principle in what I am considering, moving from the "How to implement numerical integration and differential equation" point, to "How to use libraries by others to carry out your calculations". The course currently lasts all of 12 hours and less than 5% of the students taking it have any knowledge of programming at the start. I did not need to use the material taught during the course in the rest of my degree, something which I am hoping to improve by changing the angle the course is coming from. My idea was to teach the basics using Python and scientific modules. However, the question I am consistently coming up against is "Why not teach the students Matlab?". It's a very good point and one I can think up no clear answer for. If the students stay in Physics or a related field they will be using Matlab (and C/C++/Fortran if needed). Therefore is there any reason not to teach Matlab as the introduction to programming? My arguments at present are that Matlab is a proprietary tool so the cost to students in obtaining copies will be not inconsiderable (considering it will only be used for a short course), and that Matlab is a specialised tool, so those not interested in going on into a physics related field will not find it of any use (unlike Python). The arguments for Matlab are stronger: 1) It's a standard tool, widely used 2) It is easier to install and maintain (discounting the Enthought edition for a moment, Python is CRAP compared with other langauges - where is the Package manager to make life easier?) 3) The editor has a good interface (v7 and above) which IDLE lacks (no data inspector 'right there') 4) Integrated help for all the scientific functions Are there any reasons you can think of that Python makes a better choice than Matlab? I myself would far rather use Python (I have ideas about how VPython can help the students understand Python) but need a more robust reason than a handwaving argument about "3D...easier for students to visualise...". Many thanks, and apologies if this is badly off-topic for the list. Peter From noel.oboyle2 at mail.dcu.ie Thu Oct 20 04:55:10 2005 From: noel.oboyle2 at mail.dcu.ie (Noel O'Boyle) Date: Thu, 20 Oct 2005 09:55:10 +0100 Subject: [SciPy-user] Matlab, Scipy and teaching science In-Reply-To: <6.2.3.4.0.20051020091110.03d156b8@127.0.0.1> References: <6.2.3.4.0.20051020091110.03d156b8@127.0.0.1> Message-ID: <1129798510.5197.23.camel@sandwi.ch.cam.ac.uk> > The arguments for Matlab are stronger: > 1) It's a standard tool, widely used As you say, the biggest problem with Matlab is that it's neither open source nor cheap. > 2) It is easier to install and maintain (discounting the Enthought > edition for a moment, Python is CRAP compared with other langauges - > where is the Package manager to make life easier?) A package manager may be provided by the operating system - for example, redhat linux uses yum, and debian linux has a fantastic package manager (see http://packages.debian.org/stable/python/ for a list of available stable packages for Python). > 3) The editor has a good interface (v7 and above) which IDLE lacks > (no data inspector 'right there') There are other editors. For example, ipython is closer to matlab, if you install matplotlib. > 4) Integrated help for all the scientific functions I think it's there - if you set things up with ipython. But it's not flashy or anything, true. You just type help(commandname) and get some text, but this is usually sufficient if you already know which command you want. Otherwise, you need to check out the tutorial. Regards, Noel From morgan.hough at gmail.com Thu Oct 20 05:30:49 2005 From: morgan.hough at gmail.com (Morgan Hough) Date: Thu, 20 Oct 2005 10:30:49 +0100 Subject: [SciPy-user] python+maxima+latex for symolic stuff In-Reply-To: References: Message-ID: <102408b60510200230o4dc235cbu91049f80c9636786@mail.gmail.com> I agree. Sounds very interesting. Worth packaging up. -Morgan On 10/19/05, Yann Le Du wrote: > Hi, > > Yes, I'm interested in having a go at this tool, looks great. > > -- > Yann Le Du > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user > From rkern at ucsd.edu Thu Oct 20 05:33:12 2005 From: rkern at ucsd.edu (Robert Kern) Date: Thu, 20 Oct 2005 02:33:12 -0700 Subject: [SciPy-user] Matlab, Scipy and teaching science In-Reply-To: <6.2.3.4.0.20051020091110.03d156b8@127.0.0.1> References: <6.2.3.4.0.20051020091110.03d156b8@127.0.0.1> Message-ID: <43576458.7040608@ucsd.edu> Peter Bowyer wrote: > My idea was to teach the basics using Python and scientific > modules. However, the question I am consistently coming up against > is "Why not teach the students Matlab?". It's a very good point and > one I can think up no clear answer for. If the students stay in > Physics or a related field they will be using Matlab (and > C/C++/Fortran if needed). Therefore is there any reason not to teach > Matlab as the introduction to programming? > > My arguments at present are that Matlab is a proprietary tool so the > cost to students in obtaining copies will be not inconsiderable > (considering it will only be used for a short course), and that > Matlab is a specialised tool, so those not interested in going on > into a physics related field will not find it of any use (unlike Python). [I should state before I begin that I have nothing but the deepest resentment and animosity towards Matlab.] Matlab cripples scientists who learn it as their first introduction to "real" programming. They continue to do *everything* in Matlab even when Matlab is the most hideously wrong tool to use because they've learned Matlab and using any other language would mean spending the effort to learn the new language. When they are convinced to learn another language, it's usually a low-level language so they can write faster code. But then they only write that low-level code as interfaces to Matlab so they can continue to do high-level things relatively easily. Unfortunately, the real benefits that a high-level language ought to provide are largely absent from Matlab. You can't really do any serious modularization of your code. Data flows all over the place. If I have to work with somebody else's Matlab code and it's more than a page long, I just throw it away and rewrite it in Python from scratch. Maybe I'll pick out some snippets here and there for guidance, but as a whole, the Matlab code is worthless *for communicating ideas*. When you're a research scientist, your code is only partially about getting the right answer. It's also very important that other people can understand what the code is doing and can build on your code. And don't get me started on using taxpayer-funded grants to develop research code that requires ludicrously expensive proprietary software to run. As you say, they might have to learn Matlab anyways to deal with other researcher's code, but if you get them started right with Python, they won't be contributing to the problem. > The arguments for Matlab are stronger: > 1) It's a standard tool, widely used Undeservedly so. > 2) It is easier to install and maintain And comparison in this regard ought to include the management of Matlab's floating license BS. Ease of installation and maintenance doesn't mean anything when you can't run it because too many other people on your campus are currently using it. > (discounting the Enthought > edition for a moment, Python is CRAP compared with other langauges - > where is the Package manager to make life easier?) We're working on it. http://peak.telecommunity.com/DevCenter/EasyInstall Also, any number of Linux distributions have fairly complete collections of Python packages integrated with their own package manager. At the recent SciPy '05 conference, Aric Hagberg told me that he ran a student summer program at LANL where they had to make a choice between purchasing Matlab licenses for a few dozen students or using Python. Of course they used Python. To help with the installation issue, they built a Linux LiveCD with all of the relevant software already installed. The students could just pop the CD into their computer and boot into a nice Linux environment that was known to work. Since all of the software on the CD was open source, the students could take that environment with them when they left the program. > 3) The editor has a good interface (v7 and above) which IDLE lacks > (no data inspector 'right there') IDLE isn't the only editor/IDE for Python. > 4) Integrated help for all the scientific functions We have fairly good docstring coverage in scipy at least. We need to build some searching capability, though. -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From peter at mapledesign.co.uk Thu Oct 20 05:45:17 2005 From: peter at mapledesign.co.uk (Peter Bowyer) Date: Thu, 20 Oct 2005 10:45:17 +0100 Subject: [SciPy-user] Matlab, Scipy and teaching science In-Reply-To: <1129798510.5197.23.camel@sandwi.ch.cam.ac.uk> References: <6.2.3.4.0.20051020091110.03d156b8@127.0.0.1> <1129798510.5197.23.camel@sandwi.ch.cam.ac.uk> Message-ID: <6.2.3.4.0.20051020103020.03ab3160@127.0.0.1> Hello Noel, At 09:55 20/10/2005, Noel O'Boyle wrote: >As you say, the biggest problem with Matlab is that it's neither open >source nor cheap. However when the university has a multimillion pound turnover each year the cost becomes less of an issue. FWIW the aerospace department already requires its students to buy a copy for a course; so this argument isn't as strong as I'd like. >A package manager may be provided by the operating system - for example, >redhat linux uses yum, and debian linux has a fantastic package manager >(see http://packages.debian.org/stable/python/ for a list of available >stable packages for Python). I forgot to mention that the university has standardised on Windows XP now. A debian-style package manager for Windows would be wonderful, but I doubt it'll happen (I can dream :-)) >There are other editors. For example, ipython is closer to matlab, if >you install matplotlib. Thanks, I'd not tried that one. >I think it's there - if you set things up with ipython. But it's not >flashy or anything, true. You just type help(commandname) and get some >text, but this is usually sufficient if you already know which command >you want. Otherwise, you need to check out the tutorial. What's nice is when you can't remember the method name to have a search facility directly there in the editor. That's probably showing my Windows/IDE heritage, but I like the ease it offers :) Peter -- Maple Design - quality web design and programming http://www.mapledesign.co.uk From markus.weimer at gmail.com Thu Oct 20 05:57:14 2005 From: markus.weimer at gmail.com (Markus Weimer) Date: Thu, 20 Oct 2005 11:57:14 +0200 Subject: [SciPy-user] Matlab, Scipy and teaching science In-Reply-To: <6.2.3.4.0.20051020091110.03d156b8@127.0.0.1> References: <6.2.3.4.0.20051020091110.03d156b8@127.0.0.1> Message-ID: <57be15330510200257n4f79911auf3f5812237273658@mail.gmail.com> Hi all, > My arguments at present are that Matlab is a proprietary tool so the > cost to students in obtaining copies will be not inconsiderable > (considering it will only be used for a short course), and that > Matlab is a specialised tool, so those not interested in going on > into a physics related field will not find it of any use (unlike Python). As an aside, please note that Mathworks recently changed it licensing policy to give academic discount only to places where you can actually graduate. This leaves research bodies like Fraunhofer and Max Planck in Germany without any option to get Matlab, as the prices for the commercial edition are simply to high. I assume that other research bodies around the world face the same problem, and many will consider switching to python. This one example shows how powerless one becomes when relying on proprietary software: Mathworks could increase prices whenever they like again. Even for universities. I for myself have decided to not be trapped by this and will stick to python, even if it is less convenient (and it really is). > 1) It's a standard tool, widely used This might change when major research bodies have to switch. > 2) It is easier to install and maintain (discounting the Enthought > edition for a moment, Python is CRAP compared with other langauges - > where is the Package manager to make life easier?) That is indeed one of the weakest points of python. > 3) The editor has a good interface (v7 and above) which IDLE lacks > (no data inspector 'right there') You should have a look at SPE, a nice python IDE that could be converted into the perfect matlab replacement. Unfortunately, its current development is not at all focussed on scientific computing. Greetings, Markus From morgan.hough at gmail.com Thu Oct 20 05:55:03 2005 From: morgan.hough at gmail.com (Morgan Hough) Date: Thu, 20 Oct 2005 10:55:03 +0100 Subject: [SciPy-user] Matlab, Scipy and teaching science In-Reply-To: <6.2.3.4.0.20051020103020.03ab3160@127.0.0.1> References: <6.2.3.4.0.20051020091110.03d156b8@127.0.0.1> <1129798510.5197.23.camel@sandwi.ch.cam.ac.uk> <6.2.3.4.0.20051020103020.03ab3160@127.0.0.1> Message-ID: <102408b60510200255v33b5d619ufa5bcdc65b06682c@mail.gmail.com> Python has been used here for physics teaching. http://www-teaching.physics.ox.ac.uk/computing/ProgrammingResources/Oxford/handbook_Python_html/handbook_Python.html Cheers, -Morgan On 10/20/05, Peter Bowyer wrote: > Hello Noel, > > At 09:55 20/10/2005, Noel O'Boyle wrote: > >As you say, the biggest problem with Matlab is that it's neither open > >source nor cheap. > > However when the university has a multimillion pound turnover each > year the cost becomes less of an issue. FWIW the aerospace > department already requires its students to buy a copy for a course; > so this argument isn't as strong as I'd like. > > >A package manager may be provided by the operating system - for example, > >redhat linux uses yum, and debian linux has a fantastic package manager > >(see http://packages.debian.org/stable/python/ for a list of available > >stable packages for Python). > > I forgot to mention that the university has standardised on Windows > XP now. A debian-style package manager for Windows would be > wonderful, but I doubt it'll happen (I can dream :-)) > > >There are other editors. For example, ipython is closer to matlab, if > >you install matplotlib. > > Thanks, I'd not tried that one. > > >I think it's there - if you set things up with ipython. But it's not > >flashy or anything, true. You just type help(commandname) and get some > >text, but this is usually sufficient if you already know which command > >you want. Otherwise, you need to check out the tutorial. > > What's nice is when you can't remember the method name to have a > search facility directly there in the editor. That's probably > showing my Windows/IDE heritage, but I like the ease it offers :) > > Peter > > -- > Maple Design - quality web design and programming > http://www.mapledesign.co.uk > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user > From peter at mapledesign.co.uk Thu Oct 20 06:12:21 2005 From: peter at mapledesign.co.uk (Peter Bowyer) Date: Thu, 20 Oct 2005 11:12:21 +0100 Subject: [SciPy-user] Matlab, Scipy and teaching science In-Reply-To: <102408b60510200255v33b5d619ufa5bcdc65b06682c@mail.gmail.co m> References: <6.2.3.4.0.20051020091110.03d156b8@127.0.0.1> <1129798510.5197.23.camel@sandwi.ch.cam.ac.uk> <6.2.3.4.0.20051020103020.03ab3160@127.0.0.1> <102408b60510200255v33b5d619ufa5bcdc65b06682c@mail.gmail.com> Message-ID: <6.2.3.4.0.20051020110830.0393b9c8@127.0.0.1> At 10:55 20/10/2005, Morgan Hough wrote: >Python has been used here for physics teaching. > >http://www-teaching.physics.ox.ac.uk/computing/ProgrammingResources/Oxford/handbook_Python_html/handbook_Python.html Indeed, however Oxford dropped teaching Python the next year, as they found the students did not find Python any easier to learn than C (they split the class in half and taught ~80 students C and ~80 students Python). I believe this was because the notes were pretty much a straight translation from C -> Python and am not convinced by the conclusion reached. Peter -- Maple Design - quality web design and programming http://www.mapledesign.co.uk From prabhu_r at users.sf.net Thu Oct 20 06:27:14 2005 From: prabhu_r at users.sf.net (Prabhu Ramachandran) Date: Thu, 20 Oct 2005 15:57:14 +0530 Subject: [SciPy-user] Matlab, Scipy and teaching science In-Reply-To: <43576458.7040608@ucsd.edu> References: <6.2.3.4.0.20051020091110.03d156b8@127.0.0.1> <43576458.7040608@ucsd.edu> Message-ID: <17239.28930.79965.507605@enthought.cfl.aero.iitm.ernet.in> >>>>> "Robert" == Robert Kern writes: Robert> At the recent SciPy '05 conference, Aric Hagberg told me Robert> that he ran a student summer program at LANL where they Robert> had to make a choice between purchasing Matlab licenses Robert> for a few dozen students or using Python. Of course they Robert> used Python. To help with the installation issue, they Robert> built a Linux LiveCD with all of the relevant software Is this live Linux CD available for download somewhere? I'm sure a large number of folks would find this very useful. cheers, prabhu From peter at mapledesign.co.uk Thu Oct 20 06:28:08 2005 From: peter at mapledesign.co.uk (Peter Bowyer) Date: Thu, 20 Oct 2005 11:28:08 +0100 Subject: [SciPy-user] Matlab, Scipy and teaching science In-Reply-To: <43576458.7040608@ucsd.edu> References: <6.2.3.4.0.20051020091110.03d156b8@127.0.0.1> <43576458.7040608@ucsd.edu> Message-ID: <6.2.3.4.0.20051020112744.0393b880@127.0.0.1> Hi Robert, At 10:33 20/10/2005, Robert Kern wrote: >Matlab cripples scientists who learn it as their first introduction to >"real" programming. They continue to do *everything* in Matlab even when >Matlab is the most hideously wrong tool to use because they've learned >Matlab and using any other language would mean spending the effort to >learn the new language. When they are convinced to learn another >language, it's usually a low-level language so they can write faster >code. But then they only write that low-level code as interfaces to >Matlab so they can continue to do high-level things relatively easily. Playing the devils advocate, doesn't that apply to learning any new programming language - because there's always an inertia to overcome? If you substitute Python for Matlab in the above, do you say it doesn't hold true? One point that people always make when they've heard of Python is that "Python is slow, so to write simulations the students will have to learn anyway". I believe enough (informal) studies have been done to show that students with no programming experience learn programming faster using Python. However, then comes the point that "Python does things differently, so it's harder to use another langauge". >As you say, they might have to learn Matlab anyways to deal with other >researcher's code, but if you get them started right with Python, they >won't be contributing to the problem. From the purists point of view I agree, however one criticism of the current course (using the department's own dialect of Basic) is that the language doesn't look good on the CV and isn't useful - something that Matlab may be able to overcome. Of course a bigger problem is that the students don't want to learn to program in the first place ("We'd have done computer science if we liked computers"). >We're working on it. > http://peak.telecommunity.com/DevCenter/EasyInstall Fantastic! >We have fairly good docstring coverage in scipy at least. We need to >build some searching capability, though. Search would be good. Peter -- Maple Design - quality web design and programming http://www.mapledesign.co.uk From t.zito at biologie.hu-berlin.de Thu Oct 20 06:37:09 2005 From: t.zito at biologie.hu-berlin.de (Tiziano Zito) Date: Thu, 20 Oct 2005 12:37:09 +0200 Subject: [SciPy-user] Matlab, Scipy and teaching science In-Reply-To: <102408b60510200255v33b5d619ufa5bcdc65b06682c@mail.gmail.com> References: <6.2.3.4.0.20051020091110.03d156b8@127.0.0.1> <1129798510.5197.23.camel@sandwi.ch.cam.ac.uk> <6.2.3.4.0.20051020103020.03ab3160@127.0.0.1> <102408b60510200255v33b5d619ufa5bcdc65b06682c@mail.gmail.com> Message-ID: <20051020103709.GA23562@itb.biologie.hu-berlin.de> We used Python for Computational Neuroscience teaching here: http://itb.biologie.hu-berlin.de/~zito/teaching/CNSIV/index.html Another strong arguments against Matlab is that it lacks any kind of Object Oriented design. As already stated by others, that makes practically impossible to reuse old code for new things and exchange code with other people. Last argument is speed and memory consumption. Part of our MDP Python library was originally written in Matlab: the corresponding Python versions are not only much shorter, but at least two times faster. Python passes arguments to functions as references, whereas Matlab uses copies: if you are dealing with large matrices this makes a difference. ciao, tiziano On Thu 20 Oct, 10:55, Morgan Hough wrote: > Python has been used here for physics teaching. > > http://www-teaching.physics.ox.ac.uk/computing/ProgrammingResources/Oxford/handbook_Python_html/handbook_Python.html > > Cheers, > > -Morgan > From prabhu_r at users.sf.net Thu Oct 20 06:37:40 2005 From: prabhu_r at users.sf.net (Prabhu Ramachandran) Date: Thu, 20 Oct 2005 16:07:40 +0530 Subject: [SciPy-user] Matlab, Scipy and teaching science In-Reply-To: <6.2.3.4.0.20051020110830.0393b9c8@127.0.0.1> References: <6.2.3.4.0.20051020091110.03d156b8@127.0.0.1> <1129798510.5197.23.camel@sandwi.ch.cam.ac.uk> <6.2.3.4.0.20051020103020.03ab3160@127.0.0.1> <102408b60510200255v33b5d619ufa5bcdc65b06682c@mail.gmail.com> <6.2.3.4.0.20051020110830.0393b9c8@127.0.0.1> Message-ID: <17239.29556.886646.401175@enthought.cfl.aero.iitm.ernet.in> >>>>> "Peter" == Peter Bowyer writes: Peter> At 10:55 20/10/2005, Morgan Hough wrote: >> Python has been used here for physics teaching. >> >> http://www-teaching.physics.ox.ac.uk/computing/ProgrammingResources/Oxford/handbook_Python_html/handbook_Python.html Peter> Indeed, however Oxford dropped teaching Python the next Peter> year, as they found the students did not find Python any Peter> easier to learn than C (they split the class in half and Peter> taught ~80 students C and ~80 students Python). I believe Peter> this was because the notes were pretty much a straight Peter> translation from C -> Python and am not convinced by the Peter> conclusion reached. You might want to check out Hans Fangohr's experiences in this regard. http://www.python-in-business.org/ep2005/talk.chtml?talk=2777&track=774 cheers, prabhu From cimrman3 at ntc.zcu.cz Thu Oct 20 06:43:41 2005 From: cimrman3 at ntc.zcu.cz (Robert Cimrman) Date: Thu, 20 Oct 2005 12:43:41 +0200 Subject: [SciPy-user] Matlab, Scipy and teaching science In-Reply-To: <20051020103709.GA23562@itb.biologie.hu-berlin.de> References: <6.2.3.4.0.20051020091110.03d156b8@127.0.0.1> <1129798510.5197.23.camel@sandwi.ch.cam.ac.uk> <6.2.3.4.0.20051020103020.03ab3160@127.0.0.1> <102408b60510200255v33b5d619ufa5bcdc65b06682c@mail.gmail.com> <20051020103709.GA23562@itb.biologie.hu-berlin.de> Message-ID: <435774DD.7070305@ntc.zcu.cz> Tiziano Zito wrote: > We used Python for Computational Neuroscience teaching here: > > http://itb.biologie.hu-berlin.de/~zito/teaching/CNSIV/index.html > > Another strong arguments against Matlab is that it lacks any kind of > Object Oriented design. As already stated by others, that makes > practically impossible to reuse old code for new things and exchange > code with other people. Last argument is speed and memory > consumption. Part of our MDP Python library was originally written in > Matlab: the corresponding Python versions are not only much shorter, > but at least two times faster. Python passes arguments to functions > as references, whereas Matlab uses copies: if you are dealing with > large matrices this makes a difference. Just a minor correction - Matlab passes arguments also by reference, but it makes a *copy* of each argument that you change inside the function - it can be (ugly)-hacked around though... r. From cimrman3 at ntc.zcu.cz Thu Oct 20 06:44:42 2005 From: cimrman3 at ntc.zcu.cz (Robert Cimrman) Date: Thu, 20 Oct 2005 12:44:42 +0200 Subject: [SciPy-user] Matlab, Scipy and teaching science In-Reply-To: <43576458.7040608@ucsd.edu> References: <6.2.3.4.0.20051020091110.03d156b8@127.0.0.1> <43576458.7040608@ucsd.edu> Message-ID: <4357751A.2030801@ntc.zcu.cz> Robert Kern wrote: > [I should state before I begin that I have nothing but the deepest > resentment and animosity towards Matlab.] > > Matlab cripples scientists who learn it as their first introduction to > "real" programming. They continue to do *everything* in Matlab even when > Matlab is the most hideously wrong tool to use because they've learned > Matlab and using any other language would mean spending the effort to > learn the new language. When they are convinced to learn another > language, it's usually a low-level language so they can write faster > code. But then they only write that low-level code as interfaces to > Matlab so they can continue to do high-level things relatively easily. > > Unfortunately, the real benefits that a high-level language ought to > provide are largely absent from Matlab. You can't really do any serious > modularization of your code. Data flows all over the place. If I have to > work with somebody else's Matlab code and it's more than a page long, I > just throw it away and rewrite it in Python from scratch. Maybe I'll > pick out some snippets here and there for guidance, but as a whole, the > Matlab code is worthless *for communicating ideas*. When you're a > research scientist, your code is only partially about getting the right > answer. It's also very important that other people can understand what > the code is doing and can build on your code. > ... As I generally agree, that Matlab is terrible as a programming language, I must say that even in it one can write a reasonably structured/modularized code. Of course, in python it's orders of magnitude easier. Btw. I am a Matlab user because of various circumstances, so I had to learn how to live with it but now I am trying to break the shackles, partially also thanks to the existence of SciPy and similar projects! r. From morgan.hough at gmail.com Thu Oct 20 06:45:39 2005 From: morgan.hough at gmail.com (Morgan Hough) Date: Thu, 20 Oct 2005 11:45:39 +0100 Subject: [SciPy-user] Matlab, Scipy and teaching science In-Reply-To: <6.2.3.4.0.20051020110830.0393b9c8@127.0.0.1> References: <6.2.3.4.0.20051020091110.03d156b8@127.0.0.1> <1129798510.5197.23.camel@sandwi.ch.cam.ac.uk> <6.2.3.4.0.20051020103020.03ab3160@127.0.0.1> <102408b60510200255v33b5d619ufa5bcdc65b06682c@mail.gmail.com> <6.2.3.4.0.20051020110830.0393b9c8@127.0.0.1> Message-ID: <102408b60510200345x6140fa9cq33ae1a667e2d98f@mail.gmail.com> I must confess that the postgraduate Scientific Computing for DPhil Students still use MATLAB but I thought the undergraduates still had a choice. I did bring up python as an alternative for the Scientific Computing course but to no avail. I know what you mean about those python notes. There really needs to be an Enthought Python (or an easy way to put it together) for all platforms. It would make it a lot easier on Linux systems to get people to present it as a MATLAB alternative. Right now I am doing a C++ workgroup in the Maths Institute as it is the only group willing to discuss OO programming. Not the best way to do it really. Cheers, -Morgan On 10/20/05, Peter Bowyer wrote: > At 10:55 20/10/2005, Morgan Hough wrote: > >Python has been used here for physics teaching. > > > >http://www-teaching.physics.ox.ac.uk/computing/ProgrammingResources/Oxford/handbook_Python_html/handbook_Python.html > > Indeed, however Oxford dropped teaching Python the next year, as they > found the students did not find Python any easier to learn than C > (they split the class in half and taught ~80 students C and ~80 > students Python). I believe this was because the notes were pretty > much a straight translation from C -> Python and am not convinced by > the conclusion reached. > > Peter > > -- > Maple Design - quality web design and programming > http://www.mapledesign.co.uk > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user > From ryanlists at gmail.com Thu Oct 20 09:34:12 2005 From: ryanlists at gmail.com (Ryan Krauss) Date: Thu, 20 Oct 2005 09:34:12 -0400 Subject: [SciPy-user] python+maxima+latex for symolic stuff In-Reply-To: <102408b60510200230o4dc235cbu91049f80c9636786@mail.gmail.com> References: <102408b60510200230o4dc235cbu91049f80c9636786@mail.gmail.com> Message-ID: I will clean it up and create a simple example over the next few days and let people know when it is available. Ryan On 10/20/05, Morgan Hough wrote: > I agree. Sounds very interesting. Worth packaging up. > > -Morgan > > On 10/19/05, Yann Le Du wrote: > > Hi, > > > > Yes, I'm interested in having a go at this tool, looks great. > > > > -- > > Yann Le Du > > > > _______________________________________________ > > SciPy-user mailing list > > SciPy-user at scipy.net > > http://www.scipy.net/mailman/listinfo/scipy-user > > > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user > From nikolai.hlubek at mailbox.tu-dresden.de Thu Oct 20 10:18:27 2005 From: nikolai.hlubek at mailbox.tu-dresden.de (Nikolai Hlubek) Date: Thu, 20 Oct 2005 16:18:27 +0200 Subject: [SciPy-user] Matlab, Scipy and teaching science In-Reply-To: <1129798510.5197.23.camel@sandwi.ch.cam.ac.uk> References: <6.2.3.4.0.20051020091110.03d156b8@127.0.0.1> <1129798510.5197.23.camel@sandwi.ch.cam.ac.uk> Message-ID: <4357A733.1070201@mailbox.tu-dresden.de> Noel O'Boyle wrote: [...] >>3) The editor has a good interface (v7 and above) which IDLE lacks >>(no data inspector 'right there') > > > There are other editors. For example, ipython is closer to matlab, if > you install matplotlib. [...] I'd like to promote eric a bit. I've been using it for a year now and it is really great. If you are looking for a full featured python IDE just give it a try. http://www.die-offenbachs.de/detlev/eric3.html Cheers, Nikolai -- "1984" is not a howto! From arnd.baecker at web.de Thu Oct 20 12:00:24 2005 From: arnd.baecker at web.de (Arnd Baecker) Date: Thu, 20 Oct 2005 18:00:24 +0200 (CEST) Subject: [SciPy-user] Matlab, Scipy and teaching science In-Reply-To: <17239.28930.79965.507605@enthought.cfl.aero.iitm.ernet.in> References: <6.2.3.4.0.20051020091110.03d156b8@127.0.0.1> <43576458.7040608@ucsd.edu> <17239.28930.79965.507605@enthought.cfl.aero.iitm.ernet.in> Message-ID: On Thu, 20 Oct 2005, Prabhu Ramachandran wrote: > >>>>> "Robert" == Robert Kern writes: > > Robert> At the recent SciPy '05 conference, Aric Hagberg told me > Robert> that he ran a student summer program at LANL where they > Robert> had to make a choice between purchasing Matlab licenses > Robert> for a few dozen students or using Python. Of course they > Robert> used Python. To help with the installation issue, they > Robert> built a Linux LiveCD with all of the relevant software > > Is this live Linux CD available for download somewhere? I'm sure a > large number of folks would find this very useful. What about the debian based Quantian? http://dirk.eddelbuettel.com/quantian.html Some time ago we did some effort to get scipy included. Presently it does in addition include python 2.3, ipython, matplotlib, python-vtk, wxgtk2.4-python, atlas2-base, PIL ... More detailed list below and at http://dirk.eddelbuettel.com/quantian/quantian_0.6.9.3.quantian.packages.txt Best, Arnd ii python 2.3.4-1 An interactive high-level object-oriented language (default vers ii python-apt 0.5.10 Python interface to libapt-pkg ii python-bibtex 1.2.1-1 Python interfaces to BibTeX and the GNU Recode library ii python-biopython 1.30-2 Python library for bioinformatics ii python-biopython-martel 1.30-1 Flat file parser for the Biopython library ii python-dev 2.3.4-4 Header files and a static library for Python (default) ii python-egenix-mxdatetime 2.0.5-2 Date and time handling routines for Python [dummy package] ii python-egenix-mxtexttool 2.0.5-2 Fast text manipulation tools for Python [dummy package] ii python-gd 0.53-1 GD module is an interface to the GD library ii python-gdal 1.2.1-1 Python bindings to the Geospatial Data Abstraction Library ii python-glade2 2.2.0-3 GTK+ bindings: Glade support ii python-gnome2 2.0.3-1 Python bindings for the GNOME desktop environment ii python-gnuplot 1.7-4 A Python interface to the gnuplot plotting program ii python-gtk2 2.2.0-3 Python bindings for the GTK+ widget set ii python-imaging 1.1.4-3 Python Imaging Library ii python-matplotlib 0.64-1 python based plotting system in a style similar to matlab ii python-mode 4.62-1 Emacs-lisp python-mode for the Python language ii python-netcdf 2.4.6-2 A netCDF interface for Python ii python-numeric 23.6-2 Numerical (matrix-oriented) Mathematics for Python ii python-numeric-ext 23.6-2 Extension modules for Numeric Python ii python-pmw 1.2-2 Pmw -- Python MegaWidgets ii python-pypaint 0.3-1 pypaint provides a light Python wrapper for libart ii python-reportlab 1.19debian-0.1 ReportLab library to create PDF documents using Python ii python-scientific 2.4.6-2 Python modules useful for scientific computing ii python-stats 0.6-5 A collection of statistical functions for Python ii python-tables 0.8.1-4 A hierarchical database for Python based on HDF5 ii python-tk 2.3.4-1 Tkinter - Writing Tk applications with Python (default version) ii python-vtk 4.2.6-5 Python bindings for VTK ii python-xml 0.8.3-5 XML tools for Python [dummy package] ii python2.3 2.3.4-13 An interactive high-level object-oriented language (version 2.3) ii python2.3-arpack 0.2-3 Python binding to ARPACK ii python2.3-biopython 1.30-1 Python library for bioinformatics ii python2.3-biopython-mart 1.30-1 Flat file parser for the Biopython library ii python2.3-cjkcodecs 1.1.1-1 Python Unicode Codecs Collection for CJK Encodings ii python2.3-dev 2.3.4-13 Header files and a static library for Python (v2.3) ii python2.3-doc 2.3.4-13 Documentation for the high-level object-oriented language Python ii python2.3-egenix-mxdatet 2.0.5-2 Date and time handling routines for Python 2.3 ii python2.3-egenix-mxtextt 2.0.5-2 Fast text manipulation tools for Python 2.3 ii python2.3-egenix-mxtools 2.0.5-2 A collection of new builtins for Python 2.3 ii python2.3-f2py 2.42.239+1844- Fortran to Python interface generator ii python2.3-glade2 2.2.0-3 GTK+ bindings: Glade support ii python2.3-gnome2 2.0.3-1 Python bindings for the GNOME desktop environment ii python2.3-gtk2 2.2.0-3 Python bindings for the GTK+ widget set ii python2.3-iconvcodec 1.1.2-1 Python universal Unicode codec, using iconv() ii python2.3-imaging 1.1.4-3 Python Imaging Library ii python2.3-japanese-codec 1.4.9-3 Japanese Codecs for Python ii python2.3-mysqldb 1.1.6-1 A Python interface to MySQL ii python2.3-numarray 1.1-2 An array processing package modelled after Python-Numeric ii python2.3-numeric 23.5-1 Numerical (matrix-oriented) Mathematics for Python ii python2.3-numeric-ext 23.5-1 Extension modules for Numeric Python ii python2.3-psycopg 1.1.15-1 Python 2.3 module for PostgreSQL ii python2.3-pyorbit 2.0.1-1 A Python language binding for the ORBit2 CORBA implementation ii python2.3-pythoncard 0.7.3.1-1 wxPython-based GUI construction framework (underlying Python 2.3 ii python2.3-reportlab 1.19debian-0.1 ReportLab library to create PDF documents using Python (2.3) ii python2.3-rpy 0.4.0-1 Python interface to the GNU R language and environment ii python2.3-scipy 0.3.2-3 scientific tools for Python 2.3 ii python2.3-scipy-core 0.3.0+108.1820 Scientific tools for Python ii python2.3-sqlite 0.5.1-3 Python interface to SQLite ii python2.3-subversion 1.0.9-2 Python modules for interfacing with Subversion (aka. svn) ii python2.3-tables 0.8.1-4 A hierarchical database for Python based on HDF5 ii python2.3-tk 2.3.4-13 Tkinter - Writing Tk applications with Python (v2.3) ii python2.3-umfpack 0.1-3 Python binding to UMFPACK ii python2.3-xml 0.8.3-5 XML tools for Python (2.3.x) From H.FANGOHR at soton.ac.uk Thu Oct 20 13:20:16 2005 From: H.FANGOHR at soton.ac.uk (Hans Fangohr) Date: Thu, 20 Oct 2005 18:20:16 +0100 (BST) Subject: [SciPy-user] Matlab, Scipy and teaching science In-Reply-To: <6.2.3.4.0.20051020091110.03d156b8@127.0.0.1> References: <6.2.3.4.0.20051020091110.03d156b8@127.0.0.1> Message-ID: Hi all, > I'm new to this list, so let me introduce myself. I'm currently > doing a Masters investigating ways to teach physics students how to > modules. However, the question I am consistently coming up against > is "Why not teach the students Matlab?". It's a very good point and > one I can think up no clear answer for. If the students stay in > Physics or a related field they will be using Matlab (and > C/C++/Fortran if needed). Therefore is there any reason not to teach > Matlab as the introduction to programming? As Prabhu pointed out, I have recently replaced Matlab with Python in a computing module I deliver to engineering students. They have very little prior knowledge. Let me try to reply to some of your points, and provide some further thought, partly based on this experience. I know Matlab pretty well, and have experience teaching beginners using it. While in my eyes Matlab is a very good collection of linear algebra libraries and decent visualisation in the same package, the glueing 'matlab language' that is used to script these components together is quite broken (from a teaching point of view). For example, (i) you can have only one (globally visible) function per file. This is confusing for students because I try to introduce the concept of modularisation by suggesting to use functions to split up code. They find this not useful, because then the code is distributed over several files (!) and you can't see all the files at the same time on your screen. Also: (ii) the name of the function is determined by the label of the function inside the file that defines the function. However, other code (located outside that file) will know the function under the name of the file (!) it is saved in (without the required ".m" extension). You therefore have two names which -- if you want to keep things tidy -- should always be the same. This is another source for errors and frustration. (iii) There are more odd things, for example try these statements at theh matlab command prompt: "sum 10" or "cos pi" for some entertainment. If you are a student, you might expect "10" and "0.0" as replies but you get "97" and "0.4560 -0.2410". (I would have hoped to get an error message.) While these are very entertaining challenges for the lecturer, this is clearly confusing to the student. (Can you work out why this happens?) It boils down to this: If you know how to program 'correctly' already, how to design your programs and data structures well, then you can write useful and structured code in Matlab (or in C for that matter). If, however, you are trying to learn these abilities, then Python puts fewer obstacles in the way than Matlab. I have published a few of these thoughts here (if citing a paper helps for your case): H. Fangohr. A Comparison of C, Matlab and Python as Teaching Languages in Engineering. Lecture Notes on Computational Science 3039, 1210-1217 (2004) Above, I have outlined practical issues for complete beginners. If you look slightly ahead, then Python supports the three most common programing paradigms: imperative, functional and object oriented whereas Matlab is (pratically) procedural (and it not really object oriented). Python allows you to use these concepts but you can mix and match as you like and ignore what you don't want to know and use. (This keeps the barrier low to start using it, but keeps the door open to do quite sophisticated stuff.) With respect to 'Matlab is THE software in industry': there is no doubt that it is widely used in certain parts of science and engineering. (And so for good reason: it is an interpreted high-level language that makes your life easier than if you used Fortran or C for the same problem.) However, there is a certainly a growing trend towards Python. I have spoken with a representative from Airbus and was told that a number of new projects use Python as the user interface (often with C or C++ underneath whereever speed is required). Surprisingly, one of the reasons mentioned why they move away from Matlab, is that it was too expensive. They also expressed delight that students leaving our university would know (some) Python already. I hope this is some useful information. Cheers, Hans From perry at stsci.edu Thu Oct 20 14:13:45 2005 From: perry at stsci.edu (Perry Greenfield) Date: Thu, 20 Oct 2005 14:13:45 -0400 Subject: [SciPy-user] Matlab, Scipy and teaching science In-Reply-To: <6.2.3.4.0.20051020091110.03d156b8@127.0.0.1> References: <6.2.3.4.0.20051020091110.03d156b8@127.0.0.1> Message-ID: On Oct 20, 2005, at 4:24 AM, Peter Bowyer wrote: > Hi, > > I'm new to this list, so let me introduce myself. I'm currently > doing a Masters investigating ways to teach physics students how to > program (and to investigate how much programming knowledge they need, > given 2/3 of the class don't continue on in physics). > > In the past as with most physics departments the course was taught as > an introduction to C or Basic. This worked fine up to the late 90s > when most students had experience of programming. The structure here > has been to have an introductory course in the 2nd year teaching > people how to program, and then an optional course in the 3rd year > introducing computational physics. > > There's a fundamental shift in principle in what I am considering, > moving from the "How to implement numerical integration and > differential equation" point, to "How to use libraries by others to > carry out your calculations". The course currently lasts all of 12 > hours and less than 5% of the students taking it have any knowledge > of programming at the start. I did not need to use the material > taught during the course in the rest of my degree, something which I > am hoping to improve by changing the angle the course is coming from. > > My idea was to teach the basics using Python and scientific > modules. However, the question I am consistently coming up against > is "Why not teach the students Matlab?". It's a very good point and > one I can think up no clear answer for. If the students stay in > Physics or a related field they will be using Matlab (and > C/C++/Fortran if needed). Therefore is there any reason not to teach > Matlab as the introduction to programming? > > My arguments at present are that Matlab is a proprietary tool so the > cost to students in obtaining copies will be not inconsiderable > (considering it will only be used for a short course), and that > Matlab is a specialised tool, so those not interested in going on > into a physics related field will not find it of any use (unlike > Python). > > The arguments for Matlab are stronger: > 1) It's a standard tool, widely used > 2) It is easier to install and maintain (discounting the Enthought > edition for a moment, Python is CRAP compared with other langauges - > where is the Package manager to make life easier?) > 3) The editor has a good interface (v7 and above) which IDLE lacks > (no data inspector 'right there') > 4) Integrated help for all the scientific functions > > Are there any reasons you can think of that Python makes a better > choice than Matlab? I myself would far rather use Python (I have > ideas about how VPython can help the students understand Python) but > need a more robust reason than a handwaving argument about > "3D...easier for students to visualise...". > > Many thanks, and apologies if this is badly off-topic for the list. > Peter The problem I have with this is that your goals seem murky. Is the choice centered around what is most effective to teach Physics students how to program (as your opening suggest). Or what they think will be most useful to them in their career? Or which you think will irritate them least with regard to installation or cost? Or something else? Who is deciding? You? Your advisor? If your goal is teaching them programming and the most modern programming concepts, the answer is easy: pick Python. If you want to give them something they can use now with good current documentation and ease of installation: pick Matlab. If you are forward looking (i.e., that you want this to persist in some way beyond just helping you finish your masters), I'd recommend Python. Things are certainly more confusing, less well integrated, changing rapidly, etc on the Python side of things. But I think Python as a scientific and engineering environment is gaining rapidly on things like Matlab and IDL and one day in the not too distant future (I hope) will leave both of these in the dust. So if you want to be a pioneer, take the leap. You may find that if the students "stay in the field", they will be using Python and not Matlab. But I'm a strong believer in not overselling the current situation. As a canned solution, Python (scipy) isn't yet at the level of either of these, and it would be a mistake to represent it as such. Perry Greenfield From rkern at ucsd.edu Thu Oct 20 15:17:13 2005 From: rkern at ucsd.edu (Robert Kern) Date: Thu, 20 Oct 2005 12:17:13 -0700 Subject: [SciPy-user] Matlab, Scipy and teaching science In-Reply-To: <6.2.3.4.0.20051020112744.0393b880@127.0.0.1> References: <6.2.3.4.0.20051020091110.03d156b8@127.0.0.1> <43576458.7040608@ucsd.edu> <6.2.3.4.0.20051020112744.0393b880@127.0.0.1> Message-ID: <4357ED39.9060507@ucsd.edu> Peter Bowyer wrote: > Hi Robert, > > At 10:33 20/10/2005, Robert Kern wrote: > >>Matlab cripples scientists who learn it as their first introduction to >>"real" programming. They continue to do *everything* in Matlab even when >>Matlab is the most hideously wrong tool to use because they've learned >>Matlab and using any other language would mean spending the effort to >>learn the new language. When they are convinced to learn another >>language, it's usually a low-level language so they can write faster >>code. But then they only write that low-level code as interfaces to >>Matlab so they can continue to do high-level things relatively easily. > > Playing the devils advocate, doesn't that apply to learning any new > programming language - because there's always an inertia to > overcome? If you substitute Python for Matlab in the above, do you > say it doesn't hold true? No, because Python is an actual, useful programming language. Matlab is barely one. It's quite appropriate to write GUIs in Python. It is never appropriate to write GUIs in Matlab. Python isn't appropriate for *everything*, but it carves out a humongously larger territory than Matlab. -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From jeremy at jeremysanders.net Thu Oct 20 15:45:50 2005 From: jeremy at jeremysanders.net (Jeremy Sanders) Date: Thu, 20 Oct 2005 20:45:50 +0100 (BST) Subject: [SciPy-user] ANN: Veusz 0.8 released Message-ID: Veusz 0.8 --------- Velvet Ember Under Sky Zenith ----------------------------- http://home.gna.org/veusz/ Veusz is Copyright (C) 2003-2005 Jeremy Sanders Licenced under the GPL (version 2 or greater) Veusz is a scientific plotting package written in Python (currently 100% Python). It uses PyQt for display and user-interfaces, and numarray for handling the numeric data. Veusz is designed to produce publication-ready Postscript output. Veusz provides a GUI, command line and scripting interface (based on Python) to its plotting facilities. The plots are built using an object-based system to provide a consistent interface. Changes from 0.7: Please refer to ChangeLog for all the changes. Highlights include: * Datasets can be linked together with expressions * SVG export * Edit/Copy/Cut support of widgets * Pan image with mouse * Click on graph to change settings * Lots of UI improvements Features of package: * X-Y plots (with errorbars) * Images (with colour mappings) * Stepped plots (for histograms) * Line plots * Function plots * Fitting functions to data * Stacked plots and arrays of plots * Plot keys * Plot labels * LaTeX-like formatting for text * EPS output * Simple data importing * Scripting interface * Save/Load plots * Dataset manipulation * Embed Veusz within other programs To be done: * Contour plots * UI improvements * Import filters (for qdp and other plotting packages, fits, csv) Requirements: Python (probably 2.3 or greater required) http://www.python.org/ Qt (free edition) http://www.trolltech.com/products/qt/ PyQt (SIP is required to be installed first) http://www.riverbankcomputing.co.uk/pyqt/ http://www.riverbankcomputing.co.uk/sip/ numarray http://www.stsci.edu/resources/software_hardware/numarray Microsoft Core Fonts (recommended) http://corefonts.sourceforge.net/ PyFITS (optional) http://www.stsci.edu/resources/software_hardware/pyfits For documentation on using Veusz, see the "Documents" directory. The manual is in pdf, html and text format (generated from docbook). If you enjoy using Veusz, I would love to hear from you. Please join the mailing lists at https://gna.org/mail/?group=veusz to discuss new features or if you'd like to contribute code. The newest code can always be found in CVS. From Fernando.Perez at colorado.edu Thu Oct 20 16:16:49 2005 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Thu, 20 Oct 2005 14:16:49 -0600 Subject: [SciPy-user] Matlab, Scipy and teaching science In-Reply-To: <17239.28930.79965.507605@enthought.cfl.aero.iitm.ernet.in> References: <6.2.3.4.0.20051020091110.03d156b8@127.0.0.1> <43576458.7040608@ucsd.edu> <17239.28930.79965.507605@enthought.cfl.aero.iitm.ernet.in> Message-ID: <4357FB31.4080805@colorado.edu> Prabhu Ramachandran wrote: >>>>>>"Robert" == Robert Kern writes: > > > Robert> At the recent SciPy '05 conference, Aric Hagberg told me > Robert> that he ran a student summer program at LANL where they > Robert> had to make a choice between purchasing Matlab licenses > Robert> for a few dozen students or using Python. Of course they > Robert> used Python. To help with the installation issue, they > Robert> built a Linux LiveCD with all of the relevant software > > Is this live Linux CD available for download somewhere? I'm sure a > large number of folks would find this very useful. Not really. It was prepared by Aric's excellent sysadmin for a course John Hunter and I taught this summer at LANL, at Aric's invitation. We didn't go with Quantian because we wanted up-to-date versions of various things, in addition to making sure that all the class materials would be burned on the (ubuntu-based) CD. This gave the students something to take home. The admin did an amazing job: there being no 'student lab' at LANL, he burned these CDs both for x86 and PPC, and everyone just brought in a laptop, grabbed a CD from a stack for their architecture (PC or Powerbooks), and went to town. I was impressed to see 30 people go from zero to fully working workstation, with 100% guaranteed identical environments, across 2 chip architectures, in 3 minutes (time to boot from CD). This is an experience worth repeating and disseminating, but I imagine Aric is too swamped with other things to push it much. You may want to contact him directly. Regards, f From rmuller at sandia.gov Thu Oct 20 16:44:42 2005 From: rmuller at sandia.gov (Rick Muller) Date: Thu, 20 Oct 2005 14:44:42 -0600 Subject: [SciPy-user] Matlab's economy size QR decomposition Message-ID: <11331403-AA04-41E8-AB21-2C337A7BCF13@sandia.gov> I'm trying to mimic matlabs "economy size" qr decomposition in scipy. Does anyone know how to achieve this using scipy.linalg.qr? If I call q,r = qr(A) Can I then do something like say q = q[:A.shape[0]] or something along these lines? Thanks in advance for any help anyone can offer. Rick From oliphant at ee.byu.edu Fri Oct 21 01:20:44 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Thu, 20 Oct 2005 23:20:44 -0600 Subject: [SciPy-user] Matlab's economy size QR decomposition In-Reply-To: <11331403-AA04-41E8-AB21-2C337A7BCF13@sandia.gov> References: <11331403-AA04-41E8-AB21-2C337A7BCF13@sandia.gov> Message-ID: <43587AAC.7030303@ee.byu.edu> Rick Muller wrote: >I'm trying to mimic matlabs "economy size" qr decomposition in scipy. >Does anyone know how to achieve this using scipy.linalg.qr? > >If I call > q,r = qr(A) >Can I then do something like say > q = q[:A.shape[0]] >or something along these lines? > > Yes, something like that. Presumably you are after the just the columns of q corresponding to the non-zero rows of r. In that case (assuming m>n) m,n = A.shape q,r = qr(A) qhat = q[:,:n] rhat = r[:n] should work. -Travis O. From cournape at atr.jp Fri Oct 21 01:31:07 2005 From: cournape at atr.jp (Cournapeau David) Date: Fri, 21 Oct 2005 14:31:07 +0900 Subject: [SciPy-user] Matlab, Scipy and teaching science In-Reply-To: <435774DD.7070305@ntc.zcu.cz> References: <6.2.3.4.0.20051020091110.03d156b8@127.0.0.1> <1129798510.5197.23.camel@sandwi.ch.cam.ac.uk> <6.2.3.4.0.20051020103020.03ab3160@127.0.0.1> <102408b60510200255v33b5d619ufa5bcdc65b06682c@mail.gmail.com> <20051020103709.GA23562@itb.biologie.hu-berlin.de> <435774DD.7070305@ntc.zcu.cz> Message-ID: <1129872667.4692.53.camel@localhost.localdomain> On Thu, 2005-10-20 at 12:43 +0200, Robert Cimrman wrote: > Tiziano Zito wrote: > > We used Python for Computational Neuroscience teaching here: > > > > http://itb.biologie.hu-berlin.de/~zito/teaching/CNSIV/index.html > > > > Another strong arguments against Matlab is that it lacks any kind of > > Object Oriented design. As already stated by others, that makes > > practically impossible to reuse old code for new things and exchange > > code with other people. Last argument is speed and memory > > consumption. Part of our MDP Python library was originally written in > > Matlab: the corresponding Python versions are not only much shorter, > > but at least two times faster. Python passes arguments to functions > > as references, whereas Matlab uses copies: if you are dealing with > > large matrices this makes a difference. > > Just a minor correction - Matlab passes arguments also by reference, but > it makes a *copy* of each argument that you change inside the function - > it can be (ugly)-hacked around though... Matlab uses copy on write techniques, so the arguments as references against arguments as copy does not hold. In my opinion, the main (technical) problem of matlab, by far, is that there is only good support for arrays of double. From there comes the memory consumption problem (eg: loading an audiofile in double is stupid most of the time). From there comes the speed issue when using something else than double arrays (matlab is arguably very fast in that domain for linear algebra stuff on native arrays). Emulating list with cell array is extremely slow, and cell array are really impractical to use (indexing syntax is different for example). Now, the existing matlab codebase for domain such as signal processing is huge, and without a way to interface matlab and python, I don't see that changing soon, unfortunately. I am still using matlab in many projects for that reason; I am using some ugly hacks to use python back and forth inside matlab, but this is far from ideal. David From rkern at ucsd.edu Fri Oct 21 01:51:30 2005 From: rkern at ucsd.edu (Robert Kern) Date: Thu, 20 Oct 2005 22:51:30 -0700 Subject: [SciPy-user] Matlab, Scipy and teaching science In-Reply-To: <1129872667.4692.53.camel@localhost.localdomain> References: <6.2.3.4.0.20051020091110.03d156b8@127.0.0.1> <1129798510.5197.23.camel@sandwi.ch.cam.ac.uk> <6.2.3.4.0.20051020103020.03ab3160@127.0.0.1> <102408b60510200255v33b5d619ufa5bcdc65b06682c@mail.gmail.com> <20051020103709.GA23562@itb.biologie.hu-berlin.de> <435774DD.7070305@ntc.zcu.cz> <1129872667.4692.53.camel@localhost.localdomain> Message-ID: <435881E2.9020705@ucsd.edu> Cournapeau David wrote: > Now, the existing matlab codebase for domain such as signal processing > is huge, and without a way to interface matlab and python, I don't see > that changing soon, unfortunately. I am still using matlab in many > projects for that reason; I am using some ugly hacks to use python back > and forth inside matlab, but this is far from ideal. Have you seen these projects? http://pymat.sourceforge.net/ http://mlabwrap.sourceforge.net/ -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From cournape at atr.jp Fri Oct 21 02:13:42 2005 From: cournape at atr.jp (Cournapeau David) Date: Fri, 21 Oct 2005 15:13:42 +0900 Subject: [SciPy-user] Matlab, Scipy and teaching science In-Reply-To: <435881E2.9020705@ucsd.edu> References: <6.2.3.4.0.20051020091110.03d156b8@127.0.0.1> <1129798510.5197.23.camel@sandwi.ch.cam.ac.uk> <6.2.3.4.0.20051020103020.03ab3160@127.0.0.1> <102408b60510200255v33b5d619ufa5bcdc65b06682c@mail.gmail.com> <20051020103709.GA23562@itb.biologie.hu-berlin.de> <435774DD.7070305@ntc.zcu.cz> <1129872667.4692.53.camel@localhost.localdomain> <435881E2.9020705@ucsd.edu> Message-ID: <1129875222.4692.57.camel@localhost.localdomain> On Thu, 2005-10-20 at 22:51 -0700, Robert Kern wrote: > Cournapeau David wrote: > > > Now, the existing matlab codebase for domain such as signal processing > > is huge, and without a way to interface matlab and python, I don't see > > that changing soon, unfortunately. I am still using matlab in many > > projects for that reason; I am using some ugly hacks to use python back > > and forth inside matlab, but this is far from ideal. > > Have you seen these projects? > > http://pymat.sourceforge.net/ > http://mlabwrap.sourceforge.net/ > I didn't know about the second one, thanks for the information. I didn't like pymat that much, and it is not developed anymore anyway. mlabwrap looks interesting; I wonder if it would be possible to include it (or something similar) in scipy. I think I am not the only one who would like to use python, while still relying on some matlab code. cheers, David From peter at mapledesign.co.uk Fri Oct 21 04:19:31 2005 From: peter at mapledesign.co.uk (Peter Bowyer) Date: Fri, 21 Oct 2005 09:19:31 +0100 Subject: [SciPy-user] Matlab, Scipy and teaching science In-Reply-To: References: <6.2.3.4.0.20051020091110.03d156b8@127.0.0.1> Message-ID: <6.2.3.4.0.20051020224153.04392460@127.0.0.1> At 19:13 20/10/2005, Perry Greenfield wrote: >The problem I have with this is that your goals seem murky. Is the >choice centered around what is most effective to teach Physics students >how to program (as your opening suggest). Or what they think will be >most useful to them in their career? Or which you think will irritate >them least with regard to installation or cost? Or something else? The short answer is that I don't know. The long answer is that this is another goal of the project: to decide whether programming should be taught to physics undergraduates; if so how much should be and when and in what form. At present it is not clear whether this has to fit within the departmental course structure and time-tabling (the current introductory course has got squeezed shorter and shorter) or whether I can propose a totally different course. As the optional 3rd year computational physics course uses the programming from the introductory course, whatever changes I make will affect the latter course. And should we be teaching all physics students to program (instead of just those who do the computational physics course) - in fact why not teach them how to plot graphs and analyse data instead, skills they'll use in the 2nd and 3rd year laboratories? At this point the project really goes back to the question of "What should the point of a Physics degree be?". What are we trying to create in the students - future researchers, numerically literate people to fill jobs, people with broad thinking? This question will have to be addressed to some extent by the project, as it will shape the direction of the course. In the end the project will be what I define it to be, and I am leaning to the first goal (what is most effective to teach Physics students how to program). >Who is deciding? You? Your advisor? I am, although I have been advised that whatever solution I propose, a version that fits within departmental constraints would be appropriate. Peter From peter at mapledesign.co.uk Fri Oct 21 04:19:31 2005 From: peter at mapledesign.co.uk (Peter Bowyer) Date: Fri, 21 Oct 2005 09:19:31 +0100 Subject: [SciPy-user] Matlab, Scipy and teaching science In-Reply-To: References: <6.2.3.4.0.20051020091110.03d156b8@127.0.0.1> Message-ID: <6.2.3.4.0.20051020224153.04392460@127.0.0.1> At 19:13 20/10/2005, Perry Greenfield wrote: >The problem I have with this is that your goals seem murky. Is the >choice centered around what is most effective to teach Physics students >how to program (as your opening suggest). Or what they think will be >most useful to them in their career? Or which you think will irritate >them least with regard to installation or cost? Or something else? The short answer is that I don't know. The long answer is that this is another goal of the project: to decide whether programming should be taught to physics undergraduates; if so how much should be and when and in what form. At present it is not clear whether this has to fit within the departmental course structure and time-tabling (the current introductory course has got squeezed shorter and shorter) or whether I can propose a totally different course. As the optional 3rd year computational physics course uses the programming from the introductory course, whatever changes I make will affect the latter course. And should we be teaching all physics students to program (instead of just those who do the computational physics course) - in fact why not teach them how to plot graphs and analyse data instead, skills they'll use in the 2nd and 3rd year laboratories? At this point the project really goes back to the question of "What should the point of a Physics degree be?". What are we trying to create in the students - future researchers, numerically literate people to fill jobs, people with broad thinking? This question will have to be addressed to some extent by the project, as it will shape the direction of the course. In the end the project will be what I define it to be, and I am leaning to the first goal (what is most effective to teach Physics students how to program). >Who is deciding? You? Your advisor? I am, although I have been advised that whatever solution I propose, a version that fits within departmental constraints would be appropriate. Peter From nwagner at mecha.uni-stuttgart.de Fri Oct 21 06:50:46 2005 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Fri, 21 Oct 2005 12:50:46 +0200 Subject: [SciPy-user] Array handling Message-ID: <4358C806.6060708@mecha.uni-stuttgart.de> >>> a=rand(4) >>> a array([ 0.53451766, 0.10459875, 0.91114967, 0.70552477]) How can I chop the components of the array a to m decimal places (e.g. m=2 --> a_new = array(([0.53, 0.10, 0.91, 0.70])) ? Nils From cimrman3 at ntc.zcu.cz Fri Oct 21 06:54:49 2005 From: cimrman3 at ntc.zcu.cz (Robert Cimrman) Date: Fri, 21 Oct 2005 12:54:49 +0200 Subject: [SciPy-user] Array handling In-Reply-To: <4358C806.6060708@mecha.uni-stuttgart.de> References: <4358C806.6060708@mecha.uni-stuttgart.de> Message-ID: <4358C8F9.8030007@ntc.zcu.cz> Nils Wagner wrote: >>>>a=rand(4) >>>>a > > array([ 0.53451766, 0.10459875, 0.91114967, 0.70552477]) > > How can I chop the components of the array a to m decimal places (e.g. > m=2 --> a_new = array(([0.53, 0.10, 0.91, 0.70])) ? scipy.round_( 1.234234234, 2 ) r. From rmuller at sandia.gov Fri Oct 21 07:15:51 2005 From: rmuller at sandia.gov (Rick Muller) Date: Fri, 21 Oct 2005 05:15:51 -0600 Subject: [SciPy-user] Matlab's economy QR In-Reply-To: References: Message-ID: <9E96EDCB-4F0B-4DFD-8030-65BAE9D8D968@sandia.gov> Yup. That worked. Thanks very much. On Oct 20, 2005, at 11:11 PM, Travis Oliphant wrote: >> I'm trying to mimic matlabs "economy size" qr decomposition in scipy. >> Does anyone know how to achieve this using scipy.linalg.qr? >> >> If I call >> q,r = qr(A) >> Can I then do something like say >> q = q[:A.shape[0]] >> or something along these lines? >> >> >> > Yes, something like that. Presumably you are after the just the > columns > of q corresponding to the non-zero rows of r. > > In that case (assuming m>n) > > m,n = A.shape > q,r = qr(A) > qhat = q[:,:n] > rhat = r[:n] > > should work. > From steve at shrogers.com Fri Oct 21 08:06:06 2005 From: steve at shrogers.com (Steven H. Rogers) Date: Fri, 21 Oct 2005 06:06:06 -0600 Subject: [SciPy-user] Matlab, Scipy and teaching science In-Reply-To: <6.2.3.4.0.20051020224153.04392460@127.0.0.1> References: <6.2.3.4.0.20051020091110.03d156b8@127.0.0.1> <6.2.3.4.0.20051020224153.04392460@127.0.0.1> Message-ID: <4358D9AE.4050702@shrogers.com> I was a Physics undergrad, and my two cents is that programming is just as important as calculus for a 21st century Physics graduate. When I was a frsshman, we learned calculus as part of our introductory Physics course despite taking a calculus course concurrently from the Math department because we needed to use techniques that came later in the Math course. I think Programming should be given similar treatement. It's useful for teaching critical, logical thought in addition to producing nice tables and graphs of results. Regards, Steve Peter Bowyer wrote: > At 19:13 20/10/2005, Perry Greenfield wrote: > >>The problem I have with this is that your goals seem murky. Is the >>choice centered around what is most effective to teach Physics students >>how to program (as your opening suggest). Or what they think will be >>most useful to them in their career? Or which you think will irritate >>them least with regard to installation or cost? Or something else? > > > The short answer is that I don't know. > > The long answer is that this is another goal of the project: to > decide whether programming should be taught to physics > undergraduates; if so how much should be and when and in what > form. At present it is not clear whether this has to fit within the > departmental course structure and time-tabling (the current > introductory course has got squeezed shorter and shorter) or whether > I can propose a totally different course. As the optional 3rd year > computational physics course uses the programming from the > introductory course, whatever changes I make will affect the latter > course. And should we be teaching all physics students to program > (instead of just those who do the computational physics course) - in > fact why not teach them how to plot graphs and analyse data instead, > skills they'll use in the 2nd and 3rd year laboratories? > > At this point the project really goes back to the question of "What > should the point of a Physics degree be?". What are we trying to > create in the students - future researchers, numerically literate > people to fill jobs, people with broad thinking? This question will > have to be addressed to some extent by the project, as it will shape > the direction of the course. > > In the end the project will be what I define it to be, and I am > leaning to the first goal (what is most effective to teach Physics > students how to program). > > >>Who is deciding? You? Your advisor? > > > I am, although I have been advised that whatever solution I propose, > a version that fits within departmental constraints would be appropriate. > > Peter > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user > > -- Steven H. Rogers, Ph.D., steve at shrogers.com Weblog: http://shrogers.com/weblog "He who refuses to do arithmetic is doomed to talk nonsense." -- John McCarthy From ryanlists at gmail.com Fri Oct 21 16:32:10 2005 From: ryanlists at gmail.com (Ryan Krauss) Date: Fri, 21 Oct 2005 16:32:10 -0400 Subject: [SciPy-user] root locus Message-ID: Does anyone out there have a root locus function written for scipy? I didn't see one in the signals package and didn't come up with anything in Google. I could use one right now and don't really want to write it myself if it has already been done. Are there other people working in feedback control out there who would want to collaborate on a controls package for scipy? Ryan From oliphant at ee.byu.edu Fri Oct 21 19:18:23 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Fri, 21 Oct 2005 17:18:23 -0600 Subject: [SciPy-user] Docstring searching capability in scipy In-Reply-To: <43576458.7040608@ucsd.edu> References: <6.2.3.4.0.20051020091110.03d156b8@127.0.0.1> <43576458.7040608@ucsd.edu> Message-ID: <4359773F.6090301@ee.byu.edu> Robert Kern wrote: >We have fairly good docstring coverage in scipy at least. We need to >build some searching capability, though. > > There is a little bit of searching in scipy's info info('minimize') will look through the docstrings and report functions containing that word. In new scipy info is in *scipy.utils.helpmod* until we figure out what to do with it. If you are running new scipy >>> from scipy.utils.helpmod import info or >>> from scipy import info if you are running old scipy. info was written just before help was available from Python, but it still returns more docstring information than help does. compare help(scipy.sin) info(scipy.sin) -Travis From Fernando.Perez at colorado.edu Fri Oct 21 19:34:19 2005 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Fri, 21 Oct 2005 17:34:19 -0600 Subject: [SciPy-user] Docstring searching capability in scipy In-Reply-To: <4359773F.6090301@ee.byu.edu> References: <6.2.3.4.0.20051020091110.03d156b8@127.0.0.1> <43576458.7040608@ucsd.edu> <4359773F.6090301@ee.byu.edu> Message-ID: <43597AFB.7070400@colorado.edu> Travis Oliphant wrote: > Robert Kern wrote: > > >>We have fairly good docstring coverage in scipy at least. We need to >>build some searching capability, though. >> >> > > There is a little bit of searching in scipy's info > > info('minimize') will look through the docstrings and report functions > containing that word. > > In new scipy info is in *scipy.utils.helpmod* until we figure out what > to do with it. In the spirit of scipy for numerics, matplotlib for plotting and ipython for interactive work, perhaps we should copy this over to ipython. Opinions? f From rkern at ucsd.edu Fri Oct 21 23:24:44 2005 From: rkern at ucsd.edu (Robert Kern) Date: Fri, 21 Oct 2005 20:24:44 -0700 Subject: [SciPy-user] Docstring searching capability in scipy In-Reply-To: <43597AFB.7070400@colorado.edu> References: <6.2.3.4.0.20051020091110.03d156b8@127.0.0.1> <43576458.7040608@ucsd.edu> <4359773F.6090301@ee.byu.edu> <43597AFB.7070400@colorado.edu> Message-ID: <4359B0FC.4000608@ucsd.edu> Fernando Perez wrote: > Travis Oliphant wrote: > >>Robert Kern wrote: >> >>>We have fairly good docstring coverage in scipy at least. We need to >>>build some searching capability, though. >> >>There is a little bit of searching in scipy's info >> >>info('minimize') will look through the docstrings and report functions >>containing that word. >> >>In new scipy info is in *scipy.utils.helpmod* until we figure out what >>to do with it. > > In the spirit of scipy for numerics, matplotlib for plotting and ipython for > interactive work, perhaps we should copy this over to ipython. Opinions? Yes, please! Among other things, that would allow it to scan scipy and other packages for docstrings and store the snippets of documentation (doclets?) somewhere in ~/.ipython/ . Then we can search without importing anything or scanning every time. -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From ryanlists at gmail.com Sat Oct 22 00:37:30 2005 From: ryanlists at gmail.com (Ryan Krauss) Date: Sat, 22 Oct 2005 00:37:30 -0400 Subject: [SciPy-user] Docstring searching capability in scipy In-Reply-To: <4359B0FC.4000608@ucsd.edu> References: <6.2.3.4.0.20051020091110.03d156b8@127.0.0.1> <43576458.7040608@ucsd.edu> <4359773F.6090301@ee.byu.edu> <43597AFB.7070400@colorado.edu> <4359B0FC.4000608@ucsd.edu> Message-ID: I like the three way division of responsibilities Fernando mentioned and would certainly appreciate more search capability in ipython. On 10/21/05, Robert Kern wrote: > Fernando Perez wrote: > > Travis Oliphant wrote: > > > >>Robert Kern wrote: > >> > >>>We have fairly good docstring coverage in scipy at least. We need to > >>>build some searching capability, though. > >> > >>There is a little bit of searching in scipy's info > >> > >>info('minimize') will look through the docstrings and report functions > >>containing that word. > >> > >>In new scipy info is in *scipy.utils.helpmod* until we figure out what > >>to do with it. > > > > In the spirit of scipy for numerics, matplotlib for plotting and ipython for > > interactive work, perhaps we should copy this over to ipython. Opinions? > > Yes, please! Among other things, that would allow it to scan scipy and > other packages for docstrings and store the snippets of documentation > (doclets?) somewhere in ~/.ipython/ . Then we can search without > importing anything or scanning every time. > > -- > Robert Kern > rkern at ucsd.edu > > "In the fields of hell where the grass grows high > Are the graves of dreams allowed to die." > -- Richard Harter > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user > From a.h.jaffe at gmail.com Sat Oct 22 07:39:29 2005 From: a.h.jaffe at gmail.com (Andrew Jaffe) Date: Sat, 22 Oct 2005 12:39:29 +0100 Subject: [SciPy-user] new scipy and __future__ division? Message-ID: Hi All- I regularly use 'from __future__ import division' in my code, since I can never be sure that my floats don't come to me looking like integers. However, scipy seems to have a problem with this: when I try to divide a float64 scipy array by a float, I get errors like TypeError: unsupported operand type(s) for /: 'scipy.ndarray' and 'float' I don't know if this is a newly-introduced bug (it is a bug, right?), since this is occuring in code that I've ported from numarray (where this behavior did not occur), not Numeric. Thanks, Andrew From a.h.jaffe at gmail.com Sat Oct 22 12:26:51 2005 From: a.h.jaffe at gmail.com (Andrew Jaffe) Date: Sat, 22 Oct 2005 17:26:51 +0100 Subject: [SciPy-user] Masked arrays, scipy and matplotlib Message-ID: <435A684B.2060403@gmail.com> Hi (again) All, Recently I'd been plotting masked arrays with matplotlib numarray's masking features. I am attempting to switch over to scipy_core, and running into some trouble when trying to do pylab.pcolor(x, y, data), where data is a masked array. I set matplotlib's "Numerix" variable to Numeric and it seems to correctly use the scipy routines, but it balks at my masked arrays: MAError: Cannot automatically convert masked array to Numeric because data is masked in one or more locations. (This worked fine with numarray masked arrays.) Any ideas? Andrew From oliphant at ee.byu.edu Sat Oct 22 13:41:28 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Sat, 22 Oct 2005 11:41:28 -0600 Subject: [SciPy-user] Masked arrays, scipy and matplotlib In-Reply-To: <435A684B.2060403@gmail.com> References: <435A684B.2060403@gmail.com> Message-ID: <435A79C8.4010503@ee.byu.edu> Andrew Jaffe wrote: >Hi (again) All, > >Recently I'd been plotting masked arrays with matplotlib numarray's >masking features. I am attempting to switch over to scipy_core, and >running into some trouble when trying to do pylab.pcolor(x, y, data), >where data is a masked array. I set matplotlib's "Numerix" variable to >Numeric and it seems to correctly use the scipy routines, but it balks >at my masked arrays: > >MAError: Cannot automatically convert masked array to Numeric because >data is masked in one or more locations. > >(This worked fine with numarray masked arrays.) > >Any ideas? > > Masked arrays are being worked on right now by Paul Dubois. They are not ready for prime time quite yet. -Travis From oliphant at ee.byu.edu Sat Oct 22 13:48:18 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Sat, 22 Oct 2005 11:48:18 -0600 Subject: [SciPy-user] new scipy and __future__ division? In-Reply-To: References: Message-ID: <435A7B62.2090002@ee.byu.edu> Andrew Jaffe wrote: >Hi All- > >I regularly use 'from __future__ import division' in my code, since I >can never be sure that my floats don't come to me looking like >integers. > >However, scipy seems to have a problem with this: when I try to divide >a float64 scipy array by a float, I get errors like > >TypeError: unsupported operand type(s) for /: 'scipy.ndarray' and 'float' > >I don't know if this is a newly-introduced bug (it is a bug, right?), >since this is occuring in code that I've ported from numarray (where > > Yes, this is a bug. Could you please post a small snippet of code that produces the problem on your computer. My simple tests don't reproduce it. Please print scipy.__core_version__ for comparison purposes as well. Thanks, -Travis From a.h.jaffe at gmail.com Sat Oct 22 14:18:53 2005 From: a.h.jaffe at gmail.com (Andrew Jaffe) Date: Sat, 22 Oct 2005 19:18:53 +0100 Subject: [SciPy-user] new scipy and __future__ division? In-Reply-To: <435A7B62.2090002@ee.byu.edu> References: <435A7B62.2090002@ee.byu.edu> Message-ID: <435A828D.4080809@gmail.com> Travis Oliphant wrote: > Andrew Jaffe wrote: > >>Hi All- >> >>I regularly use 'from __future__ import division' in my code, since I >>can never be sure that my floats don't come to me looking like >>integers. >> >>However, scipy seems to have a problem with this: when I try to divide >>a float64 scipy array by a float, I get errors like >> >>TypeError: unsupported operand type(s) for /: 'scipy.ndarray' and 'float' >> >>I don't know if this is a newly-introduced bug (it is a bug, right? > > Yes, this is a bug. > > Could you please post a small snippet of code that produces the problem > on your computer. My simple tests don't reproduce it. > > Please print > > scipy.__core_version__ > > for comparison purposes as well. The attached code produces the bug following, on both Mac OSX 10.4 and RHEL Linux, each running Python 2.4.1. Note that I haven't tried to simplify the offending set_Cinv routine any further. snowleopard:~...stats/MCMC% python ~/div_err.py SciPy Core version= 0.4.2.1252 Traceback (most recent call last): File "/Users/jaffe/div_err.py", line 25, in ? a.set_Cinv() File "/Users/jaffe/div_err.py", line 18, in set_Cinv dtype=float64)/det TypeError: unsupported operand type(s) for /: 'scipy.ndarray' and 'float' -------------- next part -------------- An embedded and charset-unspecified text was scrubbed... Name: div_err.py URL: From stephen.walton at csun.edu Sat Oct 22 18:56:20 2005 From: stephen.walton at csun.edu (Stephen Walton) Date: Sat, 22 Oct 2005 15:56:20 -0700 Subject: [SciPy-user] Numeric 24, matplotlib, etc. (was: Re: [SciPy-dev] TypeError: __array_data__ must return a string providing the pointer to data) In-Reply-To: <43597048.5080509@ee.byu.edu> References: <43589507.4050003@mecha.uni-stuttgart.de> <435907B4.5070506@csun.edu> <43591CB0.8040302@ee.byu.edu> <43593D35.6050005@ee.byu.edu> <43595D02.90709@csun.edu> <43597048.5080509@ee.byu.edu> Message-ID: <435AC394.10803@csun.edu> Travis Oliphant wrote: >Installing Numeric 24.0 and scipy core allows matplotlib to work with >scipy core arrays (they get quickly converted to Numeric 24.0 arrays >behind the scenes using the array interface). > Is 24.0 still around for download somewhere? I used an old archive of 24.0b2 I had lying around, but didn't see Numeric 24 on Sourceforge, either as a download or from CVS. It gets worse, because it turns out that the Ubuntu numeric packagers have broken it up into python-numeric, python2.4-numeric, and python2.4-numeric-ext. John Hunter, are you out there? I know you use Ubuntu; are you just using old scipy until you get a chance to port matplotlib to new core? Followups redirected to scipy-user, since IMHO this is no longer a scipy-dev discussion. From rkern at ucsd.edu Sat Oct 22 20:57:50 2005 From: rkern at ucsd.edu (Robert Kern) Date: Sat, 22 Oct 2005 17:57:50 -0700 Subject: [SciPy-user] Numeric 24, matplotlib, etc. In-Reply-To: <435AC394.10803@csun.edu> References: <43589507.4050003@mecha.uni-stuttgart.de> <435907B4.5070506@csun.edu> <43591CB0.8040302@ee.byu.edu> <43593D35.6050005@ee.byu.edu> <43595D02.90709@csun.edu> <43597048.5080509@ee.byu.edu> <435AC394.10803@csun.edu> Message-ID: <435AE00E.4030008@ucsd.edu> Stephen Walton wrote: > Travis Oliphant wrote: > >>Installing Numeric 24.0 and scipy core allows matplotlib to work with >>scipy core arrays (they get quickly converted to Numeric 24.0 arrays >>behind the scenes using the array interface). > > Is 24.0 still around for download somewhere? I used an old archive of > 24.0b2 I had lying around, but didn't see Numeric 24 on Sourceforge, > either as a download or from CVS. The CVS is where it always was, but it looks like Travis made a 24.0 final release. http://sourceforge.net/project/showfiles.php?group_id=1369&package_id=1351&release_id=365163 > It gets worse, because it turns out that the Ubuntu numeric packagers > have broken it up into python-numeric, python2.4-numeric, and > python2.4-numeric-ext. John Hunter, are you out there? I know you use > Ubuntu; are you just using old scipy until you get a chance to port > matplotlib to new core? apt-get the source for those packages. python-numeric is an empty package that depends on the package for the appropriate Python version. python2.4-numeric and python2.4-numeric-ext should both come from the same source package. python2.4-numeric-ext contains LinearAlgebra et al. If you install Numeric 24.0 yourself, then you can use the equivs package to make dummy packages to satisfy dependencies if you like. -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From williams at thphys.ox.ac.uk Sun Oct 23 09:02:22 2005 From: williams at thphys.ox.ac.uk (Michael Williams) Date: Sun, 23 Oct 2005 14:02:22 +0100 Subject: [SciPy-user] SciPy on OS X installation problem: ImportError: No module named scipy_distutils Message-ID: <5834C85E-61D2-4001-ADB1-5CB6BF9BAFB8@thphys.ox.ac.uk> Hi, I'd like to install SciPy on OS X. I'm following a combination of the instructions on these two pages: http://www.physics.ucf.edu/~mdj/MacpythonScipyGnuplot.html http://www.scipy.org/documentation/Members/fonnesbeck/osx_build.txt I'm running OS X 10.4 and have previously upgraded from Tiger's stock Python 2.3.5 to Python 2.4.1 using Bob Ippolito's MacPython installer: http://bob.pythonmac.org/archives/2005/03/31/macpython-241- installer/ Most of the steps described on the two guides work as advertised. However, I had a warning while building FFTW (described below), and a show-stopping failure in SciPy's setup.py build. I checked SciPy out from SVN around twelve hours ago. When building I get the following error: > ~/Desktop/scipy mike$ python setup.py build > > Traceback (most recent call last): > File "setup.py", line 47, in ? > import scipy_distutils > ImportError: No module named scipy_distutils From Googling, I understand the scipy_distutils module comes with F2PY. I've double checked my installation of this package, and I'm sure it's there (it's in /Library/Frameworks/Python.framework/ Versions/2.4/lib/python2.4/site-packages/f2py2e, as expected). The version of F2PY I've installed is 2.45.241_1926. Does anyone have any suggestions? Should I try an earlier version of SciPy? By the way, as mentioned above, when I ran ./configure on FFTW I got the following warning: checking for Fortran 77 libraries... -lcrt1.o -lcrt2.o -L/usr/local/ lib/gcc/powerpc-apple-darwin7.9.0/3.4.4 -L/usr/local/lib/gcc/powerpc- apple-darwin7.9.0/3.4.4/../../.. -lfrtbegin -lg2c -lgcc_s -lSystem checking for dummy main to link with Fortran 77 libraries... unknown configure: WARNING: *** Couldn't figure out how to link C and Fortran; switching to --disable-fortran. FFTW went on to compile and install without further problems. I don't think this is related to SciPy's failure to build, but I describe it on the off chance that it's related, or anyone knows how to fix it, or whether it even needs fixing. Thanks for reading this far, -- Mike Williams Theoretical Physics, University of Oxford From rkern at ucsd.edu Sun Oct 23 09:12:21 2005 From: rkern at ucsd.edu (Robert Kern) Date: Sun, 23 Oct 2005 06:12:21 -0700 Subject: [SciPy-user] SciPy on OS X installation problem: ImportError: No module named scipy_distutils In-Reply-To: <5834C85E-61D2-4001-ADB1-5CB6BF9BAFB8@thphys.ox.ac.uk> References: <5834C85E-61D2-4001-ADB1-5CB6BF9BAFB8@thphys.ox.ac.uk> Message-ID: <435B8C35.8000406@ucsd.edu> Michael Williams wrote: > Hi, > > I'd like to install SciPy on OS X. I'm following a combination of the > instructions on these two pages: > > http://www.physics.ucf.edu/~mdj/MacpythonScipyGnuplot.html > http://www.scipy.org/documentation/Members/fonnesbeck/osx_build.txt > > I'm running OS X 10.4 and have previously upgraded from Tiger's stock > Python 2.3.5 to Python 2.4.1 using Bob Ippolito's MacPython installer: > > http://bob.pythonmac.org/archives/2005/03/31/macpython-241- > installer/ > > Most of the steps described on the two guides work as advertised. > However, I had a warning while building FFTW (described below), and a > show-stopping failure in SciPy's setup.py build. > > I checked SciPy out from SVN around twelve hours ago. When building I > get the following error: > >>~/Desktop/scipy mike$ python setup.py build >> >>Traceback (most recent call last): >> File "setup.py", line 47, in ? >> import scipy_distutils >>ImportError: No module named scipy_distutils Install the trunk of scipy_core first. http://svn.scipy.org/svn/scipy_core/trunk/ Note, that this is not the scipy_core that we are currently developing as the replacement for Numeric. That development is taking place on a branch. -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From williams at thphys.ox.ac.uk Sun Oct 23 10:16:36 2005 From: williams at thphys.ox.ac.uk (Michael Williams) Date: Sun, 23 Oct 2005 15:16:36 +0100 Subject: [SciPy-user] SciPy on OS X installation problem: ImportError: No module named scipy_distutils In-Reply-To: <435B8C35.8000406@ucsd.edu> References: <5834C85E-61D2-4001-ADB1-5CB6BF9BAFB8@thphys.ox.ac.uk> <435B8C35.8000406@ucsd.edu> Message-ID: <67E251D5-4A9D-4182-A71E-4C82E535C9DC@thphys.ox.ac.uk> Hi Robert, On 23 Oct 2005, at 14:12, Robert Kern wrote: >> I checked SciPy out from SVN around twelve hours ago. When building I >> get the following error: >> >>> ~/Desktop/scipy mike$ python setup.py build >>> >>> Traceback (most recent call last): >>> File "setup.py", line 47, in ? >>> import scipy_distutils >>> ImportError: No module named scipy_distutils >>> > > Install the trunk of scipy_core first. > > http://svn.scipy.org/svn/scipy_core/trunk/ > > Note, that this is not the scipy_core that we are currently developing > as the replacement for Numeric. That development is taking place on a > branch. Thanks. I get a lot further along now. "python setup.py build && sudo python setup.py install" worked fine on scipy_core. I also get what seems like quite a long way through the build process on scipy, before it falls over with the following error: /usr/bin/ld: can't locate file for: -lcc_dynamic collect2: ld returned 1 exit status /usr/bin/ld: can't locate file for: -lcc_dynamic collect2: ld returned 1 exit status error: Command "/usr/local/bin/g77 -undefined dynamic_lookup -bundle build/temp.darwin-8.2.0-Power_Macintosh-2.4/build/src/Lib/fftpack/ _fftpackmodule.o build/temp.darwin-8.2.0-Power_Macintosh-2.4/Lib/ fftpack/src/zfft.o build/temp.darwin-8.2.0-Power_Macintosh-2.4/Lib/ fftpack/src/drfft.o build/temp.darwin-8.2.0-Power_Macintosh-2.4/Lib/ fftpack/src/zrfft.o build/temp.darwin-8.2.0-Power_Macintosh-2.4/Lib/ fftpack/src/zfftnd.o build/temp.darwin-8.2.0-Power_Macintosh-2.4/ build/src/fortranobject.o -L/usr/local/lib -L/usr/local/lib/gcc/ powerpc-apple-darwin7.9.0/3.4.4 -Lbuild/temp.darwin-8.2.0- Power_Macintosh-2.4 -ldfftpack -lrfftw -lfftw -lg2c -lcc_dynamic -o build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/fftpack/_fftpack.so" failed with exit status 1 Is this is related to the g77 ranlib problem described on the High Performance Computing on OS X page? This says: http://hpc.sourceforge.net/ > Note: You may need to ranlib some libs after you install. The > compiler will tell you which ones when you try to use it. In that > case, simply do a sudo ranlib -s on each such library. On Tiger you > may need to add the flags -lSystemStub -lcc_dynamic for the last > linking step to be successful. I have no experience with Fortran or C beyond "Hello, World!", so am left at a loss by this sentence. Assuming it is the cause of my new problem building SciPy, could someone perhaps explain it to me? I am conscious of the fact that I may now be wandering off topic for this list, so if I should take things elsewhere (e.g. pythonmac-sig or a Fortran group), please don't hesitate to tell me to go away! Thanks again, -- Mike Williams From rkern at ucsd.edu Sun Oct 23 11:12:08 2005 From: rkern at ucsd.edu (Robert Kern) Date: Sun, 23 Oct 2005 08:12:08 -0700 Subject: [SciPy-user] SciPy on OS X installation problem: ImportError: No module named scipy_distutils In-Reply-To: <67E251D5-4A9D-4182-A71E-4C82E535C9DC@thphys.ox.ac.uk> References: <5834C85E-61D2-4001-ADB1-5CB6BF9BAFB8@thphys.ox.ac.uk> <435B8C35.8000406@ucsd.edu> <67E251D5-4A9D-4182-A71E-4C82E535C9DC@thphys.ox.ac.uk> Message-ID: <435BA848.50306@ucsd.edu> Michael Williams wrote: > Hi Robert, > > On 23 Oct 2005, at 14:12, Robert Kern wrote: > >>>I checked SciPy out from SVN around twelve hours ago. When building I >>>get the following error: >>> >>> >>>>~/Desktop/scipy mike$ python setup.py build >>>> >>>>Traceback (most recent call last): >>>> File "setup.py", line 47, in ? >>>> import scipy_distutils >>>>ImportError: No module named scipy_distutils >>>> >> >>Install the trunk of scipy_core first. >> >> http://svn.scipy.org/svn/scipy_core/trunk/ >> >>Note, that this is not the scipy_core that we are currently developing >>as the replacement for Numeric. That development is taking place on a >>branch. > > Thanks. I get a lot further along now. "python setup.py build && sudo > python setup.py install" worked fine on scipy_core. > > I also get what seems like quite a long way through the build process > on scipy, before it falls over with the following error: > > /usr/bin/ld: can't locate file for: -lcc_dynamic > collect2: ld returned 1 exit status > /usr/bin/ld: can't locate file for: -lcc_dynamic > collect2: ld returned 1 exit status > error: Command "/usr/local/bin/g77 -undefined dynamic_lookup -bundle > build/temp.darwin-8.2.0-Power_Macintosh-2.4/build/src/Lib/fftpack/ > _fftpackmodule.o build/temp.darwin-8.2.0-Power_Macintosh-2.4/Lib/ > fftpack/src/zfft.o build/temp.darwin-8.2.0-Power_Macintosh-2.4/Lib/ > fftpack/src/drfft.o build/temp.darwin-8.2.0-Power_Macintosh-2.4/Lib/ > fftpack/src/zrfft.o build/temp.darwin-8.2.0-Power_Macintosh-2.4/Lib/ > fftpack/src/zfftnd.o build/temp.darwin-8.2.0-Power_Macintosh-2.4/ > build/src/fortranobject.o -L/usr/local/lib -L/usr/local/lib/gcc/ > powerpc-apple-darwin7.9.0/3.4.4 -Lbuild/temp.darwin-8.2.0- > Power_Macintosh-2.4 -ldfftpack -lrfftw -lfftw -lg2c -lcc_dynamic -o > build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/fftpack/_fftpack.so" > failed with exit status 1 > > Is this is related to the g77 ranlib problem described on the High > Performance Computing on OS X page? This says: > > http://hpc.sourceforge.net/ > >>Note: You may need to ranlib some libs after you install. The >>compiler will tell you which ones when you try to use it. In that >>case, simply do a sudo ranlib -s on each such library. On Tiger you >>may need to add the flags -lSystemStub -lcc_dynamic for the last >>linking step to be successful. > > I have no experience with Fortran or C beyond "Hello, World!", so am > left at a loss by this sentence. That sentence is unrelated to the rest of the paragraph. The GNU FORTRAN runtime needs some functions which aren't exposed by what g77 normally links in for dynamic libraries, so you need to manually add -lcc_dynamic. > Assuming it is the cause of my new > problem building SciPy, could someone perhaps explain it to me? Are you using gcc-4.0? For a variety of reasons, you should use gcc-3.3 instead. $ sudo gcc_select 3.3 You should then have a file /usr/lib/libcc_dynamic.a . -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From williams at thphys.ox.ac.uk Sun Oct 23 12:29:15 2005 From: williams at thphys.ox.ac.uk (Michael Williams) Date: Sun, 23 Oct 2005 17:29:15 +0100 Subject: [SciPy-user] SciPy on OS X installation problem: ImportError: No module named scipy_distutils In-Reply-To: <435BA848.50306@ucsd.edu> References: <5834C85E-61D2-4001-ADB1-5CB6BF9BAFB8@thphys.ox.ac.uk> <435B8C35.8000406@ucsd.edu> <67E251D5-4A9D-4182-A71E-4C82E535C9DC@thphys.ox.ac.uk> <435BA848.50306@ucsd.edu> Message-ID: <65339A5C-8297-4A0B-A184-DDBFD534A74A@thphys.ox.ac.uk> On 23 Oct 2005, at 16:12, Robert Kern wrote: >> http://hpc.sourceforge.net/ >>> Note: You may need to ranlib some libs after you install. The >>> compiler will tell you which ones when you try to use it. In that >>> case, simply do a sudo ranlib -s on each such library. On Tiger you >>> may need to add the flags -lSystemStub -lcc_dynamic for the last >>> linking step to be successful. >>> >> >> I have no experience with Fortran or C beyond "Hello, World!", so am >> left at a loss by this sentence. > > That sentence is unrelated to the rest of the paragraph. The GNU > FORTRAN > runtime needs some functions which aren't exposed by what g77 normally > links in for dynamic libraries, so you need to manually add - > lcc_dynamic. > >> Assuming it is the cause of my new >> problem building SciPy, could someone perhaps explain it to me? > > Are you using gcc-4.0? For a variety of reasons, you should use > gcc-3.3 > instead. > > $ sudo gcc_select 3.3 > > You should then have a file /usr/lib/libcc_dynamic.a . Thanks very much. I'm happy to remain blissfully ignorant as to why I should not be using gcc-4.0, and I have now successfully built and installed scipy. There were quite a few warnings during building, but I guess that goes with the territory. Running scipy.test() from Python also throws up a lot of errors, all of which are along the lines of: !! No test file 'test_lib.py' found for Are these anything to worry about? -- Mike From martigan at gmail.com Mon Oct 24 07:19:35 2005 From: martigan at gmail.com (Mauro Cherubini) Date: Mon, 24 Oct 2005 13:19:35 +0200 Subject: [SciPy-user] [newbie] array cloning with new size Message-ID: <1823AF3A-AC9E-4C38-A038-E63E4E7F7224@gmail.com> Dear All, sorry for the silly question, but I am struggling with the following problem: I have to clone a 2D array into another 2D array with an extra empty line. Then I want to copy the first line of the array into the empty last line of the array. I tried a couple of things to get it sorted but none of them worked. The last attempt is reported below (my matrix has three columns), where I used the resize() function, which throws back this error from scipy: ... File "/Users/mauro/Documents/workspace/tag-wai/ TagsNetworkMaker.py", line 52, in matrix_preparator new_matrix = resize(datamatrix, [lato +1, 3]) File "/System/Library/Frameworks/Python.framework/Versions/2.3/lib/ python2.3/site-packages/scipy/base/oldnumeric.py", line 196, in resize total_size = up.multiply.reduce(new_shape) NameError: global name 'up' is not defined By the way, can anyone explain me how to slice and assign a submatrix from the original matrix? I tried this syntax with no luck: new_matrix[: lato, :] = datamatrix[:, :] Thanks a lot in advance for your hints Mauro MY LAST ATTEMPT: ---------------------------- # the following method should prepare the matrix for the processing. In fact, we need to have a poligon set of points # which is by definition closed so the first point should be added into the matrix as last point def matrix_preparator(self, datamatrix): # first we find the number of points in the matrix lato = datamatrix.shape[0] # if the matrix contains only one row then it exit if lato > 1: # now we create a new matrix with an extra line new_matrix = resize(datamatrix, [lato +1, 3]) print new_matrix # here we clone the first point to the last line new_matrix[lato + 1, :] = datamatrix[0, :] # now the matrix is ready for processing return new_matrix; else: return datamatrix; -- web: http://craft.epfl.ch -- blog: http://www.i-cherubini.it/mauro/blog/ From ckkart at hoc.net Mon Oct 24 08:00:07 2005 From: ckkart at hoc.net (Christian Kristukat) Date: Mon, 24 Oct 2005 14:00:07 +0200 Subject: [SciPy-user] [newbie] array cloning with new size In-Reply-To: <1823AF3A-AC9E-4C38-A038-E63E4E7F7224@gmail.com> References: <1823AF3A-AC9E-4C38-A038-E63E4E7F7224@gmail.com> Message-ID: <435CCCC7.4060903@hoc.net> Mauro Cherubini wrote: > Dear All, > > sorry for the silly question, but I am struggling with the following > problem: > I have to clone a 2D array into another 2D array with an extra empty > line. Then I want to copy the first line of the array into the empty > last line of the array. This should happen automatically when you resize the array along the rows (first dimension). The new lines are filled with the values from the beginning of the array. > I tried a couple of things to get it sorted but none of them worked. > The last attempt is reported below (my matrix has three columns), > where I used the resize() function, which throws back this error from > scipy: > ... > File "/Users/mauro/Documents/workspace/tag-wai/ > TagsNetworkMaker.py", line 52, in matrix_preparator > new_matrix = resize(datamatrix, [lato +1, 3]) > File "/System/Library/Frameworks/Python.framework/Versions/2.3/lib/ > python2.3/site-packages/scipy/base/oldnumeric.py", line 196, in resize > total_size = up.multiply.reduce(new_shape) > NameError: global name 'up' is not defined I do not get that error. Does resize in general work as expected? Probably it's a problem with your numeric/scipy installation. > By the way, can anyone explain me how to slice and assign a submatrix > from the original matrix? I tried this syntax with no luck: > new_matrix[: lato, :] = datamatrix[:, :] > This is correct and should work. Check if both sides really have the same shape. > > MY LAST ATTEMPT: > ---------------------------- > > # the following method should prepare the matrix for the > processing. In fact, we need to have a poligon set of points > # which is by definition closed so the first point should be > added into the matrix as last point > def matrix_preparator(self, datamatrix): > # first we find the number of points in the matrix > lato = datamatrix.shape[0] > # if the matrix contains only one row then it exit > if lato > 1: > # now we create a new matrix with an extra line > new_matrix = resize(datamatrix, [lato +1, 3]) > print new_matrix > # here we clone the first point to the last line > new_matrix[lato + 1, :] = datamatrix[0, :] 'lato+1' is one too much. The last line's index is just 'lato'. > # now the matrix is ready for processing > return new_matrix; > else: > return datamatrix; > Christian From oliphant at ee.byu.edu Mon Oct 24 10:45:41 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Mon, 24 Oct 2005 08:45:41 -0600 Subject: [SciPy-user] [newbie] array cloning with new size In-Reply-To: <1823AF3A-AC9E-4C38-A038-E63E4E7F7224@gmail.com> References: <1823AF3A-AC9E-4C38-A038-E63E4E7F7224@gmail.com> Message-ID: <435CF395.6060604@ee.byu.edu> Mauro Cherubini wrote: >Dear All, > >sorry for the silly question, but I am struggling with the following >problem: >I have to clone a 2D array into another 2D array with an extra empty >line. Then I want to copy the first line of the array into the empty >last line of the array. > >I tried a couple of things to get it sorted but none of them worked. >The last attempt is reported below (my matrix has three columns), >where I used the resize() function, which throws back this error from >scipy: >... > File "/Users/mauro/Documents/workspace/tag-wai/ >TagsNetworkMaker.py", line 52, in matrix_preparator > new_matrix = resize(datamatrix, [lato +1, 3]) > File "/System/Library/Frameworks/Python.framework/Versions/2.3/lib/ >python2.3/site-packages/scipy/base/oldnumeric.py", line 196, in resize > total_size = up.multiply.reduce(new_shape) >NameError: global name 'up' is not defined > > This is a typo bug in your version of new scipy. This has already been fixed in the latest version of SciPy. Which one are you using? You can fix it by relacing 'up' with 'um' >By the way, can anyone explain me how to slice and assign a submatrix >from the original matrix? I tried this syntax with no luck: >new_matrix[: lato, :] = datamatrix[:, :] > > This should work if the shapes are consistent (in fact you dont need to slice the right hand side: new_matrix[:, lato, :] = datamatrix should work). Please report any problem you find. But using the latest version of scipy core is advised. -Travis From zunzun at zunzun.com Mon Oct 24 15:31:38 2005 From: zunzun at zunzun.com (zunzun at zunzun.com) Date: Mon, 24 Oct 2005 15:31:38 -0400 Subject: [SciPy-user] Gentoo scipy install went OK for me Message-ID: <20051024193138.GA24007@localhost.members.linode.com> Just completed a Gentoo scipy install, everything went fine after I got ATLAS and lapack to compile. A couple of dependencies could be added, for example numeric and f2py. Other than that no problem. I was very pleased to see weave installed as part of the emerge. My compliments on a job well done. James Phillips http://zunzun.com From Fernando.Perez at colorado.edu Mon Oct 24 16:48:37 2005 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Mon, 24 Oct 2005 14:48:37 -0600 Subject: [SciPy-user] Docstring searching capability in scipy In-Reply-To: <4359B0FC.4000608@ucsd.edu> References: <6.2.3.4.0.20051020091110.03d156b8@127.0.0.1> <43576458.7040608@ucsd.edu> <4359773F.6090301@ee.byu.edu> <43597AFB.7070400@colorado.edu> <4359B0FC.4000608@ucsd.edu> Message-ID: <435D48A5.8080602@colorado.edu> Robert Kern wrote: > Yes, please! Among other things, that would allow it to scan scipy and > other packages for docstrings and store the snippets of documentation > (doclets?) somewhere in ~/.ipython/ . Then we can search without > importing anything or scanning every time. OK, added to the wiki. That's not much, but at least it will guilt-trip me into actually doing it. Cheers, f From mcantor at stanford.edu Tue Oct 25 02:29:54 2005 From: mcantor at stanford.edu (mike cantor) Date: Mon, 24 Oct 2005 23:29:54 -0700 Subject: [SciPy-user] plotting from within a wxPython app Message-ID: <6.0.1.1.2.20051024231812.01d7f0c0@mcantor.pobox.stanford.edu> Hi all, I am new to python and SciPy and so far I am having a great time with it! Here is my question: I am writing a GUI using wxPython to manipulate and run an ODE-based model. What is the best way to embed a plot in the application? That is, how do instantiate a plot window within a frame/window which I can still control from the application. I guess one possibility would be to save the plot to a png file and then open the file as an image in a frame but it seems like there should be a better way. It would be nice if I could open a plot window from within the app that retains its interactive properties. Of course I tried the dumb thing of just calling plot from within the "run" buttons event handler. The result is that the plot comes up in its own nice window but if I so much as click on it the whole application crashes. Any ideas? Thanks, -mike From rkern at ucsd.edu Tue Oct 25 03:16:21 2005 From: rkern at ucsd.edu (Robert Kern) Date: Tue, 25 Oct 2005 00:16:21 -0700 Subject: [SciPy-user] plotting from within a wxPython app In-Reply-To: <6.0.1.1.2.20051024231812.01d7f0c0@mcantor.pobox.stanford.edu> References: <6.0.1.1.2.20051024231812.01d7f0c0@mcantor.pobox.stanford.edu> Message-ID: <435DDBC5.2000002@ucsd.edu> mike cantor wrote: > Hi all, > > I am new to python and SciPy and so far I am having a great time with > it! Here is my question: > > I am writing a GUI using wxPython to manipulate and run an ODE-based > model. What is the best way to embed a plot in the application? That is, > how do instantiate a plot window within a frame/window which I can still > control from the application. I guess one possibility would be to save the > plot to a png file and then open the file as an image in a frame but it > seems like there should be a better way. It would be nice if I could open > a plot window from within the app that retains its interactive properties. The plotting capabilities in scipy are deprecated, and I'm not sure that they supported embedding in a wxPython GUI in the first place. I recommend that you take a look at matplotlib and Chaco, which do support embedding in wxPython GUIs and are actively developed. matplotlib might be easier to get going with because it has an established community with lots of people to answer your questions. http://matplotlib.sourceforge.net http://code.enthought.com/chaco/chaco.htm -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From ckkart at hoc.net Tue Oct 25 04:33:00 2005 From: ckkart at hoc.net (Christian Kristukat) Date: Tue, 25 Oct 2005 10:33:00 +0200 Subject: [SciPy-user] plotting from within a wxPython app In-Reply-To: <435DDBC5.2000002@ucsd.edu> References: <6.0.1.1.2.20051024231812.01d7f0c0@mcantor.pobox.stanford.edu> <435DDBC5.2000002@ucsd.edu> Message-ID: <435DEDBC.4000201@hoc.net> Robert Kern wrote: > mike cantor wrote: > >>Hi all, >> >>I am new to python and SciPy and so far I am having a great time with >>it! Here is my question: >> >> I am writing a GUI using wxPython to manipulate and run an ODE-based >>model. What is the best way to embed a plot in the application? That is, >>how do instantiate a plot window within a frame/window which I can still >>control from the application. I guess one possibility would be to save the >>plot to a png file and then open the file as an image in a frame but it >>seems like there should be a better way. It would be nice if I could open >>a plot window from within the app that retains its interactive properties. > > > The plotting capabilities in scipy are deprecated, and I'm not sure that > they supported embedding in a wxPython GUI in the first place. I > recommend that you take a look at matplotlib and Chaco, which do support > embedding in wxPython GUIs and are actively developed. matplotlib might > be easier to get going with because it has an established community with > lots of people to answer your questions. > > http://matplotlib.sourceforge.net > http://code.enthought.com/chaco/chaco.htm > You should consider the wxPython PyPlot module, too. It comes with some nice demonstration how to use it in an application. It is by far not as powerful as matplotlib but much faster. I use matplotlib for the final step, i.e. to produce printable output. Christian From noel.oboyle2 at mail.dcu.ie Tue Oct 25 04:40:18 2005 From: noel.oboyle2 at mail.dcu.ie (Noel O'Boyle) Date: Tue, 25 Oct 2005 09:40:18 +0100 Subject: [SciPy-user] plotting from within a wxPython app In-Reply-To: <435DEDBC.4000201@hoc.net> References: <6.0.1.1.2.20051024231812.01d7f0c0@mcantor.pobox.stanford.edu> <435DDBC5.2000002@ucsd.edu> <435DEDBC.4000201@hoc.net> Message-ID: <1130229618.7869.2.camel@sandwi.ch.cam.ac.uk> Check out pychem.sf.net for an example of a wxPython GUI using PyPlot. Noel On Tue, 2005-10-25 at 10:33 +0200, Christian Kristukat wrote: > Robert Kern wrote: > > mike cantor wrote: > > > >>Hi all, > >> > >>I am new to python and SciPy and so far I am having a great time with > >>it! Here is my question: > >> > >> I am writing a GUI using wxPython to manipulate and run an ODE-based > >>model. What is the best way to embed a plot in the application? That is, > >>how do instantiate a plot window within a frame/window which I can still > >>control from the application. I guess one possibility would be to save the > >>plot to a png file and then open the file as an image in a frame but it > >>seems like there should be a better way. It would be nice if I could open > >>a plot window from within the app that retains its interactive properties. > > > > > > The plotting capabilities in scipy are deprecated, and I'm not sure that > > they supported embedding in a wxPython GUI in the first place. I > > recommend that you take a look at matplotlib and Chaco, which do support > > embedding in wxPython GUIs and are actively developed. matplotlib might > > be easier to get going with because it has an established community with > > lots of people to answer your questions. > > > > http://matplotlib.sourceforge.net > > http://code.enthought.com/chaco/chaco.htm > > > > You should consider the wxPython PyPlot module, too. It comes with some nice > demonstration how to use it in an application. It is by far not as powerful as > matplotlib but much faster. I use matplotlib for the final step, i.e. to produce > printable output. > > Christian > > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user From pau.gargallo at gmail.com Tue Oct 25 08:45:53 2005 From: pau.gargallo at gmail.com (Pau Gargallo) Date: Tue, 25 Oct 2005 12:45:53 +0000 Subject: [SciPy-user] about weave performance evaluation Message-ID: <6ef8f3380510250545j1617fb0ekf6b48cc5b60c403f@mail.gmail.com> Hi all, I'm not completely sure but it seems to me that there is an error in the performance evaluation of pure python, scipy, weave, fortran and C++ implementations of the laplace equation solver in http://www.scipy.org/documentation/weave/weaveperformance.html the code used to solve the laplace equation in pure python is: for i in range(1, nx-1): for j in range(1, ny-1): u[i,j] = ((u[i-1, j] + u[i+1, j])*dy**2 + (u[i, j-1] + u[i, j+1])*dx**2)/(2.0*(dx**2 + dy**2)) my guess is that this implementation overwrites u while running the loop. In other terms, when computing the term u[i,j], the terms u[i-1,j] and u[i,j-1] are used, and these terms where already modified in previous iterations. The same error is done in the weave.inline, fortran and C++ implementations, so as they are all computing the same wrong algorithm this may not be a problem when evaluation performances. However the scipy (or numeric) and weave.blitz implementation is: u[1:-1, 1:-1] = ((u[0:-2, 1:-1] + u[2:, 1:-1])*dy2 + (u[1:-1,0:-2] + u[1:-1, 2:])*dx2)*dnr_inv the previous error is not in this implementation so the code is not computing the same algorithm, and comparisons in the execution times would make no sense. Is my guess true? should the performances be reevaluated? thank you, pau From grante at visi.com Tue Oct 25 11:17:22 2005 From: grante at visi.com (Grant Edwards) Date: Tue, 25 Oct 2005 15:17:22 +0000 (UTC) Subject: [SciPy-user] plotting from within a wxPython app References: <6.0.1.1.2.20051024231812.01d7f0c0@mcantor.pobox.stanford.edu> Message-ID: On 2005-10-25, mike cantor wrote: > I am writing a GUI using wxPython to manipulate and run an ODE-based > model. What is the best way to embed a plot in the application? This is just one guy's opinion, but I think the best way to embed a plot into an application is not to. After working on several wxPython based programs that do plotting, I've decied that I really prefer using Gnuplot to generate a separate plot window. I have the controls/data in the wxPython application's window and the plot in a second window. I find this to be much more flexible, and you gain all of the interactivity of Gnpulot windows. For example, using the mouse the user can rotate/zoom surface plots with the mouse, the user can find the cordinates of points on an X-Y plot. The user can hide the plot window while working on the data or settings. Once the plot does what the user wants, they can show/maximize the plot window and not have any screen space wasted by widgets. -- Grant Edwards grante Yow! They at collapsed... like nuns visi.com in the street... they had no teenappeal! From arnd.baecker at web.de Tue Oct 25 11:35:47 2005 From: arnd.baecker at web.de (Arnd Baecker) Date: Tue, 25 Oct 2005 17:35:47 +0200 (CEST) Subject: [SciPy-user] Docstring searching capability in scipy In-Reply-To: <435D48A5.8080602@colorado.edu> References: <6.2.3.4.0.20051020091110.03d156b8@127.0.0.1> <43576458.7040608@ucsd.edu> <4359773F.6090301@ee.byu.edu> <43597AFB.7070400@colorado.edu> <4359B0FC.4000608@ucsd.edu> <435D48A5.8080602@colorado.edu> Message-ID: On Mon, 24 Oct 2005, Fernando Perez wrote: > Robert Kern wrote: > > > Yes, please! Among other things, that would allow it to scan scipy and > > other packages for docstrings and store the snippets of documentation > > (doclets?) somewhere in ~/.ipython/ . Then we can search without > > importing anything or scanning every time. > > OK, added to the wiki. That's not much, but at least it will guilt-trip me > into actually doing it. One remark on storing docstrings: I am not sure if this is the optimal strategy because for big packages (like VTK, wxPython, and scipy itself) this would consume a lot of space. Creating a searchable index (like documancer http://documancer.sourceforge.net/ does), e.g. using http://pylucene.osafoundation.org - an alternative is http://divmod.org/projects/xapwrap (using http://www.xapian.org/) might be a more efficient approach, also in terms of search times. Documancer does provide a very nice graphical interface for searching documents from various sources, but it should be possible to separate the actual search and result generation from the graphical display (eg to provide a pure text interface, for use in ipython or a graphical interface, for use in the upcoming notebook for ipython). Actually, I think an indexed search would also be helpful for the whole of python (and one could open even more cans of worms like ReSt for python documentation/pysource/eggs support etc.etc. in this context...) Best, Arnd From jdc at uwo.ca Tue Oct 25 11:43:42 2005 From: jdc at uwo.ca (Dan Christensen) Date: Tue, 25 Oct 2005 11:43:42 -0400 Subject: [SciPy-user] about weave performance evaluation In-Reply-To: <6ef8f3380510250545j1617fb0ekf6b48cc5b60c403f@mail.gmail.com> References: <6ef8f3380510250545j1617fb0ekf6b48cc5b60c403f@mail.gmail.com> Message-ID: <87hdb5lhzl.fsf@uwo.ca> Pau Gargallo writes: > I'm not completely sure but it seems to me that there is an error in > the performance evaluation of pure python, scipy, weave, fortran and > C++ implementations of the laplace equation solver in > http://www.scipy.org/documentation/weave/weaveperformance.html > > the code used to solve the laplace equation in pure python is: > ... > > my guess is that this implementation overwrites u while running the > loop. In other terms, when computing the term u[i,j], the terms > u[i-1,j] and u[i,j-1] are used, and these terms where already modified > in previous iterations. > ... comparisons in the execution times would make no sense. This is discussed at the URL above. Here's what they say: The expression will use temporaries. Hence, during one iteration, the computed values at an already computed location will not be used during the iteration. For instance, in the original for loop, once the value of u[1,1] is computed, the next value for u[1,2] will use the newly computed u[1,1] and not the old one. However, since the numeric expression uses temporaries internally, only the old value of u[1,1] will be used. This is not a serious issue in this case because it is known that even when this happens the algorithm will converge (but in twice as much time, which reduces the benefit by a factor of 2, which still leaves us with a 25 fold increase). Dan From Paul.Casteels at ua.ac.be Tue Oct 25 12:52:37 2005 From: Paul.Casteels at ua.ac.be (Paul Casteels) Date: Tue, 25 Oct 2005 18:52:37 +0200 Subject: [SciPy-user] plotting from within a wxPython app In-Reply-To: References: <6.0.1.1.2.20051024231812.01d7f0c0@mcantor.pobox.stanford.edu> Message-ID: <435E62D5.6000000@ua.ac.be> Grant Edwards schreef: > On 2005-10-25, mike cantor wrote: > > >> I am writing a GUI using wxPython to manipulate and run an ODE-based >>model. What is the best way to embed a plot in the application? > > > This is just one guy's opinion, but I think the best way to > embed a plot into an application is not to. After working on > several wxPython based programs that do plotting, I've decied > that I really prefer using Gnuplot to generate a separate plot > window. I have the controls/data in the wxPython application's > window and the plot in a second window. I find this to be much > more flexible, and you gain all of the interactivity of Gnpulot > windows. For example, using the mouse the user can rotate/zoom > surface plots with the mouse, the user can find the cordinates > of points on an X-Y plot. > > The user can hide the plot window while working on the data or > settings. Once the plot does what the user wants, they can > show/maximize the plot window and not have any screen space > wasted by widgets. > I once used DISLIN ( http://www.mps.mpg.de/dislin/, a very nice plotting library and free for Python users ) in a wxPython application and experienced similar problems. Putting the plotting code in a separate thread did the trick. wxPython seems quite picky with 'foreign' libraries : with a NI USB data acquisition library I had similar problems and solved them in the same way. From ryanlists at gmail.com Tue Oct 25 12:58:33 2005 From: ryanlists at gmail.com (Ryan Krauss) Date: Tue, 25 Oct 2005 12:58:33 -0400 Subject: [SciPy-user] plotting from within a wxPython app In-Reply-To: References: <6.0.1.1.2.20051024231812.01d7f0c0@mcantor.pobox.stanford.edu> Message-ID: There is an embedded example that comes with matplotlib that works quite nicely. I adapted it without much trouble to my purposes and was very happy with the result. Ryan On 10/25/05, Grant Edwards wrote: > On 2005-10-25, mike cantor wrote: > > > I am writing a GUI using wxPython to manipulate and run an ODE-based > > model. What is the best way to embed a plot in the application? > > This is just one guy's opinion, but I think the best way to > embed a plot into an application is not to. After working on > several wxPython based programs that do plotting, I've decied > that I really prefer using Gnuplot to generate a separate plot > window. I have the controls/data in the wxPython application's > window and the plot in a second window. I find this to be much > more flexible, and you gain all of the interactivity of Gnpulot > windows. For example, using the mouse the user can rotate/zoom > surface plots with the mouse, the user can find the cordinates > of points on an X-Y plot. > > The user can hide the plot window while working on the data or > settings. Once the plot does what the user wants, they can > show/maximize the plot window and not have any screen space > wasted by widgets. > > -- > Grant Edwards grante Yow! They > at collapsed... like nuns > visi.com in the street... they had > no teenappeal! > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user > From grante at visi.com Tue Oct 25 13:13:03 2005 From: grante at visi.com (Grant Edwards) Date: Tue, 25 Oct 2005 17:13:03 +0000 (UTC) Subject: [SciPy-user] plotting from within a wxPython app References: <6.0.1.1.2.20051024231812.01d7f0c0@mcantor.pobox.stanford.edu> <435E62D5.6000000@ua.ac.be> Message-ID: On 2005-10-25, Paul Casteels wrote: >> On 2005-10-25, mike cantor wrote: >> >>>I am writing a GUI using wxPython to manipulate and run an >>>ODE-based model. What is the best way to embed a plot in the >>>application? >> >> This is just one guy's opinion, but I think the best way to >> embed a plot into an application is not to. After working on >> several wxPython based programs that do plotting, I've decied >> that I really prefer using Gnuplot to generate a separate plot >> window. [...] >> The user can hide the plot window while working on the data or >> settings. Once the plot does what the user wants, they can >> show/maximize the plot window and not have any screen space >> wasted by widgets. > > I once used DISLIN ( http://www.mps.mpg.de/dislin/, a very > nice plotting library and free for Python users ) in a > wxPython application and experienced similar problems. Huh? What problems? > Putting the plotting code in a separate thread did the trick. > wxPython seems quite picky with 'foreign' libraries : with a > NI USB data acquisition library I had similar problems and > solved them in the same way. I wasn't aware I was having any problems with gnuplot. -- Grant Edwards grante Yow! Used staples are good at with SOY SAUCE! visi.com From pau.gargallo at gmail.com Tue Oct 25 13:40:01 2005 From: pau.gargallo at gmail.com (Pau Gargallo) Date: Tue, 25 Oct 2005 17:40:01 +0000 Subject: [SciPy-user] about weave performance evaluation In-Reply-To: <87hdb5lhzl.fsf@uwo.ca> References: <6ef8f3380510250545j1617fb0ekf6b48cc5b60c403f@mail.gmail.com> <87hdb5lhzl.fsf@uwo.ca> Message-ID: <6ef8f3380510251040x30210ebbyfa2f10d9a4017449@mail.gmail.com> > > This is discussed at the URL above. Here's what they say: > > The expression will use temporaries. Hence, during one iteration, the > computed values at an already computed location will not be used > during the iteration. For instance, in the original for loop, once > the value of u[1,1] is computed, the next value for u[1,2] will use > the newly computed u[1,1] and not the old one. However, since the > numeric expression uses temporaries internally, only the old value of > u[1,1] will be used. This is not a serious issue in this case because > it is known that even when this happens the algorithm will converge > (but in twice as much time, which reduces the benefit by a factor of > 2, which still leaves us with a 25 fold increase). > > Dan > thank you, and sorry I didn't read it carefully enough. i have then some [not important at all] questions (sorry again): as i understand from this text, the algorithm that the numeric code implements is intrinsically 2 times slower. Then, 1- i don't understand the 25 fold increase of the last sentence of the cited text. Shouldn't we conclude that the use of numeric speeds up by a factor 100? 2- in the final comparison the time used by numeric is 29.3s. If we want to compare the performance between the different implementations, should we divide this time by 2 ? sorry for these irrelevant questions, and thank you very much for the responses pau From kael.fischer at gmail.com Tue Oct 25 17:55:27 2005 From: kael.fischer at gmail.com (Kael Fischer) Date: Tue, 25 Oct 2005 14:55:27 -0700 Subject: [SciPy-user] leastsq not converging with tutorial example Message-ID: Hi: When running the leastsq example form the tutorial, the data are not fit and the starting parameters are returned. Looking a little deeper.... >>> leastsq(residuals, p0, args=(y_meas, x),full_output=True ) ([ 8. , 43.47826087, 1.04719755,], {'qtf': [ -1.69473087e+02, -4.72325042e+02, 2.13172580e+07,], 'nfev': 4, 'fjac': [[...blah...blah...blah...]], 'ipvt': [1,2,3,]}, 4, 'The cosine of the angle between func(x) and any column of the\n Jacobian is at most 0.000000 in absolute value') I get the same behavior if I change the starting parameters, or if fit other systems (like first order decay) OS: FreeBSD 5.4 Python: 2.3.4 scipy_version: 0.3.2 Any ideas about getting this to work? Kael -- Kael Fischer, Ph.D DeRisi Lab - Univ. Of California San Francisco 415-514-4320 From fishburn at MIT.EDU Wed Oct 26 02:34:52 2005 From: fishburn at MIT.EDU (Matt Fishburn) Date: Wed, 26 Oct 2005 02:34:52 -0400 Subject: [SciPy-user] leastsq not converging with tutorial example In-Reply-To: References: Message-ID: <435F238C.1040206@mit.edu> Kael Fischer wrote: > Hi: > > When running the leastsq example form the tutorial, the data are not > fit and the starting parameters are returned. Looking a little > deeper.... I've used the leastsq algorithm to fit to general degree polynomials and an erf function before - see code at: http://web.mit.edu/fishburn/www/021/funcFit.py I haven't tried a cosine fit, though - do you have more in-depth code, or did you just grab the code from the tutorial? I've had pretty good luck with leastsq. -Matt Fishburn From giovanni.samaey at cs.kuleuven.ac.be Wed Oct 26 06:35:39 2005 From: giovanni.samaey at cs.kuleuven.ac.be (Giovanni Samaey) Date: Wed, 26 Oct 2005 12:35:39 +0200 Subject: [SciPy-user] newscipy: linalg.iterative? Message-ID: <435F5BFB.60205@cs.kuleuven.ac.be> Hi all, i have (successfully?) installed newcore and newscipy from svn. I get no compile or install errors, but I experience two problems: 1) can't run the tests: errors along the lines of !! FAILURE importing tests for I have reported this error already before, and also a couple of days ago Michael Williams asked about this on this list. It seems the type of question no one replies ... 2) scipy.linalg.iterative appears not to be there (it was in a previous version). has this been removed, permanently or temporarily, or did I make an install mistake of some kind (e.g. forget to check out something.) Giovanni Disclaimer: http://www.kuleuven.be/cwis/email_disclaimer.htm From meesters at uni-mainz.de Wed Oct 26 09:25:48 2005 From: meesters at uni-mainz.de (Christian Meesters) Date: Wed, 26 Oct 2005 15:25:48 +0200 Subject: [SciPy-user] Radians vs. degrees Message-ID: Hi, got the problem that my last version of scipy (there still 0.3.2) I need to do some trigonometry and don't know whether functions like sin or arctan use degrees or radians in advance (seems like degrees is the default, right?). How can I determine this and change it, if necessary? TIA Christian From ryanlists at gmail.com Wed Oct 26 09:52:09 2005 From: ryanlists at gmail.com (Ryan Krauss) Date: Wed, 26 Oct 2005 09:52:09 -0400 Subject: [SciPy-user] Radians vs. degrees In-Reply-To: References: Message-ID: The default is going to be degrees. Try things like sin(90) and sin(pi/2). Ryan On 10/26/05, Christian Meesters wrote: > Hi, > > got the problem that my last version of scipy (there still 0.3.2) I > need to do some trigonometry and don't know whether functions like sin > or arctan use degrees or radians in advance (seems like degrees is the > default, right?). How can I determine this and change it, if necessary? > > TIA > Christian > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user > From ryanlists at gmail.com Wed Oct 26 09:52:23 2005 From: ryanlists at gmail.com (Ryan Krauss) Date: Wed, 26 Oct 2005 09:52:23 -0400 Subject: [SciPy-user] Radians vs. degrees In-Reply-To: References: Message-ID: Sorry, I meant radians is the default On 10/26/05, Ryan Krauss wrote: > The default is going to be degrees. Try things like sin(90) and sin(pi/2). > > Ryan > > On 10/26/05, Christian Meesters wrote: > > Hi, > > > > got the problem that my last version of scipy (there still 0.3.2) I > > need to do some trigonometry and don't know whether functions like sin > > or arctan use degrees or radians in advance (seems like degrees is the > > default, right?). How can I determine this and change it, if necessary? > > > > TIA > > Christian > > > > _______________________________________________ > > SciPy-user mailing list > > SciPy-user at scipy.net > > http://www.scipy.net/mailman/listinfo/scipy-user > > > From meesters at uni-mainz.de Wed Oct 26 10:11:46 2005 From: meesters at uni-mainz.de (Christian Meesters) Date: Wed, 26 Oct 2005 16:11:46 +0200 Subject: [SciPy-user] Radians vs. degrees In-Reply-To: References: Message-ID: <09367602109862577880a51d640cb018@uni-mainz.de> Ok, I was confusing it, you too. Gues we are even now? ;-) But how can I change from one mode to the other? (It would be handy, if I could do that ...) Thanks, Christian On 26 Oct 2005, at 15:52, Ryan Krauss wrote: > Sorry, I meant radians is the default > > On 10/26/05, Ryan Krauss wrote: >> The default is going to be degrees. Try things like sin(90) and >> sin(pi/2). >> >> Ryan >> >> On 10/26/05, Christian Meesters wrote: >>> Hi, >>> >>> got the problem that my last version of scipy (there still 0.3.2) I >>> need to do some trigonometry and don't know whether functions like >>> sin >>> or arctan use degrees or radians in advance (seems like degrees is >>> the >>> default, right?). How can I determine this and change it, if >>> necessary? >>> >>> TIA >>> Christian >>> >>> _______________________________________________ >>> SciPy-user mailing list >>> SciPy-user at scipy.net >>> http://www.scipy.net/mailman/listinfo/scipy-user >>> >> > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user > From ryanlists at gmail.com Wed Oct 26 11:44:54 2005 From: ryanlists at gmail.com (Ryan Krauss) Date: Wed, 26 Oct 2005 11:44:54 -0400 Subject: [SciPy-user] Radians vs. degrees In-Reply-To: <09367602109862577880a51d640cb018@uni-mainz.de> References: <09367602109862577880a51d640cb018@uni-mainz.de> Message-ID: I don't know that you can. do sin(deg*pi/180.0) On 10/26/05, Christian Meesters wrote: > Ok, I was confusing it, you too. Gues we are even now? ;-) > But how can I change from one mode to the other? (It would be handy, if > I could do that ...) > > Thanks, > Christian > > On 26 Oct 2005, at 15:52, Ryan Krauss wrote: > > > Sorry, I meant radians is the default > > > > On 10/26/05, Ryan Krauss wrote: > >> The default is going to be degrees. Try things like sin(90) and > >> sin(pi/2). > >> > >> Ryan > >> > >> On 10/26/05, Christian Meesters wrote: > >>> Hi, > >>> > >>> got the problem that my last version of scipy (there still 0.3.2) I > >>> need to do some trigonometry and don't know whether functions like > >>> sin > >>> or arctan use degrees or radians in advance (seems like degrees is > >>> the > >>> default, right?). How can I determine this and change it, if > >>> necessary? > >>> > >>> TIA > >>> Christian > >>> > >>> _______________________________________________ > >>> SciPy-user mailing list > >>> SciPy-user at scipy.net > >>> http://www.scipy.net/mailman/listinfo/scipy-user > >>> > >> > > > > _______________________________________________ > > SciPy-user mailing list > > SciPy-user at scipy.net > > http://www.scipy.net/mailman/listinfo/scipy-user > > > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user > From jdc at uwo.ca Wed Oct 26 11:45:31 2005 From: jdc at uwo.ca (Dan Christensen) Date: Wed, 26 Oct 2005 11:45:31 -0400 Subject: [SciPy-user] about weave performance evaluation In-Reply-To: <6ef8f3380510251040x30210ebbyfa2f10d9a4017449@mail.gmail.com> References: <6ef8f3380510250545j1617fb0ekf6b48cc5b60c403f@mail.gmail.com> <87hdb5lhzl.fsf@uwo.ca> <6ef8f3380510251040x30210ebbyfa2f10d9a4017449@mail.gmail.com> Message-ID: <87u0f4b7tw.fsf@uwo.ca> Pau Gargallo writes: >> This is discussed at the URL above. Here's what they say: >> >> This is not a serious issue in this case because >> it is known that even when this happens the algorithm will converge >> (but in twice as much time, which reduces the benefit by a factor of >> 2, which still leaves us with a 25 fold increase). >> >> Dan > > i have then some [not important at all] questions (sorry again): > as i understand from this text, the algorithm that the numeric code > implements is intrinsically 2 times slower. Then, > > 1- i don't understand the 25 fold increase of the last sentence of the > cited text. Shouldn't we conclude that the use of numeric speeds up by > a factor 100? I think 25 is correct. When the numeric method gets err < eps, it will be farther from the correct solution than when the pure python method gets err < eps, since it is in effect taking smaller steps. To get as accurate an answer, you'd have to adjust eps to make the code run approximately twice as long. I'm not at all an expert on this, but am just trying to rephrase what is said at that URL. On the other hand, as a comparison between pure python and numeric, I think 50 is the right number. Many problems won't have this "factor of 2" issue, so with similar inner loops numeric should give about a 50-fold increase in speed. > 2- in the final comparison the time used by numeric is 29.3s. If we > want to compare the performance between the different implementations, > should we divide this time by 2 ? Multiply, I think. Dan From meesters at uni-mainz.de Wed Oct 26 12:22:56 2005 From: meesters at uni-mainz.de (Christian Meesters) Date: Wed, 26 Oct 2005 18:22:56 +0200 Subject: [SciPy-user] Radians vs. degrees In-Reply-To: References: <09367602109862577880a51d640cb018@uni-mainz.de> Message-ID: Thank you Ryan. Though an explicit conversion was not what I had in mind. My equations got way too long now and this is the reason why I would ask whether a simple mode change could be included into the new scipy-core? Travis, what's your opinion? (And forgive me if this is already included ;-) .) Cheers, Christian On 26 Oct 2005, at 17:44, Ryan Krauss wrote: > I don't know that you can. do sin(deg*pi/180.0) > > On 10/26/05, Christian Meesters wrote: >> Ok, I was confusing it, you too. Gues we are even now? ;-) >> But how can I change from one mode to the other? (It would be handy, >> if >> I could do that ...) >> >> Thanks, >> Christian >> >> On 26 Oct 2005, at 15:52, Ryan Krauss wrote: >> >>> Sorry, I meant radians is the default >>> >>> On 10/26/05, Ryan Krauss wrote: >>>> The default is going to be degrees. Try things like sin(90) and >>>> sin(pi/2). >>>> >>>> Ryan >>>> >>>> On 10/26/05, Christian Meesters wrote: >>>>> Hi, >>>>> >>>>> got the problem that my last version of scipy (there still 0.3.2) I >>>>> need to do some trigonometry and don't know whether functions like >>>>> sin >>>>> or arctan use degrees or radians in advance (seems like degrees is >>>>> the >>>>> default, right?). How can I determine this and change it, if >>>>> necessary? >>>>> >>>>> TIA >>>>> Christian >>>>> >>>>> _______________________________________________ >>>>> SciPy-user mailing list >>>>> SciPy-user at scipy.net >>>>> http://www.scipy.net/mailman/listinfo/scipy-user >>>>> >>>> >>> >>> _______________________________________________ >>> SciPy-user mailing list >>> SciPy-user at scipy.net >>> http://www.scipy.net/mailman/listinfo/scipy-user >>> >> >> _______________________________________________ >> SciPy-user mailing list >> SciPy-user at scipy.net >> http://www.scipy.net/mailman/listinfo/scipy-user >> > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user > From kael.fischer at gmail.com Wed Oct 26 12:42:32 2005 From: kael.fischer at gmail.com (Kael Fischer) Date: Wed, 26 Oct 2005 09:42:32 -0700 Subject: [SciPy-user] leastsq not converging with tutorial example In-Reply-To: <435F238C.1040206@mit.edu> References: <435F238C.1040206@mit.edu> Message-ID: Matt et al: Thanks for your reply. I'm still in the leastsq positive control stage, so for now lets just stick to stuff based on the tutorial and another example (http://linuxgazette.net/115/andreasen.html). Incidentally, my own application is simple 3-parameter first-order decay (i.e. A = (A0-B)*exp(-k/t) + B), and I'm sure if I can get the examples working, leastsq will be adequate. Although I am not sure how to report the quality of the fit, or how to estimate the uncertainty of the fit parameters. Here 2 scripts that I hope the mailer doesn't mangle. Using either (one is a sine wave; the other is a double exponential), I get back the initial parameters and the "The cosine of the angle between func(x) and any column of the\n Jacobian is at most 0.000000 in absolute value" status message. Anybody know what is going on. # Based on Scipy leastsq section of the tutorial from scipy import * from scipy.optimize import leastsq x = arange(0,6e-2,6e-2/30) A,k,theta = 10, 1.0/3e-2, pi/6 y_true = A*sin(2*pi*k*x+theta) y_meas = y_true + 2*randn(len(x)) def residuals(p, y, x): A,k,theta = p err = y-A*sin(2*pi*k*x+theta) return err def peval(x, p): return p[0]*sin(2*pi*p[1]*x+p[2]) p0 = [8, 1/2.3e-2, pi/3] print array(p0) plsq = leastsq(residuals, p0, args=(y_meas, x),full_output=True ) print plsq ==== or another examle ==== # From kinfit.py see link above # fits to a double exponential # # data file avalilable at http://linuxgazette.net/115/misc/andreasen/tgdata.dat # from scipy import * from scipy.optimize import leastsq import scipy.io.array_import from scipy import gplt def residuals(p, y, x): err = y-peval(x,p) return err def peval(x, p): return p[0]*(1-exp(-(p[2]*x)**p[4])) + p[1]*(1-exp(-(p[3]*(x))**p[5] )) filename='tgdata.dat' data = scipy.io.array_import.read_array(filename) y = data[:,1] x = data[:,0] A1_0=5 A2_0=3 k1_0=0.5 k2_0=0.04 n1_0=2 n2_0=1 pname = (['A1','A2','k1','k2','n1','n2']) p0 = array([A1_0 , A2_0, k1_0, k2_0,n1_0,n2_0]) plsq = leastsq(residuals, p0, args=(y, x), maxfev=2000,full_output=True ) print plsq === End Python Code === I just rebuilt Scipy by hand and made sure all the correct libraries are there... It has the same problem (initially I was using the FreeBSD port to build locally). That said - there were problems with cephs/protos.h. The following complex functions were definitions were defined elsewhere (?) and I comment them out: //extern void cexp ( cmplx *z, cmplx *w ); //extern void csin ( cmplx *z, cmplx *w ); //extern void ccos ( cmplx *z, cmplx *w ); //extern void ctan ( cmplx *z, cmplx *w ); //extern void casin ( cmplx *z, cmplx *w ); //extern void cacos ( cmplx *z, cmplx *w ); //extern void catan ( cmplx *z, cmplx *w ); //extern void csqrt ( cmplx *z, cmplx *w ); Rgds, Kael On 10/25/05, Matt Fishburn wrote: > Kael Fischer wrote: > > Hi: > > > > When running the leastsq example form the tutorial, the data are not > > fit and the starting parameters are returned. Looking a little > > deeper.... > > I've used the leastsq algorithm to fit to general degree polynomials and > an erf function before - see code at: > > http://web.mit.edu/fishburn/www/021/funcFit.py > > I haven't tried a cosine fit, though - do you have more in-depth code, > or did you just grab the code from the tutorial? I've had pretty good > luck with leastsq. > > -Matt Fishburn > -- Kael Fischer, Ph.D DeRisi Lab - Univ. Of California San Francisco 415-514-4320 From Fernando.Perez at colorado.edu Wed Oct 26 13:32:09 2005 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Wed, 26 Oct 2005 11:32:09 -0600 Subject: [SciPy-user] Radians vs. degrees In-Reply-To: References: <09367602109862577880a51d640cb018@uni-mainz.de> Message-ID: <435FBD99.6030506@colorado.edu> Christian Meesters wrote: > Thank you Ryan. Though an explicit conversion was not what I had in > mind. My equations got way too long now and this is the reason why I > would ask whether a simple mode change could be included into the new > scipy-core? Travis, what's your opinion? (And forgive me if this is > already included ;-) .) why can't you just write a couple of top-level utilities like this r2d = pi/180.0 dsin = lambda deg: sin(deg*r2d) dcos = lambda deg: cos(deg*r2d) and use explicitly dsin/dcos when you need the degree versions? As far as _any_ mechanism to globally flip sin/cos/etc to defaulting silently to degrees, I'll be -1000 (and loudly) on that. There are a million mathematical reasons why the core trig functions should remain, always, in their proper units (radians - which aren't really a unit, btw). For something like this, 'explicit is better than implicit' is a good one to remember. Cheers, f From oliphant at ee.byu.edu Wed Oct 26 13:38:02 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Wed, 26 Oct 2005 11:38:02 -0600 Subject: [SciPy-user] newscipy: linalg.iterative? In-Reply-To: <435F5BFB.60205@cs.kuleuven.ac.be> References: <435F5BFB.60205@cs.kuleuven.ac.be> Message-ID: <435FBEFA.4010501@ee.byu.edu> Giovanni Samaey wrote: >Hi all, > >i have (successfully?) installed newcore and newscipy from svn. >I get no compile or install errors, but I experience two problems: > >1) can't run the tests: >errors along the lines of >!! FAILURE importing tests for 'scipy/xxx/foo.pyc'> > > Nobody responds to this error because it is not an error. It is just noise reminding us that there are no tests written yet for that module. The wording could be changed, possibly. >2) scipy.linalg.iterative appears not to be there (it was in a previous >version). >has this been removed, permanently or temporarily, or did I make an >install mistake of some kind (e.g. forget to check out something.) > > It's there on my system. What leads you to say it is not there? -Travis From meesters at uni-mainz.de Wed Oct 26 13:48:17 2005 From: meesters at uni-mainz.de (Christian Meesters) Date: Wed, 26 Oct 2005 19:48:17 +0200 Subject: [SciPy-user] Radians vs. degrees In-Reply-To: <435FBD99.6030506@colorado.edu> References: <09367602109862577880a51d640cb018@uni-mainz.de> <435FBD99.6030506@colorado.edu> Message-ID: On 26 Oct 2005, at 19:32, Fernando Perez wrote: > why can't you just write a couple of top-level utilities like this > > r2d = pi/180.0 > dsin = lambda deg: sin(deg*r2d) > dcos = lambda deg: cos(deg*r2d) That's what I'm busy with - except for another bug which keeps me busy now :( > > and use explicitly dsin/dcos when you need the degree versions? > > As far as _any_ mechanism to globally flip sin/cos/etc to defaulting > silently > to degrees, I'll be -1000 (and loudly) on that. There are a million > mathematical reasons why the core trig functions should remain, > always, in > their proper units (radians - which aren't really a unit, btw). Alright, I got the point and guess you are right. I feel convinced, think I overdid it a little. > > For something like this, 'explicit is better than implicit' is a good > one to > remember. Anyway, as long as it is explicitely mentioned in the docs ;-). Cheers, Christian From ckkart at hoc.net Wed Oct 26 16:54:12 2005 From: ckkart at hoc.net (Christian Kristukat) Date: Wed, 26 Oct 2005 22:54:12 +0200 (CEST) Subject: [SciPy-user] leastsq not converging with tutorial example Message-ID: Hi Kael, On Wed, 26 Oct 2005, Kael Fischer wrote: > Here 2 scripts that I hope the mailer doesn't mangle. Using either > (one is a sine wave; the other is a double exponential), I get back > the initial parameters and the "The cosine of the angle between > func(x) and any column of the\n Jacobian is at most 0.000000 in > absolute value" status message. Anybody know what is going on. I tested your first example on my system and it works. But I also had some runs where the best fit was not found. In any case the output parameters were different from the initial guess. My guess is that your initial guess is really bad. Try with some values closer to the exact model. Christian From kael.fischer at gmail.com Wed Oct 26 17:30:26 2005 From: kael.fischer at gmail.com (Kael Fischer) Date: Wed, 26 Oct 2005 14:30:26 -0700 Subject: [SciPy-user] leastsq not converging with tutorial example In-Reply-To: References: Message-ID: Christrian: I wish that were the answer but, alas, no matter what starting parameters you use the problem is the same. The staring parameters in those examples are very reasonable, and they should converge, as you saw when you ran them. Note that on my machines, it do not do the things minimizers typically do. In particular: It does not converge to a local minimum. Nor does it run until the iteration limt is reached. It exits with the error I gave, and with ier = 4. According to the docs, this indicates non-convergence. Has any one seen _this_ behaviour before? I suspect that something is wrong with a library or with the the python wrappers in the FreeBSD context. Rgds, Kael -- Kael Fischer, Ph.D DeRisi Lab - Univ. Of California San Francisco 415-514-4320 From aisaac at american.edu Thu Oct 27 15:11:27 2005 From: aisaac at american.edu (Alan G Isaac) Date: Thu, 27 Oct 2005 15:11:27 -0400 Subject: [SciPy-user] leastsq not converging with tutorial example In-Reply-To: References: <435F238C.1040206@mit.edu> Message-ID: On Wed, 26 Oct 2005, Kael Fischer apparently wrote: > I get back the initial parameters and the "The cosine of > the angle between func(x) and any column of the\n > Jacobian is at most 0.000000 in absolute value" status > message. Anybody know what is going on. > # Based on Scipy leastsq section of the tutorial > from scipy import * > from scipy.optimize import leastsq > x = arange(0,6e-2,6e-2/30) > A,k,theta = 10, 1.0/3e-2, pi/6 > y_true = A*sin(2*pi*k*x+theta) > y_meas = y_true + 2*randn(len(x)) > def residuals(p, y, x): > A,k,theta = p > err = y-A*sin(2*pi*k*x+theta) > return err > p0 = [8, 1/2.3e-2, pi/3] > print array(p0) > plsq = leastsq(residuals, p0, args=(y_meas, x),full_output=True ) > print plsq I've lost track of this conversation: are you working with the old SciPy? If so, I get a fine fit. (There is no unique solution.) Output below. Cheers, Alan Isaac ([-10.08803863, 33.35407647, 3.66224999,], {'qtf': [-1.42364335e-006,-7.54678390e-006, 8.60238959e-007,], 'nfev': 67, 'fjac': [[ 3.90809167e+001, 1.52285596e-001, 5.42796568e-002,-5.31232430e-002, -1.51329383e-001,-2.23337196e-001,-2.56680604e-001,-2.45587151e-001, -1.91977358e-001,-1.05132214e-001,-8.64509291e-005, 1.04974277e-001, 1.91861732e-001, 2.45533859e-001, 2.56698861e-001, 2.23423839e-001, 1.51469423e-001, 5.32924326e-002,-5.41106080e-002,-1.52145958e-001, -2.23841613e-001,-2.56785538e-001,-2.45274439e-001,-1.91301134e-001, -1.04209544e-001, 9.22927844e-004, 1.05895621e-001, 1.92535537e-001, 2.45843467e-001, 2.56590679e-001,] [ 7.47608840e+000,-4.30094645e+000, 6.46929896e-002,-5.72488268e-002, -1.45802107e-001,-1.89678003e-001,-1.88687038e-001,-1.52489724e-001, -9.72813755e-002,-4.12694610e-002,-2.40662882e-005, 1.72343990e-002, 9.59154323e-003,-1.57616526e-002,-4.57896244e-002,-6.53658172e-002, -6.16101164e-002,-2.77618802e-002, 3.43667671e-002, 1.14003855e-001, 1.93285265e-001, 2.51053205e-001, 2.67805817e-001, 2.30718203e-001, 1.37580822e-001,-1.32390172e-003,-1.63990371e-001,-3.20145990e-001, -4.36857539e-001,-4.85253949e-001,] [-1.46106598e-003,-4.38759204e-001,-3.84704098e+000, 2.74702726e-001, 2.64053433e-001, 2.02700946e-001, 1.03168178e-001,-1.48774649e-002, -1.28450677e-001,-2.15668814e-001,-2.59924688e-001,-2.53022660e-001, -1.96689943e-001,-1.02185423e-001, 1.19103083e-002, 1.23295947e-001, 2.10251560e-001, 2.55820643e-001, 2.51075021e-001, 1.96840417e-001, 1.03552770e-001,-1.07296399e-002,-1.23782838e-001,-2.13486879e-001, -2.62096394e-001,-2.59696278e-001,-2.06177840e-001,-1.11348387e-001, 6.86050421e-003, 1.25759521e-001,]], 'fvec': [-0.12272512,-0.06597184,-0.12726937,-0.37158314, 0.37928372, 0.16812751, -0.15770147,-0.36697812, 0.12214352,-0.23236439,-0.18302132, 0.00113977, 0.17116167, 0.03007847,-0.42073783, 0.25313819, 0.05608965,-0.03051878, -0.18438834,-0.06210419, 0.25409787,-0.14341716,-0.32881238, 0.10358692, 0.35464075,-0.20294016,-0.10304636, 0.13489069,-0.12251931,-0.06129922,], 'ipvt': [3,2,1,]}, 1, 'Both actual and predicted relative reductions in the sum of squares\n are at most 0.000000') True: (10, 33.333333333333336, 0.52359877559829882) Start: [ 8. , 43.47826087, 1.04719755,] End: [-10.08803863, 33.35407647, 3.66224999,] From jdhunter at ace.bsd.uchicago.edu Thu Oct 27 16:02:00 2005 From: jdhunter at ace.bsd.uchicago.edu (John Hunter) Date: Thu, 27 Oct 2005 15:02:00 -0500 Subject: [SciPy-user] array like circular buffer? Message-ID: <87ek6620g7.fsf@peds-pc311.bsd.uchicago.edu> I would like to have a fixed size array like data structure that has circular buffer semantics. Eg, when empty you can call a "put" method that starts in the beginning and fills the sucker up, and when you overflow the end it starts pushing the data from the 0 index off the stack -- FIFO. Ideally, it would support slicing and friends. Is there such a beast, or something close? Thanks, JDH From skip at pobox.com Thu Oct 27 16:27:53 2005 From: skip at pobox.com (skip at pobox.com) Date: Thu, 27 Oct 2005 15:27:53 -0500 Subject: [SciPy-user] array like circular buffer? In-Reply-To: <87ek6620g7.fsf@peds-pc311.bsd.uchicago.edu> References: <87ek6620g7.fsf@peds-pc311.bsd.uchicago.edu> Message-ID: <17249.14409.257206.555059@montanaro.dyndns.org> John> I would like to have a fixed size array like data structure that John> has circular buffer semantics. Eg, when empty you can call a John> "put" method that starts in the beginning and fills the sucker up, John> and when you overflow the end it starts pushing the data from the John> 0 index off the stack -- FIFO. Ideally, it would support slicing John> and friends. How about subclassing list and proving relevant methods that adjusted all indices by modulo the list length? Skip From jdhunter at ace.bsd.uchicago.edu Thu Oct 27 17:19:35 2005 From: jdhunter at ace.bsd.uchicago.edu (John Hunter) Date: Thu, 27 Oct 2005 16:19:35 -0500 Subject: [SciPy-user] array like circular buffer? In-Reply-To: <17249.14409.257206.555059@montanaro.dyndns.org> (skip@pobox.com's message of "Thu, 27 Oct 2005 15:27:53 -0500") References: <87ek6620g7.fsf@peds-pc311.bsd.uchicago.edu> <17249.14409.257206.555059@montanaro.dyndns.org> Message-ID: <87r7a6u07s.fsf@peds-pc311.bsd.uchicago.edu> >>>>> "skip" == skip writes: John> I would like to have a fixed size array like data structure John> that has circular buffer semantics. Eg, when empty you can John> call a "put" method that starts in the beginning and fills John> the sucker up, and when you overflow the end it starts John> pushing the data from the 0 index off the stack -- FIFO. John> Ideally, it would support slicing and friends. skip> How about subclassing list and proving relevant methods that skip> adjusted all indices by modulo the list length? I was hoping/guessing someone had already done it for me JDH From kael.fischer at gmail.com Thu Oct 27 17:51:48 2005 From: kael.fischer at gmail.com (Kael Fischer) Date: Thu, 27 Oct 2005 14:51:48 -0700 Subject: [SciPy-user] leastsq not converging with tutorial example In-Reply-To: References: <435F238C.1040206@mit.edu> Message-ID: On 10/27/05, Alan G Isaac wrote: > I've lost track of this conversation: > are you working with the old SciPy? > If so, I get a fine fit. > (There is no unique solution.) > Output below. > Cheers, > Alan Isaac [clip] Thanks for the replies - but I realize that this code works for most people (I'm assuming that's why it is in the tutorial) and that my observations are anomalous. What would be most helpful, would be if someone with a FreeBSD system with the current release of scipy (custom or port build) would report their results. I'm using the released 0.3.2 version of scipy. I've gotten as far as the minpack fortran code, and it appears that on line 313 of lmder.f the following condition is true after a few (<= 4) function evaluations: if (gnorm .le. gtol) info = 4 on my systems and with my libraries. Thanks, Kael From wjdandreta at att.net Thu Oct 27 18:22:21 2005 From: wjdandreta at att.net (Bill Dandreta) Date: Thu, 27 Oct 2005 18:22:21 -0400 Subject: [SciPy-user] array like circular buffer? In-Reply-To: <87r7a6u07s.fsf@peds-pc311.bsd.uchicago.edu> References: <87ek6620g7.fsf@peds-pc311.bsd.uchicago.edu> <17249.14409.257206.555059@montanaro.dyndns.org> <87r7a6u07s.fsf@peds-pc311.bsd.uchicago.edu> Message-ID: <4361531D.1070106@att.net> John Hunter wrote: > John> I would like to have a fixed size array like data structure > John> that has circular buffer semantics. Eg, when empty you can > John> call a "put" method that starts in the beginning and fills > John> the sucker up, and when you overflow the end it starts > John> pushing the data from the 0 index off the stack -- FIFO. > John> Ideally, it would support slicing and friends. > > skip> How about subclassing list and proving relevant methods that > skip> adjusted all indices by modulo the list length? > >I was hoping/guessing someone had already done it for me I think it is called a circular queue, here is a generator version: >>> def circularQueue(X): ... while True: ... x=X.pop(0) ... X.append(x) ... yield x ... >>> >>> X=['a','b','c'] >>> for x in circularQueue(X): >>> i=0 ... print x,X ... i+=1 ... if i>9:break ... a ['b', 'c', 'a'] b ['c', 'a', 'b'] c ['a', 'b', 'c'] a ['b', 'c', 'a'] b ['c', 'a', 'b'] c ['a', 'b', 'c'] a ['b', 'c', 'a'] b ['c', 'a', 'b'] c ['a', 'b', 'c'] a ['b', 'c', 'a'] >>> Bill From giovanni.samaey at cs.kuleuven.ac.be Fri Oct 28 08:34:23 2005 From: giovanni.samaey at cs.kuleuven.ac.be (Giovanni Samaey) Date: Fri, 28 Oct 2005 14:34:23 +0200 Subject: [SciPy-user] SciPy op Opteron install Message-ID: <43621ACF.6040802@cs.kuleuven.ac.be> Hi all, I am proud to say that I finally managed to do a complete install of svn scipy on our new cluster. I will briefly report on the install process, which is kind of tricky. I also have some remaining questions. The setup is the following: blas and lapack come with the AMD Core Math Library, which was installed on the system. Everything is 64bit. The preferred compiler is pathscale, but also gcc is available (3.4.4 and 3.4.0 -- I used 3.4.4 ). (All other pythonrelated stuff has been compiled with pathscale.) Building newcore works without any problems; newscipy is a bit messier, since I get an "Internal compiler error during WHIRL" (Nested functions not implemented...) on pathcc -fno-strict-aliasing -DNDEBUG -g -O3 -Wall -Wstrict-prototypes -fPIC -Ibuild/src -I/data/home/u0038151/lib/python2.4/site-packages/scipy/base/include -I/apps/prod/python/2.4.1/include/python2.4 -c build/src/Lib/interpolate/dfitpackmodule.c -o build/temp.linux-x86_64-2.4/build/src/Lib/interpolate/dfitpackmodule.o I replaced pathcc by gcc; this succeeded and I continued the build. Then I got 8 fortran errors on the command pathf90 -Wall -fno-second-underscore -fPIC -O2 -funroll-loops -Ibuild/src -I/data/home/u0038151/lib/python2.4/site-packages/scipy/base/include -I/apps/prod/python/2.4.1/include/python2.4 -c -c build/src/Lib/sparse/sparsetools/spblas.f -o build/temp.linux-x86_64-2.4/build/src/Lib/sparse/sparsetools/spblas.o I got past this by replacing pathf90 by g77. Then the build succeeded. I don't know how "legal" this is :-) Importing scipy gives the already reported >>> import scipy Importing io to scipy Importing interpolate to scipy Importing fftpack to scipy Importing special to scipy Importing cluster to scipy Importing sparse to scipy Importing signal to scipy Failed to import signal cannot import name comb Importing utils to scipy Importing integrate to scipy Importing optimize to scipy Importing linalg to scipy Importing stats to scipy And running scipy.test() returns 38 failures/4 errors. (They are listed at the bottom of the message.) Please let me know whether these problems are due to my setup (the 64 bit Opteron), which are unsolved for now, due to some mistakes in installing, which I should solve, or due to my "special bypass". ====================================================================== ERROR: check_integer (scipy.io.array_import.test_array_import.test_read_array) ---------------------------------------------------------------------- Traceback (most recent call last): File "/data/home/u0038151/lib/python2.4/site-packages/scipy/io/tests/test_array_import.py", line 62, in check_integer b = io.read_array(fname,atype=N.Int) File "/data/home/u0038151/lib/python2.4/site-packages/scipy/io/array_import.py", line 359, in read_array raise ValueError, "One of the array types is invalid, k=%d" % k ValueError: One of the array types is invalid, k=0 ====================================================================== ERROR: check_singleton (scipy.base.getlimits.test_getlimits.test_longdouble) ---------------------------------------------------------------------- Traceback (most recent call last): File "/data/home/u0038151/lib/python2.4/site-packages/scipy/base/tests/test_getlimits.py", line 39, in check_singleton ftype = finfo(longdouble) File "/data/home/u0038151/lib/python2.4/site-packages/scipy/base/getlimits.py", line 49, in __new__ obj = object.__new__(cls)._init(dtype) File "/data/home/u0038151/lib/python2.4/site-packages/scipy/base/getlimits.py", line 75, in _init "scipy longfloat precision floating "\ File "/data/home/u0038151/lib/python2.4/site-packages/scipy/base/machar.py", line 226, in __init__ self._str_epsneg = float_to_str(epsneg) File "/data/home/u0038151/lib/python2.4/site-packages/scipy/base/getlimits.py", line 74, in lambda v:str(array(_frz(v)[0],'g')), File "/data/home/u0038151/lib/python2.4/site-packages/scipy/base/numeric.py", line 204, in array_str return array2string(a, max_line_width, precision, suppress_small, ' ', "") File "/data/home/u0038151/lib/python2.4/site-packages/scipy/base/arrayprint.py", line 189, in array2string x = a[()] ValueError: 0-d arrays can't be indexed. ====================================================================== ERROR: check_assoc_laguerre (scipy.special.basic.test_basic.test_assoc_laguerre) ---------------------------------------------------------------------- Traceback (most recent call last): File "/data/home/u0038151/lib/python2.4/site-packages/scipy/special/tests/test_basic.py", line 529, in check_assoc_laguerre assert_array_almost_equal(a2,a1(.2),8) File "/data/home/u0038151/lib/python2.4/site-packages/scipy/test/testing.py", line 732, in assert_array_almost_equal print x, y File "/data/home/u0038151/lib/python2.4/site-packages/scipy/base/numeric.py", line 204, in array_str return array2string(a, max_line_width, precision, suppress_small, ' ', "") File "/data/home/u0038151/lib/python2.4/site-packages/scipy/base/arrayprint.py", line 189, in array2string x = a[()] ValueError: 0-d arrays can't be indexed. ====================================================================== ERROR: check_y0_zeros (scipy.special.basic.test_basic.test_y0_zeros) ---------------------------------------------------------------------- Traceback (most recent call last): File "/data/home/u0038151/lib/python2.4/site-packages/scipy/special/tests/test_basic.py", line 2033, in check_y0_zeros assert_array_almost_equal(abs(yv(1,all)-allval),0.0,11) File "/data/home/u0038151/lib/python2.4/site-packages/scipy/test/testing.py", line 732, in assert_array_almost_equal print x, y File "/data/home/u0038151/lib/python2.4/site-packages/scipy/base/numeric.py", line 204, in array_str return array2string(a, max_line_width, precision, suppress_small, ' ', "") File "/data/home/u0038151/lib/python2.4/site-packages/scipy/base/arrayprint.py", line 189, in array2string x = a[()] ValueError: 0-d arrays can't be indexed. ====================================================================== FAIL: check_matvec (scipy.sparse.sparse.test_sparse.test_csc) ---------------------------------------------------------------------- Traceback (most recent call last): File "/data/home/u0038151/lib/python2.4/site-packages/scipy/sparse/tests/test_sparse.py", line 66, in check_matvec assert_array_almost_equal([1,2,3,4]*b,dot([1,2,3,4],b.todense())) File "/data/home/u0038151/lib/python2.4/site-packages/scipy/test/testing.py", line 727, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 100.0%): Array 1: [ 0. 0. 0.] Array 2: [ 17. 14. 9.] ====================================================================== FAIL: check_matvec (scipy.sparse.sparse.test_sparse.test_csr) ---------------------------------------------------------------------- Traceback (most recent call last): File "/data/home/u0038151/lib/python2.4/site-packages/scipy/sparse/tests/test_sparse.py", line 66, in check_matvec assert_array_almost_equal([1,2,3,4]*b,dot([1,2,3,4],b.todense())) File "/data/home/u0038151/lib/python2.4/site-packages/scipy/test/testing.py", line 727, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 100.0%): Array 1: [ 0. 0. 0.] Array 2: [ 17. 14. 9.] ====================================================================== FAIL: check_matvec (scipy.sparse.sparse.test_sparse.test_dok) ---------------------------------------------------------------------- Traceback (most recent call last): File "/data/home/u0038151/lib/python2.4/site-packages/scipy/sparse/tests/test_sparse.py", line 66, in check_matvec assert_array_almost_equal([1,2,3,4]*b,dot([1,2,3,4],b.todense())) File "/data/home/u0038151/lib/python2.4/site-packages/scipy/test/testing.py", line 727, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 100.0%): Array 1: [ 0. 0. 0.] Array 2: [ 17. 14. 9.] ====================================================================== FAIL: check_simple (scipy.linalg.basic.test_basic.test_det) ---------------------------------------------------------------------- Traceback (most recent call last): File "/data/home/u0038151/lib/python2.4/site-packages/scipy/linalg/tests/test_basic.py", line 273, in check_simple assert_almost_equal(a_det,-2.0) File "/data/home/u0038151/lib/python2.4/site-packages/scipy/test/testing.py", line 649, in assert_almost_equal assert round(abs(desired - actual),decimal) == 0, msg AssertionError: Items are not equal: DESIRED: -2.0 ACTUAL: nan ====================================================================== FAIL: check_simple_exact (scipy.linalg.basic.test_basic.test_lstsq) ---------------------------------------------------------------------- Traceback (most recent call last): File "/data/home/u0038151/lib/python2.4/site-packages/scipy/linalg/tests/test_basic.py", line 356, in check_simple_exact assert_array_almost_equal(Numeric.matrixmultiply(a,x),b) File "/data/home/u0038151/lib/python2.4/site-packages/scipy/test/testing.py", line 727, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 50.0%): Array 1: [0 0] Array 2: [1 0] ====================================================================== FAIL: check_simple_overdet (scipy.linalg.basic.test_basic.test_lstsq) ---------------------------------------------------------------------- Traceback (most recent call last): File "/data/home/u0038151/lib/python2.4/site-packages/scipy/linalg/tests/test_basic.py", line 363, in check_simple_overdet assert_array_almost_equal(x,direct_lstsq(a,b)) File "/data/home/u0038151/lib/python2.4/site-packages/scipy/test/testing.py", line 727, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 100.0%): Array 1: [-9223372036854775808 0] Array 2: [-0.4285714 0.8571429] ====================================================================== FAIL: check_simple (scipy.linalg.basic.test_basic.test_solve) ---------------------------------------------------------------------- Traceback (most recent call last): File "/data/home/u0038151/lib/python2.4/site-packages/scipy/linalg/tests/test_basic.py", line 74, in check_simple assert_array_almost_equal(Numeric.matrixmultiply(a,x),b) File "/data/home/u0038151/lib/python2.4/site-packages/scipy/test/testing.py", line 727, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 50.0%): Array 1: [-9223372036854775808 0] Array 2: [1 0] ====================================================================== FAIL: check_simple_sym (scipy.linalg.basic.test_basic.test_solve) ---------------------------------------------------------------------- Traceback (most recent call last): File "/data/home/u0038151/lib/python2.4/site-packages/scipy/linalg/tests/test_basic.py", line 81, in check_simple_sym assert_array_almost_equal(Numeric.matrixmultiply(a,x),b) File "/data/home/u0038151/lib/python2.4/site-packages/scipy/test/testing.py", line 727, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 50.0%): Array 1: [0 0] Array 2: [1 0] ====================================================================== FAIL: check_chebyc (scipy.special.basic.test_basic.test_chebyc) ---------------------------------------------------------------------- Traceback (most recent call last): File "/data/home/u0038151/lib/python2.4/site-packages/scipy/special/tests/test_basic.py", line 679, in check_chebyc assert_array_almost_equal(C2.c,[1,0,-2],13) File "/data/home/u0038151/lib/python2.4/site-packages/scipy/test/testing.py", line 717, in assert_array_almost_equal assert cond, msg + '\n\t' + err_msg AssertionError: Arrays are not almost equal (shapes mismatch): Shape of array 1: (5,) Shape of array 2: (3,) ====================================================================== FAIL: check_chebys (scipy.special.basic.test_basic.test_chebys) ---------------------------------------------------------------------- Traceback (most recent call last): File "/data/home/u0038151/lib/python2.4/site-packages/scipy/special/tests/test_basic.py", line 695, in check_chebys assert_array_almost_equal(S2.c,[1,0,-1],13) File "/data/home/u0038151/lib/python2.4/site-packages/scipy/test/testing.py", line 717, in assert_array_almost_equal assert cond, msg + '\n\t' + err_msg AssertionError: Arrays are not almost equal (shapes mismatch): Shape of array 1: (5,) Shape of array 2: (3,) ====================================================================== FAIL: check_chebyt (scipy.special.basic.test_basic.test_chebyt) ---------------------------------------------------------------------- Traceback (most recent call last): File "/data/home/u0038151/lib/python2.4/site-packages/scipy/special/tests/test_basic.py", line 711, in check_chebyt assert_array_almost_equal(T2.c,[2,0,-1],13) File "/data/home/u0038151/lib/python2.4/site-packages/scipy/test/testing.py", line 717, in assert_array_almost_equal assert cond, msg + '\n\t' + err_msg AssertionError: Arrays are not almost equal (shapes mismatch): Shape of array 1: (5,) Shape of array 2: (3,) ====================================================================== FAIL: check_chebyu (scipy.special.basic.test_basic.test_chebyu) ---------------------------------------------------------------------- Traceback (most recent call last): File "/data/home/u0038151/lib/python2.4/site-packages/scipy/special/tests/test_basic.py", line 727, in check_chebyu assert_array_almost_equal(U2.c,[4,0,-1],13) File "/data/home/u0038151/lib/python2.4/site-packages/scipy/test/testing.py", line 717, in assert_array_almost_equal assert cond, msg + '\n\t' + err_msg AssertionError: Arrays are not almost equal (shapes mismatch): Shape of array 1: (5,) Shape of array 2: (3,) ====================================================================== FAIL: check_gegenbauer (scipy.special.basic.test_basic.test_gegenbauer) ---------------------------------------------------------------------- Traceback (most recent call last): File "/data/home/u0038151/lib/python2.4/site-packages/scipy/special/tests/test_basic.py", line 1092, in check_gegenbauer assert_array_almost_equal(Ca2.c,array([2*a*(a+1),0,-a]),13) File "/data/home/u0038151/lib/python2.4/site-packages/scipy/test/testing.py", line 717, in assert_array_almost_equal assert cond, msg + '\n\t' + err_msg AssertionError: Arrays are not almost equal (shapes mismatch): Shape of array 1: (5,) Shape of array 2: (3,) ====================================================================== FAIL: check_h1vp (scipy.special.basic.test_basic.test_h1vp) ---------------------------------------------------------------------- Traceback (most recent call last): File "/data/home/u0038151/lib/python2.4/site-packages/scipy/special/tests/test_basic.py", line 1107, in check_h1vp assert_almost_equal(h1,h1real,8) File "/data/home/u0038151/lib/python2.4/site-packages/scipy/test/testing.py", line 649, in assert_almost_equal assert round(abs(desired - actual),decimal) == 0, msg AssertionError: Items are not equal: DESIRED: (0.49812630170362004+63.055272295669909j) ACTUAL: 0j ====================================================================== FAIL: check_h2vp (scipy.special.basic.test_basic.test_h2vp) ---------------------------------------------------------------------- Traceback (most recent call last): File "/data/home/u0038151/lib/python2.4/site-packages/scipy/special/tests/test_basic.py", line 1114, in check_h2vp assert_almost_equal(h2,h2real,8) File "/data/home/u0038151/lib/python2.4/site-packages/scipy/test/testing.py", line 649, in assert_almost_equal assert round(abs(desired - actual),decimal) == 0, msg AssertionError: Items are not equal: DESIRED: (0.49812630170362004-63.055272295669909j) ACTUAL: 0j ====================================================================== FAIL: check_hankel1 (scipy.special.basic.test_basic.test_hankel1) ---------------------------------------------------------------------- Traceback (most recent call last): File "/data/home/u0038151/lib/python2.4/site-packages/scipy/special/tests/test_basic.py", line 1013, in check_hankel1 assert_almost_equal(hank1,hankrl,8) File "/data/home/u0038151/lib/python2.4/site-packages/scipy/test/testing.py", line 649, in assert_almost_equal assert round(abs(desired - actual),decimal) == 0, msg AssertionError: Items are not equal: DESIRED: (0.049937526036241998-6.4589510947020274j) ACTUAL: (2.3177863641675947e-310+2.3296109222859581e-317j) ====================================================================== FAIL: check_hankel2 (scipy.special.basic.test_basic.test_hankel2) ---------------------------------------------------------------------- Traceback (most recent call last): File "/data/home/u0038151/lib/python2.4/site-packages/scipy/special/tests/test_basic.py", line 1027, in check_hankel2 assert_almost_equal(hank2,hankrl2,8) File "/data/home/u0038151/lib/python2.4/site-packages/scipy/test/testing.py", line 649, in assert_almost_equal assert round(abs(desired - actual),decimal) == 0, msg AssertionError: Items are not equal: DESIRED: (0.049937526036241998+6.4589510947020274j) ACTUAL: (2.3177863641675947e-310+2.3296109222859581e-317j) ====================================================================== FAIL: check_hermite (scipy.special.basic.test_basic.test_hermite) ---------------------------------------------------------------------- Traceback (most recent call last): File "/data/home/u0038151/lib/python2.4/site-packages/scipy/special/tests/test_basic.py", line 1047, in check_hermite assert_array_almost_equal(H2.c,[4,0,-2],13) File "/data/home/u0038151/lib/python2.4/site-packages/scipy/test/testing.py", line 717, in assert_array_almost_equal assert cond, msg + '\n\t' + err_msg AssertionError: Arrays are not almost equal (shapes mismatch): Shape of array 1: (5,) Shape of array 2: (3,) ====================================================================== FAIL: check_hermitenorm (scipy.special.basic.test_basic.test_hermite) ---------------------------------------------------------------------- Traceback (most recent call last): File "/data/home/u0038151/lib/python2.4/site-packages/scipy/special/tests/test_basic.py", line 1069, in check_hermitenorm assert_array_almost_equal(H1.c,he1.c,13) File "/data/home/u0038151/lib/python2.4/site-packages/scipy/test/testing.py", line 727, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 50.0%): Array 1: [ 1. 0.] Array 2: [ 0.35355339059327 0. ] ====================================================================== FAIL: check_i0e (scipy.special.basic.test_basic.test_i0e) ---------------------------------------------------------------------- Traceback (most recent call last): File "/data/home/u0038151/lib/python2.4/site-packages/scipy/special/tests/test_basic.py", line 1177, in check_i0e assert_almost_equal(oize,oizer,8) File "/data/home/u0038151/lib/python2.4/site-packages/scipy/test/testing.py", line 649, in assert_almost_equal assert round(abs(desired - actual),decimal) == 0, msg AssertionError: Items are not equal: DESIRED: 0.0 ACTUAL: 0.90710092578230106 ====================================================================== FAIL: check_i1e (scipy.special.basic.test_basic.test_i1e) ---------------------------------------------------------------------- Traceback (most recent call last): File "/data/home/u0038151/lib/python2.4/site-packages/scipy/special/tests/test_basic.py", line 1192, in check_i1e assert_almost_equal(oi1e,oi1er,8) File "/data/home/u0038151/lib/python2.4/site-packages/scipy/test/testing.py", line 649, in assert_almost_equal assert round(abs(desired - actual),decimal) == 0, msg AssertionError: Items are not equal: DESIRED: 3.6755005828442021e-317 ACTUAL: 0.045298446808809324 ====================================================================== FAIL: check_ive (scipy.special.basic.test_basic.test_ive) ---------------------------------------------------------------------- Traceback (most recent call last): File "/data/home/u0038151/lib/python2.4/site-packages/scipy/special/tests/test_basic.py", line 1229, in check_ive assert_almost_equal(ive1,iv1,10) File "/data/home/u0038151/lib/python2.4/site-packages/scipy/test/testing.py", line 649, in assert_almost_equal assert round(abs(desired - actual),decimal) == 0, msg AssertionError: Items are not equal: DESIRED: 0.90710092578230106 ACTUAL: 0.0 ====================================================================== FAIL: check_jacobi (scipy.special.basic.test_basic.test_jacobi) ---------------------------------------------------------------------- Traceback (most recent call last): File "/data/home/u0038151/lib/python2.4/site-packages/scipy/special/tests/test_basic.py", line 1263, in check_jacobi assert_array_almost_equal(P1.c,array([a+b+2,a-b])/2.0,13) File "/data/home/u0038151/lib/python2.4/site-packages/scipy/test/testing.py", line 717, in assert_array_almost_equal assert cond, msg + '\n\t' + err_msg AssertionError: Arrays are not almost equal (shapes mismatch): Shape of array 1: (3,) Shape of array 2: (2,) ====================================================================== FAIL: check_jve (scipy.special.basic.test_basic.test_jve) ---------------------------------------------------------------------- Traceback (most recent call last): File "/data/home/u0038151/lib/python2.4/site-packages/scipy/special/tests/test_basic.py", line 1289, in check_jve assert_almost_equal(jvexp,0.099500832639235995,8) File "/data/home/u0038151/lib/python2.4/site-packages/scipy/test/testing.py", line 649, in assert_almost_equal assert round(abs(desired - actual),decimal) == 0, msg AssertionError: Items are not equal: DESIRED: 0.099500832639235995 ACTUAL: 3.6755005828442021e-317 ====================================================================== FAIL: check_k0 (scipy.special.basic.test_basic.test_k0) ---------------------------------------------------------------------- Traceback (most recent call last): File "/data/home/u0038151/lib/python2.4/site-packages/scipy/special/tests/test_basic.py", line 1367, in check_k0 assert_almost_equal(ozk,ozkr,8) File "/data/home/u0038151/lib/python2.4/site-packages/scipy/test/testing.py", line 649, in assert_almost_equal assert round(abs(desired - actual),decimal) == 0, msg AssertionError: Items are not equal: DESIRED: 3.6755005828442021e-317 ACTUAL: 2.4270690247020164 ====================================================================== FAIL: check_k0e (scipy.special.basic.test_basic.test_k0e) ---------------------------------------------------------------------- Traceback (most recent call last): File "/data/home/u0038151/lib/python2.4/site-packages/scipy/special/tests/test_basic.py", line 1374, in check_k0e assert_almost_equal(ozke,ozker,8) File "/data/home/u0038151/lib/python2.4/site-packages/scipy/test/testing.py", line 649, in assert_almost_equal assert round(abs(desired - actual),decimal) == 0, msg AssertionError: Items are not equal: DESIRED: 3.6755005828442021e-317 ACTUAL: 2.6823261022628944 ====================================================================== FAIL: check_k1 (scipy.special.basic.test_basic.test_k1) ---------------------------------------------------------------------- Traceback (most recent call last): File "/data/home/u0038151/lib/python2.4/site-packages/scipy/special/tests/test_basic.py", line 1381, in check_k1 assert_almost_equal(o1k,o1kr,8) File "/data/home/u0038151/lib/python2.4/site-packages/scipy/test/testing.py", line 649, in assert_almost_equal assert round(abs(desired - actual),decimal) == 0, msg AssertionError: Items are not equal: DESIRED: 3.6755005828442021e-317 ACTUAL: 9.853844780870606 ====================================================================== FAIL: check_k1e (scipy.special.basic.test_basic.test_k1e) ---------------------------------------------------------------------- Traceback (most recent call last): File "/data/home/u0038151/lib/python2.4/site-packages/scipy/special/tests/test_basic.py", line 1388, in check_k1e assert_almost_equal(o1ke,o1ker,8) File "/data/home/u0038151/lib/python2.4/site-packages/scipy/test/testing.py", line 649, in assert_almost_equal assert round(abs(desired - actual),decimal) == 0, msg AssertionError: Items are not equal: DESIRED: 3.6755005828442021e-317 ACTUAL: 10.890182683049698 ====================================================================== FAIL: check_kv (scipy.special.basic.test_basic.test_kv) ---------------------------------------------------------------------- Traceback (most recent call last): File "/data/home/u0038151/lib/python2.4/site-packages/scipy/special/tests/test_basic.py", line 1524, in check_kv assert_almost_equal(kv1,1.7527038555281462,8) File "/data/home/u0038151/lib/python2.4/site-packages/scipy/test/testing.py", line 649, in assert_almost_equal assert round(abs(desired - actual),decimal) == 0, msg AssertionError: Items are not equal: DESIRED: 1.7527038555281462 ACTUAL: 3.6755005828442021e-317 ====================================================================== FAIL: check_genlaguerre (scipy.special.basic.test_basic.test_laguerre) ---------------------------------------------------------------------- Traceback (most recent call last): File "/data/home/u0038151/lib/python2.4/site-packages/scipy/special/tests/test_basic.py", line 1567, in check_genlaguerre assert_equal(lag1.c,[-1,k+1]) File "/data/home/u0038151/lib/python2.4/site-packages/scipy/test/testing.py", line 619, in assert_equal return assert_array_equal(actual, desired, err_msg) File "/data/home/u0038151/lib/python2.4/site-packages/scipy/test/testing.py", line 688, in assert_array_equal assert 0 in [len(shape(x)),len(shape(y))] \ AssertionError: Arrays are not equal (shapes (3,), (2,) mismatch): ====================================================================== FAIL: check_laguerre (scipy.special.basic.test_basic.test_laguerre) ---------------------------------------------------------------------- Traceback (most recent call last): File "/data/home/u0038151/lib/python2.4/site-packages/scipy/special/tests/test_basic.py", line 1554, in check_laguerre assert_array_almost_equal(lag1.c,[-1,1],13) File "/data/home/u0038151/lib/python2.4/site-packages/scipy/test/testing.py", line 717, in assert_array_almost_equal assert cond, msg + '\n\t' + err_msg AssertionError: Arrays are not almost equal (shapes mismatch): Shape of array 1: (3,) Shape of array 2: (2,) ====================================================================== FAIL: check_legendre (scipy.special.basic.test_basic.test_legendre) ---------------------------------------------------------------------- Traceback (most recent call last): File "/data/home/u0038151/lib/python2.4/site-packages/scipy/special/tests/test_basic.py", line 1584, in check_legendre assert_equal(leg2.c,array([3,0,-1])/2.0) File "/data/home/u0038151/lib/python2.4/site-packages/scipy/test/testing.py", line 619, in assert_equal return assert_array_equal(actual, desired, err_msg) File "/data/home/u0038151/lib/python2.4/site-packages/scipy/test/testing.py", line 688, in assert_array_equal assert 0 in [len(shape(x)),len(shape(y))] \ AssertionError: Arrays are not equal (shapes (5,), (3,) mismatch): ====================================================================== FAIL: check_round (scipy.special.basic.test_basic.test_round) ---------------------------------------------------------------------- Traceback (most recent call last): File "/data/home/u0038151/lib/python2.4/site-packages/scipy/special/tests/test_basic.py", line 1793, in check_round assert_array_equal(rnd,rndrl) File "/data/home/u0038151/lib/python2.4/site-packages/scipy/test/testing.py", line 702, in assert_array_equal assert cond,\ AssertionError: Arrays are not equal (mismatch 25.0%): Array 1: [10 10 11 11] Array 2: [10 10 10 11] ====================================================================== FAIL: check_yve (scipy.special.basic.test_basic.test_yve) ---------------------------------------------------------------------- Traceback (most recent call last): File "/data/home/u0038151/lib/python2.4/site-packages/scipy/special/tests/test_basic.py", line 2076, in check_yve assert_almost_equal(yve2,-3.3238249881118471,8) File "/data/home/u0038151/lib/python2.4/site-packages/scipy/test/testing.py", line 649, in assert_almost_equal assert round(abs(desired - actual),decimal) == 0, msg AssertionError: Items are not equal: DESIRED: -3.3238249881118471 ACTUAL: 3.6755005828442021e-317 ====================================================================== FAIL: check_data (scipy.stats.morestats.test_morestats.test_binom_test) ---------------------------------------------------------------------- Traceback (most recent call last): File "/data/home/u0038151/lib/python2.4/site-packages/scipy/stats/tests/test_morestats.py", line 104, in check_data assert_almost_equal(pval,0.0018833009350757682,11) File "/data/home/u0038151/lib/python2.4/site-packages/scipy/test/testing.py", line 649, in assert_almost_equal assert round(abs(desired - actual),decimal) == 0, msg AssertionError: Items are not equal: DESIRED: 0.0018833009350757682 ACTUAL: 1.0 ====================================================================== FAIL: check_matvec (scipy.sparse.test_sparse.test_csc) ---------------------------------------------------------------------- Traceback (most recent call last): File "/data/home/u0038151/lib/python2.4/site-packages/scipy/sparse/tests/test_sparse.py", line 66, in check_matvec assert_array_almost_equal([1,2,3,4]*b,dot([1,2,3,4],b.todense())) File "/data/home/u0038151/lib/python2.4/site-packages/scipy/test/testing.py", line 727, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 100.0%): Array 1: [ 0. 0. 0.] Array 2: [ 17. 14. 9.] ====================================================================== FAIL: check_matvec (scipy.sparse.test_sparse.test_csr) ---------------------------------------------------------------------- Traceback (most recent call last): File "/data/home/u0038151/lib/python2.4/site-packages/scipy/sparse/tests/test_sparse.py", line 66, in check_matvec assert_array_almost_equal([1,2,3,4]*b,dot([1,2,3,4],b.todense())) File "/data/home/u0038151/lib/python2.4/site-packages/scipy/test/testing.py", line 727, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 100.0%): Array 1: [ 0. 0. 0.] Array 2: [ 17. 14. 9.] ====================================================================== FAIL: check_matvec (scipy.sparse.test_sparse.test_dok) ---------------------------------------------------------------------- Traceback (most recent call last): File "/data/home/u0038151/lib/python2.4/site-packages/scipy/sparse/tests/test_sparse.py", line 66, in check_matvec assert_array_almost_equal([1,2,3,4]*b,dot([1,2,3,4],b.todense())) File "/data/home/u0038151/lib/python2.4/site-packages/scipy/test/testing.py", line 727, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 100.0%): Array 1: [ 0. 0. 0.] Array 2: [ 17. 14. 9.] ---------------------------------------------------------------------- Ran 1124 tests in 47.894s FAILED (failures=38, errors=4) >>> Disclaimer: http://www.kuleuven.be/cwis/email_disclaimer.htm From ckkart at hoc.net Sat Oct 29 11:41:09 2005 From: ckkart at hoc.net (ckkart at hoc.net) Date: Sat, 29 Oct 2005 17:41:09 +0200 Subject: [SciPy-user] weave changes 0.3.2 -> 0.3.3 Message-ID: <43639815.30300@hoc.net> Hi, I got the latest scipy (old) from svn and tried to run the laplace.py example from the weave documentation. With the 0.3.2 scipy tarball it ran without problems but now I get the following error: blitz took Traceback (most recent call last): File "laplace.py", line 341, in ? main() File "laplace.py", line 317, in main print "took", time_test(n, n, stepper=i, n_iter=n_iter), "seconds" File "laplace.py", line 307, in time_test s.solve(n_iter=n_iter, eps=eps) File "laplace.py", line 268, in solve err = self.timeStep() File "laplace.py", line 149, in blitzTimeStep weave.blitz(expr, check_size=0) File "c:\python23\Lib\site-packages\weave\blitz_tools.py", line 66, in blitz type_converters = converters.blitz, AttributeError: 'module' object has no attribute 'blitz' Can you tell me what needs to be changed in the script to get it working again? Regards, Christian From ryanlists at gmail.com Sat Oct 29 14:01:43 2005 From: ryanlists at gmail.com (Ryan Krauss) Date: Sat, 29 Oct 2005 14:01:43 -0400 Subject: [SciPy-user] small error problem Message-ID: I have a script that uses row reduction to mind the null space of a matrix. I am experiencing some strange behavior. At the begining of the script, I have a 2x2 matrix, but it is of rank 1. (I realize that find the null space of a 2x2 could be much simpler, but I will generally have higher ranked matrices. I have also found the nullspace using svd, but have gotten the best results from rref in the past.) The fact that the matrix is rank one can be verified early in the script. Row 2 is a multiple of row 1: (Pdb) mat[1,1]/mat[1,0]-mat[0,1]/mat[0,0] Out[2]: NumPy array, format: long 0.0 (Pdb) '%0.30f'%(mat[1,1]/mat[1,0]-mat[0,1]/mat[0,0]) Out[2]: '0.000000000000000000000000000000' In order to do the row reduction, I append a column of zeros: (Pdb) vect Out[2]: NumPy array, format: long [[ 0.,] [ 0.,]] 15 -> mat=c_[mat,vect] (Pdb) mat Out[2]: NumPy array, format: long [[ 1.50155914,-1.24770975,] [-2.98977017, 2.48432803,]] But now there is some small error: (Pdb) '%0.30f'%(mat[1,1]/mat[1,0]-mat[0,1]/mat[0,0]) Out[2]: '-0.000000013035565737951060327759' (Pdb) mat[1,1]/mat[1,0]-mat[0,1]/mat[0,0] Out[2]: -1.303556573795106e-08 and the output is now a scalar instead of NumPy array, format: long Any ideas why this is happening? Thanks, Ryan From oliphant at ee.byu.edu Sat Oct 29 14:12:23 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Sat, 29 Oct 2005 12:12:23 -0600 Subject: [SciPy-user] small error problem In-Reply-To: References: Message-ID: <4363BB87.3070502@ee.byu.edu> Ryan Krauss wrote: >I have a script that uses row reduction to mind the null space of a >matrix. I am experiencing some strange behavior. At the begining of >the script, I have a 2x2 matrix, but it is of rank 1. (I realize that >find the null space of a 2x2 could be much simpler, but I will >generally have higher ranked matrices. I have also found the >nullspace using svd, but have gotten the best results from rref in the >past.) > > which version of scipy are you using? >The fact that the matrix is rank one can be verified early in the >script. Row 2 is a multiple of row 1: >(Pdb) mat[1,1]/mat[1,0]-mat[0,1]/mat[0,0] >Out[2]: NumPy array, format: long >0.0 >(Pdb) '%0.30f'%(mat[1,1]/mat[1,0]-mat[0,1]/mat[0,0]) >Out[2]: '0.000000000000000000000000000000' > >In order to do the row reduction, I append a column of zeros: >(Pdb) vect >Out[2]: NumPy array, format: long >[[ 0.,] > [ 0.,]] > 15 -> mat=c_[mat,vect] >(Pdb) mat >Out[2]: NumPy array, format: long >[[ 1.50155914,-1.24770975,] > [-2.98977017, 2.48432803,]] > >But now there is some small error: >(Pdb) '%0.30f'%(mat[1,1]/mat[1,0]-mat[0,1]/mat[0,0]) >Out[2]: '-0.000000013035565737951060327759' >(Pdb) mat[1,1]/mat[1,0]-mat[0,1]/mat[0,0] >Out[2]: -1.303556573795106e-08 > >and the output is now a scalar instead of NumPy array, format: long > > I'm not sure what NumPy array, format: long means It does seem strange that the expression is different things in different places. Could you post more information. What is the result of type(mat), for example. Thanks, -Travis From ryanlists at gmail.com Sat Oct 29 14:22:06 2005 From: ryanlists at gmail.com (Ryan Krauss) Date: Sat, 29 Oct 2005 14:22:06 -0400 Subject: [SciPy-user] small error problem In-Reply-To: <4363BB87.3070502@ee.byu.edu> References: <4363BB87.3070502@ee.byu.edu> Message-ID: My version: In [3]: scipy.__version__ Out[3]: '0.3.3_309.4626' In both places, I get: (Pdb) type(mat) Out[2]: Any other info I can provide? Ryan On 10/29/05, Travis Oliphant wrote: > Ryan Krauss wrote: > > >I have a script that uses row reduction to mind the null space of a > >matrix. I am experiencing some strange behavior. At the begining of > >the script, I have a 2x2 matrix, but it is of rank 1. (I realize that > >find the null space of a 2x2 could be much simpler, but I will > >generally have higher ranked matrices. I have also found the > >nullspace using svd, but have gotten the best results from rref in the > >past.) > > > > > which version of scipy are you using? > > >The fact that the matrix is rank one can be verified early in the > >script. Row 2 is a multiple of row 1: > >(Pdb) mat[1,1]/mat[1,0]-mat[0,1]/mat[0,0] > >Out[2]: NumPy array, format: long > >0.0 > >(Pdb) '%0.30f'%(mat[1,1]/mat[1,0]-mat[0,1]/mat[0,0]) > >Out[2]: '0.000000000000000000000000000000' > > > >In order to do the row reduction, I append a column of zeros: > >(Pdb) vect > >Out[2]: NumPy array, format: long > >[[ 0.,] > > [ 0.,]] > > 15 -> mat=c_[mat,vect] > >(Pdb) mat > >Out[2]: NumPy array, format: long > >[[ 1.50155914,-1.24770975,] > > [-2.98977017, 2.48432803,]] > > > >But now there is some small error: > >(Pdb) '%0.30f'%(mat[1,1]/mat[1,0]-mat[0,1]/mat[0,0]) > >Out[2]: '-0.000000013035565737951060327759' > >(Pdb) mat[1,1]/mat[1,0]-mat[0,1]/mat[0,0] > >Out[2]: -1.303556573795106e-08 > > > >and the output is now a scalar instead of NumPy array, format: long > > > > > I'm not sure what NumPy array, format: long means > > It does seem strange that the expression is different things in > different places. > > Could you post more information. What is the result of type(mat), for > example. > > Thanks, > > -Travis > > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user > From oliphant at ee.byu.edu Sat Oct 29 14:28:09 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Sat, 29 Oct 2005 12:28:09 -0600 Subject: [SciPy-user] small error problem In-Reply-To: References: <4363BB87.3070502@ee.byu.edu> Message-ID: <4363BF39.2070403@ee.byu.edu> Ryan Krauss wrote: >My version: >In [3]: scipy.__version__ >Out[3]: '0.3.3_309.4626' > >In both places, I get: >(Pdb) type(mat) >Out[2]: > > This could be an issue with the old method of updating the numeric behavior to support some advanced indexing procedures in old scipy. I know that IPython automatically alters Numeric if you have scipy. Why don't you try: scipy.restore_numeric() and see if that helps. -Travis From Fernando.Perez at colorado.edu Sat Oct 29 14:39:29 2005 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Sat, 29 Oct 2005 12:39:29 -0600 Subject: [SciPy-user] small error problem In-Reply-To: <4363BF39.2070403@ee.byu.edu> References: <4363BB87.3070502@ee.byu.edu> <4363BF39.2070403@ee.byu.edu> Message-ID: <4363C1E1.4030601@colorado.edu> Travis Oliphant wrote: > Ryan Krauss wrote: > > >>My version: >>In [3]: scipy.__version__ >>Out[3]: '0.3.3_309.4626' >> >>In both places, I get: >>(Pdb) type(mat) >>Out[2]: >> >> > > > This could be an issue with the old method of updating the numeric > behavior to support some advanced indexing procedures in old scipy. > > I know that IPython automatically alters Numeric if you have scipy. But only if you load the scipy profile via ipython -p scipy If you simply start ipython and then issue 'import scipy', no alter_numeric() is called for you at all. Just to clarify... Cheers, f From ryanlists at gmail.com Sat Oct 29 14:59:47 2005 From: ryanlists at gmail.com (Ryan Krauss) Date: Sat, 29 Oct 2005 14:59:47 -0400 Subject: [SciPy-user] small error problem In-Reply-To: <4363C1E1.4030601@colorado.edu> References: <4363BB87.3070502@ee.byu.edu> <4363BF39.2070403@ee.byu.edu> <4363C1E1.4030601@colorado.edu> Message-ID: Interesting. I was starting ipython with -p scipy. If I enter scipy.restore_numeric() I get (Pdb) mat[1,1]/mat[1,0]-mat[0,1]/mat[0,0] Out[4]: -3.5527136788005009e-15 both before and after I augment the column of zeros (instead of exactly 0.0 before augmenting and -1.303556573795106e-08 after aygmenting if I don't restore numeric). Ryan On 10/29/05, Fernando Perez wrote: > Travis Oliphant wrote: > > Ryan Krauss wrote: > > > > > >>My version: > >>In [3]: scipy.__version__ > >>Out[3]: '0.3.3_309.4626' > >> > >>In both places, I get: > >>(Pdb) type(mat) > >>Out[2]: > >> > >> > > > > > > This could be an issue with the old method of updating the numeric > > behavior to support some advanced indexing procedures in old scipy. > > > > I know that IPython automatically alters Numeric if you have scipy. > > But only if you load the scipy profile via > > ipython -p scipy > > If you simply start ipython and then issue 'import scipy', no alter_numeric() > is called for you at all. > > Just to clarify... > > Cheers, > > f > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user > From oliphant at ee.byu.edu Sun Oct 30 01:40:57 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Sun, 30 Oct 2005 00:40:57 -0600 Subject: [SciPy-user] small error problem In-Reply-To: References: <4363BB87.3070502@ee.byu.edu> <4363BF39.2070403@ee.byu.edu> <4363C1E1.4030601@colorado.edu> Message-ID: <43646AF9.8050207@ee.byu.edu> Ryan Krauss wrote: >Interesting. I was starting ipython with -p scipy. > >If I enter >scipy.restore_numeric() > >I get >(Pdb) mat[1,1]/mat[1,0]-mat[0,1]/mat[0,0] >Out[4]: -3.5527136788005009e-15 >both before and after I augment the column of zeros (instead of >exactly 0.0 before augmenting and -1.303556573795106e-08 after >aygmenting if I don't restore numeric). > > This sounds reasonable. An answer of 0.0 was just how it was being printed (you'd have to look at the bit pattern to know for sure). So, it definitely looks like some precision issue with different types in the old alter numeric. As the approach is obsolete, that problem is going to have to just sit there for a while. -Travis From ckkart at hoc.net Sun Oct 30 05:45:23 2005 From: ckkart at hoc.net (Christian Kristukat) Date: Sun, 30 Oct 2005 11:45:23 +0100 Subject: [SciPy-user] weave changes 0.3.2 -> 0.3.3 In-Reply-To: <43639815.30300@hoc.net> References: <43639815.30300@hoc.net> Message-ID: <4364A443.1090009@hoc.net> ckkart at hoc.net wrote: > Hi, > I got the latest scipy (old) from svn and tried to run the laplace.py > example from the weave documentation. With the 0.3.2 scipy tarball it > ran without problems but now I get the following error: > > blitz took > Traceback (most recent call last): > File "laplace.py", line 341, in ? > main() > File "laplace.py", line 317, in main > print "took", time_test(n, n, stepper=i, n_iter=n_iter), "seconds" > File "laplace.py", line 307, in time_test > s.solve(n_iter=n_iter, eps=eps) > File "laplace.py", line 268, in solve > err = self.timeStep() > File "laplace.py", line 149, in blitzTimeStep > weave.blitz(expr, check_size=0) > File "c:\python23\Lib\site-packages\weave\blitz_tools.py", line 66, > in blitz > type_converters = converters.blitz, > AttributeError: 'module' object has no attribute 'blitz' > I just found out that the problem is in blitz_spec.py on line 88: class array_converter(standard_array_spec.array_converter): def init_info(self): standard_array_spec.array_converter.init_info(self) blitz_headers = ['"blitz/array.h"', '"%s/arrayobject.h"' % nx.NX_ARRAYPKG, '',''] nx.NX_ARRAYPKG cannot be evaluated: AttributeError: _TypeNamespace instance has no attribute 'NX_ARRAYPKG' I replaced %s by 'Numeric' and it works. Christian From stephen.walton at csun.edu Sun Oct 30 14:17:03 2005 From: stephen.walton at csun.edu (Stephen Walton) Date: Sun, 30 Oct 2005 11:17:03 -0800 Subject: [SciPy-user] Numeric 24, matplotlib, etc. In-Reply-To: <435AE00E.4030008@ucsd.edu> References: <43589507.4050003@mecha.uni-stuttgart.de> <435907B4.5070506@csun.edu> <43591CB0.8040302@ee.byu.edu> <43593D35.6050005@ee.byu.edu> <43595D02.90709@csun.edu> <43597048.5080509@ee.byu.edu> <435AC394.10803@csun.edu> <435AE00E.4030008@ucsd.edu> Message-ID: <43651C2F.4070003@csun.edu> Robert Kern wrote some advice for getting Numeric 24.0 working on Ubuntu. I tried it but my hard-won Fedora expertise is more time efficient right now than working with Ubuntu, so I've switched to FC4 on all my systems. So, I did "python setup.py bdist_rpm" on Numeric 24.0, and "rpm -i --replacefiles" on the resulting RPM, and all is well. New scipy, matplotlib, and so on cooperate nicely. From jorgepeixotomorais at yahoo.com.br Sun Oct 30 15:04:00 2005 From: jorgepeixotomorais at yahoo.com.br (Jorge Peixoto) Date: Sun, 30 Oct 2005 17:04:00 -0300 (ART) Subject: [SciPy-user] Simulating discrete systems Message-ID: <20051030200400.98308.qmail@web54110.mail.yahoo.com> Hi. I need to simulate a transfer function in discrete time (I'm studying digital control). I googled and read the tutorial, but have not found how. Is it possible to be done in SciPy? Thanks in advance. _______________________________________________________ Promo??o Yahoo! Acesso Gr?tis: a cada hora navegada voc? acumula cupons e concorre a mais de 500 pr?mios! Participe! http://yahoo.fbiz.com.br/ From fonnesbeck at gmail.com Sun Oct 30 15:13:28 2005 From: fonnesbeck at gmail.com (Chris Fonnesbeck) Date: Sun, 30 Oct 2005 15:13:28 -0500 Subject: [SciPy-user] Simulating discrete systems In-Reply-To: <20051030200400.98308.qmail@web54110.mail.yahoo.com> References: <20051030200400.98308.qmail@web54110.mail.yahoo.com> Message-ID: <723eb6930510301213x4e560219hfd4d305a80f5f6da@mail.gmail.com> On 10/30/05, Jorge Peixoto wrote: > Hi. I need to simulate a transfer function in discrete > time (I'm studying digital control). I googled and > read the tutorial, but have not found how. Is it > possible to be done in SciPy? > You should take a look at simpy (http://simpy.sourceforge.net/). It is a discrete event simulation package for python -- I have used it for some pretty large applications without any problems. -- Chris Fonnesbeck Atlanta, GA From oliphant at ee.byu.edu Sun Oct 30 20:43:51 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Sun, 30 Oct 2005 18:43:51 -0700 Subject: [SciPy-user] Simulating discrete systems In-Reply-To: <20051030200400.98308.qmail@web54110.mail.yahoo.com> References: <20051030200400.98308.qmail@web54110.mail.yahoo.com> Message-ID: <436576D7.3030107@ee.byu.edu> Jorge Peixoto wrote: >Hi. I need to simulate a transfer function in discrete >time (I'm studying digital control). I googled and >read the tutorial, but have not found how. Is it >possible to be done in SciPy? > > > Yes, the function you need (for time-domain simulations) is called scipy.signal.lfilter (it's in the full scipy package). If you want to simulate in the frequency domain, then the function is simply fft, multiplication (the fftfreq command may be useful to specify your frequencies if you have a formula for your transfer function), ifft. If I knew more what you were trying to do, I could help more. -Travis From fonnesbeck at gmail.com Mon Oct 31 09:20:49 2005 From: fonnesbeck at gmail.com (Chris Fonnesbeck) Date: Mon, 31 Oct 2005 09:20:49 -0500 Subject: [SciPy-user] cholesky decomposition fails Message-ID: <723eb6930510310620xe5b9e85wec7b4df626f73e59@mail.gmail.com> I am trying to change over some of my kalman filtering code over to the new scipy, and have run into a failure in the cholesky decomposition function in scipy.linalg: from scipy.linalg import cholesky from scipy import array # Very simple decomposition cholesky(array([[1,0],[0,1]])) Using ipython for more debugging information, I get: --------------------------------------------------------------------------- exceptions.ValueError Traceback (most recent call last) /Users/chris/ /Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/scipy/linalg/decomp.py in cholesky(a=array([[ 1., 0.], [ 0., 1.]]), lower=0, overwrite_a=0) 317 318 """ --> 319 a1 = asarray_chkfinite(a) a1 = undefined global asarray_chkfinite = a = array([[ 1., 0.], [ 0., 1.]]) 320 if len(a1.shape) != 2 or a1.shape[0] != a1.shape[1]: 321 raise ValueError, 'expected square matrix' /Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/scipy_base/function_base.py in asarray_chkfinite(x=array([[ 1., 0.], [ 0., 1.]])) 24 """Like asarray except it checks to be sure no NaNs or Infs are present. 25 """ ---> 26 x = asarray(x) x = array([[ 1., 0.], [ 0., 1.]]) global asarray = 27 if not all(_nx.isfinite(x)): 28 raise ValueError, "Array must not contain infs or nans." /Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/scipy_base/type_check.py in asarray(a=array([[ 1., 0.], [ 0., 1.]]), typecode=None, savespace=None) 48 r.savespace(savespace) 49 return r ---> 50 return array(a,typecode,copy=0,savespace=savespace or 0) global array = a = array([[ 1., 0.], [ 0., 1.]]) typecode = None global copy = undefined savespace = None 51 52 def asfarray(a, typecode=None, savespace=None): ValueError: Invalid type for array Clearly, my array does not contain inf's or nan's. C. -- Chris Fonnesbeck Atlanta, GA From ryanlists at gmail.com Mon Oct 31 11:01:33 2005 From: ryanlists at gmail.com (Ryan Krauss) Date: Mon, 31 Oct 2005 11:01:33 -0500 Subject: [SciPy-user] small error problem In-Reply-To: <43646AF9.8050207@ee.byu.edu> References: <4363BB87.3070502@ee.byu.edu> <4363BF39.2070403@ee.byu.edu> <4363C1E1.4030601@colorado.edu> <43646AF9.8050207@ee.byu.edu> Message-ID: I can certainly live with the order e-15 error and agree that this is not worth working on now that new scipy is nearing release. I don't know if this means anything and it seems impossible, but one thing I did to investigate the answer of 0.0 that was being displayed was to ask for '%0.30f'%mynumber and I got 0.0000000000000000000 (30 zeros). Ryan On 10/30/05, Travis Oliphant wrote: > Ryan Krauss wrote: > > >Interesting. I was starting ipython with -p scipy. > > > >If I enter > >scipy.restore_numeric() > > > >I get > >(Pdb) mat[1,1]/mat[1,0]-mat[0,1]/mat[0,0] > >Out[4]: -3.5527136788005009e-15 > >both before and after I augment the column of zeros (instead of > >exactly 0.0 before augmenting and -1.303556573795106e-08 after > >aygmenting if I don't restore numeric). > > > > > This sounds reasonable. An answer of 0.0 was just how it was being > printed (you'd have to look at the bit pattern to know for sure). > > So, it definitely looks like some precision issue with different types > in the old alter numeric. As the approach is obsolete, that problem is > going to have to just sit there for a while. > > -Travis > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user > From oliphant at ee.byu.edu Mon Oct 31 12:30:08 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Mon, 31 Oct 2005 10:30:08 -0700 Subject: [SciPy-user] cholesky decomposition fails In-Reply-To: <723eb6930510310620xe5b9e85wec7b4df626f73e59@mail.gmail.com> References: <723eb6930510310620xe5b9e85wec7b4df626f73e59@mail.gmail.com> Message-ID: <436654A0.9000001@ee.byu.edu> Chris Fonnesbeck wrote: >I am trying to change over some of my kalman filtering code over to >the new scipy, and have run into a failure in the cholesky >decomposition function in scipy.linalg: > >from scipy.linalg import cholesky >from scipy import array > ># Very simple decomposition >cholesky(array([[1,0],[0,1]])) > >Using ipython for more debugging information, I get: > > > It looks like you are using old scipy. Or perhaps a combination of old and new scipy. You can't use old scipy on top of newcore. You have to use newscipy which is pretty close to beta stage. >--------------------------------------------------------------------------- >exceptions.ValueError Traceback (most >recent call last) > >/Users/chris/ > >/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/scipy/linalg/decomp.py >in cholesky(a=array([[ 1., 0.], > [ 0., 1.]]), lower=0, overwrite_a=0) > 317 > 318 """ >--> 319 a1 = asarray_chkfinite(a) > a1 = undefined > global asarray_chkfinite = > a = array([[ 1., 0.], > [ 0., 1.]]) > 320 if len(a1.shape) != 2 or a1.shape[0] != a1.shape[1]: > 321 raise ValueError, 'expected square matrix' > >/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/scipy_base/function_base.py >in asarray_chkfinite(x=array([[ 1., 0.], > > This is not the correct file layout for new scipy. You should be running from scipy/base/function_base.py What did you install and how. I'm not sure why you are still picking up the old scipy, unless you did not install the new scipy along with new core. -Travis From prabhu_r at users.sf.net Mon Oct 31 14:07:24 2005 From: prabhu_r at users.sf.net (Prabhu Ramachandran) Date: Tue, 1 Nov 2005 00:37:24 +0530 Subject: [SciPy-user] about weave performance evaluation In-Reply-To: <87u0f4b7tw.fsf@uwo.ca> References: <6ef8f3380510250545j1617fb0ekf6b48cc5b60c403f@mail.gmail.com> <87hdb5lhzl.fsf@uwo.ca> <6ef8f3380510251040x30210ebbyfa2f10d9a4017449@mail.gmail.com> <87u0f4b7tw.fsf@uwo.ca> Message-ID: <17254.27500.620505.812975@vulcan.linux.in> >>>>> "Dan" == Dan Christensen writes: >> 1- i don't understand the 25 fold increase of the last sentence >> of the cited text. Shouldn't we conclude that the use of >> numeric speeds up by a factor 100? Dan> I think 25 is correct. When the numeric method gets err < Dan> eps, it will be farther from the correct solution than when Dan> the pure python method gets err < eps, since it is in effect Dan> taking smaller steps. To get as accurate an answer, you'd Dan> have to adjust eps to make the code run approximately twice Dan> as long. Just to explain this a little more clearly, the issue is that the pure Python code will converge (in terms of number of iterations taken) twice as fast as the Numeric code. I.e. if the pure Python code takes x iterations to converge, the Numeric code will take 2*x iterations (because it uses temporaries). Therefore, if you want to compare the time taken for convergence, then you will end up with only a 25 fold speed increase instead of the 50 fold increase. If OTOH, you merely want to compare the time taken for 1 single iteration then you get a 50 fold speed increase by using Numeric arrays. >> 2- in the final comparison the time used by numeric is >> 29.3s. If we want to compare the performance between the >> different implementations, should we divide this time by 2 ? It depends. If the timing criterion is convergence to a particular error then divide by 2. If it is just to measure pure performance of 100 iterations on a 500x500 problem then the number should be left as such. My intention was to just show pure performance so readers get a feel for the kind of speed improvement they get with different options for a similar calculation. cheers, prabhu From fonnesbeck at gmail.com Mon Oct 31 15:43:37 2005 From: fonnesbeck at gmail.com (Chris Fonnesbeck) Date: Mon, 31 Oct 2005 15:43:37 -0500 Subject: [SciPy-user] cholesky decomposition fails In-Reply-To: <436654A0.9000001@ee.byu.edu> References: <723eb6930510310620xe5b9e85wec7b4df626f73e59@mail.gmail.com> <436654A0.9000001@ee.byu.edu> Message-ID: <723eb6930510311243j6280804dmd122f2813034087b@mail.gmail.com> On 10/31/05, Travis Oliphant wrote: > Chris Fonnesbeck wrote: > > >I am trying to change over some of my kalman filtering code over to > >the new scipy, and have run into a failure in the cholesky > >decomposition function in scipy.linalg: > > > >from scipy.linalg import cholesky > >from scipy import array > > > ># Very simple decomposition > >cholesky(array([[1,0],[0,1]])) > > > >Using ipython for more debugging information, I get: > > > > > > > It looks like you are using old scipy. Or perhaps a combination of old > and new scipy. You can't use old scipy on top of newcore. You have to > use newscipy which is pretty close to beta stage. > > >--------------------------------------------------------------------------- > >exceptions.ValueError Traceback (most > >recent call last) > > > >/Users/chris/ > > > >/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/scipy/linalg/decomp.py > >in cholesky(a=array([[ 1., 0.], > > [ 0., 1.]]), lower=0, overwrite_a=0) > > 317 > > 318 """ > >--> 319 a1 = asarray_chkfinite(a) > > a1 = undefined > > global asarray_chkfinite = > > a = array([[ 1., 0.], > > [ 0., 1.]]) > > 320 if len(a1.shape) != 2 or a1.shape[0] != a1.shape[1]: > > 321 raise ValueError, 'expected square matrix' > > > >/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/scipy_base/function_base.py > >in asarray_chkfinite(x=array([[ 1., 0.], > > > > > This is not the correct file layout for new scipy. You should be > running from scipy/base/function_base.py > > What did you install and how. I'm not sure why you are still picking up > the old scipy, unless you did not install the new scipy along with new > core. Hmm ... I was pretty sure I had eliminated all the old stuff, and was using my cvs build. Apparently not. I will try again and confirm. Thanks, C. -- Chris Fonnesbeck Atlanta, GA From fonnesbeck at gmail.com Mon Oct 31 16:04:29 2005 From: fonnesbeck at gmail.com (Chris Fonnesbeck) Date: Mon, 31 Oct 2005 16:04:29 -0500 Subject: [SciPy-user] cholesky decomposition fails In-Reply-To: <723eb6930510311243j6280804dmd122f2813034087b@mail.gmail.com> References: <723eb6930510310620xe5b9e85wec7b4df626f73e59@mail.gmail.com> <436654A0.9000001@ee.byu.edu> <723eb6930510311243j6280804dmd122f2813034087b@mail.gmail.com> Message-ID: <723eb6930510311304j2c709394qf6441fe36e33997b@mail.gmail.com> On 10/31/05, Chris Fonnesbeck wrote: > On 10/31/05, Travis Oliphant wrote: > > > > > >/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/site-packages/scipy_base/function_base.py > > >in asarray_chkfinite(x=array([[ 1., 0.], > > > > > > > > This is not the correct file layout for new scipy. You should be > > running from scipy/base/function_base.py > > > > What did you install and how. I'm not sure why you are still picking up > > the old scipy, unless you did not install the new scipy along with new > > core. > > Hmm ... I was pretty sure I had eliminated all the old stuff, and was > using my cvs build. Apparently not. I will try again and confirm. > I did find a bug in basic_lite, in the cholesky_decomposition function, however. --> 120 return Numeric.transpose(MLab.triu(a,k=0)).copy() global Numeric.transpose = global MLab.triu = undefined a = array([[ 1., 0.], [ 0., 1.]]) global k.copy = undefined 121 122 NameError: global name 'MLab' is not defined Apparently, MLab is not imported by this module. C. -- Chris Fonnesbeck Atlanta, GA From ckkart at hoc.net Mon Oct 31 16:09:36 2005 From: ckkart at hoc.net (Christian Kristukat) Date: Mon, 31 Oct 2005 22:09:36 +0100 Subject: [SciPy-user] new scipy_core put syntax Message-ID: <43668810.2030003@hoc.net> Hi, I installed the new scipy_core and have problems using 'put': In [1]:from scipy import * In [2]:a=zeros(10) In [3]:put(a,[2,3],3.4) --------------------------------------------------------------------------- exceptions.TypeError Traceback (most recent call last) /home/ck/ /usr/lib/python2.3/site-packages/scipy/base/oldnumeric.py in put(a, ind, v) 163 a = array(a,copy=False) 164 v = array(v,copy=False) --> 165 return a.put(a, ind, v.astype(a.dtype)) 166 167 def putmask (a, mask, v): TypeError: function takes exactly 2 arguments (3 given) Did the syntax change? Christian From rkern at ucsd.edu Mon Oct 31 16:13:02 2005 From: rkern at ucsd.edu (Robert Kern) Date: Mon, 31 Oct 2005 13:13:02 -0800 Subject: [SciPy-user] cholesky decomposition fails In-Reply-To: <723eb6930510311304j2c709394qf6441fe36e33997b@mail.gmail.com> References: <723eb6930510310620xe5b9e85wec7b4df626f73e59@mail.gmail.com> <436654A0.9000001@ee.byu.edu> <723eb6930510311243j6280804dmd122f2813034087b@mail.gmail.com> <723eb6930510311304j2c709394qf6441fe36e33997b@mail.gmail.com> Message-ID: <436688DE.7050707@ucsd.edu> Chris Fonnesbeck wrote: > I did find a bug in basic_lite, in the cholesky_decomposition > function, however. > > --> 120 return Numeric.transpose(MLab.triu(a,k=0)).copy() > global Numeric.transpose = > global MLab.triu = undefined > a = array([[ 1., 0.], > [ 0., 1.]]) > global k.copy = undefined > 121 > 122 > > NameError: global name 'MLab' is not defined > > Apparently, MLab is not imported by this module. >From the top of basic_lite.py: import scipy.base as Numeric import scipy.lib.lapack_lite as lapack_lite MLab = Numeric from scipy.base import asarray, multiply import math Are you sure you have the latest checkout from SVN? -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From rkern at ucsd.edu Mon Oct 31 16:20:17 2005 From: rkern at ucsd.edu (Robert Kern) Date: Mon, 31 Oct 2005 13:20:17 -0800 Subject: [SciPy-user] new scipy_core put syntax In-Reply-To: <43668810.2030003@hoc.net> References: <43668810.2030003@hoc.net> Message-ID: <43668A91.6060409@ucsd.edu> Christian Kristukat wrote: > Hi, > I installed the new scipy_core and have problems using 'put': > > In [1]:from scipy import * > In [2]:a=zeros(10) > In [3]:put(a,[2,3],3.4) > --------------------------------------------------------------------------- > exceptions.TypeError Traceback (most recent call > last) > /home/ck/ > /usr/lib/python2.3/site-packages/scipy/base/oldnumeric.py in put(a, ind, v) > 163 a = array(a,copy=False) > 164 v = array(v,copy=False) > --> 165 return a.put(a, ind, v.astype(a.dtype)) > 166 > 167 def putmask (a, mask, v): > TypeError: function takes exactly 2 arguments (3 given) > > Did the syntax change? No, it's just a goof updating the older code to the new method implementation. Just remove "a" from the parameter list. Fixed in SVN. -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From oliphant at ee.byu.edu Mon Oct 31 16:23:36 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Mon, 31 Oct 2005 14:23:36 -0700 Subject: [SciPy-user] new scipy_core put syntax In-Reply-To: <43668810.2030003@hoc.net> References: <43668810.2030003@hoc.net> Message-ID: <43668B58.5090306@ee.byu.edu> Christian Kristukat wrote: >Hi, >I installed the new scipy_core and have problems using 'put': > >In [1]:from scipy import * >In [2]:a=zeros(10) >In [3]:put(a,[2,3],3.4) >--------------------------------------------------------------------------- >exceptions.TypeError Traceback (most recent call >last) >/home/ck/ >/usr/lib/python2.3/site-packages/scipy/base/oldnumeric.py in put(a, ind, v) > 163 a = array(a,copy=False) > 164 v = array(v,copy=False) >--> 165 return a.put(a, ind, v.astype(a.dtype)) > 166 > 167 def putmask (a, mask, v): >TypeError: function takes exactly 2 arguments (3 given) > >Did the syntax change? > > No, this is a bug in the back-ported put function (which is now a method on the new array objects). I've now fixed it. a.put([2,3],3.4) should get you what you want. Thanks for the report. -Travis