From jose.martin at wanadoo.es Sun Feb 1 20:28:47 2004 From: jose.martin at wanadoo.es (=?iso-8859-1?Q?Jos=E9_A_Mart=EDn_H?=) Date: Mon, 2 Feb 2004 02:28:47 +0100 Subject: [SciPy-user] PYTHON23 WINDOWS BINARIES Message-ID: <000f01c3e92b$e8d0a800$ba16243e@CASA> IN ORDER TO USE THE SCIPY WITH PYTHON23 IN WIN32 FROM BINARY INSTALATION YOU MUST INSTALL THE SCIPY CORE PACKAGE I HAVE INSTALLED THIS, I DONT HAVE ANY GARRANTY THAT THIS PACKAGE DONT HAVE VIRUS BUT I HAVE INSTALLED AND SCIPY NOW WORK WELL. THANKS TO GARY RUBEN... http://users.bigpond.net.au/gazzar/Scipy_core-0.2.1_alpha_40.1444.win32-py23.exe Please download this package and install it before using scipy binary installation for python 23. Jose. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gazzar at email.com Sun Feb 1 23:11:29 2004 From: gazzar at email.com (Gary Ruben) Date: Mon, 02 Feb 2004 14:11:29 +1000 Subject: [SciPy-user] PYTHON23 WINDOWS BINARIES Message-ID: <20040202041129.11254.qmail@email.com> I noticed Jos? was having what seemed to be the same problem as Gey-Hong Gweon and I, so I made the scipy-core installer which Travis sent me offlist available for him to download. The simplest way to do this was to make it available on my site. However, I didn't intend this to be made public. Actually, I wouldn't mind if only a handful of people downloaded this, but after a few tens of people, my ISP will probably limit access, so it's not really practical to leave it there. So, can whoever built the scipy Windows binary installer for Python 2.3 and included it on the download page please include scipy-core with it, or at least host the scipy-core installer on scipy.org and link to it from the downloads page. thanks, Gary ----- Original Message ----- From: Jos? A Mart?n H Date: Mon, 2 Feb 2004 02:28:47 +0100 To: "SciPy Users List" Subject: [SciPy-user] PYTHON23 WINDOWS BINARIES > IN ORDER TO USE THE SCIPY WITH PYTHON23 IN WIN32 FROM BINARY INSTALATION > YOU MUST INSTALL THE SCIPY CORE PACKAGE > > I HAVE INSTALLED THIS, I DONT HAVE ANY GARRANTY THAT THIS PACKAGE DONT HAVE VIRUS BUT I HAVE INSTALLED AND SCIPY NOW WORK WELL. > > THANKS TO GARY RUBEN... > > http://users.bigpond.net.au/gazzar/Scipy_core-0.2.1_alpha_40.1444.win32-py23.exe > > Please download this package and install it before using scipy binary installation for python 23. > > Jose. -- ___________________________________________________________ Sign-up for Ads Free at Mail.com http://promo.mail.com/adsfreejump.htm From pearu at scipy.org Mon Feb 2 17:54:24 2004 From: pearu at scipy.org (Pearu Peterson) Date: Mon, 2 Feb 2004 16:54:24 -0600 (CST) Subject: [SciPy-user] Scipy_core win32 installer available In-Reply-To: <20040202041129.11254.qmail@email.com> Message-ID: Hi, You can find Scipy_core windows installer at my scipy page: http://cens.ioc.ee/~pearu/scipy/ Scipy_core requires that the following win32 software is installed: Python: 2.3.3, I got it from www.python.org Numeric: 23.1, I got it from www.numpy.org After installing Scipy_core, recent Scipy win32 port from scipy.org should work (though I haven't tested this myself, yet). Enjoy! Pearu PS: this is my first python extension built for win32 platform, I shall celebrate this with a cup of tea.. From chris at fonnesbeck.org Wed Feb 4 14:19:17 2004 From: chris at fonnesbeck.org (Christopher Fonnesbeck) Date: Wed, 4 Feb 2004 14:19:17 -0500 Subject: [SciPy-user] distutils problem on cvs install under python 2.3 Message-ID: <06CCAB30-5747-11D8-838F-000A956FDAC0@fonnesbeck.org> When I attempt a build of the latest cvs on a clean Debian Linux install of python2.3, I get this old error message: chris at redhead:/usr/src/scipy$ python setup.py build Traceback (most recent call last): File "setup.py", line 25, in ? from scipy_distutils.core import setup File "scipy_core/scipy_distutils/core.py", line 1, in ? from distutils.core import * ImportError: No module named distutils.core I thought this bug in distutils was squashed long ago. Am I missing something here? Thanks, C. -- Christopher J. Fonnesbeck ( c h r i s @ f o n n e s b e c k . o r g ) Georgia Cooperative Fish & Wildlife Research Unit, University of Georgia From pearu at scipy.org Wed Feb 4 14:39:57 2004 From: pearu at scipy.org (Pearu Peterson) Date: Wed, 4 Feb 2004 13:39:57 -0600 (CST) Subject: [SciPy-user] distutils problem on cvs install under python 2.3 In-Reply-To: <06CCAB30-5747-11D8-838F-000A956FDAC0@fonnesbeck.org> Message-ID: On Wed, 4 Feb 2004, Christopher Fonnesbeck wrote: > When I attempt a build of the latest cvs on a clean Debian Linux > install of python2.3, I get this old error message: > > chris at redhead:/usr/src/scipy$ python setup.py build > Traceback (most recent call last): > File "setup.py", line 25, in ? > from scipy_distutils.core import setup > File "scipy_core/scipy_distutils/core.py", line 1, in ? > from distutils.core import * > ImportError: No module named distutils.core Could you verify that adding `import distutils` before the line #1 fixes this import error? > I thought this bug in distutils was squashed long ago. Am I missing > something here? Strange, I am on debian too with Python 2.3.3 and I don't get this error. Btw, do you have write permissions to /usr/src/scipy/scipy_core/scipy_distutils directory? I noticed that the traceback comes from .py file and not from .pyc file. Pearu From chris at fonnesbeck.org Wed Feb 4 15:58:10 2004 From: chris at fonnesbeck.org (Christopher Fonnesbeck) Date: Wed, 4 Feb 2004 15:58:10 -0500 Subject: [SciPy-user] distutils problem on cvs install under python 2.3 In-Reply-To: References: Message-ID: On Feb 4, 2004, at 2:39 PM, Pearu Peterson wrote: > > > On Wed, 4 Feb 2004, Christopher Fonnesbeck wrote: > >> When I attempt a build of the latest cvs on a clean Debian Linux >> install of python2.3, I get this old error message: >> >> chris at redhead:/usr/src/scipy$ python setup.py build >> Traceback (most recent call last): >> File "setup.py", line 25, in ? >> from scipy_distutils.core import setup >> File "scipy_core/scipy_distutils/core.py", line 1, in ? >> from distutils.core import * >> ImportError: No module named distutils.core > > Could you verify that adding `import distutils` before the line #1 > fixes this import error? It does not! Very strange. > >> I thought this bug in distutils was squashed long ago. Am I missing >> something here? > > Strange, I am on debian too with Python 2.3.3 and I don't get this > error. > Btw, do you have write permissions to > /usr/src/scipy/scipy_core/scipy_distutils directory? I noticed that > the traceback comes from .py file and not from .pyc file. I tried this running as root, so permissions should not be a problem. C. -- Christopher J. Fonnesbeck ( c h r i s @ f o n n e s b e c k . o r g ) Georgia Cooperative Fish & Wildlife Research Unit, University of Georgia From ehaux at ucmerced.edu Thu Feb 5 17:06:48 2004 From: ehaux at ucmerced.edu (ehaux at ucmerced.edu) Date: Thu, 05 Feb 2004 14:06:48 -0800 Subject: [SciPy-user] Re: need help with gcc error( digest Vol 5, Issue 7) Message-ID: <8282887c2b.87c2b82828@ucmerced.edu> following up from Jan 27.... my system_info.py output is this: atlas_info: FOUND: libraries = ['lapack', 'f77blas', 'cblas', 'atlas'] library_dirs = ['/usr/lib/atlas', '/usr/lib'] blas_info: NOT AVAILABLE blas_src_info: NOT AVAILABLE dfftw_info: NOT AVAILABLE dfftw_threads_info: NOT AVAILABLE djbfft_info: NOT AVAILABLE fftw_info: FOUND: libraries = ['rfftw', 'fftw'] library_dirs = ['/usr/lib'] define_macros = [('SCIPY_FFTW_H', None)] include_dirs = ['/usr/include'] fftw_threads_info: FOUND: libraries = ['rfftw_threads', 'fftw_threads'] library_dirs = ['/usr/lib'] define_macros = [('SCIPY_FFTW_THREADS_H', None)] include_dirs = ['/usr/include'] lapack_info: NOT AVAILABLE lapack_src_info: NOT AVAILABLE sfftw_info: NOT AVAILABLE sfftw_threads_info: NOT AVAILABLE x11_info: NOT AVAILABLE I'm running Debian on a pIII and from $ uname -a : Linux 2.6.1 #2 Tue Jan 20 15:33:43 PST 2004 i686 GNU/Linux locate did not find Xlib.h Eric > Message: 1 > Date: Tue, 27 Jan 2004 01:35:02 -0600 (CST) > From: Pearu Peterson > Subject: Re: [SciPy-user] need help with gcc error > To: SciPy Users List > > > can anyone help with this?? > > i downloaded from cvs and did: > > $python setup.py build > > > > but got the following errors: > > > > uild/temp.linux-i686-2.3/Lib/xplt/src/play/x11/colors.o - > DGISTPATH="/usr/lib/python2.3/site-packages/scipy/xplt/gistdata" > > In file included from Lib/xplt/src/play/x11/colors.c:9: > > Lib/xplt/src/play/x11/playx.h:11:22: X11/Xlib.h: No such file or > directory > ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^What is the output of > > python scipy_core/scipy_distutils/system_info.py > > ? > Where the header file Xlib.h is located in your system? > And could you give more details about your system > (sys.platform,os.name,uname -a, etc)? > > Pearu From ehaux at ucmerced.edu Thu Feb 5 17:22:00 2004 From: ehaux at ucmerced.edu (ehaux at ucmerced.edu) Date: Thu, 05 Feb 2004 14:22:00 -0800 Subject: [SciPy-user] Re: need help with gcc error (Digest, Vol 5, Issue 7) Message-ID: <84d44854ec.854ec84d44@ucmerced.edu> Also following up from Jan 27... I get the following when using this script: :/usr/lib/python2.3/site-packages/scipy$ python setup.py /usr/bin/python scipy_core/setup.py SciPy Core Version 0.2.1_alpha_40.1448 usage: setup.py [global_opts] cmd1 [cmd1_opts] [cmd2 [cmd2_opts] ...] or: setup.py --help [cmd1 cmd2 ...] or: setup.py --help-commands or: setup.py cmd --help error: no commands supplied Traceback (most recent call last): File "setup.py", line 127, in ? setup_package(ignore_packages) File "setup.py", line 77, in setup_package assert not s,'failed on scipy_core' AssertionError: failed on scipy_core > > Message: 3 > Date: Tue, 27 Jan 2004 06:17:31 -500 > From: scipy at zunzun.com (James R. Phillips) > Subject: Re: [SciPy-user] need help with gcc error > To: scipy-user at scipy.net > Message-ID: <401648cb9456c1.76981309 at mercury.sabren.com> > Content-Type: text/plain; charset="iso-8859-1" > > > i downloaded from cvs and did $python setup.py build > > but got the following errors: > > > > Lib/xplt/src/play/x11/playx.h:11:22: X11/Xlib.h: No such file or > directory > You may need to configure the setup.py file for > scipy to not build anything with X windows. Here > is the file I used as an example: > > http://zunzun.com/no_X_setup.py > > James Phillips > From pearu at scipy.org Fri Feb 6 03:26:08 2004 From: pearu at scipy.org (Pearu Peterson) Date: Fri, 6 Feb 2004 02:26:08 -0600 (CST) Subject: [SciPy-user] Re: need help with gcc error( digest Vol 5, Issue 7) In-Reply-To: <8282887c2b.87c2b82828@ucmerced.edu> Message-ID: On Thu, 5 Feb 2004 ehaux at ucmerced.edu wrote: > following up from Jan 27.... > > my system_info.py output is this: > x11_info: > NOT AVAILABLE > > > I'm running Debian on a pIII and from $ uname -a : > > Linux 2.6.1 #2 Tue Jan 20 15:33:43 PST 2004 i686 GNU/Linux > > locate did not find Xlib.h You'll need to install xlibs-dev. Pearu From pearu at scipy.org Fri Feb 6 03:28:28 2004 From: pearu at scipy.org (Pearu Peterson) Date: Fri, 6 Feb 2004 02:28:28 -0600 (CST) Subject: [SciPy-user] Re: need help with gcc error (Digest, Vol 5, Issue 7) In-Reply-To: <84d44854ec.854ec84d44@ucmerced.edu> Message-ID: On Thu, 5 Feb 2004 ehaux at ucmerced.edu wrote: > Also following up from Jan 27... > > I get the following when using this script: > > :/usr/lib/python2.3/site-packages/scipy$ python setup.py > /usr/bin/python scipy_core/setup.py > SciPy Core Version 0.2.1_alpha_40.1448 > usage: setup.py [global_opts] cmd1 [cmd1_opts] [cmd2 [cmd2_opts] ...] > or: setup.py --help [cmd1 cmd2 ...] > or: setup.py --help-commands > or: setup.py cmd --help > > error: no commands supplied ^^^^^^^^^^^^^^^^^^^^^^^^^^^ Read: http://www.python.org/doc/current/inst/inst.html Pearu From christoph.ortner at comlab.ox.ac.uk Fri Feb 6 05:19:35 2004 From: christoph.ortner at comlab.ox.ac.uk (Christoph Ortner) Date: Fri, 06 Feb 2004 10:19:35 +0000 Subject: [SciPy-user] Sparse Matrices Message-ID: <40236A37.8030307@comlab.ox.ac.uk> Hi all, Is there a chance that scipy will have support for sparse matrices in the near future? (e.g. pysparse -> http://people.web.psi.ch/geus/pyfemax/) With the advantage, that a complete, (hopefully optimized) package exists, optimization routines should be made able to use sparse Hessians, etc. Christoph From bob at redivi.com Fri Feb 6 17:50:44 2004 From: bob at redivi.com (Bob Ippolito) Date: Fri, 6 Feb 2004 17:50:44 -0500 Subject: [SciPy-user] H E L P !!! PLEASE In-Reply-To: <000f01c3e76c$c560de50$52806750@CASA> References: <000f01c3e76c$c560de50$52806750@CASA> Message-ID: On Jan 30, 2004, at 3:08 PM, Jos? A Mart?n H wrote: > IDLE 1.0.2????? > >>> import scipy > ? > Traceback (most recent call last): > ? File "", line 1, in -toplevel- > ??? import scipy > ? File "C:\Python23\Lib\site-packages\scipy\__init__.py", line 11, in > -toplevel- > ??? from scipy_base import * > ImportError: No module named scipy_base > >>> > ? > i have upgraded to python 2.3 and downloaded the last binaries for > scipy !!!. > I'm not a Windows user, but I have heard that this is a separate download (scipy_core maybe?). In any case, you best bet might be to download the Enthought Python 2.3 distribution; I believe that it includes both PIL and SciPy. -bob From pearu at scipy.org Fri Feb 6 19:17:46 2004 From: pearu at scipy.org (Pearu Peterson) Date: Fri, 6 Feb 2004 18:17:46 -0600 (CST) Subject: [SciPy-user] H E L P !!! PLEASE In-Reply-To: <000f01c3e76c$c560de50$52806750@CASA> Message-ID: On Fri, 30 Jan 2004, Jos? A Mart?n H wrote: > IDLE 1.0.2 > >>> import scipy > > Traceback (most recent call last): > File "", line 1, in -toplevel- > import scipy > File "C:\Python23\Lib\site-packages\scipy\__init__.py", line 11, in -toplevel- > from scipy_base import * > ImportError: No module named scipy_base > >>> > > i have upgraded to python 2.3 and downloaded the last binaries for scipy !!!. You'll need to install scipy_core. I few day old snapshot is available at http://cens.ioc.ee/~pearu/scipy/ Pearu From oliphant at ee.byu.edu Fri Feb 6 07:49:23 2004 From: oliphant at ee.byu.edu (Travis E. Oliphant) Date: Fri, 06 Feb 2004 05:49:23 -0700 Subject: [SciPy-user] Sparse Matrices In-Reply-To: <40236A37.8030307@comlab.ox.ac.uk> References: <40236A37.8030307@comlab.ox.ac.uk> Message-ID: <40238D53.5010408@ee.byu.edu> Christoph Ortner wrote: > Hi all, > > Is there a chance that scipy will have support for sparse matrices > in the near future? > (e.g. pysparse -> http://people.web.psi.ch/geus/pyfemax/) It already does, but currently the automatic build on Windows is turned off (I'm not sure it builds on windows). Sparse support on Linux works, however, though it's not well tested. -Travis From pearu at scipy.org Sat Feb 7 13:24:45 2004 From: pearu at scipy.org (Pearu Peterson) Date: Sat, 7 Feb 2004 12:24:45 -0600 (CST) Subject: [SciPy-user] Sparse Matrices In-Reply-To: <40238D53.5010408@ee.byu.edu> Message-ID: On Fri, 6 Feb 2004, Travis E. Oliphant wrote: > Christoph Ortner wrote: > > Hi all, > > > > Is there a chance that scipy will have support for sparse matrices > > in the near future? > > (e.g. pysparse -> http://people.web.psi.ch/geus/pyfemax/) > > It already does, but currently the automatic build on Windows is turned > off (I'm not sure it builds on windows). > scipy.sparse does not build on Windows - with mingw because some of the sources in SuperLU2.0 use sys/times.h, for instance, that is not available on non-posix system like Win32. This issue can be fixed by implementing sys/times.h features for windows. - with cygwin because for some reasons _S and _C (if I remember correctly) in the definition of Dtype_t enum (in supermatrix.h) are defined by compiler and thus causing syntax error. This issue can be fixed by renaming Dtype_t items and replacing the corresponding occurrences elsewhere with the new names. Pearu From christoph.ortner at comlab.ox.ac.uk Sat Feb 7 18:55:35 2004 From: christoph.ortner at comlab.ox.ac.uk (Christoph Ortner) Date: Sat, 07 Feb 2004 23:55:35 +0000 Subject: [SciPy-user] Sparse Matrices In-Reply-To: References: Message-ID: <40257AF7.1010101@comlab.ox.ac.uk> Pearu Peterson wrote: >On Fri, 6 Feb 2004, Travis E. Oliphant wrote: > > > >>Christoph Ortner wrote: >> >> >>>Hi all, >>> >>>Is there a chance that scipy will have support for sparse matrices >>>in the near future? >>>(e.g. pysparse -> http://people.web.psi.ch/geus/pyfemax/) >>> >>> >>It already does, but currently the automatic build on Windows is turned >>off (I'm not sure it builds on windows). >> >> >> > >scipy.sparse does not build on Windows >- with mingw because some of the sources in SuperLU2.0 use sys/times.h, > for instance, that is not available on non-posix system like Win32. > This issue can be fixed by implementing sys/times.h features > for windows. >- with cygwin because for some reasons _S and _C (if I remember correctly) > in the definition of Dtype_t enum (in supermatrix.h) are defined > by compiler and thus causing syntax error. This issue can be fixed by > renaming Dtype_t items and replacing the corresponding occurrences > elsewhere with the new names. > >Pearu > > Do you mean the pysparse package of Roman Geus? I built it both under cygwin and VC6.0 without any problems. I didnt succed in mingw though, because I wasnt able to create the python23.a. From paxcalpt at sapo.pt Sat Feb 7 20:02:19 2004 From: paxcalpt at sapo.pt (Ricardo Henriques) Date: Sun, 8 Feb 2004 01:02:19 -0000 Subject: [SciPy-user] How to make an Gassian fit? Message-ID: <01ae01c3eddf$35dc8440$0100a8c0@xcal> Is it possible to make a Gaussian fit to histogram data using Scipy? Tks everyone... Paxcal From pearu at scipy.org Sun Feb 8 02:07:08 2004 From: pearu at scipy.org (Pearu Peterson) Date: Sun, 8 Feb 2004 01:07:08 -0600 (CST) Subject: [SciPy-user] Sparse Matrices In-Reply-To: <40257AF7.1010101@comlab.ox.ac.uk> Message-ID: On Sat, 7 Feb 2004, Christoph Ortner wrote: > Pearu Peterson wrote: > >scipy.sparse does not build on Windows ^^^^^ > Do you mean the pysparse package of Roman Geus? No. See ^^^ above. > I built it both under cygwin and VC6.0 without any problems. > I didnt succed in mingw though, because I wasnt able to > create the python23.a. Hint: when using scipy_distutils instead of distutils, libpython23.a will be created for you. Also, google'ing gives number of references how to create this file. Pearu > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user > From chris at fonnesbeck.org Sun Feb 8 13:15:01 2004 From: chris at fonnesbeck.org (Christopher Fonnesbeck) Date: Sun, 8 Feb 2004 13:15:01 -0500 Subject: [SciPy-user] extra_compile_args in weave Message-ID: I am trying to write a GRASS GIS module (ANSI C) using weave.inline, so that I can interface my python simulation code with a GIS to get spatial information. As such, I need some GRASS header files. However, I am having trouble setting the GRASS include directory for weave. Here is a very simple example that simply initializes the GIS: import os,weave 'Set GRASS environment variables' os.putenv('GISBASE','/usr/local/grass57') os.putenv('GISRC','$HOME/.grassrc57') os.putenv('PATH','$PATH:$GISBASE/bin:$GISBASE/scripts') def test(foo): code = """ #include "gis.h" #include #include #include G_gisinit(argv[0]); return foo; """ weave.inline(code,[foo], extra_compile_args=["-I/usr/local/grass57/include"]) if __name__=='__main__': test(3) However, when I run this, I get: Traceback (most recent call last): File "grasstest.py", line 27, in ? test(3) File "grasstest.py", line 23, in test extra_compile_args=["-I/usr/local/grass57/include"]) File "/System/Library/Frameworks/Python.framework/Versions/2.3/lib/ python2.3/site-packages/weave/inline_tools.py", line 335, in inline auto_downcast = auto_downcast, File "/System/Library/Frameworks/Python.framework/Versions/2.3/lib/ python2.3/site-packages/weave/inline_tools.py", line 421, in compile_function type_converters = type_converters) File "/System/Library/Frameworks/Python.framework/Versions/2.3/lib/ python2.3/site-packages/weave/ext_tools.py", line 176, in __init__ auto_downcast, type_converters) File "/System/Library/Frameworks/Python.framework/Versions/2.3/lib/ python2.3/site-packages/weave/ext_tools.py", line 391, in assign_variable_types errors[var] = ("The type and dimensionality specifications" + TypeError: cannot concatenate 'str' and 'int' objects It doesnt seem to like the extra_compile_args. How else am I to set this include directory? Thanks, Chris -- Christopher J. Fonnesbeck ( c h r i s @ f o n n e s b e c k . o r g ) Georgia Cooperative Fish & Wildlife Research Unit, University of Georgia From prabhu at aero.iitm.ernet.in Sun Feb 8 13:36:53 2004 From: prabhu at aero.iitm.ernet.in (Prabhu Ramachandran) Date: Mon, 9 Feb 2004 00:06:53 +0530 Subject: [SciPy-user] extra_compile_args in weave In-Reply-To: References: Message-ID: <16422.33221.862765.834906@monster.linux.in> >>>>> "CF" == Christopher Fonnesbeck writes: CF> I am trying to write a GRASS GIS module (ANSI C) using CF> weave.inline, so that I can interface my python simulation CF> code with a GIS to get spatial information. As such, I need CF> some GRASS header files. However, I am having trouble setting CF> the GRASS include directory for weave. Here is a very simple CF> example that simply initializes the GIS: Try something like this: def test(foo): code = """ G_gisinit(argv[0]); return foo; """ weave.inline(code, ['foo'], include_dirs=["/usr/local/grass57/include"], headers=["gis.h", "stdio.h", "string.h", "math.h"]) All the options available can be seen here: http://www.scipy.org/site_content/weave/tutorial.html in the Keyword options section. I'm not sure your os.putenv statements are going to be of any use. IIRC these changes do not propagate to your shell. cheers, prabhu From chris at fonnesbeck.org Sun Feb 8 13:50:18 2004 From: chris at fonnesbeck.org (Christopher Fonnesbeck) Date: Sun, 8 Feb 2004 13:50:18 -0500 Subject: [SciPy-user] extra_compile_args in weave In-Reply-To: <16422.33221.862765.834906@monster.linux.in> References: <16422.33221.862765.834906@monster.linux.in> Message-ID: On Feb 8, 2004, at 1:36 PM, Prabhu Ramachandran wrote: >>>>>> "CF" == Christopher Fonnesbeck writes: > > CF> I am trying to write a GRASS GIS module (ANSI C) using > CF> weave.inline, so that I can interface my python simulation > CF> code with a GIS to get spatial information. As such, I need > CF> some GRASS header files. However, I am having trouble setting > CF> the GRASS include directory for weave. Here is a very simple > CF> example that simply initializes the GIS: > > Try something like this: > > def test(foo): > code = """ > G_gisinit(argv[0]); > > return foo; > """ > > weave.inline(code, ['foo'], > include_dirs=["/usr/local/grass57/include"], > headers=["gis.h", "stdio.h", "string.h", > "math.h"]) > Thanks for the correction. However, I still get a similar error message: Traceback (most recent call last): File "grasstest.py", line 19, in ? test(3) File "grasstest.py", line 15, in test headers=["gis.h", "stdio.h", "string.h", "math.h"]) File "/System/Library/Frameworks/Python.framework/Versions/2.3/lib/ python2.3/site-packages/weave/inline_tools.py", line 335, in inline auto_downcast = auto_downcast, File "/System/Library/Frameworks/Python.framework/Versions/2.3/lib/ python2.3/site-packages/weave/inline_tools.py", line 421, in compile_function type_converters = type_converters) File "/System/Library/Frameworks/Python.framework/Versions/2.3/lib/ python2.3/site-packages/weave/ext_tools.py", line 176, in __init__ auto_downcast, type_converters) File "/System/Library/Frameworks/Python.framework/Versions/2.3/lib/ python2.3/site-packages/weave/ext_tools.py", line 391, in assign_variable_types errors[var] = ("The type and dimensionality specifications" + TypeError: cannot concatenate 'str' and 'int' objects I am using a recent CVS build of scipy (i.e. since weave was integrated into the scipy code base). C. -- Christopher J. Fonnesbeck ( c h r i s @ f o n n e s b e c k . o r g ) Georgia Cooperative Fish & Wildlife Research Unit, University of Georgia From prabhu at aero.iitm.ernet.in Mon Feb 9 12:45:39 2004 From: prabhu at aero.iitm.ernet.in (Prabhu Ramachandran) Date: Mon, 9 Feb 2004 23:15:39 +0530 Subject: [SciPy-user] extra_compile_args in weave In-Reply-To: References: <16422.33221.862765.834906@monster.linux.in> Message-ID: <16423.51011.511913.836045@monster.linux.in> >>>>> "CF" == Christopher Fonnesbeck writes: CF> On Feb 8, 2004, at 1:36 PM, Prabhu Ramachandran wrote: [...] >> weave.inline(code, ['foo'], ^^^^^ Note the quotes! Your original code did not have the quotes and just had [foo] which is incorrect. I suspect (read: guess) that is your problem. prabhu From jeremy.gore at yale.edu Mon Feb 9 13:53:04 2004 From: jeremy.gore at yale.edu (Jeremy Gore) Date: Mon, 9 Feb 2004 13:53:04 -0500 Subject: [SciPy-user] scipy OS X compiling on a G3 Message-ID: <31850D8C-5B31-11D8-8720-000A95843714@yale.edu> I've tried all sorts of combinations of flags and so forth to get scipy to compile on OS X 10.3.2, but it still won't work for me. With CFLAGS="-lcc_dynamic -framework vecLib" (and I've also tried with -faltivec and -framework Accelerate) set and using sudo python setup.py install (and have also tried with the build build_ext first) for the CVS version, I get error messages along the lines of: In the build.py phase: package init file '/Users/jmg75/Programming/scipy/scipy_core/scipy_distutils/tests/ __init__.py' not found (or not a regular file) package init file '/Users/jmg75/Programming/scipy/scipy_core/scipy_base/tests/ __init__.py' not found (or not a regular file) In the install_data phase: fftw_info: NOT AVAILABLE dfftw_info: NOT AVAILABLE atlas_info: scipy_distutils.system_info.atlas_info NOT AVAILABLE blas_info: NOT AVAILABLE Clearly it is not finding the vecLib BLAS libraries. Whether this is because I'm on a G3 or some other problem I don't know; I was under the impression that although not optimized for altivec the vecLib libraries still processed for G3s. Any suggestions? My system: Reading specs from /usr/local/lib/gcc/powerpc-apple-darwin7.2.0/3.4.0/specs Configured with: ../gcc/configure --enable-threads=posix --enable-languages=f77 Thread model: posix gcc version 3.4.0 20040107 (experimental) Reading specs from /usr/libexec/gcc/darwin/ppc/3.3/specs Thread model: posix gcc version 3.3 20030304 (Apple Computer, Inc. build 1495) on the run_all.py test for f2py: Traceback (most recent call last): File "/Users/jmg75/Downloads/Expanded/F2PY-2.37.233-1545/tests/f90/ return_integer.py", line 136, in ? test_functions = build(f2py_opts) File "/Users/jmg75/Downloads/Expanded/F2PY-2.37.233-1545/tests/f90/ return_integer.py", line 16, in build assert not f2py2e.compile('''\ AssertionError TEST FAILURE (status=1) -- Jeremy Gore 1515 State St New Haven, CT 06511 203-530-9157 From pearu at scipy.org Mon Feb 9 14:37:03 2004 From: pearu at scipy.org (Pearu Peterson) Date: Mon, 9 Feb 2004 13:37:03 -0600 (CST) Subject: [SciPy-user] scipy OS X compiling on a G3 In-Reply-To: <31850D8C-5B31-11D8-8720-000A95843714@yale.edu> Message-ID: On Mon, 9 Feb 2004, Jeremy Gore wrote: > I've tried all sorts of combinations of flags and so forth to get scipy > to compile on OS X 10.3.2, but it still won't work for me. With > CFLAGS="-lcc_dynamic -framework vecLib" (and I've also tried with > -faltivec and -framework Accelerate) set and using sudo python setup.py > install (and have also tried with the build build_ext first) for the > CVS version, I get error messages along the lines of: > > In the build.py phase: > package init file > '/Users/jmg75/Programming/scipy/scipy_core/scipy_distutils/tests/ > __init__.py' not found (or not a regular file) > package init file > '/Users/jmg75/Programming/scipy/scipy_core/scipy_base/tests/ > __init__.py' not found (or not a regular file) > > In the install_data phase: > fftw_info: > NOT AVAILABLE > dfftw_info: > NOT AVAILABLE > atlas_info: > scipy_distutils.system_info.atlas_info > NOT AVAILABLE > blas_info: > NOT AVAILABLE > > Clearly it is not finding the vecLib BLAS libraries. Whether this is > because I'm on a G3 or some other problem I don't know; I was under the > impression that although not optimized for altivec the vecLib libraries > still processed for G3s. Any suggestions? Detecting vecLib BLAS libraries is not implemented in system_info.py yet. Could you send me the information how one can detect vecLib OSX and I'll add its support to system_info.py? E.g. which files it should look for, or which programs to run and what should be the expected output, the values of os.name and sys.platform, etc. Pearu From chris at fonnesbeck.org Mon Feb 9 14:35:33 2004 From: chris at fonnesbeck.org (Christopher Fonnesbeck) Date: Mon, 9 Feb 2004 14:35:33 -0500 Subject: [SciPy-user] scipy OS X compiling on a G3 In-Reply-To: References: Message-ID: <206F34D4-5B37-11D8-A63F-000A956FDAC0@fonnesbeck.org> On Feb 9, 2004, at 2:37 PM, Pearu Peterson wrote: >> Clearly it is not finding the vecLib BLAS libraries. Whether this is >> because I'm on a G3 or some other problem I don't know; I was under >> the >> impression that although not optimized for altivec the vecLib >> libraries >> still processed for G3s. Any suggestions? > > Detecting vecLib BLAS libraries is not implemented in system_info.py > yet. > Could you send me the information how one can detect vecLib OSX > and I'll add its support to system_info.py? E.g. which files it should > look for, or which programs to run and what should be the expected > output, > the values of os.name and sys.platform, etc. Adding the following compile flag does the trick for me: -framework vecLib C., -- Christopher J. Fonnesbeck ( c h r i s @ f o n n e s b e c k . o r g ) Georgia Cooperative Fish & Wildlife Research Unit, University of Georgia From pearu at scipy.org Mon Feb 9 14:45:20 2004 From: pearu at scipy.org (Pearu Peterson) Date: Mon, 9 Feb 2004 13:45:20 -0600 (CST) Subject: [SciPy-user] scipy OS X compiling on a G3 In-Reply-To: <206F34D4-5B37-11D8-A63F-000A956FDAC0@fonnesbeck.org> Message-ID: On Mon, 9 Feb 2004, Christopher Fonnesbeck wrote: > > On Feb 9, 2004, at 2:37 PM, Pearu Peterson wrote: > > >> Clearly it is not finding the vecLib BLAS libraries. Whether this is > >> because I'm on a G3 or some other problem I don't know; I was under > >> the > >> impression that although not optimized for altivec the vecLib > >> libraries > >> still processed for G3s. Any suggestions? > > > > Detecting vecLib BLAS libraries is not implemented in system_info.py > > yet. > > Could you send me the information how one can detect vecLib OSX > > and I'll add its support to system_info.py? E.g. which files it should > > look for, or which programs to run and what should be the expected > > output, > > the values of os.name and sys.platform, etc. > > Adding the following compile flag does the trick for me: > > -framework vecLib Ok but will it work on all OSX systems? Pearu From chris at fonnesbeck.org Mon Feb 9 14:54:21 2004 From: chris at fonnesbeck.org (Christopher Fonnesbeck) Date: Mon, 9 Feb 2004 14:54:21 -0500 Subject: [SciPy-user] scipy OS X compiling on a G3 In-Reply-To: References: Message-ID: On Feb 9, 2004, at 2:45 PM, Pearu Peterson wrote: >> >>>> Clearly it is not finding the vecLib BLAS libraries. Whether this >>>> is >>>> because I'm on a G3 or some other problem I don't know; I was under >>>> the >>>> impression that although not optimized for altivec the vecLib >>>> libraries >>>> still processed for G3s. Any suggestions? >>> >>> Detecting vecLib BLAS libraries is not implemented in system_info.py >>> yet. >>> Could you send me the information how one can detect vecLib OSX >>> and I'll add its support to system_info.py? E.g. which files it >>> should >>> look for, or which programs to run and what should be the expected >>> output, >>> the values of os.name and sys.platform, etc. >> >> Adding the following compile flag does the trick for me: >> >> -framework vecLib > > Ok but will it work on all OSX systems? I believe so, as long as the developer tools are installed: http://developer.apple.com/documentation/ReleaseNotes/MacOSX/vecLib.html C. -- Christopher J. Fonnesbeck ( c h r i s @ f o n n e s b e c k . o r g ) Georgia Cooperative Fish & Wildlife Research Unit, University of Georgia From pearu at scipy.org Mon Feb 9 15:07:30 2004 From: pearu at scipy.org (Pearu Peterson) Date: Mon, 9 Feb 2004 14:07:30 -0600 (CST) Subject: [SciPy-user] scipy OS X compiling on a G3 In-Reply-To: Message-ID: On Mon, 9 Feb 2004, Christopher Fonnesbeck wrote: > >> Adding the following compile flag does the trick for me: > >> > >> -framework vecLib > > > > Ok but will it work on all OSX systems? > > I believe so, as long as the developer tools are installed: > > http://developer.apple.com/documentation/ReleaseNotes/MacOSX/vecLib.html So, could you confirm that the following approach is correct: if os.path.exists('/System/Library/Frameworks/vecLib.framework/'): # use the following compiler flags: # -faltivec -framework vecLib # and no additional libraries need to be specified when linking. ? Thanks, Pearu From chris at fonnesbeck.org Mon Feb 9 15:23:25 2004 From: chris at fonnesbeck.org (Christopher Fonnesbeck) Date: Mon, 9 Feb 2004 15:23:25 -0500 Subject: [SciPy-user] scipy OS X compiling on a G3 In-Reply-To: References: Message-ID: On Feb 9, 2004, at 3:07 PM, Pearu Peterson wrote: > So, could you confirm that the following approach is correct: > > if os.path.exists('/System/Library/Frameworks/vecLib.framework/'): > # use the following compiler flags: > # -faltivec -framework vecLib > # and no additional libraries need to be specified when linking. I see no reason why that shouldn't work ... C. -- Christopher J. Fonnesbeck ( c h r i s @ f o n n e s b e c k . o r g ) Georgia Cooperative Fish & Wildlife Research Unit, University of Georgia From bob at redivi.com Mon Feb 9 20:07:28 2004 From: bob at redivi.com (Bob Ippolito) Date: Mon, 9 Feb 2004 20:07:28 -0500 Subject: [SciPy-user] scipy OS X compiling on a G3 In-Reply-To: References: Message-ID: <7F189200-5B65-11D8-BF45-000A95686CD8@redivi.com> On Feb 9, 2004, at 3:23 PM, Christopher Fonnesbeck wrote: > > On Feb 9, 2004, at 3:07 PM, Pearu Peterson wrote: > >> So, could you confirm that the following approach is correct: >> >> if os.path.exists('/System/Library/Frameworks/vecLib.framework/'): >> # use the following compiler flags: >> # -faltivec -framework vecLib >> # and no additional libraries need to be specified when linking. > > I see no reason why that shouldn't work ... If you don't have developer tools installed, you probably don't have a compiler anyway. You should link to Accelerate.framework if it exists, and vecLib otherwise. -bob -------------- next part -------------- A non-text attachment was scrubbed... Name: smime.p7s Type: application/pkcs7-signature Size: 2357 bytes Desc: not available URL: From chris at fonnesbeck.org Mon Feb 9 23:29:52 2004 From: chris at fonnesbeck.org (Christopher Fonnesbeck) Date: Mon, 9 Feb 2004 23:29:52 -0500 Subject: [SciPy-user] forcing linking in weave Message-ID: I need to link against a particular set of libraries for a weave.inline extension I am making. However, on compile, I get messages saying that linking is not done. How can I force linking with weave? I have specified include_dirs, library_dirs, libraries and headers, but to no avail. Thanks for any advice, C. -- Christopher J. Fonnesbeck ( c h r i s @ f o n n e s b e c k . o r g ) Georgia Cooperative Fish & Wildlife Research Unit, University of Georgia From scipy at zunzun.com Tue Feb 10 05:08:35 2004 From: scipy at zunzun.com (James R. Phillips) Date: Tue, 10 Feb 2004 05:08:35 -500 Subject: [SciPy-user] forcing linking in weave References: Message-ID: <4028ada3934f13.47178150@mercury.sabren.com> > I need to link against a particular set of libraries for a weave.inline > extension I am making. How can I force linking with weave? The version of the tutorial distributed with weave appears to be more up-to-date than the online version. It is located in the Lib/site-packages/weave/doc directory of your Python install. Look for the "Distutils keywords" section, where it refers to lists of strings as in option = ['string1', 'string2'] For example, I use extra_compile_args = ['-O3'] >From the *offline* tutorial: inline() also accepts a number of distutils keywords for controlling how the code is compiled. library_dirs [string] list of directories to search for C/C++ libraries at link time libraries [string] list of library names (not filenames or paths) to link against runtime_library_dirs [string] list of directories to search for C/C++ libraries at run time (for shared extensions, this is when the extension is loaded) extra_objects [string] list of extra files to link with (eg. object files not implied by 'sources', static library that must be explicitly specified, binary resource files, etc.) extra_compile_args [string] any extra platform- and compiler-specific information to use when compiling the source files in 'sources'. For platforms and compilers where "command line" makes sense, this is typically a list of command-line arguments, but for other platforms it could be anything. extra_link_args [string] any extra platform- and compiler-specific information to use when linking object files together to create the extension (or to create a new static Python interpreter). Similar interpretation as for 'extra_compile_args'. James Phillips http://zunzun.com From pearu at scipy.org Tue Feb 10 08:12:11 2004 From: pearu at scipy.org (Pearu Peterson) Date: Tue, 10 Feb 2004 07:12:11 -0600 (CST) Subject: [SciPy-user] scipy OS X compiling on a G3 In-Reply-To: <7F189200-5B65-11D8-BF45-000A95686CD8@redivi.com> Message-ID: On Mon, 9 Feb 2004, Bob Ippolito wrote: > > On Feb 9, 2004, at 3:23 PM, Christopher Fonnesbeck wrote: > > > > > On Feb 9, 2004, at 3:07 PM, Pearu Peterson wrote: > > > >> So, could you confirm that the following approach is correct: > >> > >> if os.path.exists('/System/Library/Frameworks/vecLib.framework/'): > >> # use the following compiler flags: > >> # -faltivec -framework vecLib > >> # and no additional libraries need to be specified when linking. > > > > I see no reason why that shouldn't work ... > > If you don't have developer tools installed, you probably don't have a > compiler anyway. > > You should link to Accelerate.framework if it exists, and vecLib > otherwise. As you may have noticed then I am ignorant what comes to OS X. So, how your last remark applies to my suggestion? Is the following correct: if os.path.exists('/System/Library/Frameworks/Accelerate.framework/'): # use the following compiler flags: # -faltivec -framework Accelerate # and no additional libraries need to be specified when linking. elif os.path.exists('/System/Library/Frameworks/vecLib.framework/'): # use the following compiler flags: # -faltivec -framework vecLib # and no additional libraries need to be specified when linking. ? Thanks, Pearu From bob at redivi.com Tue Feb 10 08:26:24 2004 From: bob at redivi.com (Bob Ippolito) Date: Tue, 10 Feb 2004 08:26:24 -0500 Subject: [SciPy-user] scipy OS X compiling on a G3 In-Reply-To: References: Message-ID: On Feb 10, 2004, at 8:12 AM, Pearu Peterson wrote: > > > On Mon, 9 Feb 2004, Bob Ippolito wrote: > >> >> On Feb 9, 2004, at 3:23 PM, Christopher Fonnesbeck wrote: >> >>> >>> On Feb 9, 2004, at 3:07 PM, Pearu Peterson wrote: >>> >>>> So, could you confirm that the following approach is correct: >>>> >>>> if os.path.exists('/System/Library/Frameworks/vecLib.framework/'): >>>> # use the following compiler flags: >>>> # -faltivec -framework vecLib >>>> # and no additional libraries need to be specified when linking. >>> >>> I see no reason why that shouldn't work ... >> >> If you don't have developer tools installed, you probably don't have a >> compiler anyway. >> >> You should link to Accelerate.framework if it exists, and vecLib >> otherwise. > > As you may have noticed then I am ignorant what comes to OS X. So, how > your last remark applies to my suggestion? Is the following correct: > > if os.path.exists('/System/Library/Frameworks/Accelerate.framework/'): > # use the following compiler flags: > # -faltivec -framework Accelerate > # and no additional libraries need to be specified when linking. > elif os.path.exists('/System/Library/Frameworks/vecLib.framework/'): > # use the following compiler flags: > # -faltivec -framework vecLib > # and no additional libraries need to be specified when linking. > ? That is correct as far as I know. -bob -------------- next part -------------- A non-text attachment was scrubbed... Name: smime.p7s Type: application/pkcs7-signature Size: 2357 bytes Desc: not available URL: From pearu at scipy.org Tue Feb 10 11:46:45 2004 From: pearu at scipy.org (Pearu Peterson) Date: Tue, 10 Feb 2004 10:46:45 -0600 (CST) Subject: [SciPy-user] "Building Scipy on Windows" notes available Message-ID: Hi all, Detailed "from scratch" notes about "Building Scipy on Windows" are available at http://cens.ioc.ee/~pearu/scipy/BUILD_WIN32.html It also contains a section on building Win32 port of Atlas libraries that seems to be a major show-stopper for Windows people, and indeed, I found it a rather sensitive subject too. Hopefully these notes will help Windows people to overcome the threshold of building Scipy themself and may enjoy most recent Scipy features from CVS. Best regards, Pearu From chris at fonnesbeck.org Tue Feb 10 14:35:41 2004 From: chris at fonnesbeck.org (Christopher Fonnesbeck) Date: Tue, 10 Feb 2004 14:35:41 -0500 Subject: [SciPy-user] zero-size graphic files from scipy.gplt Message-ID: <4FE357AF-5C00-11D8-A63F-000A956FDAC0@fonnesbeck.org> I am using scipy.gplt to generate plots of simulation time series. Every so often, however, plots that are saved to file using output() are of zero size, even though the plot looks fine on screen. Has anyone run into this problem before? There are no error messages, so this is a tough one to debug. scipy cvs build on OSX 10.3 with python 2.3.3 Thanks, Chris -- Christopher J. Fonnesbeck ( c h r i s @ f o n n e s b e c k . o r g ) Georgia Cooperative Fish & Wildlife Research Unit, University of Georgia From paxcalpt at sapo.pt Wed Feb 11 14:50:45 2004 From: paxcalpt at sapo.pt (Ricardo Henriques) Date: Wed, 11 Feb 2004 19:50:45 -0000 Subject: [SciPy-user] Where is my error when using scipy.optimize.leastsq Message-ID: <001101c3f0d8$5effcb10$3eb6c151@xcal> Hi... Does anyone know how can I get the error (R**2) when using the scipy.optimize.leastsq fitting. Tks everyone... Paxcal From nwagner at mecha.uni-stuttgart.de Fri Feb 13 07:30:36 2004 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Fri, 13 Feb 2004 13:30:36 +0100 Subject: [SciPy-user] det(A-\lambda B) is identically zero Message-ID: <402CC36C.89C2480F@mecha.uni-stuttgart.de> Hi all, Please find attached a small example for a singular pencil A-\lambda B. det(A-\lambda B) is identically zero. However linalg.eig(A,B) computes certain eigenvalues. although \lambda can be chosen arbitrarely. There should be at leat a warning in such a case. Any comment or suggestion ? Nils from scipy import * A = array(( [0,7,-1,1], [4,2,3,1], [2,0,0,2], [6,9,2,4])) B = array(( [3,4,1,2], [-1,3,5,-6], [1,0,0,1], [15,7,6,9])) print 'A' print A print 'B' print B # # Eigenvalue problem # w,vr = linalg.eig(A,B) for i in w: print i # # det(A-\lambda B) is identically zero # for l in arange(0,10.,0.1): d = linalg.det(A-l*B) print l,d From pearu at scipy.org Fri Feb 13 08:56:15 2004 From: pearu at scipy.org (Pearu Peterson) Date: Fri, 13 Feb 2004 07:56:15 -0600 (CST) Subject: [SciPy-user] det(A-\lambda B) is identically zero In-Reply-To: <402CC36C.89C2480F@mecha.uni-stuttgart.de> Message-ID: On Fri, 13 Feb 2004, Nils Wagner wrote: > Please find attached a small example for a singular pencil > A-\lambda B. det(A-\lambda B) is identically zero. > However linalg.eig(A,B) computes certain eigenvalues. And one of them is inf. That should make an user careful when manipulating with (A,B) eigendata. > although \lambda can be chosen arbitrarely. Well, if lambda is small then det is close to small number indeed. However note that >>> linalg.det(A-1e5*B) -966.29150970670446 for instance. > There should be at leat a warning in such a case. I think it is a computer numerics issue rather than of scipy. What would be the criterion when such a message should be displayed? Pearu From h.jansen at fel.tno.nl Fri Feb 13 10:08:27 2004 From: h.jansen at fel.tno.nl (H Jansen) Date: Fri, 13 Feb 2004 16:08:27 +0100 Subject: [SciPy-user] ... object has no attribute 'zgelss' Message-ID: <402CE86B.8050605@fel.tno.nl> Many Lapack routines can't be found when I do: import scipy scipy.test(level=1) For instance: ====================================================================== ERROR: check_random_complex_exact (test_basic.test_lstsq) ---------------------------------------------------------------------- Traceback (most recent call last): File "/opt/inst/python/2.3/lib/python2.3/site-packages/scipy/linalg/tests/test_basic.py", line 346, in check_random_complex_exact x = lstsq(a,b)[0] File "/opt/inst/python/2.3/lib/python2.3/site-packages/scipy/linalg/basic.py", line 281, in lstsq gelss, = get_lapack_funcs(('gelss',),(a1,b1)) File "/opt/inst/python/2.3/lib/python2.3/site-packages/scipy/linalg/lapack.py", line 55, in get_lapack_funcs func = getattr(m2,func_name) AttributeError: 'module' object has no attribute 'zgelss' ... << more of the same, all related to lapack subroutines >> When I do "nm liblapack.a | grep zgelss" code for "zgelss_"; likewise with all other routines. I'm using ATLAS-3.6.0 and my liblapack is complete (appr. 6.8 Mb, routines added from CLAPACK). Apparently, there's an interface mismatch between ATLAS and scipy (SciPy-0.2.0_alpha_200.4161). The scipy source builts with the command python setup.py build build_ext `pkg-config atlas --cflags --libs` resulting the linalg shared modules to be linked as -L/opt/src/math/numerical/matrix/atlas/current/lib/Linux_P4SSE2 -llapack -lf77blas -lcblas -latlas -lg2c Some help will be highly appreated! Thanks. -- Henk -- ------------------------------------------------------------------------------ The disclaimer that applies to e-mail from TNO Physics and Electronics Laboratory can be found on: http://www.tno.nl/disclaimer/email.html ------------------------------------------------------------------------------ From nwagner at mecha.uni-stuttgart.de Fri Feb 13 10:16:58 2004 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Fri, 13 Feb 2004 16:16:58 +0100 Subject: [SciPy-user] Problems with latest cvs Message-ID: <402CEA6A.3C229E0F@mecha.uni-stuttgart.de> Hi all, I have some trouble with latest cvs Traceback (most recent call last): File "qr.py", line 79, in ? eval=linalg.eigvals(A) File "/usr/local/lib/python2.3/site-packages/scipy_base/ppimport.py", line 243, in __getattr__ module = self._ppimport_importer() File "/usr/local/lib/python2.3/site-packages/scipy_base/ppimport.py", line 216, in _ppimport_importer module = __import__(name,None,None,['*']) File "/usr/local/lib/python2.3/site-packages/scipy/linalg/__init__.py", line 8, in ? from basic import * File "/usr/local/lib/python2.3/site-packages/scipy/linalg/basic.py", line 12, in ? from flinalg import get_flinalg_funcs File "/usr/local/lib/python2.3/site-packages/scipy/linalg/flinalg.py", line 7, in ? from scipy_distutils.misc_util import PostponedException File "/usr/local/lib/python2.3/site-packages/scipy_distutils/__init__.py", line 14, in ? assert not sys.modules.has_key('distutils.ccompiler'),\ AssertionError: distutils has been imported before scipy_distutils >>> scipy.__version__ '0.2.1_253.3711' >>> Nils From pearu at scipy.org Fri Feb 13 10:23:30 2004 From: pearu at scipy.org (Pearu Peterson) Date: Fri, 13 Feb 2004 09:23:30 -0600 (CST) Subject: [SciPy-user] ... object has no attribute 'zgelss' In-Reply-To: <402CE86B.8050605@fel.tno.nl> Message-ID: On Fri, 13 Feb 2004, H Jansen wrote: > Many Lapack routines can't be found when I do: > > import scipy > scipy.test(level=1) > > For instance: > ... << more of the same, all related to lapack subroutines >> > > When I do "nm liblapack.a | grep zgelss" code for "zgelss_"; ??? So what is the result then? Are zgelss_ and others defined or not? > likewise > with all other routines. I'm using ATLAS-3.6.0 and my liblapack is > complete (appr. 6.8 Mb, routines added from CLAPACK). CLAPACK??? Unless CLAPACK provides symbol names with '_' suffix, you should complete ATLAS liblapack.a with Fortran LAPACK. > Apparently, > there's an interface mismatch between ATLAS and scipy > (SciPy-0.2.0_alpha_200.4161). > > The scipy source builts with the command > > python setup.py build build_ext `pkg-config atlas --cflags --libs` > > resulting the linalg shared modules to be linked as > > -L/opt/src/math/numerical/matrix/atlas/current/lib/Linux_P4SSE2 > -llapack -lf77blas -lcblas -latlas -lg2c I suspect that scipy setup finds different atlas/lapack from you system than that you specify with `pkd-config`. Try building scipy as follows rm -rf build export ATLAS=/opt/src/math/numerical/matrix/atlas/current/lib/Linux_P4SSE2 python setup.py build HTH, Pearu From pearu at scipy.org Fri Feb 13 10:30:17 2004 From: pearu at scipy.org (Pearu Peterson) Date: Fri, 13 Feb 2004 09:30:17 -0600 (CST) Subject: [SciPy-user] Problems with latest cvs In-Reply-To: <402CEA6A.3C229E0F@mecha.uni-stuttgart.de> Message-ID: On Fri, 13 Feb 2004, Nils Wagner wrote: > I have some trouble with latest cvs > > assert not sys.modules.has_key('distutils.ccompiler'),\ > AssertionError: distutils has been imported before scipy_distutils So, have you imported distutils before scipy_distutils or scipy? The fix depends on the answer.. Thanks, Pearu From nwagner at mecha.uni-stuttgart.de Fri Feb 13 10:28:03 2004 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Fri, 13 Feb 2004 16:28:03 +0100 Subject: [SciPy-user] Problems with latest cvs References: Message-ID: <402CED03.864AD1B2@mecha.uni-stuttgart.de> Pearu Peterson schrieb: > > On Fri, 13 Feb 2004, Nils Wagner wrote: > > > I have some trouble with latest cvs > > > > assert not sys.modules.has_key('distutils.ccompiler'),\ > > AssertionError: distutils has been imported before scipy_distutils > > So, have you imported distutils before scipy_distutils or scipy? > The fix depends on the answer.. > > Thanks, > Pearu > > _________ Pearu, just from scipy import * Nils ______________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user From pearu at scipy.org Fri Feb 13 10:46:58 2004 From: pearu at scipy.org (Pearu Peterson) Date: Fri, 13 Feb 2004 09:46:58 -0600 (CST) Subject: [SciPy-user] Problems with latest cvs In-Reply-To: <402CED03.864AD1B2@mecha.uni-stuttgart.de> Message-ID: On Fri, 13 Feb 2004, Nils Wagner wrote: > > So, have you imported distutils before scipy_distutils or scipy? > > The fix depends on the answer.. > > just > > from scipy import * Strange, the site.py or something has imported distutils for you. It does not happend here. Anyway, a fix is in CVS. You'll only need to reinstall scipy_core. Pearu From nwagner at mecha.uni-stuttgart.de Fri Feb 13 10:51:46 2004 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Fri, 13 Feb 2004 16:51:46 +0100 Subject: [SciPy-user] Problems with latest cvs References: Message-ID: <402CF292.850BA75E@mecha.uni-stuttgart.de> Pearu Peterson schrieb: > > On Fri, 13 Feb 2004, Nils Wagner wrote: > > > > So, have you imported distutils before scipy_distutils or scipy? > > > The fix depends on the answer.. > > > > just > > > > from scipy import * > > Strange, the site.py or something has imported distutils for you. > It does not happend here. Anyway, a fix is in CVS. You'll only need to > reinstall scipy_core. > > Pearu > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user Sorry, but the problem is still there. Nils Traceback (most recent call last): File "qr.py", line 89, in ? eval=linalg.eigvals(A) File "/usr/local/lib/python2.3/site-packages/scipy_base/ppimport.py", line 243, in __getattr__ module = self._ppimport_importer() File "/usr/local/lib/python2.3/site-packages/scipy_base/ppimport.py", line 216, in _ppimport_importer module = __import__(name,None,None,['*']) File "/usr/local/lib/python2.3/site-packages/scipy/linalg/__init__.py", line 8, in ? from basic import * File "/usr/local/lib/python2.3/site-packages/scipy/linalg/basic.py", line 12, in ? from flinalg import get_flinalg_funcs File "/usr/local/lib/python2.3/site-packages/scipy/linalg/flinalg.py", line 7, in ? from scipy_distutils.misc_util import PostponedException File "/usr/local/lib/python2.3/site-packages/scipy_distutils/__init__.py", line 14, in ? assert not sys.modules.has_key('distutils.ccompiler'),\ AssertionError: distutils has been imported before scipy_distutils From pearu at scipy.org Fri Feb 13 11:02:05 2004 From: pearu at scipy.org (Pearu Peterson) Date: Fri, 13 Feb 2004 10:02:05 -0600 (CST) Subject: [SciPy-user] Problems with latest cvs In-Reply-To: <402CF292.850BA75E@mecha.uni-stuttgart.de> Message-ID: On Fri, 13 Feb 2004, Nils Wagner wrote: > Sorry, but the problem is still there. Impossible.. > File > "/usr/local/lib/python2.3/site-packages/scipy_distutils/__init__.py", > line 14, in ? > assert not sys.modules.has_key('distutils.ccompiler'),\ > AssertionError: distutils has been imported before scipy_distutils In CVS version of __init__.py line #14 contains a comment line and the assert statement is on line #15. So, you have failed to update scipy_core. Pearu From h.jansen at fel.tno.nl Fri Feb 13 10:58:47 2004 From: h.jansen at fel.tno.nl (H Jansen) Date: Fri, 13 Feb 2004 16:58:47 +0100 Subject: [SciPy-user] ... object has no attribute 'zgelss' In-Reply-To: References: Message-ID: <1076687927.27613.16.camel@pc50002760.fel.tno.nl> On Fri, 2004-02-13 at 16:23, Pearu Peterson wrote: > On Fri, 13 Feb 2004, H Jansen wrote: > > > Many Lapack routines can't be found when I do: > > > > import scipy > > scipy.test(level=1) > > > > For instance: > > > ... << more of the same, all related to lapack subroutines >> > > > > When I do "nm liblapack.a | grep zgelss" code for "zgelss_"; > > ??? So what is the result then? Are zgelss_ and others defined or not? > > > likewise > > with all other routines. I'm using ATLAS-3.6.0 and my liblapack is > > complete (appr. 6.8 Mb, routines added from CLAPACK). > > CLAPACK??? Unless CLAPACK provides symbol names with '_' suffix, you > should complete ATLAS liblapack.a with Fortran LAPACK. With Fortran Lapack everything now works fine! Thanks! Henk. > > Apparently, > > there's an interface mismatch between ATLAS and scipy > > (SciPy-0.2.0_alpha_200.4161). > > > > The scipy source builts with the command > > > > python setup.py build build_ext `pkg-config atlas --cflags --libs` > > > > resulting the linalg shared modules to be linked as > > > > -L/opt/src/math/numerical/matrix/atlas/current/lib/Linux_P4SSE2 > > -llapack -lf77blas -lcblas -latlas -lg2c > > I suspect that scipy setup finds different atlas/lapack from you system > than that you specify with `pkd-config`. Try building scipy as follows > > rm -rf build > export ATLAS=/opt/src/math/numerical/matrix/atlas/current/lib/Linux_P4SSE2 > python setup.py build > > HTH, > Pearu > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user -- Henk Jansen TNO Physics and Electronics Laboratory -- ------------------------------------------------------------------------------ The disclaimer that applies to e-mail from TNO Physics and Electronics Laboratory can be found on: http://www.tno.nl/disclaimer/email.html ------------------------------------------------------------------------------ From j_r_fonseca at yahoo.co.uk Sun Feb 15 14:16:18 2004 From: j_r_fonseca at yahoo.co.uk (=?iso-8859-1?Q?Jos=E9?= Fonseca) Date: Sun, 15 Feb 2004 19:16:18 +0000 (UTC) Subject: [SciPy-user] New Scipy Debian packages - Testers wanted! Message-ID: New Scipy & F2PY Debian packages are available at . These no longer have Pentium 4 specific code; should compile perfectly from sources; and no longer get any warning from lintian. Note that chaco no longer is included though. They should be almost ready for inclusion in the official Debian repository, but before that I'd like them to be more widely tested. So please test and let me know (privately) if they work ok with you! Many thanks to Pearu Peterson for all his efforts in making the packaging much easier. Jose Fonseca From arnd.baecker at web.de Mon Feb 16 10:26:29 2004 From: arnd.baecker at web.de (Arnd Baecker) Date: Mon, 16 Feb 2004 16:26:29 +0100 (CET) Subject: [SciPy-user] scipy.xplt.ghelp missing. Message-ID: Hi, after a recent CVS download scipy.xplt.ghelp is missing. I tracked this down to a removed "from helpmod import help as ghelp" in gist.py (see below). Of course I think it would be great if the xplt documentation gets integrated into the docstrings so that tools like pydoc can make use of it (Also it would be one `help` type command less to `teach` to the students). For the commmands themselves this works fine, e.g. scipy.xplt.ghelp("plg") vs. help("scipy.xplt.plg") but for the keywords, e.g. scipy.xplt.ghelp("type") I can't find any equivalent with the "normal" help. Well, so did I just do a check out during some reorganization of the help for scipy.xplt or should the `from helpmod import help as ghelp` be put back into gist.py ? Many thanks, Arnd __version__ = "$Id: gist.py,v 1.15 2003/06/25 11:02:01 travo Exp $" from gistC import * from helpmod import help as ghelp from pydoc import help __version__ = "1.5.18" from gistC import * from pydoc import help From scipy at zunzun.com Mon Feb 16 15:13:00 2004 From: scipy at zunzun.com (James R. Phillips) Date: Mon, 16 Feb 2004 15:13:00 -500 Subject: [SciPy-user] Night of the Living Wgnuplot.exe Message-ID: <4031244c256b35.29068471@mercury.sabren.com> I have a lot of undying wgnuplot.exe processes on my XP box that I must manually kill, they appear each time I use scipy. Has anyone else seen this? James Phillips http://zunzun.com From jdhunter at ace.bsd.uchicago.edu Tue Feb 17 09:19:21 2004 From: jdhunter at ace.bsd.uchicago.edu (John Hunter) Date: Tue, 17 Feb 2004 08:19:21 -0600 Subject: [SciPy-user] ANN: matplotlib-0.50 - python plotting Message-ID: matplotlib is a python 2D plotting library which produces publication quality figures using in a variety of hardcopy formats (PNG, JPG, TIFF, PS) and interactive GUI environments (WX, GTK) across platforms. matplotlib can be used in python scripts, interactively from the python shell (ala matlab or mathematica), in web application servers generating dynamic charts, or embedded in GTK or WX applications. http://matplotlib.sourceforge.net What's new in matplotlib 0.50 Antigrain backend: Agg Adding to the growing list of image backends is Antigrain -- http://antigrain.com. This is a backend written mostly in extension code and is the fastest of all the image backends. Agg supports freetype fonts, antialiased drawing, alpha blending, and much more. The windows installer contains everything you need except Numeric to use the agg backend out of the box; for other platforms see http://matplotlib.sourceforge.net/backends.html#Agg Paint/libart backend David Moore wrote a backend for pypaint, a libart wrapper. libart is a high quality, cross platform image renderer that supports antialiased lines, freetype fonts, and other capabilities to soon be exploited. Thanks David! See http://matplotlib.sourceforge.net/backends.html#Paint for more information and install instructions The Matplotlib FAQ Matplotlib now has a FAQ -- http://matplotlib.sourceforge.net/faq.html Alpha channel attribute All the figure elements now have an alpha attribute to allow blending and translucency. Not all backends are currenly capable of supporting alpha - currently only Agg, but Paint should be able to support this soon - see the scatter screenshot for an example of alpha at work http://matplotlib.sourceforge.net/screenshots.html#scatter_demo2 Table class added John Gill has developed a very nice Table class and table function that plays well with bar charts and stacked bar charts. See example code and screenshot table_demo at http://matplotlib.sourceforge.net/screenshots.html#table_demo New plot commands cla and clf Clear the current axes or figure. Useful in interactive plotting from a python shell GD module on win32 With much weeping and gnashing of teeth and help from half the people on this globe, built a gdmodule win32 installer. Special thanks to Stefan Kuzminski for putting up with my endless windows confusions. See the win32 quickstart at installing the GD backend - http://matplotlib.sourceforge.net/backends.html#GDWIN32 GD supports clipping and antialiased line drawing See instructions about upgrading gd and gdmodule at Installing the GD backend. The line object has a new 'antialiased' property, that if True, the backend will render the line antialiased if supported. Note antialiased drawing under GD is slow, so be sure to turn the property off set(lines, 'antialiased', False) if you experience performance problems. If you need performance and antialiasing, use the agg backend. Wild and wonderful bar charts You can provide an optional argument bottom to the bar command to determine where the bottom of each bar is, default 0 for all. This enables stacked bar plots and candelstick plots -- examples/bar_stacked.py. Thanks to David Moore and John Gill for suggestions and code. Figure backend refactored The figure functionality was split into a backend independent component Figure and a backend dependent component FigureCanvasBase. This completes the transition to a totally abstract figure interface and improves the ability the switch backends and a figure to multiple backends. See API_CHANGES for information on migrating applications to the new API at http://matplotlib.sourceforge.net/API_CHANGES Tons of bug fixes and optimizations detailed at http://matplotlib.sourceforge.net/whats_new.html From chris at fonnesbeck.org Tue Feb 17 09:58:18 2004 From: chris at fonnesbeck.org (Christopher Fonnesbeck) Date: Tue, 17 Feb 2004 09:58:18 -0500 Subject: [SciPy-user] fftpack fails on cvs build (OSX) Message-ID: After updating from cvs and building, I get the following error: gcc options: "-fno-strict-aliasing -Wno-long-double -no-cpp-precomp -mno-fused-madd -fno-common -dynamic -DNDEBUG -g -O3 -Wall -Wstrict-prototypes -faltivec" compile options: '-DSCIPY_FFTW_H -I/usr/local/include -Ibuild/src -I/System/Library/Frameworks/Python.framework/Versions/2.3/include/ python2.3 -c' /usr/local/bin/g77 build/temp.darwin-7.2.0-Power_Macintosh-2.3/build/src/Lib/fftpack/ _fftpackmodule.o build/temp.darwin-7.2.0-Power_Macintosh-2.3/Lib/fftpack/src/zfft.o build/temp.darwin-7.2.0-Power_Macintosh-2.3/Lib/fftpack/src/drfft.o build/temp.darwin-7.2.0-Power_Macintosh-2.3/Lib/fftpack/src/zrfft.o build/temp.darwin-7.2.0-Power_Macintosh-2.3/Lib/fftpack/src/zfftnd.o build/temp.darwin-7.2.0-Power_Macintosh-2.3/build/src/fortranobject.o -L/usr/local/lib -L/usr/local/lib/gcc/powerpc-apple-darwin7.0.0/3.4 -Lbuild/temp.darwin-7.2.0-Power_Macintosh-2.3 -ldfftpack -lrfftw -lfftw -lg2c -o build/lib.darwin-7.2.0-Power_Macintosh-2.3/scipy/fftpack/_fftpack.so /usr/bin/ld: Undefined symbols: _PyArg_ParseTuple _PyArg_ParseTupleAndKeywords _PyCObject_AsVoidPtr _PyCObject_Type _PyComplex_Type _PyDict_GetItemString _PyDict_SetItemString _PyErr_Clear _PyErr_NewException _PyErr_Occurred _PyErr_Print _PyErr_SetString _PyImport_ImportModule _PyInt_Type _PyModule_GetDict _PyNumber_Int _PyObject_GetAttrString _PySequence_Check _PySequence_GetItem _PyString_FromString _PyString_Type _PyType_IsSubtype _PyType_Type _Py_BuildValue _Py_FatalError _Py_InitModule4 __Py_NoneStruct _PyDict_DelItemString _PyDict_New _PyErr_Format _PyExc_AttributeError _PyExc_RuntimeError _PyExc_TypeError _PyObject_Free _PyString_ConcatAndDel _Py_FindMethod __PyObject_New restFP saveFP _MAIN__ collect2: ld returned 1 exit status error: Command "/usr/local/bin/g77 build/temp.darwin-7.2.0-Power_Macintosh-2.3/build/src/Lib/fftpack/ _fftpackmodule.o build/temp.darwin-7.2.0-Power_Macintosh-2.3/Lib/fftpack/src/zfft.o build/temp.darwin-7.2.0-Power_Macintosh-2.3/Lib/fftpack/src/drfft.o build/temp.darwin-7.2.0-Power_Macintosh-2.3/Lib/fftpack/src/zrfft.o build/temp.darwin-7.2.0-Power_Macintosh-2.3/Lib/fftpack/src/zfftnd.o build/temp.darwin-7.2.0-Power_Macintosh-2.3/build/src/fortranobject.o -L/usr/local/lib -L/usr/local/lib/gcc/powerpc-apple-darwin7.0.0/3.4 -Lbuild/temp.darwin-7.2.0-Power_Macintosh-2.3 -ldfftpack -lrfftw -lfftw -lg2c -o build/lib.darwin-7.2.0-Power_Macintosh-2.3/scipy/fftpack/_fftpack.so" failed with exit status 1 Not sure why -- worked fine before the update. C. -- Christopher J. Fonnesbeck ( c h r i s @ f o n n e s b e c k . o r g ) Georgia Cooperative Fish & Wildlife Research Unit, University of Georgia From bob at redivi.com Tue Feb 17 10:09:40 2004 From: bob at redivi.com (Bob Ippolito) Date: Tue, 17 Feb 2004 10:09:40 -0500 Subject: [SciPy-user] fftpack fails on cvs build (OSX) In-Reply-To: References: Message-ID: <4F62AE0C-615B-11D8-B0CA-000A95686CD8@redivi.com> On Feb 17, 2004, at 9:58 AM, Christopher Fonnesbeck wrote: > After updating from cvs and building, I get the following error: > > gcc options: "-fno-strict-aliasing -Wno-long-double -no-cpp-precomp > -mno-fused-madd -fno-common -dynamic -DNDEBUG -g -O3 -Wall > -Wstrict-prototypes -faltivec" > compile options: '-DSCIPY_FFTW_H -I/usr/local/include -Ibuild/src > -I/System/Library/Frameworks/Python.framework/Versions/2.3/include/ > python2.3 -c' > /usr/local/bin/g77 > build/temp.darwin-7.2.0-Power_Macintosh-2.3/build/src/Lib/fftpack/ > _fftpackmodule.o > build/temp.darwin-7.2.0-Power_Macintosh-2.3/Lib/fftpack/src/zfft.o > build/temp.darwin-7.2.0-Power_Macintosh-2.3/Lib/fftpack/src/drfft.o > build/temp.darwin-7.2.0-Power_Macintosh-2.3/Lib/fftpack/src/zrfft.o > build/temp.darwin-7.2.0-Power_Macintosh-2.3/Lib/fftpack/src/zfftnd.o > build/temp.darwin-7.2.0-Power_Macintosh-2.3/build/src/fortranobject.o > -L/usr/local/lib -L/usr/local/lib/gcc/powerpc-apple-darwin7.0.0/3.4 > -Lbuild/temp.darwin-7.2.0-Power_Macintosh-2.3 -ldfftpack -lrfftw > -lfftw -lg2c -o > build/lib.darwin-7.2.0-Power_Macintosh-2.3/scipy/fftpack/_fftpack.so > /usr/bin/ld: Undefined symbols: > _PyArg_ParseTuple > .... Extensions need to link to Python (-framework Python). -bob From pearu at scipy.org Tue Feb 17 10:13:48 2004 From: pearu at scipy.org (Pearu Peterson) Date: Tue, 17 Feb 2004 09:13:48 -0600 (CST) Subject: [SciPy-user] fftpack fails on cvs build (OSX) In-Reply-To: Message-ID: On Tue, 17 Feb 2004, Christopher Fonnesbeck wrote: > After updating from cvs and building, I get the following error: > > gcc options: "-fno-strict-aliasing -Wno-long-double -no-cpp-precomp > -mno-fused-madd -fno-common -dynamic -DNDEBUG -g -O3 -Wall > -Wstrict-prototypes -faltivec" > compile options: '-DSCIPY_FFTW_H -I/usr/local/include -Ibuild/src > -I/System/Library/Frameworks/Python.framework/Versions/2.3/include/ > python2.3 -c' > /usr/local/bin/g77 > build/temp.darwin-7.2.0-Power_Macintosh-2.3/build/src/Lib/fftpack/ > _fftpackmodule.o > build/temp.darwin-7.2.0-Power_Macintosh-2.3/Lib/fftpack/src/zfft.o > build/temp.darwin-7.2.0-Power_Macintosh-2.3/Lib/fftpack/src/drfft.o > build/temp.darwin-7.2.0-Power_Macintosh-2.3/Lib/fftpack/src/zrfft.o > build/temp.darwin-7.2.0-Power_Macintosh-2.3/Lib/fftpack/src/zfftnd.o > build/temp.darwin-7.2.0-Power_Macintosh-2.3/build/src/fortranobject.o > -L/usr/local/lib -L/usr/local/lib/gcc/powerpc-apple-darwin7.0.0/3.4 > -Lbuild/temp.darwin-7.2.0-Power_Macintosh-2.3 -ldfftpack -lrfftw -lfftw > -lg2c -o > build/lib.darwin-7.2.0-Power_Macintosh-2.3/scipy/fftpack/_fftpack.so Could you run this command with -shared option? Does it work? Btw, previous gnu fcompiler implementation had a comment "Darwin g77 cannot be used as a linker", is it really true and what could be the backround of this comment if this is true? Pearu From pearu at scipy.org Tue Feb 17 10:26:11 2004 From: pearu at scipy.org (Pearu Peterson) Date: Tue, 17 Feb 2004 09:26:11 -0600 (CST) Subject: [SciPy-user] fftpack fails on cvs build (OSX) In-Reply-To: Message-ID: On Tue, 17 Feb 2004, Christopher Fonnesbeck wrote: > After updating from cvs and building, I get the following error: > > gcc options: "-fno-strict-aliasing -Wno-long-double -no-cpp-precomp > -mno-fused-madd -fno-common -dynamic -DNDEBUG -g -O3 -Wall > -Wstrict-prototypes -faltivec" > compile options: '-DSCIPY_FFTW_H -I/usr/local/include -Ibuild/src > -I/System/Library/Frameworks/Python.framework/Versions/2.3/include/ > python2.3 -c' > /usr/local/bin/g77 > build/temp.darwin-7.2.0-Power_Macintosh-2.3/build/src/Lib/fftpack/ > _fftpackmodule.o > build/temp.darwin-7.2.0-Power_Macintosh-2.3/Lib/fftpack/src/zfft.o > build/temp.darwin-7.2.0-Power_Macintosh-2.3/Lib/fftpack/src/drfft.o > build/temp.darwin-7.2.0-Power_Macintosh-2.3/Lib/fftpack/src/zrfft.o > build/temp.darwin-7.2.0-Power_Macintosh-2.3/Lib/fftpack/src/zfftnd.o > build/temp.darwin-7.2.0-Power_Macintosh-2.3/build/src/fortranobject.o > -L/usr/local/lib -L/usr/local/lib/gcc/powerpc-apple-darwin7.0.0/3.4 > -Lbuild/temp.darwin-7.2.0-Power_Macintosh-2.3 -ldfftpack -lrfftw -lfftw > -lg2c -o > build/lib.darwin-7.2.0-Power_Macintosh-2.3/scipy/fftpack/_fftpack.so > /usr/bin/ld: Undefined symbols: > _PyArg_ParseTuple Could you update cvs and try again? I have added Bob's suggestion there. Let me know if it works or not. Thanks, Pearu From chris at fonnesbeck.org Tue Feb 17 10:36:27 2004 From: chris at fonnesbeck.org (Christopher Fonnesbeck) Date: Tue, 17 Feb 2004 10:36:27 -0500 Subject: [SciPy-user] fftpack fails on cvs build (OSX) In-Reply-To: References: Message-ID: <0CB9A903-615F-11D8-86FB-000A956FDAC0@fonnesbeck.org> On Feb 17, 2004, at 10:26 AM, Pearu Peterson wrote: > Could you update cvs and try again? I have added Bob's suggestion > there. > Let me know if it works or not. You added: opt.append('-framework','Python') which gives an error, since append only takes 1 argument. Substituting: opt.append("-framework Python") gives the following error: g77: Python: No such file or directory C. -- Christopher J. Fonnesbeck ( c h r i s @ f o n n e s b e c k . o r g ) Georgia Cooperative Fish & Wildlife Research Unit, University of Georgia From pearu at scipy.org Tue Feb 17 10:47:48 2004 From: pearu at scipy.org (Pearu Peterson) Date: Tue, 17 Feb 2004 09:47:48 -0600 (CST) Subject: [SciPy-user] fftpack fails on cvs build (OSX) In-Reply-To: <0CB9A903-615F-11D8-86FB-000A956FDAC0@fonnesbeck.org> Message-ID: On Tue, 17 Feb 2004, Christopher Fonnesbeck wrote: > > On Feb 17, 2004, at 10:26 AM, Pearu Peterson wrote: > > > Could you update cvs and try again? I have added Bob's suggestion > > there. > > Let me know if it works or not. > > You added: > > opt.append('-framework','Python') Oops. I meant what's below. > which gives an error, since append only takes 1 argument. Substituting: > > opt.append("-framework Python") > > gives the following error: > > g77: Python: No such file or directory Could you try: opt.extend(['-framework','Python']) ? Pearu From chris at fonnesbeck.org Tue Feb 17 11:35:12 2004 From: chris at fonnesbeck.org (Christopher Fonnesbeck) Date: Tue, 17 Feb 2004 11:35:12 -0500 Subject: [SciPy-user] fftpack fails on cvs build (OSX) In-Reply-To: References: Message-ID: <41C9C8E1-6167-11D8-86FB-000A956FDAC0@fonnesbeck.org> On Feb 17, 2004, at 10:47 AM, Pearu Peterson wrote: >> >> On Feb 17, 2004, at 10:26 AM, Pearu Peterson wrote: >> >>> Could you update cvs and try again? I have added Bob's suggestion >>> there. >>> Let me know if it works or not. >> >> You added: >> >> opt.append('-framework','Python') > > Oops. I meant what's below. > >> which gives an error, since append only takes 1 argument. >> Substituting: >> >> opt.append("-framework Python") >> >> gives the following error: >> >> g77: Python: No such file or directory > > Could you try: > > opt.extend(['-framework','Python']) > Doesnt work ... it can't seem to find python. I believe the appropriate way to link frameworks with g77 with "-Wl,-framework -Wl,Python". Using opt.append("-Wl,-framework -Wl,Python -lcc_dynamic") I am able to eliminate most of the errors, except for the following: /usr/local/bin/g77 -Wl,-framework -Wl,Python -lcc_dynamic build/temp.darwin-7.2.0-Power_Macintosh-2.3/build/src/Lib/fftpack/ _fftpackmodule.o build/temp.darwin-7.2.0-Power_Macintosh-2.3/Lib/fftpack/src/zfft.o build/temp.darwin-7.2.0-Power_Macintosh-2.3/Lib/fftpack/src/drfft.o build/temp.darwin-7.2.0-Power_Macintosh-2.3/Lib/fftpack/src/zrfft.o build/temp.darwin-7.2.0-Power_Macintosh-2.3/Lib/fftpack/src/zfftnd.o build/temp.darwin-7.2.0-Power_Macintosh-2.3/build/src/fortranobject.o -L/usr/local/lib -L/usr/local/lib/gcc/powerpc-apple-darwin7.0.0/3.4 -Lbuild/temp.darwin-7.2.0-Power_Macintosh-2.3 -ldfftpack -lrfftw -lfftw -lg2c -o build/lib.darwin-7.2.0-Power_Macintosh-2.3/scipy/fftpack/_fftpack.so /usr/bin/ld: Undefined symbols: _MAIN__ collect2: ld returned 1 exit status error: Command "/usr/local/bin/g77 -Wl,-framework -Wl,Python -lcc_dynamic build/temp.darwin-7.2.0-Power_Macintosh-2.3/build/src/Lib/fftpack/ _fftpackmodule.o build/temp.darwin-7.2.0-Power_Macintosh-2.3/Lib/fftpack/src/zfft.o build/temp.darwin-7.2.0-Power_Macintosh-2.3/Lib/fftpack/src/drfft.o build/temp.darwin-7.2.0-Power_Macintosh-2.3/Lib/fftpack/src/zrfft.o build/temp.darwin-7.2.0-Power_Macintosh-2.3/Lib/fftpack/src/zfftnd.o build/temp.darwin-7.2.0-Power_Macintosh-2.3/build/src/fortranobject.o -L/usr/local/lib -L/usr/local/lib/gcc/powerpc-apple-darwin7.0.0/3.4 -Lbuild/temp.darwin-7.2.0-Power_Macintosh-2.3 -ldfftpack -lrfftw -lfftw -lg2c -o build/lib.darwin-7.2.0-Power_Macintosh-2.3/scipy/fftpack/_fftpack.so" failed with exit status 1 -- Christopher J. Fonnesbeck ( c h r i s @ f o n n e s b e c k . o r g ) Georgia Cooperative Fish & Wildlife Research Unit, University of Georgia From bob at redivi.com Tue Feb 17 11:51:46 2004 From: bob at redivi.com (Bob Ippolito) Date: Tue, 17 Feb 2004 11:51:46 -0500 Subject: [SciPy-user] fftpack fails on cvs build (OSX) In-Reply-To: <41C9C8E1-6167-11D8-86FB-000A956FDAC0@fonnesbeck.org> References: <41C9C8E1-6167-11D8-86FB-000A956FDAC0@fonnesbeck.org> Message-ID: <9279BE5E-6169-11D8-B0CA-000A95686CD8@redivi.com> On Feb 17, 2004, at 11:35 AM, Christopher Fonnesbeck wrote: > > On Feb 17, 2004, at 10:47 AM, Pearu Peterson wrote: > >>> >>> On Feb 17, 2004, at 10:26 AM, Pearu Peterson wrote: >>> >>>> Could you update cvs and try again? I have added Bob's suggestion >>>> there. >>>> Let me know if it works or not. >>> >>> You added: >>> >>> opt.append('-framework','Python') >> >> Oops. I meant what's below. >> >>> which gives an error, since append only takes 1 argument. >>> Substituting: >>> >>> opt.append("-framework Python") >>> >>> gives the following error: >>> >>> g77: Python: No such file or directory >> >> Could you try: >> >> opt.extend(['-framework','Python']) >> > > Doesnt work ... it can't seem to find python. I believe the > appropriate way to link frameworks with g77 with "-Wl,-framework > -Wl,Python". Using > > opt.append("-Wl,-framework -Wl,Python -lcc_dynamic") > > I am able to eliminate most of the errors, except for the following: > > /usr/local/bin/g77 -Wl,-framework -Wl,Python -lcc_dynamic > build/temp.darwin-7.2.0-Power_Macintosh-2.3/build/src/Lib/fftpack/ > _fftpackmodule.o > build/temp.darwin-7.2.0-Power_Macintosh-2.3/Lib/fftpack/src/zfft.o > build/temp.darwin-7.2.0-Power_Macintosh-2.3/Lib/fftpack/src/drfft.o > build/temp.darwin-7.2.0-Power_Macintosh-2.3/Lib/fftpack/src/zrfft.o > build/temp.darwin-7.2.0-Power_Macintosh-2.3/Lib/fftpack/src/zfftnd.o > build/temp.darwin-7.2.0-Power_Macintosh-2.3/build/src/fortranobject.o > -L/usr/local/lib -L/usr/local/lib/gcc/powerpc-apple-darwin7.0.0/3.4 > -Lbuild/temp.darwin-7.2.0-Power_Macintosh-2.3 -ldfftpack -lrfftw > -lfftw -lg2c -o > build/lib.darwin-7.2.0-Power_Macintosh-2.3/scipy/fftpack/_fftpack.so > /usr/bin/ld: Undefined symbols: > _MAIN__ It probably needs a -bundle, right now it's probably trying to link as an executable.. -bob From chris at fonnesbeck.org Tue Feb 17 12:17:42 2004 From: chris at fonnesbeck.org (Christopher Fonnesbeck) Date: Tue, 17 Feb 2004 12:17:42 -0500 Subject: [SciPy-user] fftpack fails on cvs build (OSX) In-Reply-To: <9279BE5E-6169-11D8-B0CA-000A95686CD8@redivi.com> References: <41C9C8E1-6167-11D8-86FB-000A956FDAC0@fonnesbeck.org> <9279BE5E-6169-11D8-B0CA-000A95686CD8@redivi.com> Message-ID: <32027C9A-616D-11D8-86FB-000A956FDAC0@fonnesbeck.org> On Feb 17, 2004, at 11:51 AM, Bob Ippolito wrote: >>> >> >> Doesnt work ... it can't seem to find python. I believe the >> appropriate way to link frameworks with g77 with "-Wl,-framework >> -Wl,Python". Using >> >> opt.append("-Wl,-framework -Wl,Python -lcc_dynamic") >> >> I am able to eliminate most of the errors, except for the following: >> >> /usr/local/bin/g77 -Wl,-framework -Wl,Python -lcc_dynamic >> build/temp.darwin-7.2.0-Power_Macintosh-2.3/build/src/Lib/fftpack/ >> _fftpackmodule.o >> build/temp.darwin-7.2.0-Power_Macintosh-2.3/Lib/fftpack/src/zfft.o >> build/temp.darwin-7.2.0-Power_Macintosh-2.3/Lib/fftpack/src/drfft.o >> build/temp.darwin-7.2.0-Power_Macintosh-2.3/Lib/fftpack/src/zrfft.o >> build/temp.darwin-7.2.0-Power_Macintosh-2.3/Lib/fftpack/src/zfftnd.o >> build/temp.darwin-7.2.0-Power_Macintosh-2.3/build/src/fortranobject.o >> -L/usr/local/lib -L/usr/local/lib/gcc/powerpc-apple-darwin7.0.0/3.4 >> -Lbuild/temp.darwin-7.2.0-Power_Macintosh-2.3 -ldfftpack -lrfftw >> -lfftw -lg2c -o >> build/lib.darwin-7.2.0-Power_Macintosh-2.3/scipy/fftpack/_fftpack.so >> /usr/bin/ld: Undefined symbols: >> _MAIN__ > > It probably needs a -bundle, right now it's probably trying to link as > an executable.. > > -bob Right, that would do it. This appears to work: if sys.platform=='darwin': opt.extend(["-Wl,-framework","-Wl,Python","-lcc_dynamic","-bundle"]) -- Christopher J. Fonnesbeck ( c h r i s @ f o n n e s b e c k . o r g ) Georgia Cooperative Fish & Wildlife Research Unit, University of Georgia From chris at fonnesbeck.org Tue Feb 17 13:22:12 2004 From: chris at fonnesbeck.org (Christopher Fonnesbeck) Date: Tue, 17 Feb 2004 13:22:12 -0500 Subject: [SciPy-user] building fortran extensions in scipy Message-ID: <34AD4702-6176-11D8-86FB-000A956FDAC0@fonnesbeck.org> As I indicated in a previous post, I am trying to move some of the distributions pdf and likelihood methods to FORTRAN in order to speed them up for use in statistical simulation procedures (such as MCMC). I have put together one such extension, and added it to the list of modules in stats/setup_stats.py, as follows: # add futil and fdist modules sources = ['futil.f','fdist.f'] sources = [os.path.join(local_path,x) for x in sources] for x in sources: ext = Extension(dot_join(parent_package,package,x),sources) config['ext_modules'].append(ext) However, when I build scipy, I see that neither futil.so or fdist.so exist in the scipy installation. How can I persuade scipy to install them? Thanks, Chris -- Christopher J. Fonnesbeck ( c h r i s @ f o n n e s b e c k . o r g ) Georgia Cooperative Fish & Wildlife Research Unit, University of Georgia From chris at fonnesbeck.org Tue Feb 17 14:11:07 2004 From: chris at fonnesbeck.org (Christopher Fonnesbeck) Date: Tue, 17 Feb 2004 14:11:07 -0500 Subject: [SciPy-user] building fortran extensions in scipy In-Reply-To: <34AD4702-6176-11D8-86FB-000A956FDAC0@fonnesbeck.org> References: <34AD4702-6176-11D8-86FB-000A956FDAC0@fonnesbeck.org> Message-ID: <0A2226E7-617D-11D8-86FB-000A956FDAC0@fonnesbeck.org> On Feb 17, 2004, at 1:22 PM, Christopher Fonnesbeck wrote: > As I indicated in a previous post, I am trying to move some of the > distributions pdf and likelihood methods to FORTRAN in order to speed > them up for use in statistical simulation procedures (such as MCMC). I > have put together one such extension, and added it to the list of > modules in stats/setup_stats.py, as follows: > > # add futil and fdist modules > sources = ['futil.f','fdist.f'] > sources = [os.path.join(local_path,x) for x in sources] > for x in sources: > ext = Extension(dot_join(parent_package,package,x),sources) > config['ext_modules'].append(ext) > > However, when I build scipy, I see that neither futil.so or fdist.so > exist in the scipy installation. How can I persuade scipy to install > them? > Ignore this ... I see my mistake. Sorry. -- Christopher J. Fonnesbeck ( c h r i s @ f o n n e s b e c k . o r g ) Georgia Cooperative Fish & Wildlife Research Unit, University of Georgia From pearu at scipy.org Wed Feb 18 03:53:00 2004 From: pearu at scipy.org (Pearu Peterson) Date: Wed, 18 Feb 2004 02:53:00 -0600 (CST) Subject: [SciPy-user] fftpack fails on cvs build (OSX) In-Reply-To: <32027C9A-616D-11D8-86FB-000A956FDAC0@fonnesbeck.org> Message-ID: On Tue, 17 Feb 2004, Christopher Fonnesbeck wrote: > Right, that would do it. This appears to work: > > if sys.platform=='darwin': > > opt.extend(["-Wl,-framework","-Wl,Python","-lcc_dynamic","-bundle"]) Thanks, it is in CVS now. Pearu From Jonathan.Peirce at nottingham.ac.uk Wed Feb 11 12:55:09 2004 From: Jonathan.Peirce at nottingham.ac.uk (Jon Peirce) Date: Wed, 11 Feb 2004 17:55:09 +0000 Subject: [SciPy-user] PIL and scipy conflicting Message-ID: <402A6C7D.40604@psychology.nottingham.ac.uk> Hi all I have scipy (0.2.0_alpha_212.4128) and PIL (1.1.4) installed on a winXP machine and I'm getting a strange conflict. If scipy is imported second (using a clean namespace) then Image can no longer 'open': #this works fine import scipy import Image im = Image.open('somefile.jpg') #this returns error import Image import scipy im = Image.open('somefile.jpg') > File "C:\Python23\lib\site-packages\PIL\Image.py", line 1571, in open > raise IOError("cannot identify image file") > IOError: cannot identify image file Does anyone know why? I don't want just to go with the former - its presumably doing some other damage to scipy that i haven't come across yet! -- Jon Peirce Nottingham University +44 (0)115 8467176 (tel) +44 (0)115 9515324 (fax) http://www.psychology.nottingham.ac.uk/staff/jwp/ From pearu at cens.ioc.ee Thu Feb 19 07:01:28 2004 From: pearu at cens.ioc.ee (Pearu Peterson) Date: Thu, 19 Feb 2004 14:01:28 +0200 (EET) Subject: [SciPy-user] scipy OS X compiling on a G3 In-Reply-To: <31850D8C-5B31-11D8-8720-000A95843714@yale.edu> Message-ID: Hi all, OSX users, could you try out the recent scipy_distutils for building linalg module? It now uses '-framework Accelerate ..' stuff for atlas_info but I cannot test this myself. Here are basic steps: Install new scipy_core: cd scipy/scipy_core python setup.py install Build linalg module: cd scipy/Lib/linalg # undefine ATLAS environment variable if any python setup_linalg.py build # pay attention to the availability of `lapack_opt_info` python tests/test_basic.py .. Let me know if it works or not. Note that building scipy will not work yet without atlas libraries as scipy.integrate still uses old hooks, if someone get's scipy.linalg buildable under darwin, I'll finish scipy support for darwin-without-atlas asap. Regards, Pearu From covino at mi.astro.it Thu Feb 19 13:28:46 2004 From: covino at mi.astro.it (Stefano Covino) Date: Thu, 19 Feb 2004 19:28:46 +0100 Subject: [SciPy-user] function minization with constraints Message-ID: <1077215326.6534.42.camel@sirrah.merate.mi.astro.it> Dear all, is there in the SciPy package a library to minime a multi-variable function with constraints on each parameter? Thanks, Stefano From jose.martin at wanadoo.es Thu Feb 19 17:50:56 2004 From: jose.martin at wanadoo.es (=?iso-8859-1?Q?Jos=E9_A_Mart=EDn_H?=) Date: Thu, 19 Feb 2004 23:50:56 +0100 Subject: [SciPy-user] PIL and scipy conflicting References: <402A6C7D.40604@psychology.nottingham.ac.uk> Message-ID: <007201c3f73a$d72efad0$ea93253e@CASA> Yes thats true , this error is ... a long time ago in a scipy faraway... but today this error still happend... I as you import first scipy , and then Image... ----- Original Message ----- From: "Jon Peirce" To: Sent: Wednesday, February 11, 2004 6:55 PM Subject: [SciPy-user] PIL and scipy conflicting > Hi all > > I have scipy (0.2.0_alpha_212.4128) and PIL (1.1.4) installed on a winXP > machine and I'm getting a strange conflict. If scipy is imported second > (using a clean namespace) then Image can no longer 'open': > > #this works fine > import scipy > import Image > im = Image.open('somefile.jpg') > > #this returns error > import Image > import scipy > im = Image.open('somefile.jpg') > > > File "C:\Python23\lib\site-packages\PIL\Image.py", line 1571, in open > > raise IOError("cannot identify image file") > > IOError: cannot identify image file > > > Does anyone know why? I don't want just to go with the former - its > presumably doing some other damage to scipy that i haven't come across yet! > > -- > Jon Peirce > Nottingham University > +44 (0)115 8467176 (tel) > +44 (0)115 9515324 (fax) > > http://www.psychology.nottingham.ac.uk/staff/jwp/ > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user From nwagner at mecha.uni-stuttgart.de Fri Feb 20 08:13:25 2004 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Fri, 20 Feb 2004 14:13:25 +0100 Subject: [SciPy-user] Urgent: Bug in linalg.lu_factor linalg.lu_solve Message-ID: <403607F5.C043379@mecha.uni-stuttgart.de> Hi all, I have observed a very strange behaviour within linalg.lu_factor / linalg.lu_solve. Any idea or suggestion ? Nils from scipy import * def F(A,l): tmp=zeros((n,n),Complex) tmp[:n,:n] = A+l*identity(n)+exp(-l)*identity(n) return tmp def Fs(l): tmp=zeros((n,n),Complex) tmp[:n,:n] = identity(n)-l*exp(-l)*identity(n) return tmp def T(n): tmp=zeros((n,n),Complex) for i in arange(0,n): tmp[i,i] = -2.0 if i < n-1: tmp[i+1,i] = 1.0 tmp[i,i+1] = 1.0 return tmp n = 4 A = T(n) gamma = 2.0+0j rho = 2.0 ns = 180 #for i in arange(0,ns): for i in arange(0,2): l = gamma+rho*exp(2.*pi*i*1j/ns) D1 = F(A,l) Ds = Fs(l) Dinv = linalg.inv(D1) lu, piv = linalg.lu_factor(D1,overwrite_a=0) tr = 0.0 for k in arange(0,n): rhs = Ds[:,k] g1 = linalg.lu_solve((lu,piv),rhs,trans=0,overwrite_b=0) tr = tr + g1[k] # # tr and trace(dot(Dinv,Ds)) should be the same !!!!! # print tr, trace(dot(Dinv,Ds)) From nwagner at mecha.uni-stuttgart.de Fri Feb 20 08:19:01 2004 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Fri, 20 Feb 2004 14:19:01 +0100 Subject: [SciPy-user] Bug in linalg.lu_factor linalg.lu_solve Message-ID: <40360945.8C842433@mecha.uni-stuttgart.de> Hi all, the problem with linalg.lu_factor and linalg.lu_solve seems to be restricted to complex matrices... Nils From nwagner at mecha.uni-stuttgart.de Fri Feb 20 08:56:30 2004 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Fri, 20 Feb 2004 14:56:30 +0100 Subject: [SciPy-user] linalg.lu is Message-ID: <4036120E.5B9DE9B8@mecha.uni-stuttgart.de> Hi all, linalg.lu cannot handle complex equations, where the rhs is complex. Nils from scipy import * from RandomArray import * n = 2 A = rand(n,n)+rand(n,n)*1j X = zeros((n,n),Complex) Ainv = linalg.inv(A) print linalg.norm(dot(Ainv,A)) I = identity(n)+0j*identity(n) # # linalg.solve cannot handle complex rhs !! # #I = I.real # for i in arange(0,n): X[:,i] = linalg.solve(A,I[:,i]) # # Should be the inverse of A as well !!!! # print X # # Inverse of A # print Ainv From elcortostrash at gmx.net Fri Feb 20 11:01:04 2004 From: elcortostrash at gmx.net (el corto) Date: Fri, 20 Feb 2004 17:01:04 +0100 (MET) Subject: [SciPy-user] install SciPy with cygwin and Python 2.3x Message-ID: <23254.1077292864@www45.gmx.net> I've downloaded and installed the latest version of cygwin which comes with Python 2.3.3. The install instructions at www.scipy.org/site_content/tutorials/install_binaries say to unpack the SciPy-binary for cygwin in /usr/lib/python2.3/site-packages/ which I did. The binary includes the folders scipy gui_thread scipy_base scipy_distutils Since I'm new to the cygwin thing I really don't know if that was the whole installation (sry for this obviously stupid question). When I try to >>> import scipy I get error-messages like File "/usr/lib/python2.3/site-packages/scipy/__init__.py", line 36, in ? from scipy_base import * File "/usr/lib/python2.3/site-packages/scipy_base/__init__.py", line 110, in ? raise ImportError, mess ImportError: dlopen: Win32 error 126 OK, now what does that mean ?? I really hope that someone can give me a little hint :-) thx, bye. -- GMX ProMail (250 MB Mailbox, 50 FreeSMS, Virenschutz, 2,99 EUR/Monat...) jetzt 3 Monate GRATIS + 3x DER SPIEGEL +++ http://www.gmx.net/derspiegel +++ From pearu at scipy.org Fri Feb 20 11:40:49 2004 From: pearu at scipy.org (Pearu Peterson) Date: Fri, 20 Feb 2004 10:40:49 -0600 (CST) Subject: [SciPy-user] install SciPy with cygwin and Python 2.3x In-Reply-To: <23254.1077292864@www45.gmx.net> Message-ID: On Fri, 20 Feb 2004, el corto wrote: > I've downloaded and installed the latest version of cygwin which comes with > Python 2.3.3. The install instructions at > > www.scipy.org/site_content/tutorials/install_binaries > > I get error-messages like > > File "/usr/lib/python2.3/site-packages/scipy/__init__.py", line 36, in ? > from scipy_base import * > File "/usr/lib/python2.3/site-packages/scipy_base/__init__.py", line > 110, in ? > raise ImportError, mess > ImportError: dlopen: Win32 error 126 > > OK, now what does that mean ?? I really hope that someone can give me a > little hint :-) I cannot say what causes the above error but I suggest building scipy under cygwin yourself, it is not so hard. See http://cens.ioc.ee/~pearu/scipy/BUILD_WIN32.html HTH, Pearu From oliphant at ee.byu.edu Fri Feb 20 11:37:28 2004 From: oliphant at ee.byu.edu (Travis E. Oliphant) Date: Fri, 20 Feb 2004 09:37:28 -0700 Subject: [SciPy-user] PIL and scipy conflicting In-Reply-To: <402A6C7D.40604@psychology.nottingham.ac.uk> References: <402A6C7D.40604@psychology.nottingham.ac.uk> Message-ID: <403637C8.7030409@ee.byu.edu> Jon Peirce wrote: > Hi all > > I have scipy (0.2.0_alpha_212.4128) and PIL (1.1.4) installed on a winXP > machine and I'm getting a strange conflict. If scipy is imported second > (using a clean namespace) then Image can no longer 'open': > You may be interested to know that if you have PIL installed then scipy has some commands that make opening and saving images to and from Numeric arrays even easier. I have used these commands without incident . arr = imread('somefile.jpg') imsave('newfile.jpg',arr) One big difference is that the image is read in to a Numeric array (which of course an image should be thought of as anyway :-) ) Best, Travis O. From pearu at scipy.org Fri Feb 20 12:12:33 2004 From: pearu at scipy.org (Pearu Peterson) Date: Fri, 20 Feb 2004 11:12:33 -0600 (CST) Subject: [SciPy-user] linalg.lu is In-Reply-To: <4036120E.5B9DE9B8@mecha.uni-stuttgart.de> Message-ID: On Fri, 20 Feb 2004, Nils Wagner wrote: > linalg.lu cannot handle complex equations, where the rhs is complex. This is not generally true but you are right, something goes wrong. For example, In [4]: from scipy.linalg import * In [5]: a = [[1,1],[1,0]] # ok In [6]: x0 = solve(a,[1,0j]) In [7]: dot(a,x0) Out[7]: array([ 1.+0.j, 0.+0.j]) In [8]: a = [[1,1],[1.2,0]] # not ok In [9]: x0 = solve(a,[1,0j]) In [10]: dot(a,x0) Out[10]: array([ 0.83333333+0.j, 0. +0.j]) When I disable ATLAS, then the second case gives correct answer. So, I suspect it is an ATLAS bug. I am using 3.6.0. What version of ATLAS are you using? Pearu From dr.nwagner at web.de Fri Feb 20 13:28:08 2004 From: dr.nwagner at web.de (Nils Wagner) Date: Fri, 20 Feb 2004 19:28:08 +0100 Subject: [SciPy-user] linalg.lu is Message-ID: <200402201828.i1KIS8Q32517@mailgate5.cinetic.de> SciPy Users List schrieb am 20.02.04 18:06:43: On Fri, 20 Feb 2004, Nils Wagner wrote: > linalg.lu cannot handle complex equations, where the rhs is complex. This is not generally true but you are right, something goes wrong. For example, In [4]: from scipy.linalg import * In [5]: a = [[1,1],[1,0]] # ok In [6]: x0 = solve(a,[1,0j]) In [7]: dot(a,x0) Out[7]: array([ 1.+0.j, 0.+0.j]) In [8]: a = [[1,1],[1.2,0]] # not ok In [9]: x0 = solve(a,[1,0j]) In [10]: dot(a,x0) Out[10]: array([ 0.83333333+0.j, 0. +0.j]) When I disable ATLAS, then the second case gives correct answer. So, I suspect it is an ATLAS bug. I am using 3.6.0. What version of ATLAS are you using? I am using 3.6 as well. How do I disable the usage of ATLAS ? Did you sent an email to the maintainer of ATLAS ? Nils Pearu _______________________________________________ SciPy-user mailing list SciPy-user at scipy.net http://www.scipy.net/mailman/listinfo/scipy-user ______________________________________________________________________________ Ein Grund zum Feiern: Die PC Praxis ermittelt zwischen 10 grossen Mailprovidern WEB.DE FreeMail als Testsieger http://f.web.de/?mc=021190 From dr.nwagner at web.de Sat Feb 21 01:37:18 2004 From: dr.nwagner at web.de (Nils Wagner) Date: Sat, 21 Feb 2004 07:37:18 +0100 Subject: [SciPy-user] Disable ATLAS Message-ID: <200402210637.i1L6bIQ32423@mailgate5.cinetic.de> Hi all, How can I disable the usage of ATLAS in scipy ? Nils ______________________________________________________________________________ Nachrichten, Musik und Spiele schnell und einfach per Quickstart im WEB.DE Screensaver - Gratis downloaden: http://screensaver.web.de/?mc=021110 From pearu at scipy.org Sat Feb 21 01:52:46 2004 From: pearu at scipy.org (Pearu Peterson) Date: Sat, 21 Feb 2004 00:52:46 -0600 (CST) Subject: [SciPy-user] Disable ATLAS In-Reply-To: <200402210637.i1L6bIQ32423@mailgate5.cinetic.de> Message-ID: On Sat, 21 Feb 2004, Nils Wagner wrote: > Hi all, > > How can I disable the usage of ATLAS in scipy ? There are several ways to do it, all are hackish currently. E.g. set atlas_info={} in lapack_opt_info.calc_info method in system_info.py file or hide the location of atlas libraries from system_info.py. Then you'll need Fortran blas and lapack installed. But considering the current issue, you can either try out different ALTAS versions or set force_clapack = 0 in the get_lapack_funcs() argument list. See linalg/lapack.py file. HTH, Pearu From dr.nwagner at web.de Sat Feb 21 08:27:17 2004 From: dr.nwagner at web.de (Nils Wagner) Date: Sat, 21 Feb 2004 14:27:17 +0100 Subject: [SciPy-user] Disable ATLAS Message-ID: <200402211327.i1LDRGQ23314@mailgate5.cinetic.de> SciPy Users List schrieb am 21.02.04 07:46:32: On Sat, 21 Feb 2004, Nils Wagner wrote: > Hi all, > > How can I disable the usage of ATLAS in scipy ? There are several ways to do it, all are hackish currently. E.g. set atlas_info={} in lapack_opt_info.calc_info method in system_info.py file or hide the location of atlas libraries from system_info.py. Then you'll need Fortran blas and lapack installed. But considering the current issue, you can either try out different ALTAS versions or set force_clapack = 0 in the get_lapack_funcs() argument list. See linalg/lapack.py file. I have disabled ATLAS. The problem with linalg.lu didn't vanish. What can we do ? from scipy import * from RandomArray import * n = 2 A = rand(n,n)+rand(n,n)*1j X = zeros((n,n),Complex) Ainv = linalg.inv(A) print linalg.norm(dot(Ainv,A)) I = identity(n)+identity(n)*0j # # linalg.solve cannot handle complex rhs !! # #I = I.real # for i in arange(0,n): r = I[:,i] X[:,i] = linalg.solve(A,r) # # Should be the inverse of A as well !!!! # print print 'X' print print X # # Inverse of A # print print 'inv(A)' print print Ainv print dot(A,Ainv) print dot(A,X) HTH, Pearu _______________________________________________ SciPy-user mailing list SciPy-user at scipy.net http://www.scipy.net/mailman/listinfo/scipy-user ______________________________________________________________________________ ... and the winner is... WEB.DE FreeMail! - Deutschlands beste E-Mail ist zum 39. Mal Testsieger (PC Praxis 03/04) http://f.web.de/?mc=021191 From pearu at scipy.org Sat Feb 21 08:40:56 2004 From: pearu at scipy.org (Pearu Peterson) Date: Sat, 21 Feb 2004 07:40:56 -0600 (CST) Subject: [SciPy-user] Disable ATLAS In-Reply-To: <200402211327.i1LDRGQ23314@mailgate5.cinetic.de> Message-ID: On Sat, 21 Feb 2004, Nils Wagner wrote: > I have disabled ATLAS. The problem with linalg.lu didn't vanish. > What can we do ? Are you absolutely sure that ATLAS is not being used (rebuilded linalg from scratch, removed earlier installations, eg `ldd blas.so` does not contain references to ATLAS symbols, etc etc)? Have you tried the two examples of mine from the earlier thread? Pearu From dr.nwagner at web.de Sat Feb 21 12:32:15 2004 From: dr.nwagner at web.de (Nils Wagner) Date: Sat, 21 Feb 2004 18:32:15 +0100 Subject: [SciPy-user] Disable ATLAS Message-ID: <200402211732.i1LHWFQ13708@mailgate5.cinetic.de> SciPy Users List schrieb am 21.02.04 14:35:12: On Sat, 21 Feb 2004, Nils Wagner wrote: > I have disabled ATLAS. The problem with linalg.lu didn't vanish. > What can we do ? Are you absolutely sure that ATLAS is not being used (rebuilded linalg from scratch, removed earlier installations, eg `ldd blas.so` does not contain references to ATLAS symbols, etc etc)? Yes. nwagner at linux:/usr/local/lib/python2.3/site-packages/scipy/linalg> ldd cblas.so libg2c.so.0 => /usr/lib/libg2c.so.0 (0x40024000) libm.so.6 => /lib/i686/libm.so.6 (0x40042000) libgcc_s.so.1 => /lib/libgcc_s.so.1 (0x40065000) libc.so.6 => /lib/i686/libc.so.6 (0x4006d000) /lib/ld-linux.so.2 => /lib/ld-linux.so.2 (0x80000000) Have you tried the two examples of mine from the earlier thread? Yes. they work fine for me. but lu.py failed again Please can you run lu.py from scipy import * from RandomArray import * n = 2 A = rand(n,n)+rand(n,n)*1j X = zeros((n,n),Complex) Ainv = linalg.inv(A) R = identity(n)+identity(n)*0j # # linalg.solve cannot handle complex rhs !! # #R = R.real # for i in arange(0,n): r = R[:,i] X[:,i] = linalg.solve(A,r) # # Should be the inverse of A as well !!!! # print print 'X' print print X # # Inverse of A # print print 'inv(A)' print print Ainv print print 'Identity ???' print print dot(A,Ainv) print print 'Identity ???' print print dot(A,X) Thanks in advance Nils Pearu _______________________________________________ SciPy-user mailing list SciPy-user at scipy.net http://www.scipy.net/mailman/listinfo/scipy-user ______________________________________________________________________________ ... and the winner is... WEB.DE FreeMail! - Deutschlands beste E-Mail ist zum 39. Mal Testsieger (PC Praxis 03/04) http://f.web.de/?mc=021191 From pearu at scipy.org Sat Feb 21 15:09:09 2004 From: pearu at scipy.org (Pearu Peterson) Date: Sat, 21 Feb 2004 14:09:09 -0600 (CST) Subject: [SciPy-user] Disable ATLAS In-Reply-To: <200402211732.i1LHWFQ13708@mailgate5.cinetic.de> Message-ID: On Sat, 21 Feb 2004, Nils Wagner wrote: > Please can you run lu.py When I set force_clapack=0 then the output of lu.py is --------------- X [[ 1.28518872-0.09428482j -0.46708202-0.37104511j] [-0.31203795+0.19175932j 0.73496379-0.19732829j]] inv(A) [[ 0.91414361+0.3727972j -0.46708202-0.37104511j] [-0.50936624-0.54320447j 0.73496379-0.19732829j]] Identity ??? [[ 1.00000000e+00 +2.90295132e-17j -1.99086624e-17 -3.05202912e-17j] [ 7.09610322e-17 +1.95427442e-16j 1.00000000e-00 +2.33103467e-18j]] Identity ??? [[ 1.00000000e+00 +1.72794721e-17j -1.99086624e-17 -3.05202912e-17j] [ -2.36085023e-17 +1.00000000e+00j 1.00000000e-00 +2.33103467e-18j]] -------------- Everything is correct up to round-off errors. So, check again that ATLAS is not being used. For the record, with force_lapack=1 the results are incorrect. As for a possible Atlas bug, I haven't created a simple C program that would demonstrate the bug and is necessary to file a bug report to Atlas people. Any takers? Pearu From dr.nwagner at web.de Sat Feb 21 16:57:20 2004 From: dr.nwagner at web.de (Nils Wagner) Date: Sat, 21 Feb 2004 22:57:20 +0100 Subject: [SciPy-user] Disable ATLAS Message-ID: <200402212157.i1LLvKQ24535@mailgate5.cinetic.de> SciPy Users List schrieb am 21.02.04 21:03:36: On Sat, 21 Feb 2004, Nils Wagner wrote: > Please can you run lu.py When I set force_clapack=0 then the output of lu.py is --------------- X [[ 1.28518872-0.09428482j -0.46708202-0.37104511j] [-0.31203795+0.19175932j 0.73496379-0.19732829j]] inv(A) [[ 0.91414361+0.3727972j -0.46708202-0.37104511j] [-0.50936624-0.54320447j 0.73496379-0.19732829j]] Identity ??? [[ 1.00000000e+00 +2.90295132e-17j -1.99086624e-17 -3.05202912e-17j] [ 7.09610322e-17 +1.95427442e-16j 1.00000000e-00 +2.33103467e-18j]] Identity ??? [[ 1.00000000e+00 +1.72794721e-17j -1.99086624e-17 -3.05202912e-17j] [ -2.36085023e-17 +1.00000000e+00j 1.00000000e-00 +2.33103467e-18j]] ----^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^---------- How about -2.36085023e-17 +1.00000000e+00j ? The imaginary part should be zero ! Nils Everything is correct up to round-off errors. So, check again that ATLAS is not being used. For the record, with force_lapack=1 the results are incorrect. As for a possible Atlas bug, I haven't created a simple C program that would demonstrate the bug and is necessary to file a bug report to Atlas people. Any takers? Pearu _______________________________________________ SciPy-user mailing list SciPy-user at scipy.net http://www.scipy.net/mailman/listinfo/scipy-user ______________________________________________________________________________ ... and the winner is... WEB.DE FreeMail! - Deutschlands beste E-Mail ist zum 39. Mal Testsieger (PC Praxis 03/04) http://f.web.de/?mc=021191 From pearu at scipy.org Sat Feb 21 18:12:52 2004 From: pearu at scipy.org (Pearu Peterson) Date: Sat, 21 Feb 2004 17:12:52 -0600 (CST) Subject: [SciPy-user] Disable ATLAS In-Reply-To: <200402212157.i1LLvKQ24535@mailgate5.cinetic.de> Message-ID: On Sat, 21 Feb 2004, Nils Wagner wrote: > Identity ??? > > [[ 1.00000000e+00 +1.72794721e-17j -1.99086624e-17 -3.05202912e-17j] > [ -2.36085023e-17 +1.00000000e+00j 1.00000000e-00 +2.33103467e-18j]] > ----^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^---------- > > How about -2.36085023e-17 +1.00000000e+00j ? > The imaginary part should be zero ! Funny that I missed that, sorry. I think I found the cause of these incorrect results. It seems to be due to solve() ignoring piv output from gesv function. Pearu From pearu at scipy.org Sun Feb 22 04:53:18 2004 From: pearu at scipy.org (Pearu Peterson) Date: Sun, 22 Feb 2004 03:53:18 -0600 (CST) Subject: [SciPy-user] Incorrect results in linalg.solve and friends In-Reply-To: <200402211732.i1LHWFQ13708@mailgate5.cinetic.de> Message-ID: Hi, This is heads-up for linalg crunchers. Incorrect results appear when the following condition holds: an input array is a *slice* of a complex array. > for i in arange(0,n): For example, changing > r = R[:,i] to `r = R[:,i].copy()` will give correct results. > X[:,i] = linalg.solve(A,r) This gives a workaround to avoid this problem until it gets resolved. Also, this is not due to some bug in Atlas but rather of copy_ND_array in fortranobject.c or of PyArray_CopyFromObject, but most probably of copy_ND_array. Pearu From chris at fonnesbeck.org Mon Feb 23 12:37:12 2004 From: chris at fonnesbeck.org (Christopher Fonnesbeck) Date: Mon, 23 Feb 2004 12:37:12 -0500 Subject: [SciPy-user] distutils problem on cvs install under python 2.3 In-Reply-To: References: Message-ID: On Feb 4, 2004, at 2:39 PM, Pearu Peterson wrote: > > > On Wed, 4 Feb 2004, Christopher Fonnesbeck wrote: > >> When I attempt a build of the latest cvs on a clean Debian Linux >> install of python2.3, I get this old error message: >> >> chris at redhead:/usr/src/scipy$ python setup.py build >> Traceback (most recent call last): >> File "setup.py", line 25, in ? >> from scipy_distutils.core import setup >> File "scipy_core/scipy_distutils/core.py", line 1, in ? >> from distutils.core import * >> ImportError: No module named distutils.core > > Could you verify that adding `import distutils` before the line #1 > fixes this import error? > >> I thought this bug in distutils was squashed long ago. Am I missing >> something here? > > Strange, I am on debian too with Python 2.3.3 and I don't get this > error. > Btw, do you have write permissions to > /usr/src/scipy/scipy_core/scipy_distutils directory? I noticed that > the traceback comes from .py file and not from .pyc file. > I have tried reinstalling python2.3 on Debian, but when I try and import distutils, it tells me that there is no model named distutils! It seems that the current testing/stable distributions of python2.3 are broken. C. -- Christopher J. Fonnesbeck ( c h r i s @ f o n n e s b e c k . o r g ) Georgia Cooperative Fish & Wildlife Research Unit, University of Georgia From chris at fonnesbeck.org Mon Feb 23 16:21:22 2004 From: chris at fonnesbeck.org (Christopher Fonnesbeck) Date: Mon, 23 Feb 2004 16:21:22 -0500 Subject: IGNORE [SciPy-user] distutils problem on cvs install under python 2.3 In-Reply-To: References: Message-ID: <3AE1A59C-6646-11D8-B3C2-000A956FDAC0@fonnesbeck.org> On Feb 23, 2004, at 12:37 PM, Christopher Fonnesbeck wrote: > > On Feb 4, 2004, at 2:39 PM, Pearu Peterson wrote: > >> >> >> On Wed, 4 Feb 2004, Christopher Fonnesbeck wrote: >> >>> When I attempt a build of the latest cvs on a clean Debian Linux >>> install of python2.3, I get this old error message: >>> >>> chris at redhead:/usr/src/scipy$ python setup.py build >>> Traceback (most recent call last): >>> File "setup.py", line 25, in ? >>> from scipy_distutils.core import setup >>> File "scipy_core/scipy_distutils/core.py", line 1, in ? >>> from distutils.core import * >>> ImportError: No module named distutils.core >> >> Could you verify that adding `import distutils` before the line #1 >> fixes this import error? >> >>> I thought this bug in distutils was squashed long ago. Am I missing >>> something here? >> >> Strange, I am on debian too with Python 2.3.3 and I don't get this >> error. >> Btw, do you have write permissions to >> /usr/src/scipy/scipy_core/scipy_distutils directory? I noticed that >> the traceback comes from .py file and not from .pyc file. >> > > I have tried reinstalling python2.3 on Debian, but when I try and > import distutils, it tells me that there is no model named distutils! > It seems that the current testing/stable distributions of python2.3 > are broken. > Ignore this; I didnt have the proper python-dev installed. C. -- Christopher J. Fonnesbeck ( c h r i s @ f o n n e s b e c k . o r g ) Georgia Cooperative Fish & Wildlife Research Unit, University of Georgia From nwagner at mecha.uni-stuttgart.de Mon Feb 23 02:54:01 2004 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Mon, 23 Feb 2004 08:54:01 +0100 Subject: [SciPy-user] Incorrect results in linalg.solve and friends References: Message-ID: <4039B199.EA7FCDFD@mecha.uni-stuttgart.de> Pearu Peterson schrieb: > > Hi, > > This is heads-up for linalg crunchers. Incorrect results appear when > the following condition holds: > > an input array is a *slice* of a complex array. > > > for i in arange(0,n): > > For example, changing > > > r = R[:,i] > > to `r = R[:,i].copy()` will give correct results. > > > X[:,i] = linalg.solve(A,r) > > This gives a workaround to avoid this problem until it gets resolved. > Also, this is not due to some bug in Atlas but rather of copy_ND_array in > fortranobject.c or of PyArray_CopyFromObject, but most probably of > copy_ND_array. > > Pearu Pearu, Afaik fortranobject.c belongs to your f2py project. Therefore, it would be very kind of you, if you put a new F2PY release on your website. Thanks in advance Nils BTW, the cvs repository of scipy is currently not available. > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user From nwagner at mecha.uni-stuttgart.de Mon Feb 23 02:41:58 2004 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Mon, 23 Feb 2004 08:41:58 +0100 Subject: [SciPy-user] CVS repository is not availabe Message-ID: <4039AEC6.490930D@mecha.uni-stuttgart.de> From pearu at scipy.org Mon Feb 23 18:04:36 2004 From: pearu at scipy.org (Pearu Peterson) Date: Mon, 23 Feb 2004 17:04:36 -0600 (CST) Subject: [SciPy-user] Incorrect results in linalg.solve and friends In-Reply-To: Message-ID: Hi, On Sun, 22 Feb 2004, Pearu Peterson wrote: > This is heads-up for linalg crunchers. Incorrect results appear when > the following condition holds: > > an input array is a *slice* of a complex array. More accurate condition for the bug to show up is: an input array is an one-dimensional slice of a complex array > This gives a workaround to avoid this problem until it gets resolved. > Also, this is not due to some bug in Atlas but rather of copy_ND_array in > fortranobject.c or of PyArray_CopyFromObject, but most probably of > copy_ND_array. This issue is now resolved. Upgrading f2py to the latest is highly recommended, especially when computations use complex inputs to f2py generated wrapper functions. Regards, Pearu From nwagner at mecha.uni-stuttgart.de Tue Feb 24 03:02:11 2004 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Tue, 24 Feb 2004 09:02:11 +0100 Subject: [SciPy-user] Binomial coefficients Message-ID: <403B0503.AE69099@mecha.uni-stuttgart.de> Hi all, I am wondering if binomial coefficients are currently available in scipy. If that is not the case, I would like to add this feature to scipy Any comment ? Nils from scipy import * def binomial(a,b): return int(special.gamma(a+1)/(special.gamma(b+1)*special.gamma(a-b+1))) # # The following table is given in Bronstein Taschenbuch der Mathematik # for n in arange(0,21): for k in arange(0,11): print binomial(n,k) print From rkern at ucsd.edu Tue Feb 24 03:53:09 2004 From: rkern at ucsd.edu (Robert Kern) Date: Tue, 24 Feb 2004 00:53:09 -0800 Subject: [SciPy-user] Binomial coefficients In-Reply-To: <403B0503.AE69099@mecha.uni-stuttgart.de> References: <403B0503.AE69099@mecha.uni-stuttgart.de> Message-ID: On Feb 24, 2004, at 12:02 AM, Nils Wagner wrote: > Hi all, > > I am wondering if binomial coefficients are currently available in > scipy. It's named scipy.comb() -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From Bob.Cowdery at CGI-Europe.com Tue Feb 24 04:41:32 2004 From: Bob.Cowdery at CGI-Europe.com (Bob.Cowdery at CGI-Europe.com) Date: Tue, 24 Feb 2004 09:41:32 -0000 Subject: [SciPy-user] Help with filter design Message-ID: <9B40DB5FBF7D2048AD79C2A91F862A5106DCFF@STEVAPPS05.Stevenage.CGI-Europe.com> Forgive these questions if they are rather basic. I am trying to get up to speed with python and the scientific packages. I have an implementation of a dsp process for a software radio implemented using Intel IPP. I am converting this to python. One of the things I need to do is fast convolution filtering. To create the filter in IPP I do something like: ippsFIRGenLowpass_64f( ( high_cutoff - low_cutoff )/2, taps, no_of_taps, window_type ) ) where taps is the output array. Is there something similar in SciPy or other packages. I found the tutorial in the signal processing area too difficult to follow and I can't seem to locate any other documentation for SciPy except the help where you really need to know what you are looking for. Any assistance appreciated Bob Bob Cowdery CGI Senior Technical Architect +44(0)1438 791517 Mobile: +44(0)7771 532138 bob.cowdery at cgi-europe.com *** Confidentiality Notice *** Proprietary/Confidential Information belonging to CGI Group Inc. and its affiliates may be contained in this message. If you are not a recipient indicated or intended in this message (or responsible for delivery of this message to such person), or you think for any reason that this message may have been addressed to you in error, you may not use or copy or deliver this message to anyone else. In such case, you should destroy this message and are asked to notify the sender by reply email. -------------- next part -------------- An HTML attachment was scrubbed... URL: From pearu at cens.ioc.ee Tue Feb 24 05:02:08 2004 From: pearu at cens.ioc.ee (Pearu Peterson) Date: Tue, 24 Feb 2004 12:02:08 +0200 (EET) Subject: [SciPy-user] ANN: F2PY - Fortran to Python Interface Generator Message-ID: F2PY - Fortran to Python Interface Generator -------------------------------------------- I am pleased to announce the seventh public release of F2PY, version 2.39.235_1642. The purpose of the F2PY project is to provide the connection between Python and Fortran programming languages. For more information, see http://cens.ioc.ee/projects/f2py2e/ Download source: http://cens.ioc.ee/projects/f2py2e/2.x/F2PY-2-latest.tar.gz What's new? ------------ Since the last public release, that was more than two years ago, F2PY project has been actively maintained and many useful features have been implemented. Here follows some highlights: * New statement ``usercode`` allows inserting user defined C code to F2PY generated extension module sources at various relevant places: before wrapper functions, after variable declarations, and to the end of initialization function of the module. This makes using F2PY very flexible. * New statement ``pymethoddef`` allows adding items to PyMethodDef-array. * Full support for character arrays and arrays of strings is finally implemented. * Number of feature requests from users are implemented. For example: - the auxiliary ``as_column_major_storage()`` that efficiently converts array to column storage order - the F2PY_REPORT_ON_ARRAY_COPY macro that when defined sends a message to stderr when copy of an array with the size larger than specified threshold is made. - Numarray support, thanks to Todd Miller. * Support for Win32 and Mac OSX platforms is considerably improved. The list of compilers supported by the scipy_distutils package is longer than ever: GNU Fortran Compiler Portland Group Fortran Compiler Absoft Corp Fortran Compiler MIPSpro Fortran Compiler Sun|Forte Fortran 95 Compiler Intel Fortran Compiler for 32-bit apps Intel Visual Fortran Compiler for 32-bit apps Intel Fortran Compiler for Itanium apps NAGWare Fortran 95 Compiler Compaq Fortran Compiler DIGITAL|Compaq Visual Fortran Compiler Pacific-Sierra Research Fortran 90 Compiler HP Fortran 90 Compiler Lahey/Fujitsu Fortran 95 Compiler IBM XL Fortran Compiler * Numerous bugs are fixed. You should definitely update F2PY when using complex input arrays, there was a nasty bug that in certain cases caused incorrect results. * F2PY has now a man page and its documentation is kept up to date. Many other improvements to F2PY algorithm and usage are implemented, see HISTORY.txt for more details. Enjoy, Pearu Peterson ---------------

F2PY 2.39.235_1642 - The Fortran to Python Interface Generator (24-Feb-04) From oliphant at ee.byu.edu Tue Feb 24 13:33:44 2004 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Tue, 24 Feb 2004 11:33:44 -0700 Subject: [SciPy-user] Help with filter design In-Reply-To: <9B40DB5FBF7D2048AD79C2A91F862A5106DCFF@STEVAPPS05.Stevenage.CGI-Europe.com> References: <9B40DB5FBF7D2048AD79C2A91F862A5106DCFF@STEVAPPS05.Stevenage.CGI-Europe.com> Message-ID: <403B9908.50705@ee.byu.edu> Bob.Cowdery at CGI-Europe.com wrote: > Forgive these questions if they are rather basic. I am trying to get > up to speed with python and the scientific packages. I have an > implementation of a dsp process for a software radio implemented using > Intel IPP. I am converting this to python. One of the things I need to > do is fast convolution filtering. To create the filter in IPP I do > something like: > > ippsFIRGenLowpass_64f( ( high_cutoff - low_cutoff > )/2, taps, no_of_taps, window_type ) ) where taps is the output array. > > Is there something similar in SciPy or other packages. I found the > tutorial in the signal processing area too difficult to follow and I > can't seem to locate any other documentation for SciPy except the help > where you really need to know what you are looking for. > I would be interested to know which parts of the tutorial you found difficulty following and how you think it could be improved. To answer your question... Yes, there are quite a few filtering functions in SciPy. There is signal.convolve which will convolve two functions together (an FIR filter) There is also signal.lfilter which will implement an IIR filter (given the numerator and denominator polynomials) It looks like FIRGenLowPass is creating an FIR filter. Do you know what algorithm it is using? What are the possibilities in window_type? If you want to create an FIR filter then you should look at signal.remez which will construct an FIR filter to minimize the maximum distance between your desired response in the pass band and the obtained response. You can design low-pass, high-pass and bandpass filters using this method. info(signal.remez) will tell you how to call it but it does assume some familiarity with designing digital filters. You can also construct low-pass filters using the Fourier Transform and signal.get_window() signal.get_window() will construct a variety of windows for you. You could use these windows as low-pass filters themselves or use them in the Fourier domain to construct an FIR filter (more common). Let me know which approach you would prefer to use and I can give you more specific examples if you like. -Travis Oliphant From paxcalpt at sapo.pt Wed Feb 25 02:58:38 2004 From: paxcalpt at sapo.pt (Ricardo Henriques) Date: Wed, 25 Feb 2004 07:58:38 -0000 Subject: [SciPy-user] Does Scipy work well with py2exe? Message-ID: <002501c3fb75$2f6a31e0$16b2c151@xcal> Can anyone tell me if py2exe supports scipy importing? Tks... Paxcal From Bob.Cowdery at CGI-Europe.com Wed Feb 25 08:19:25 2004 From: Bob.Cowdery at CGI-Europe.com (Bob.Cowdery at CGI-Europe.com) Date: Wed, 25 Feb 2004 13:19:25 -0000 Subject: [SciPy-user] FIR Filters Message-ID: <9B40DB5FBF7D2048AD79C2A91F862A51772F66@STEVAPPS05.Stevenage.CGI-Europe.com> I am trying to get a windowed FIR filter to use in fast convolution filtering. I have tried using : signal.remez(2048,[300,3000],[1],Hz=44100) to generate a 2048 tap filter, which is what I have used previously in Intel IPP. Firstly, this gives me an array full of QNAN errors. Infact the only time I don't get errors is by using a very small number of taps, say 10. Various other values of taps fails to converge or gives QNAN errors. With the 2048 taps it is also very, very slow (about 75 seconds on a 1.2GHz). The IPP implementation is sub-second. I don't know if it's throwing an exception on every QNAN or quite what is hapenning. I am also confused by not being able to specify a window type and I really need lowpass not bandpass. I can see there is a function get_window() but I don't know how this relates to an FIR filter when the only other parameter is fftbins. Do I do something like tell it the number of fftbins I want the filter to cover (I have 2048 bins of around 10Hz each) then shift it to the correct part of the spectrum? Any help appreciated. Bob Bob Cowdery CGI Senior Technical Architect +44(0)1438 791517 Mobile: +44(0)7771 532138 bob.cowdery at cgi-europe.com *** Confidentiality Notice *** Proprietary/Confidential Information belonging to CGI Group Inc. and its affiliates may be contained in this message. If you are not a recipient indicated or intended in this message (or responsible for delivery of this message to such person), or you think for any reason that this message may have been addressed to you in error, you may not use or copy or deliver this message to anyone else. In such case, you should destroy this message and are asked to notify the sender by reply email. -------------- next part -------------- An HTML attachment was scrubbed... URL: From arnd.baecker at web.de Wed Feb 25 12:12:07 2004 From: arnd.baecker at web.de (Arnd Baecker) Date: Wed, 25 Feb 2004 18:12:07 +0100 (CET) Subject: [SciPy-user] scipy.xplt.ghelp (2nd try ;-) Message-ID: Hi, this is a reformulation of my recent question with the same subject, I hope I get it right this time ;-) If I understand things correctly, scipy.xplt.ghelp is gone because the information from xplt/gistdata/help.help has been moved to the doc strings of the commands. If this is correct, how do I get information on the keywords for a plot command ? E.g. help("scipy.xplt.plg") works, but I don't know how to get information on the keywords, like `color`, `closed` etc. I think all the information is given here http://bonsai.ims.u-tokyo.ac.jp/~mdehoon/software/python/pygist_html/node57.html How can I access this type of information from within python? (Before scipy.xplt.ghelp("color") did the job). Many thanks, Arnd From gry at ll.mit.edu Wed Feb 25 14:01:50 2004 From: gry at ll.mit.edu (george young) Date: Wed, 25 Feb 2004 14:01:50 -0500 Subject: [SciPy-user] data smoothing: interpolate.splrep ignores s parameter Message-ID: <20040225140150.0c97d070.gry@ll.mit.edu> [SciPy-0.2.0_alpha_200.4161, Numeric-23.1, Python 2.3.3, x86 Linux] My goal really is to smooth some noisy measurement data without messing up it's *shape*. My first attempt was 1d splines. I did: from Numeric import * import scipy f = file('rough') data = array([(float(x),float(y)) for (x,y) in [l.split() for l in f]]) rep = scipy.interpolate.splrep(data[:,0], data[:,1], s=s) smooth_y = scipy.interpolate.splev(data[:,0], rep) "s" is supposed to vary the amount of smoothing. For s=0, i get my original data back, as expected. But for all other values, including the recommended range of (m-sqrt(2.0*m)) to (m+sqrt(2.0*m), I get a single, too much smoothed, result. It seems like splrep is not using the "s" value to adjust the splines, except to sense that it's not zero. Splines may not be the right method anyway, since they tend to warp the shape of the curve, and I need to get the data's derivatives. Is there some way to fashion a low pass filter? It seems like fft should be useful here, but I have very little experience with fft's. -- George Young -- "Are the gods not just?" "Oh no, child. What would become of us if they were?" (CSL) From gry at ll.mit.edu Wed Feb 25 14:21:26 2004 From: gry at ll.mit.edu (george young) Date: Wed, 25 Feb 2004 14:21:26 -0500 Subject: [SciPy-user] Re: data smoothing: interpolate.splrep ignores s parameter Message-ID: <20040225142126.096ed95e.gry@ll.mit.edu> [[reposting with "rough" data file appended, sorry]] [SciPy-0.2.0_alpha_200.4161, Numeric-23.1, Python 2.3.3, x86 Linux] My goal really is to smooth some noisy measurement data without messing up it's *shape*. My first attempt was 1d splines. I did: from Numeric import * import scipy f = file('rough') data = array([(float(x),float(y)) for (x,y) in [l.split() for l in f]]) rep = scipy.interpolate.splrep(data[:,0], data[:,1], s=s) smooth_y = scipy.interpolate.splev(data[:,0], rep) "s" is supposed to vary the amount of smoothing. For s=0, i get my original data back, as expected. But for all other values, including the recommended range of (m-sqrt(2.0*m)) to (m+sqrt(2.0*m), I get a single, too much smoothed, result. It seems like splrep is not using the "s" value to adjust the splines, except to sense that it's not zero. Splines may not be the right method anyway, since they tend to warp the shape of the curve, and I need to get the data's derivatives. Is there some way to fashion a low pass filter? It seems like fft should be useful here, but I have very little experience with fft's. -- George Young "rough" file: 0.5 -1.7355e-10 0.455 -4.6185e-10 0.41 -3.2125e-10 0.365 -4.035e-10 0.32 2.951e-10 0.275 2.628e-10 0.23 2.922e-10 0.185 -3.1775e-10 0.14 -3.3209e-10 0.095 -3.6665e-10 0.05 -2.7394e-10 0.0 2.98e-10 -0.04 3.228e-10 -0.085 3.219e-10 -0.13 4.093e-10 -0.175 2.7406e-10 -0.22 1.102e-10 -0.265 -6.012e-10 -0.31 -3.9674e-09 -0.355 -1.2145e-08 -0.4 -3.9299e-08 -0.445 -1.2074e-07 -0.49 -3.2109e-07 -0.535 -7.3798e-07 -0.58 -1.4695e-06 -0.625 -2.548e-06 -0.67 -3.9104e-06 -0.715 -5.4895e-06 -0.76 -7.1758e-06 -0.805 -8.9267e-06 -0.85 -1.0675e-05 -0.895 -1.2425e-05 -0.94 -1.4115e-05 -0.985 -1.579e-05 -1.03 -1.7415e-05 -1.075 -1.9005e-05 -1.12 -2.0505e-05 -1.165 -2.2e-05 -1.21 -2.346e-05 -1.255 -2.4885e-05 -1.3 -2.624e-05 -1.345 -2.757e-05 -1.39 -2.886e-05 -1.435 -3.015e-05 -1.48 -3.1404e-05 -1.525 -3.2604e-05 -1.57 -3.385e-05 -1.615 -3.498e-05 -1.66 -3.6105e-05 -1.705 -3.718e-05 -1.75 -3.8279e-05 -1.795 -3.932e-05 -1.84 -4.033e-05 -1.885 -4.13e-05 -1.93 -4.2304e-05 -1.975 -4.3275e-05 -2.02 -4.429e-05 -2.065 -4.5205e-05 -2.11 -4.6069e-05 -2.155 -4.697e-05 -2.2 -4.7825e-05 -2.245 -4.8639e-05 -2.29 -4.9515e-05 -2.335 -5.0319e-05 -2.38 -5.1094e-05 -2.425 -5.188e-05 -2.47 -5.264e-05 -2.515 -5.3454e-05 -2.56 -5.4169e-05 -2.605 -5.4905e-05 -2.65 -5.5633e-05 -2.695 -5.6334e-05 -2.74 -5.7043e-05 -2.785 -5.7749e-05 -2.83 -5.8343e-05 -2.875 -5.9085e-05 -2.92 -5.9729e-05 -2.965 -6.0344e-05 -3.01 -6.094e-05 -3.055 -6.1564e-05 -3.1 -6.2104e-05 -3.145 -6.2749e-05 -3.19 -6.3348e-05 -3.235 -6.3956e-05 -3.28 -6.4414e-05 -3.325 -6.4958e-05 -3.37 -6.555e-05 -3.415 -6.6075e-05 -3.46 -6.656e-05 -3.505 -6.71e-05 -3.55 -6.7603e-05 -3.595 -6.8083e-05 -3.64 -6.8623e-05 -3.685 -6.9048e-05 -3.73 -6.9518e-05 -3.775 -6.998e-05 -3.82 -7.0508e-05 -3.865 -7.0929e-05 -3.91 -7.1343e-05 -3.955 -7.1824e-05 -4.0 -7.2218e-05 From pearu at scipy.org Wed Feb 25 15:45:53 2004 From: pearu at scipy.org (Pearu Peterson) Date: Wed, 25 Feb 2004 14:45:53 -0600 (CST) Subject: [SciPy-user] Re: data smoothing: interpolate.splrep ignores s parameter In-Reply-To: <20040225142126.096ed95e.gry@ll.mit.edu> Message-ID: On Wed, 25 Feb 2004, george young wrote: > [[reposting with "rough" data file appended, sorry]] > [SciPy-0.2.0_alpha_200.4161, Numeric-23.1, Python 2.3.3, x86 Linux] I recommend using CVS version of Scipy. > My goal really is to smooth some noisy measurement data without messing > up it's *shape*. My first attempt was 1d splines. I did: > > from Numeric import * > import scipy > f = file('rough') > data = array([(float(x),float(y)) for (x,y) in [l.split() for l in f]]) > rep = scipy.interpolate.splrep(data[:,0], data[:,1], s=s) > smooth_y = scipy.interpolate.splev(data[:,0], rep) > > "s" is supposed to vary the amount of smoothing. For s=0, i get > my original data back, as expected. But for all other values, including > the recommended range of (m-sqrt(2.0*m)) to (m+sqrt(2.0*m), I get a > single, too much smoothed, result. It seems like splrep is not using > the "s" value to adjust the splines, except to sense that it's not zero. First note that you have to specify x data as an increasing sequence of numbers, currently data[:,0] is decreasing. Second, your data is quite smooth, so you must use very small s values to see the effect, e.g. try for example s=1e-11 or smaller. > Splines may not be the right method anyway, since they tend to warp the > shape of the curve, and I need to get the data's derivatives. Is there > some way to fashion a low pass filter? It seems like fft should be useful > here, but I have very little experience with fft's. FFT is not suitable here as your data is not periodic. But if you need only derivatives, then use rep = \ interpolate.fitpack2.UnivariateSpline(data[::-1,0],data[::-1,1],s=1e-13) y_der = rep(data[:,0],1) # first derivative You may need to play with s value a bit and may be also split your data around x=-0.5 and do the smoothing on these parts separately because the data behaves differently on those two parts. HTH, Pearu From pearu at scipy.org Wed Feb 25 16:09:55 2004 From: pearu at scipy.org (Pearu Peterson) Date: Wed, 25 Feb 2004 15:09:55 -0600 (CST) Subject: [SciPy-user] scipy.xplt.ghelp (2nd try ;-) In-Reply-To: Message-ID: On Wed, 25 Feb 2004, Arnd Baecker wrote: > Hi, > > this is a reformulation of my recent question with the same subject, > I hope I get it right this time ;-) > > If I understand things correctly, scipy.xplt.ghelp is gone > because the information from xplt/gistdata/help.help > has been moved to the doc strings of the commands. > If this is correct, > how do I get information on the keywords for a plot command ? > > E.g. > help("scipy.xplt.plg") > works, but I don't know how to get information on the keywords, > like `color`, `closed` etc. > I think all the information is given here > http://bonsai.ims.u-tokyo.ac.jp/~mdehoon/software/python/pygist_html/node57.html > How can I access this type of information from within python? > (Before scipy.xplt.ghelp("color") did the job). I am not familiar with the background of loosing scipy.xplt.ghelp but the following works: import scipy.xplt.helpmod scipy.xplt.helpmod.help('color') Pearu From oliphant at ee.byu.edu Wed Feb 25 17:29:44 2004 From: oliphant at ee.byu.edu (Travis E. Oliphant) Date: Wed, 25 Feb 2004 15:29:44 -0700 Subject: [SciPy-user] scipy.xplt.ghelp (2nd try ;-) In-Reply-To: References: Message-ID: <403D21D8.1000500@ee.byu.edu> Arnd Baecker wrote: > Hi, > > this is a reformulation of my recent question with the same subject, > I hope I get it right this time ;-) > > If I understand things correctly, scipy.xplt.ghelp is gone > because the information from xplt/gistdata/help.help > has been moved to the doc strings of the commands. > If this is correct, Yes, that was the idea. > how do I get information on the keywords for a plot command ? It looks like we have a problem, as the docstrings aren't complete yet. I should add back the ghelp command in the meantime. You can get it from an older version of the code. I will try to update the CVS (but it may be a week or two). Thanks for the heads-up. -Travis From val at vtek.com Wed Feb 25 17:47:20 2004 From: val at vtek.com (val) Date: Wed, 25 Feb 2004 17:47:20 -0500 Subject: [SciPy-user] Re: data smoothing: interpolate.splrep ignores s parameter References: <20040225142126.096ed95e.gry@ll.mit.edu> Message-ID: <070f01c3fbf1$54654940$6400a8c0@vt1000> Well, Maybe, first you'd set up the realistic precision eps you want. For now, you have a range of 5 orders of magnitudes. Scale your data based on eps and graph it to see how noisy the data is. For physical variables, the noise should be within reasonable range. Smoothing 'per se' does not help much with unreasonably noisy data. Or i'm missing smth? val ----- Original Message ----- From: "george young" To: Sent: Wednesday, February 25, 2004 2:21 PM Subject: [SciPy-user] Re: data smoothing: interpolate.splrep ignores s parameter > [[reposting with "rough" data file appended, sorry]] > [SciPy-0.2.0_alpha_200.4161, Numeric-23.1, Python 2.3.3, x86 Linux] > My goal really is to smooth some noisy measurement data without messing > up it's *shape*. My first attempt was 1d splines. I did: > > from Numeric import * > import scipy > f = file('rough') > data = array([(float(x),float(y)) for (x,y) in [l.split() for l in f]]) > rep = scipy.interpolate.splrep(data[:,0], data[:,1], s=s) > smooth_y = scipy.interpolate.splev(data[:,0], rep) > > "s" is supposed to vary the amount of smoothing. For s=0, i get > my original data back, as expected. But for all other values, including > the recommended range of (m-sqrt(2.0*m)) to (m+sqrt(2.0*m), I get a > single, too much smoothed, result. It seems like splrep is not using > the "s" value to adjust the splines, except to sense that it's not zero. > > Splines may not be the right method anyway, since they tend to warp the > shape of the curve, and I need to get the data's derivatives. Is there > some way to fashion a low pass filter? It seems like fft should be useful > here, but I have very little experience with fft's. > > -- George Young > > "rough" file: > 0.5 -1.7355e-10 > 0.455 -4.6185e-10 > 0.41 -3.2125e-10 > 0.365 -4.035e-10 > 0.32 2.951e-10 > 0.275 2.628e-10 > 0.23 2.922e-10 > 0.185 -3.1775e-10 > 0.14 -3.3209e-10 > 0.095 -3.6665e-10 > 0.05 -2.7394e-10 > 0.0 2.98e-10 > -0.04 3.228e-10 > -0.085 3.219e-10 > -0.13 4.093e-10 > -0.175 2.7406e-10 > -0.22 1.102e-10 > -0.265 -6.012e-10 > -0.31 -3.9674e-09 > -0.355 -1.2145e-08 > -0.4 -3.9299e-08 > -0.445 -1.2074e-07 > -0.49 -3.2109e-07 > -0.535 -7.3798e-07 > -0.58 -1.4695e-06 > -0.625 -2.548e-06 > -0.67 -3.9104e-06 > -0.715 -5.4895e-06 > -0.76 -7.1758e-06 > -0.805 -8.9267e-06 > -0.85 -1.0675e-05 > -0.895 -1.2425e-05 > -0.94 -1.4115e-05 > -0.985 -1.579e-05 > -1.03 -1.7415e-05 > -1.075 -1.9005e-05 > -1.12 -2.0505e-05 > -1.165 -2.2e-05 > -1.21 -2.346e-05 > -1.255 -2.4885e-05 > -1.3 -2.624e-05 > -1.345 -2.757e-05 > -1.39 -2.886e-05 > -1.435 -3.015e-05 > -1.48 -3.1404e-05 > -1.525 -3.2604e-05 > -1.57 -3.385e-05 > -1.615 -3.498e-05 > -1.66 -3.6105e-05 > -1.705 -3.718e-05 > -1.75 -3.8279e-05 > -1.795 -3.932e-05 > -1.84 -4.033e-05 > -1.885 -4.13e-05 > -1.93 -4.2304e-05 > -1.975 -4.3275e-05 > -2.02 -4.429e-05 > -2.065 -4.5205e-05 > -2.11 -4.6069e-05 > -2.155 -4.697e-05 > -2.2 -4.7825e-05 > -2.245 -4.8639e-05 > -2.29 -4.9515e-05 > -2.335 -5.0319e-05 > -2.38 -5.1094e-05 > -2.425 -5.188e-05 > -2.47 -5.264e-05 > -2.515 -5.3454e-05 > -2.56 -5.4169e-05 > -2.605 -5.4905e-05 > -2.65 -5.5633e-05 > -2.695 -5.6334e-05 > -2.74 -5.7043e-05 > -2.785 -5.7749e-05 > -2.83 -5.8343e-05 > -2.875 -5.9085e-05 > -2.92 -5.9729e-05 > -2.965 -6.0344e-05 > -3.01 -6.094e-05 > -3.055 -6.1564e-05 > -3.1 -6.2104e-05 > -3.145 -6.2749e-05 > -3.19 -6.3348e-05 > -3.235 -6.3956e-05 > -3.28 -6.4414e-05 > -3.325 -6.4958e-05 > -3.37 -6.555e-05 > -3.415 -6.6075e-05 > -3.46 -6.656e-05 > -3.505 -6.71e-05 > -3.55 -6.7603e-05 > -3.595 -6.8083e-05 > -3.64 -6.8623e-05 > -3.685 -6.9048e-05 > -3.73 -6.9518e-05 > -3.775 -6.998e-05 > -3.82 -7.0508e-05 > -3.865 -7.0929e-05 > -3.91 -7.1343e-05 > -3.955 -7.1824e-05 > -4.0 -7.2218e-05 > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user > From oliphant at ee.byu.edu Wed Feb 25 18:40:34 2004 From: oliphant at ee.byu.edu (Travis E. Oliphant) Date: Wed, 25 Feb 2004 16:40:34 -0700 Subject: [SciPy-user] FIR Filters In-Reply-To: <9B40DB5FBF7D2048AD79C2A91F862A51772F66@STEVAPPS05.Stevenage.CGI-Europe.com> References: <9B40DB5FBF7D2048AD79C2A91F862A51772F66@STEVAPPS05.Stevenage.CGI-Europe.com> Message-ID: <403D3272.5050009@ee.byu.edu> Bob.Cowdery at CGI-Europe.com wrote: > I am trying to get a windowed FIR filter to use in fast convolution > filtering. > > I have tried using : > signal.remez(2048,[300,3000],[1],Hz=44100) > to generate a 2048 tap filter, which is what I have used previously in > Intel IPP. Firstly, this gives me an array full of QNAN errors. That is a lot of taps. In theory remez should work, but I admit it has not been tested for so many taps. And, the "bandpass" option can generate a lowpass filter by making the lower band 0. Why are you doing convolution filtering with 2048 taps? Why are you not using the FFT which should be much faster? > Infact > the only time I don't get errors is by using a very small number of > taps, say 10. Various other values of taps fails to converge or gives > QNAN errors. With the 2048 taps it is also very, very slow (about 75 > seconds on a 1.2GHz). The IPP implementation is sub-second. I don't know > if it's throwing an exception on every QNAN or quite what is hapenning. I don't think IPP is using the remez algorithm for getting it's filter. It's probably doing windowed filter design. > I am also confused by not being able to specify a window type and I > really need lowpass not bandpass. I can see there is a function > get_window() but I don't know how this relates to an FIR filter when the > only other parameter is fftbins. Do I do something like tell it the > number of fftbins I want the filter to cover (I have 2048 bins of around > 10Hz each) then shift it to the correct part of the spectrum? There is not currently an FIR-filter design program in SciPy. One should be constructed as it is not hard to implement (of course making it generic with all the options you might want would take some time). What kind of window are you currently using? For your purposes this is what I would do: w = fftfreq(2048,T) # give frequency bins in Hz, T is sample spacing myfilter = where((abs(w)< cutoff),1,0) # cutoff is low-pass filter h = ifft(myfilter) # ideal filter beta = 11.7 myh = fftshift(h) * get_window(beta,2048) # beta implies Kaiser Filter Someday this will get wrapped up into a nice FIR filter design routine --- any suggestions on the interface that routine should take would be useful. -Travis O. > > Any help appreciated. > > Bob > > > Bob Cowdery > CGI Senior Technical Architect > +44(0)1438 791517 > Mobile: +44(0)7771 532138 > bob.cowdery at cgi-europe.com > > > > > **** Confidentiality Notice **** Proprietary/Confidential > Information belonging to CGI Group Inc. and its affiliates > may be contained in this message. If you are not a recipient > indicated or intended in this message (or responsible for > delivery of this message to such person), or you think for > any reason that this message may have been addressed to you > in error, you may not use or copy or deliver this message > to anyone else. In such case, you should destroy this > message and are asked to notify the sender by reply email. > > > ------------------------------------------------------------------------ > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user From oliphant at ee.byu.edu Wed Feb 25 18:43:30 2004 From: oliphant at ee.byu.edu (Travis E. Oliphant) Date: Wed, 25 Feb 2004 16:43:30 -0700 Subject: [SciPy-user] Re: data smoothing: interpolate.splrep ignores s parameter In-Reply-To: References: Message-ID: <403D3322.9080304@ee.byu.edu> Pearu Peterson wrote: > > On Wed, 25 Feb 2004, george young wrote: > > FFT is not suitable here as your data is not periodic. But if you need > only derivatives, then use This is not necessarily true. The FFT can be used for non-periodic data -- and is in fact used quite often for such cases as long as you are careful. You just have to realize that the FFT is assuming your data is periodic and so any filtering (i.e. smoothing) will be using wrap-around boundary conditions. You can window your data, zero-pad, mirror it on the boundaries and many other things in order to get rid of artifacts that result from the periodic assumption of the FFT. -Travis O. From jbenson at sextans.lowell.edu Wed Feb 25 22:53:26 2004 From: jbenson at sextans.lowell.edu (Jim Benson) Date: Wed, 25 Feb 2004 20:53:26 -0700 (MST) Subject: [SciPy-user] scipy cvs snapshot problem Message-ID: Hi, I downloaded a cvs snapshot on 2004-02-25 and installed it over a previous cvs snapshot (from 2003-12-03). The previous one did pass all scipy.test levels. The recent version: Found scipy_distutils version '0.2.2_alpha_26.374' in '/usr/local/lib/python2.3/ site-packages/scipy_distutils/__init__.pyc' Importing scipy_distutils.command.build_flib ... ok failed the first test: ====================================================================== FAIL: check_nils_20Feb04 (scipy.linalg.basic.test_basic.test_solve) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib/python2.3/site-packages/scipy/linalg/tests/test_basic.py", line 112, in check_nils_20Feb04 assert_array_almost_equal(X,Ainv) File "/usr/local/lib/python2.3/site-packages/scipy_test/testing.py", line 636, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 50.0%): Array 1: [[-2.3261187+2.4103813j 2.6175944-0.5570154j] [ 2.9991264-1.7766662j -2.0232508-0.2587063j]] Array 2: [[-2.8831341-0.2072131j 2.6175944-0.5570154j] [ 2.7404201+0.2465847j -2.0232508-0.2587063j]] ---------------------------------------------------------------------- Ran 1198 tests in 1.918s FAILED (failures=1) >>> My atlas version: floyd:/usr/local/scipy/Lib/linalg>python -c 'import atlas_version' ATLAS version 3.4.2 built by root on Tue Dec 2 20:35:31 MST 2003: ...am i forgetting something critical? Thanks, Jim From oliphant at ee.byu.edu Thu Feb 26 02:09:56 2004 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Thu, 26 Feb 2004 00:09:56 -0700 Subject: [SciPy-user] FIR Filters In-Reply-To: <9B40DB5FBF7D2048AD79C2A91F862A51772F66@STEVAPPS05.Stevenage.CGI-Europe.com> References: <9B40DB5FBF7D2048AD79C2A91F862A51772F66@STEVAPPS05.Stevenage.CGI-Europe.com> Message-ID: <403D9BC4.8000808@ee.byu.edu> > I am trying to get a windowed FIR filter to use in fast convolution > filtering. > > I have tried using : > signal.remez(2048,[300,3000],[1],Hz=44100) > to generate a 2048 tap filter, which is what I have used previously in > Intel IPP. Firstly, this gives me an array full of QNAN errors. Infact > the only time I don't get errors is by using a very small number of > taps, say 10. Various other values of taps fails to converge or gives > QNAN errors. With the 2048 taps it is also very, very slow (about 75 > seconds on a 1.2GHz). The IPP implementation is sub-second. I don't > know if it's throwing an exception on every QNAN or quite what is > hapenning. > > I am also confused by not being able to specify a window type and I > really need lowpass not bandpass. I can see there is a function > get_window() but I don't know how this relates to an FIR filter when > the only other parameter is fftbins. Do I do something like tell it > the number of fftbins I want the filter to cover (I have 2048 bins of > around 10Hz each) then shift it to the correct part of the spectrum? Bob, Here is a simple windowed ideal lowpass filter algorithm. The two important parameters are the number of taps and the cutoff frequency normalized so that 1 corresponds to pi radians/sample (i.e. the Nyquist frequency). If you specify width it will choose a nice Kaiser window for you to try and reach that width of transistion region (from passband to stopband). Otherwise, you can specify the window type (see signal.get_window for types) This is now in SciPy CVS. def firwin(N, cutoff, width=None, window='hamming'): """FIR Filter Design using windowed ideal filter method. Inputs: N -- order of filter (number of taps) cutoff -- cutoff frequency of filter (normalized so that 1 corresponds to Nyquist or pi radians / sample) width -- if width is not None, then assume it is the approximate width of the transition region (normalized so that 1 corresonds to pi) for use in kaiser FIR filter design. window -- desired window to use. Outputs: h -- coefficients of length N fir filter. """ from signaltools import get_window if isinstance(width,float): A = 2.285*N*width + 8 if (A < 21): beta = 0.0 elif (A <= 50): beta = 0.5842*(A-21)**0.4 + 0.07886*(A-21) else: beta = 0.1102*(A-8.7) window=('kaiser',beta) win = get_window(window,N,fftbins=1) alpha = N//2 m = Num.arange(0,N) return win*special.sinc(cutoff*(m-alpha)) From pearu at scipy.org Thu Feb 26 03:52:26 2004 From: pearu at scipy.org (Pearu Peterson) Date: Thu, 26 Feb 2004 02:52:26 -0600 (CST) Subject: [SciPy-user] scipy cvs snapshot problem In-Reply-To: Message-ID: On Wed, 25 Feb 2004, Jim Benson wrote: > I downloaded a cvs snapshot on 2004-02-25 and installed it > over a previous cvs snapshot (from 2003-12-03). The previous > one did pass all scipy.test levels. The recent version > failed the first test: > > ====================================================================== > FAIL: check_nils_20Feb04 (scipy.linalg.basic.test_basic.test_solve) > ---------------------------------------------------------------------- > AssertionError: > Arrays are not almost equal (mismatch 50.0%): > Array 1: [[-2.3261187+2.4103813j 2.6175944-0.5570154j] > [ 2.9991264-1.7766662j -2.0232508-0.2587063j]] > Array 2: [[-2.8831341-0.2072131j 2.6175944-0.5570154j] > [ 2.7404201+0.2465847j -2.0232508-0.2587063j]] > ...am i forgetting something critical? Yes, you missed recent thread about the corner case where linalg.solve could return wrong results. The above test covers this case. You should certainly update f2py to the latest if somewhere in your applications the following conditions may be holding: a complex one-dimensional slice of an array is passed to f2py generated wrapper. Actually as far as I am concerned, the origin of this bug lies in all versions of Numeric CDOUBLE_to_CDOUBLE,etc functions for any(!) non-contiguous complex array. This bug was reported to numpy list years ago, see http://sourceforge.net/mailarchive/message.php?msg_id=1282411 but since there was no response or interest to it :(, f2py provided a workaround to this problem but missed the above important corner case. Regards, Pearu From arnd.baecker at web.de Thu Feb 26 05:34:34 2004 From: arnd.baecker at web.de (Arnd Baecker) Date: Thu, 26 Feb 2004 11:34:34 +0100 (CET) Subject: [SciPy-user] scipy.xplt.ghelp (2nd try ;-) In-Reply-To: References: Message-ID: On Wed, 25 Feb 2004, Pearu Peterson wrote: [...] > > How can I access this type of information from within python? > > (Before scipy.xplt.ghelp("color") did the job). > > I am not familiar with the background of loosing scipy.xplt.ghelp but the > following works: > > import scipy.xplt.helpmod > scipy.xplt.helpmod.help('color') Thanks, Pearu! I also just checked that putting a from helpmod import help as ghelp into gist.py restores the original behaviour, i.e. scipy.xplt.ghelp("color") works as before. So this is an easy temporary fix, but I think now that all the doc-strings are there for the commands we could with little effort also get the keywords into one of the doc-strings (see my next message) Best, Arnd From arnd.baecker at web.de Thu Feb 26 05:57:03 2004 From: arnd.baecker at web.de (Arnd Baecker) Date: Thu, 26 Feb 2004 11:57:03 +0100 (CET) Subject: [SciPy-user] scipy.xplt.ghelp (2nd try ;-) In-Reply-To: <403D21D8.1000500@ee.byu.edu> References: <403D21D8.1000500@ee.byu.edu> Message-ID: On Wed, 25 Feb 2004, Travis E. Oliphant wrote: > Arnd Baecker wrote: [...] > > If I understand things correctly, scipy.xplt.ghelp is gone > > because the information from xplt/gistdata/help.help > > has been moved to the doc strings of the commands. > > If this is correct, > > Yes, that was the idea. I am very much in favour of this! > > how do I get information on the keywords for a plot command ? > > It looks like we have a problem, as the docstrings aren't complete yet. As far as I can see it is really only the keywords. Thinking further about it, there might be a simple solution to this a) add the information for the different keywords to the docstring of scipy.xplt itself. One could then add the text "see help("scipy.xplt") for information on the keywords" in the doc-strings for the corresponding plot routines. b) add the information for the different keywords to the docstring of scipy.xplt.plg. One could then add the text "see help("scipy.xplt.plg") for information on the keywords" in the doc-strings for the corresponding other plot routines. c) Invent a dummy routine scipy.xplt.keywords which contains the information on the keywords. Personally I don't like c) very much, so maybe b) is the best solution as the doc-string will be pretty long (i.e. a reason not to put it into scipy.xplt). If you think that this is a reasonable (also long-term) solution then I can prepare a corresponding doc-string. One more general question on the doc-strings of xplt: Some of the doc-strings (e.g. get_style(...) and many more) are so long that for 80 characters display width the corresponding lines get wrapped. To be precise: the doc-strings themselves are (mostly) <80 chars, but the indent when using the help brings the whole line to > 80 chars. If you think this is worth doing, I could take care of changing this in gist.py (and presuambly some other files). > I should add back the ghelp command in the meantime. > > You can get it from an older version of the code. I will try to update > the CVS (but it may be a week or two). That's perfectly fine with me - many thanks, Arnd From Bob.Cowdery at CGI-Europe.com Thu Feb 26 08:27:40 2004 From: Bob.Cowdery at CGI-Europe.com (Bob.Cowdery at CGI-Europe.com) Date: Thu, 26 Feb 2004 13:27:40 -0000 Subject: [SciPy-user] FIR Filters Message-ID: <9B40DB5FBF7D2048AD79C2A91F862A51772F68@STEVAPPS05.Stevenage.CGI-Europe.com> Travis Thanks very much for the responses. Don't confuse me with someone who knows what they are doing (but I am working on it!). To give you some background I am converting a .NET application that uses Intel IPP to Python. The application is for a software defined radio. At the moment I have audio capture which contains my quadrature samples in blocks of 4096 (2048 each for I&Q) with 44100 sampling rate (46ms per block). I have a simple dsp process to test each signal processing call. At the moment this should output exactly what it receives, i.e just the audio input as the demodulator needs the FIR filter and some magic applied at the point indicated in the code. When I have written the demodulator I can test firwin() properly. At the moment I can see it outputs valid data, and it's fast which is good. The code below is the very simple path. I have had to write two functions for polar/cartesian conversions as I didn't find any in the library. I don't know if these work right as I can hear clicking sounds on the audio not there when they are commented out. However, it could be audio overrun as I am up at 75% processor. The functions are also quite slow (adding 30% to my processor loading). Is there a more efficient way to do this? def sigproc( self, samples ) : # convert to a numarray array of type double x = asarray(samples, typecode=Float64) # first to do is sort into real and image arrays from interleaved array realin = x[1::2] # every other element starting at index 1 imagin = x[::2] # every other element starting at index 0 # convert to frequency domain realfd = fft(realin) imagfd = fft(imagin) # cartesian to polar in frequency domain magn,phase = self.cartesian_to_polar(realfd, imagfd) # apply filter and demodulate # polar to cartesian in frequency domain realfd, imagfd = self.polar_to_cartesian(magn, phase) # convert to time domain realtd = ifft(realfd) imagtd = ifft(imagfd) # convert back to interleaved array x[::2] = realtd.astype(Float64) x[1::2] = imagtd.astype(Float64) # convert back to an integer array x = x.astype('l') # return python list of processed samples return x.tolist() # utility functions not in the library def polar_to_cartesian(self, magn, phase): return magn * cos(phase), magn * sin(phase) def cartesian_to_polar(self, real, imag): # convert the arrays real, imag into the polar system # magnitude array magn = sqrt(real**2 + imag**2) # phase array phase = choose(equal(real,0),(arctan(imag/real),(choose(less(imag,0),((pi/2),-(pi/2) ))))) return magn, phase -----Original Message----- From: Travis Oliphant [mailto:oliphant at ee.byu.edu] Sent: 26 February 2004 07:10 To: SciPy Users List Subject: Re: [SciPy-user] FIR Filters > I am trying to get a windowed FIR filter to use in fast convolution > filtering. > > I have tried using : > signal.remez(2048,[300,3000],[1],Hz=44100) > to generate a 2048 tap filter, which is what I have used previously in > Intel IPP. Firstly, this gives me an array full of QNAN errors. Infact > the only time I don't get errors is by using a very small number of > taps, say 10. Various other values of taps fails to converge or gives > QNAN errors. With the 2048 taps it is also very, very slow (about 75 > seconds on a 1.2GHz). The IPP implementation is sub-second. I don't > know if it's throwing an exception on every QNAN or quite what is > hapenning. > > I am also confused by not being able to specify a window type and I > really need lowpass not bandpass. I can see there is a function > get_window() but I don't know how this relates to an FIR filter when > the only other parameter is fftbins. Do I do something like tell it > the number of fftbins I want the filter to cover (I have 2048 bins of > around 10Hz each) then shift it to the correct part of the spectrum? Bob, Here is a simple windowed ideal lowpass filter algorithm. The two important parameters are the number of taps and the cutoff frequency normalized so that 1 corresponds to pi radians/sample (i.e. the Nyquist frequency). If you specify width it will choose a nice Kaiser window for you to try and reach that width of transistion region (from passband to stopband). Otherwise, you can specify the window type (see signal.get_window for types) This is now in SciPy CVS. def firwin(N, cutoff, width=None, window='hamming'): """FIR Filter Design using windowed ideal filter method. Inputs: N -- order of filter (number of taps) cutoff -- cutoff frequency of filter (normalized so that 1 corresponds to Nyquist or pi radians / sample) width -- if width is not None, then assume it is the approximate width of the transition region (normalized so that 1 corresonds to pi) for use in kaiser FIR filter design. window -- desired window to use. Outputs: h -- coefficients of length N fir filter. """ from signaltools import get_window if isinstance(width,float): A = 2.285*N*width + 8 if (A < 21): beta = 0.0 elif (A <= 50): beta = 0.5842*(A-21)**0.4 + 0.07886*(A-21) else: beta = 0.1102*(A-8.7) window=('kaiser',beta) win = get_window(window,N,fftbins=1) alpha = N//2 m = Num.arange(0,N) return win*special.sinc(cutoff*(m-alpha)) _______________________________________________ SciPy-user mailing list SciPy-user at scipy.net http://www.scipy.net/mailman/listinfo/scipy-user *** Confidentiality Notice *** Proprietary/Confidential Information belonging to CGI Group Inc. and its affiliates may be contained in this message. If you are not a recipient indicated or intended in this message (or responsible for delivery of this message to such person), or you think for any reason that this message may have been addressed to you in error, you may not use or copy or deliver this message to anyone else. In such case, you should destroy this message and are asked to notify the sender by reply email. -------------- next part -------------- An HTML attachment was scrubbed... URL: From magnew at math.uio.no Thu Feb 26 09:12:20 2004 From: magnew at math.uio.no (Magne Westlie) Date: Thu, 26 Feb 2004 15:12:20 +0100 (CET) Subject: [SciPy-user] weave, including headers in generated file In-Reply-To: References: Message-ID: Hello. I'm having a problem with using weave which I was wondering if someone could help me out with. I've got a class Hello declared in a file hw.h, but how do I tell weave to include this file in the generated cpp-file? If I try: code = """ Hello h; h.hw(); """ weave.inline(code, [], compiler='gcc', verbose=2, force=1, sources=['/home/mw/weave/inline-ex/hw.cpp'], include_dirs=['/home/mw/weave/inline-ex/'], ) Compiling fails because Hello is not defined: /home/mw/.python23_compiled/sc_dbdc99b5e1f621cd206ee575625e18f15.cpp:644: error: ` Hello' undeclared (first use this function) If I open the cpp file manually and puts: #include in it, everything works. Magne Westlie From barrett at stsci.edu Thu Feb 26 09:15:58 2004 From: barrett at stsci.edu (Paul Barrett) Date: Thu, 26 Feb 2004 09:15:58 -0500 Subject: [SciPy-user] Re: data smoothing: interpolate.splrep ignores s parameter In-Reply-To: <20040225142126.096ed95e.gry@ll.mit.edu> References: <20040225142126.096ed95e.gry@ll.mit.edu> Message-ID: <403DFF9E.20403@stsci.edu> george young wrote: > [[reposting with "rough" data file appended, sorry]] > [SciPy-0.2.0_alpha_200.4161, Numeric-23.1, Python 2.3.3, x86 Linux] > My goal really is to smooth some noisy measurement data without messing > up it's *shape*. My first attempt was 1d splines. I did: > > from Numeric import * > import scipy > f = file('rough') > data = array([(float(x),float(y)) for (x,y) in [l.split() for l in f]]) > rep = scipy.interpolate.splrep(data[:,0], data[:,1], s=s) > smooth_y = scipy.interpolate.splev(data[:,0], rep) > > "s" is supposed to vary the amount of smoothing. For s=0, i get > my original data back, as expected. But for all other values, including > the recommended range of (m-sqrt(2.0*m)) to (m+sqrt(2.0*m), I get a > single, too much smoothed, result. It seems like splrep is not using > the "s" value to adjust the splines, except to sense that it's not zero. > > Splines may not be the right method anyway, since they tend to warp the > shape of the curve, and I need to get the data's derivatives. Is there > some way to fashion a low pass filter? It seems like fft should be useful > here, but I have very little experience with fft's. > [snip, snip] Have you considered wavelet smoothing or denoising? The 'a-trous' wavelet method is a simple, understandable technique that can be done by convolving a function with the data. The difference between your raw data and your smoothed (convolved) data is the wavelet at that particular spatial resolution. You can filter the wavelet values using thresholding and then add them back into the smoothed data to get your smoothed or denoised result. This can be done at different spatial resolutions depending on the what scale you wish to smooth. If you set the filtering threshold correctly, you'll be able to keep the significant features of the data without loosing resolution as is common with averaging techniques, or adding spurious features as is common with FFT techniques. A B3-spline is good kernel function and usually does a good job in this situation. Its 1-dimensional representation is [1, 4, 6, 4, 1]/16. -- Paul -- Paul Barrett, PhD Space Telescope Science Institute Phone: 410-338-4475 ESS/Science Software Branch FAX: 410-338-4767 Baltimore, MD 21218 From prabhu at aero.iitm.ernet.in Thu Feb 26 12:06:21 2004 From: prabhu at aero.iitm.ernet.in (Prabhu Ramachandran) Date: Thu, 26 Feb 2004 22:36:21 +0530 Subject: [SciPy-user] weave, including headers in generated file In-Reply-To: References: Message-ID: <16446.10125.224440.819230@monster.linux.in> >>>>> "MW" == Magne Westlie writes: MW> I've got a class Hello declared in a file hw.h, but how do I MW> tell weave to include this file in the generated cpp-file? If MW> I try: MW> code = """ Hello h; h.hw(); """ MW> weave.inline(code, MW> [], compiler='gcc', verbose=2, force=1, MW> sources=['/home/mw/weave/inline-ex/hw.cpp'], MW> include_dirs=['/home/mw/weave/inline-ex/'], ) Use the headers keyword and put hw.h into it. Also add the path to your header in include_dirs. cheers, prabhu From jbenson at nofs.navy.mil Thu Feb 26 12:12:13 2004 From: jbenson at nofs.navy.mil (James A. Benson) Date: Thu, 26 Feb 2004 10:12:13 -0700 (MST) Subject: [SciPy-user] scipy cvs snapshot problem (solved) In-Reply-To: Message-ID: On Thu, 26 Feb 2004, Pearu Peterson wrote: > > On Wed, 25 Feb 2004, Jim Benson wrote: > > > I downloaded a cvs snapshot on 2004-02-25 and installed it > > over a previous cvs snapshot (from 2003-12-03). The previous > > one did pass all scipy.test levels. The recent version > > failed the first test: > > > > ====================================================================== > > FAIL: check_nils_20Feb04 (scipy.linalg.basic.test_basic.test_solve) > > ---------------------------------------------------------------------- > > > ...am i forgetting something critical? > > Yes, you missed recent thread about the corner case where linalg.solve > could return wrong results. The above test covers this case. > You should certainly update f2py to the latest if somewhere in your > applications the following conditions may be holding: Thanks! I updated to F2Py-2.39.235_1644, reinstalled the scipy cvs snapshot, and it now passed all the tests. Thanks, Jim From gry at ll.mit.edu Thu Feb 26 13:20:36 2004 From: gry at ll.mit.edu (george young) Date: Thu, 26 Feb 2004 13:20:36 -0500 Subject: [SciPy-user] Re: data smoothing: interpolate.splrep ignores s parameter In-Reply-To: <403DFF9E.20403@stsci.edu> References: <20040225142126.096ed95e.gry@ll.mit.edu> <403DFF9E.20403@stsci.edu> Message-ID: <20040226132036.436b4c05.gry@ll.mit.edu> On Thu, 26 Feb 2004 09:15:58 -0500 Paul Barrett threw this fish to the penguins: > george young wrote: > > [[reposting with "rough" data file appended, sorry]] > > [SciPy-0.2.0_alpha_200.4161, Numeric-23.1, Python 2.3.3, x86 Linux] > > My goal really is to smooth some noisy measurement data without messing > > up it's *shape*. My first attempt was 1d splines. I did: > > ... > > Splines may not be the right method anyway, since they tend to warp the > > shape of the curve, and I need to get the data's derivatives. Is there > > some way to fashion a low pass filter? It seems like fft should be useful > > here, but I have very little experience with fft's. > > > > [snip, snip] > > Have you considered wavelet smoothing or denoising? > > The 'a-trous' wavelet method is a simple, understandable technique that can be > done by convolving a function with the data. The difference between your raw > data and your smoothed (convolved) data is the wavelet at that particular > spatial resolution. You can filter the wavelet values using thresholding and > then add them back into the smoothed data to get your smoothed or denoised > result. This can be done at different spatial resolutions depending on the what > scale you wish to smooth. > > If you set the filtering threshold correctly, you'll be able to keep the > significant features of the data without loosing resolution as is common with > averaging techniques, or adding spurious features as is common with FFT > techniques. A B3-spline is good kernel function and usually does a good job in > this situation. Its 1-dimensional representation is [1, 4, 6, 4, 1]/16. That works very nicely, thanks! It decreases the noise locally without warping the overall shape. I just did: ytmp = zeros(len(data[:, 1]), Float) xmin=0; xmax=len(data)-1 fn= [[-2,1], [-1,4], [0,6], [1,4], [2,1]] for i in range(len(data)): y = w = 0.0 for f in fn: if xmin <= (i + f[0]) <= xmax: y += f[1] * data[i + f[0], 1] w += f[1] ytmp[i] = y / w data[:, 1] = ytmp I know nothing of wavelets or dsp. Is there something I could read in a few hours to understand a little of "a-trous" and "kernels"? Is it safe to play with the numbers, e.g. [1,5,8,5,1]/20 or is there subtlety that will bite me? I assume there's functionality in scipy for doing this instead of hard coding it? I noticed a convolve function, but not much documentation. Thanks very much again, George Young -- "Are the gods not just?" "Oh no, child. What would become of us if they were?" (CSL) From oliphant at ee.byu.edu Thu Feb 26 17:38:17 2004 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Thu, 26 Feb 2004 15:38:17 -0700 Subject: [SciPy-user] FIR Filters In-Reply-To: <9B40DB5FBF7D2048AD79C2A91F862A51772F68@STEVAPPS05.Stevenage.CGI-Europe.com> References: <9B40DB5FBF7D2048AD79C2A91F862A51772F68@STEVAPPS05.Stevenage.CGI-Europe.com> Message-ID: <403E7559.6000309@ee.byu.edu> Bob.Cowdery at CGI-Europe.com wrote: > Travis > > Thanks very much for the responses. Don't confuse me with someone who > knows what they are doing (but I am working on it!). To give you some > background I am converting a .NET application that uses Intel IPP to > Python. The application is for a software defined radio. At the moment > I have audio capture which contains my quadrature samples in blocks of > 4096 (2048 each for I&Q) with 44100 sampling rate (46ms per block). I > have a simple dsp process to test each signal processing call. At the > moment this should output exactly what it receives, i.e just the audio > input as the demodulator needs the FIR filter and some magic applied > at the point indicated in the code. When I have written the > demodulator I can test firwin() properly. At the moment I can see it > outputs valid data, and it's fast which is good. The code below is the > very simple path. I have had to write two functions for > polar/cartesian conversions as I didn't find any in the library. I > don't know if these work right as I can hear clicking sounds on the > audio not there when they are commented out. However, it could be > audio overrun as I am up at 75% processor. The functions are also > quite slow (adding 30% to > > my processor loading). Is there a more efficient way to do this? > All the best in your education... First, Are you using numarray? if so then that might cause a slower application than using Numeric (older but often faster arraymodule) Comments on your code below: > def sigproc( self, samples ) : > > # convert to a numarray array of type double > x = asarray(samples, typecode=Float64) > is this numarray.asarray or Numeric.asarray ? > > # first to do is sort into real and image arrays from > interleaved array > realin = x[1::2] # every other element starting at index 1 > imagin = x[::2] # every other element starting at index 0 \ > > > # convert to frequency domain > realfd = fft(realin) > imagfd = fft(imagin) > Normally, real and imaginary parts of the data are handled together when a complex data type is available. I would do in = x[1::2] + 1j*x[::2] fd = fft(in) > > # cartesian to polar in frequency domain > magn,phase = self.cartesian_to_polar(realfd, imagfd) > > This seems strange to me, why are you filtering the magnitude and phase separately? This is very non-standard. > > # apply filter and demodulate > I doubt you want a 2048 tap filter at this point. Probably something a lot less. What kind of modulation are you demodulating? Are you doing frequency-domain filtering then? > > # polar to cartesian in frequency domain > > realfd, imagfd = self.polar_to_cartesian(magn, phase) > > > # convert to time domain > realtd = ifft(realfd) > imagtd = ifft(imagfd) > > # convert back to interleaved array > x[::2] = realtd.astype(Float64) > x[1::2] = imagtd.astype(Float64) > > # convert back to an integer array > x = x.astype('l') > > # return python list of processed samples > return x.tolist() > Why are you going back to a python list? A numeric array is a valid sequence. Why does it need to be a list. You have a lot of memory copying going on which is going to slow down everything. > > # utility functions not in the library > def polar_to_cartesian(self, magn, phase): > return magn * cos(phase), magn * sin(phase) > > def cartesian_to_polar(self, real, imag): > # convert the arrays real, imag into the polar system > # magnitude array > magn = sqrt(real**2 + imag**2) > # phase array > phase = > choose(equal(real,0),(arctan(imag/real),(choose(less(imag,0),((pi/2),-(pi/2)))))) > > return magn, phase > > If you have a complex-valued array, then abs(x) is the magnitude and angle(x) is the phase. My colleagues down the hall do software radio and I have a lot of background in signal processing. Let me know if you have more specific questions. Thanks, -Travis O. From paxcalpt at sapo.pt Thu Feb 26 18:31:15 2004 From: paxcalpt at sapo.pt (Ricardo Henriques) Date: Thu, 26 Feb 2004 23:31:15 -0000 Subject: [SciPy-user] Boa Constructor Plug-in for ploting Message-ID: <003501c3fcc0$a28fa870$5181c151@xcal> Is there a Boa Constructor plug-in using a scipy based plot interface like Chaco or scipy.plt out there? I have been using wxPyPlot witch is based on scipy.plot but laks many of its features... Tks... Paxcal From pajer at iname.com Thu Feb 26 23:11:30 2004 From: pajer at iname.com (Gary Pajer) Date: Thu, 26 Feb 2004 23:11:30 -0500 Subject: [SciPy-user] Boa Constructor Plug-in for ploting References: <003501c3fcc0$a28fa870$5181c151@xcal> Message-ID: <001901c3fce7$c741d090$01fd5644@playroom> ----- Original Message ----- From: "Ricardo Henriques" > Is there a Boa Constructor plug-in using a scipy based plot interface like > Chaco or scipy.plt out there? > I have been using wxPyPlot witch is based on scipy.plot but laks many of its > features... I'll second the question, and add something: I've been experimenting with BLT/Tkinter, and I'm growing fond of it. Lots of features, including a stripchart widget that works for me nicely. -gary From arnd.baecker at web.de Fri Feb 27 05:12:09 2004 From: arnd.baecker at web.de (Arnd Baecker) Date: Fri, 27 Feb 2004 11:12:09 +0100 (CET) Subject: [SciPy-user] help and lazy-import (?) Message-ID: Hi, there there seems to be something weird going on when trying to access help in the following way >>> import scipy >>> help("scipy.linalg") Help on instance of _ModuleLoader in scipy: scipy.linalg = Doing another >>> help("scipy.linalg") one gets the expected help. I'd guess that this somehow has to to with the lazy importer of scipy. But maybe there is a workaround? Many thanks in advance, Arnd From pearu at scipy.org Fri Feb 27 06:08:01 2004 From: pearu at scipy.org (Pearu Peterson) Date: Fri, 27 Feb 2004 05:08:01 -0600 (CST) Subject: [SciPy-user] help and lazy-import (?) In-Reply-To: Message-ID: On Fri, 27 Feb 2004, Arnd Baecker wrote: > there there seems to be something weird > going on when trying to access help in the following way > > >>> import scipy > >>> help("scipy.linalg") > Help on instance of _ModuleLoader in scipy: > > scipy.linalg = e-packages/scipy/linalg/__init__.pyc' [imported]> > > Doing another > >>> help("scipy.linalg") > > one gets the expected help. > > I'd guess that this somehow has to to with the lazy importer of scipy. > But maybe there is a workaround? I get similar behaviour when using ipython. But in a traditional Python prompt >>> import scipy >>> help("scipy.linalg") I get expected help. This is because ppimport redefines built-in help: >>> help Type help () for interactive help, or help (object) for help about object. >>> help.__module__ 'scipy_base.ppimport' So, for some reasons, in my case when using ipython, redefined help is not effective and probably because help is referenced in ipython before importing scipy. It seems that you are not using ipython, so, what is the contents of your PYTHONSTARTUP? Pearu From arnd.baecker at web.de Fri Feb 27 06:24:51 2004 From: arnd.baecker at web.de (Arnd Baecker) Date: Fri, 27 Feb 2004 12:24:51 +0100 (CET) Subject: [SciPy-user] help and lazy-import (?) In-Reply-To: References: Message-ID: On Fri, 27 Feb 2004, Pearu Peterson wrote: > On Fri, 27 Feb 2004, Arnd Baecker wrote: > > > there there seems to be something weird > > going on when trying to access help in the following way > > > > >>> import scipy > > >>> help("scipy.linalg") > > Help on instance of _ModuleLoader in scipy: > > > > scipy.linalg = > e-packages/scipy/linalg/__init__.pyc' [imported]> > > > > Doing another > > >>> help("scipy.linalg") > > > > one gets the expected help. > > > > I'd guess that this somehow has to to with the lazy importer of scipy. > > But maybe there is a workaround? > > I get similar behaviour when using ipython. Well, that's where I first saw this as well (honestly, I always use ipython ;-) As I also see this with the normal python prompt I thought it is un-related to ipython. > But in a traditional Python prompt > > >>> import scipy > >>> help("scipy.linalg") > > I get expected help. This is because ppimport redefines built-in help: > > >>> help > Type help () for interactive help, or help (object) for help about object. > >>> help.__module__ > 'scipy_base.ppimport' I get precisely the same output ((to be clear here: apart from that help("scipy.linalg") gives the above _ModuleLoader message on the first invocation, and only the second time I get the expected help.)) If this matters: - scipy.__version__: '0.2.1_253.3701' - Python: 2.3.3 > So, for some reasons, in my case when using ipython, redefined help is > not effective and probably because help is referenced in ipython before > importing scipy. With IPython 0.5.1.cvs I get the same as described above, however with In [4]: help.__module__ Out[4]: 'site' > It seems that you are not using ipython, so, what is the > contents of your PYTHONSTARTUP? no contents, I don't use it. What further tests could I do solve this issue ? (maybe we need further tests by other users to find the origin?) Many thanks, Arnd P.S.: just another "screen shot": Python 2.3.3 (#1, Feb 3 2004, 22:05:11) [GCC 3.2] on linux2 Type "help", "copyright", "credits" or "license" for more information. >>> help.__module__ 'site' >>> import scipy >>> help.__module__ 'scipy_base.ppimport' >>> help("scipy.linalg") Help on instance of _ModuleLoader in scipy: scipy.linalg = >>> help("scipy.linalg") Help on package scipy.linalg in scipy: NAME scipy.linalg FILE [..] From Bob.Cowdery at CGI-Europe.com Fri Feb 27 09:10:39 2004 From: Bob.Cowdery at CGI-Europe.com (Bob.Cowdery at CGI-Europe.com) Date: Fri, 27 Feb 2004 14:10:39 -0000 Subject: [SciPy-user] FIR Filters Message-ID: <9B40DB5FBF7D2048AD79C2A91F862A51772F6B@STEVAPPS05.Stevenage.CGI-Europe.com> Travis With your help I am making progress. I started off using numarray, then ran into problems so I am now using Numeric. I only import scipy so I am using whatever it imports which I believe is Numeric. I have tried to follow your advice and move to complex types as I can see it makes sense. I don't think I have done a complete job yet. The audio output has a pulse type modulation on it which should not be there, however selecting different passbands is definitely having the desired effect. I have copied all the code in below for context. Yes the filtering is in the frequency domain which I am told gives better stopband attenuation and good shape. The way the filter and demodulation works is still a bit of a mystery to me as I didn't write the original code. I would love to work out exactly what it does but I'm not quite there yet. I can't really ask very targetted questions yet until I understand more. The code cleary still has problems but I don't know why or where. The good news is the processor is down to 50% now from 80%. If you do have a few moments perhaps you could point out any more obvious errors. Thanks Bob import time import threading import fastaudio as f from scipy import * class dsp( threading.Thread ): # signal processing def sig( self, samples ) : # allocate the filter arrays global m_filterMagn global m_filterPhase # convert to a numeric array of type double x = asarray(samples, typecode=Float64) # get a complex array ordered from the I,Q source complexin = x[1::2] + 1j*x[::2] # convert to complex frequency domain cart_complex_in_fd = fft(complexin) # apply filter to magnitude and phase magn = abs(complexin)*m_filterMagn phase = angle(complexin)*m_filterPhase # polar to cartesian in frequency domain demodulated = magn * cos(phase) + 1j*(magn * sin(phase)) # convert to time domain complexout = ifft(demodulated) # convert back to interleaved array x[1::2] = real(complexout) x[::2] = real(complexout) # convert back to an integer array x = x.astype('l') # return sequence of processed samples return x def firwin(self, N, cutoff, width=None, window='blackman'): """ FIR Filter Design using windowed ideal filter method. Inputs: N -- order of filter (number of taps) cutoff -- cutoff frequency of filter (normalized so that 1 corresponds to Nyquist or pi radians / sample) width -- if width is not None, then assume it is the approximate width of the transition region (normalized so that 1 corresonds to pi) for use in kaiser FIR filter design. window -- desired window to use. Outputs: h -- coefficients of length N fir filter. """ if isinstance(width,float): A = 2.285*N*width + 8 if (A < 21): beta = 0.0 elif (A <= 50): beta = 0.5842*(A-21)**0.4 + 0.07886*(A-21) else: beta = 0.1102*(A-8.7) window=('kaiser',beta) win = signal.get_window(window,N,fftbins=1) alpha = N//2 m = signal.Num.arange(0,N) return win*special.sinc(cutoff*(m-alpha)) # calculate the complex filter def filter_calc( self, fLow, fHigh ): global m_filterMagn global m_filterPhase m_filterMagn = zeros(2048) m_filterPhase = zeros(2048) # create empty arrays to hold results Rh = zeros(2048) Ih = zeros(2048) # calculate the high/low cutoff freq as a fraction of the sample rate fhc = fHigh/44100.0 flc = fLow/44100.0 # calculate fft bin size bin_size = 44100.0/4096.0 # calculate no of bins in filter width bins = fHigh/bin_size # calculate FIR lowpass filter coefficients Rh = self.firwin(2048.0,(fhc-flc)/2.0) # compute USB filter by modulating the lowpass filter up filterPhase = 0.0 filterFreq = (flc+fhc)*pi phaseSin = 0 phaseCos = 0 # modulate to USB centre frequency i = 0 while i< 2048 : phaseSin = sin(filterPhase) phaseCos = cos(filterPhase) Ih[i] = Rh[i]*phaseSin Rh[i] = Rh[i]*phaseCos filterPhase = filterPhase+filterFreq if filterPhase >= 6.28318530717959 : filterPhase = filterPhase - 6.28318530717959 i = i+1 # for LSB create conjugate of USB filter # comment out for LSB Ih = -Ih # compute complex frequency domain response of LSB or USB filter complexin = Rh + 1j*Ih complexfd = fft(complexin) # Cartesian to polar gives the magnitude and phase components of the fft result m_filterMagn = abs(complexfd) m_filterPhase = angle(complexfd) # start streaming def run( self ): global running running = 1 #s = f.stream(7500, # sample rate # 2, # number of channels # 'int16', # sample format, choose from 'int8', 'int16', 'int32'. # 4096, # frames per buffer # 16) # maximum number of input buffers # open stream for input/output print "Create, open and start stream" global s s = f.stream(44100,2,'int16',2048,4) # open and start streaming s.open() s.start() print "Thread running..." # calculate the SSB filter self.filter_calc(300.0,3000.0) while( running ): # read next block of data from the input buffer samples = s.read() if( len( samples ) > 0 ): demod = self.sig( samples ) # now, write the data back s.write( demod ) # sleep for 10 ms # a stero block of 4096 samples takes 46 ms so 10ms should avoid dropped blocks time.sleep(0.01) print "Thread exiting..." def start(): global DSP DSP = dsp() DSP.start() def stop(): # stop the stream global running running = 0 global s s.stop() # and close it s.close() def filter(id): # set the filter passband global DSP if id == 0: DSP.filter_calc(300.0,3000.0) elif id == 1: DSP.filter_calc(300.0,2000.0) elif id == 2: DSP.filter_calc(500.0,700.0) -----Original Message----- From: Travis Oliphant [mailto:oliphant at ee.byu.edu] Sent: 26 February 2004 22:38 To: SciPy Users List Subject: Re: [SciPy-user] FIR Filters Bob.Cowdery at CGI-Europe.com wrote: > Travis > > Thanks very much for the responses. Don't confuse me with someone who > knows what they are doing (but I am working on it!). To give you some > background I am converting a .NET application that uses Intel IPP to > Python. The application is for a software defined radio. At the moment > I have audio capture which contains my quadrature samples in blocks of > 4096 (2048 each for I&Q) with 44100 sampling rate (46ms per block). I > have a simple dsp process to test each signal processing call. At the > moment this should output exactly what it receives, i.e just the audio > input as the demodulator needs the FIR filter and some magic applied > at the point indicated in the code. When I have written the > demodulator I can test firwin() properly. At the moment I can see it > outputs valid data, and it's fast which is good. The code below is the > very simple path. I have had to write two functions for > polar/cartesian conversions as I didn't find any in the library. I > don't know if these work right as I can hear clicking sounds on the > audio not there when they are commented out. However, it could be > audio overrun as I am up at 75% processor. The functions are also > quite slow (adding 30% to > > my processor loading). Is there a more efficient way to do this? > All the best in your education... First, Are you using numarray? if so then that might cause a slower application than using Numeric (older but often faster arraymodule) Comments on your code below: > def sigproc( self, samples ) : > > # convert to a numarray array of type double > x = asarray(samples, typecode=Float64) > is this numarray.asarray or Numeric.asarray ? > > # first to do is sort into real and image arrays from > interleaved array > realin = x[1::2] # every other element starting at index 1 > imagin = x[::2] # every other element starting at index 0 \ > > > # convert to frequency domain > realfd = fft(realin) > imagfd = fft(imagin) > Normally, real and imaginary parts of the data are handled together when a complex data type is available. I would do in = x[1::2] + 1j*x[::2] fd = fft(in) > > # cartesian to polar in frequency domain > magn,phase = self.cartesian_to_polar(realfd, imagfd) > > This seems strange to me, why are you filtering the magnitude and phase separately? This is very non-standard. > > # apply filter and demodulate > I doubt you want a 2048 tap filter at this point. Probably something a lot less. What kind of modulation are you demodulating? Are you doing frequency-domain filtering then? > > # polar to cartesian in frequency domain > > realfd, imagfd = self.polar_to_cartesian(magn, phase) > > > # convert to time domain > realtd = ifft(realfd) > imagtd = ifft(imagfd) > > # convert back to interleaved array > x[::2] = realtd.astype(Float64) > x[1::2] = imagtd.astype(Float64) > > # convert back to an integer array > x = x.astype('l') > > # return python list of processed samples > return x.tolist() > Why are you going back to a python list? A numeric array is a valid sequence. Why does it need to be a list. You have a lot of memory copying going on which is going to slow down everything. > > # utility functions not in the library > def polar_to_cartesian(self, magn, phase): > return magn * cos(phase), magn * sin(phase) > > def cartesian_to_polar(self, real, imag): > # convert the arrays real, imag into the polar system > # magnitude array > magn = sqrt(real**2 + imag**2) > # phase array > phase = > choose(equal(real,0),(arctan(imag/real),(choose(less(imag,0),((pi/2),-(pi/2) ))))) > > return magn, phase > > If you have a complex-valued array, then abs(x) is the magnitude and angle(x) is the phase. My colleagues down the hall do software radio and I have a lot of background in signal processing. Let me know if you have more specific questions. Thanks, -Travis O. _______________________________________________ SciPy-user mailing list SciPy-user at scipy.net http://www.scipy.net/mailman/listinfo/scipy-user *** Confidentiality Notice *** Proprietary/Confidential Information belonging to CGI Group Inc. and its affiliates may be contained in this message. If you are not a recipient indicated or intended in this message (or responsible for delivery of this message to such person), or you think for any reason that this message may have been addressed to you in error, you may not use or copy or deliver this message to anyone else. In such case, you should destroy this message and are asked to notify the sender by reply email. -------------- next part -------------- An HTML attachment was scrubbed... URL: From Fernando.Perez at colorado.edu Fri Feb 27 12:21:31 2004 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Fri, 27 Feb 2004 10:21:31 -0700 Subject: [SciPy-user] help and lazy-import (?) In-Reply-To: References: Message-ID: <403F7C9B.4030909@colorado.edu> Pearu Peterson wrote: > So, for some reasons, in my case when using ipython, redefined help is > not effective and probably because help is referenced in ipython before > importing scipy. It seems that you are not using ipython, so, what is the > contents of your PYTHONSTARTUP? Indeed, ipython imports help upon starting. So by the time scipy tries to modify things, the help ipython sees is already the default one. I've noticed in general that, because of the lazy import mechanisms of scipy, often I need to do things like hitting tab twice, at least the first time I try them. I realize the rationale for the lazy import system, so I don't know if this is something which can even be fixed at all. But if there are any changes which ipython can make to improve the situation, and someone can at least point me in the right direction, I'll gladly give it a try. Regards, f.