From pearu at scipy.org Mon Nov 1 02:11:42 2004 From: pearu at scipy.org (Pearu Peterson) Date: Mon, 1 Nov 2004 01:11:42 -0600 (CST) Subject: [SciPy-user] Undefined symbol "ATL_cpttrsm" In-Reply-To: References: Message-ID: On Mon, 1 Nov 2004, Lee Harr wrote: >>> but I still get ... >>> >>> ImportError: >>> /usr/local/lib/python2.3/site-packages/scipy/linalg/clapack.so: Undefined >>> symbol "clapack_sgesv" >> >> Symbol "clapack_sgesv" should be defined in atlas liblapack.{a,so}. But >> this may be different in freebsd. Try changing 'lapack' to 'alapack' or >> 'alapack_r'. >> >> Note that I am just guessing here, so, if you get more undefined symbol >> errors then try to find out yourself where these symbols are defined using >> nm, for instance, and modify library name lists accordingly. Let us know >> about the results. >> > > Does the phrase 'woot!' mean anything to you? > > def calc_info(self): > lib_dirs = self.get_lib_dirs() > info = {} > atlas_libs = self.get_libs('atlas_libs', > self._lib_names + ['atlas_r']) > lapack_libs = self.get_libs('lapack_libs',['alapack_r']) > > > did the trick... Great! Btw, what is the value of sys.platform in your system? > The other patch I needed to get scipy to build under > FreeBSD was to remove references to the arctanh_data, > arccosh_data, and arcsinh_data from > scipy_core/scipy_base/fastumath_unsigned.inc > > Should I post complete patches here, or create a FreeBSD > ports problem report? Either is fine. I can apply your patches asap provided that they do not break other platforms. Pearu From david at dwavesys.com Mon Nov 1 13:41:18 2004 From: david at dwavesys.com (David Grant) Date: Mon, 01 Nov 2004 10:41:18 -0800 Subject: [SciPy-user] Looking for zggsvd Message-ID: <4186834E.5030606@dwavesys.com> There are a few functions from lapack which I was wondering if I could run from python using scipy. zgetrf: This seems to be provided by clapack and flapack python modules. Hmm, I'm wondering why one would use flapack over clapack, from a python perspective. zgetri: ditto, in flapack and clapack zggsvd: I can't find this anywhere in scipy. Anyone know if there is an interface to this function? -- David J. Grant Scientific Officer Intellectual Property D-Wave Systems Inc. tel: 604.732.6604 fax: 604.732.6614 ************************** CONFIDENTIAL COMMUNICATION This electronic transmission, and any documents attached hereto, is confidential. The information is intended only for use by the recipient named above. If you have received this electronic message in error, please notify the sender and delete the electronic message. Any disclosure, copying, distribution, or use of the contents of information received in error is strictly prohibited. -------------- next part -------------- A non-text attachment was scrubbed... Name: david.vcf Type: text/x-vcard Size: 334 bytes Desc: not available URL: From birkenes at iet.ntnu.no Mon Nov 1 15:22:55 2004 From: birkenes at iet.ntnu.no (Oystein Birkenes) Date: Mon, 01 Nov 2004 21:22:55 +0100 Subject: [SciPy-user] Building SciPy Fails Message-ID: <1099340575.2677.5.camel@localhost.localdomain> The command "python setup.py build" exits with the following error message: error: Command "gcc -pthread -fno-strict-aliasing -DNDEBUG -g -O3 -Wall -Wstrict-prototypes -fPIC -Ibuild/temp.linux-i686-2.3/config_pygist -ILib/xplt/src/gist -ILib/xplt/src/play -ILib/xplt/src/play/unix -I/usr/local/include/python2.3 -c Lib/xplt/src/play/x11/colors.c -o build/temp.linux-i686-2.3/Lib/xplt/src/play/x11/colors.o -DGISTPATH="\"/usr/local/lib/python2.3/site-packages/scipy/xplt/gistdata\""" failed with exit status 1 I'm running Fedora Core 2 on a Dell Latitude C640 laptop. I have Python 2.3.4, Numeric version 23.1, numarray version 1.1, f2py2e version 2.43.239_1844 and scipy_distutils version 0.3.2. I've put some additional information on http://www.tele.ntnu.no/users/birkenes/scipy/ including: build.out: stdout of "python setup.py build" build.err: stderr of "python setup.py build" diagnose.out: stdout of "python -c 'from f2py2e.diagnose import run;run()'" atlas_version.out: stdout of "python -c 'import atlas_version'" system_info.out: stdout of "python scipy_core/scipy_distutils/system_info.py" Do you have any suggestions for what I can do to install SciPy? Thanking you in advance! Cheers, Oystein Birkenes From pearu at scipy.org Mon Nov 1 15:58:24 2004 From: pearu at scipy.org (Pearu Peterson) Date: Mon, 1 Nov 2004 14:58:24 -0600 (CST) Subject: [SciPy-user] Building SciPy Fails In-Reply-To: <1099340575.2677.5.camel@localhost.localdomain> References: <1099340575.2677.5.camel@localhost.localdomain> Message-ID: On Mon, 1 Nov 2004, Oystein Birkenes wrote: > The command "python setup.py build" exits with the following error > message: > > error: Command "gcc -pthread -fno-strict-aliasing -DNDEBUG -g -O3 > -Wall -Wstrict-prototypes -fPIC > -Ibuild/temp.linux-i686-2.3/config_pygist -ILib/xplt/src/gist > -ILib/xplt/src/play -ILib/xplt/src/play/unix > -I/usr/local/include/python2.3 -c Lib/xplt/src/play/x11/colors.c -o > build/temp.linux-i686-2.3/Lib/xplt/src/play/x11/colors.o > -DGISTPATH="\"/usr/local/lib/python2.3/site-packages/scipy/xplt/gistdata\""" > failed with exit status 1 > > I'm running Fedora Core 2 on a Dell Latitude C640 laptop. I have > Python 2.3.4, Numeric version 23.1, numarray version 1.1, f2py2e version > 2.43.239_1844 and scipy_distutils version 0.3.2. > > I've put some additional information on > http://www.tele.ntnu.no/users/birkenes/scipy/ including: > build.out: stdout of "python setup.py build" > build.err: stderr of "python setup.py build" > diagnose.out: stdout of "python -c 'from f2py2e.diagnose import > run;run()'" > atlas_version.out: stdout of "python -c 'import atlas_version'" > system_info.out: stdout of "python > scipy_core/scipy_distutils/system_info.py" > > Do you have any suggestions for what I can do to install SciPy? The problem is that X11 library is not found as seen in system_info.out and this is also reflected at the end of build.out. Where are libX11.a and X11/X.h located in your system? Pearu From birkenes at iet.ntnu.no Mon Nov 1 15:49:09 2004 From: birkenes at iet.ntnu.no (Oystein Birkenes) Date: Mon, 01 Nov 2004 21:49:09 +0100 Subject: [SciPy-user] Building SciPy Fails In-Reply-To: References: <1099340575.2677.5.camel@localhost.localdomain> Message-ID: <1099342149.2677.11.camel@localhost.localdomain> On Mon, 2004-11-01 at 21:58, Pearu Peterson wrote: > On Mon, 1 Nov 2004, Oystein Birkenes wrote: > > > The command "python setup.py build" exits with the following error > > message: > > > > error: Command "gcc -pthread -fno-strict-aliasing -DNDEBUG -g -O3 > > -Wall -Wstrict-prototypes -fPIC > > -Ibuild/temp.linux-i686-2.3/config_pygist -ILib/xplt/src/gist > > -ILib/xplt/src/play -ILib/xplt/src/play/unix > > -I/usr/local/include/python2.3 -c Lib/xplt/src/play/x11/colors.c -o > > build/temp.linux-i686-2.3/Lib/xplt/src/play/x11/colors.o > > -DGISTPATH="\"/usr/local/lib/python2.3/site-packages/scipy/xplt/gistdata\""" > > failed with exit status 1 > > > > I'm running Fedora Core 2 on a Dell Latitude C640 laptop. I have > > Python 2.3.4, Numeric version 23.1, numarray version 1.1, f2py2e version > > 2.43.239_1844 and scipy_distutils version 0.3.2. > > > > I've put some additional information on > > http://www.tele.ntnu.no/users/birkenes/scipy/ including: > > build.out: stdout of "python setup.py build" > > build.err: stderr of "python setup.py build" > > diagnose.out: stdout of "python -c 'from f2py2e.diagnose import > > run;run()'" > > atlas_version.out: stdout of "python -c 'import atlas_version'" > > system_info.out: stdout of "python > > scipy_core/scipy_distutils/system_info.py" > > > > Do you have any suggestions for what I can do to install SciPy? > > The problem is that X11 library is not found as seen in system_info.out > and this is also reflected at the end of build.out. > > Where are libX11.a and X11/X.h located in your system? > > Pearu > I can't find the files you asked for, but I have a file called /usr/X11R6/lib/libX11.so.6.2 Oystein From oliphant at ee.byu.edu Mon Nov 1 15:56:44 2004 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Mon, 01 Nov 2004 13:56:44 -0700 Subject: [SciPy-user] Building SciPy Fails In-Reply-To: <1099342149.2677.11.camel@localhost.localdomain> References: <1099340575.2677.5.camel@localhost.localdomain> <1099342149.2677.11.camel@localhost.localdomain> Message-ID: <4186A30C.4020208@ee.byu.edu> Oystein Birkenes wrote: >On Mon, 2004-11-01 at 21:58, Pearu Peterson wrote: > > >>On Mon, 1 Nov 2004, Oystein Birkenes wrote: >> >> >> >>>The command "python setup.py build" exits with the following error >>>message: >>> >>>error: Command "gcc -pthread -fno-strict-aliasing -DNDEBUG -g -O3 >>>-Wall -Wstrict-prototypes -fPIC >>>-Ibuild/temp.linux-i686-2.3/config_pygist -ILib/xplt/src/gist >>>-ILib/xplt/src/play -ILib/xplt/src/play/unix >>>-I/usr/local/include/python2.3 -c Lib/xplt/src/play/x11/colors.c -o >>>build/temp.linux-i686-2.3/Lib/xplt/src/play/x11/colors.o >>>-DGISTPATH="\"/usr/local/lib/python2.3/site-packages/scipy/xplt/gistdata\""" >>>failed with exit status 1 >>> >>>I'm running Fedora Core 2 on a Dell Latitude C640 laptop. I have >>>Python 2.3.4, Numeric version 23.1, numarray version 1.1, f2py2e version >>>2.43.239_1844 and scipy_distutils version 0.3.2. >>> >>>I've put some additional information on >>>http://www.tele.ntnu.no/users/birkenes/scipy/ including: >>>build.out: stdout of "python setup.py build" >>>build.err: stderr of "python setup.py build" >>>diagnose.out: stdout of "python -c 'from f2py2e.diagnose import >>>run;run()'" >>>atlas_version.out: stdout of "python -c 'import atlas_version'" >>>system_info.out: stdout of "python >>>scipy_core/scipy_distutils/system_info.py" >>> >>>Do you have any suggestions for what I can do to install SciPy? >>> >>> >>The problem is that X11 library is not found as seen in system_info.out >>and this is also reflected at the end of build.out. >> >>Where are libX11.a and X11/X.h located in your system? >> >>Pearu >> >> >> >I can't find the files you asked for, but I have a file called >/usr/X11R6/lib/libX11.so.6.2 > > You probably need the libX11-devel rpm or something equivalent. -Travis From missive at hotmail.com Mon Nov 1 17:32:44 2004 From: missive at hotmail.com (Lee Harr) Date: Tue, 02 Nov 2004 03:02:44 +0430 Subject: [SciPy-user] Undefined symbol "ATL_cpttrsm" Message-ID: >> self._lib_names + ['atlas_r']) >> lapack_libs = self.get_libs('lapack_libs',['alapack_r']) >> >> >>did the trick... > >Great! >Btw, what is the value of sys.platform in your system? > >>The other patch I needed to get scipy to build under >>FreeBSD was to remove references to the arctanh_data, >>arccosh_data, and arcsinh_data from >>scipy_core/scipy_base/fastumath_unsigned.inc >> >>Should I post complete patches here, or create a FreeBSD >>ports problem report? > >Either is fine. I can apply your patches asap provided that they do not >break other platforms. > Well... I have a feeling my patches will wreak havoc for other people. I am not so sure how to do this for just FreeBSD-4 without causing other trouble... >>>import sys >>>sys.platform 'freebsd4' I created a FreeBSD problem report with this patch attached: begin 644 py-scipy.patch M9&EF9B`M=7).('!Y+7-C:7!Y+F]R:65]B87-E+V9A5]C;W)E+W-C:7!Y7V)A'`L("`H=F]I9"`J*6-?97AP+"`@*'9O:60@*BEC M7V5X<"P@("AV;VED("HI(F5X<"(L("!].PHK0$`@+3,P.3DL-R`K,S`Y.2PW M($!`"BL@"0D)"2)A"D@2!O9B!E;&5M96YT M=VES92!I;G9E41I8W1?4V5T M271E;5-T7!E41I8W1?4V5T271E;5-T2P@(F%R8W-I;F at B+"!F*3L**RT@("`@4'E?1$5#4D5&*&8I.PHK*R`@("!0 M>5]$14-2148H9BD[("HO"BL@("`@(&8@/2!0>55&=6YC7T9R;VU&=6YC06YD M1&%T82AA2US8VEP>2YO5]C;W)E+W-C M:7!Y7V1I7-T96U?:6YF;RYP>2YO Just thought I'd share my experiences concerning looking for a way to make python interfaces for C++ code (see my previous thread about this): pycxx - Looks like it requires one to modify all the C++ source, adding PyObject pointers everywhere. I stayed away from this. I don't want to modify the code if I have to (why should I have to? the reason I'm making an interface is so that I can use the C++ code, and not touch it, thus avoiding creating bugs in the C++ code. If I WANTED to touch the C++ code I'd just re-write in python). boost.python - Horrible documentation and horrible setup. Nothing good to say about this. I spent more time on this than any other method and made the least progress. weave - was going to write a python routine which had some small piece weave code. The weave code would be talking to a C++ library or compiled python extension or something. I was able to make a C++ library containing all my C++ code but then abandoned it after that, to see if there were simpler methods python.org method - http://docs.python.org/ext/simpleExample.html - Similar to pycxx it involved altering the code swig - at first it seemed like a lot of work to copy everything from the .h files into the .i file. But then I realized you can just use %include and it will use that stuff for swig. Simple examples are given in the docs for both C and C++ which work well. I like how it just generates an extra .c wrapper file which is then compiled to a .o file and linked together will all the other .o files and libraries (like -llapack) into a python extension, very simply. Extend any methods or functions you want. I had a problem because the C++ class overloaded the operator+ method, but there is any easy to fix that in the docs, calling the python wrapper __add__ instead. So far swig is my choice. Boost.python looked good but I simple couldn't get bjam to work! David From bob at redivi.com Mon Nov 1 18:51:14 2004 From: bob at redivi.com (Bob Ippolito) Date: Mon, 1 Nov 2004 18:51:14 -0500 Subject: [SciPy-user] FYI: C++ Extensions for Python In-Reply-To: <4186BDD9.5020606@telus.net> References: <4186BDD9.5020606@telus.net> Message-ID: On Nov 1, 2004, at 17:51, David Grant wrote: > Just thought I'd share my experiences concerning looking for a way to > make python interfaces for C++ code (see my previous thread about > this): > > pycxx - Looks like it requires one to modify all the C++ source, > adding PyObject pointers everywhere. I stayed away from this. I > don't want to modify the code if I have to (why should I have to? the > reason I'm making an interface is so that I can use the C++ code, and > not touch it, thus avoiding creating bugs in the C++ code. If I > WANTED to touch the C++ code I'd just re-write in python). > > boost.python - Horrible documentation and horrible setup. Nothing > good to say about this. I spent more time on this than any other > method and made the least progress. > > weave - was going to write a python routine which had some small piece > weave code. The weave code would be talking to a C++ library or > compiled python extension or something. I was able to make a C++ > library containing all my C++ code but then abandoned it after that, > to see if there were simpler methods > > python.org method - http://docs.python.org/ext/simpleExample.html - > Similar to pycxx it involved altering the code > > swig - at first it seemed like a lot of work to copy everything from > the .h files into the .i file. But then I realized you can just use > %include and it will use that stuff for swig. Simple examples are > given in the docs for both C and C++ which work well. I like how it > just generates an extra .c wrapper file which is then compiled to a .o > file and linked together will all the other .o files and libraries > (like -llapack) into a python extension, very simply. Extend any > methods or functions you want. I had a problem because the C++ class > overloaded the operator+ method, but there is any easy to fix that in > the docs, calling the python wrapper __add__ instead. > > So far swig is my choice. Boost.python looked good but I simple > couldn't get bjam to work! Uh, where does it say that you need to use bjam? It works just fine outside of that environment... take a look at Fusion for example. -bob From david at dwavesys.com Mon Nov 1 19:03:58 2004 From: david at dwavesys.com (David Grant) Date: Mon, 01 Nov 2004 16:03:58 -0800 Subject: [SciPy-user] FYI: C++ Extensions for Python In-Reply-To: References: <4186BDD9.5020606@telus.net> Message-ID: <4186CEEE.5060408@dwavesys.com> Bob Ippolito wrote: > > On Nov 1, 2004, at 17:51, David Grant wrote: > >> Just thought I'd share my experiences concerning looking for a way to >> make python interfaces for C++ code (see my previous thread about this): >> >> pycxx - Looks like it requires one to modify all the C++ source, >> adding PyObject pointers everywhere. I stayed away from this. I >> don't want to modify the code if I have to (why should I have to? >> the reason I'm making an interface is so that I can use the C++ code, >> and not touch it, thus avoiding creating bugs in the C++ code. If I >> WANTED to touch the C++ code I'd just re-write in python). >> >> boost.python - Horrible documentation and horrible setup. Nothing >> good to say about this. I spent more time on this than any other >> method and made the least progress. >> >> weave - was going to write a python routine which had some small >> piece weave code. The weave code would be talking to a C++ library >> or compiled python extension or something. I was able to make a C++ >> library containing all my C++ code but then abandoned it after that, >> to see if there were simpler methods >> >> python.org method - http://docs.python.org/ext/simpleExample.html - >> Similar to pycxx it involved altering the code >> >> swig - at first it seemed like a lot of work to copy everything from >> the .h files into the .i file. But then I realized you can just use >> %include and it will use that stuff for swig. Simple examples are >> given in the docs for both C and C++ which work well. I like how it >> just generates an extra .c wrapper file which is then compiled to a >> .o file and linked together will all the other .o files and libraries >> (like -llapack) into a python extension, very simply. Extend any >> methods or functions you want. I had a problem because the C++ class >> overloaded the operator+ method, but there is any easy to fix that in >> the docs, calling the python wrapper __add__ instead. >> >> So far swig is my choice. Boost.python looked good but I simple >> couldn't get bjam to work! > > > Uh, where does it say that you need to use bjam? It works just fine > outside of that environment... take a look at Fusion > for example. > This page calls itself a quickstart but it doesn't say how to build the shared library or what to do with the "Boost.Python" wrapper: http://www.boost.org/libs/python/doc/tutorial/doc/quickstart.html The next page in the tutorial which is very Windows specific (I don't use Windows) and doesn't get me from A to B. ie. it should get a hello.cpp python module but it doesn't. It seems to want me to run boost in an example directory which is under /usr/share. Not the right place to be building anything! This page also doesn't talk about making the wrapper and/or what to name it, or what to do with it. It only talks about bjam -- David J. Grant Scientific Officer Intellectual Property D-Wave Systems Inc. tel: 604.732.6604 fax: 604.732.6614 ************************** CONFIDENTIAL COMMUNICATION This electronic transmission, and any documents attached hereto, is confidential. The information is intended only for use by the recipient named above. If you have received this electronic message in error, please notify the sender and delete the electronic message. Any disclosure, copying, distribution, or use of the contents of information received in error is strictly prohibited. -------------- next part -------------- A non-text attachment was scrubbed... Name: david.vcf Type: text/x-vcard Size: 334 bytes Desc: not available URL: From pearu at scipy.org Mon Nov 1 19:42:11 2004 From: pearu at scipy.org (Pearu Peterson) Date: Mon, 1 Nov 2004 18:42:11 -0600 (CST) Subject: [SciPy-user] Undefined symbol "ATL_cpttrsm" In-Reply-To: References: Message-ID: On Tue, 2 Nov 2004, Lee Harr wrote: > Well... I have a feeling my patches will wreak havoc for other > people. I am not so sure how to do this for just FreeBSD-4 > without causing other trouble... > >>>> import sys >>>> sys.platform > 'freebsd4' > > > I created a FreeBSD problem report with this patch attached: > > begin 644 py-scipy.patch > end > > At least that way folks will be able to build it, and maybe someone > will come along who can figure out how to really fix the problem. Thanks for the information. Could you try the latest scipy from CVS? I just commited a patch fixing atlas,lapack names for freebsd and implementation of missing atanh, asinh, acosh functions. Pearu From missive at hotmail.com Mon Nov 1 20:25:55 2004 From: missive at hotmail.com (Lee Harr) Date: Tue, 02 Nov 2004 05:55:55 +0430 Subject: [SciPy-user] Undefined symbol "ATL_cpttrsm" Message-ID: >Thanks for the information. Could you try the latest scipy from CVS? >I just commited a patch fixing atlas,lapack names for freebsd and >implementation of missing atanh, asinh, acosh functions. > With CVS just checked out ... (and no other patches applied): [...] creating build/temp.freebsd-4.10-STABLE-i386-2.3/scipy_core creating build/temp.freebsd-4.10-STABLE-i386-2.3/scipy_core/scipy_base compile options: '-DUSE_MCONF_LITE_LE -I/usr/local/include/python2.3 -c' cc: scipy_core/scipy_base/fastumathmodule.c In file included from scipy_core/scipy_base/fastumathmodule.c:417: scipy_core/scipy_base/fastumath_unsigned.inc:2508: `acosh' undeclared here (not in a function) scipy_core/scipy_base/fastumath_unsigned.inc:2508: initializer element is not constant scipy_core/scipy_base/fastumath_unsigned.inc:2508: (near initialization for `arccosh_data[0]') scipy_core/scipy_base/fastumath_unsigned.inc:2508: `acosh' undeclared here (not in a function) scipy_core/scipy_base/fastumath_unsigned.inc:2508: initializer element is not constant scipy_core/scipy_base/fastumath_unsigned.inc:2508: (near initialization for `arccosh_data[1]') scipy_core/scipy_base/fastumath_unsigned.inc:2509: `asinh' undeclared here (not in a function) scipy_core/scipy_base/fastumath_unsigned.inc:2509: initializer element is not constant scipy_core/scipy_base/fastumath_unsigned.inc:2509: (near initialization for `arcsinh_data[0]') scipy_core/scipy_base/fastumath_unsigned.inc:2509: `asinh' undeclared here (not in a function) scipy_core/scipy_base/fastumath_unsigned.inc:2509: initializer element is not constant scipy_core/scipy_base/fastumath_unsigned.inc:2509: (near initialization for `arcsinh_data[1]') scipy_core/scipy_base/fastumath_unsigned.inc:2510: `atanh' undeclared here (not in a function) scipy_core/scipy_base/fastumath_unsigned.inc:2510: initializer element is not constant scipy_core/scipy_base/fastumath_unsigned.inc:2510: (near initialization for `arctanh_data[0]') scipy_core/scipy_base/fastumath_unsigned.inc:2510: `atanh' undeclared here (not in a function) scipy_core/scipy_base/fastumath_unsigned.inc:2510: initializer element is not constant scipy_core/scipy_base/fastumath_unsigned.inc:2510: (near initialization for `arctanh_data[1]') [...] _________________________________________________________________ Don't just search. Find. Check out the new MSN Search! http://search.msn.com/ From bob at redivi.com Mon Nov 1 20:59:41 2004 From: bob at redivi.com (Bob Ippolito) Date: Mon, 1 Nov 2004 20:59:41 -0500 Subject: [SciPy-user] FYI: C++ Extensions for Python In-Reply-To: <4186CEEE.5060408@dwavesys.com> References: <4186BDD9.5020606@telus.net> <4186CEEE.5060408@dwavesys.com> Message-ID: On Nov 1, 2004, at 19:03, David Grant wrote: > Bob Ippolito wrote: > >> >> On Nov 1, 2004, at 17:51, David Grant wrote: >> >>> Just thought I'd share my experiences concerning looking for a way >>> to make python interfaces for C++ code (see my previous thread about >>> this): >>> >>> pycxx - Looks like it requires one to modify all the C++ source, >>> adding PyObject pointers everywhere. I stayed away from this. I >>> don't want to modify the code if I have to (why should I have to? >>> the reason I'm making an interface is so that I can use the C++ >>> code, and not touch it, thus avoiding creating bugs in the C++ code. >>> If I WANTED to touch the C++ code I'd just re-write in python). >>> >>> boost.python - Horrible documentation and horrible setup. Nothing >>> good to say about this. I spent more time on this than any other >>> method and made the least progress. >>> >>> weave - was going to write a python routine which had some small >>> piece weave code. The weave code would be talking to a C++ library >>> or compiled python extension or something. I was able to make a C++ >>> library containing all my C++ code but then abandoned it after that, >>> to see if there were simpler methods >>> >>> python.org method - http://docs.python.org/ext/simpleExample.html - >>> Similar to pycxx it involved altering the code >>> >>> swig - at first it seemed like a lot of work to copy everything from >>> the .h files into the .i file. But then I realized you can just use >>> %include and it will use that stuff for swig. Simple examples are >>> given in the docs for both C and C++ which work well. I like how it >>> just generates an extra .c wrapper file which is then compiled to a >>> .o file and linked together will all the other .o files and >>> libraries (like -llapack) into a python extension, very simply. >>> Extend any methods or functions you want. I had a problem because >>> the C++ class overloaded the operator+ method, but there is any easy >>> to fix that in the docs, calling the python wrapper __add__ instead. >>> >>> So far swig is my choice. Boost.python looked good but I simple >>> couldn't get bjam to work! >> >> >> Uh, where does it say that you need to use bjam? It works just fine >> outside of that environment... take a look at Fusion >> for example. >> > This page calls itself a quickstart but it doesn't say how to build > the shared library or what to do with the "Boost.Python" wrapper: > http://www.boost.org/libs/python/doc/tutorial/doc/quickstart.html > > The next page in the tutorial which is very Windows specific (I don't > use Windows) and doesn't get me from A to B. ie. it should get a > hello.cpp python module but it doesn't. It seems to want me to run > boost in an example directory which is under /usr/share. Not the > right place to be building anything! > > This page also doesn't talk about making the wrapper and/or what to > name it, or what to do with it. It only talks about bjam Ok, fine, so the tutorials and quickstart aren't the greatest. No real surprise there. Take a look at Fusion's setup.py. -bob From david.grant at telus.net Mon Nov 1 21:10:35 2004 From: david.grant at telus.net (David Grant) Date: Mon, 01 Nov 2004 18:10:35 -0800 Subject: [SciPy-user] FYI: C++ Extensions for Python In-Reply-To: References: <4186BDD9.5020606@telus.net> <4186CEEE.5060408@dwavesys.com> Message-ID: <4186EC9B.9000503@telus.net> Bob Ippolito wrote: > > On Nov 1, 2004, at 19:03, David Grant wrote: > >> Bob Ippolito wrote: >> >>> >>> On Nov 1, 2004, at 17:51, David Grant wrote: >>> >>>> Just thought I'd share my experiences concerning looking for a way >>>> to make python interfaces for C++ code (see my previous thread >>>> about this): >>>> >>>> pycxx - Looks like it requires one to modify all the C++ source, >>>> adding PyObject pointers everywhere. I stayed away from this. I >>>> don't want to modify the code if I have to (why should I have to? >>>> the reason I'm making an interface is so that I can use the C++ >>>> code, and not touch it, thus avoiding creating bugs in the C++ >>>> code. If I WANTED to touch the C++ code I'd just re-write in python). >>>> >>>> boost.python - Horrible documentation and horrible setup. Nothing >>>> good to say about this. I spent more time on this than any other >>>> method and made the least progress. >>>> >>>> weave - was going to write a python routine which had some small >>>> piece weave code. The weave code would be talking to a C++ library >>>> or compiled python extension or something. I was able to make a >>>> C++ library containing all my C++ code but then abandoned it after >>>> that, to see if there were simpler methods >>>> >>>> python.org method - http://docs.python.org/ext/simpleExample.html - >>>> Similar to pycxx it involved altering the code >>>> >>>> swig - at first it seemed like a lot of work to copy everything >>>> from the .h files into the .i file. But then I realized you can >>>> just use %include and it will use that stuff for swig. Simple >>>> examples are given in the docs for both C and C++ which work well. >>>> I like how it just generates an extra .c wrapper file which is then >>>> compiled to a .o file and linked together will all the other .o >>>> files and libraries (like -llapack) into a python extension, very >>>> simply. Extend any methods or functions you want. I had a problem >>>> because the C++ class overloaded the operator+ method, but there is >>>> any easy to fix that in the docs, calling the python wrapper >>>> __add__ instead. >>>> >>>> So far swig is my choice. Boost.python looked good but I simple >>>> couldn't get bjam to work! >>> >>> >>> >>> Uh, where does it say that you need to use bjam? It works just fine >>> outside of that environment... take a look at Fusion >>> for example. >>> >> This page calls itself a quickstart but it doesn't say how to build >> the shared library or what to do with the "Boost.Python" wrapper: >> http://www.boost.org/libs/python/doc/tutorial/doc/quickstart.html >> >> The next page in the tutorial which is very Windows specific (I don't >> use Windows) and doesn't get me from A to B. ie. it should get a >> hello.cpp python module but it doesn't. It seems to want me to run >> boost in an example directory which is under /usr/share. Not the >> right place to be building anything! >> >> This page also doesn't talk about making the wrapper and/or what to >> name it, or what to do with it. It only talks about bjam > > > Ok, fine, so the tutorials and quickstart aren't the greatest. No > real surprise there. Take a look at Fusion's setup.py. > Well I'll give them a 7/10 just for being pretty. From jdhunter at ace.bsd.uchicago.edu Mon Nov 1 22:14:49 2004 From: jdhunter at ace.bsd.uchicago.edu (John Hunter) Date: Mon, 01 Nov 2004 21:14:49 -0600 Subject: [SciPy-user] FYI: C++ Extensions for Python In-Reply-To: <4186BDD9.5020606@telus.net> (David Grant's message of "Mon, 01 Nov 2004 14:51:05 -0800") References: <4186BDD9.5020606@telus.net> Message-ID: >>>>> "David" == David Grant writes: David> pycxx - Looks like it requires one to modify all the C++ David> source, adding PyObject pointers everywhere. I stayed away David> from this. I don't want to modify the code if I have to David> (why should I have to? the reason I'm making an interface David> is so that I can use the C++ code, and not touch it, thus David> avoiding creating bugs in the C++ code. If I WANTED to David> touch the C++ code I'd just re-write in python). Not at all. Typically, you do not modify any of the C++ code in the library you are wrapping; I never have. Suppose you are wrapping a C++ function that takes a pointer to x and a pointer to y and returns void, using the pointers as the return value. Suppose x and y aren't used in calculating the return, but are merely used to handle the return value. void SomeClass::somefunc(double *x, double *y); In python you would typically want to do x, y = o.somefunc() In pycxx, you would include the header that defines the class that somefunc lives in, and wrap it like Py::Object SomeClassPy::somefunc(const Py::Tuple & args) { args.verify_length(0); double x(0),y(0); _p->somefunc(&x, &y); Py::Tuple xy(2); xy[0] = Py::Float(x); xy[1] = Py::Float(y); return xy; } Here _p is a pointer to a SomeClass instance. Your wrapper class SomeClassPy wraps SomeClass -- you do not alter SomeClass in any way. True, it requires some additional boilerplate to hand wrap the function. I like it because 1) it gives you total control on how to process the input and output args, 2) allows you to write pythonic c++ with Py::Tuple, Py::Dict, Py::Float, etc. 3) handles all the reference counting 4) supports all the special methods for emulating numeric types, comparisons, mapping/seq types etc. Other wrapping libraries do more to automate the process of wrapping. But can any of them handle changing around the input and output args as above to make the interface more pythonic, w/o having to write some dedicated wrapper code. Ie, in pyfort I believe you can declare arguments as input or output and the wrapper will rearrange everything for you. Do any of the C++ wrapper generators support this? JDH From prabhu_r at users.sf.net Mon Nov 1 22:58:44 2004 From: prabhu_r at users.sf.net (Prabhu Ramachandran) Date: Tue, 2 Nov 2004 09:28:44 +0530 Subject: [SciPy-user] FYI: C++ Extensions for Python In-Reply-To: <4186BDD9.5020606@telus.net> References: <4186BDD9.5020606@telus.net> Message-ID: <16775.1524.976199.80138@monster.linux.in> >>>>> "DG" == David Grant writes: [...] DG> weave - was going to write a python routine which had some DG> small piece weave code. The weave code would be talking to a DG> C++ library or compiled python extension or something. I was DG> able to make a C++ library containing all my C++ code but then DG> abandoned it after that, to see if there were simpler methods This is a common thing you'd need to do with any other approach as well since all of them would link to your C++ code. Plus, building a library out of .o's is straightforward. I also recommend using SCons to build your code. DG> swig - at first it seemed like a lot of work to copy [...] In my experience SWIG "just works" and works fast and well. Oh, BTW, if you do have SWIG wrapped code you can weave SWIG wrapped objects too! Look at weave/examples/swig2_example.py and read the docstrings. cheers, prabhu From prabhu_r at users.sf.net Mon Nov 1 23:13:42 2004 From: prabhu_r at users.sf.net (Prabhu Ramachandran) Date: Tue, 2 Nov 2004 09:43:42 +0530 Subject: [SciPy-user] FYI: C++ Extensions for Python In-Reply-To: References: <4186BDD9.5020606@telus.net> Message-ID: <16775.2422.208335.284232@monster.linux.in> >>>>> "JH" == John Hunter writes: JH> Other wrapping libraries do more to automate the process of JH> wrapping. But can any of them handle changing around the JH> input and output args as above to make the interface more JH> pythonic, w/o having to write some dedicated wrapper code. JH> Ie, in pyfort I believe you can declare arguments as input or JH> output and the wrapper will rearrange everything for you. Do JH> any of the C++ wrapper generators support this? SWIG lets you define your own "typemaps" to do pretty much anything you want to. http://www.swig.org/Doc1.3/Typemaps.html Besides, to do what you wanted to do in your example there are predefined SWIG libraries that let you do this easily. http://www.swig.org/Doc1.3/Arguments.html#Arguments_nn3 cheers, prabhu From david.grant at telus.net Tue Nov 2 01:14:25 2004 From: david.grant at telus.net (David Grant) Date: Mon, 01 Nov 2004 22:14:25 -0800 Subject: [SciPy-user] FYI: C++ Extensions for Python In-Reply-To: <16775.1524.976199.80138@monster.linux.in> References: <4186BDD9.5020606@telus.net> <16775.1524.976199.80138@monster.linux.in> Message-ID: <418725C1.3040501@telus.net> Prabhu Ramachandran wrote: >>>>>>"DG" == David Grant writes: >>>>>> >>>>>> > >[...] > DG> weave - was going to write a python routine which had some > DG> small piece weave code. The weave code would be talking to a > DG> C++ library or compiled python extension or something. I was > DG> able to make a C++ library containing all my C++ code but then > DG> abandoned it after that, to see if there were simpler methods > >This is a common thing you'd need to do with any other approach as >well since all of them would link to your C++ code. Plus, building a >library out of .o's is straightforward. > Actually you're probably right. I was trying to do it with full automake/autoconf/configure support and that was what I was getting bogged down in. I guess making a library is the same as is done with swig right? Just call g++ with the -shared argument? > I also recommend using SCons >to build your code. > > DG> swig - at first it seemed like a lot of work to copy >[...] > >In my experience SWIG "just works" and works fast and well. Oh, BTW, >if you do have SWIG wrapped code you can weave SWIG wrapped objects >too! Look at weave/examples/swig2_example.py and read the >docstrings. > > > I'll check that out! >cheers, >prabhu > > > > > -- David J. Grant http://www.davidgrant.ca:81 -------------- next part -------------- A non-text attachment was scrubbed... Name: david.grant.vcf Type: text/x-vcard Size: 200 bytes Desc: not available URL: From pearu at scipy.org Tue Nov 2 02:10:48 2004 From: pearu at scipy.org (Pearu Peterson) Date: Tue, 2 Nov 2004 01:10:48 -0600 (CST) Subject: [SciPy-user] Undefined symbol "ATL_cpttrsm" In-Reply-To: References: Message-ID: On Tue, 2 Nov 2004, Lee Harr wrote: >> Thanks for the information. Could you try the latest scipy from CVS? >> I just commited a patch fixing atlas,lapack names for freebsd and >> implementation of missing atanh, asinh, acosh functions. >> > > > With CVS just checked out ... (and no other patches applied): > > [...] > creating build/temp.freebsd-4.10-STABLE-i386-2.3/scipy_core > creating build/temp.freebsd-4.10-STABLE-i386-2.3/scipy_core/scipy_base > compile options: '-DUSE_MCONF_LITE_LE -I/usr/local/include/python2.3 -c' > cc: scipy_core/scipy_base/fastumathmodule.c > In file included from scipy_core/scipy_base/fastumathmodule.c:417: > scipy_core/scipy_base/fastumath_unsigned.inc:2508: `acosh' undeclared here > (not in a function) Ok, try again the updated scipy from CVS. Seems like on freebsd HAVE_INVERSE_HYPERBOLIC is defined somewhere but inverse hyberbolic functions may have different names from acosh, asing, atanh. I have fixed this by forcing -UHAVE_INVERSE_HYPERBOLIC. Pearu From prabhu_r at users.sf.net Tue Nov 2 04:40:34 2004 From: prabhu_r at users.sf.net (Prabhu Ramachandran) Date: Tue, 2 Nov 2004 15:10:34 +0530 Subject: [SciPy-user] FYI: C++ Extensions for Python In-Reply-To: <418725C1.3040501@telus.net> References: <4186BDD9.5020606@telus.net> <16775.1524.976199.80138@monster.linux.in> <418725C1.3040501@telus.net> Message-ID: <16775.22034.55380.553210@monster.linux.in> >>>>> "DG" == David Grant writes: DG> Prabhu Ramachandran wrote: >> This is a common thing you'd need to do with any other approach >> as well since all of them would link to your C++ code. Plus, >> building a library out of .o's is straightforward. >> DG> Actually you're probably right. I was trying to do it with DG> full automake/autoconf/configure support and that was what I DG> was getting bogged down in. I guess making a library is the DG> same as is done with swig right? Just call g++ with the DG> -shared argument? Its easier with SCons. Here is an untested example to give you a taste: # ------------------------------ import glob ccflags = '-Wall -O3 -ggdb' # and whatever else inc_dirs = ['.', 'sub_dir/'] lib_dirs = ['/usr/X11R6/lib', '.'] env = Environment(CCFLAGS = ccflags, CPPPATH=inc_dirs, LIBPATH=lib_dirs) # If you want to build with SWIG support use something like this # instead. env = Environment(CCFLAGS = ccflags, CPPPATH=inc_dirs, LIBPATH=lib_dirs, SWIGFLAGS='-c++ -python ') lib_src = glob.glob('src/*.c') # Build a shared library from these sources. env.SharedLibrary(target='my_library', source=lib_src) # Now build main.c libs = ['lapack', 'atlas', 'my_library'] env.Program(target='main', source='main.c', LIBS=libs) # Then for a swig module: env.SharedLibrary('_foo.so', 'foo.i', LIBS=libs) # ------------------------------ Obviously there are more options, but this shows you the basics of a library, executable and a swig shared module. cheers, prabhu From birkenes at iet.ntnu.no Tue Nov 2 05:09:23 2004 From: birkenes at iet.ntnu.no (Oystein Birkenes) Date: Tue, 02 Nov 2004 11:09:23 +0100 Subject: [SciPy-user] Building SciPy Fails In-Reply-To: <4186A30C.4020208@ee.byu.edu> References: <1099340575.2677.5.camel@localhost.localdomain> <1099342149.2677.11.camel@localhost.localdomain> <4186A30C.4020208@ee.byu.edu> Message-ID: <1099390163.7619.26.camel@lydlab236.tele.ntnu.no> On Mon, 2004-11-01 at 21:56, Travis Oliphant wrote: > Oystein Birkenes wrote: > > >On Mon, 2004-11-01 at 21:58, Pearu Peterson wrote: > > > > > >>On Mon, 1 Nov 2004, Oystein Birkenes wrote: > >> > >> > >> > >>>The command "python setup.py build" exits with the following error > >>>message: > >>> > >>>error: Command "gcc -pthread -fno-strict-aliasing -DNDEBUG -g -O3 > >>>-Wall -Wstrict-prototypes -fPIC > >>>-Ibuild/temp.linux-i686-2.3/config_pygist -ILib/xplt/src/gist > >>>-ILib/xplt/src/play -ILib/xplt/src/play/unix > >>>-I/usr/local/include/python2.3 -c Lib/xplt/src/play/x11/colors.c -o > >>>build/temp.linux-i686-2.3/Lib/xplt/src/play/x11/colors.o > >>>-DGISTPATH="\"/usr/local/lib/python2.3/site-packages/scipy/xplt/gistdata\""" > >>>failed with exit status 1 > >>> > >>>I'm running Fedora Core 2 on a Dell Latitude C640 laptop. I have > >>>Python 2.3.4, Numeric version 23.1, numarray version 1.1, f2py2e version > >>>2.43.239_1844 and scipy_distutils version 0.3.2. > >>> > >>>I've put some additional information on > >>>http://www.tele.ntnu.no/users/birkenes/scipy/ including: > >>>build.out: stdout of "python setup.py build" > >>>build.err: stderr of "python setup.py build" > >>>diagnose.out: stdout of "python -c 'from f2py2e.diagnose import > >>>run;run()'" > >>>atlas_version.out: stdout of "python -c 'import atlas_version'" > >>>system_info.out: stdout of "python > >>>scipy_core/scipy_distutils/system_info.py" > >>> > >>>Do you have any suggestions for what I can do to install SciPy? > >>> > >>> > >>The problem is that X11 library is not found as seen in system_info.out > >>and this is also reflected at the end of build.out. > >> > >>Where are libX11.a and X11/X.h located in your system? > >> > >>Pearu > >> > >> > >> > >I can't find the files you asked for, but I have a file called > >/usr/X11R6/lib/libX11.so.6.2 > > > > > You probably need the libX11-devel rpm or something equivalent. > > -Travis > Great, now I managed to install SciPy and everything seems to be fine! What I did is that I installed the xorg-x11-devel package using yum (didn't find a package named libX11-devel for Fedora) and the X Software Development Tools using the Package Management. Thanks again for help! Oystein From sseefeld at art.ca Tue Nov 2 08:40:25 2004 From: sseefeld at art.ca (Stefan Seefeld) Date: Tue, 2 Nov 2004 08:40:25 -0500 Subject: [SciPy-user] FYI: C++ Extensions for Python Message-ID: <20DCDD8F0FCED411AC4D001083CF504501AA9821@MTL-EXCHANGE> > From: David Grant [mailto:david.grant at telus.net] > Sent: November 1, 2004 17:51 > boost.python - Horrible documentation and horrible setup. > Nothing good > to say about this. I spent more time on this than any other > method and > made the least progress. Just to balance this a bit: I must agree that the documentation is very sparse, but otherwise it's a wonderful tool. It's totally non-intrusive in that it doesn't require you to design your C++ API for use with python, but instead it lets you express (via template traits) how objects are created, how memory management should be done, etc. It's extremely powerful yet easy to write (as an example: is there any other binding that lets you define classes in C++ and derive from them in python ?) > So far swig is my choice. Boost.python looked good but I simple > couldn't get bjam to work! I don't use bjam either. Use the boost.python tests to see what compiler / linker options to use and then use them with your own build system. That's what I'v been doing with make/scons/MSVC. Works like a charm. Regards, Stefan From prabhu_r at users.sf.net Tue Nov 2 10:48:29 2004 From: prabhu_r at users.sf.net (Prabhu Ramachandran) Date: Tue, 2 Nov 2004 21:18:29 +0530 Subject: [SciPy-user] FYI: C++ Extensions for Python In-Reply-To: <20DCDD8F0FCED411AC4D001083CF504501AA9821@MTL-EXCHANGE> References: <20DCDD8F0FCED411AC4D001083CF504501AA9821@MTL-EXCHANGE> Message-ID: <16775.44109.829189.807407@monster.linux.in> >>>>> "SS" == Stefan Seefeld writes: SS> Just to balance this a bit: I must agree that the SS> documentation is very sparse, but otherwise it's a wonderful I agree that it is an excellent tool. [...] SS> extremely powerful yet easy to write (as an example: is there SS> any other binding that lets you define classes in C++ and SS> derive from them in python ?) Yes, SWIG with the "directors" option. Support for directors has been around for atleast a year now! SWIG seems to have a bad reputation from its 1.1 era, the new versions are really powerful and support a large subset of C++. I admire boost.python and tried very hard to use it for my work. Unfortunately it did not work out. I needed to wrap close to 80 C++ classes so hand wrapping them was out of question. I had to use Pyste and fixed a bunch of bugs with that. The biggest problem for me was boost.python wrappers take *forever* to build with GCC. In the end, I was dealing with raw pointers in my C++ code and for reasons too complex to discuss here that caused problems that were not trivial to handle without writing code by hand. At that point I had spent close to 3 weeks on trying to get wrappers working and finally gave up. I then tried SWIG and it worked really well. Getting working wrappers took very little time and then improving on that (there were bugs that were fixed etc.) took a some time but eventually it worked beautifully. In all I spent less time on SWIG than boost.python and it did the job well. SWIG wrapper code is ANSI C so it takes close to a tenth of the time to build the wrappers. For a fairly trivial example I made some simple comparisons between the two and presented them last year at SciPy'03. You can find the code and slides here: http://www.ae.iitm.ac.in/~prabhu/research/papers/swig_boost.zip Currently, I use SWIG to build Python wrappers for about 80 classes. Classes can be subclassed in Python. Documentation from doxygen comments in the C++ code is extracted and inserted into the shadow Python docstrings using a script[1] that parses the XML generated by doxygen and generates the necessary SWIG interface file. Building the wrappers takes about 10-15 minutes to build on an outdated machine. Generating the wrappers takes around 500 lines of SWIG interface code. Of course, SWIG also has support for so many other languages (atleast 10 other languages apart from Python!). On the downside, SWIG does not yet support nested classes. AFAIK it does not support expression templates. However, wrapping code that uses expression templates might not be very common. I use templates but quite sparingly so that is not a problem for me. The other potential issue is that SWIG uses proxy classes to represent the Python classes, whereas boost.python generates everything in the extension module. In reality this works both ways for SWIG, because the proxy classes are pure Python you can do some neat magic there. However, some folks have reported slowdowns because of this. I don't have a problem with this though. I initially thought that the size of the Python proxy files would cause a lot of bloat and problems. In practice, I have not had any prolems. Also, it was quite easy (a days effort) to make SWIG2 and weave work well together, so I can do inline C++ from Python on any of my wrapped code. All of these make SWIG incredibly attractive for my work. Before you think otherwise, I'd like to mention that I think boost.python is an excellent tool! It has its place and I would think there are some things that it would do better than any other tool. Its just that SWIG is often underestimated and for me SWIG has worked *really* nicely. For the most part it is true that SWIG "just works". cheers, prabhu [1] http://www.ae.iitm.ac.in/~prabhu/software/python.html and http://www.ae.iitm.ac.in/~prabhu/software/code/python/doxy2swig.py From missive at hotmail.com Tue Nov 2 16:28:38 2004 From: missive at hotmail.com (Lee Harr) Date: Wed, 03 Nov 2004 01:58:38 +0430 Subject: [SciPy-user] Undefined symbol "ATL_cpttrsm" Message-ID: >Ok, try again the updated scipy from CVS. > >Seems like on freebsd HAVE_INVERSE_HYPERBOLIC is defined somewhere but >inverse hyberbolic functions may have different names from acosh, asing, >atanh. I have fixed this by forcing -UHAVE_INVERSE_HYPERBOLIC. > Still the same error ... Here is the end of the message... error: Command "cc -fno-strict-aliasing -DNDEBUG -O -pipe -D_THREAD_SAFE -DTHREAD_STACK_SIZE=0x20000 -O -pipe -fPIC -DUSE_MCONF_LITE_LE -UHAVE_INVERSE_HYPERBOLIC -I/usr/local/include/python2.3 -c scipy_core/scipy_base/fastumathmodule.c -o build/temp.freebsd-4.10-STABLE-i386-2.3/scipy_core/scipy_base/fastumathmodule.o" failed with exit status 1 so HAVE_INVERSE_HYPERBOLIC is there, but apparently that is not the fix. Here is a snip from the acosh man page: LIBRARY Math Library (libm, -lm) SYNOPSIS #include double acosh(double x); Actually, I just made a test program to see if I could find acosh and it needs -lm to link in the math library. I am looking around in system_info.py but I am not sure where to add this ... _________________________________________________________________ Express yourself instantly with MSN Messenger! Download today it's FREE! http://messenger.msn.com/ From pearu at scipy.org Tue Nov 2 17:10:52 2004 From: pearu at scipy.org (Pearu Peterson) Date: Tue, 2 Nov 2004 16:10:52 -0600 (CST) Subject: [SciPy-user] Undefined symbol "ATL_cpttrsm" In-Reply-To: References: Message-ID: On Wed, 3 Nov 2004, Lee Harr wrote: >> Ok, try again the updated scipy from CVS. >> >> Seems like on freebsd HAVE_INVERSE_HYPERBOLIC is defined somewhere but >> inverse hyberbolic functions may have different names from acosh, asing, >> atanh. I have fixed this by forcing -UHAVE_INVERSE_HYPERBOLIC. >> > > > Still the same error ... Here is the end of the message... > > error: Command "cc -fno-strict-aliasing -DNDEBUG -O -pipe -D_THREAD_SAFE > -DTHREAD_STACK_SIZE=0x20000 -O -pipe -fPIC -DUSE_MCONF_LITE_LE > -UHAVE_INVERSE_HYPERBOLIC -I/usr/local/include/python2.3 -c > scipy_core/scipy_base/fastumathmodule.c -o > build/temp.freebsd-4.10-STABLE-i386-2.3/scipy_core/scipy_base/fastumathmodule.o" > failed with exit status 1 > > > so HAVE_INVERSE_HYPERBOLIC is there, but apparently that is not > the fix. Strange.. > Here is a snip from the acosh man page: > > LIBRARY > Math Library (libm, -lm) > > SYNOPSIS > #include > > double > acosh(double x); > > Actually, I just made a test program to see if I could find acosh > and it needs -lm to link in the math library. I am looking around > in system_info.py but I am not sure where to add this ... Update scipy_base from CVS and try again. It's setup file now uses -lm to link fastumath module. Pearu From missive at hotmail.com Tue Nov 2 17:15:37 2004 From: missive at hotmail.com (Lee Harr) Date: Wed, 03 Nov 2004 02:45:37 +0430 Subject: [SciPy-user] Undefined symbol "ATL_cpttrsm" Message-ID: >>Actually, I just made a test program to see if I could find acosh >>and it needs -lm to link in the math library. I am looking around >>in system_info.py but I am not sure where to add this ... > >Update scipy_base from CVS and try again. It's setup file now uses -lm >to link fastumath module. > Well... that change did not make the section use -lm but I copied and pasted the line and added it by hand and still get the same error message. Erm... sorry, but I think that is a wild goose chase. The error message from my test program without -lm looks very different: /tmp/ccRC0xKG.o: In function `main': /tmp/ccRC0xKG.o(.text+0x16): undefined reference to `acosh' whereas for scipy the error is: `acosh' undeclared here (not in a function) initializer element is not constant (near initialization for `arccosh_data[0]') Could it be a compiler error? I am using: gcc version 2.95.4 20020320 [FreeBSD] _________________________________________________________________ FREE pop-up blocking with the new MSN Toolbar - get it now! http://toolbar.msn.com/ From pearu at scipy.org Tue Nov 2 17:51:35 2004 From: pearu at scipy.org (Pearu Peterson) Date: Tue, 2 Nov 2004 16:51:35 -0600 (CST) Subject: [SciPy-user] Undefined symbol "ATL_cpttrsm" In-Reply-To: References: Message-ID: On Wed, 3 Nov 2004, Lee Harr wrote: >>> Actually, I just made a test program to see if I could find acosh >>> and it needs -lm to link in the math library. I am looking around >>> in system_info.py but I am not sure where to add this ... >> >> Update scipy_base from CVS and try again. It's setup file now uses -lm >> to link fastumath module. >> > > Well... that change did not make the section use -lm > but I copied and pasted the line and added it by hand and > still get the same error message. > > Erm... sorry, but I think that is a wild goose chase. The error > message from my test program without -lm looks very different: > > /tmp/ccRC0xKG.o: In function `main': > /tmp/ccRC0xKG.o(.text+0x16): undefined reference to `acosh' > > whereas for scipy the error is: > > `acosh' undeclared here (not in a function) > initializer element is not constant > (near initialization for `arccosh_data[0]') The errors are different because acosh are used differently in those two cases. Try removing `#if !defined(HAVE_INVERSE_HYPERBOLIC)` block from fastumath_unsigned.inc. This should ensure that acosh, etc are defined. If that still does not work then resend build.log to me. May be I can find there something off.. > Could it be a compiler error? > > I am using: > gcc version 2.95.4 20020320 [FreeBSD] I tried gcc version 2.95.4 20011002 (Debian prerelease) and it works fine on a Debian box. Pearu From missive at hotmail.com Tue Nov 2 17:46:29 2004 From: missive at hotmail.com (Lee Harr) Date: Wed, 03 Nov 2004 03:16:29 +0430 Subject: [SciPy-user] Undefined symbol "ATL_cpttrsm" Message-ID: >Try removing `#if !defined(HAVE_INVERSE_HYPERBOLIC)` block from >fastumath_unsigned.inc. This should ensure that acosh, etc are defined. > Ok. That works. >>>from scipy import * >>>arccosh(10) 2.9932228461263808 >>>arcsinh(10) 2.9982229502979698 >>>arctanh(10) (0.10033534773107562-1.5707963267948966j) _________________________________________________________________ FREE pop-up blocking with the new MSN Toolbar - get it now! http://toolbar.msn.com/ From nwagner at mecha.uni-stuttgart.de Wed Nov 3 07:20:44 2004 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Wed, 03 Nov 2004 13:20:44 +0100 Subject: [SciPy-user] Iterative solvers and complex arithmetic Message-ID: <4188CD1C.5040501@mecha.uni-stuttgart.de> Hi all, Who can resolve this problem ? python2.3 -i failure.py Traceback (most recent call last): File "failure.py", line 9, in ? x,info = linalg.gmres(A,r) File "/usr/lib/python2.3/site-packages/scipy/linalg/iterative.py", line 499, in gmres work[slice2] += sclr1*matvec(work[slice1]) TypeError: return array has incorrect type Nils -------------- next part -------------- A non-text attachment was scrubbed... Name: failure.py Type: text/x-python Size: 218 bytes Desc: not available URL: From Giovanni.Samaey at cs.kuleuven.ac.be Wed Nov 3 08:48:28 2004 From: Giovanni.Samaey at cs.kuleuven.ac.be (Giovanni Samaey) Date: Wed, 03 Nov 2004 14:48:28 +0100 Subject: [SciPy-user] Derivative() usage? In-Reply-To: <4177EE97.2060701@ucsd.edu> References: <200410211803.24608.falted@pytables.org> <4177EE97.2060701@ucsd.edu> Message-ID: <4188E1AC.2030400@cs.kuleuven.ac.be> > > 'n' as in d^n/dx^n . > > Somewhat confusingly, 'N' is being used in the docstring as a synonym > for 'order' and is the number of discrete points used to evaluate the > numerical derivative. I'm going to fix that. > > For example, when n=2, and order=3, one is computing the second > central derivative using 3 points [x0-dx, x0, x0+dx]. Normally, one would expect "order" to be an indication of the accuracy of the derivative. So, order=p should mean d^n/dx^n (exact answer) - (what's computed) = O(dx^p) where dx is the mesh spacing. Clearly, with three points for the second derivative, the order is only two, so this is clearly not a mathematically correct use of the word "order"... -- Giovanni Samaey From rkern at ucsd.edu Wed Nov 3 13:53:59 2004 From: rkern at ucsd.edu (Robert Kern) Date: Wed, 03 Nov 2004 10:53:59 -0800 Subject: [SciPy-user] Derivative() usage? In-Reply-To: <4188E1AC.2030400@cs.kuleuven.ac.be> References: <200410211803.24608.falted@pytables.org> <4177EE97.2060701@ucsd.edu> <4188E1AC.2030400@cs.kuleuven.ac.be> Message-ID: <41892947.10006@ucsd.edu> Giovanni Samaey wrote: > >> >> 'n' as in d^n/dx^n . >> >> Somewhat confusingly, 'N' is being used in the docstring as a synonym >> for 'order' and is the number of discrete points used to evaluate the >> numerical derivative. I'm going to fix that. >> >> For example, when n=2, and order=3, one is computing the second >> central derivative using 3 points [x0-dx, x0, x0+dx]. > > > Normally, one would expect "order" to be an indication of the accuracy > of the derivative. So, order=p should mean > d^n/dx^n (exact answer) - (what's computed) = O(dx^p) where dx is the > mesh spacing. > Clearly, with three points for the second derivative, the order is only > two, so this is clearly not a mathematically correct use of the word > "order"... Not really. "Order" is an incredibly overloaded word in this area. Its usage here is one of several acceptable ones. "numpoints" may be clearer, though. -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From missive at hotmail.com Wed Nov 3 17:21:16 2004 From: missive at hotmail.com (Lee Harr) Date: Thu, 04 Nov 2004 02:51:16 +0430 Subject: [SciPy-user] freebsd inverse hyperbolic patch Message-ID: How about something like this: Create a new variable called NEED_INVERSE_HYPERBOLIC and check for that one in fastumathmodule.c It would look something like this... begin 644 patches.tgz M'XL(`!18B4$``^W4RVZ;0!0&8&_+4YRJ"X.Y>+B9A*J2VQ0ID2*W2J5472$, M8YL6,Q2&EJAY^(R-76JWCK.)NSG?@IMG#@"TNP4#9V6KNM0 MQ6EQ%\:LI,/V;YJ^[8)%B".I MJOKT6CME1KY[[COG;9GQ&'3'UCQ0U]OQ6(*M.D_H+%Q&<36Z#FT]!>/GE8W#S[L/UU45?:>?1K*)^5R)+IV54IK2;OQ0C]>W/HGZ: MT[U;'+Z'-F$Y511)/3)_$@3O'YL/FUX;#F\@:#C-JY3EA!"""&$$$(((8000@@AA!!"""&$CGH`&B0D/0`H```` ` end _________________________________________________________________ Express yourself instantly with MSN Messenger! Download today it's FREE! http://messenger.msn.com/ From david at dwavesys.com Wed Nov 3 18:38:53 2004 From: david at dwavesys.com (David Grant) Date: Wed, 03 Nov 2004 15:38:53 -0800 Subject: [SciPy-user] Derivative() usage? In-Reply-To: <41892947.10006@ucsd.edu> References: <200410211803.24608.falted@pytables.org> <4177EE97.2060701@ucsd.edu> <4188E1AC.2030400@cs.kuleuven.ac.be> <41892947.10006@ucsd.edu> Message-ID: <41896C0D.7010701@dwavesys.com> Robert Kern wrote: > Giovanni Samaey wrote: > >> >>> >>> 'n' as in d^n/dx^n . >>> >>> Somewhat confusingly, 'N' is being used in the docstring as a >>> synonym for 'order' and is the number of discrete points used to >>> evaluate the numerical derivative. I'm going to fix that. >>> >>> For example, when n=2, and order=3, one is computing the second >>> central derivative using 3 points [x0-dx, x0, x0+dx]. >> >> >> >> Normally, one would expect "order" to be an indication of the >> accuracy of the derivative. So, order=p should mean >> d^n/dx^n (exact answer) - (what's computed) = O(dx^p) where dx is the >> mesh spacing. >> Clearly, with three points for the second derivative, the order is >> only two, so this is clearly not a mathematically correct use of the >> word "order"... > > > Not really. "Order" is an incredibly overloaded word in this area. Its > usage here is one of several acceptable ones. "numpoints" may be > clearer, though. > Where is the derivative function in scipy? From david at dwavesys.com Wed Nov 3 18:44:33 2004 From: david at dwavesys.com (David Grant) Date: Wed, 03 Nov 2004 15:44:33 -0800 Subject: [SciPy-user] Derivative() usage? In-Reply-To: <41896C0D.7010701@dwavesys.com> References: <200410211803.24608.falted@pytables.org> <4177EE97.2060701@ucsd.edu> <4188E1AC.2030400@cs.kuleuven.ac.be> <41892947.10006@ucsd.edu> <41896C0D.7010701@dwavesys.com> Message-ID: <41896D61.5080006@dwavesys.com> David Grant wrote: > Robert Kern wrote: > >> Giovanni Samaey wrote: >> >>> >>>> >>>> 'n' as in d^n/dx^n . >>>> >>>> Somewhat confusingly, 'N' is being used in the docstring as a >>>> synonym for 'order' and is the number of discrete points used to >>>> evaluate the numerical derivative. I'm going to fix that. >>>> >>>> For example, when n=2, and order=3, one is computing the second >>>> central derivative using 3 points [x0-dx, x0, x0+dx]. >>> >>> >>> >>> >>> Normally, one would expect "order" to be an indication of the >>> accuracy of the derivative. So, order=p should mean >>> d^n/dx^n (exact answer) - (what's computed) = O(dx^p) where dx is >>> the mesh spacing. >>> Clearly, with three points for the second derivative, the order is >>> only two, so this is clearly not a mathematically correct use of the >>> word "order"... >> >> >> >> Not really. "Order" is an incredibly overloaded word in this area. >> Its usage here is one of several acceptable ones. "numpoints" may be >> clearer, though. >> > Where is the derivative function in scipy? > doh, found it. It would be nice if the api docs page had an index of all functions, instead of having to "guess" that it was in common and clicking there. From pearu at scipy.org Thu Nov 4 06:09:25 2004 From: pearu at scipy.org (Pearu Peterson) Date: Thu, 4 Nov 2004 05:09:25 -0600 (CST) Subject: [SciPy-user] freebsd inverse hyperbolic patch In-Reply-To: References: Message-ID: On Thu, 4 Nov 2004, Lee Harr wrote: > How about something like this: > > Create a new variable called NEED_INVERSE_HYPERBOLIC > and check for that one in fastumathmodule.c This would force acosh,.. to be implemented for all platforms but I can see your point. However, what bugs me is that -UHAVE_INVERSE_HYPERBOLIC method did not worked while it should have. Btw, can you compile Numeric on freebsd? It has exactly the same issue and we should copy the workaround from there. Pearu From nwagner at mecha.uni-stuttgart.de Thu Nov 4 06:54:37 2004 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Thu, 04 Nov 2004 12:54:37 +0100 Subject: [SciPy-user] Sparse matrix operations Message-ID: <418A187D.9040204@mecha.uni-stuttgart.de> Hi all, How can I compute the Rayleigh quotient R = x' A x/(x' B x), where A and B are sparse matrices. How do I compute the matrix vector product of a sparse matrix with a vector ? Can I use dot(A,x) ? Nils From nuttyputtyzzzzt at hotmail.com Thu Nov 4 07:32:59 2004 From: nuttyputtyzzzzt at hotmail.com (zzzzt tzzzz) Date: Thu, 04 Nov 2004 13:32:59 +0100 Subject: [SciPy-user] Help! More instances of "AttributeError: 'module' object has no attribute 'eig'" Message-ID: Hi, I am trying to install SciPy.0.3.x on a beowulf cluster frontend running Redhat 7.x (I don't know the exact version). I install everything in a separate top dir, using a prebuilt ATLAS package for Linux.AthlonSSE1 and the gnu f77 compiler. Installing python 2.3.4, Numeric 23-6 and f2py2e goes smoothly; all tests run ok. Installing SciPy runs fine, with few serious error messages. However, when I import scipy in python (which is also not a problem) and try to run the tests, 15 tests fail with the error message "AttributeError: 'module' object has no attribute 'eig'" A web search led me to andrewl's message (link: http://www.scipy.net/pipermail/scipy-user/2004-October/003486.html ). Running >python setup.py clean --all; rm -rf build/ does nothing for me however (not even when I run it, or any combination of it, for each separate package after each reinstall). The clean script seems to run fine, but for the error message >'build/bdist.linux-i686' does not exist -- can't clean it >'build/scripts-2.3' does not exist -- can't clean it I have no idea whether this is relevant or not. Has anyone else had this problem? What log outputs should I submit to help solve the problem? As I am entirely new to the world of python, I would be very grateful for literal and simplified replies. Sincerely, Erik McNellis _________________________________________________________________ Hitta r?tt p? n?tet med MSN S?k http://search.msn.se/ From pearu at scipy.org Thu Nov 4 07:43:43 2004 From: pearu at scipy.org (Pearu Peterson) Date: Thu, 4 Nov 2004 06:43:43 -0600 (CST) Subject: [SciPy-user] Help! More instances of "AttributeError: 'module' object has no attribute 'eig'" In-Reply-To: References: Message-ID: On Thu, 4 Nov 2004, zzzzt tzzzz wrote: > Hi, > > I am trying to install SciPy.0.3.x on a beowulf cluster frontend running > Redhat 7.x (I don't know the exact version). I install everything in a > separate top dir, using a prebuilt ATLAS package for Linux.AthlonSSE1 and Could you try building scipy against ATLAS libraries from http://www.scipy.org/download/atlasbinaries/linux/ ? See the relevant section in http://www.scipy.org/documentation/buildscipy.txt > the gnu f77 compiler. Installing python 2.3.4, Numeric 23-6 and f2py2e > goes smoothly; all tests run ok. Installing SciPy runs fine, with few serious > error messages. However, when I import scipy in python (which is also not > a problem) and try to run the tests, 15 tests fail with the error message > > "AttributeError: 'module' object has no attribute 'eig'" > > A web search led me to andrewl's message (link: > http://www.scipy.net/pipermail/scipy-user/2004-October/003486.html > ). Running > >> python setup.py clean --all; rm -rf build/ You don't need to run `clean --all`. `rm -rf build` is sufficient. > does nothing for me however (not even when I run it, or any combination > of it, for each separate package after each reinstall). The clean script > seems to run fine, but for the error message > >> 'build/bdist.linux-i686' does not exist -- can't clean it >> 'build/scripts-2.3' does not exist -- can't clean it > > I have no idea whether this is relevant or not. Has anyone else had this > problem? What log outputs should I submit to help solve the problem? As > I am entirely new to the world of python, I would be very grateful for > literal and simplified replies. > > Sincerely, Erik McNellis What is the output of python system_info.py when you run it inside scipy_distutils directory? Pearu From pajer at iname.com Thu Nov 4 10:48:26 2004 From: pajer at iname.com (Gary) Date: Thu, 04 Nov 2004 10:48:26 -0500 Subject: [SciPy-user] Derivative() usage? In-Reply-To: <41896D61.5080006@dwavesys.com> References: <200410211803.24608.falted@pytables.org> <4177EE97.2060701@ucsd.edu> <4188E1AC.2030400@cs.kuleuven.ac.be> <41892947.10006@ucsd.edu> <41896C0D.7010701@dwavesys.com> <41896D61.5080006@dwavesys.com> Message-ID: <418A4F4A.7060404@iname.com> >> > doh, found it. It would be nice if the api docs page had an index of > all functions, instead of having to "guess" that it was in common and > clicking there. Timely comment ... I just spent 15 minutes looking for bisect(), which I knew had to be somewhere, but I didn't know where. (optimize, btw). From david.grant at telus.net Thu Nov 4 11:27:41 2004 From: david.grant at telus.net (David Grant) Date: Thu, 04 Nov 2004 08:27:41 -0800 Subject: [SciPy-user] Sparse matrix operations In-Reply-To: <418A187D.9040204@mecha.uni-stuttgart.de> References: <418A187D.9040204@mecha.uni-stuttgart.de> Message-ID: <418A587D.4090005@telus.net> I thought there was a matvec() function there somewhere... Dave Nils Wagner wrote: > Hi all, > > How can I compute the Rayleigh quotient > > R = x' A x/(x' B x), > > where A and B are sparse matrices. > > How do I compute the matrix vector product of a sparse matrix with a > vector ? > Can I use dot(A,x) ? > > > Nils > > > > > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user > -- David J. Grant http://www.davidgrant.ca:81 -------------- next part -------------- A non-text attachment was scrubbed... Name: david.grant.vcf Type: text/x-vcard Size: 177 bytes Desc: not available URL: From erikmcnellis at spray.se Wed Nov 3 12:11:56 2004 From: erikmcnellis at spray.se (Erik McNellis) Date: Wed, 03 Nov 2004 17:11:56 GMT Subject: [SciPy-user] Help! More instances of "AttributeError: 'module' object has no attribute 'eig'" Message-ID: <1099501916029646@lycos-europe.com> Hi, I am trying to install SciPy.0.3.x on a beowulf cluster frontend running Redhat 7.x (I don't know the exact version). I install everything in a separate top dir, using a prebuilt ATLAS package for Linux.AthlonSSE1 and the gnu f77 compiler. Installing python 2.3.4, Numeric 23-6 and f2py2e goes smoothly; all tests run ok. Installing SciPy runs fine, with few serious error messages. However, when I import scipy in python (which is also not a problem) and try to run the tests, 15 tests fail with the error message "AttributeError: 'module' object has no attribute 'eig'" A web search led me to andrewl's message (link: http://www.scipy.net/pipermail/scipy-user/2004-October/003486.html ). Running >python setup.py clean --all; rm -rf build/ does nothing for me however (not even when I run it, or any combination of it, for each separate package after each reinstall). The clean script seems to run fine, but for the error message >'build/bdist.linux-i686' does not exist -- can't clean it >'build/scripts-2.3' does not exist -- can't clean it I have no idea whether this is relevant or not. Has anyone else had this problem? What log outputs should I submit to help solve the problem? As I am entirely new to the world of python, I would be very grateful for literal and simplified replies. Sincerely, Erik McNellis null Otur i k?rlek - www.spray.se/spel Tur i k?rlek - www.spraydate.com From bldrake at ieee.org Thu Nov 4 15:12:49 2004 From: bldrake at ieee.org (unangnamu) Date: Thu, 4 Nov 2004 12:12:49 -0800 (PST) Subject: [SciPy-user] Problems installing on Suse 9.1 Message-ID: <20041104201249.71425.qmail@web202.biz.mail.re2.yahoo.com> I'm new to installing on a Linux system. When I try to install the Numeric .rpm using YaST2, I get a conflict error: python-base not available Has anyone seen this error before? System: Suse 9.1 Pro Python 2.3.3-85 Dell Inspiron, 1GB RAM Thank you, Barry Drake From tfogal at apollo.sr.unh.edu Thu Nov 4 16:30:13 2004 From: tfogal at apollo.sr.unh.edu (tom fogal) Date: Thu, 04 Nov 2004 16:30:13 -0500 Subject: [SciPy-user] Problems installing on Suse 9.1 In-Reply-To: Your message of "Thu, 04 Nov 2004 12:12:49 PST." <20041104201249.71425.qmail@web202.biz.mail.re2.yahoo.com> References: <20041104201249.71425.qmail@web202.biz.mail.re2.yahoo.com> Message-ID: <200411042130.iA4LUDa3029265@apollo.sr.unh.edu> <20041104201249.71425.qmail at web202.biz.mail.re2.yahoo.com>unangnamu writes: >When I try to install the Numeric .rpm using YaST2, I >get a conflict error: > >python-base not available Sounds like you haven't installed python first, and yast can't find the package for some reason. >Has anyone seen this error before? Try installing python, python-dev, python-*, etc. and see if that helps. -tom From missive at hotmail.com Thu Nov 4 17:23:52 2004 From: missive at hotmail.com (Lee Harr) Date: Fri, 05 Nov 2004 02:53:52 +0430 Subject: [SciPy-user] freebsd inverse hyperbolic patch Message-ID: >Btw, can you compile Numeric on freebsd? It has exactly the same issue >and we should copy the workaround from there. > Good call... Here is the FreeBSD patch for Numeric: --- Src/umathmodule.c.orig Sat Aug 2 01:10:09 2003 +++ Src/umathmodule.c Sat Aug 2 01:10:43 2003 @@ -1,9 +1,9 @@ +#include #include "Python.h" #include "Numeric/arrayobject.h" #include "Numeric/ufuncobject.h" #include "abstract.h" -#include #ifndef CHAR_BIT #define CHAR_BIT 8 Doing a similar maneuver in fastumathmodule.c allows the module to compile and link successfully. _________________________________________________________________ Don't just search. Find. Check out the new MSN Search! http://search.msn.com/ From oliphant at ee.byu.edu Thu Nov 4 17:46:29 2004 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Thu, 04 Nov 2004 15:46:29 -0700 Subject: [SciPy-user] Sparse matrix operations In-Reply-To: <418A187D.9040204@mecha.uni-stuttgart.de> References: <418A187D.9040204@mecha.uni-stuttgart.de> Message-ID: <418AB145.5000106@ee.byu.edu> Nils Wagner wrote: > Hi all, > > How can I compute the Rayleigh quotient > > R = x' A x/(x' B x), > > where A and B are sparse matrices. > > How do I compute the matrix vector product of a sparse matrix with a > vector ? > Can I use dot(A,x) ? No, This won't work. dot(x,A*x ) should work -Travis From oliphant at ee.byu.edu Thu Nov 4 17:51:09 2004 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Thu, 04 Nov 2004 15:51:09 -0700 Subject: [SciPy-user] Iterative solvers and complex arithmetic In-Reply-To: <4188CD1C.5040501@mecha.uni-stuttgart.de> References: <4188CD1C.5040501@mecha.uni-stuttgart.de> Message-ID: <418AB25D.6060002@ee.byu.edu> Nils Wagner wrote: > Hi all, > > Who can resolve this problem ? > > python2.3 -i failure.py > > Traceback (most recent call last): > File "failure.py", line 9, in ? > x,info = linalg.gmres(A,r) > File "/usr/lib/python2.3/site-packages/scipy/linalg/iterative.py", > line 499, in gmres > work[slice2] += sclr1*matvec(work[slice1]) > TypeError: return array has incorrect type > > Nils I will try to fix this, for now make sure A and r have the same type. By the way, you don't need from RandomArray import * rand is available from scipy -Travis From bldrake at adaptcs.com Thu Nov 4 18:05:47 2004 From: bldrake at adaptcs.com (Barry Drake) Date: Thu, 4 Nov 2004 15:05:47 -0800 (PST) Subject: [SciPy-user] Problems installing on Suse 9.1 In-Reply-To: <200411042130.iA4LUDa3029265@apollo.sr.unh.edu> Message-ID: <20041104230547.34220.qmail@web203.biz.mail.re2.yahoo.com> Thanks Tom. Python is installed and working fine. I have other packages working from the site-packages directory, e.g., drPython. I decided to ignore the error and let YaST install Numeric. I then installed scipy and from Idle imported the packages. Everything seems to be found ok. Are there some simple examples I can try to test the installation? Barry --- tom fogal wrote: > > <20041104201249.71425.qmail at web202.biz.mail.re2.yahoo.com>unangnamu > writes: > >When I try to install the Numeric .rpm using YaST2, > I > >get a conflict error: > > > >python-base not available > > Sounds like you haven't installed python first, and > yast can't find the > package for some reason. > > >Has anyone seen this error before? > > Try installing python, python-dev, python-*, etc. > and see if that > helps. > > -tom > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user > From pearu at scipy.org Thu Nov 4 18:13:02 2004 From: pearu at scipy.org (Pearu Peterson) Date: Thu, 4 Nov 2004 17:13:02 -0600 (CST) Subject: [SciPy-user] freebsd inverse hyperbolic patch In-Reply-To: References: Message-ID: On Fri, 5 Nov 2004, Lee Harr wrote: >> Btw, can you compile Numeric on freebsd? It has exactly the same issue >> and we should copy the workaround from there. >> > > Good call... Here is the FreeBSD patch for Numeric: > > --- Src/umathmodule.c.orig Sat Aug 2 01:10:09 2003 > +++ Src/umathmodule.c Sat Aug 2 01:10:43 2003 > @@ -1,9 +1,9 @@ > > +#include > #include "Python.h" > #include "Numeric/arrayobject.h" > #include "Numeric/ufuncobject.h" > #include "abstract.h" > -#include > > #ifndef CHAR_BIT > #define CHAR_BIT 8 > > > Doing a similar maneuver in fastumathmodule.c > allows the module to compile and link successfully. This patch is now applied to scipy CVS. Thanks, Pearu From david at dwavesys.com Thu Nov 4 18:20:26 2004 From: david at dwavesys.com (David Grant) Date: Thu, 04 Nov 2004 15:20:26 -0800 Subject: [SciPy-user] Problems installing on Suse 9.1 In-Reply-To: <20041104230547.34220.qmail@web203.biz.mail.re2.yahoo.com> References: <20041104230547.34220.qmail@web203.biz.mail.re2.yahoo.com> Message-ID: <418AB93A.6060508@dwavesys.com> Barry Drake wrote: >Thanks Tom. > >Python is installed and working fine. I have other >packages working from the site-packages directory, >e.g., drPython. > >I decided to ignore the error and let YaST install >Numeric. I then installed scipy and from Idle >imported the packages. Everything seems to be found >ok. > >Are there some simple examples I can try to test the >installation? > > from scipy.gplt import plot plot([1,2,3,4],[1,4,9,16]) from scipy.common import derivative a=2 derivative(lambda x: x**2, a, 3) #Finds the 3-point discrete derivative at x=a of the function x**2 From pearu at scipy.org Thu Nov 4 18:20:53 2004 From: pearu at scipy.org (Pearu Peterson) Date: Thu, 4 Nov 2004 17:20:53 -0600 (CST) Subject: [SciPy-user] Problems installing on Suse 9.1 In-Reply-To: <20041104230547.34220.qmail@web203.biz.mail.re2.yahoo.com> References: <20041104230547.34220.qmail@web203.biz.mail.re2.yahoo.com> Message-ID: On Thu, 4 Nov 2004, Barry Drake wrote: > Python is installed and working fine. I have other > packages working from the site-packages directory, > e.g., drPython. > > I decided to ignore the error and let YaST install > Numeric. I then installed scipy and from Idle > imported the packages. Everything seems to be found > ok. > > Are there some simple examples I can try to test the > installation? import scipy scipy.test() Pearu From tfogal at apollo.sr.unh.edu Thu Nov 4 19:06:59 2004 From: tfogal at apollo.sr.unh.edu (tom fogal) Date: Thu, 04 Nov 2004 19:06:59 -0500 Subject: [SciPy-user] Problems installing on Suse 9.1 In-Reply-To: Your message of "Thu, 04 Nov 2004 15:05:47 PST." <20041104230547.34220.qmail@web203.biz.mail.re2.yahoo.com> References: <20041104230547.34220.qmail@web203.biz.mail.re2.yahoo.com> Message-ID: <200411050006.iA506x24029772@apollo.sr.unh.edu> <20041104230547.34220.qmail at web203.biz.mail.re2.yahoo.com>Barry Drake writes: >I decided to ignore the error and let YaST install >Numeric. I then installed scipy and from Idle >imported the packages. Everything seems to be found >ok. Excellent. >Are there some simple examples I can try to test the >installation? Have you tried scipy.test ? tfogal at thetis tfogal $ python Python 2.3.4 (#1, Sep 29 2004, 14:29:41) [GCC 3.3.4 20040623 (Gentoo Linux 3.3.4-r2, ssp-3.3.2-2, pie-8.7.6)] on linux2 Type "help", "copyright", "credits" or "license" for more information. >>> import scipy >>> scipy.test() -tom >> >When I try to install the Numeric .rpm using YaST2, >> I >> >get a conflict error: >> > >> >python-base not available >> >> Sounds like you haven't installed python first, and >> yast can't find the >> package for some reason. >> >> >Has anyone seen this error before? >> >> Try installing python, python-dev, python-*, etc. >> and see if that >> helps. >> >> -tom From bldrake at adaptcs.com Thu Nov 4 19:14:13 2004 From: bldrake at adaptcs.com (Barry Drake) Date: Thu, 4 Nov 2004 16:14:13 -0800 (PST) Subject: [SciPy-user] Problems installing on Suse 9.1 In-Reply-To: <200411050006.iA506x24029772@apollo.sr.unh.edu> Message-ID: <20041105001414.21703.qmail@web201.biz.mail.re2.yahoo.com> Thank you to all who answered my questions. I tried all of the suggested test code from Idle and all seems to be working well. Running scipy.test() resulted in a couple of errors that I'll check into. Thanks! Barry Drake --- tom fogal wrote: > > <20041104230547.34220.qmail at web203.biz.mail.re2.yahoo.com>Barry > Drake writes: > >I decided to ignore the error and let YaST install > >Numeric. I then installed scipy and from Idle > >imported the packages. Everything seems to be > found > >ok. > > Excellent. > > >Are there some simple examples I can try to test > the > >installation? > > Have you tried scipy.test ? > > tfogal at thetis tfogal $ python > Python 2.3.4 (#1, Sep 29 2004, 14:29:41) > [GCC 3.3.4 20040623 (Gentoo Linux 3.3.4-r2, > ssp-3.3.2-2, pie-8.7.6)] on > linux2 > Type "help", "copyright", "credits" or "license" for > more information. > >>> import scipy > >>> scipy.test() > > -tom > > >> >When I try to install the Numeric .rpm using > YaST2, > >> I > >> >get a conflict error: > >> > > >> >python-base not available > >> > >> Sounds like you haven't installed python first, > and > >> yast can't find the > >> package for some reason. > >> > >> >Has anyone seen this error before? > >> > >> Try installing python, python-dev, python-*, etc. > >> and see if that > >> helps. > >> > >> -tom > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user > From nwagner at mecha.uni-stuttgart.de Fri Nov 5 04:30:15 2004 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Fri, 05 Nov 2004 10:30:15 +0100 Subject: [SciPy-user] Sparse matrix operations In-Reply-To: <418AB145.5000106@ee.byu.edu> References: <418A187D.9040204@mecha.uni-stuttgart.de> <418AB145.5000106@ee.byu.edu> Message-ID: <418B4827.2010601@mecha.uni-stuttgart.de> Travis Oliphant wrote: > Nils Wagner wrote: > >> Hi all, >> >> How can I compute the Rayleigh quotient >> >> R = x' A x/(x' B x), >> >> where A and B are sparse matrices. >> >> How do I compute the matrix vector product of a sparse matrix with a >> vector ? >> Can I use dot(A,x) ? > > > No, This won't work. > > dot(x,A*x ) should work > > -Travis > Now, assume that x is a rectangular (dense) matrix and A is a sparse matrix ared = dot(transpose(x0),ma*x0) File "/usr/lib/python2.3/site-packages/scipy/sparse/Sparse.py", line 179, in __mul__ res = csc * other File "/usr/lib/python2.3/site-packages/scipy/sparse/Sparse.py", line 403, in __mul__ return self.matvec(other) File "/usr/lib/python2.3/site-packages/scipy/sparse/Sparse.py", line 492, in matvec raise ValueError, "Dimension mismatch" ValueError: Dimension mismatch >>> shape(x0) (715, 10) >>> shape(transpose(x0)) (10, 715) >>> shape(ma) (715, 715) Any suggestion ? Nils > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user From nuttyputtyzzzzt at hotmail.com Fri Nov 5 05:37:42 2004 From: nuttyputtyzzzzt at hotmail.com (zzzzt tzzzz) Date: Fri, 05 Nov 2004 11:37:42 +0100 Subject: [SciPy-user] No luck: "AttributeError: 'module' object has no attribute 'eig'" Message-ID: Hi again. First of all, I apologize for spamming. I thought my first two tries bounced. Second, yes, I use the prebuilt atlas binaries from http://www.scipy.org/download/atlasbinaries/linux/ , for my architecture. I have, to the extent of my ability, tried to follow the instructions in http://www.scipy.org/documentation/buildscipy.txt to the letter, and I still get 15 failed tests with the error message above. I am enclosing the output of the >python system_info.py; run (the file is probably easiest viewed with emacs or vi or an equivalent program). I am very grateful for your quick response and great patience with us less experienced users; thank you so much. Sincerely, Erik McNellis. _________________________________________________________________ L?ttare att hitta dr?mresan med MSN Resor http://www.msn.se/resor/ -------------- next part -------------- A non-text attachment was scrubbed... Name: SysInfoOutput Type: application/octet-stream Size: 10926 bytes Desc: not available URL: From pearu at scipy.org Fri Nov 5 08:09:15 2004 From: pearu at scipy.org (Pearu Peterson) Date: Fri, 5 Nov 2004 07:09:15 -0600 (CST) Subject: [SciPy-user] No luck: "AttributeError: 'module' object has no attribute 'eig'" In-Reply-To: References: Message-ID: On Fri, 5 Nov 2004, zzzzt tzzzz wrote: > Hi again. > > First of all, I apologize for spamming. I thought my first two tries bounced. > Second, yes, I use the prebuilt atlas binaries from > http://www.scipy.org/download/atlasbinaries/linux/ , for my architecture. I > have, to the extent of my ability, tried to follow the instructions in > http://www.scipy.org/documentation/buildscipy.txt to the letter, and I still > get 15 failed tests with the error message above. I am enclosing the output > of the >> python system_info.py; > run (the file is probably easiest viewed with emacs or vi or an equivalent > program). The problem is that system_info picks up threaded version of atlas libraries from /usr/local/lib (that probably do not contain a complete lapack and has version 3.4.1) instead of the ones from /home/erik/NewInstall/lib. Are you using scipy_distutils from CVS? I thought I have fixed such problems in CVS. I'll look into again to find out a proper solution to avoid such problems in future.. One workaround would be to use threaded version of atlas libraries that you can also download from http://www.scipy.org/download/atlasbinaries/linux/ Pearu From nuttyputtyzzzzt at hotmail.com Fri Nov 5 09:10:50 2004 From: nuttyputtyzzzzt at hotmail.com (zzzzt tzzzz) Date: Fri, 05 Nov 2004 15:10:50 +0100 Subject: [SciPy-user] No luck: "AttributeError: 'module' object has no attribute 'eig'" Message-ID: Hi Pearu, I appreciate the quick reply very much. I used the SciPy sources in the .tar.gz archive from http://www.scipy.org/download/scipy/src/SciPy_complete-0.3.2.tar.gz . I'm sorry but I don't quite understand; threaded atlas libs? Do they come prebuilt for single AthlonSSE1 from http://www.scipy.org/download/atlasbinaries/linux/ (and in that case, which one is it?) or do I have to build them myself? I'll post another message once the problem is solved... Regards, Erik McNellis. _________________________________________________________________ Chat: Ha en fest p? Habbo Hotel http://habbohotel.msn.se/habbo/sv/channelizer Checka in h?r! From pearu at scipy.org Fri Nov 5 09:17:15 2004 From: pearu at scipy.org (Pearu Peterson) Date: Fri, 5 Nov 2004 08:17:15 -0600 (CST) Subject: [SciPy-user] No luck: "AttributeError: 'module' object has no attribute 'eig'" In-Reply-To: References: Message-ID: On Fri, 5 Nov 2004, zzzzt tzzzz wrote: > I appreciate the quick reply very much. I used the SciPy sources in the > .tar.gz archive > from http://www.scipy.org/download/scipy/src/SciPy_complete-0.3.2.tar.gz . > I'm sorry but I don't quite understand; threaded atlas libs? Do they come > prebuilt > for single AthlonSSE1 from > http://www.scipy.org/download/atlasbinaries/linux/ (and in that case, which > one is it?) or do I have to build them myself? I'll post another message once > the problem is solved... You can use atlas3.6.0_Linux_ATHLONSSE1_2.tgz that contain threaded atlas libraries, i.e. libpt{f77,c}blas.a files. I presume that you are running on machine with several CPUs, otherwise it does not make sense that your system has threaded atlas libraries installed.. Pearu From nuttyputtyzzzzt at hotmail.com Fri Nov 5 09:35:46 2004 From: nuttyputtyzzzzt at hotmail.com (zzzzt tzzzz) Date: Fri, 05 Nov 2004 15:35:46 +0100 Subject: [SciPy-user] No luck: "AttributeError: 'module' object has no attribute 'eig'" Message-ID: Hi Pearu. Well, I use SciPy as a requirement for ASE, which I in turn need to run Dacapo. The python and ASE code is used to generate input files for the Dacapo code, which then runs the calculations in parallell on the cluster nodes. It makes sense to me to run SciPy on the cluster frontend only, which is a single CPU machine. I dont know, maybe the threaded code would work for a single CPU system as well? Regards, Erik McNellis _________________________________________________________________ Hitta r?tt p? n?tet med MSN S?k http://search.msn.se/ From pearu at scipy.org Fri Nov 5 09:45:31 2004 From: pearu at scipy.org (Pearu Peterson) Date: Fri, 5 Nov 2004 08:45:31 -0600 (CST) Subject: [SciPy-user] No luck: "AttributeError: 'module' object has no attribute 'eig'" In-Reply-To: References: Message-ID: On Fri, 5 Nov 2004, zzzzt tzzzz wrote: > Well, I use SciPy as a requirement for ASE, which I in turn need to run > Dacapo. The python and ASE code is used to generate input files for the > Dacapo code, which then runs the calculations in parallell on the cluster > nodes. It makes sense to me to run SciPy on the cluster frontend only, which > is a single CPU machine. I dont know, maybe the threaded code would work for > a single CPU system as well? Give it a try. Pearu From nuttyputtyzzzzt at hotmail.com Fri Nov 5 10:43:43 2004 From: nuttyputtyzzzzt at hotmail.com (zzzzt tzzzz) Date: Fri, 05 Nov 2004 16:43:43 +0100 Subject: [SciPy-user] Solved!!! "AttributeError: 'module' object has no attribute 'eig'" Message-ID: Hi, It worked!! The threaded code http://www.scipy.org/download/atlasbinaries/linux/atlas3.6.0_Linux_ATHLONSSE1_2.tgz runs all tests without glitches. Thank you very much for your advice, Pearu. I hope this has been of some use to the scipy community, too... Regards, Erik McNellis _________________________________________________________________ Auktioner: Tj?na en hacka p? gamla prylar http://tradera.msn.se From oliphant at ee.byu.edu Fri Nov 5 12:11:14 2004 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Fri, 05 Nov 2004 10:11:14 -0700 Subject: [SciPy-user] Sparse matrix operations In-Reply-To: <418B4827.2010601@mecha.uni-stuttgart.de> References: <418A187D.9040204@mecha.uni-stuttgart.de> <418AB145.5000106@ee.byu.edu> <418B4827.2010601@mecha.uni-stuttgart.de> Message-ID: <418BB432.5060607@ee.byu.edu> Nils Wagner wrote: > Travis Oliphant wrote: > >> Nils Wagner wrote: >> >>> Hi all, >>> >>> How can I compute the Rayleigh quotient >>> >>> R = x' A x/(x' B x), >>> >>> where A and B are sparse matrices. >>> >>> How do I compute the matrix vector product of a sparse matrix with a >>> vector ? >>> Can I use dot(A,x) ? >> >> >> >> No, This won't work. >> >> dot(x,A*x ) should work >> >> -Travis >> > Now, assume that x is a rectangular (dense) matrix and A is a sparse > matrix > > ared = dot(transpose(x0),ma*x0) > File "/usr/lib/python2.3/site-packages/scipy/sparse/Sparse.py", line > 179, in __mul__ > res = csc * other > File "/usr/lib/python2.3/site-packages/scipy/sparse/Sparse.py", line > 403, in __mul__ > return self.matvec(other) > File "/usr/lib/python2.3/site-packages/scipy/sparse/Sparse.py", line > 492, in matvec > raise ValueError, "Dimension mismatch" > ValueError: Dimension mismatch > >>> shape(x0) > (715, 10) > >>> shape(transpose(x0)) > (10, 715) > >>> shape(ma) > (715, 715) > > Any suggestion ? sparse matrix * dense matrix is not yet supported. You will have to loop over the columns of the dense matrix. -Travis From falted at pytables.org Fri Nov 5 13:28:32 2004 From: falted at pytables.org (Francesc Altet) Date: Fri, 5 Nov 2004 19:28:32 +0100 Subject: [SciPy-user] ANN: PyTables-0.9 released Message-ID: <200411051928.32583.falted@pytables.org> Announcing PyTables 0.9 ----------------------- I'm pleased to announce the availability of the latest incarnation of PyTables. On this release you will find a series of quite exciting new features, being the most important the indexing capabilities, in-kernel selections, support for complex datatypes and the possibility to modify values in both tables *and* arrays (yeah, finally :). What is ------- PyTables is a hierarchical database package designed to efficiently manage extremely large amounts of data (supports full 64-bit file addressing). It features an object-oriented interface that, combined with C extensions for the peformance-critical parts of the code, makes it a very easy to use tool for high performance data saving and retrieving. It is built on top of the HDF5 library and the numarray package, and provides containers for both heterogeneous data (Tables) and homogeneous data (Array, EArray). It also sports a container for keeping lists of objects of variable length on a very efficient way (VLArray). A flexible support of filters allows you to compress your data on-the-flight by using different compressors and compression enablers. Moreover, its powerful browsing and searching capabilities allow you to do data selections over tables exceeding gigabytes of data in just tenths of second. Changes more in depth --------------------- New features: - Indexing of columns in tables. That allow to make data selections on tables up to 500 times faster than standard selections (for ex. doing a selection along an indexed column of 100 milion of rows takes less than 1 second on a modern CPU). Perhaps the most interesting thing about the indexing algorithm implemented by PyTables is that the time taken to index grows *lineraly* with the length of the data, so, making the indexation process to be *scalable* (quite differently to many relational databases). This means that it can index, in a relatively quick way, arbitrarily large table columns (for ex. indexing a column of 100 milion of rows takes just 100 seconds, i.e. at a rate of 1 Mrow/sec). See more detailed info about that in http://pytables.sourceforge.net/doc/SciPy04.pdf. - In-kernel selections. This feature allow to make data selections on tables up to 5 times faster than standard selections (i.e. pre-0.9 selections), without a need to create an index. As a hint of how fast these selections can be, they are up to 10 times faster than a traditional relational database. Again, see http://pytables.sourceforge.net/doc/SciPy04.pdf for some experiments on that matter. - Support of complex datatypes for all the data objects (i.e. Table, Array, EArray and VLArray). With that, the complete set of datatypes of Numeric and numarray packages are supported. Thanks to Tom Hedley for providing the patches for Array, EArray and VLArray objects, as well as updating the User's Manual and adding unit tests for the new functionality. - Modification of values. You can modifiy Table, Array, EArray and VLArray values. See Table.modifyRows, Table.modifyColumns() and the newly introduced __setitem__() method for Table, Array, EArray and VLArray entities in the Library Reference of User's Manual. - A new sub-package called "nodes" is there. On it, there will be included different modules to make more easy working with different entities (like images, files, ...). The first module that has been added to this sub-package is "FileNode", whose mission is to enable the creation of a database of nodes which can be used like regular opened files in Python. In other words, you can store a set of files in a PyTables database, and read and write it as you would do with any other file in Python. Thanks to Ivan Vilata i Balaguer for contributing this. Improvements: - New __len__(self) methods added in Arrays, Tables and Columns. This, in combination with __getitem__(self,key) allows to better emulate sequences. - Better capabilities to import generic HDF5 files. In particular, Table objects (in the HDF5_HL naming schema) with "holes" in their compound type definition are supported. That allows to read certain files produced by NASA (thanks to Stephen Walton for reporting this). - Much improved test units. More than 2000 different tests has been implemented which accounts for more than 13000 loc (this represents twice of the PyTables library code itself (!)). Backward-incompatible API changes: - The __call__ special method has been removed from objects File, Group, Table, Array, EArray and VLArray. Now, you should use walkNodes() in File and Group and iterrows in Table, Array, EArray and VLArray to get the same functionality. This would provide better compatibility with IPython as well. 'nctoh5', a new importing utility: - Jeff Whitaker has contributed a script to easily convert NetCDF files into HDF5 files using Scientific Python and PyTables. It has been included and documented as a new utility. Bug fixes: - A call to File.flush() now invoke a call to H5Fflush() so to effectively flushing all the file contents to disk. Thanks to Shack Toms for reporting this and providing a patch. - SF #1054683: Security hole in utils.checkNameValidity(). Reported in 2004-10-26 by ivilata - SF #1049297: Suggestion: new method File.delAttrNode(). Reported in 2004-10-18 by ivilata - SF #1049285: Leak in AttributeSet.__delattr__(). Reported in 2004-10-18 by ivilata - SF #1014298: Wrong method call in examples/tutorial1-2.py. Reported in 2004-08-23 by ivilata - SF #1013202: Cryptic error appending to EArray on RO file. Reported in 2004-08-21 by ivilata - SF #991715: Table.read(field="var1", flavor="List") fails. Reported in 2004-07-15 by falted - SF #988547: Wrong file type assumption in File.__new__. Reported in 2004-07-10 by ivilata Where PyTables can be applied? ------------------------------ PyTables is not designed to work as a relational database competitor, but rather as a teammate. If you want to work with large datasets of multidimensional data (for example, for multidimensional analysis), or just provide a categorized structure for some portions of your cluttered RDBS, then give PyTables a try. It works well for storing data from data acquisition systems (DAS), simulation software, network data monitoring systems (for example, traffic measurements of IP packets on routers), very large XML files, or for creating a centralized repository for system logs, to name only a few possible uses. What is a table? ---------------- A table is defined as a collection of records whose values are stored in fixed-length fields. All records have the same structure and all values in each field have the same data type. The terms "fixed-length" and "strict data types" seem to be quite a strange requirement for a language like Python that supports dynamic data types, but they serve a useful function if the goal is to save very large quantities of data (such as is generated by many scientific applications, for example) in an efficient manner that reduces demand on CPU time and I/O resources. What is HDF5? ------------- For those people who know nothing about HDF5, it is a general purpose library and file format for storing scientific data made at NCSA. HDF5 can store two primary objects: datasets and groups. A dataset is essentially a multidimensional array of data elements, and a group is a structure for organizing objects in an HDF5 file. Using these two basic constructs, one can create and store almost any kind of scientific data structure, such as images, arrays of vectors, and structured and unstructured grids. You can also mix and match them in HDF5 files according to your needs. Platforms --------- I'm using Linux (Intel 32-bit) as the main development platform, but PyTables should be easy to compile/install on many other UNIX machines. This package has also passed all the tests on a UltraSparc platform with Solaris 7 and Solaris 8. It also compiles and passes all the tests on a SGI Origin2000 with MIPS R12000 processors, with the MIPSPro compiler and running IRIX 6.5. It also runs fine on Linux 64-bit platforms, like AMD Opteron running GNU/Linux 2.4.21 Server, Intel Itanium (IA64) running GNU/Linux 2.4.21 or PowerPC G5 with Linux 2.6.x in 64bit mode. It has also been tested in MacOSX platforms (10.2 but should also work on newer versions). Regarding Windows platforms, PyTables has been tested with Windows 2000 and Windows XP (using the Microsoft Visual C compiler), but it should also work with other flavors as well. Web site -------- Go to the PyTables web site for more details: http://pytables.sourceforge.net/ To know more about the company behind the PyTables development, see: http://www.carabos.com/ Share your experience --------------------- Let me know of any bugs, suggestions, gripes, kudos, etc. you may have. Bon profit! -- Francesc Altet From pearu at scipy.org Sat Nov 6 05:17:18 2004 From: pearu at scipy.org (Pearu Peterson) Date: Sat, 6 Nov 2004 04:17:18 -0600 (CST) Subject: [SciPy-user] No luck: "AttributeError: 'module' object has no attribute 'eig'" In-Reply-To: References: Message-ID: On Fri, 5 Nov 2004, Pearu Peterson wrote: > The problem is that system_info picks up threaded version of atlas libraries > from /usr/local/lib (that probably do not contain a complete lapack and has > version 3.4.1) instead of the ones from /home/erik/NewInstall/lib. > > Are you using scipy_distutils from CVS? I thought I have fixed such problems > in CVS. I'll look into again to find out a proper solution to avoid such > problems in future.. To disable threaded atlas, one should define PTATLAS=None environment variable when using recent scipy_distutils from CVS. Pearu From yaroslavvb at gmail.com Thu Nov 4 20:33:07 2004 From: yaroslavvb at gmail.com (Yaroslav Bulatov) Date: Thu, 4 Nov 2004 17:33:07 -0800 Subject: [SciPy-user] fmin won't optimize Message-ID: Hi I'm fitting Gibbs distribution to data using maximum likelihood. If I use fmin_powell, it works, but when I use fmin, it terminates after 1 iteration regardless of starting point. The log-likelihood function is convex. Any idea why this would happen? The code below terminates with likelihood 4.158883, whereas the minimum is 3.296169 # Train simple 3 parameter Gibbs Distribution import math from scipy import * from scipy.optimize import * train_set=[(0,0),(0,1),(1,1)] #test_set=[(0,0),(0,1),(1,1)] # Negative log-likelihood def neg_ll(lambdas): unnormalized = lambda(x): math.exp(lambdas[0]*x[0]+lambdas[1]*x[1]+lambdas[2]*x[0]*x[1]) Z = sum([unnormalized(x) for x in [(0,0),(0,1),(1,0),(1,1)]]) ll = 0 for x in train_set: ll+=math.log(unnormalized(x))-math.log(Z) return -ll def main(): x0 = [0,0,0] xopt = fmin(neg_ll, x0) print 'Before training: %f' %(neg_ll(x0),) print 'After training: %f' %(neg_ll(xopt),) if __name__=='__main__': main() From pearu at scipy.org Sat Nov 6 12:14:25 2004 From: pearu at scipy.org (Pearu Peterson) Date: Sat, 6 Nov 2004 11:14:25 -0600 (CST) Subject: [SciPy-user] fmin won't optimize In-Reply-To: References: Message-ID: On Thu, 4 Nov 2004, Yaroslav Bulatov wrote: > I'm fitting Gibbs distribution to data using maximum likelihood. If I > use fmin_powell, it works, but when I use fmin, it terminates after 1 > iteration regardless of starting point. The log-likelihood function is > convex. Any idea why this would happen? Try `x0=[0.0, 0.0, 0.0]` in your code. The problem that integer initial value to fmin terminates iteration immidiately should be fixed in scipy CVS. Here is the output from your program that I get: Optimization terminated successfully. Current function value: 3.295837 Iterations: 269 Function evaluations: 510 Before training: 4.158883 After training: 3.295837 Pearu From yaroslavvb at gmail.com Sat Nov 6 22:11:04 2004 From: yaroslavvb at gmail.com (Yaroslav Bulatov) Date: Sat, 6 Nov 2004 19:11:04 -0800 Subject: [SciPy-user] fmin won't optimize In-Reply-To: References: Message-ID: Hi I'm fitting Gibbs distribution to data using maximum likelihood. If I use fmin_powell, it works, but when I use fmin, it terminates after 1 iteration regardless of starting point. The log-likelihood function is convex. Any idea why this would happen? The code below terminates with likelihood 4.158883 if I use fmin, but with 3.296169 if "fmin" is replaced with "fmin_powell" # Train simple 3 parameter Gibbs Distribution import math from scipy import * from scipy.optimize import * train_set=[(0,0),(0,1),(1,1)] #test_set=[(0,0),(0,1),(1,1)] # Negative log-likelihood def neg_ll(lambdas): unnormalized = lambda(x): math.exp(lambdas[0]*x[0]+lambdas[1]*x[1]+lambdas[2]*x[0]*x[1]) Z = sum([unnormalized(x) for x in [(0,0),(0,1),(1,0),(1,1)]]) ll = 0 for x in train_set: ll+=math.log(unnormalized(x))-math.log(Z) return -ll def main(): x0 = [0,0,0] xopt = fmin(neg_ll, x0) print 'Before training: %f' %(neg_ll(x0),) print 'After training: %f' %(neg_ll(xopt),) if __name__=='__main__': main() From oliphant at ee.byu.edu Sat Nov 6 23:58:19 2004 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Sat, 06 Nov 2004 21:58:19 -0700 Subject: [SciPy-user] fmin won't optimize In-Reply-To: References: Message-ID: <418DAB6B.9020003@ee.byu.edu> Yaroslav Bulatov wrote: >Hi > >I'm fitting Gibbs distribution to data using maximum likelihood. If I >use fmin_powell, it works, but when I use fmin, it terminates after 1 >iteration regardless of starting point. The log-likelihood function is >convex. Any idea why this would happen? > >The code below terminates with likelihood 4.158883 if I use fmin, but >with 3.296169 if "fmin" is replaced with "fmin_powell" > ># Train simple 3 parameter Gibbs Distribution > >import math >from scipy import * >from scipy.optimize import * > >train_set=[(0,0),(0,1),(1,1)] >#test_set=[(0,0),(0,1),(1,1)] > ># Negative log-likelihood >def neg_ll(lambdas): > unnormalized = lambda(x): > > Note, this line break needed to be fixed for it to work. > math.exp(lambdas[0]*x[0]+lambdas[1]*x[1]+lambdas[2]*x[0]*x[1]) > Z = sum([unnormalized(x) for x in [(0,0),(0,1),(1,0),(1,1)]]) > ll = 0 > for x in train_set: > ll+=math.log(unnormalized(x))-math.log(Z) > return -ll > >def main(): > x0 = [0,0,0] > xopt = fmin(neg_ll, x0) > print 'Before training: %f' %(neg_ll(x0),) > print 'After training: %f' %(neg_ll(xopt),) > >if __name__=='__main__': main() > > > I tried your code and it worked like this for me: >>> xopt = fmin_powell(neg_ll,x0) Optimization terminated successfully. Current function value: 3.296169 Iterations: 7 Function evaluations: 168 >>> print xopt [-8.2385 -0.0122 8.254 ] >>> xopt = fmin(neg_ll,x0) Optimization terminated successfully. Current function value: 3.295837 Iterations: 269 Function evaluations: 510 >>> print xopt [-34.2701 -0. 34.2701] From aisaac at american.edu Sat Nov 6 18:31:50 2004 From: aisaac at american.edu (Alan G Isaac) Date: Sat, 6 Nov 2004 18:31:50 -0500 (Eastern Standard Time) Subject: [SciPy-user] Iterative solvers and complex arithmetic In-Reply-To: <418AB25D.6060002@ee.byu.edu> References: <4188CD1C.5040501@mecha.uni-stuttgart.de><418AB25D.6060002@ee.byu.edu> Message-ID: On Thu, 04 Nov 2004, Travis Oliphant apparently wrote: > By the way, you don't need from RandomArray import * > rand is available from scipy Just for feedback from a newbie: such observations are very helpful. In fact I read that early on in the tutorial, but I had forgotten it. Thanks, Alan Isaac PS The only mention I saw was one sentence under the Common functions section: "The functions rand and randn are used so often that they warranted a place at the top level." This does not serve as documentation. If there is an easy way for me to add to the tutorial when I run across such instances, I would be happy to do so. From nuttyputtyzzzzt at hotmail.com Sun Nov 7 10:01:43 2004 From: nuttyputtyzzzzt at hotmail.com (zzzzt tzzzz) Date: Sun, 07 Nov 2004 16:01:43 +0100 Subject: [SciPy-user] Whatever happened to the Scientific_netcdf module?!? Message-ID: Hi, I am trying to get SciPy.0.3.2 to work for another program which requires the Scientific_netcdf module (and a couple of other netcdf related modules). I have found instructions for building these modules for Scientific Python 2.4.6, but I'm still aiming for 0.3.2. If possible, how do I build these modules for SciPy.0.3.2? Regards, Erik McNellis _________________________________________________________________ L?ttare att hitta dr?mresan med MSN Resor http://www.msn.se/resor/ From jdgleeson at mac.com Sun Nov 7 10:30:58 2004 From: jdgleeson at mac.com (John Gleeson) Date: Sun, 7 Nov 2004 08:30:58 -0700 Subject: [SciPy-user] fmin won't optimize In-Reply-To: <418DAB6B.9020003@ee.byu.edu> References: <418DAB6B.9020003@ee.byu.edu> Message-ID: <05B4E5BD-30D2-11D9-A9A4-000A95B4710C@mac.com> On Nov 6, 2004, at 9:58 PM, Travis Oliphant wrote: > Yaroslav Bulatov wrote: > >> Hi >> >> I'm fitting Gibbs distribution to data using maximum likelihood. If I >> use fmin_powell, it works, but when I use fmin, it terminates after 1 >> iteration regardless of starting point. The log-likelihood function is >> convex. Any idea why this would happen? >> >> The code below terminates with likelihood 4.158883 if I use fmin, but >> with 3.296169 if "fmin" is replaced with "fmin_powell" >> >> # Train simple 3 parameter Gibbs Distribution >> >> import math >> from scipy import * >> from scipy.optimize import * >> >> train_set=[(0,0),(0,1),(1,1)] >> #test_set=[(0,0),(0,1),(1,1)] >> >> # Negative log-likelihood >> def neg_ll(lambdas): >> unnormalized = lambda(x): >> > Note, this line break needed to be fixed for it to work. > >> math.exp(lambdas[0]*x[0]+lambdas[1]*x[1]+lambdas[2]*x[0]*x[1]) >> Z = sum([unnormalized(x) for x in [(0,0),(0,1),(1,0),(1,1)]]) >> ll = 0 >> for x in train_set: >> ll+=math.log(unnormalized(x))-math.log(Z) >> return -ll >> >> def main(): >> x0 = [0,0,0] >> xopt = fmin(neg_ll, x0) >> print 'Before training: %f' %(neg_ll(x0),) >> print 'After training: %f' %(neg_ll(xopt),) >> >> if __name__=='__main__': main() >> >> > I tried your code and it worked like this for me: > > >>> xopt = fmin_powell(neg_ll,x0) > Optimization terminated successfully. > Current function value: 3.296169 > Iterations: 7 > Function evaluations: 168 > >>> print xopt > [-8.2385 -0.0122 8.254 ] > > >>> xopt = fmin(neg_ll,x0) > Optimization terminated successfully. > Current function value: 3.295837 > Iterations: 269 > Function evaluations: 510 > >>> print xopt > [-34.2701 -0. 34.2701] > Yaroslav, maybe you missed Pearu's response to your earlier email? I get 1 iteration and 4.158883 from fmin also. But if I change x0 = [0,0,0] to x0 = [0.0,0.0,0.0], as Pearu suggested , then I get exactly the same results as Travis. John From bhoel at web.de Sun Nov 7 15:06:43 2004 From: bhoel at web.de (=?iso-8859-15?q?Berthold_H=F6llmann?=) Date: Sun, 07 Nov 2004 21:06:43 +0100 Subject: [SciPy-user] Re: Whatever happened to the Scientific_netcdf module?!? In-Reply-To: (zzzzt tzzzz's message of "Sun, 07 Nov 2004 16:01:43 +0100") References: Message-ID: "zzzzt tzzzz" writes: > Hi, > > I am trying to get SciPy.0.3.2 to work for another program which > requires the Scientific_netcdf module (and a couple of other netcdf > related modules). I have found instructions for building these modules > for Scientific Python 2.4.6, but I'm still aiming for 0.3.2. > If possible, how do I build these modules for SciPy.0.3.2? ScientificPython is not SciPy. Get it at . Regards Berthold -- bhoel at web.de / http://starship.python.net/crew/bhoel/ From stephen.walton at csun.edu Sun Nov 7 15:03:01 2004 From: stephen.walton at csun.edu (Stephen Walton) Date: Sun, 07 Nov 2004 12:03:01 -0800 Subject: [SciPy-user] ANN: PyTables-0.9 released In-Reply-To: <200411051928.32583.falted@pytables.org> References: <200411051928.32583.falted@pytables.org> Message-ID: <1099857781.3511.14.camel@localhost.localdomain> On Fri, 2004-11-05 at 19:28 +0100, Francesc Altet wrote: > Announcing PyTables 0.9 Francesc, I hit a problem building PyTables 0.9 on FC2 which I didn't see at PyTables 0.8.3: it is attempting to link against the lzo libraries even though setup.py correctly detected I don't have them installed. A short excerpt from the build output: Found HDF5 libraries at /usr/lib Found HDF5 header files at /usr/include Optional lzo libraries or include files not found. Disabling support for them. Optional ucl libraries or include files not found. Disabling support for them. Found numarray 1.1 package installed running build running build_py creating build creating build/lib.linux-i686-2.3 creating build/lib.linux-i686-2.3/tables [...many lines deleted, including a lot of warnings about mismatched pointer types..] build/temp.linux-i686-2.3/src/H5TB-opt.o -lhdf5 -llzo -lucl -o build/lib.linux-i686-2.3/tables/hdf5Extension.so /usr/bin/ld: cannot find -llzo collect2: ld returned 1 exit status error: command 'gcc' failed with exit status 1 -- Stephen Walton Physics & Astronomy CSUN From yaroslavvb at gmail.com Mon Nov 8 04:40:53 2004 From: yaroslavvb at gmail.com (Yaroslav Bulatov) Date: Mon, 8 Nov 2004 01:40:53 -0800 Subject: [SciPy-user] fmin won't optimize In-Reply-To: <05B4E5BD-30D2-11D9-A9A4-000A95B4710C@mac.com> References: <418DAB6B.9020003@ee.byu.edu> <05B4E5BD-30D2-11D9-A9A4-000A95B4710C@mac.com> Message-ID: Sorry, for the double send, I was confused by the "message awaits moderator approval" replies. Yes, if I change starting parameter to be [0.,0.,0.] instead of [0,0,0] fmin and fmin_powell find the same optimum. I guess fmin doesn't cast them to floats? Yaroslav On Sun, 7 Nov 2004 08:30:58 -0700, John Gleeson wrote: > > > > On Nov 6, 2004, at 9:58 PM, Travis Oliphant wrote: > > > Yaroslav Bulatov wrote: > > > >> Hi > >> > >> I'm fitting Gibbs distribution to data using maximum likelihood. If I > >> use fmin_powell, it works, but when I use fmin, it terminates after 1 > >> iteration regardless of starting point. The log-likelihood function is > >> convex. Any idea why this would happen? > >> > >> The code below terminates with likelihood 4.158883 if I use fmin, but > >> with 3.296169 if "fmin" is replaced with "fmin_powell" > >> > >> # Train simple 3 parameter Gibbs Distribution > >> > >> import math > >> from scipy import * > >> from scipy.optimize import * > >> > >> train_set=[(0,0),(0,1),(1,1)] > >> #test_set=[(0,0),(0,1),(1,1)] > >> > >> # Negative log-likelihood > >> def neg_ll(lambdas): > >> unnormalized = lambda(x): > >> > > Note, this line break needed to be fixed for it to work. > > > >> math.exp(lambdas[0]*x[0]+lambdas[1]*x[1]+lambdas[2]*x[0]*x[1]) > >> Z = sum([unnormalized(x) for x in [(0,0),(0,1),(1,0),(1,1)]]) > >> ll = 0 > >> for x in train_set: > >> ll+=math.log(unnormalized(x))-math.log(Z) > >> return -ll > >> > >> def main(): > >> x0 = [0,0,0] > >> xopt = fmin(neg_ll, x0) > >> print 'Before training: %f' %(neg_ll(x0),) > >> print 'After training: %f' %(neg_ll(xopt),) > >> > >> if __name__=='__main__': main() > >> > >> > > I tried your code and it worked like this for me: > > > > >>> xopt = fmin_powell(neg_ll,x0) > > Optimization terminated successfully. > > Current function value: 3.296169 > > Iterations: 7 > > Function evaluations: 168 > > >>> print xopt > > [-8.2385 -0.0122 8.254 ] > > > > >>> xopt = fmin(neg_ll,x0) > > Optimization terminated successfully. > > Current function value: 3.295837 > > Iterations: 269 > > Function evaluations: 510 > > >>> print xopt > > [-34.2701 -0. 34.2701] > > > > Yaroslav, maybe you missed Pearu's response to your > earlier email? > > I get 1 iteration and 4.158883 from fmin also. But if > I change x0 = [0,0,0] to x0 = [0.0,0.0,0.0], as Pearu suggested , then > I get exactly the same > results as Travis. > > John > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user > From falted at pytables.org Mon Nov 8 05:30:17 2004 From: falted at pytables.org (Francesc Altet) Date: Mon, 8 Nov 2004 11:30:17 +0100 Subject: [SciPy-user] ANN: PyTables-0.9 released In-Reply-To: <1099857781.3511.14.camel@localhost.localdomain> References: <200411051928.32583.falted@pytables.org> <1099857781.3511.14.camel@localhost.localdomain> Message-ID: <200411081130.17736.falted@pytables.org> A Diumenge 07 Novembre 2004 21:03, Stephen Walton va escriure: > On Fri, 2004-11-05 at 19:28 +0100, Francesc Altet wrote: > Francesc, I hit a problem building PyTables 0.9 on FC2 which I didn't > see at PyTables 0.8.3: it is attempting to link against the lzo > libraries even though setup.py correctly detected I don't have them > installed. A short excerpt from the build output: > > [...many lines deleted, including a lot of warnings about mismatched > pointer types..] > build/temp.linux-i686-2.3/src/H5TB-opt.o -lhdf5 -llzo -lucl -o > build/lib.linux-i686-2.3/tables/hdf5Extension.so > /usr/bin/ld: cannot find -llzo > collect2: ld returned 1 exit status > error: command 'gcc' failed with exit status 1 Ooops, my bad. I must recognize that I don't check frequently enough the pytables installation without lzo and ucl present. The cure is in CVS now. You can also apply this patch: --- ../exports/pytables-0.9/setup.py 2004-11-05 16:33:58.000000000 +0100 +++ setup.py 2004-11-08 11:23:21.000000000 +0100 @@ -94,6 +94,7 @@ else: if not incdir or not libdir: print "Optional %s libraries or include files not found. Disabling support for them." % (libname,) + return else: # Necessary to include code for optional libs def_macros.append(("HAVE_"+libname.upper()+"_LIB", 1)) Cheers, -- Francesc Altet From pearu at scipy.org Mon Nov 8 05:31:02 2004 From: pearu at scipy.org (Pearu Peterson) Date: Mon, 8 Nov 2004 04:31:02 -0600 (CST) Subject: [SciPy-user] fmin won't optimize In-Reply-To: References: <418DAB6B.9020003@ee.byu.edu> Message-ID: On Mon, 8 Nov 2004, Yaroslav Bulatov wrote: > Sorry, for the double send, I was confused by the "message awaits > moderator approval" replies. > > Yes, if I change starting parameter to be [0.,0.,0.] instead of > [0,0,0] fmin and fmin_powell find the same optimum. I guess fmin > doesn't cast them to floats? As I said, this is fixed in Scipy. The issue was related to the fact that in fmin `x0 = asarray(x0)` caused x0 to be integer array and latter x0.typecode() was used to create an array that should have been a float array. Pearu From nwagner at mecha.uni-stuttgart.de Mon Nov 8 10:03:08 2004 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Mon, 08 Nov 2004 16:03:08 +0100 Subject: [SciPy-user] Bug in io.mmread Message-ID: <418F8AAC.2070002@mecha.uni-stuttgart.de> Hi all, I have some trouble when using mmread; k0.mtx is large and sparse matrix. %%MatrixMarket matrix coordinate real general % Generated 08-Nov-2004 67986 67986 4222171 ... Traceback (most recent call last): File "test1.py", line 4, in ? ma = io.mmread('k0.mtx') File "/usr/lib/python2.3/site-packages/scipy/io/mmio.py", line 208, in mmread a = coo_matrix(data,(row,col),M=rows,N=cols) File "/usr/lib/python2.3/site-packages/scipy/sparse/Sparse.py", line 1366, in __init__ self._check() File "/usr/lib/python2.3/site-packages/scipy/sparse/Sparse.py", line 1376, in _check self.ftype = _transtabl[self.typecode] KeyError: 'O' >>> Any pointer would be appreciated. Nils From Giovanni.Samaey at cs.kuleuven.ac.be Mon Nov 8 12:02:38 2004 From: Giovanni.Samaey at cs.kuleuven.ac.be (Giovanni Samaey) Date: Mon, 08 Nov 2004 18:02:38 +0100 Subject: [SciPy-user] installing SciPy without plotting facilities... Message-ID: <418FA6AE.60309@cs.kuleuven.ac.be> Hi, I am trying to install Python + SciPy for use on an Opteron cluster. (with the aim of running parallel later on). However, due to an "incompatible" X11, the linking step fails when building scipy submodules which depend on X11. I trust these are only gplt, xplt, etc. Since they cannot be used anyway (the cluster is remote, and there is no x-forwarding), I might as well shut down these features. How do I tell the installation script that I do not want plotting features? Giovanni -- Giovanni Samaey http://www.cs.kuleuven.ac.be/~giovanni/ Katholieke Universiteit Leuven email: giovanni at cs.kuleuven.ac.be Departement Computerwetenschappen phone: +32-16-327081 Celestijnenlaan 200A, B-3001 Heverlee, Belgium fax: +32-16-327996 Office: A04.36 From pearu at scipy.org Mon Nov 8 12:58:57 2004 From: pearu at scipy.org (Pearu Peterson) Date: Mon, 8 Nov 2004 11:58:57 -0600 (CST) Subject: [SciPy-user] installing SciPy without plotting facilities... In-Reply-To: <418FA6AE.60309@cs.kuleuven.ac.be> References: <418FA6AE.60309@cs.kuleuven.ac.be> Message-ID: On Mon, 8 Nov 2004, Giovanni Samaey wrote: > Hi, > > I am trying to install Python + SciPy for use on an Opteron cluster. > (with the aim of running parallel later on). However, due to > an "incompatible" X11, the linking step fails when building scipy > submodules which depend on X11. I trust these are only gplt, xplt, etc. > > Since they cannot be used anyway (the cluster is remote, and there is no > x-forwarding), > I might as well shut down these features. > How do I tell the installation script that I do not want plotting features? List the corresponing packages in `ignore_packages` list at the end of setup.py file. E.g. set ignore_packages = ['xplt','plt','gplt','gui_thread'] Pearu From pearu at scipy.org Mon Nov 8 13:07:00 2004 From: pearu at scipy.org (Pearu Peterson) Date: Mon, 8 Nov 2004 12:07:00 -0600 (CST) Subject: [SciPy-user] Bug in io.mmread In-Reply-To: <418F8AAC.2070002@mecha.uni-stuttgart.de> References: <418F8AAC.2070002@mecha.uni-stuttgart.de> Message-ID: On Mon, 8 Nov 2004, Nils Wagner wrote: > Hi all, > > I have some trouble when using mmread; k0.mtx is large and sparse matrix. > > %%MatrixMarket matrix coordinate real general > % Generated 08-Nov-2004 > 67986 67986 4222171 > ... > > Traceback (most recent call last): > File "test1.py", line 4, in ? > ma = io.mmread('k0.mtx') > File "/usr/lib/python2.3/site-packages/scipy/io/mmio.py", line 208, in > mmread > a = coo_matrix(data,(row,col),M=rows,N=cols) > File "/usr/lib/python2.3/site-packages/scipy/sparse/Sparse.py", line 1366, > in __init__ > self._check() > File "/usr/lib/python2.3/site-packages/scipy/sparse/Sparse.py", line 1376, > in _check > self.ftype = _transtabl[self.typecode] > KeyError: 'O' >>>> > > Any pointer would be appreciated. I suspect that k0.mtx contains only integer values (some maybe larger that sys.maxint) and so asarray(data) returns array object with typecode=='O'. Could you send the file k0.mtx to me (don't send large files to the list) and I'll see if I can workaround this problem. Pearu From nwagner at mecha.uni-stuttgart.de Mon Nov 8 14:30:39 2004 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Mon, 08 Nov 2004 20:30:39 +0100 Subject: [SciPy-user] Bug in io.mmread In-Reply-To: References: <418F8AAC.2070002@mecha.uni-stuttgart.de> Message-ID: On Mon, 8 Nov 2004 12:07:00 -0600 (CST) Pearu Peterson wrote: > > > On Mon, 8 Nov 2004, Nils Wagner wrote: > >> Hi all, >> >> I have some trouble when using mmread; k0.mtx is large >>and sparse matrix. >> >> %%MatrixMarket matrix coordinate real general >> % Generated 08-Nov-2004 >> 67986 67986 4222171 >> ... >> >> Traceback (most recent call last): >> File "test1.py", line 4, in ? >> ma = io.mmread('k0.mtx') >> File >>"/usr/lib/python2.3/site-packages/scipy/io/mmio.py", line >>208, in >> mmread >> a = coo_matrix(data,(row,col),M=rows,N=cols) >> File >>"/usr/lib/python2.3/site-packages/scipy/sparse/Sparse.py", >>line 1366, >> in __init__ >> self._check() >> File >>"/usr/lib/python2.3/site-packages/scipy/sparse/Sparse.py", >>line 1376, >> in _check >> self.ftype = _transtabl[self.typecode] >> KeyError: 'O' >>>>> >> >> Any pointer would be appreciated. > > I suspect that k0.mtx contains only integer values (some >maybe larger that sys.maxint) and so asarray(data) >returns array object with typecode=='O'. Could you send >the file k0.mtx to me (don't send large files to the >list) and I'll see if I can workaround this problem. > > Pearu > Pearu, Thank you for your reply. Actually, the file is very large (60 MB). How should I send it to you ? Nils > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user From pearu at scipy.org Mon Nov 8 15:22:24 2004 From: pearu at scipy.org (Pearu Peterson) Date: Mon, 8 Nov 2004 14:22:24 -0600 (CST) Subject: [SciPy-user] Bug in io.mmread In-Reply-To: References: <418F8AAC.2070002@mecha.uni-stuttgart.de> Message-ID: On Mon, 8 Nov 2004, Nils Wagner wrote: >>> Any pointer would be appreciated. >> >> I suspect that k0.mtx contains only integer values (some maybe larger that >> sys.maxint) and so asarray(data) returns array object with typecode=='O'. >> Could you send the file k0.mtx to me (don't send large files to the list) >> and I'll see if I can workaround this problem. >> >> Pearu >> > Pearu, > > Thank you for your reply. Actually, the file > is very large (60 MB). How should I send it to you ? If you would gzip it, the size should reduce to 6-10 MB that should be ok for sending it by E-mail (DON'T send it to list). However, if you could just send an url where to download it, that would be perfect. Pearu From david at dwavesys.com Mon Nov 8 17:20:17 2004 From: david at dwavesys.com (David Grant) Date: Mon, 08 Nov 2004 14:20:17 -0800 Subject: [SciPy-user] swig+numeric+typemaps Message-ID: <418FF121.7040302@dwavesys.com> Is using typemaps with swig still a necessity to allow passing of Numeric arrays back and forth between C/C++ functions? See here (last updated 1999) http://www.scripps.edu/mb/olson/people/sanner/html/Python/NumericTypemaps/ Thanks, David From david at dwavesys.com Mon Nov 8 20:04:55 2004 From: david at dwavesys.com (David Grant) Date: Mon, 08 Nov 2004 17:04:55 -0800 Subject: [SciPy-user] C++ extensions In-Reply-To: <16771.17794.93175.156932@monster.linux.in> References: <4182C39A.3060901@telus.net> <16771.17794.93175.156932@monster.linux.in> Message-ID: <419017B7.7050601@dwavesys.com> Prabhu Ramachandran wrote: >>>>>>"DG" == David Grant writes: >>>>>> >>>>>> > > DG> I looked at some Weave information, but that didn't help, > DG> because I obviously can't run this in-line since it is many > DG> files. > >That seems like an incorrect assumption. Weave will let you call any >function you want from within the inlined code. From the information >you have provided, I assume you want to do something like this in >Python: > > a = Numeric.array(...) > # Setup your matrix the way you want. > > call_c_code_that_does_stuff_to_array(a) > > # Continue work from Python... > >So simply define call_c_code_that_does_stuff_to_array and write some >simple C++ code that does what you want. This is definitely doable >from weave. The question is what 'output' does your blackbox produce? >If its a number/array then they are easy to handle since weave will >let you do that. When you call inline you can pass headers and link >to libraries that you have built. So in theory you can build your >blackbox C++ code as a library, and call its functions from your >inlined code. > > Prabhu, I tried doing what you describe. I basically am doing the following as a test to see if I can get this to work. I have the following test python script: import weave inc_dirs=['/home/david/working_dir/python/qcd', '/usr/include'] lib_dirs=['/home/david/working_dir/python/qcd', '/usr/lib'] libs=['dcomplex'] code = """ #include "DComplex.h" DComplex z(2,3); """ weave.inline(code, include_dirs=inc_dirs, library_dirs=lib_dirs, libraries=libs, verbose=2) DComplex.cpp and DComplex.h define a class for complex numbers. I build DComplex.cpp separately using a makefile. This does: gcc -c DComplex.cpp ar rc libdcomplex.a DComplex.o ranlib libdcomplex.a This builds a static library libdcomplex.a Then, as shown in my test python script above, I use libs=['dcomplex'] and that will include libdcomplex.a in the current directory. The linking actually seems to work ok! The prooblem seems to be when python imports the file: running build_ext customize UnixCCompiler customize UnixCCompiler using build_ext building 'sc_89dfe56fc9f6dd6ee19b60174c27c7511' extension compiling C++ sources g++ options: '-fno-strict-aliasing -DNDEBUG -fPIC' compile options: '-I/home/david/working_dir/python/qcd -I/usr/include -I/usr/lib/python2.3/site-packages/weave -I/usr/lib/python2.3/site-packages/weave/scxx -I/usr/include/python2.3 -c' g++: /home/david/.python23_compiled/sc_89dfe56fc9f6dd6ee19b60174c27c7511.cpp g++ -pthread -shared /tmp/david/python23_intermediate/compiler_d039d257d176c3731283eef034e62e43/home/david/.python23_compiled/sc_89dfe56fc9f6dd6ee19b60174c27c7511.o /tmp/david/python23_intermediate/compiler_d039d257d176c3731283eef034e62e43/usr/lib/python2.3/site-packages/weave/scxx/weave_imp.o -L/home/david/working_dir/python/qcd -L/usr/lib -ldcomplex -o /home/david/.python23_compiled/sc_89dfe56fc9f6dd6ee19b60174c27c7511.so Traceback (most recent call last): File "test_weave.py", line 20, in ? weave.inline(code,['arr','_Narr'], include_dirs=inc_dirs, library_dirs=lib_dirs, libraries=libs, verbose=2) File "/usr/lib/python2.3/site-packages/weave/inline_tools.py", line 335, in inline auto_downcast = auto_downcast, File "/usr/lib/python2.3/site-packages/weave/inline_tools.py", line 445, in compile_function exec 'import ' + module_name File "", line 1, in ? ImportError: /home/david/.python23_compiled/sc_89dfe56fc9f6dd6ee19b60174c27c7511.so: undefined symbol: _ZZ13compiled_funcP7_objectS0_EN8DComplexC1Edd As you can see above there is an undefined symbol in the python .so extension, which ends in ...DComplex... I've been stuck on this for an hour and I have no clue what is wrong. If you or anything else can help me, it would be appreciated. >Of course, you can use SWIG or Boost.Python also. But weave is still >an option. > >cheers, >prabhu > >_______________________________________________ >SciPy-user mailing list >SciPy-user at scipy.net >http://www.scipy.net/mailman/listinfo/scipy-user > > > -- David J. Grant Scientific Officer Intellectual Property D-Wave Systems Inc. tel: 604.732.6604 fax: 604.732.6614 ************************** CONFIDENTIAL COMMUNICATION This electronic transmission, and any documents attached hereto, is confidential. The information is intended only for use by the recipient named above. If you have received this electronic message in error, please notify the sender and delete the electronic message. Any disclosure, copying, distribution, or use of the contents of information received in error is strictly prohibited. -------------- next part -------------- A non-text attachment was scrubbed... Name: david.vcf Type: text/x-vcard Size: 334 bytes Desc: not available URL: From rkern at ucsd.edu Mon Nov 8 20:25:44 2004 From: rkern at ucsd.edu (Robert Kern) Date: Mon, 08 Nov 2004 17:25:44 -0800 Subject: [SciPy-user] C++ extensions In-Reply-To: <419017B7.7050601@dwavesys.com> References: <4182C39A.3060901@telus.net> <16771.17794.93175.156932@monster.linux.in> <419017B7.7050601@dwavesys.com> Message-ID: <41901C98.2070803@ucsd.edu> David Grant wrote: > I tried doing what you describe. I basically am doing the following as > a test to see if I can get this to work. I have the following test > python script: > > import weave > inc_dirs=['/home/david/working_dir/python/qcd', '/usr/include'] > lib_dirs=['/home/david/working_dir/python/qcd', '/usr/lib'] > libs=['dcomplex'] > code = """ > #include "DComplex.h" > DComplex z(2,3); > """ > weave.inline(code, include_dirs=inc_dirs, library_dirs=lib_dirs, > libraries=libs, verbose=2) I'm pretty sure that you want to put all #include's as support code. Everything in 'code' gets pasted into a function body. E.g. support_code = """#include "DComplex.h" """ code = """DComplex z(2,3); """ weave.inline(code, support_code=support_code, ...) > DComplex.cpp and DComplex.h define a class for complex numbers. I build > DComplex.cpp separately using a makefile. This does: > > gcc -c DComplex.cpp > ar rc libdcomplex.a DComplex.o > ranlib libdcomplex.a > > This builds a static library libdcomplex.a Then, as shown in my test > python script above, I use libs=['dcomplex'] and that will include > libdcomplex.a in the current directory. The linking actually seems to > work ok! The prooblem seems to be when python imports the file: When linking shared libraries, the linker will often (always?) not check for missing symbols. > running build_ext > customize UnixCCompiler > customize UnixCCompiler using build_ext > building 'sc_89dfe56fc9f6dd6ee19b60174c27c7511' extension > compiling C++ sources > g++ options: '-fno-strict-aliasing -DNDEBUG -fPIC' > compile options: '-I/home/david/working_dir/python/qcd -I/usr/include > -I/usr/lib/python2.3/site-packages/weave > -I/usr/lib/python2.3/site-packages/weave/scxx -I/usr/include/python2.3 -c' > g++: > /home/david/.python23_compiled/sc_89dfe56fc9f6dd6ee19b60174c27c7511.cpp > g++ -pthread -shared > /tmp/david/python23_intermediate/compiler_d039d257d176c3731283eef034e62e43/home/david/.python23_compiled/sc_89dfe56fc9f6dd6ee19b60174c27c7511.o > /tmp/david/python23_intermediate/compiler_d039d257d176c3731283eef034e62e43/usr/lib/python2.3/site-packages/weave/scxx/weave_imp.o > -L/home/david/working_dir/python/qcd -L/usr/lib -ldcomplex -o > /home/david/.python23_compiled/sc_89dfe56fc9f6dd6ee19b60174c27c7511.so > > Traceback (most recent call last): > File "test_weave.py", line 20, in ? > weave.inline(code,['arr','_Narr'], include_dirs=inc_dirs, > library_dirs=lib_dirs, libraries=libs, verbose=2) > File "/usr/lib/python2.3/site-packages/weave/inline_tools.py", line > 335, in inline > auto_downcast = auto_downcast, > File "/usr/lib/python2.3/site-packages/weave/inline_tools.py", line > 445, in compile_function > exec 'import ' + module_name > File "", line 1, in ? > ImportError: > /home/david/.python23_compiled/sc_89dfe56fc9f6dd6ee19b60174c27c7511.so: > undefined symbol: _ZZ13compiled_funcP7_objectS0_EN8DComplexC1Edd > > As you can see above there is an undefined symbol in the python .so > extension, which ends in ...DComplex... The stuff before it, ...compiled_func... points, I believe to the fact that the #include came inside the function body whereas the library needs it outside of the function body. -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From david at dwavesys.com Mon Nov 8 20:29:48 2004 From: david at dwavesys.com (David Grant) Date: Mon, 08 Nov 2004 17:29:48 -0800 Subject: [SciPy-user] C++ extensions In-Reply-To: <419017B7.7050601@dwavesys.com> References: <4182C39A.3060901@telus.net> <16771.17794.93175.156932@monster.linux.in> <419017B7.7050601@dwavesys.com> Message-ID: <41901D8C.4010207@dwavesys.com> David Grant wrote: > Prabhu, > > I tried doing what you describe. I basically am doing the following > as a test to see if I can get this to work. I have the following test > python script: > > import weave > inc_dirs=['/home/david/working_dir/python/qcd', '/usr/include'] > lib_dirs=['/home/david/working_dir/python/qcd', '/usr/lib'] > libs=['dcomplex'] > code = """ > #include "DComplex.h" > DComplex z(2,3); > """ > weave.inline(code, include_dirs=inc_dirs, library_dirs=lib_dirs, > libraries=libs, verbose=2) > > DComplex.cpp and DComplex.h define a class for complex numbers. I > build DComplex.cpp separately using a makefile. This does: > > gcc -c DComplex.cpp > ar rc libdcomplex.a DComplex.o > ranlib libdcomplex.a > > This builds a static library libdcomplex.a Then, as shown in my test > python script above, I use libs=['dcomplex'] and that will include > libdcomplex.a in the current directory. The linking actually seems to > work ok! The prooblem seems to be when python imports the file: > > running build_ext > customize UnixCCompiler > customize UnixCCompiler using build_ext > building 'sc_89dfe56fc9f6dd6ee19b60174c27c7511' extension > compiling C++ sources > g++ options: '-fno-strict-aliasing -DNDEBUG -fPIC' > compile options: '-I/home/david/working_dir/python/qcd -I/usr/include > -I/usr/lib/python2.3/site-packages/weave > -I/usr/lib/python2.3/site-packages/weave/scxx -I/usr/include/python2.3 > -c' > g++: > /home/david/.python23_compiled/sc_89dfe56fc9f6dd6ee19b60174c27c7511.cpp > g++ -pthread -shared > /tmp/david/python23_intermediate/compiler_d039d257d176c3731283eef034e62e43/home/david/.python23_compiled/sc_89dfe56fc9f6dd6ee19b60174c27c7511.o > /tmp/david/python23_intermediate/compiler_d039d257d176c3731283eef034e62e43/usr/lib/python2.3/site-packages/weave/scxx/weave_imp.o > -L/home/david/working_dir/python/qcd -L/usr/lib -ldcomplex -o > /home/david/.python23_compiled/sc_89dfe56fc9f6dd6ee19b60174c27c7511.so > > Traceback (most recent call last): > File "test_weave.py", line 20, in ? > weave.inline(code,['arr','_Narr'], include_dirs=inc_dirs, > library_dirs=lib_dirs, libraries=libs, verbose=2) > File "/usr/lib/python2.3/site-packages/weave/inline_tools.py", line > 335, in inline > auto_downcast = auto_downcast, > File "/usr/lib/python2.3/site-packages/weave/inline_tools.py", line > 445, in compile_function > exec 'import ' + module_name > File "", line 1, in ? > ImportError: > /home/david/.python23_compiled/sc_89dfe56fc9f6dd6ee19b60174c27c7511.so: > undefined symbol: _ZZ13compiled_funcP7_objectS0_EN8DComplexC1Edd > > As you can see above there is an undefined symbol in the python .so > extension, which ends in ...DComplex... > > I've been stuck on this for an hour and I have no clue what is wrong. > If you or anything else can help me, it would be appreciated. Grr....I figured it out. I need to use the headers=[] arg. It isn't documented on scipy.org. Grrr.... I'll email Eric Jones and politely inform him. :-) Dave From david at dwavesys.com Mon Nov 8 20:37:18 2004 From: david at dwavesys.com (David Grant) Date: Mon, 08 Nov 2004 17:37:18 -0800 Subject: [SciPy-user] C++ extensions In-Reply-To: <41901C98.2070803@ucsd.edu> References: <4182C39A.3060901@telus.net> <16771.17794.93175.156932@monster.linux.in> <419017B7.7050601@dwavesys.com> <41901C98.2070803@ucsd.edu> Message-ID: <41901F4E.2000505@dwavesys.com> Robert Kern wrote: > David Grant wrote: > >> I tried doing what you describe. I basically am doing the following >> as a test to see if I can get this to work. I have the following >> test python script: >> >> import weave >> inc_dirs=['/home/david/working_dir/python/qcd', '/usr/include'] >> lib_dirs=['/home/david/working_dir/python/qcd', '/usr/lib'] >> libs=['dcomplex'] >> code = """ >> #include "DComplex.h" >> DComplex z(2,3); >> """ >> weave.inline(code, include_dirs=inc_dirs, library_dirs=lib_dirs, >> libraries=libs, verbose=2) > > > I'm pretty sure that you want to put all #include's as support code. > Everything in 'code' gets pasted into a function body. > > E.g. > > support_code = """#include "DComplex.h" > """ > code = """DComplex z(2,3); > """ > weave.inline(code, support_code=support_code, ...) Cool, thanks. This is similar to headers! Probably headers just gets wrapped into support_code I'm guessing, which might explain why it is undocumented? > >> DComplex.cpp and DComplex.h define a class for complex numbers. I >> build DComplex.cpp separately using a makefile. This does: >> >> gcc -c DComplex.cpp >> ar rc libdcomplex.a DComplex.o >> ranlib libdcomplex.a >> >> This builds a static library libdcomplex.a Then, as shown in my test >> python script above, I use libs=['dcomplex'] and that will include >> libdcomplex.a in the current directory. The linking actually seems >> to work ok! The prooblem seems to be when python imports the file: > > > When linking shared libraries, the linker will often (always?) not > check for missing symbols. Right. > >> running build_ext >> customize UnixCCompiler >> customize UnixCCompiler using build_ext >> building 'sc_89dfe56fc9f6dd6ee19b60174c27c7511' extension >> compiling C++ sources >> g++ options: '-fno-strict-aliasing -DNDEBUG -fPIC' >> compile options: '-I/home/david/working_dir/python/qcd -I/usr/include >> -I/usr/lib/python2.3/site-packages/weave >> -I/usr/lib/python2.3/site-packages/weave/scxx >> -I/usr/include/python2.3 -c' >> g++: >> /home/david/.python23_compiled/sc_89dfe56fc9f6dd6ee19b60174c27c7511.cpp >> g++ -pthread -shared >> /tmp/david/python23_intermediate/compiler_d039d257d176c3731283eef034e62e43/home/david/.python23_compiled/sc_89dfe56fc9f6dd6ee19b60174c27c7511.o >> /tmp/david/python23_intermediate/compiler_d039d257d176c3731283eef034e62e43/usr/lib/python2.3/site-packages/weave/scxx/weave_imp.o >> -L/home/david/working_dir/python/qcd -L/usr/lib -ldcomplex -o >> /home/david/.python23_compiled/sc_89dfe56fc9f6dd6ee19b60174c27c7511.so >> >> Traceback (most recent call last): >> File "test_weave.py", line 20, in ? >> weave.inline(code,['arr','_Narr'], include_dirs=inc_dirs, >> library_dirs=lib_dirs, libraries=libs, verbose=2) >> File "/usr/lib/python2.3/site-packages/weave/inline_tools.py", line >> 335, in inline >> auto_downcast = auto_downcast, >> File "/usr/lib/python2.3/site-packages/weave/inline_tools.py", line >> 445, in compile_function >> exec 'import ' + module_name >> File "", line 1, in ? >> ImportError: >> /home/david/.python23_compiled/sc_89dfe56fc9f6dd6ee19b60174c27c7511.so: >> undefined symbol: _ZZ13compiled_funcP7_objectS0_EN8DComplexC1Edd > > > > >> As you can see above there is an undefined symbol in the python .so >> extension, which ends in ...DComplex... > > > The stuff before it, ...compiled_func... points, I believe to the fact > that the #include came inside the function body whereas the library > needs it outside of the function body. > Thanks. -- David J. Grant Scientific Officer Intellectual Property D-Wave Systems Inc. tel: 604.732.6604 fax: 604.732.6614 ************************** CONFIDENTIAL COMMUNICATION This electronic transmission, and any documents attached hereto, is confidential. The information is intended only for use by the recipient named above. If you have received this electronic message in error, please notify the sender and delete the electronic message. Any disclosure, copying, distribution, or use of the contents of information received in error is strictly prohibited. -------------- next part -------------- A non-text attachment was scrubbed... Name: david.vcf Type: text/x-vcard Size: 334 bytes Desc: not available URL: From rkern at ucsd.edu Mon Nov 8 21:07:59 2004 From: rkern at ucsd.edu (Robert Kern) Date: Mon, 08 Nov 2004 18:07:59 -0800 Subject: [SciPy-user] C++ extensions In-Reply-To: <41901F4E.2000505@dwavesys.com> References: <4182C39A.3060901@telus.net> <16771.17794.93175.156932@monster.linux.in> <419017B7.7050601@dwavesys.com> <41901C98.2070803@ucsd.edu> <41901F4E.2000505@dwavesys.com> Message-ID: <4190267F.9080603@ucsd.edu> David Grant wrote: > Cool, thanks. This is similar to headers! Probably headers just gets > wrapped into support_code I'm guessing, which might explain why it is > undocumented? Lack of time on the part of the person/people who have the knowledge is more likely. -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From stephen.walton at csun.edu Mon Nov 8 23:47:39 2004 From: stephen.walton at csun.edu (Stephen Walton) Date: Mon, 08 Nov 2004 20:47:39 -0800 Subject: [SciPy-user] ANN: PyTables-0.9 released In-Reply-To: <200411081130.17736.falted@pytables.org> References: <200411051928.32583.falted@pytables.org> <1099857781.3511.14.camel@localhost.localdomain> <200411081130.17736.falted@pytables.org> Message-ID: <1099975659.3088.0.camel@localhost.localdomain> On Mon, 2004-11-08 at 11:30 +0100, Francesc Altet wrote: > Ooops, my bad. I must recognize that I don't check frequently enough > the pytables installation without lzo and ucl present. > > The cure is in CVS now. You can also apply this patch: Thanks again, Francesc. -- Stephen Walton Physics & Astronomy CSUN From prabhu_r at users.sf.net Tue Nov 9 01:35:49 2004 From: prabhu_r at users.sf.net (Prabhu Ramachandran) Date: Tue, 9 Nov 2004 12:05:49 +0530 Subject: [SciPy-user] C++ extensions In-Reply-To: <41901D8C.4010207@dwavesys.com> References: <4182C39A.3060901@telus.net> <16771.17794.93175.156932@monster.linux.in> <419017B7.7050601@dwavesys.com> <41901D8C.4010207@dwavesys.com> Message-ID: <16784.25925.343605.707663@monster.linux.in> >>>>> "DG" == David Grant writes: DG> David Grant wrote: >> wrong. If you or anything else can help me, it would be >> appreciated. DG> Grr....I figured it out. I need to use the headers=[] arg. DG> It isn't documented on scipy.org. Grrr.... I'll email Eric DG> Jones and politely inform him. :-) Well, the inline command is fully documented in the sources. :) Look at inline_tools.py:: 132 def inline(code,arg_names=[],local_dict = None, global_dict = None, 133 force = 0, 134 compiler='', 135 verbose = 0, 136 support_code = None, 137 headers = [], 138 customize=None, 139 type_converters = None, 140 auto_downcast=1, 141 **kw): 142 """ Inline C/C++ code within Python scripts. [... snipped long docs ...] Also, if you look at the examples directory the swig2_example.py and vtk_example.py use the headers. I know, this is perhaps no excuse for lack of docs but using the sources is generally a good idea. :) I also think Fernando's examples ought to be cleaned up sometime and put in the examples directory. It sure covers a lot of ground! For now perhaps a link to his examples would be a good idea. This is from an older mail from him on this list:: http://amath.colorado.edu/faculty/fperez/python/weave_examples.html original source here: http://amath.colorado.edu/faculty/fperez/python/weave_examples.py HTH, cheers, prabhu From prabhu_r at users.sf.net Tue Nov 9 01:37:34 2004 From: prabhu_r at users.sf.net (Prabhu Ramachandran) Date: Tue, 9 Nov 2004 12:07:34 +0530 Subject: [SciPy-user] C++ extensions In-Reply-To: <41901C98.2070803@ucsd.edu> References: <4182C39A.3060901@telus.net> <16771.17794.93175.156932@monster.linux.in> <419017B7.7050601@dwavesys.com> <41901C98.2070803@ucsd.edu> Message-ID: <16784.26030.589975.130285@monster.linux.in> >>>>> "RK" == Robert Kern writes: RK> I'm pretty sure that you want to put all #include's as support RK> code. Everything in 'code' gets pasted into a function body. Well support code could be used when you want to write functions that you don't want to wrap but want to call in the wrapped function. I think headers are the way to go if you just want to specify the headers to include. cheers, prabhu From prabhu_r at users.sf.net Tue Nov 9 01:39:47 2004 From: prabhu_r at users.sf.net (Prabhu Ramachandran) Date: Tue, 9 Nov 2004 12:09:47 +0530 Subject: [SciPy-user] swig+numeric+typemaps In-Reply-To: <418FF121.7040302@dwavesys.com> References: <418FF121.7040302@dwavesys.com> Message-ID: <16784.26163.443527.196108@monster.linux.in> >>>>> "DG" == David Grant writes: DG> Is using typemaps with swig still a necessity to allow passing DG> of Numeric arrays back and forth between C/C++ functions? DG> See here (last updated 1999) DG> http://www.scripps.edu/mb/olson/people/sanner/html/Python/NumericTypemaps/ I would think so, but am not 100% sure. This is however something that is worth putting into a nice swig library that ships with swig, so users can include it and get on with their work. Perhaps a question/suggestion on the swig lists would be a good idea? cheers, prabhu From rkern at ucsd.edu Tue Nov 9 01:57:04 2004 From: rkern at ucsd.edu (Robert Kern) Date: Mon, 08 Nov 2004 22:57:04 -0800 Subject: [SciPy-user] C++ extensions In-Reply-To: <16784.26030.589975.130285@monster.linux.in> References: <4182C39A.3060901@telus.net> <16771.17794.93175.156932@monster.linux.in> <419017B7.7050601@dwavesys.com> <41901C98.2070803@ucsd.edu> <16784.26030.589975.130285@monster.linux.in> Message-ID: <41906A40.5020108@ucsd.edu> Prabhu Ramachandran wrote: >>>>>>"RK" == Robert Kern writes: > > > RK> I'm pretty sure that you want to put all #include's as support > RK> code. Everything in 'code' gets pasted into a function body. > > Well support code could be used when you want to write functions that > you don't want to wrap but want to call in the wrapped function. I > think headers are the way to go if you just want to specify the > headers to include. Yes, indeed. Last time I had to delve into weave, I just reflexively used the first thing I saw (somewhat more forgivably on my part, I needed both headers and support functions). And then I reflexively passed it on to an unsuspecting David. -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From david.grant at telus.net Tue Nov 9 03:43:21 2004 From: david.grant at telus.net (David Grant) Date: Tue, 09 Nov 2004 00:43:21 -0800 Subject: [SciPy-user] C++ extensions In-Reply-To: <16784.25925.343605.707663@monster.linux.in> References: <4182C39A.3060901@telus.net> <16771.17794.93175.156932@monster.linux.in> <419017B7.7050601@dwavesys.com> <41901D8C.4010207@dwavesys.com> <16784.25925.343605.707663@monster.linux.in> Message-ID: <41908329.9010508@telus.net> Prabhu Ramachandran wrote: >Also, if you look at the examples directory the swig2_example.py and >vtk_example.py use the headers. > > Yeah, that was how I discovered it. Looked in the examples directory and grepped for "#include" -- David J. Grant http://www.davidgrant.ca:81 -------------- next part -------------- A non-text attachment was scrubbed... Name: david.grant.vcf Type: text/x-vcard Size: 177 bytes Desc: not available URL: From pearu at scipy.org Tue Nov 9 04:39:52 2004 From: pearu at scipy.org (Pearu Peterson) Date: Tue, 9 Nov 2004 03:39:52 -0600 (CST) Subject: [SciPy-user] Bug in io.mmread In-Reply-To: References: <418F8AAC.2070002@mecha.uni-stuttgart.de> Message-ID: On Mon, 8 Nov 2004, Pearu Peterson wrote: > > On Mon, 8 Nov 2004, Nils Wagner wrote: > >> Hi all, >> >> I have some trouble when using mmread; k0.mtx is large and sparse matrix. >> >> %%MatrixMarket matrix coordinate real general >> % Generated 08-Nov-2004 >> 67986 67986 4222171 >> ... >> >> Traceback (most recent call last): >> File "test1.py", line 4, in ? >> ma = io.mmread('k0.mtx') >> File "/usr/lib/python2.3/site-packages/scipy/io/mmio.py", line 208, in >> mmread >> a = coo_matrix(data,(row,col),M=rows,N=cols) >> File "/usr/lib/python2.3/site-packages/scipy/sparse/Sparse.py", line 1366, >> in __init__ >> self._check() >> File "/usr/lib/python2.3/site-packages/scipy/sparse/Sparse.py", line 1376, >> in _check >> self.ftype = _transtabl[self.typecode] >> KeyError: 'O' >>>>> >> >> Any pointer would be appreciated. > > I suspect that k0.mtx contains only integer values (some maybe larger that > sys.maxint) and so asarray(data) returns array object with typecode=='O'. > Could you send the file k0.mtx to me (don't send large files to the list) and > I'll see if I can workaround this problem. Thanks for sending the file. It contains a line 34362 34362 559000591904384 where 559000591904384 in python evaluates to 559000591904384L and asarray returned an array with typecode=='O' as a result. This issue is now fixed in scipy CVS. Pearu From nwagner at mecha.uni-stuttgart.de Tue Nov 9 04:50:12 2004 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Tue, 09 Nov 2004 10:50:12 +0100 Subject: [SciPy-user] Bug in io.mmread In-Reply-To: References: <418F8AAC.2070002@mecha.uni-stuttgart.de> Message-ID: <419092D4.9090301@mecha.uni-stuttgart.de> Pearu Peterson wrote: > > > On Mon, 8 Nov 2004, Pearu Peterson wrote: > >> >> On Mon, 8 Nov 2004, Nils Wagner wrote: >> >>> Hi all, >>> >>> I have some trouble when using mmread; k0.mtx is large and sparse >>> matrix. >>> >>> %%MatrixMarket matrix coordinate real general >>> % Generated 08-Nov-2004 >>> 67986 67986 4222171 >>> ... >>> >>> Traceback (most recent call last): >>> File "test1.py", line 4, in ? >>> ma = io.mmread('k0.mtx') >>> File "/usr/lib/python2.3/site-packages/scipy/io/mmio.py", line 208, >>> in mmread >>> a = coo_matrix(data,(row,col),M=rows,N=cols) >>> File "/usr/lib/python2.3/site-packages/scipy/sparse/Sparse.py", line >>> 1366, in __init__ >>> self._check() >>> File "/usr/lib/python2.3/site-packages/scipy/sparse/Sparse.py", line >>> 1376, in _check >>> self.ftype = _transtabl[self.typecode] >>> KeyError: 'O' >>> >>>>>> >>> >>> Any pointer would be appreciated. >> >> >> I suspect that k0.mtx contains only integer values (some maybe larger >> that sys.maxint) and so asarray(data) returns array object with >> typecode=='O'. Could you send the file k0.mtx to me (don't send large >> files to the list) and I'll see if I can workaround this problem. > > > Thanks for sending the file. It contains a line > > 34362 34362 559000591904384 > > where 559000591904384 in python evaluates to 559000591904384L and > asarray returned an array with typecode=='O' as a result. > > This issue is now fixed in scipy CVS. > Great ! Thank you very much ! BTW, are iterative solvers also affected with respect to this typecode ? Nils > Pearu > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user From david.grant at telus.net Tue Nov 9 05:22:53 2004 From: david.grant at telus.net (David Grant) Date: Tue, 09 Nov 2004 02:22:53 -0800 Subject: [SciPy-user] swig+numeric+typemaps In-Reply-To: <418FF121.7040302@dwavesys.com> References: <418FF121.7040302@dwavesys.com> Message-ID: <41909A7D.1050006@telus.net> I've found several references from exhaustive google searching which indicates that this code is out of date and needs updating, but no one has done it yet. I found this: https://geodoc.uchicago.edu/climatewiki/SwigTypemaps which is cool. I got it working right away. Now, I only wonder if it will work with complex numbers and other situations... I expect problems, but maybe NumPtr will surprise me! David David Grant wrote: > Is using typemaps with swig still a necessity to allow passing of > Numeric arrays back and forth between C/C++ functions? > > See here (last updated 1999) > http://www.scripps.edu/mb/olson/people/sanner/html/Python/NumericTypemaps/ > > > Thanks, > David > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user > -- David J. Grant http://www.davidgrant.ca:81 -------------- next part -------------- A non-text attachment was scrubbed... Name: david.grant.vcf Type: text/x-vcard Size: 177 bytes Desc: not available URL: From Giovanni.Samaey at cs.kuleuven.ac.be Tue Nov 9 05:22:55 2004 From: Giovanni.Samaey at cs.kuleuven.ac.be (Giovanni Samaey) Date: Tue, 09 Nov 2004 11:22:55 +0100 Subject: [SciPy-user] error linking to libg2c on 64-bit machine Message-ID: <41909A7F.9090204@cs.kuleuven.ac.be> Hi, I get an error linking to lg2c when building scipy on a 64 bit machine. (machine and compiler characteristics below...) The error I get is the following: /usr/bin/g77 -shared build/temp.linux-x86_64-2.3/Lib/interpolate/_fitpackmodule.o -Lbuild/temp.linux-x86_64-2.3 -lfitpack -lg2c -o build/lib.linux-x86_64-2.3/scipy/interpolate/_fitpack.so /usr/bin/ld: /usr/lib/gcc-lib/i486-linux/3.3.2/64/libg2c.a(lread.o): relocation R_X86_64_32 can not be used when making a shared object; recompile with -fPIC /usr/lib/gcc-lib/i486-linux/3.3.2/64/libg2c.a: could not read symbols: Bad valuecollect2: ld returned 1 exit status /usr/bin/ld: /usr/lib/gcc-lib/i486-linux/3.3.2/64/libg2c.a(lread.o): relocation R_X86_64_32 can not be used when making a shared object; recompile with -fPIC /usr/lib/gcc-lib/i486-linux/3.3.2/64/libg2c.a: could not read symbols: Bad valuecollect2: ld returned 1 exit status error: Command "/usr/bin/g77 -shared build/temp.linux-x86_64-2.3/Lib/interpolate/_fitpackmodule.o -Lbuild/temp.linux-x86_64-2.3 -lfitpack -lg2c -o build/lib.linux-x86_64-2.3/scipy/interpolate/_fitpack.so" failed with exit status 1 So there is an incompatibility between what g77 is compiling and what is in this version of libg2c. Recompiling libg2c (as suggested) is not an option. But there are other libg2c versions available. Only I don't know (and didn't find in the doc's) how to change the standard choice in the installation script. What could I do? I am trying to install scipy on a 64 bit machine with the following characteristics posix linux2 Linux inti 2.4.27 #2 SMP Mon Aug 9 13:36:15 CEST 2004 x86_64 GNU/Linux with compiler version gcc 3.3.2, configured as follows Reading specs from /usr/lib/gcc-lib/i486-linux/3.3.2/specs Configured with: ../src/configure -v --enable-languages=c,c++,java,f77,pascal,objc,ada,treelang --prefix=/usr --mandir=/usr/share/man --infodir=/usr/share/info --with-gxx-include-dir=/usr/include/c++/3.3 --enable-shared --with-system-zlib --enable-nls --without-included-gettext --enable-__cxa_atexit --enable-clocale=gnu --enable-debug --enable-java-gc=boehm --enable-java-awt=xlib --enable-objc-gc i486-linux Thread model: posix gcc version 3.3.2 (Debian) -- Giovanni Samaey http://www.cs.kuleuven.ac.be/~giovanni/ Katholieke Universiteit Leuven email: giovanni at cs.kuleuven.ac.be Departement Computerwetenschappen phone: +32-16-327081 Celestijnenlaan 200A, B-3001 Heverlee, Belgium fax: +32-16-327996 Office: A04.36 From Giovanni.Samaey at cs.kuleuven.ac.be Tue Nov 9 05:24:23 2004 From: Giovanni.Samaey at cs.kuleuven.ac.be (Giovanni Samaey) Date: Tue, 09 Nov 2004 11:24:23 +0100 Subject: [SciPy-user] installing SciPy without plotting facilities... In-Reply-To: References: <418FA6AE.60309@cs.kuleuven.ac.be> Message-ID: <41909AD7.5050905@cs.kuleuven.ac.be> Pearu Peterson wrote: > > > On Mon, 8 Nov 2004, Giovanni Samaey wrote: > >> Hi, >> >> I am trying to install Python + SciPy for use on an Opteron cluster. >> (with the aim of running parallel later on). However, due to >> an "incompatible" X11, the linking step fails when building scipy >> submodules which depend on X11. I trust these are only gplt, xplt, etc. >> >> Since they cannot be used anyway (the cluster is remote, and there is >> no x-forwarding), >> I might as well shut down these features. >> How do I tell the installation script that I do not want plotting >> features? > > > List the corresponing packages in `ignore_packages` list at the end of > setup.py file. E.g. set > > ignore_packages = ['xplt','plt','gplt','gui_thread'] Thanks, this works smoothly... The next problem is in a separate mail ;-) Also, I have located the correct 64 bit xlib files in a different place then standard. What should I do to include those instead of standard? Thanks a lot for all this quick help! Giovanni > > Pearu > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user -- Giovanni Samaey http://www.cs.kuleuven.ac.be/~giovanni/ Katholieke Universiteit Leuven email: giovanni at cs.kuleuven.ac.be Departement Computerwetenschappen phone: +32-16-327081 Celestijnenlaan 200A, B-3001 Heverlee, Belgium fax: +32-16-327996 Office: A04.36 From Giovanni.Samaey at cs.kuleuven.ac.be Tue Nov 9 05:49:53 2004 From: Giovanni.Samaey at cs.kuleuven.ac.be (Giovanni Samaey) Date: Tue, 09 Nov 2004 11:49:53 +0100 Subject: [SciPy-user] error linking to libg2c on 64-bit machine In-Reply-To: <41909A7F.9090204@cs.kuleuven.ac.be> References: <41909A7F.9090204@cs.kuleuven.ac.be> Message-ID: <4190A0D1.80502@cs.kuleuven.ac.be> Giovanni Samaey wrote: > So there is an incompatibility between what g77 is compiling and what > is in this version of libg2c. > Recompiling libg2c (as suggested) is not an option. Of course, providing a recompiled libg2c in a different place would be an option if I can tell the install script where to look... From pearu at scipy.org Tue Nov 9 05:56:05 2004 From: pearu at scipy.org (Pearu Peterson) Date: Tue, 9 Nov 2004 04:56:05 -0600 (CST) Subject: [SciPy-user] error linking to libg2c on 64-bit machine In-Reply-To: <4190A0D1.80502@cs.kuleuven.ac.be> References: <41909A7F.9090204@cs.kuleuven.ac.be> <4190A0D1.80502@cs.kuleuven.ac.be> Message-ID: On Tue, 9 Nov 2004, Giovanni Samaey wrote: > Giovanni Samaey wrote: > > > >> So there is an incompatibility between what g77 is compiling and what is in >> this version of libg2c. >> Recompiling libg2c (as suggested) is not an option. > > Of course, providing a recompiled libg2c in a different place would be an > option if I can tell the install script where > to look... .. this would also mean compiling compiler, afaik. Do you have g2c-pic library installed in your system? For example, debian provides both g2c and g2c-pic. Scipy setup scripts use g2c-pic when available. HTH, Pearu From Giovanni.Samaey at cs.kuleuven.ac.be Tue Nov 9 07:21:56 2004 From: Giovanni.Samaey at cs.kuleuven.ac.be (Giovanni Samaey) Date: Tue, 09 Nov 2004 13:21:56 +0100 Subject: [SciPy-user] error linking to libg2c on 64-bit machine In-Reply-To: References: <41909A7F.9090204@cs.kuleuven.ac.be> <4190A0D1.80502@cs.kuleuven.ac.be> Message-ID: <4190B664.3050503@cs.kuleuven.ac.be> Pearu Peterson wrote: > > > On Tue, 9 Nov 2004, Giovanni Samaey wrote: > >> Giovanni Samaey wrote: >> >> >> >>> So there is an incompatibility between what g77 is compiling and >>> what is in this version of libg2c. >>> Recompiling libg2c (as suggested) is not an option. >> >> >> Of course, providing a recompiled libg2c in a different place would >> be an option if I can tell the install script where >> to look... > > > .. this would also mean compiling compiler, afaik. > > Do you have g2c-pic library installed in your system? > For example, debian provides both g2c and g2c-pic. Scipy setup scripts > use g2c-pic when available. Despite the fact that the system is debian, I do not have g2c-pic... As far as I can tell (I am no expert), the problem is that the library libg2c.a is not position independent. However, there is a libg2c.so file sitting next to it, which should be a shared library? > > HTH, > Pearu > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user -- Giovanni Samaey http://www.cs.kuleuven.ac.be/~giovanni/ Katholieke Universiteit Leuven email: giovanni at cs.kuleuven.ac.be Departement Computerwetenschappen phone: +32-16-327081 Celestijnenlaan 200A, B-3001 Heverlee, Belgium fax: +32-16-327996 Office: A04.36 From pearu at scipy.org Tue Nov 9 07:32:02 2004 From: pearu at scipy.org (Pearu Peterson) Date: Tue, 9 Nov 2004 06:32:02 -0600 (CST) Subject: [SciPy-user] error linking to libg2c on 64-bit machine In-Reply-To: <4190B664.3050503@cs.kuleuven.ac.be> References: <41909A7F.9090204@cs.kuleuven.ac.be> <4190A0D1.80502@cs.kuleuven.ac.be> <4190B664.3050503@cs.kuleuven.ac.be> Message-ID: On Tue, 9 Nov 2004, Giovanni Samaey wrote: > Pearu Peterson wrote: > >> >> >> On Tue, 9 Nov 2004, Giovanni Samaey wrote: >> >>> Giovanni Samaey wrote: >>> >>> >>> >>>> So there is an incompatibility between what g77 is compiling and what is >>>> in this version of libg2c. >>>> Recompiling libg2c (as suggested) is not an option. >>> >>> >>> Of course, providing a recompiled libg2c in a different place would be an >>> option if I can tell the install script where >>> to look... >> >> >> .. this would also mean compiling compiler, afaik. >> >> Do you have g2c-pic library installed in your system? >> For example, debian provides both g2c and g2c-pic. Scipy setup scripts use >> g2c-pic when available. > > Despite the fact that the system is debian, I do not have g2c-pic... > As far as I can tell (I am no expert), the problem is that the library > libg2c.a is not position independent. Right. What debian version are you using? In debian sid g2c-pic is provided by g77-3.3, for instance. Try: cd /var/lib/dpkg/info grep libg2c *.list > However, there is a libg2c.so file sitting next to it, which should be > a shared library? That could be an option but it would be better to use g2c-pic. Pearu From Giovanni.Samaey at cs.kuleuven.ac.be Tue Nov 9 07:45:23 2004 From: Giovanni.Samaey at cs.kuleuven.ac.be (Giovanni Samaey) Date: Tue, 09 Nov 2004 13:45:23 +0100 Subject: [SciPy-user] error linking to libg2c on 64-bit machine In-Reply-To: References: <41909A7F.9090204@cs.kuleuven.ac.be> <4190A0D1.80502@cs.kuleuven.ac.be> <4190B664.3050503@cs.kuleuven.ac.be> Message-ID: <4190BBE3.1060302@cs.kuleuven.ac.be> Pearu Peterson wrote: > > That could be an option but it would be better to use g2c-pic. The debian version is "stable" (so not testing). I located g2c-pic in /usr/lib/gcc-lib/i486-linux/3.3.2/libg2c-pic.a However, since the script is trying to do a 64-bit compilation, it is searching in /usr/lib/gcc-lib/i486-linux/3.3.2/64/, where it is not available. I asked the system administrator to provide this file. I should probably wait until this has happened before proceeding? Giovanni > > Pearu > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user -- Giovanni Samaey http://www.cs.kuleuven.ac.be/~giovanni/ Katholieke Universiteit Leuven email: giovanni at cs.kuleuven.ac.be Departement Computerwetenschappen phone: +32-16-327081 Celestijnenlaan 200A, B-3001 Heverlee, Belgium fax: +32-16-327996 Office: A04.36 From pearu at scipy.org Tue Nov 9 08:00:30 2004 From: pearu at scipy.org (Pearu Peterson) Date: Tue, 9 Nov 2004 07:00:30 -0600 (CST) Subject: [SciPy-user] error linking to libg2c on 64-bit machine In-Reply-To: <4190BBE3.1060302@cs.kuleuven.ac.be> References: <41909A7F.9090204@cs.kuleuven.ac.be> <4190A0D1.80502@cs.kuleuven.ac.be> <4190BBE3.1060302@cs.kuleuven.ac.be> Message-ID: On Tue, 9 Nov 2004, Giovanni Samaey wrote: > Pearu Peterson wrote: > >> >> That could be an option but it would be better to use g2c-pic. > > The debian version is "stable" (so not testing). > I located g2c-pic in /usr/lib/gcc-lib/i486-linux/3.3.2/libg2c-pic.a > > However, since the script is trying to do a 64-bit compilation, it is > searching in > /usr/lib/gcc-lib/i486-linux/3.3.2/64/, where it is not available. > I asked the system administrator to provide this file. I should probably > wait until this has happened before proceeding? Yes. Let us know who it goes.. I must note that you are running on an interesting system: 64-bit x86 debian woody. Debian supports only 32-bit for x86 architecture. Pearu From nwagner at mecha.uni-stuttgart.de Tue Nov 9 11:26:01 2004 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Tue, 09 Nov 2004 17:26:01 +0100 Subject: [SciPy-user] No module named distutils Message-ID: <4190EF99.7070802@mecha.uni-stuttgart.de> Hi all, I am going to build scipy from scratch. Installing Numeric-23.6 via python setup.py install failed Traceback (most recent call last): File "setup.py", line 11, in ? import distutils ImportError: No module named distutils Any pointer ? Nils From tfogal at apollo.sr.unh.edu Tue Nov 9 11:43:02 2004 From: tfogal at apollo.sr.unh.edu (tom fogal) Date: Tue, 09 Nov 2004 11:43:02 -0500 Subject: [SciPy-user] No module named distutils In-Reply-To: Your message of "Tue, 09 Nov 2004 17:26:01 +0100." <4190EF99.7070802@mecha.uni-stuttgart.de> References: <4190EF99.7070802@mecha.uni-stuttgart.de> Message-ID: <200411091643.iA9Gh2xs031078@apollo.sr.unh.edu> <4190EF99.7070802 at mecha.uni-stuttgart.de>Nils Wagner writes: >Traceback (most recent call last): > File "setup.py", line 11, in ? > import distutils >ImportError: No module named distutils *blink* I thought distutils was a standard part of python. In any case, for whatever reason you seem not to have it. You can get it here: http://www.python.org/sigs/distutils-sig/download.html After installing distutils I should hope everything goes smoothly... -tom From sseefeld at art.ca Tue Nov 9 11:39:48 2004 From: sseefeld at art.ca (Stefan Seefeld) Date: Tue, 9 Nov 2004 11:39:48 -0500 Subject: [SciPy-user] No module named distutils Message-ID: <20DCDD8F0FCED411AC4D001083CF504501AA9834@MTL-EXCHANGE> > From: tom fogal [mailto:tfogal at apollo.sr.unh.edu] > Sent: November 9, 2004 11:43 > *blink* I thought distutils was a standard part of python. > In any case, for whatever reason you seem not to have it. distutils has been part of the standard distribution since Python version 2.0, I believe. May be he's using 1.5.2. (This may be worth an entry in the installation notes.) Regards, Stefan From nwagner at mecha.uni-stuttgart.de Tue Nov 9 11:47:40 2004 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Tue, 09 Nov 2004 17:47:40 +0100 Subject: [SciPy-user] No module named distutils In-Reply-To: <200411091643.iA9Gh2xs031078@apollo.sr.unh.edu> References: <4190EF99.7070802@mecha.uni-stuttgart.de> <200411091643.iA9Gh2xs031078@apollo.sr.unh.edu> Message-ID: <4190F4AC.3090308@mecha.uni-stuttgart.de> tom fogal wrote: > <4190EF99.7070802 at mecha.uni-stuttgart.de>Nils Wagner writes: > > > > > >>Traceback (most recent call last): >>File "setup.py", line 11, in ? >>import distutils >>ImportError: No module named distutils >> >> > >*blink* I thought distutils was a standard part of python. > > Just now I saw a short note via rpm -qi python >If you want to install third party modules using distutils, you need to >install python-devel package. This seems to be a SuSE specific problem. Nils >In any case, for whatever reason you seem not to have it. You can get >it here: > >http://www.python.org/sigs/distutils-sig/download.html > >After installing distutils I should hope everything goes smoothly... > >-tom > >_______________________________________________ >SciPy-user mailing list >SciPy-user at scipy.net >http://www.scipy.net/mailman/listinfo/scipy-user > > From yaroslavvb at gmail.com Tue Nov 9 12:28:08 2004 From: yaroslavvb at gmail.com (Yaroslav Bulatov) Date: Tue, 9 Nov 2004 09:28:08 -0800 Subject: [SciPy-user] Installing SciPy in user directory? Message-ID: Hi, I'm trying to run SciPy on a Linux cluster, which doesn't have SciPy installed and for which I don't have admin priviledges. Is that possible? In particular I'm wondering if I can install SciPy into home directory. There's a 2 year old thread describing how to do it with Numeric, but has anyone done it with whole SciPy? Yaroslav From pearu at scipy.org Tue Nov 9 15:13:44 2004 From: pearu at scipy.org (Pearu Peterson) Date: Tue, 9 Nov 2004 14:13:44 -0600 (CST) Subject: [SciPy-user] No module named distutils In-Reply-To: <4190F4AC.3090308@mecha.uni-stuttgart.de> References: <4190EF99.7070802@mecha.uni-stuttgart.de> <4190F4AC.3090308@mecha.uni-stuttgart.de> Message-ID: On Tue, 9 Nov 2004, Nils Wagner wrote: > tom fogal wrote: > >> <4190EF99.7070802 at mecha.uni-stuttgart.de>Nils Wagner writes: >> >> >> >> >>> Traceback (most recent call last): >>> File "setup.py", line 11, in ? >>> import distutils >>> ImportError: No module named distutils >>> >> >> *blink* I thought distutils was a standard part of python. >> > Just now I saw a short note via rpm -qi python > >> If you want to install third party modules using distutils, you need to >> install python-devel package. > > This seems to be a SuSE specific problem. This is not a problem. You just need to install python-dev (or whatever is the corresponding package in SUSE) package that contains also distutils. Pearu From pearu at scipy.org Tue Nov 9 15:17:15 2004 From: pearu at scipy.org (Pearu Peterson) Date: Tue, 9 Nov 2004 14:17:15 -0600 (CST) Subject: [SciPy-user] Installing SciPy in user directory? In-Reply-To: References: Message-ID: On Tue, 9 Nov 2004, Yaroslav Bulatov wrote: > Hi, I'm trying to run SciPy on a Linux cluster, which doesn't have > SciPy installed and for which I don't have admin priviledges. Is that > possible? Yes. > In particular I'm wondering if I can install SciPy into home > directory. There's a 2 year old thread describing how to do it with > Numeric, but has anyone done it with whole SciPy? No problem. You can either do it by python setup.py install --prefix=~ or python setup.py install --home=~ In either case you may need to specify the installed location also in PYTHONPATH environment variable. See also `python setup.py install --help`. Pearu From Fernando.Perez at colorado.edu Tue Nov 9 15:25:12 2004 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Tue, 09 Nov 2004 13:25:12 -0700 Subject: [SciPy-user] Installing SciPy in user directory? In-Reply-To: References: Message-ID: <419127A8.8030605@colorado.edu> Pearu Peterson schrieb: > On Tue, 9 Nov 2004, Yaroslav Bulatov wrote: >>In particular I'm wondering if I can install SciPy into home >>directory. There's a 2 year old thread describing how to do it with >>Numeric, but has anyone done it with whole SciPy? > > > No problem. You can either do it by > > python setup.py install --prefix=~ > > or > > python setup.py install --home=~ As a simple suggestion, here's how I handle this: I have a ~/usr directory which I can pass straight into --prefix=~/usr. This has the advantage of having the entire substructure (bin/, lib/, share/, etc) of the system's /usr directory, and allows me to handle personal installs of stuff without cluttering my ~ dir with all these subdirs. I even make a distinction between ~/usr and ~/usr/local: I use the first for things _I_ wrote (but which I don't want system-wide) and the latter for third-party software I install. Sort of like the /usr vs /usr/local distinction made by most Linux distros. Once this is in place, a few PATH/PYTHONPATH adjustments are all that's needed for a very smoothly running environment, which allows me to handle fairly complex user-specific installs with minimal hassles (and good isolation). HTH, f ps. I'd _strongly_ suggest using --prefix instead of --home, since --prefix uses properly versioned python2.X subdirs. This is critical for C extensions, whose magic API number changes with Python releases. From Fernando.Perez at colorado.edu Tue Nov 9 16:59:54 2004 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Tue, 09 Nov 2004 14:59:54 -0700 Subject: [SciPy-user] Scientific computing links list Message-ID: <41913DDA.8070704@colorado.edu> Hi all, at scipy'04 I promised to put together a page of links for scientific computing. At long last, here it is: http://amath.colorado.edu/faculty/fperez/python/scicomp Many apologies for the long delay, and thanks to those who sent me their lists of links. I don't expect this to be the permanent location of the page. Ideally, it probably should be integrated with the topical wiki at: http://www.scipy.org/wikis/topical_software/TopicalSoftware However, a) this took a *lot* more time than I expected, and now I need to move on to other pressing concerns. b) I'm not too sure how the wiki organization decisions are to be made. So I'd like to ask of those handling the website that you simply grab the above page and put it into the Wiki in whichever location/form you find most convenient. And from now on (once it's wikified), I hope the community will pick up and continue to maintain/enhance this page. Best, f From david at dwavesys.com Tue Nov 9 17:44:51 2004 From: david at dwavesys.com (David Grant) Date: Tue, 09 Nov 2004 14:44:51 -0800 Subject: [SciPy-user] swig+numeric+typemaps In-Reply-To: <41909A7D.1050006@telus.net> References: <418FF121.7040302@dwavesys.com> <41909A7D.1050006@telus.net> Message-ID: <41914863.5010503@dwavesys.com> David Grant wrote: > I've found several references from exhaustive google searching which > indicates that this code is out of date and needs updating, but no one > has done it yet. > > I found this: https://geodoc.uchicago.edu/climatewiki/SwigTypemaps > which is cool. I got it working right away. > > Now, I only wonder if it will work with complex numbers and other > situations... I expect problems, but maybe NumPtr will surprise me! Hmm, well I can't pass a Numeric array of complex type (isn't allowed by NumPtr) but I can write a wrapper in C++ around my code which will receive the pure real part and pure imaginary part, then I can combine them. I think I'd have to do this anyways, to get the data massaged into the form that my C++ classes need. From oliphant at ee.byu.edu Tue Nov 9 20:27:21 2004 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Tue, 09 Nov 2004 18:27:21 -0700 Subject: [SciPy-user] SciPy LiveDocs Message-ID: <41916E79.6010305@ee.byu.edu> You may be interested in the online, linked documentation located here: http://www.scipy.org/livedocs/ (don't forget the / at the end) Eventually, this site will support user feedback, but for the moment, it is a hierarchial documentation index of what is available using the docstrings in the code itself. Corrections, additions, etc. are welcome. -Travis Oliphant From nwagner at mecha.uni-stuttgart.de Wed Nov 10 02:28:05 2004 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Wed, 10 Nov 2004 08:28:05 +0100 Subject: [SciPy-user] Type handling of matrices Message-ID: <4191C305.8080400@mecha.uni-stuttgart.de> Hi all, Is it currently possible (by a built-in function) to verify whether a matrix is symmetric or not. I am also interested in a built-in function for verifying the definiteness of a matrix ? Any suggestion or comment ? Nils From pearu at scipy.org Wed Nov 10 04:17:34 2004 From: pearu at scipy.org (Pearu Peterson) Date: Wed, 10 Nov 2004 03:17:34 -0600 (CST) Subject: [SciPy-user] SciPy LiveDocs In-Reply-To: <41916E79.6010305@ee.byu.edu> References: <41916E79.6010305@ee.byu.edu> Message-ID: On Tue, 9 Nov 2004, Travis Oliphant wrote: > > You may be interested in the online, linked documentation located here: > > http://www.scipy.org/livedocs/ > > (don't forget the / at the end) > > Eventually, this site will support user feedback, but for the moment, it is a > hierarchial documentation index of what is available using the docstrings in > the code itself. I like livedocs/ very much. It's fast (I hope you are not planning to expose livedocs/ through plone:) and well accessible compared to http://www.scipy.org/documentation/apidocs Will livedocs/ support class documentation? See e.g. http://oliphant.ee.byu.edu:81/scipy_base/ParallelExec Pearu From Giovanni.Samaey at cs.kuleuven.ac.be Wed Nov 10 10:56:43 2004 From: Giovanni.Samaey at cs.kuleuven.ac.be (Giovanni Samaey) Date: Wed, 10 Nov 2004 16:56:43 +0100 Subject: [SciPy-user] error linking to libg2c on 64-bit machine In-Reply-To: References: <41909A7F.9090204@cs.kuleuven.ac.be> <4190A0D1.80502@cs.kuleuven.ac.be> <4190BBE3.1060302@cs.kuleuven.ac.be> Message-ID: <41923A3B.9070908@cs.kuleuven.ac.be> Pearu Peterson wrote: > > > On Tue, 9 Nov 2004, Giovanni Samaey wrote: > >> Pearu Peterson wrote: >> >>> >>> That could be an option but it would be better to use g2c-pic. >> >> >> The debian version is "stable" (so not testing). >> I located g2c-pic in /usr/lib/gcc-lib/i486-linux/3.3.2/libg2c-pic.a >> >> However, since the script is trying to do a 64-bit compilation, it is >> searching in >> /usr/lib/gcc-lib/i486-linux/3.3.2/64/, where it is not available. >> I asked the system administrator to provide this file. I should >> probably wait until this has happened before proceeding? > Apparently the system adminstration does not consider this a priority, since I have access to libg2c.so. What should I do to use that file instead? Giovanni From nwagner at mecha.uni-stuttgart.de Wed Nov 10 11:44:41 2004 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Wed, 10 Nov 2004 17:44:41 +0100 Subject: [SciPy-user] Sparse matrices Message-ID: <41924579.5080408@mecha.uni-stuttgart.de> Hi all, from scipy import * A = sparse.dok_matrix() for i in range(3): A[i,i] = 0.0+0.0j # A[i,i] = 0.1+0.0j n,m = A.shape print n,m print A n, m should be equal to 3, but this is only valid for non-zero entries. Am I missing something ? Nils From pearu at scipy.org Wed Nov 10 11:58:10 2004 From: pearu at scipy.org (Pearu Peterson) Date: Wed, 10 Nov 2004 10:58:10 -0600 (CST) Subject: [SciPy-user] error linking to libg2c on 64-bit machine In-Reply-To: <41923A3B.9070908@cs.kuleuven.ac.be> References: <41909A7F.9090204@cs.kuleuven.ac.be> <4190A0D1.80502@cs.kuleuven.ac.be> <4190BBE3.1060302@cs.kuleuven.ac.be> <41923A3B.9070908@cs.kuleuven.ac.be> Message-ID: On Wed, 10 Nov 2004, Giovanni Samaey wrote: > Apparently the system adminstration does not consider this a priority, That's typical :) > since I have access to libg2c.so. What should I do to use that file instead? First, try manually if libg2c.so is usable. After scipy build crashes then execute the linking command using libg2c.so. For example, /usr/bin/g77 -shared build/temp.linux-x86_64-2.3/Lib/interpolate/_fitpackmodule.o -Lbuild/temp.linux-x86_64-2.3 -lfitpack /path/to/libg2c.so -o build/lib.linux-x86_64-2.3/scipy/interpolate/_fitpack.so and then try to import _fitpack to python: cd build/lib.linux-x86_64-2.3/scipy/interpolate/ python -c 'import _fitpack' If this works then I can make a patch for scipy_distutils/gnufcompiler.py. Pearu From Giovanni.Samaey at cs.kuleuven.ac.be Wed Nov 10 12:20:50 2004 From: Giovanni.Samaey at cs.kuleuven.ac.be (Giovanni Samaey) Date: Wed, 10 Nov 2004 18:20:50 +0100 Subject: [SciPy-user] error linking to libg2c on 64-bit machine In-Reply-To: References: <41909A7F.9090204@cs.kuleuven.ac.be> <4190A0D1.80502@cs.kuleuven.ac.be> <4190BBE3.1060302@cs.kuleuven.ac.be> <41923A3B.9070908@cs.kuleuven.ac.be> Message-ID: <41924DF2.10102@cs.kuleuven.ac.be> > First, try manually if libg2c.so is usable. After scipy build crashes > then execute the linking command using libg2c.so. For example, > > /usr/bin/g77 -shared > build/temp.linux-x86_64-2.3/Lib/interpolate/_fitpackmodule.o > -Lbuild/temp.linux-x86_64-2.3 -lfitpack /path/to/libg2c.so -o > build/lib.linux-x86_64-2.3/scipy/interpolate/_fitpack.so > > and then try to import _fitpack to python: > > cd build/lib.linux-x86_64-2.3/scipy/interpolate/ > python -c 'import _fitpack' > > If this works then I can make a patch for > scipy_distutils/gnufcompiler.py. This works, on some conditions: there are several libg2c.so files on the system. The one that was sitting next to the libg2c.a file that was detected, did not work. I had to supply another one. Basically, I have to be able to supply the complete filename of the correct libg2c.so file to continue. Also, as you might have seen in the thread on the plotting (which is also me, on the same machine), I have the same problem with x11 libraries. There are some valid x11 libraries in a freaky place, which I have to link to... Anyway, I have passed another step. What should I do next? Thanks a lot already! You've been a great help! Giovanni From pearu at scipy.org Wed Nov 10 12:40:09 2004 From: pearu at scipy.org (Pearu Peterson) Date: Wed, 10 Nov 2004 11:40:09 -0600 (CST) Subject: [SciPy-user] error linking to libg2c on 64-bit machine In-Reply-To: <41924DF2.10102@cs.kuleuven.ac.be> References: <41909A7F.9090204@cs.kuleuven.ac.be> <4190A0D1.80502@cs.kuleuven.ac.be> <41923A3B.9070908@cs.kuleuven.ac.be> <41924DF2.10102@cs.kuleuven.ac.be> Message-ID: On Wed, 10 Nov 2004, Giovanni Samaey wrote: > >> First, try manually if libg2c.so is usable. After scipy build crashes >> then execute the linking command using libg2c.so. For example, >> >> /usr/bin/g77 -shared >> build/temp.linux-x86_64-2.3/Lib/interpolate/_fitpackmodule.o >> -Lbuild/temp.linux-x86_64-2.3 -lfitpack /path/to/libg2c.so -o >> build/lib.linux-x86_64-2.3/scipy/interpolate/_fitpack.so >> >> and then try to import _fitpack to python: >> >> cd build/lib.linux-x86_64-2.3/scipy/interpolate/ >> python -c 'import _fitpack' >> >> If this works then I can make a patch for scipy_distutils/gnufcompiler.py. > > This works, on some conditions: there are several libg2c.so files on the > system. > The one that was sitting next to the libg2c.a file that was detected, did not > work. > I had to supply another one. > > Basically, I have to be able to supply the complete filename of the correct > libg2c.so file to continue. Could you be more specific? What is the location of the libg2c.so that works? > Also, as you might have seen in the thread on the plotting (which is also me, > on the same machine), > I have the same problem with x11 libraries. There are some valid x11 > libraries in a freaky place, which I have to link to... Again, since yours machine configuration is rather unique, be more specific on the location of libraries that work. Pearu From Giovanni.Samaey at cs.kuleuven.ac.be Wed Nov 10 12:47:45 2004 From: Giovanni.Samaey at cs.kuleuven.ac.be (Giovanni Samaey) Date: Wed, 10 Nov 2004 18:47:45 +0100 Subject: [SciPy-user] error linking to libg2c on 64-bit machine In-Reply-To: References: <41909A7F.9090204@cs.kuleuven.ac.be> <4190A0D1.80502@cs.kuleuven.ac.be> <41923A3B.9070908@cs.kuleuven.ac.be> <41924DF2.10102@cs.kuleuven.ac.be> Message-ID: <41925441.4050404@cs.kuleuven.ac.be> OK, sorry. I presumed that being specific would be "too much info" :-) The working libg2c.so file is /apps/prod/local64/lib64/libg2c.so According to the sys-admin, working x11 libraries are to be found (not tested by me) in /apps/prod/xlib64/ and ls /apps/prod/xlib64/ gives libICE.so.6 -> libICE.so.6.3 libICE.so.6.3 libSM.so.6 -> libSM.so.6.0 libSM.so.6.0 libX11.so.6 -> libX11.so.6.2 libX11.so.6.2 libXTrap.so.6 -> libXTrap.so.6.4 libXTrap.so.6.4 libXext.so.6 -> libXext.so.6.4 libXext.so.6.4 libXft.so.1 -> libXft.so.1.1 libXft.so.1.1 libXi.so.6 -> libXi.so.6.0 libXi.so.6.0 libXmu.so.6 -> libXmu.so.6.2 libXmu.so.6.2 libXmuu.so.1 -> libXmuu.so.1.0 libXmuu.so.1.0 libXp.so.6 -> libXp.so.6.2 libXp.so.6.2 libXpm.so.4 -> libXpm.so.4.11 libXpm.so.4.11 libXrandr.so.2 -> libXrandr.so.2.0 libXrandr.so.2.0 libXt.so.6 -> libXt.so.6.0 libXt.so.6.0 libXtst.so.6 -> libXtst.so.6.1 libXtst.so.6.1 libXv.so.1 -> libXv.so.1.0 libXv.so.1.0 libxrx.so.6 -> libxrx.so.6.3 libxrx.so.6.3 Pearu Peterson wrote: > > > On Wed, 10 Nov 2004, Giovanni Samaey wrote: > >> >>> First, try manually if libg2c.so is usable. After scipy build crashes >>> then execute the linking command using libg2c.so. For example, >>> >>> /usr/bin/g77 -shared >>> build/temp.linux-x86_64-2.3/Lib/interpolate/_fitpackmodule.o >>> -Lbuild/temp.linux-x86_64-2.3 -lfitpack /path/to/libg2c.so -o >>> build/lib.linux-x86_64-2.3/scipy/interpolate/_fitpack.so >>> >>> and then try to import _fitpack to python: >>> >>> cd build/lib.linux-x86_64-2.3/scipy/interpolate/ >>> python -c 'import _fitpack' >>> >>> If this works then I can make a patch for >>> scipy_distutils/gnufcompiler.py. >> >> >> This works, on some conditions: there are several libg2c.so files on >> the system. >> The one that was sitting next to the libg2c.a file that was detected, >> did not work. >> I had to supply another one. >> >> Basically, I have to be able to supply the complete filename of the >> correct libg2c.so file to continue. > > > Could you be more specific? What is the location of the libg2c.so that > works? > >> Also, as you might have seen in the thread on the plotting (which is >> also me, on the same machine), >> I have the same problem with x11 libraries. There are some valid x11 >> libraries in a freaky place, which I have to link to... > > > Again, since yours machine configuration is rather unique, be more > specific on the location of libraries that work. > > Pearu > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user -- Giovanni Samaey http://www.cs.kuleuven.ac.be/~giovanni/ Katholieke Universiteit Leuven email: giovanni at cs.kuleuven.ac.be Departement Computerwetenschappen phone: +32-16-327081 Celestijnenlaan 200A, B-3001 Heverlee, Belgium fax: +32-16-327996 Office: A04.36 From pearu at scipy.org Wed Nov 10 13:15:29 2004 From: pearu at scipy.org (Pearu Peterson) Date: Wed, 10 Nov 2004 12:15:29 -0600 (CST) Subject: [SciPy-user] error linking to libg2c on 64-bit machine In-Reply-To: <41925441.4050404@cs.kuleuven.ac.be> References: <41909A7F.9090204@cs.kuleuven.ac.be> <4190A0D1.80502@cs.kuleuven.ac.be><41924DF2.10102@cs.kuleuven.ac.be> <41925441.4050404@cs.kuleuven.ac.be> Message-ID: On Wed, 10 Nov 2004, Giovanni Samaey wrote: > OK, sorry. I presumed that being specific would be "too much info" :-) > The working libg2c.so file is > /apps/prod/local64/lib64/libg2c.so > > According to the sys-admin, working x11 libraries are to be found (not tested > by me) in > /apps/prod/xlib64/ > and ls /apps/prod/xlib64/ gives Thanks. These locations look strange enough to be considered as nonstandard locations (I tried google(/apps/prod/local64) and got no results) and using site.cfg might be the proper way to go. See the header of scipy_distutils/system_info.py file as well as of sample_site.cfg. Another approach would be to use build_ext -L flag, e.g try python setup.py build_ext -L/apps/prod/local64/lib64 build to specify the location of libg2c.so. To resolve x11 libraries problem, you should probably use site.cfg file. Pearu From oliphant at ee.byu.edu Wed Nov 10 14:50:02 2004 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Wed, 10 Nov 2004 12:50:02 -0700 Subject: [SciPy-user] SciPy LiveDocs In-Reply-To: References: <41916E79.6010305@ee.byu.edu> Message-ID: <419270EA.80307@ee.byu.edu> Pearu Peterson wrote: > > > On Tue, 9 Nov 2004, Travis Oliphant wrote: > >> >> You may be interested in the online, linked documentation located here: >> >> http://www.scipy.org/livedocs/ >> >> (don't forget the / at the end) >> >> Eventually, this site will support user feedback, but for the moment, >> it is a hierarchial documentation index of what is available using >> the docstrings in the code itself. > > > I like livedocs/ very much. It's fast (I hope you are not planning to > expose livedocs/ through plone:) and well accessible compared to > http://www.scipy.org/documentation/apidocs > > Will livedocs/ support class documentation? See e.g. > http://oliphant.ee.byu.edu:81/scipy_base/ParallelExec > Yes, it's just a matter of handling all the different types of objects that scipy uses. I was handling old-style classes but had to add a simple check for the new-style class that this one is. It should work now. Notice, that it uses info() to construct the documentation that is printed. -Travis O. From oliphant at ee.byu.edu Wed Nov 10 14:52:19 2004 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Wed, 10 Nov 2004 12:52:19 -0700 Subject: [SciPy-user] SciPy LiveDocs In-Reply-To: References: <41916E79.6010305@ee.byu.edu> Message-ID: <41927173.6000104@ee.byu.edu> Pearu Peterson wrote: > > > On Tue, 9 Nov 2004, Travis Oliphant wrote: > >> >> You may be interested in the online, linked documentation located here: >> >> http://www.scipy.org/livedocs/ >> >> (don't forget the / at the end) >> >> Eventually, this site will support user feedback, but for the moment, >> it is a hierarchial documentation index of what is available using >> the docstrings in the code itself. > > > I like livedocs/ very much. It's fast (I hope you are not planning to > expose livedocs/ through plone:) and well accessible compared to > http://www.scipy.org/documentation/apidocs The thought was to do something simple outside of plone, and so I don't have any such plans. The Python code that runs the site is very simple right now (not including the nevow and twisted.web software itself) so I can grasp it and make changes easily. I suspect most Python users could also make changes easily. -Travis O. From oliphant at ee.byu.edu Wed Nov 10 14:55:02 2004 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Wed, 10 Nov 2004 12:55:02 -0700 Subject: [SciPy-user] Sparse matrices In-Reply-To: <41924579.5080408@mecha.uni-stuttgart.de> References: <41924579.5080408@mecha.uni-stuttgart.de> Message-ID: <41927216.7040606@ee.byu.edu> Nils Wagner wrote: > Hi all, > > from scipy import * > A = sparse.dok_matrix() > > for i in range(3): > A[i,i] = 0.0+0.0j > # A[i,i] = 0.1+0.0j > > n,m = A.shape > print n,m > print A > > > n, m should be equal to 3, but this is only valid for non-zero entries. > > Am I missing something ? Entering a zero value into a sparse matrix causes no storage and therefore no shape changes. A remains an empty sparse matrix. What are you trying to do? -Travis From david at dwavesys.com Wed Nov 10 16:08:13 2004 From: david at dwavesys.com (David Grant) Date: Wed, 10 Nov 2004 13:08:13 -0800 Subject: [SciPy-user] Sparse matrices In-Reply-To: <41927216.7040606@ee.byu.edu> References: <41924579.5080408@mecha.uni-stuttgart.de> <41927216.7040606@ee.byu.edu> Message-ID: <4192833D.9090601@dwavesys.com> Travis Oliphant wrote: > Nils Wagner wrote: > >> Hi all, >> >> from scipy import * >> A = sparse.dok_matrix() >> >> for i in range(3): >> A[i,i] = 0.0+0.0j >> # A[i,i] = 0.1+0.0j >> >> n,m = A.shape >> print n,m >> print A >> >> >> n, m should be equal to 3, but this is only valid for non-zero entries. >> >> Am I missing something ? > > > Entering a zero value into a sparse matrix causes no storage and > therefore no shape changes. A remains an empty sparse matrix. What > are you trying to do? > > One thing that it is interesting. These sparse arrays are dynamically allocated I assume? In the PySparse package, arrays are dynamically allocated, but the shape must be set upon initialization. It looks like you've made a more flexible package here. From gazzar at email.com Wed Nov 10 19:09:27 2004 From: gazzar at email.com (Gary Ruben) Date: Thu, 11 Nov 2004 10:09:27 +1000 Subject: [SciPy-user] Type handling of matrices Message-ID: <20041111000927.82273164005@ws1-4.us4.outblaze.com> Hi Nils, How about transposing and subtracting it? >>> a=array([[1,2,3],[2,1,4],[3,4,1]]) >>> a array([[1, 2, 3], [2, 1, 4], [3, 4, 1]]) >>> b=transpose(a) >>> b array([[1, 2, 3], [2, 1, 4], [3, 4, 1]]) >>> a-b array([[0, 0, 0], [0, 0, 0], [0, 0, 0]]) Alternatively, maybe you could sum along each axis of your matrix and compare the results? eg. >>> c=sum(a) >>> d=sum(a,1) >>> c array([6, 7, 8]) >>> d array([6, 7, 8]) >>> c-d array([0, 0, 0]) I'm not sure whether there might be cases where this could falsely identify a matrix as symmetric - you'll have to have a think about this. Gary R. ----- Original Message ----- > > Hi all, > > Is it currently possible (by a built-in function) to verify whether a > matrix > is symmetric or not. I am also interested in a > built-in function for verifying the definiteness of a matrix ? > > Any suggestion or comment ? > > Nils -- ___________________________________________________________ Sign-up for Ads Free at Mail.com http://promo.mail.com/adsfreejump.htm From nwagner at mecha.uni-stuttgart.de Thu Nov 11 03:06:09 2004 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Thu, 11 Nov 2004 09:06:09 +0100 Subject: [SciPy-user] Type handling of matrices In-Reply-To: <20041111000927.82273164005@ws1-4.us4.outblaze.com> References: <20041111000927.82273164005@ws1-4.us4.outblaze.com> Message-ID: <41931D71.5060404@mecha.uni-stuttgart.de> Gary Ruben wrote: >Hi Nils, >How about transposing and subtracting it? > > > That's the definition of a real symmetric matrix A = transpose(A) In case of complex matrices we have A = conj(transpose(A)) I am looking for a built-in function A.issym which returns 1 if A is hermitian 0 if A is non-hermitian How about similar functions for A.issingular (singular from a numerical point of view) A.isspd (symmetric positive definite) This might be useful with respect to iterative solvers. AFAIK linalg.cg is restricted to spd matrices. A.isindefinite Any comments or suggestions ? Nils >>>>a=array([[1,2,3],[2,1,4],[3,4,1]]) >>>>a >>>> >>>> >array([[1, 2, 3], > [2, 1, 4], > [3, 4, 1]]) > > >>>>b=transpose(a) >>>>b >>>> >>>> >array([[1, 2, 3], > [2, 1, 4], > [3, 4, 1]]) > > >>>>a-b >>>> >>>> >array([[0, 0, 0], > [0, 0, 0], > [0, 0, 0]]) > > >Alternatively, maybe you could sum along each axis of your matrix and compare the results? >eg. > > > >>>>c=sum(a) >>>>d=sum(a,1) >>>>c >>>> >>>> >array([6, 7, 8]) > > >>>>d >>>> >>>> >array([6, 7, 8]) > > >>>>c-d >>>> >>>> >array([0, 0, 0]) > >I'm not sure whether there might be cases where this could falsely identify a matrix as symmetric - you'll have to have a think about this. >Gary R. > >----- Original Message ----- > > >>Hi all, >> >>Is it currently possible (by a built-in function) to verify whether a >>matrix >>is symmetric or not. I am also interested in a >>built-in function for verifying the definiteness of a matrix ? >> >>Any suggestion or comment ? >> >>Nils >> >> > > > From pearu at scipy.org Thu Nov 11 03:44:40 2004 From: pearu at scipy.org (Pearu Peterson) Date: Thu, 11 Nov 2004 02:44:40 -0600 (CST) Subject: [SciPy-user] Type handling of matrices In-Reply-To: <41931D71.5060404@mecha.uni-stuttgart.de> References: <20041111000927.82273164005@ws1-4.us4.outblaze.com> <41931D71.5060404@mecha.uni-stuttgart.de> Message-ID: On Thu, 11 Nov 2004, Nils Wagner wrote: > Gary Ruben wrote: > >> Hi Nils, >> How about transposing and subtracting it? >> >> > That's the definition of a real symmetric matrix > A = transpose(A) > > In case of complex matrices we have > > A = conj(transpose(A)) > > I am looking for a built-in function > > A.issym > > which returns > 1 if A is hermitian > 0 if A is non-hermitian > > How about similar functions for > > A.issingular (singular from a numerical point of view) > A.isspd (symmetric positive definite) This might be useful with respect to > iterative solvers. AFAIK linalg.cg is restricted to spd matrices. > A.isindefinite Methods (not attributes) A.is can be defined for sparse matrices, in fact, sparse.spmatrix should take advantage of using these properties. For full matrices, is must probably be a function as we cannot change Numeric of numarray in this way (unless we subclass Matrix). They can be implemented in scipy_base/matrix_base.py, for instance. = symmetric | hermitian | singular | positive | negative | nonnegative | nonpositive All these predicate functions must have optional tolerance argument to take into account numerical errors that accumulate when computing eigenvalues.. Pearu From Fernando.Perez at colorado.edu Thu Nov 11 03:51:45 2004 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Thu, 11 Nov 2004 01:51:45 -0700 Subject: [SciPy-user] Type handling of matrices In-Reply-To: References: <20041111000927.82273164005@ws1-4.us4.outblaze.com> <41931D71.5060404@mecha.uni-stuttgart.de> Message-ID: <41932821.2090200@colorado.edu> Pearu Peterson wrote: > Methods (not attributes) A.is can be defined for sparse > matrices, in fact, sparse.spmatrix should take advantage of using > these properties. > > For full matrices, is must probably be a function as we cannot > change Numeric of numarray in this way (unless we subclass Matrix). They > can be implemented in scipy_base/matrix_base.py, for instance. How about just using functions for everything? I'd really hate to have to remember that if A is sparse, A.is works, but if it's dense I need to instead use is(A). Another alternative would be to suggest these modifications be made to numarray proper, if the method approach appears preferable. But I'd really like to vote for a _single_ syntax for property queries. Ultimately which one doesn't matter that much to me, but I don't want to see two different syntaxes for the same thing. Best, f From rkern at ucsd.edu Thu Nov 11 04:30:25 2004 From: rkern at ucsd.edu (Robert Kern) Date: Thu, 11 Nov 2004 01:30:25 -0800 Subject: [SciPy-user] Type handling of matrices In-Reply-To: <41932821.2090200@colorado.edu> References: <20041111000927.82273164005@ws1-4.us4.outblaze.com> <41931D71.5060404@mecha.uni-stuttgart.de> <41932821.2090200@colorado.edu> Message-ID: <41933131.7010408@ucsd.edu> Fernando Perez wrote: > How about just using functions for everything? I'd really hate to have > to remember that if A is sparse, A.is works, but if it's dense I > need to instead use is(A). The big benefit for a property (not method) on sparse matrices is that one can set A.ishermitian on initialization and all get/set operations will obey that property. A = sparse.dok_matrix() A.ishermitian = True A[1,2] = 1. assert A[2,1] == 1. Some sparse formats may be able to hold the property information and use it effectively without duplicating the individual elements themselves. There should definitely be a function interface for querying these properties that will work for both dense and sparse matrices. -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From nwagner at mecha.uni-stuttgart.de Thu Nov 11 04:47:07 2004 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Thu, 11 Nov 2004 10:47:07 +0100 Subject: [SciPy-user] Possibly bug in iterative.py Message-ID: <4193351B.9040509@mecha.uni-stuttgart.de> Hi all, I am going to solve large scale linear systems with linalg.gmres This is the output of my test program... Traceback (most recent call last): File "test1.py", line 12, in ? x1 = linalg.gmres(ma-s*mb,r) File "/usr/lib/python2.3/site-packages/scipy/linalg/iterative.py", line 472, in gmres work = sb.zeros((6+restrt)*n,typ) ValueError: negative dimensions are not allowed >>> ma <67986x67986 sparse matrix of type 'd' with 4222171 stored elements (space for 4222171) in COOrdinate format> >>> mb <67986x67986 sparse matrix of type 'd' with 68150 stored elements (space for 68150) in COOrdinate format> >>> s (3000000+1j) >>> type(r) >>> shape(r) (67986,) Any suggestion ? Nils From pearu at scipy.org Thu Nov 11 04:49:18 2004 From: pearu at scipy.org (Pearu Peterson) Date: Thu, 11 Nov 2004 03:49:18 -0600 (CST) Subject: [SciPy-user] Type handling of matrices In-Reply-To: <41933131.7010408@ucsd.edu> References: <20041111000927.82273164005@ws1-4.us4.outblaze.com> <41932821.2090200@colorado.edu> <41933131.7010408@ucsd.edu> Message-ID: On Thu, 11 Nov 2004, Robert Kern wrote: > Fernando Perez wrote: > >> How about just using functions for everything? I'd really hate to have to >> remember that if A is sparse, A.is works, but if it's dense I need to >> instead use is(A). > > The big benefit for a property (not method) on sparse matrices is that one > can set A.ishermitian on initialization and all get/set operations will obey > that property. > > A = sparse.dok_matrix() > A.ishermitian = True > A[1,2] = 1. > assert A[2,1] == 1. The problem with this approach is that it is not safe. It is too easy to make the following error: A.ishermitian = True A[1,2] = 1. A[2,1] = 2. I would introduce sethermitian(flag=True) method that for an empty matrix sets private A._ishermitian and for a non-empty matrix it first checks if A can be hermitian (if not, it raises an exception). Hmm, should traits package or Python property features be used here? > Some sparse formats may be able to hold the property information and use it > effectively without duplicating the individual elements themselves. I agree. > There should definitely be a function interface for querying these properties > that will work for both dense and sparse matrices. Ok. Pearu From rkern at ucsd.edu Thu Nov 11 05:31:05 2004 From: rkern at ucsd.edu (Robert Kern) Date: Thu, 11 Nov 2004 02:31:05 -0800 Subject: [SciPy-user] Type handling of matrices In-Reply-To: References: <20041111000927.82273164005@ws1-4.us4.outblaze.com> <41932821.2090200@colorado.edu> <41933131.7010408@ucsd.edu> Message-ID: <41933F69.5040909@ucsd.edu> Pearu Peterson wrote: > > > On Thu, 11 Nov 2004, Robert Kern wrote: > >> Fernando Perez wrote: >> >>> How about just using functions for everything? I'd really hate to >>> have to remember that if A is sparse, A.is works, but if it's >>> dense I need to instead use is(A). >> >> >> The big benefit for a property (not method) on sparse matrices is that >> one can set A.ishermitian on initialization and all get/set operations >> will obey that property. >> >> A = sparse.dok_matrix() >> A.ishermitian = True >> A[1,2] = 1. >> assert A[2,1] == 1. > > > The problem with this approach is that it is not safe. It is too easy > to make the following error: > > A.ishermitian = True > A[1,2] = 1. > A[2,1] = 2. > > I would introduce sethermitian(flag=True) method that for an empty matrix > sets private A._ishermitian and for a non-empty matrix it first checks > if A can be hermitian (if not, it raises an exception). Actually, now that I think some more on it, I'd probably make them keyword arguments in __init__ and the corresponding properties read-only. A[1,2] = 1. A[2,1] = 2. A.ishermitian = True This should raise an error since I can't think of a reasonable action to perform. > Hmm, should traits package or Python property features be used here? traits may be overkill, but properties certainly can be used here. -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From pearu at scipy.org Thu Nov 11 05:48:09 2004 From: pearu at scipy.org (Pearu Peterson) Date: Thu, 11 Nov 2004 04:48:09 -0600 (CST) Subject: [SciPy-user] Type handling of matrices In-Reply-To: <41933F69.5040909@ucsd.edu> References: <20041111000927.82273164005@ws1-4.us4.outblaze.com> <41932821.2090200@colorado.edu> <41933131.7010408@ucsd.edu> <41933F69.5040909@ucsd.edu> Message-ID: On Thu, 11 Nov 2004, Robert Kern wrote: > Pearu Peterson wrote: >> >> >> On Thu, 11 Nov 2004, Robert Kern wrote: >> >>> Fernando Perez wrote: >>> >>>> How about just using functions for everything? I'd really hate to have >>>> to remember that if A is sparse, A.is works, but if it's dense I >>>> need to instead use is(A). >>> >>> >>> The big benefit for a property (not method) on sparse matrices is that one >>> can set A.ishermitian on initialization and all get/set operations will >>> obey that property. >>> >>> A = sparse.dok_matrix() >>> A.ishermitian = True >>> A[1,2] = 1. >>> assert A[2,1] == 1. >> >> >> The problem with this approach is that it is not safe. It is too easy >> to make the following error: >> >> A.ishermitian = True >> A[1,2] = 1. >> A[2,1] = 2. >> >> I would introduce sethermitian(flag=True) method that for an empty matrix >> sets private A._ishermitian and for a non-empty matrix it first checks >> if A can be hermitian (if not, it raises an exception). > > Actually, now that I think some more on it, I'd probably make them keyword > arguments in __init__ and the corresponding properties read-only. > > A[1,2] = 1. > A[2,1] = 2. > A.ishermitian = True > > This should raise an error since I can't think of a reasonable action to > perform. I think I like the idea of setting attributes in __init__ very much. This actually simplifies the code: no need to test if a matrix still has a specific property everytime the matrix is changed. Btw, sparse.dok_matrix should also have N,M optional arguments (like other spmatrix derivatives) to fix the shape. Otherwise, it is rather difficult to perform matrix-matrix multiplication, for exmaple, with dok_matrix'ses if the shape depends on its contents. Pearu From nwagner at mecha.uni-stuttgart.de Thu Nov 11 07:10:24 2004 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Thu, 11 Nov 2004 13:10:24 +0100 Subject: [SciPy-user] Sparse identity matrix speye() Message-ID: <419356B0.8000507@mecha.uni-stuttgart.de> Hi all, Is there a better way to generate a sparse identity matrix I have used A=sparse.csc_matrix(identity(3)) Nils From nwagner at mecha.uni-stuttgart.de Thu Nov 11 08:01:28 2004 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Thu, 11 Nov 2004 14:01:28 +0100 Subject: [SciPy-user] Automatically retreiving Matrix Market files Message-ID: <419362A8.5080002@mecha.uni-stuttgart.de> Hi all, I am going to retreive test matrices from the Matrix Market page from scipy import * from tempfile import mktemp from urllib import urlretrieve url="http://math.nist.gov/MatrixMarket/data/SPARSKIT/fidap" fname = mktemp(".mtx.gz") print "Downloading Matrix Market; this may take a minute..." urlretrieve(url, fname) How can I decompress the file afterwards and how do I use io.mmread() to import the matrix under consideration. Sorry, if this inquiry is not purely related to scipy. Nils From pearu at scipy.org Thu Nov 11 08:13:33 2004 From: pearu at scipy.org (Pearu Peterson) Date: Thu, 11 Nov 2004 07:13:33 -0600 (CST) Subject: [SciPy-user] Automatically retreiving Matrix Market files In-Reply-To: <419362A8.5080002@mecha.uni-stuttgart.de> References: <419362A8.5080002@mecha.uni-stuttgart.de> Message-ID: On Thu, 11 Nov 2004, Nils Wagner wrote: > Hi all, > > I am going to retreive test matrices from the Matrix Market page > > from scipy import * > from tempfile import mktemp > from urllib import urlretrieve > url="http://math.nist.gov/MatrixMarket/data/SPARSKIT/fidap" > fname = mktemp(".mtx.gz") > print "Downloading Matrix Market; this may take a minute..." > urlretrieve(url, fname) > > How can I decompress the file afterwards and how do I use > > io.mmread() > > to import the matrix under consideration. Try: import gzip io.mmread(gzip.open(fname)) Pearu From nwagner at mecha.uni-stuttgart.de Thu Nov 11 08:39:47 2004 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Thu, 11 Nov 2004 14:39:47 +0100 Subject: [SciPy-user] Automatically retreiving Matrix Market files In-Reply-To: References: <419362A8.5080002@mecha.uni-stuttgart.de> Message-ID: <41936BA3.8040204@mecha.uni-stuttgart.de> Pearu Peterson wrote: > > > On Thu, 11 Nov 2004, Nils Wagner wrote: > >> Hi all, >> >> I am going to retreive test matrices from the Matrix Market page >> >> from scipy import * >> from tempfile import mktemp >> from urllib import urlretrieve >> url="http://math.nist.gov/MatrixMarket/data/SPARSKIT/fidap" >> fname = mktemp(".mtx.gz") >> print "Downloading Matrix Market; this may take a minute..." >> urlretrieve(url, fname) >> >> How can I decompress the file afterwards and how do I use >> >> io.mmread() >> >> to import the matrix under consideration. > > > Try: > > import gzip > io.mmread(gzip.open(fname)) > > Pearu Hi Pearu, from scipy import * from tempfile import mktemp from urllib import urlretrieve import os path = os.path.join(os.environ['HOME'], 'matrices') import gzip url="ftp://math.nist.gov/pub/MatrixMarket2/SPARSKIT/fidap/fidap005.mtx.gz" fname = mktemp(".mtx.gz") print "Downloading Matrix Market; this may take a minute..." urlretrieve(url, fname) A = io.mmread(gzip.open(fname)) Now it works fine. Thank you very much ! BTW, how can I remove the *.mtx.gz files in /tmp automatically ? Nils > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user From rkern at ucsd.edu Thu Nov 11 09:03:06 2004 From: rkern at ucsd.edu (Robert Kern) Date: Thu, 11 Nov 2004 06:03:06 -0800 Subject: [SciPy-user] Automatically retreiving Matrix Market files In-Reply-To: <41936BA3.8040204@mecha.uni-stuttgart.de> References: <419362A8.5080002@mecha.uni-stuttgart.de> <41936BA3.8040204@mecha.uni-stuttgart.de> Message-ID: <4193711A.6080703@ucsd.edu> Nils Wagner wrote: > url="ftp://math.nist.gov/pub/MatrixMarket2/SPARSKIT/fidap/fidap005.mtx.gz" > fname = mktemp(".mtx.gz") > print "Downloading Matrix Market; this may take a minute..." > urlretrieve(url, fname) > A = io.mmread(gzip.open(fname)) > > Now it works fine. Thank you very much ! > > BTW, how can I remove the *.mtx.gz files in /tmp automatically ? os.unlink(fname) -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From nwagner at mecha.uni-stuttgart.de Thu Nov 11 09:12:34 2004 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Thu, 11 Nov 2004 15:12:34 +0100 Subject: [SciPy-user] Visualizing Sparsity Pattern of matrices Message-ID: <41937352.7010104@mecha.uni-stuttgart.de> Hi all, Structure plots provide a quick visual check on the sparsity pattern of the matrix. A structure plot is a rectangular array of dots; a dot is black if the corresponding matrix element is nonzero otherwise it is white. Is it possible to generate such plots with scipy or should we switch over to matplotlib ? Nils Reference: http://math.nist.gov/MatrixMarket/structureplots.html From jdhunter at ace.bsd.uchicago.edu Thu Nov 11 10:05:04 2004 From: jdhunter at ace.bsd.uchicago.edu (John Hunter) Date: Thu, 11 Nov 2004 09:05:04 -0600 Subject: [SciPy-user] Re: [Matplotlib-users] Visualizing Sparsity Pattern of matrices In-Reply-To: <41937352.7010104@mecha.uni-stuttgart.de> (Nils Wagner's message of "Thu, 11 Nov 2004 15:12:34 +0100") References: <41937352.7010104@mecha.uni-stuttgart.de> Message-ID: >>>>> "Nils" == Nils Wagner writes: Nils> Hi all, Structure plots provide a quick visual check on the Nils> sparsity pattern of the matrix. A structure plot is a Nils> rectangular array of dots; a dot is black if the Nils> corresponding matrix element is nonzero otherwise it is Nils> white. Nils> Is it possible to generate such plots with scipy or should Nils> we switch over to matplotlib ? A quick matplotlib implementation is below. In matlab this function is called "spy" and Alexander Schmolck requested this in an earlier post. The spy implementation uses plot markers which are fixed sizes (in points). For large matrices, you'll likely want to use a smaller markersize. Perhaps better would be to use a polygon collection setup so that the marker sizes filled the boundaries of the matrix cell. This would take a little more work, and would also have a different call signature that matlab's, since matlab also uses plots markers . If you have any thoughts on how you would like the implementation to work, please share them... JDH from matplotlib.matlab import * def get_xyz_where(Z, Cond): """ Z and Cond are MxN matrices. Z are data and Cond is a boolean matrix where some condition is satisfied. Return value is x,y,z where x and y are the indices into Z and z are the values of Z at those indices. x,y,z are 1D arrays This is a lot easier in numarray - is there a more elegant way to do this that works on both numeric and numarray? """ M,N = Z.shape z = ravel(Z) ind = nonzero( ravel(Cond) ) x = arange(M); x.shape = M,1 X = repeat(x, N, 1) x = ravel(X) y = arange(N); y.shape = 1,N Y = repeat(y, M) y = ravel(Y) x = take(x, ind) y = take(y, ind) z = take(z, ind) return x,y,z def spy(Z, marker='s', markersize=10, **kwargs): """ SPY(Z, **kwrags) plots the sparsity pattern of the matrix S. kwargs give the marker properties - see help(plot) for more information on marker properties """ x,y,z = get_xyz_where(Z, Z>0) plot(x+0.5,y+0.5, linestyle='None', marker=marker,markersize=markersize, **kwargs) M,N = 25,20 data = zeros((M,N))*0. data[:,12] = rand(M) data[5,:] = rand(N) spy(data) axis([0,M,0,N]) show() From jdhunter at ace.bsd.uchicago.edu Thu Nov 11 10:16:03 2004 From: jdhunter at ace.bsd.uchicago.edu (John Hunter) Date: Thu, 11 Nov 2004 09:16:03 -0600 Subject: [SciPy-user] Re: [Matplotlib-users] Visualizing Sparsity Pattern of matrices In-Reply-To: <41937352.7010104@mecha.uni-stuttgart.de> (Nils Wagner's message of "Thu, 11 Nov 2004 15:12:34 +0100") References: <41937352.7010104@mecha.uni-stuttgart.de> Message-ID: >>>>> "Nils" == Nils Wagner writes: Nils> Hi all, Structure plots provide a quick visual check on the Nils> sparsity pattern of the matrix. A structure plot is a Nils> rectangular array of dots; a dot is black if the Nils> corresponding matrix element is nonzero otherwise it is Nils> white. Nils> Is it possible to generate such plots with scipy or should Nils> we switch over to matplotlib ? Here's another implementation that uses images - likely to be much faster for very large matrices. import random from matplotlib.colors import LinearSegmentedColormap from matplotlib.matlab import * def spy2(Z): """ SPY(Z) plots the sparsity pattern of the matrix S as an image """ #binary colormap min white, max black cmapdata = { 'red' : ((0., 1., 1.), (1., 0., 0.)), 'green': ((0., 1., 1.), (1., 0., 0.)), 'blue' : ((0., 1., 1.), (1., 0., 0.)) } binary = LinearSegmentedColormap('binary', cmapdata, 2) Z = where(Z>0,1.,0.) imshow(transpose(Z), interpolation='nearest', cmap=binary) def get_sparse_matrix(M,N,frac=0.1): 'return a MxN sparse matrix with frac elements randomly filled' data = zeros((M,N))*0. for i in range(int(M*N*frac)): x = random.randint(0,M-1) y = random.randint(0,N-1) data[x,y] = rand() return data data = get_sparse_matrix(50,60) spy2(data) show() From maising at mrao.cam.ac.uk Thu Nov 11 12:50:00 2004 From: maising at mrao.cam.ac.uk (Klaus Maisinger) Date: Thu, 11 Nov 2004 17:50:00 +0000 (GMT) Subject: [SciPy-user] optimize.fmin_powell and extra arguments Message-ID: Dear SciPy experts, the following little test script causes errors on the last two lines for fmin_powell, but not for fmin. Did I get the calling convention wrong (SciPy 0.3.2, Numeric 23.6, Python 2.2.2, RH9)? Cheers, Klaus ################################################################ from scipy import * def fun(x,y): return x*x+y def fun2(x,y,z): return x*x+y[0]+z[0] a=array([1,2]) b=array([3,4]) print optimize.fmin(fun, [ 2.], (4.,)) print optimize.fmin(fun2, [ 2.], (a,b)) print optimize.fmin_powell(fun, [ 2.], (4.,)) print optimize.fmin_powell(fun2, [ 2.], (a,b)) ############################################### Output: Optimization terminated successfully. Current function value: 4.000000 Iterations: 18 Function evaluations: 36 [ -1.77635684e-15] Optimization terminated successfully. Current function value: 4.000000 Iterations: 18 Function evaluations: 36 [ -1.77635684e-15] Traceback (most recent call last): File "./test_fun.py", line 15, in ? print optimize.fmin_powell(fun, [ 2.], (4.,)) File "/usr/lib/python2.2/site-packages/scipy/optimize/optimize.py", line 1491, in fmin_powell fval, x, direc1 = _linesearch_powell(func, x, direc1, args=args, tol=xtol*100) File "/usr/lib/python2.2/site-packages/scipy/optimize/optimize.py", line 1420, in _linesearch_powell full_output=1, tol=tol) File "/usr/lib/python2.2/site-packages/scipy/optimize/optimize.py", line 1196, in brent xa,xb,xc,fa,fb,fc,funcalls = bracket(func, args=args) File "/usr/lib/python2.2/site-packages/scipy/optimize/optimize.py", line 1354, in bracket fa = apply(func, (xa,)+args) File "/usr/lib/python2.2/site-packages/scipy/optimize/optimize.py", line 1410, in _myfunc funcargs = (x0 + alpha * direc,)+args TypeError: can only concatenate tuple (not "float") to tuple or (if I comment out the penultimate line): Traceback (most recent call last): File "./test_fun.py", line 15, in ? print optimize.fmin_powell(fun2, [ 2.], (a,b)) File "/usr/lib/python2.2/site-packages/scipy/optimize/optimize.py", line 1491, in fmin_powell fval, x, direc1 = _linesearch_powell(func, x, direc1, args=args, tol=xtol*100) File "/usr/lib/python2.2/site-packages/scipy/optimize/optimize.py", line 1420, in _linesearch_powell full_output=1, tol=tol) File "/usr/lib/python2.2/site-packages/scipy/optimize/optimize.py", line 1196, in brent xa,xb,xc,fa,fb,fc,funcalls = bracket(func, args=args) File "/usr/lib/python2.2/site-packages/scipy/optimize/optimize.py", line 1354, in bracket fa = apply(func, (xa,)+args) TypeError: _myfunc() takes at most 5 arguments (6 given) From pearu at scipy.org Thu Nov 11 14:40:46 2004 From: pearu at scipy.org (Pearu Peterson) Date: Thu, 11 Nov 2004 13:40:46 -0600 (CST) Subject: [SciPy-user] optimize.fmin_powell and extra arguments In-Reply-To: References: Message-ID: On Thu, 11 Nov 2004, Klaus Maisinger wrote: > Dear SciPy experts, > > the following little test script causes errors on the last two lines for > fmin_powell, but not for fmin. Did I get the calling convention wrong (SciPy > 0.3.2, Numeric 23.6, Python 2.2.2, RH9)? No, it is a bug in 0.3.2 and is fixed in Scipy CVS. Pearu From oliphant at ee.byu.edu Thu Nov 11 15:42:21 2004 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Thu, 11 Nov 2004 13:42:21 -0700 Subject: [SciPy-user] Sparse identity matrix speye() In-Reply-To: <419356B0.8000507@mecha.uni-stuttgart.de> References: <419356B0.8000507@mecha.uni-stuttgart.de> Message-ID: <4193CEAD.80304@ee.byu.edu> Nils Wagner wrote: > Hi all, > > Is there a better way to generate a sparse identity matrix > > I have used > > A=sparse.csc_matrix(identity(3)) > Use coordinate format matrices: A = sparse.coo_matrix( ) -Travis From narnett at mccmedia.com Thu Nov 11 21:45:57 2004 From: narnett at mccmedia.com (Nick Arnett) Date: Thu, 11 Nov 2004 18:45:57 -0800 Subject: [SciPy-user] lingalg.svd performance? In-Reply-To: References: <41700D3E.2090703@mccmedia.com> Message-ID: <419423E5.9000500@mccmedia.com> Pearu Peterson wrote: > On a system with AMD Athlon(tm) 64 Processor 3000+ and 2GB memory: > > - calculating full svd of a 500x15000 matrix takes 5m55s consuming about > 1.8g memory > - calculating only singular values takes 22s consuming about 0.2g memory > > Note that the above results do not contain time spent for swaping when > memory is low. E.g. on a 1GB system calculating full svd would take > forever.. Thanks, belatedly. It'll help immensely if I actually get Atlas installed... didn't realize it wasn't there until now! Nick From nwagner at mecha.uni-stuttgart.de Fri Nov 12 07:19:46 2004 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Fri, 12 Nov 2004 13:19:46 +0100 Subject: [SciPy-user] Monitoring Functions for Iterative Solvers Message-ID: <4194AA62.1060603@mecha.uni-stuttgart.de> Hi all, I am going to solve large linear systems with current iterative solvers in scipy. Is it possible to get some control output during the iteration process, e.g iteration number versus actual residual How can I find a suitable starting guess x_0 ? Default is a zero vector, but is there any other rule of thumb ? Nils From nwagner at mecha.uni-stuttgart.de Fri Nov 12 07:35:30 2004 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Fri, 12 Nov 2004 13:35:30 +0100 Subject: [SciPy-user] Inexact Newton methods in scipy Message-ID: <4194AE12.3040001@mecha.uni-stuttgart.de> Hi all, A standard method to solve nonlinear equations f(x) = 0 is Newton's method. Given a suitable initial guess one iterates f'(x_k) \Delta x_k = -f(x_k) x_{k+1} = x_k + \Delta x_k If the Jacobian is not available in a direct manner, we can apply f'(x_k) to a vector \Delta x_k by a finite difference formula (see my previous mail fdf package) BTW, most publications deal with real Jacobians. How can I extend finite difference formulas to complex Jacobians ? help (linalg.gmres) yields A -- An array or an object with a matvec(x) method to represent A * x A small example illustrating an object with a matvec(x) method would be appreciated. Nils From pearu at scipy.org Fri Nov 12 07:40:31 2004 From: pearu at scipy.org (Pearu Peterson) Date: Fri, 12 Nov 2004 06:40:31 -0600 (CST) Subject: [SciPy-user] Inexact Newton methods in scipy In-Reply-To: <4194AE12.3040001@mecha.uni-stuttgart.de> References: <4194AE12.3040001@mecha.uni-stuttgart.de> Message-ID: On Fri, 12 Nov 2004, Nils Wagner wrote: > Hi all, > > A standard method to solve nonlinear equations > > f(x) = 0 > > is Newton's method. Given a suitable initial guess one iterates > > f'(x_k) \Delta x_k = -f(x_k) > x_{k+1} = x_k + \Delta x_k > > If the Jacobian is not available in a direct manner, we can apply f'(x_k) to > a vector \Delta x_k by a finite difference formula (see my previous mail fdf > package) > > BTW, most publications deal with real Jacobians. How can I extend finite > difference formulas to complex Jacobians ? Apply finite difference formula to real and imaginary part of f(x) to get an approximation for complex Jacobian. Pearu From val at vtek.com Fri Nov 12 09:40:43 2004 From: val at vtek.com (val) Date: Fri, 12 Nov 2004 09:40:43 -0500 Subject: [SciPy-user] parsing curves, numerical data.. Message-ID: <1aa901c4c8c5$96ca7660$c400a8c0@sony> Hi the List, I'm impressed with how parsers like pyparsing do their job to parse symbolic data (texts, etc), in particular, a nice way of specificfation of teplates. Is there smth like that in numerical world? E.g., if you need to parse, say, ECGs, EEGs, instrument output, etc to find patterns such as Q,R,S patterns in ECGs, or 'novelty' patterns, etc. It'd be nice to specify template(s) for what you are looking for and get back from the parser a 'segmentation' of a curve in terms of those patterns, sort of 'curve understanding' (like image understanding). I have no clear idea of the template(s), though. Linear, quadratic, etc segments is one option, probably not the smartest one. Looking for repeating patterns in a curve (within an eps-based precision) would be probably a smarter way. I'm sure knowledgeable people can give a better advice. thanks, val -------------- next part -------------- An HTML attachment was scrubbed... URL: From nwagner at mecha.uni-stuttgart.de Sat Nov 13 05:52:14 2004 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Sat, 13 Nov 2004 11:52:14 +0100 Subject: [SciPy-user] Usage of callable matvec attributes Message-ID: Hi all, How do I use a callable matvec attribute in linalg.gmres ? A small test example would be appreciated. Thanks in advance. Nils From rkern at ucsd.edu Sat Nov 13 06:11:21 2004 From: rkern at ucsd.edu (Robert Kern) Date: Sat, 13 Nov 2004 03:11:21 -0800 Subject: [SciPy-user] Usage of callable matvec attributes In-Reply-To: References: Message-ID: <4195EBD9.1050609@ucsd.edu> Nils Wagner wrote: > Hi all, > > How do I use a callable matvec attribute > in linalg.gmres ? > > A small test example would be appreciated. > > Thanks in advance. Sparse matrices are examples of objects with matvec() methods. -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From Giovanni.Samaey at cs.kuleuven.ac.be Mon Nov 15 08:28:40 2004 From: Giovanni.Samaey at cs.kuleuven.ac.be (Giovanni Samaey) Date: Mon, 15 Nov 2004 14:28:40 +0100 Subject: [SciPy-user] error linking to libg2c on 64-bit machine In-Reply-To: References: <41909A7F.9090204@cs.kuleuven.ac.be> <4190A0D1.80502@cs.kuleuven.ac.be><41924DF2.10102@cs.kuleuven.ac.be> <41925441.4050404@cs.kuleuven.ac.be> Message-ID: <4198AF08.4010505@cs.kuleuven.ac.be> Hi again, I am sorry I took so long; I tried a bit and then I got diverted. > Thanks. These locations look strange enough to be considered as > nonstandard locations (I tried google(/apps/prod/local64) and got no > results) and using site.cfg might be the proper way to go. See the header > of scipy_distutils/system_info.py file as well as of sample_site.cfg. I tried both options; the system_info.py file gives a lot of categories and I did not find which one to use. I tried setting library_dirs to what was there + what I need. librtary_dirs = /usr/local/lib/:/opt/lib:/usr/lib:/apps/prod/local64/lib64 (or /apps/prod/local64/lib64 in front). This didn't make any difference for the error that was reported earlier. I put this file in the scipy directory once, and also in scipy_distutils. > > Another approach would be to use build_ext -L flag, e.g try > > python setup.py build_ext -L/apps/prod/local64/lib64 build This approach gives an error earlier in the process: running build_ext Traceback (most recent call last): File "setup.py", line 112, in ? setup_package(ignore_packages) File "setup.py", line 99, in setup_package url = "http://www.scipy.org", File "scipy_core/scipy_distutils/core.py", line 73, in setup return old_setup(**new_attr) File "/data/home/giovanni/lib/python2.3/distutils/core.py", line 149, in setup dist.run_commands() File "/data/home/giovanni/lib/python2.3/distutils/dist.py", line 907, in run_commands self.run_command(cmd) File "/data/home/giovanni/lib/python2.3/distutils/dist.py", line 927, in run_command cmd_obj.run() File "scipy_core/scipy_distutils/command/build_ext.py", line 47, in run raise TypeError,'Extension "%s" sources contains unresolved'\ TypeError: Extension "scipy.xxx.spam" sources contains unresolved items (call build_src before build_ext). (Although I did run build_src before build_ext). > > to specify the location of libg2c.so. > > To resolve x11 libraries problem, you should probably use site.cfg file. I suppose here setting x11_dirs = /apps/prod/xlib64 will do the trick? Giovanni From cdavis at staffmail.ed.ac.uk Mon Nov 15 09:37:14 2004 From: cdavis at staffmail.ed.ac.uk (Cory Davis) Date: 15 Nov 2004 14:37:14 +0000 Subject: [SciPy-user] interpolate.interp1d Message-ID: <1100529433.30239.14.camel@fog> Hi all, It seems that interpolate.interp1d requires that the x argument is ascending. It seems odd that this should be the case. Either there is a bug or the ascending requirement should be stated in the documentation. Here is an example ... >>> from scipy.interpolate import * >>> zinterp=interp1d(x=-arange(10),y=arange(10)*2) >>> zinterp(-2.2) Traceback (most recent call last): File "", line 1, in ? File "/eosmls/local/linux/lib/python/scipy/interpolate/interpolate.py", line 165, in __call__ out_of_bounds = self._check_bounds(x_new) File "/eosmls/local/linux/lib/python/scipy/interpolate/interpolate.py", line 219, in _check_bounds raise ValueError, " A value in x_new is below the"\ ValueError: A value in x_new is below the interpolation range. Cheers, Cory. -- )))))))))))))))))))))))))))))))))))))))))))) Cory Davis Meteorology School of GeoSciences University of Edinburgh King's Buildings EDINBURGH EH9 3JZ ph: +44(0)131 6505092 fax +44(0)131 6505780 cdavis at staffmail.ed.ac.uk cory at met.ed.ac.uk http://www.geos.ed.ac.uk/contacts/homes/cdavis )))))))))))))))))))))))))))))))))))))))))))) From cdavis at staffmail.ed.ac.uk Mon Nov 15 09:41:58 2004 From: cdavis at staffmail.ed.ac.uk (Cory Davis) Date: 15 Nov 2004 14:41:58 +0000 Subject: [SciPy-user] interpolate.interp1d - please ignore previous post In-Reply-To: <1100529433.30239.14.camel@fog> References: <1100529433.30239.14.camel@fog> Message-ID: <1100529718.30239.21.camel@fog> Oops sorry guys. Inputs: x -- a 1d array of monotonically increasing real values. x cannot include duplicate values. (otherwise f is overspecified) Cory. On Mon, 2004-11-15 at 14:37, Cory Davis wrote: > Hi all, > > It seems that interpolate.interp1d requires that the x argument is > ascending. It seems odd that this should be the case. Either there is a > bug or the ascending requirement should be stated in the documentation. > Here is an example ... > > >>> from scipy.interpolate import * > >>> zinterp=interp1d(x=-arange(10),y=arange(10)*2) > >>> zinterp(-2.2) > Traceback (most recent call last): > File "", line 1, in ? > File > "/eosmls/local/linux/lib/python/scipy/interpolate/interpolate.py", line > 165, in __call__ > out_of_bounds = self._check_bounds(x_new) > File > "/eosmls/local/linux/lib/python/scipy/interpolate/interpolate.py", line > 219, in _check_bounds > raise ValueError, " A value in x_new is below the"\ > ValueError: A value in x_new is below the interpolation range. > > Cheers, > Cory. -- )))))))))))))))))))))))))))))))))))))))))))) Cory Davis Meteorology School of GeoSciences University of Edinburgh King's Buildings EDINBURGH EH9 3JZ ph: +44(0)131 6505092 fax +44(0)131 6505780 cdavis at staffmail.ed.ac.uk cory at met.ed.ac.uk http://www.geos.ed.ac.uk/contacts/homes/cdavis )))))))))))))))))))))))))))))))))))))))))))) From pearu at scipy.org Mon Nov 15 10:00:46 2004 From: pearu at scipy.org (Pearu Peterson) Date: Mon, 15 Nov 2004 09:00:46 -0600 (CST) Subject: [SciPy-user] error linking to libg2c on 64-bit machine In-Reply-To: <4198AF08.4010505@cs.kuleuven.ac.be> References: <41909A7F.9090204@cs.kuleuven.ac.be> <4190A0D1.80502@cs.kuleuven.ac.be><41924DF2.10102@cs.kuleuven.ac.be> <4198AF08.4010505@cs.kuleuven.ac.be> Message-ID: On Mon, 15 Nov 2004, Giovanni Samaey wrote: > Hi again, > > I am sorry I took so long; I tried a bit and then I got diverted. > >> Thanks. These locations look strange enough to be considered as nonstandard >> locations (I tried google(/apps/prod/local64) and got no results) and using >> site.cfg might be the proper way to go. See the header >> of scipy_distutils/system_info.py file as well as of sample_site.cfg. > > I tried both options; the system_info.py file gives a lot of categories and > I did not find which one to use. > I tried setting library_dirs to what was there + what I need. > librtary_dirs = /usr/local/lib/:/opt/lib:/usr/lib:/apps/prod/local64/lib64 > (or /apps/prod/local64/lib64 in front). > This didn't make any difference for the error that was reported earlier. > > I put this file in the scipy directory once, and also in scipy_distutils. After you'll update scipy_core from CVS, try the following site.cfg: [atlas] libraries = g2c library_dirs = /apps/prod/local64/lib64/ >> Another approach would be to use build_ext -L flag, e.g try >> >> python setup.py build_ext -L/apps/prod/local64/lib64 build > > This approach gives an error earlier in the process: > running build_ext > Traceback (most recent call last): > TypeError: Extension "scipy.xxx.spam" sources contains unresolved items (call > build_src before build_ext). > > (Although I did run build_src before build_ext). Hmm, may be build_clib needs similar -L option then. >> To resolve x11 libraries problem, you should probably use site.cfg file. > > I suppose here setting x11_dirs = /apps/prod/xlib64 will do the trick? [x11] library_dirs = /apps/prod/xlib64 should be sufficient. Note that site.cfg should be saved to the same location where system_info.py is loaded. So, if you install scipy_distutils then you must also copy site.cfg to installed directory. Finally, instead of hitting `python setup.py build` each time you'll modify site.cfg, run python system_info.py lapack_opt x11 until you are satisfied with the output. HTH, Pearu From Giovanni.Samaey at cs.kuleuven.ac.be Mon Nov 15 11:50:09 2004 From: Giovanni.Samaey at cs.kuleuven.ac.be (Giovanni Samaey) Date: Mon, 15 Nov 2004 17:50:09 +0100 Subject: [SciPy-user] error linking to libg2c on 64-bit machine In-Reply-To: References: <41909A7F.9090204@cs.kuleuven.ac.be> <4190A0D1.80502@cs.kuleuven.ac.be><41924DF2.10102@cs.kuleuven.ac.be> <4198AF08.4010505@cs.kuleuven.ac.be> Message-ID: <4198DE41.3020802@cs.kuleuven.ac.be> > > After you'll update scipy_core from CVS, try the following site.cfg: > > [atlas] > libraries = g2c > library_dirs = /apps/prod/local64/lib64/ I am terribly sorry, but this doesn't help... I was able to get the install script to find these directories, because I get Setting PTATLAS=ATLAS FOUND: libraries = ['ptf77blas', 'ptcblas', 'atlas', 'g2c'] library_dirs = ['/data/home/giovanni/ATLAS/Linux_HAMMER64SSE2_2/lib', '/apps/prod/local64/lib64/'] language = c include_dirs = ['/data/home/giovanni/ATLAS/Linux_HAMMER64SSE2_2/include'] which clearly shows my lib64 directory. However, the fatal command still reads: /usr/bin/g77 -shared build/temp.linux-x86_64-2.3/Lib/interpolate/_fitpackmodule.o -Lbuild/temp.linux-x86_64-2.3 -lfitpack -lg2c -o build/lib.linux-x86_64-2.3/scipy/interpolate/_fitpack.so /usr/bin/ld: /usr/lib/gcc-lib/i486-linux/3.3.2/64/libg2c.a(lread.o): relocation R_X86_64_32 can not be used when making a shared object; recompile with -fPIC /usr/lib/gcc-lib/i486-linux/3.3.2/64/libg2c.a: could not read symbols: Bad value collect2: ld returned 1 exit status /usr/bin/ld: /usr/lib/gcc-lib/i486-linux/3.3.2/64/libg2c.a(lread.o): relocation R_X86_64_32 can not be used when making a shared object; recompile with -fPIC /usr/lib/gcc-lib/i486-linux/3.3.2/64/libg2c.a: could not read symbols: Bad value collect2: ld returned 1 exit status error: Command "/usr/bin/g77 -shared build/temp.linux-x86_64-2.3/Lib/interpolate/_fitpackmodule.o -Lbuild/temp.linux-x86_64-2.3 -lfitpack -lg2c -o build/lib.linux-x86_64-2.3/scipy/interpolate/_fitpack.so" failed with exit status 1 You'll see that the -lg2c is still interpreted by ld as pointing to the "standard" directory. Changing -lg2c to an explicit -L/path/to/libg2c.so worked. (As was indicated earlier in this thread.) Giovanni From pearu at scipy.org Mon Nov 15 13:33:56 2004 From: pearu at scipy.org (Pearu Peterson) Date: Mon, 15 Nov 2004 12:33:56 -0600 (CST) Subject: [SciPy-user] error linking to libg2c on 64-bit machine In-Reply-To: <4198DE41.3020802@cs.kuleuven.ac.be> References: <41909A7F.9090204@cs.kuleuven.ac.be> <4190A0D1.80502@cs.kuleuven.ac.be><41924DF2.10102@cs.kuleuven.ac.be> <4198AF08.4010505@cs.kuleuven.ac.be> <4198DE41.3020802@cs.kuleuven.ac.be> Message-ID: On Mon, 15 Nov 2004, Giovanni Samaey wrote: >> >> After you'll update scipy_core from CVS, try the following site.cfg: >> >> [atlas] >> libraries = g2c >> library_dirs = /apps/prod/local64/lib64/ > > I am terribly sorry, but this doesn't help... Hey, don't worry, we are close:) > I was able to get the install script to find these directories, because I get > > Setting PTATLAS=ATLAS > FOUND: > libraries = ['ptf77blas', 'ptcblas', 'atlas', 'g2c'] > library_dirs = ['/data/home/giovanni/ATLAS/Linux_HAMMER64SSE2_2/lib', > '/apps/prod/local64/lib64/'] > language = c > include_dirs = ['/data/home/giovanni/ATLAS/Linux_HAMMER64SSE2_2/include'] > > which clearly shows my lib64 directory. > > However, the fatal command still reads: > > /usr/bin/g77 -shared > build/temp.linux-x86_64-2.3/Lib/interpolate/_fitpackmodule.o > -Lbuild/temp.linux-x86_64-2.3 -lfitpack -lg2c -o > build/lib.linux-x86_64-2.3/scipy/interpolate/_fitpack.so > /usr/bin/ld: /usr/lib/gcc-lib/i486-linux/3.3.2/64/libg2c.a(lread.o): > relocation R_X86_64_32 can not be used when making a shared object; recompile > with -fPIC > /usr/lib/gcc-lib/i486-linux/3.3.2/64/libg2c.a: could not read symbols: Bad > value > collect2: ld returned 1 exit status > /usr/bin/ld: /usr/lib/gcc-lib/i486-linux/3.3.2/64/libg2c.a(lread.o): > relocation R_X86_64_32 can not be used when making a shared object; recompile > with -fPIC > /usr/lib/gcc-lib/i486-linux/3.3.2/64/libg2c.a: could not read symbols: Bad > value > collect2: ld returned 1 exit status > error: Command "/usr/bin/g77 -shared > build/temp.linux-x86_64-2.3/Lib/interpolate/_fitpackmodule.o > -Lbuild/temp.linux-x86_64-2.3 -lfitpack -lg2c -o > build/lib.linux-x86_64-2.3/scipy/interpolate/_fitpack.so" failed with exit > status 1 This error comes from linking extension module that does not use atlas. So, the atlas section in site.cfg is not effective here. However, cd Lib/linalg python setup_linalg.py build should work (provided that site.cfg is installed in right place). So, to fix the g2c library path also for _fitpack, use the following section in site.cfg (you can remove the corresponding bits in [atlas]): [numpy] libraries = g2c library_dirs = /apps/prod/local64/lib64/ I haven't check but all extension modules in scipy should use numeric. If not then linkige should fail at some point but lets deal with that later on.. Pearu From giovanni.samaey at cs.kuleuven.ac.be Mon Nov 15 14:37:48 2004 From: giovanni.samaey at cs.kuleuven.ac.be (Giovanni Samaey) Date: Mon, 15 Nov 2004 20:37:48 +0100 Subject: [SciPy-user] error linking to libg2c on 64-bit machine In-Reply-To: Message-ID: <200411151937.iAFJbsU08350@iris.cs.kuleuven.ac.be> OK. We are really approaching the end now. Fitpack got compiled! The next error is due to x11 not being correctly pointed to. In site.cfg I put [x11] library_dirs = /apps/prod/xlib64 but this doesn't do enough. I avoided this so far by ignoring the packages xplt,gplt,plt and gui_thread. But this is not sufficient, since I get the following error. building 'scipy_base.display_test' extension compiling C sources gcc options: '-pthread -fno-strict-aliasing -DNDEBUG -g -O3 -Wall -Wstrict-prototypes -fPIC' compile options: '-DHAVE_X11 -I/usr/X11R6/include -I/data/home/giovanni/include/python2.3 -c' gcc -pthread -shared build/temp.linux-x86_64-2.3/scipy_core/scipy_base/src/display_test.o -L/usr/X11R6/lib -Lbuild/temp.linux-x86_64-2.3 -lX11 -o build/lib.linux-x86_64-2.3/scipy_base/display_test.so /usr/bin/ld: skipping incompatible /usr/X11R6/lib/libX11.so when searching for -lX11 /usr/bin/ld: skipping incompatible /usr/X11R6/lib/libX11.a when searching for -lX11 /usr/bin/ld: cannot find -lX11 collect2: ld returned 1 exit status /usr/bin/ld: skipping incompatible /usr/X11R6/lib/libX11.so when searching for -lX11 /usr/bin/ld: skipping incompatible /usr/X11R6/lib/libX11.a when searching for -lX11 /usr/bin/ld: cannot find -lX11 collect2: ld returned 1 exit status Anyway, we are promised x-forwarding, so at some point this has to be fixed anyway, even if we can avoid building this now. All help so far has been great; now we just need to work through the last bite here... Giovanni > -----Original Message----- > From: scipy-user-bounces at scipy.net [mailto:scipy-user-bounces at scipy.net] > On Behalf Of Pearu Peterson > Sent: maandag 15 november 2004 19:34 > To: SciPy Users List > Subject: Re: [SciPy-user] error linking to libg2c on 64-bit machine > > > > On Mon, 15 Nov 2004, Giovanni Samaey wrote: > > >> > >> After you'll update scipy_core from CVS, try the following site.cfg: > >> > >> [atlas] > >> libraries = g2c > >> library_dirs = /apps/prod/local64/lib64/ > > > > I am terribly sorry, but this doesn't help... > > Hey, don't worry, we are close:) > > > I was able to get the install script to find these directories, because > I get > > > > Setting PTATLAS=ATLAS > > FOUND: > > libraries = ['ptf77blas', 'ptcblas', 'atlas', 'g2c'] > > library_dirs = ['/data/home/giovanni/ATLAS/Linux_HAMMER64SSE2_2/lib', > > '/apps/prod/local64/lib64/'] > > language = c > > include_dirs = > ['/data/home/giovanni/ATLAS/Linux_HAMMER64SSE2_2/include'] > > > > which clearly shows my lib64 directory. > > > > However, the fatal command still reads: > > > > /usr/bin/g77 -shared > > build/temp.linux-x86_64-2.3/Lib/interpolate/_fitpackmodule.o > > -Lbuild/temp.linux-x86_64-2.3 -lfitpack -lg2c -o > > build/lib.linux-x86_64-2.3/scipy/interpolate/_fitpack.so > > /usr/bin/ld: /usr/lib/gcc-lib/i486-linux/3.3.2/64/libg2c.a(lread.o): > > relocation R_X86_64_32 can not be used when making a shared object; > recompile > > with -fPIC > > /usr/lib/gcc-lib/i486-linux/3.3.2/64/libg2c.a: could not read symbols: > Bad > > value > > collect2: ld returned 1 exit status > > /usr/bin/ld: /usr/lib/gcc-lib/i486-linux/3.3.2/64/libg2c.a(lread.o): > > relocation R_X86_64_32 can not be used when making a shared object; > recompile > > with -fPIC > > /usr/lib/gcc-lib/i486-linux/3.3.2/64/libg2c.a: could not read symbols: > Bad > > value > > collect2: ld returned 1 exit status > > error: Command "/usr/bin/g77 -shared > > build/temp.linux-x86_64-2.3/Lib/interpolate/_fitpackmodule.o > > -Lbuild/temp.linux-x86_64-2.3 -lfitpack -lg2c -o > > build/lib.linux-x86_64-2.3/scipy/interpolate/_fitpack.so" failed with > exit > > status 1 > > This error comes from linking extension module that does not use atlas. > So, the atlas section in site.cfg is not effective here. However, > > cd Lib/linalg > python setup_linalg.py build > > should work (provided that site.cfg is installed in right place). > > So, to fix the g2c library path also for _fitpack, use the following > section in site.cfg (you can remove the corresponding bits in [atlas]): > > [numpy] > libraries = g2c > library_dirs = /apps/prod/local64/lib64/ > > I haven't check but all extension modules in scipy should use numeric. If > not then linkige should fail at some point but lets deal with that later > on.. > > Pearu > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user From pearu at scipy.org Mon Nov 15 15:27:48 2004 From: pearu at scipy.org (Pearu Peterson) Date: Mon, 15 Nov 2004 14:27:48 -0600 (CST) Subject: [SciPy-user] error linking to libg2c on 64-bit machine In-Reply-To: <200411151937.iAFJbsU08350@iris.cs.kuleuven.ac.be> References: <200411151937.iAFJbsU08350@iris.cs.kuleuven.ac.be> Message-ID: On Mon, 15 Nov 2004, Giovanni Samaey wrote: > OK. We are really approaching the end now. > Fitpack got compiled! > > The next error is due to x11 not being correctly pointed to. > In site.cfg I put > [x11] > library_dirs = /apps/prod/xlib64 > > but this doesn't do enough. According to your previous messages, libX11.a is not in /apps/prod/xlib64, just .so files are there. Btw, does libX11.so exist in /apps/prod/xlib64? Probably not as this would explain why /apps/prod/xlib64 was ignored. > Anyway, we are promised x-forwarding, so at some point this has to be fixed > anyway, even if we can avoid building this now. I agree. I hope that your sysadmin can install 64-bit X11 developers package that should contain libX11.a. If not, we can figure something out to use X11 .so files. But until then, you can force undetecting X11 libraries by setting x11_libs = noX11 in [x11] section of site.cfg file. That should give you a change to complete scipy build. Pearu From p.berkes at biologie.hu-berlin.de Tue Nov 16 04:09:32 2004 From: p.berkes at biologie.hu-berlin.de (p.berkes at biologie.hu-berlin.de) Date: Tue, 16 Nov 2004 10:09:32 +0100 (CET) Subject: [SciPy-user] MDP 1.0.0 and symeig 1.0 released Message-ID: Hi, Tiziano and I are pleased to announce a new release of the Modular toolkit for Data Processing (MDP) 1.0.0 (http://mdp-toolkit.sourceforge.net) MDP is a Python library based on SciPy to implement data processing elements (nodes) and combine them into data processing sequences (flows). Already implemented nodes include Principal Component Analysis (PCA), Independent Component Analysis (ICA), Slow Feature Analysis (SFA), and Growing Neural Gas. In the past there has been some interest on the scipy-user mailing list concerning the MDP function 'symeig' to solve the standard and generalized eigenvalue problems for symmetric (hermitian) positive definite matrices: http://www.scipy.org/mailinglists/mailman?fn=scipy-user/2004-September/003241.html http://www.scipy.org/mailinglists/mailman?fn=scipy-user/2004-October/003391.html http://www.scipy.org/mailinglists/mailman?fn=scipy-user/2004-October/003396.html This function has been improved in the new release. In particular, it now correctly handles complex matrices (there seems to be something wrong with the LAPACK documentation in the complex case). 'symeig' is now distributed also as an independent package: http://mdp-toolkit.sourceforge.net/symeig.html Since the LAPACK functions wrapped by 'symeig' seem to be of general interest, you might want to consider including it in SciPy. The distribution includes the .pyf file used to generate the wrappers, the symeig function itself, and a test suite. Tiziano and I are ready to help with the inclusion process if needed. 'symeig' is currently released under LGPL, but we might change the licence if it conflicts with that of SciPy. Best Regards, Pietro Berkes and Tiziano Zito ---------------------------------------- {p.berkes, t.zito}@biologie.hu-berlin.de Institute for Theoretical Biology Humboldt University Invalidenstrasse 43 D-10115 Berlin, Germany ---------------------------------------- From nwagner at mecha.uni-stuttgart.de Tue Nov 16 05:24:47 2004 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Tue, 16 Nov 2004 11:24:47 +0100 Subject: [SciPy-user] MDP 1.0.0 and symeig 1.0 released In-Reply-To: References: Message-ID: <4199D56F.7080604@mecha.uni-stuttgart.de> p.berkes at biologie.hu-berlin.de wrote: >Hi, > >Tiziano and I are pleased to announce a new release of the > > Modular toolkit for Data Processing (MDP) 1.0.0 > (http://mdp-toolkit.sourceforge.net) > >MDP is a Python library based on SciPy to implement data processing >elements (nodes) and combine them into data processing sequences >(flows). Already implemented nodes include Principal Component >Analysis (PCA), Independent Component Analysis (ICA), Slow Feature >Analysis (SFA), and Growing Neural Gas. > >In the past there has been some interest on the scipy-user mailing >list concerning the MDP function 'symeig' to solve the standard and >generalized eigenvalue problems for symmetric (hermitian) positive >definite matrices: > >http://www.scipy.org/mailinglists/mailman?fn=scipy-user/2004-September/003241.html >http://www.scipy.org/mailinglists/mailman?fn=scipy-user/2004-October/003391.html >http://www.scipy.org/mailinglists/mailman?fn=scipy-user/2004-October/003396.html > >This function has been improved in the new release. In particular, it >now correctly handles complex matrices (there seems to be something >wrong with the LAPACK documentation in the complex case). 'symeig' is >now distributed also as an independent package: > >http://mdp-toolkit.sourceforge.net/symeig.html > >Since the LAPACK functions wrapped by 'symeig' seem to be of general >interest, you might want to consider including it in SciPy. The >distribution includes the .pyf file used to generate the wrappers, the >symeig function itself, and a test suite. Tiziano and I are ready to >help with the inclusion process if needed. 'symeig' is currently released >under LGPL, but we might change the licence if it conflicts with >that of SciPy. > >Best Regards, >Pietro Berkes and Tiziano Zito > >---------------------------------------- >{p.berkes, t.zito}@biologie.hu-berlin.de >Institute for Theoretical Biology >Humboldt University >Invalidenstrasse 43 >D-10115 Berlin, Germany >---------------------------------------- > >_______________________________________________ >SciPy-user mailing list >SciPy-user at scipy.net >http://www.scipy.net/mailman/listinfo/scipy-user > > I tried to install symeig. python setup.py install failed build/src/froutinesmodule.c: In function `f2py_rout_froutines_ssyevr': build/src/froutinesmodule.c:325: error: `U' undeclared (first use in this function) build/src/froutinesmodule.c:325: error: (Each undeclared identifier is reported only once build/src/froutinesmodule.c:325: error: for each function it appears in.) build/src/froutinesmodule.c:331: error: `V' undeclared (first use in this function) build/src/froutinesmodule.c:354: error: `A' undeclared (first use in this function) build/src/froutinesmodule.c: In function `f2py_rout_froutines_dsyevr': build/src/froutinesmodule.c:583: error: `U' undeclared (first use in this function) build/src/froutinesmodule.c:589: error: `V' undeclared (first use in this function) build/src/froutinesmodule.c:612: error: `A' undeclared (first use in this function) build/src/froutinesmodule.c: In function `f2py_rout_froutines_cheevr': build/src/froutinesmodule.c:855: error: `U' undeclared (first use in this function) build/src/froutinesmodule.c:861: error: `V' undeclared (first use in this function) build/src/froutinesmodule.c:876: error: `A' undeclared (first use in this function) build/src/froutinesmodule.c: In function `f2py_rout_froutines_zheevr': build/src/froutinesmodule.c:1134: error: `U' undeclared (first use in this function) build/src/froutinesmodule.c:1140: error: `V' undeclared (first use in this function) build/src/froutinesmodule.c:1155: error: `A' undeclared (first use in this function) build/src/froutinesmodule.c: In function `f2py_rout_froutines_ssygv': build/src/froutinesmodule.c:1390: error: `U' undeclared (first use in this function) build/src/froutinesmodule.c:1394: error: `V' undeclared (first use in this function) build/src/froutinesmodule.c: In function `f2py_rout_froutines_dsygv': build/src/froutinesmodule.c:1575: error: `U' undeclared (first use in this function) build/src/froutinesmodule.c:1579: error: `V' undeclared (first use in this function) build/src/froutinesmodule.c: In function `f2py_rout_froutines_chegv': build/src/froutinesmodule.c:1765: error: `U' undeclared (first use in this function) build/src/froutinesmodule.c:1769: error: `V' undeclared (first use in this function) build/src/froutinesmodule.c: In function `f2py_rout_froutines_zhegv': build/src/froutinesmodule.c:1967: error: `U' undeclared (first use in this function) build/src/froutinesmodule.c:1971: error: `V' undeclared (first use in this function) build/src/froutinesmodule.c: In function `f2py_rout_froutines_ssygvd': build/src/froutinesmodule.c:2170: error: `U' undeclared (first use in this function) build/src/froutinesmodule.c:2174: error: `V' undeclared (first use in this function) build/src/froutinesmodule.c: In function `f2py_rout_froutines_dsygvd': build/src/froutinesmodule.c:2376: error: `U' undeclared (first use in this function) build/src/froutinesmodule.c:2380: error: `V' undeclared (first use in this function) build/src/froutinesmodule.c: In function `f2py_rout_froutines_chegvd': build/src/froutinesmodule.c:2579: error: `U' undeclared (first use in this function) build/src/froutinesmodule.c:2583: error: `V' undeclared (first use in this function) build/src/froutinesmodule.c: In function `f2py_rout_froutines_zhegvd': build/src/froutinesmodule.c:2806: error: `U' undeclared (first use in this function) build/src/froutinesmodule.c:2810: error: `V' undeclared (first use in this function) build/src/froutinesmodule.c: In function `f2py_rout_froutines_ssygvx': build/src/froutinesmodule.c:3057: error: `V' undeclared (first use in this function) build/src/froutinesmodule.c:3075: error: `U' undeclared (first use in this function) build/src/froutinesmodule.c:3079: error: `I' undeclared (first use in this function) build/src/froutinesmodule.c: In function `f2py_rout_froutines_dsygvx': build/src/froutinesmodule.c:3342: error: `V' undeclared (first use in this function) build/src/froutinesmodule.c:3360: error: `U' undeclared (first use in this function) build/src/froutinesmodule.c:3364: error: `I' undeclared (first use in this function) build/src/froutinesmodule.c: In function `f2py_rout_froutines_chegvx': build/src/froutinesmodule.c:3627: error: `U' undeclared (first use in this function) build/src/froutinesmodule.c:3636: error: `V' undeclared (first use in this function) build/src/froutinesmodule.c:3654: error: `I' undeclared (first use in this function) build/src/froutinesmodule.c: In function `f2py_rout_froutines_zhegvx': build/src/froutinesmodule.c:3929: error: `U' undeclared (first use in this function) build/src/froutinesmodule.c:3938: error: `V' undeclared (first use in this function) build/src/froutinesmodule.c:3956: error: `I' undeclared (first use in this function) build/src/froutinesmodule.c: In function `f2py_rout_froutines_ssyevr': build/src/froutinesmodule.c:325: error: `U' undeclared (first use in this function) build/src/froutinesmodule.c:325: error: (Each undeclared identifier is reported only once build/src/froutinesmodule.c:325: error: for each function it appears in.) build/src/froutinesmodule.c:331: error: `V' undeclared (first use in this function) build/src/froutinesmodule.c:354: error: `A' undeclared (first use in this function) build/src/froutinesmodule.c: In function `f2py_rout_froutines_dsyevr': build/src/froutinesmodule.c:583: error: `U' undeclared (first use in this function) build/src/froutinesmodule.c:589: error: `V' undeclared (first use in this function) build/src/froutinesmodule.c:612: error: `A' undeclared (first use in this function) build/src/froutinesmodule.c: In function `f2py_rout_froutines_cheevr': build/src/froutinesmodule.c:855: error: `U' undeclared (first use in this function) build/src/froutinesmodule.c:861: error: `V' undeclared (first use in this function) build/src/froutinesmodule.c:876: error: `A' undeclared (first use in this function) build/src/froutinesmodule.c: In function `f2py_rout_froutines_zheevr': build/src/froutinesmodule.c:1134: error: `U' undeclared (first use in this function) build/src/froutinesmodule.c:1140: error: `V' undeclared (first use in this function) build/src/froutinesmodule.c:1155: error: `A' undeclared (first use in this function) build/src/froutinesmodule.c: In function `f2py_rout_froutines_ssygv': build/src/froutinesmodule.c:1390: error: `U' undeclared (first use in this function) build/src/froutinesmodule.c:1394: error: `V' undeclared (first use in this function) build/src/froutinesmodule.c: In function `f2py_rout_froutines_dsygv': build/src/froutinesmodule.c:1575: error: `U' undeclared (first use in this function) build/src/froutinesmodule.c:1579: error: `V' undeclared (first use in this function) build/src/froutinesmodule.c: In function `f2py_rout_froutines_chegv': build/src/froutinesmodule.c:1765: error: `U' undeclared (first use in this function) build/src/froutinesmodule.c:1769: error: `V' undeclared (first use in this function) build/src/froutinesmodule.c: In function `f2py_rout_froutines_zhegv': build/src/froutinesmodule.c:1967: error: `U' undeclared (first use in this function) build/src/froutinesmodule.c:1971: error: `V' undeclared (first use in this function) build/src/froutinesmodule.c: In function `f2py_rout_froutines_ssygvd': build/src/froutinesmodule.c:2170: error: `U' undeclared (first use in this function) build/src/froutinesmodule.c:2174: error: `V' undeclared (first use in this function) build/src/froutinesmodule.c: In function `f2py_rout_froutines_dsygvd': build/src/froutinesmodule.c:2376: error: `U' undeclared (first use in this function) build/src/froutinesmodule.c:2380: error: `V' undeclared (first use in this function) build/src/froutinesmodule.c: In function `f2py_rout_froutines_chegvd': build/src/froutinesmodule.c:2579: error: `U' undeclared (first use in this function) build/src/froutinesmodule.c:2583: error: `V' undeclared (first use in this function) build/src/froutinesmodule.c: In function `f2py_rout_froutines_zhegvd': build/src/froutinesmodule.c:2806: error: `U' undeclared (first use in this function) build/src/froutinesmodule.c:2810: error: `V' undeclared (first use in this function) build/src/froutinesmodule.c: In function `f2py_rout_froutines_ssygvx': build/src/froutinesmodule.c:3057: error: `V' undeclared (first use in this function) build/src/froutinesmodule.c:3075: error: `U' undeclared (first use in this function) build/src/froutinesmodule.c:3079: error: `I' undeclared (first use in this function) build/src/froutinesmodule.c: In function `f2py_rout_froutines_dsygvx': build/src/froutinesmodule.c:3342: error: `V' undeclared (first use in this function) build/src/froutinesmodule.c:3360: error: `U' undeclared (first use in this function) build/src/froutinesmodule.c:3364: error: `I' undeclared (first use in this function) build/src/froutinesmodule.c: In function `f2py_rout_froutines_chegvx': build/src/froutinesmodule.c:3627: error: `U' undeclared (first use in this function) build/src/froutinesmodule.c:3636: error: `V' undeclared (first use in this function) build/src/froutinesmodule.c:3654: error: `I' undeclared (first use in this function) build/src/froutinesmodule.c: In function `f2py_rout_froutines_zhegvx': build/src/froutinesmodule.c:3929: error: `U' undeclared (first use in this function) build/src/froutinesmodule.c:3938: error: `V' undeclared (first use in this function) build/src/froutinesmodule.c:3956: error: `I' undeclared (first use in this function) error: Command "gcc -pthread -fno-strict-aliasing -DNDEBUG -D_FILE_OFFSET_BITS=64 -DHAVE_LARGEFILE_SUPPORT -O2 -march=i586 -mcpu=i686 -fmessage-length=0 -Wall -fPIC -DATLAS_INFO="\"3.7.8\"" -I/usr/local/lib/atlas -Ibuild/src -I/usr/include/python2.3 -c build/src/froutinesmodule.c -o build/temp.linux-i686-2.3/build/src/froutinesmodule.o" failed with exit status 1 Nils From pearu at scipy.org Tue Nov 16 05:49:08 2004 From: pearu at scipy.org (Pearu Peterson) Date: Tue, 16 Nov 2004 04:49:08 -0600 (CST) Subject: [SciPy-user] MDP 1.0.0 and symeig 1.0 released In-Reply-To: <4199D56F.7080604@mecha.uni-stuttgart.de> References: <4199D56F.7080604@mecha.uni-stuttgart.de> Message-ID: On Tue, 16 Nov 2004, Nils Wagner wrote: > I tried to install symeig. > python setup.py install failed > > build/src/froutinesmodule.c: In function `f2py_rout_froutines_ssyevr': > build/src/froutinesmodule.c:325: error: `U' undeclared (first use in this > function) This is because you are using f2py from its CVS that had a bug which is now fixed. Do f2py cvs update and try again. Pearu From jh at oobleck.astro.cornell.edu Tue Nov 16 09:57:23 2004 From: jh at oobleck.astro.cornell.edu (Joe Harrington) Date: Tue, 16 Nov 2004 09:57:23 -0500 Subject: [SciPy-user] MDP 1.0.0 and symeig 1.0 released Message-ID: <200411161457.iAGEvN7K008095@oobleck.astro.cornell.edu> Hi, Your package sounds like a great addition to SciPy! To help people find it, the Accessible SciPy Project has started a wiki listing topical software (loosely defined as anything not included in the SciPy release): http://www.scipy.org/wikis/topical_software/ I hope you will make an entry for your package. The page is a wiki so that developers and users can all contribute to the listings. Please find a suitable topical page and make a brief entry with a link to your software's web page (or direct download if it doesn't have a page of its own). If you don't find a suitable topical page, you can either start one (but that entails maintaining it in the future) or list your package under the "Unindexed Links" section at the bottom. Further guidelines are at the bottom of the wiki front page. Thanks! --jh-- From Fernando.Perez at colorado.edu Tue Nov 16 11:10:24 2004 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Tue, 16 Nov 2004 09:10:24 -0700 Subject: [SciPy-user] MDP 1.0.0 and symeig 1.0 released In-Reply-To: <200411161457.iAGEvN7K008095@oobleck.astro.cornell.edu> References: <200411161457.iAGEvN7K008095@oobleck.astro.cornell.edu> Message-ID: <419A2670.9070805@colorado.edu> Joe Harrington wrote: > Hi, > > Your package sounds like a great addition to SciPy! To help people > find it, the Accessible SciPy Project has started a wiki listing > topical software (loosely defined as anything not included in the > SciPy release): > > http://www.scipy.org/wikis/topical_software/ > > I hope you will make an entry for your package. The page is a wiki so > that developers and users can all contribute to the listings. Please > find a suitable topical page and make a brief entry with a link to > your software's web page (or direct download if it doesn't have a page > of its own). If you don't find a suitable topical page, you can > either start one (but that entails maintaining it in the future) or > list your package under the "Unindexed Links" section at the bottom. > Further guidelines are at the bottom of the wiki front page. It's already there: http://amath.colorado.edu/faculty/fperez/python/scicomp/ in the misc section, there's already an entry. This is the page meant to become the wiki, which I promised I'd write. I guess I'll just dump it over the existing wiki, because the whole point was to avoid duplicating efforts. I was reluctant to do so because Janet had established the original wiki, but there's no point in having this info in non-editable form. I'll try to work on that tonight. Cheers, f From pearu at scipy.org Tue Nov 16 13:19:41 2004 From: pearu at scipy.org (Pearu Peterson) Date: Tue, 16 Nov 2004 12:19:41 -0600 (CST) Subject: [SciPy-user] MDP 1.0.0 and symeig 1.0 released In-Reply-To: References: Message-ID: On Tue, 16 Nov 2004 p.berkes at biologie.hu-berlin.de wrote: > 'symeig' is now distributed also as an independent package: > > http://mdp-toolkit.sourceforge.net/symeig.html > > Since the LAPACK functions wrapped by 'symeig' seem to be of general > interest, you might want to consider including it in SciPy. The > distribution includes the .pyf file used to generate the wrappers, the > symeig function itself, and a test suite. Tiziano and I are ready to > help with the inclusion process if needed. 'symeig' is currently released > under LGPL, but we might change the licence if it conflicts with > that of SciPy. I took a quick look at symeig... its .pyf file uses a bit different "style" compared to what's in linalg/generic_flapack.pyf and so simple copy&paste may not be appropriate. I think I'll create signatures for syevr, heevr, sygv, hegv, sygvd, hegvd, sygvx, hegvx myself and copy the test suite from symeig if that's alright with you. Pearu From nwagner at mecha.uni-stuttgart.de Tue Nov 16 15:00:33 2004 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Tue, 16 Nov 2004 21:00:33 +0100 Subject: [SciPy-user] MDP 1.0.0 and symeig 1.0 released In-Reply-To: References: Message-ID: On Tue, 16 Nov 2004 12:19:41 -0600 (CST) Pearu Peterson wrote: > > > On Tue, 16 Nov 2004 p.berkes at biologie.hu-berlin.de >wrote: > >> 'symeig' is now distributed also as an independent >>package: >> >> http://mdp-toolkit.sourceforge.net/symeig.html >> >> Since the LAPACK functions wrapped by 'symeig' seem to >>be of general >> interest, you might want to consider including it in >>SciPy. The >> distribution includes the .pyf file used to generate the >>wrappers, the >> symeig function itself, and a test suite. Tiziano and I >>are ready to >> help with the inclusion process if needed. 'symeig' is >>currently released >> under LGPL, but we might change the licence if it >>conflicts with >> that of SciPy. > > I took a quick look at symeig... its .pyf file uses a >bit different > "style" compared to what's in linalg/generic_flapack.pyf >and so simple > copy&paste may not be appropriate. I think I'll create >signatures > for syevr, heevr, sygv, hegv, sygvd, hegvd, sygvx, hegvx >myself > and copy the test suite from symeig if that's alright >with you. > > > Pearu symeig.test.benchmark() works fine, but symeig.test.test() failed with a segmantation fault. >>> scipy.__version__ '0.3.2_293.4427' f2py -v 2.44.240_1881 Python 2.3.2 Nils > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user From narnett at liveworld.com Fri Nov 12 20:24:01 2004 From: narnett at liveworld.com (Nick Arnett) Date: Fri, 12 Nov 2004 17:24:01 -0800 Subject: [SciPy-user] eigenvalues of sparse matrix In-Reply-To: <417EDA59.8090308@dwavesys.com> References: <417EDA59.8090308@dwavesys.com> Message-ID: <41956231.2090504@liveworld.com> David Grant wrote: > Sorry, another sparse question: Is there anything is scipy which can > find the eigenvalues and eigenvectors of a sparse matrix? Having read all the followups, it seems to me that I must be asking a dumb question -- but won't linalg.svd do this? Begging for education, really... Nick From peter at designtheory.org Sun Nov 14 08:44:39 2004 From: peter at designtheory.org (Peter Dobcsanyi) Date: Sun, 14 Nov 2004 13:44:39 +0000 Subject: [SciPy-user] memory leak in numarray 1.1 Message-ID: <20041114134439.GA18659@designtheory.org> Hi, I have already started a thread on comp.lang.python about this problem but thought would post it here too. Calling the following function with a large enough 'n' causes memory leak. import numarray as N def loop(n, m=100): for i in xrange(n): a = N.zeros((m,m)) N.matrixmultiply(a, a) If the matrixmultiply line is commented out, there is no leak, the program has a stable memory size. With n a few hundreds (say n>=500), the continuous growth of the program's memory footprint is quite noticeable. I am working on Linux (Debian testing), I monitor the memory usage of the program by top. The allocated memory is never released. I have noticed this problem of memory leak in connection with numarray a few months ago in a program of mine. The program processes the incidence matrices of thousands of combinatorial structures read from a file one by one. As the number of combinatorial objects went up I started running out of memory even on a machine with 2 Gbyte memory and with small (m<100) matrices. It took me a while to pinpoint that it was caused by matrix multiplication. Following Robert Kern's suggestion I have also tried the development version from CVS and that does not leak. Peter From rkern at ucsd.edu Tue Nov 16 18:09:42 2004 From: rkern at ucsd.edu (Robert Kern) Date: Tue, 16 Nov 2004 15:09:42 -0800 Subject: [SciPy-user] eigenvalues of sparse matrix In-Reply-To: <41956231.2090504@liveworld.com> References: <417EDA59.8090308@dwavesys.com> <41956231.2090504@liveworld.com> Message-ID: <419A88B6.7080002@ucsd.edu> Nick Arnett wrote: > David Grant wrote: > >> Sorry, another sparse question: Is there anything is scipy which can >> find the eigenvalues and eigenvectors of a sparse matrix? > > > Having read all the followups, it seems to me that I must be asking a > dumb question -- but won't linalg.svd do this? > > Begging for education, really... An SVD is not the same as an eigen decomposition. http://mathworld.wolfram.com/EigenDecomposition.html http://mathworld.wolfram.com/SingularValueDecomposition.html In any case, all of the linalg.* functions only operate on dense arrays, not sparse matrices. -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From Fernando.Perez at colorado.edu Wed Nov 17 02:39:32 2004 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Wed, 17 Nov 2004 00:39:32 -0700 Subject: [SciPy-user] Topical wiki is up Message-ID: <419B0034.8030007@colorado.edu> Hi all, OK, I've put up the full wiki page by basically clobbering the previously existing one (sorry, Janet), as suggested by others. The location is still: http://www.scipy.org/wikis/topical_software/TopicalSoftware This is the previously promised page of links, which can hopefully become a well maintained (community effort, not me!) and updated page where we can refer prospective new users. Right now it's all in one big page, perhaps later it can be split into subpages if it grows too unwieldy. At this point, I hope that the website reorg is coming, as having this stuff hidden in the wiki is not exactly the most appealing for newcomers IMHO. This is as much time as I can commit to this little exercise (it took _way_ more time than I'd expected, as these things always end up doing). I hope the community can run with the ball from here. Best, f From ariciputi at pito.com Wed Nov 17 04:18:20 2004 From: ariciputi at pito.com (Andrea Riciputi) Date: Wed, 17 Nov 2004 10:18:20 +0100 Subject: [SciPy-user] Topical wiki is up In-Reply-To: <419B0034.8030007@colorado.edu> References: <419B0034.8030007@colorado.edu> Message-ID: <9FBA26B5-3879-11D9-8E7A-000A95C0BC68@pito.com> Fernando, it's a very good job! Nevertheless, I'd add to your list PyTables , but I'm not sure if I can edit your wiki... May I? Cheers, Andrea. On 17 Nov 2004, at 08:39, Fernando Perez wrote: > Hi all, > > OK, I've put up the full wiki page by basically clobbering the > previously existing one (sorry, Janet), as suggested by others. The > location is still: > > http://www.scipy.org/wikis/topical_software/TopicalSoftware > > This is the previously promised page of links, which can hopefully > become a well maintained (community effort, not me!) and updated page > where we can refer prospective new users. Right now it's all in one > big page, perhaps later it can be split into subpages if it grows too > unwieldy. > > At this point, I hope that the website reorg is coming, as having this > stuff hidden in the wiki is not exactly the most appealing for > newcomers IMHO. > > This is as much time as I can commit to this little exercise (it took > _way_ more time than I'd expected, as these things always end up > doing). I hope the community can run with the ball from here. > > Best, > > f > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user > From p.berkes at biologie.hu-berlin.de Wed Nov 17 04:18:24 2004 From: p.berkes at biologie.hu-berlin.de (p.berkes at biologie.hu-berlin.de) Date: Wed, 17 Nov 2004 10:18:24 +0100 (CET) Subject: [SciPy-user] MDP 1.0.0 and symeig 1.0 released In-Reply-To: Message-ID: On Tue, 16 Nov 2004, Pearu Peterson wrote: > On Tue, 16 Nov 2004 p.berkes at biologie.hu-berlin.de wrote: > > > 'symeig' is now distributed also as an independent package: > > > > http://mdp-toolkit.sourceforge.net/symeig.html > > > > Since the LAPACK functions wrapped by 'symeig' seem to be of general > > interest, you might want to consider including it in SciPy. The > > distribution includes the .pyf file used to generate the wrappers, the > > symeig function itself, and a test suite. Tiziano and I are ready to > > help with the inclusion process if needed. 'symeig' is currently released > > under LGPL, but we might change the licence if it conflicts with > > that of SciPy. > > I took a quick look at symeig... its .pyf file uses a bit different > "style" compared to what's in linalg/generic_flapack.pyf and so simple > copy&paste may not be appropriate. I think I'll create signatures > for syevr, heevr, sygv, hegv, sygvd, hegvd, sygvx, hegvx myself > and copy the test suite from symeig if that's alright with you. You're of course welcome to generate a more appropriate .pyf file. You might also have some tips on how to make it more efficient. However, we strongly suggest to keep the default values we have chosen, since some of the values in the LAPACK documentation seem to be wrong. By the way, does anybody else has the same problem with the symeig test as Nils Wagner? thanks, pietro. From rkern at ucsd.edu Wed Nov 17 05:32:16 2004 From: rkern at ucsd.edu (Robert Kern) Date: Wed, 17 Nov 2004 02:32:16 -0800 Subject: [SciPy-user] Topical wiki is up In-Reply-To: <9FBA26B5-3879-11D9-8E7A-000A95C0BC68@pito.com> References: <419B0034.8030007@colorado.edu> <9FBA26B5-3879-11D9-8E7A-000A95C0BC68@pito.com> Message-ID: <419B28B0.7080800@ucsd.edu> Andrea Riciputi wrote: > Fernando, > it's a very good job! > > Nevertheless, I'd add to your list PyTables > , but I'm not > sure if I can edit your wiki... May I? Yes, please. That's the point of putting it in a Wiki after all. Anyone can edit it (although you may have to register a username/password for the site; I forget). -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From ariciputi at pito.com Wed Nov 17 05:48:38 2004 From: ariciputi at pito.com (Andrea Riciputi) Date: Wed, 17 Nov 2004 11:48:38 +0100 Subject: [SciPy-user] Topical wiki is up In-Reply-To: <419B28B0.7080800@ucsd.edu> References: <419B0034.8030007@colorado.edu> <9FBA26B5-3879-11D9-8E7A-000A95C0BC68@pito.com> <419B28B0.7080800@ucsd.edu> Message-ID: <3D079326-3886-11D9-8E7A-000A95C0BC68@pito.com> Done. I was just afraid of breaking things or of not beeing allowed to modify it. Cheers, Andrea. On 17 Nov 2004, at 11:32, Robert Kern wrote: > Yes, please. That's the point of putting it in a Wiki after all. > Anyone can edit it (although you may have to register a > username/password for the site; I forget). From rkern at ucsd.edu Wed Nov 17 06:03:43 2004 From: rkern at ucsd.edu (Robert Kern) Date: Wed, 17 Nov 2004 03:03:43 -0800 Subject: [SciPy-user] Topical wiki is up In-Reply-To: <3D079326-3886-11D9-8E7A-000A95C0BC68@pito.com> References: <419B0034.8030007@colorado.edu> <9FBA26B5-3879-11D9-8E7A-000A95C0BC68@pito.com> <419B28B0.7080800@ucsd.edu> <3D079326-3886-11D9-8E7A-000A95C0BC68@pito.com> Message-ID: <419B300F.6060102@ucsd.edu> Andrea Riciputi wrote: > Done. I was just afraid of breaking things or of not beeing allowed to > modify it. Don't worry. You probably can't break anything irretrievably. An edit history is stored, so we can roll back just about anything you do that happens to be wrong/broken/whatever. Thank you for your contribution! And while I'm at it, thank you, Fernando, for getting the ball rolling. -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From jmiller at stsci.edu Wed Nov 17 09:04:34 2004 From: jmiller at stsci.edu (Todd Miller) Date: Wed, 17 Nov 2004 09:04:34 -0500 Subject: [SciPy-user] memory leak in numarray 1.1 In-Reply-To: <20041114134439.GA18659@designtheory.org> References: <20041114134439.GA18659@designtheory.org> Message-ID: <1100700274.3311.33.camel@jaytmiller.comcast.net> This post eventually found it's way to me via the help at stsci.edu path and spurred the release of numarray-1.1.1 yesterday. You can get numarray-1.1.1 (which addresses this problem and a number of others) here: http://sourceforge.net/project/showfiles.php?group_id=1369&package_id=32367 As a rule, the best place to post about numarray problems and questions is: numpy-discussion at lists.sf.net Regards, Todd On Sun, 2004-11-14 at 13:44 +0000, Peter Dobcsanyi wrote: > Hi, > > I have already started a thread on comp.lang.python about this problem > but thought would post it here too. > > Calling the following function with a large enough 'n' causes memory > leak. > > import numarray as N > > def loop(n, m=100): > for i in xrange(n): > a = N.zeros((m,m)) > N.matrixmultiply(a, a) > > If the matrixmultiply line is commented out, there is no leak, the > program has a stable memory size. With n a few hundreds (say n>=500), > the continuous growth of the program's memory footprint is quite > noticeable. I am working on Linux (Debian testing), I monitor the memory > usage of the program by top. The allocated memory is never released. > > I have noticed this problem of memory leak in connection with numarray a > few months ago in a program of mine. The program processes the incidence > matrices of thousands of combinatorial structures read from a file one > by one. As the number of combinatorial objects went up I started running > out of memory even on a machine with 2 Gbyte memory and with small > (m<100) matrices. It took me a while to pinpoint that it was caused by > matrix multiplication. > > Following Robert Kern's suggestion I have also tried the development > version from CVS and that does not leak. > > Peter > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user From jh at oobleck.astro.cornell.edu Wed Nov 17 10:34:42 2004 From: jh at oobleck.astro.cornell.edu (Joe Harrington) Date: Wed, 17 Nov 2004 10:34:42 -0500 Subject: [SciPy-user] the wikis Message-ID: <200411171534.iAHFYgCF015504@oobleck.astro.cornell.edu> I am cross-posting this to counter some misconceptions about the ASP wikis, including the Topical one. First, I am *thrilled* to see the new Topical page put up by Fernando! The amount and quality of the content is just amazing. The page-of-pages idea wasn't making anyone happy, so it's good to see it go. Second, several people have expressed hesitation in editing the wiki, since they didn't put it up. Folks, it's a *WIKI*! You're *supposed* to edit it! Add, add, add content! All you need is a scipy.org account, which you can make for yourself following the link in the title bar. There is history, so if someone makes a change that isn't popular, we can roll back the change and no harm done. If you want to make major changes to the structure or content posted by others, you can and probably should ask on these lists. But do ask! If it's a good idea, others will agree with you. So, if you are distributing a package and it's not listed, please go add it. Make a scipy.org account, login, go to the wiki, click on the "edit" tab at the top (it only appears if you are logged in), and go to it. There is a link at the top that has more info if you're not sure how to do it. If you're really confused, please suggest improvements to the help document on scipy-dev. And thanks again to Fernando for a lot of work! --jh-- From swisher at enthought.com Wed Nov 17 11:52:08 2004 From: swisher at enthought.com (Janet Swisher) Date: Wed, 17 Nov 2004 10:52:08 -0600 Subject: [SciPy-user] RE: the wikis In-Reply-To: <200411171534.iAHFYgCF015504@oobleck.astro.cornell.edu> Message-ID: <003801c4ccc5$c95bd980$ab01a8c0@SWISHER> > -----Original Message----- > From: scipy-user-bounces at scipy.net > [mailto:scipy-user-bounces at scipy.net] On Behalf Of Joe Harrington > Second, several people have expressed hesitation in editing > the wiki, since they didn't put it up. Folks, it's a *WIKI*! > You're *supposed* to edit it! Add, add, add content! All > you need is a scipy.org account, which you can make for > yourself following the link in the title bar. There is > history, so if someone makes a change that isn't popular, we > can roll back the change and no harm done. I second what Joe said. To edit the wiki, you just need to be logged in as a registered user of scipy.org, which requires only an email address. This is intended as a minor deterrent to wiki-spammers, since they could set up a login if they really wanted to, but will probably just move along to a completely open wiki. Please don't let it deter you, as community members, from contributing content. The spirit of wiki development is *community* ownership of content, rather than individual ownership. So please don't worry too much about trodding on anyone's toes (least of all, site managers like me). If you have content that you *do* want to personally control, you can create a page in your user folder on scipy.org (or on your own site), and then link to it from the wiki. Such pages are not part of the wiki, and therefore are editable only by their owners (and by site managers). If you have questions about any of this, please ask me. If I don't know the answer, I'll try to find it. And if I find it, I'll try to add it to the site for the benefit of others. Enjoy! --Janet From Fernando.Perez at colorado.edu Wed Nov 17 12:04:12 2004 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Wed, 17 Nov 2004 10:04:12 -0700 Subject: [SciPy-user] Topical wiki is up In-Reply-To: <9FBA26B5-3879-11D9-8E7A-000A95C0BC68@pito.com> References: <419B0034.8030007@colorado.edu> <9FBA26B5-3879-11D9-8E7A-000A95C0BC68@pito.com> Message-ID: <419B848C.9090305@colorado.edu> Andrea Riciputi wrote: > Fernando, > it's a very good job! > > Nevertheless, I'd add to your list PyTables > , but I'm not > sure if I can edit your wiki... May I? Oops, I forgot pytables! Yes, by all means go ahead and edit it, that's the point of a wiki. Note that you need a scipy.org login, which you can create yourself, before being allowed to edit (I think). Best, f From Fernando.Perez at colorado.edu Wed Nov 17 13:11:13 2004 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Wed, 17 Nov 2004 11:11:13 -0700 Subject: [SciPy-user] Re: [SciPy-dev] the wikis In-Reply-To: <200411171534.iAHFYgCF015504@oobleck.astro.cornell.edu> References: <200411171534.iAHFYgCF015504@oobleck.astro.cornell.edu> Message-ID: <419B9441.7030603@colorado.edu> Joe Harrington schrieb: > I am cross-posting this to counter some misconceptions about the ASP > wikis, including the Topical one. > > First, I am *thrilled* to see the new Topical page put up by Fernando! > The amount and quality of the content is just amazing. The > page-of-pages idea wasn't making anyone happy, so it's good to see it > go. Thanks for the kind comments, I appreciate it (likewise from others who expressed this). Most of that is cut/paste from all the original projects, but it did take quite a bit more time than I had originally planned. But since it seems to be triggering community activity, it was all worth it :) > Second, several people have expressed hesitation in editing the wiki, > since they didn't put it up. Folks, it's a *WIKI*! You're *supposed* > to edit it! Add, add, add content! All you need is a scipy.org > account, which you can make for yourself following the link in the > title bar. There is history, so if someone makes a change that isn't > popular, we can roll back the change and no harm done. If you want to > make major changes to the structure or content posted by others, you > can and probably should ask on these lists. But do ask! If it's a > good idea, others will agree with you. I was one of the guilty hesitant ones initially. I guess it's just a change of thinking mode, and old habits and reflexes die hard :) But we seem to be getting into the swing of it, so I'm sure momentum will pick up. Another topic discussed at the ASP BOF/mailings was a bit of a site 'refreshing', beyond the default Plone look. I don't really mind the visuals too much, and I certainly don't expect anyone to put a lot of time into this. But the current site, at least for me, has a real usability problem: the three column layout means that the center column, where the main contents lives, is always very small! And that has incredibly annoying effects. For example, when I browse the mailing list archives, messages are bunched up inside these little boxes and I have to scroll left and right on _every line_ to read them! This happens even if I maximize my browser to full screen. Note that I work on a 1600x1200 monitor, but I have X11 correctly calibrated to use quite a few pixels for font rendering, so in terms of characters, I can fit roughly 130 characters wide in the default fixed-width font I have for mozilla. After the left and right navbars have eaten up 2/3rds of my screen, there is simply not enough space left in the center to show much at all. If the above is not clear, I can provide a screenshot of the problem. This same issue makes it very unpleasant to edit the wiki, as the edit box, is also very small, in the middle of the screen. Hopefully at some point this can be addressed without it requiring too much time. Best, f From swisher at enthought.com Wed Nov 17 13:54:47 2004 From: swisher at enthought.com (Janet Swisher) Date: Wed, 17 Nov 2004 12:54:47 -0600 Subject: [SciPy-user] Scipy.org layout In-Reply-To: <419B9441.7030603@colorado.edu> Message-ID: <003901c4ccd6$e90d09f0$ab01a8c0@SWISHER> > -----Original Message----- > From: Fernando Perez > Another topic discussed at the ASP BOF/mailings was a bit of a site > 'refreshing', beyond the default Plone look. I don't really > mind the visuals too much, and I certainly don't expect anyone to > put a lot of time into this. > But the current site, at least for me, has a real usability > problem: the three column layout means that the center column, where the > main contents lives, is always very small! And that has incredibly > annoying effects. > > For example, when I browse the mailing list archives, > messages are bunched up inside these little boxes and I have to > scroll left and right on _every line_ to read them! This happens > even if I maximize my browser to full screen. > After the left and right navbars have eaten up 2/3rds of my screen, > there is simply not enough space left in the center to show > much at all. If the above is not clear, I can provide a screenshot > of the problem. This same issue makes it very unpleasant to edit > the wiki, as the edit box, is also very small, in the middle of the screen. > > Hopefully at some point this can be addressed without it > requiring too much time. Thanks for the feedback. I also have a list of suggestions from Russell Owen that I haven't taken the time to implement yet. I've removed the right column from the wiki area and the mailing list area. This should require a bit less scrolling. If it's still hard to use, please do send me a screen shot. If there are other areas that could benefit from this change, send me your suggestions. I don't think the right-hand slots (news, recent, and calendar) are really necessary below the top level of pages. --Janet From david at dwavesys.com Wed Nov 17 14:12:48 2004 From: david at dwavesys.com (David Grant) Date: Wed, 17 Nov 2004 11:12:48 -0800 Subject: [SciPy-user] the wikis In-Reply-To: <200411171534.iAHFYgCF015504@oobleck.astro.cornell.edu> References: <200411171534.iAHFYgCF015504@oobleck.astro.cornell.edu> Message-ID: <419BA2B0.2010007@dwavesys.com> I think this wiki should be transformed into wiki format. I'm new to the zope/zwiki format, but I tried converting the first section. What is the best reference out there which explains zwiki syntax? Joe Harrington wrote: >I am cross-posting this to counter some misconceptions about the ASP >wikis, including the Topical one. > >First, I am *thrilled* to see the new Topical page put up by Fernando! >The amount and quality of the content is just amazing. The >page-of-pages idea wasn't making anyone happy, so it's good to see it >go. > >Second, several people have expressed hesitation in editing the wiki, >since they didn't put it up. Folks, it's a *WIKI*! You're *supposed* >to edit it! Add, add, add content! All you need is a scipy.org >account, which you can make for yourself following the link in the >title bar. There is history, so if someone makes a change that isn't >popular, we can roll back the change and no harm done. If you want to >make major changes to the structure or content posted by others, you >can and probably should ask on these lists. But do ask! If it's a >good idea, others will agree with you. > >So, if you are distributing a package and it's not listed, please go >add it. Make a scipy.org account, login, go to the wiki, click on the >"edit" tab at the top (it only appears if you are logged in), and go >to it. There is a link at the top that has more info if you're not >sure how to do it. If you're really confused, please suggest >improvements to the help document on scipy-dev. > >And thanks again to Fernando for a lot of work! > >--jh-- > >_______________________________________________ >SciPy-user mailing list >SciPy-user at scipy.net >http://www.scipy.net/mailman/listinfo/scipy-user > > > From Fernando.Perez at colorado.edu Wed Nov 17 14:17:57 2004 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Wed, 17 Nov 2004 12:17:57 -0700 Subject: [SciPy-user] the wikis In-Reply-To: <419BA2B0.2010007@dwavesys.com> References: <200411171534.iAHFYgCF015504@oobleck.astro.cornell.edu> <419BA2B0.2010007@dwavesys.com> Message-ID: <419BA3E5.2080107@colorado.edu> David Grant schrieb: > I think this wiki should be transformed into wiki format. I'm new to > the zope/zwiki format, but I tried converting the first section. What > is the best reference out there which explains zwiki syntax? I wrote it up in html because it was a large page of links, and using mozilla composer made the whole process *FAR* more efficient. I was extremely happy to see that I could just drop the page into the wiki by pasting the raw html and that the wiki just did the right thing. I guess I don't care about reformattings, but please be careful: I noticed that the table of contents was already all mangled up from what I put up yesterday (I don't know if this was the product of your changes or someone else). I am not going to babysit that page, but if anyone is going to do wholesale internal reformattings, at least make sure that the final result still works. Otherwise, "if it ain't broke, don't fix it"! If those internal 'fixes' are just going to break stuff, just leave the html alone, please. It's not exactly a hard format to write (and I say this as someone who has written about 3 webpages in his whole life, all of them plain text, so I'm not exactly a web design expert). Cheers, f ps. These comments are general, not necessarily directed to David, so please don't take any of it personally. From Fernando.Perez at colorado.edu Wed Nov 17 14:30:32 2004 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Wed, 17 Nov 2004 12:30:32 -0700 Subject: [SciPy-user] Scipy.org layout In-Reply-To: <003901c4ccd6$e90d09f0$ab01a8c0@SWISHER> References: <003901c4ccd6$e90d09f0$ab01a8c0@SWISHER> Message-ID: <419BA6D8.1030206@colorado.edu> Janet Swisher schrieb: > I've removed the right column from the wiki area and the mailing list area. > This should require a bit less scrolling. If it's still hard to use, please > do send me a screen shot. If there are other areas that could benefit from > this change, send me your suggestions. I don't think the right-hand slots > (news, recent, and calendar) are really necessary below the top level of > pages. Great improvement, Janet, many thanks. I think the documentation area could use the same change. There are also code snippets in there which suffer the same 'squished in a center narrow box' problem that emails and the wiki had. As I've said before, I'm not a website expert. But the default plone layout seems really problematic. I've put up a screenshot at: http://amath.colorado.edu/faculty/fperez/tmp/scipy_main.png so as not to clobber the list with a huge attachment. You can see that the central column is so narrow as to be next to useless. This is with a font layout and browser size which works for pretty much 99% of the websites I visit. Again, I don't know how easy this would be to improve, but at least I'll drop my comments in. At least removing the third column from most of the site (except perhaps the front page) would be already a big help, even if it still requires window maximization. Again, many thanks for your attention to this. Regards, f From narnett at mccmedia.com Wed Nov 17 14:41:13 2004 From: narnett at mccmedia.com (Nick Arnett) Date: Wed, 17 Nov 2004 11:41:13 -0800 Subject: [SciPy-user] Sparse v. dense matrix, SVD and LSI-like analysis In-Reply-To: <419A88B6.7080002@ucsd.edu> References: <417EDA59.8090308@dwavesys.com> <41956231.2090504@liveworld.com> <419A88B6.7080002@ucsd.edu> Message-ID: <419BA959.3080604@mccmedia.com> Robert Kern wrote: > An SVD is not the same as an eigen decomposition. > > http://mathworld.wolfram.com/EigenDecomposition.html > http://mathworld.wolfram.com/SingularValueDecomposition.html That dawned on me not too long after posting. Fuzzy brain this week, a birth and a death in the family a few days apart. > In any case, all of the linalg.* functions only operate on dense arrays, > not sparse matrices. Ummm, now I'm thinking I'm in over my head. SVD is a matrix operation isn't it? I was imagining that an array of mostly zeros would be the equivalent of a sparse matrix, so that SVD would apply to it for what I'm doing, which is related to latent semantic analysis. Am I way off-base here? My understanding of Scipy isn't great, compounded by a less-than-great grasp of the mathematics. Learning as go, in large part. Any help gratefully accepted. Nick From oliphant at ee.byu.edu Wed Nov 17 14:53:43 2004 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Wed, 17 Nov 2004 12:53:43 -0700 Subject: [SciPy-user] Sparse v. dense matrix, SVD and LSI-like analysis In-Reply-To: <419BA959.3080604@mccmedia.com> References: <417EDA59.8090308@dwavesys.com> <41956231.2090504@liveworld.com> <419A88B6.7080002@ucsd.edu> <419BA959.3080604@mccmedia.com> Message-ID: <419BAC47.4020901@ee.byu.edu> Nick Arnett wrote: > Robert Kern wrote: > >> An SVD is not the same as an eigen decomposition. >> >> http://mathworld.wolfram.com/EigenDecomposition.html >> http://mathworld.wolfram.com/SingularValueDecomposition.html > > > That dawned on me not too long after posting. Fuzzy brain this week, > a birth and a death in the family a few days apart. > >> In any case, all of the linalg.* functions only operate on dense >> arrays, not sparse matrices. > > > Ummm, now I'm thinking I'm in over my head. SVD is a matrix operation > isn't it? I was imagining that an array of mostly zeros would be the > equivalent of a sparse matrix, so that SVD would apply to it for what > I'm doing, which is related to latent semantic analysis. The SVD of A is constructed using the eigen decompostion of A A^H and A^H A which have the same real-valued eigenvalues (though one may have some zero-valued eigenvalues the other doesn't) and unitary eigenvector matrices: The decompostion provides A = U D V^H where U U^H =I and V V^H = I so that AA^H = U D^2 U^H and A^H A = V D^2 V^H showing that U are the eigenvectors of A A^H corresponding to non-zero eigenvalues V are the eigenvectors of A^H A corresponding to non-zero eigenvalues D^2 are the non-zero eigenvalues of both So, while the SVD is not an eigenvector decomposition it is related to one. -Travis From david at dwavesys.com Wed Nov 17 15:01:04 2004 From: david at dwavesys.com (David Grant) Date: Wed, 17 Nov 2004 12:01:04 -0800 Subject: [SciPy-user] the wikis In-Reply-To: <419BA3E5.2080107@colorado.edu> References: <200411171534.iAHFYgCF015504@oobleck.astro.cornell.edu> <419BA2B0.2010007@dwavesys.com> <419BA3E5.2080107@colorado.edu> Message-ID: <419BAE00.6030808@dwavesys.com> Fernando Perez wrote: > David Grant schrieb: > >> I think this wiki should be transformed into wiki format. I'm new to >> the zope/zwiki format, but I tried converting the first section. >> What is the best reference out there which explains zwiki syntax? > > > I wrote it up in html because it was a large page of links, and using > mozilla composer made the whole process *FAR* more efficient. I was > extremely happy to see that I could just drop the page into the wiki > by pasting the raw html and that the wiki just did the right thing. > > I guess I don't care about reformattings, but please be careful: I > noticed that the table of contents was already all mangled up from > what I put up yesterday (I don't know if this was the product of your > changes or someone else). That was me, sorry. It looked way worse before I partially fixed my own reformattings. > I am not going to babysit that page, but if anyone is going to do > wholesale internal reformattings, at least make sure that the final > result still works. Otherwise, "if it ain't broke, don't fix it"! If > those internal 'fixes' are just going to break stuff, just leave the > html alone, please. It's not exactly a hard format to write (and I say > this as someone who has written about 3 webpages in his whole life, > all of them plain text, so I'm not exactly a web design expert). I personally prefer if everything is in wiki format as it is a hell of a lot easier to edit. Wiki formatting should be able to handle anything HTML can do, and if not, then HTML can be used for those specific parts. I'm surprised zwiki is so complicated. I've edited using Twiki and wikipedia many times and it is much more straightforward (for headings for example). There doesn't seem to be any link on the scipy page for "formatting rules" as there are in twiki and wikipedia. I managed to find some docs but they weren't very helpful. If anyone knows any links which explain the formatting plainly and clearly with examples (for headings for example). > > Cheers, > > f > > ps. These comments are general, not necessarily directed to David, so > please don't take any of it personally. > > _______________________________________________ > SciPy-user mailing list > SciPy-user at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-user > -- David J. Grant Scientific Officer Intellectual Property D-Wave Systems Inc. tel: 604.732.6604 fax: 604.732.6614 ************************** CONFIDENTIAL COMMUNICATION This electronic transmission, and any documents attached hereto, is confidential. The information is intended only for use by the recipient named above. If you have received this electronic message in error, please notify the sender and delete the electronic message. Any disclosure, copying, distribution, or use of the contents of information received in error is strictly prohibited. -------------- next part -------------- A non-text attachment was scrubbed... Name: david.vcf Type: text/x-vcard Size: 334 bytes Desc: not available URL: From swisher at enthought.com Wed Nov 17 14:59:52 2004 From: swisher at enthought.com (Janet Swisher) Date: Wed, 17 Nov 2004 13:59:52 -0600 Subject: [SciPy-user] the wikis In-Reply-To: <419BA2B0.2010007@dwavesys.com> Message-ID: <003a01c4cce0$03736010$ab01a8c0@SWISHER> > -----Original Message----- > From: David Grant > > I think this wiki should be transformed into wiki format. I'm new to > the zope/zwiki format, but I tried converting the first > section. What > is the best reference out there which explains zwiki syntax? The Topical Software wiki uses Zwiki Structured Text, explained at: http://www.zope.org/Members/jim/StructuredTextWiki/TextFormattingRules Note that we have disabled automatic linking of "CamelCase" words. One feature of this format is that you can intermingle ST formatting and raw HTML, which is why Fernando could just paste his HTML in, and it worked. However, as you can see, "intermingling" ST bullets and HTML
    's can cause some unexpected results. On my list of things to figure out is how we managed to create one wiki on scipy.org where users can change the page format (SciPy '04 Conference), but the rest of the wikis default to ST and can't be changed. In theory, Zwiki also supports reStructured Text, pure HTML, and plain text as page formats. For more detail, see: http://zwiki.org/TextFormattingRules --Janet From narnett at mccmedia.com Wed Nov 17 15:04:40 2004 From: narnett at mccmedia.com (Nick Arnett) Date: Wed, 17 Nov 2004 12:04:40 -0800 Subject: [SciPy-user] Sparse v. dense matrix, SVD and LSI-like analysis In-Reply-To: <419BAC47.4020901@ee.byu.edu> References: <417EDA59.8090308@dwavesys.com> <41956231.2090504@liveworld.com> <419A88B6.7080002@ucsd.edu> <419BA959.3080604@mccmedia.com> <419BAC47.4020901@ee.byu.edu> Message-ID: <419BAED8.2020709@mccmedia.com> Travis Oliphant wrote: > So, while the SVD is not an eigenvector decomposition it is related to one. Right... but I still don't understand the statement, "In any case, all of the linalg.* functions only operate on dense arrays, not sparse matrices." Why would linalg.svd not operate on a sparse matrix? I was working from the example (below) from your Scipy tutorial, in fact. Would the results not be meaningful if the matrix is sparse? >>>> A = mat('[1 3 2; 1 2 3]') >>>> M,N = A.shape >>>> U,s,Vh = linalg.svd(A) >>>> Sig = mat(diagsvd(s,M,N)) >>>> U, Vh = mat(U), mat(Vh) >>>> print U > Matrix([[-0.7071, -0.7071], > [-0.7071, 0.7071]]) >>>> print Sig > Matrix([[ 5.1962, 0. , 0. ], > [ 0. , 1. , 0. ]]) >>>> print Vh > Matrix([[-0.2722, -0.6804, -0.6804], > [-0. , -0.7071, 0.7071], > [-0.9623, 0.1925, 0.1925]]) >>>> print A > Matrix([[1, 3, 2], > [1, 2, 3]]) >>>> print U*Sig*Vh > Matrix([[ 1., 3., 2.], > [ 1., 2., 3.]]) Thanks! Nick From perry at stsci.edu Wed Nov 17 15:12:07 2004 From: perry at stsci.edu (Perry Greenfield) Date: Wed, 17 Nov 2004 15:12:07 -0500 Subject: [SciPy-user] the wikis In-Reply-To: <003a01c4cce0$03736010$ab01a8c0@SWISHER> References: <003a01c4cce0$03736010$ab01a8c0@SWISHER> Message-ID: While the topical wiki is being discussed, I thought some of the comments that Russell Owen made a while back are worth bringing up again. I don't recall any discussion about these. I thought he made some good points, particularly about the visibility of these pages and the names associated with them (something other than 'topical") >> >> Do you have suggestions about where else these items could be linked >> that would be more visible? > > Since you ask, here are my impressions and suggestions. > > First the impressions: > - - From the SciPy home page, there is no clue that this site has any > general "science with Python" information. It just seems to be a site > about the SciPy software project. > - - One has to first go to "Other Wikis" and then look way > down through > a page that is mostly about SciPy to find "Topical Software". > - - "Topical Software" is not evocative. > - - Burying it so deeply and not mentioning it on the home page give > the impression that it is very much an afterthoght and not integrated > into the site. > > Suggestions: > - - Rename "Topcial Software" to something like "Python Tools > for Science". > - - Explicitly mention "Python Tools for Science" in the text on the > home page. Try to briefly explain that it exists and perhaps WHY it > exists. Just a sentence or two should do it. *Talking* about it and > not just > > Further cleanups to consider: > - - Consolidate some of the items in Navigation. > - - Get rid of "Wikis" and those pages to other, more > obvious, sub-headings. > - - SciPy should be a recognized link on every scipy wiki (to > the SciPy > home page). > > The navigation bar might look like this (with changes noted > under each topic): > > Navigation: > About SciPy > - - no change (except as mentioned for the body text above) > Documentation: > - - includes "Wikis" links: "Troubleshooting", "SciPy User's Guide > Wiki" and "Help with SciPy.org Wikis" > this is touchy and may not fully work because I realize you have > non-wiki versions of the docs as well, > but I have seen success with offering users both versions > side-by-side (let them choose > if they want docs cluttered up with user comments or not) > Development: > - - include CVS and get rid of the CVS link (i.e. bury the > CVS page one > level deeper) > Bug Tracker: > - - include the standard navigation panel that we are > discussing Mailing Lists > - - no change > Science with Python > - - This is the "Topical Software" page, renamed and brought > up one level > > Anyway, there are my 2 bits, such as they are. Best of luck with the > site and the project. > > - -- Russell From pearu at scipy.org Wed Nov 17 15:31:11 2004 From: pearu at scipy.org (Pearu Peterson) Date: Wed, 17 Nov 2004 14:31:11 -0600 (CST) Subject: [SciPy-user] Sparse v. dense matrix, SVD and LSI-like analysis In-Reply-To: <419BAED8.2020709@mccmedia.com> References: <417EDA59.8090308@dwavesys.com> <41956231.2090504@liveworld.com> <419A88B6.7080002@ucsd.edu> <419BA959.3080604@mccmedia.com> <419BAC47.4020901@ee.byu.edu> <419BAED8.2020709@mccmedia.com> Message-ID: On Wed, 17 Nov 2004, Nick Arnett wrote: > Travis Oliphant wrote: > >> So, while the SVD is not an eigenvector decomposition it is related to one. > > Right... but I still don't understand the statement, "In any case, all of the > linalg.* functions only operate on dense arrays, not sparse matrices." The statement means that the current *implementation* of linalg functions cannot operate on sparse matrices. Mathematically there is no difference between sparse or dense matrices. Sparse or dense are notions of matrix representation. > Why would linalg.svd not operate on a sparse matrix? I was working from the > example (below) from your Scipy tutorial, in fact. Would the results not be > meaningful if the matrix is sparse? Of course they are. Just with the current implementation one must transform a sparse matrix to a full matrix (that may contain lots of zeros) before applying linalg functions. I believe that in future linalg functions will support sparse matrices as well. Pearu From joneric at enthought.com Wed Nov 17 15:33:45 2004 From: joneric at enthought.com (Jon-Eric Steinbomer) Date: Wed, 17 Nov 2004 14:33:45 -0600 Subject: [SciPy-user] Re: Scipy.org layout In-Reply-To: <20041117200019.D6FCB3EB85@www.scipy.com> References: <20041117200019.D6FCB3EB85@www.scipy.com> Message-ID: <419BB5A9.2030204@enthought.com> Hi Fernando, There was a problem with the layout of the slots on the right of the page that was causing them to render larger than default. This has been fixed and should afford more room to the page content where these slots are displayed. Please let either Janet or myself know if you observe any other problems with the layout. Regards, Jon-Eric Steinbomer -----Original Message----- > From: Fernando Perez Great improvement, Janet, many thanks. I think the documentation area could use the same change. There are also code snippets in there which suffer the same 'squished in a center narrow box' problem that emails and the wiki had. As I've said before, I'm not a website expert. But the default plone layout seems really problematic. I've put up a screenshot at: http://amath.colorado.edu/faculty/fperez/tmp/scipy_main.png so as not to clobber the list with a huge attachment. You can see that the central column is so narrow as to be next to useless. This is with a font layout and browser size which works for pretty much 99% of the websites I visit. Again, I don't know how easy this would be to improve, but at least I'll drop my comments in. At least removing the third column from most of the site (except perhaps the front page) would be already a big help, even if it still requires window maximization. Again, many thanks for your attention to this. Regards, f From narnett at mccmedia.com Wed Nov 17 16:06:32 2004 From: narnett at mccmedia.com (Nick Arnett) Date: Wed, 17 Nov 2004 13:06:32 -0800 Subject: [SciPy-user] Sparse v. dense matrix, SVD and LSI-like analysis In-Reply-To: References: <417EDA59.8090308@dwavesys.com> <41956231.2090504@liveworld.com> <419A88B6.7080002@ucsd.edu> <419BA959.3080604@mccmedia.com> <419BAC47.4020901@ee.byu.edu> <419BAED8.2020709@mccmedia.com> Message-ID: <419BBD58.1060008@mccmedia.com> Pearu Peterson wrote: > Of course they are. Just with the current implementation one must > transform a sparse matrix to a full matrix (that may contain lots of > zeros) before applying linalg functions. Okay, now I get it -- I was imagining that there was some mathematical difference between dense and sparse matrices that I wasn't aware of... What we're talking about is the representation, if I understand correctly. And... whew. Nick From Fernando.Perez at colorado.edu Wed Nov 17 16:44:08 2004 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Wed, 17 Nov 2004 14:44:08 -0700 Subject: [SciPy-user] Re: Scipy.org layout In-Reply-To: <419BB5A9.2030204@enthought.com> References: <20041117200019.D6FCB3EB85@www.scipy.com> <419BB5A9.2030204@enthought.com> Message-ID: <419BC628.1090900@colorado.edu> Jon-Eric Steinbomer schrieb: > Hi Fernando, > > There was a problem with the layout of the slots on the right of the page > that was causing them to render larger than default. This has been fixed > and should afford more room to the page content where these slots are > displayed. Please let either Janet or myself know if you observe any other > problems with the layout. Much better, thanks. A minor nit: is it possible to have the blue code boxes line their text inside of them flush left by default? Look at how they open by default: http://amath.colorado.edu/faculty/fperez/tmp/scipy_plottut.png all the extra left space causes unnecessary scroll bars to appear. Perhaps a tiny bit of whitespace is good for aesthetic reasons, but one or two fixed-width spaces should be enough, I think. At any rate, many thanks for this. Even these small changes (without the need for a huge site overhaul) already make a big usability difference. Regards, f From Jaap.Leguijt at shell.com Wed Nov 17 17:33:09 2004 From: Jaap.Leguijt at shell.com (Leguijt, Jaap J SIEP-EPT-RES) Date: Wed, 17 Nov 2004 23:33:09 +0100 Subject: [SciPy-user] MatPy or SciPy Message-ID: <41290F4363795A4EA9EFE8EF7FB904DDE7E11A@Rijpat-s-340.europe.shell.com> Hello, Recently I started looking at python. As a Matlab user I wondered whether python is a good or better alternative to Matlab as a prototyping tool. On of the things I like about python is the fact that it is open and it has a very nice object model. I found a package called "MatPy" that seems to cover, in python, the functionality of Matlab. I tried to install MatPy and that was not easy. It needed additional packages and the most difficult to get properly installed was cephes. After some time I managed to install MatPy but then I found out that it failed many of its standard tests. The BioSimGrid organisation has a nice reference manual that list the packages needed to install MatPy. In this manual I found a reference to SciPy and to me it seems to offer more functionality than MatPy. I don't have a full overview yet of the functionality offered by the various packages but is it correct to assume that SciPy covers the functionality of Matpy. If this is true, I can give up my struggle with Matpy, which seems to be an inactive project anyhow. Regards, Jaap Leguijt -------------- next part -------------- An HTML attachment was scrubbed... URL: From gazzar at email.com Wed Nov 17 19:50:39 2004 From: gazzar at email.com (Gary Ruben) Date: Thu, 18 Nov 2004 10:50:39 +1000 Subject: [SciPy-user] MatPy or SciPy Message-ID: <20041118005039.D023B1CE306@ws1-6.us4.outblaze.com> Hi Jaap, It really depends on what you're doing with Matlab. It also sounds like the new wiki page is your best starting point: My suggested starting point would be to just install and read the documentation of both the Numeric and numarray packages. I'd start with Numeric but, as you learn more, you may find that numarray is a better fit for your applications. It would be better to get familiar with what you can do with Numeric first since it's higher quality documentation will get you familiar with Python quicker than struggling with working out SciPy. It sounds like you're mainly dealing with matrices and you may find all the functionality is already available in Numeric. Scipy extends this basic functionality enormously, but it goes beyond just dealing with matrices. Definitely have a browse through the SciPy documentation, but it might be better to do this after having browsed the Numeric documentation. Also, have a look at the other packages mentioned in the "Science---basic tools" section of the wiki page. I'd also suggest looking at the matlab interface to the matplotlib package, which may provide you with some familiar-looking functionality: Gary R. ----- Original Message ----- From: "Leguijt, Jaap J SIEP-EPT-RES" To: scipy-user at scipy.net Subject: [SciPy-user] MatPy or SciPy Date: Wed, 17 Nov 2004 23:33:09 +0100 > > Hello, > > Recently I started looking at python. As a Matlab user I wondered whether python is a good or better alternative to Matlab as a prototyping tool. On of the things I like about python is the fact that it is open and it has a very nice object model. I found a package called "MatPy" that seems to cover, in python, the functionality of Matlab. I tried to install MatPy and that was not easy. It needed additional packages and the most difficult to get properly installed was cephes. After some time I managed to install MatPy but then I found out that it failed many of its standard tests. The BioSimGrid organisation has a nice reference manual that list the packages needed to install MatPy. In this manual I found a reference to SciPy and to me it seems to offer more functionality than MatPy. I don't have a full overview yet of the functionality offered by the various packages but is it correct to assume that SciPy covers the functionality of Matpy. If this is true, I can give up my struggle with Matpy, which seems to be an inactive project anyhow. > > Regards, > > Jaap Leguijt > -- ___________________________________________________________ Sign-up for Ads Free at Mail.com http://promo.mail.com/adsfreejump.htm From oliphant at ee.byu.edu Wed Nov 17 20:06:04 2004 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Wed, 17 Nov 2004 18:06:04 -0700 Subject: [SciPy-user] Sparse v. dense matrix, SVD and LSI-like analysis In-Reply-To: <419BAED8.2020709@mccmedia.com> References: <417EDA59.8090308@dwavesys.com> <41956231.2090504@liveworld.com> <419A88B6.7080002@ucsd.edu> <419BA959.3080604@mccmedia.com> <419BAC47.4020901@ee.byu.edu> <419BAED8.2020709@mccmedia.com> Message-ID: <419BF57C.8030706@ee.byu.edu> Nick Arnett wrote: > Travis Oliphant wrote: > >> So, while the SVD is not an eigenvector decomposition it is related >> to one. > > > Right... but I still don't understand the statement, "In any case, all > of the linalg.* functions only operate on dense arrays, not sparse > matrices." > > Why would linalg.svd not operate on a sparse matrix? I was working > from the example (below) from your Scipy tutorial, in fact. Would the > results not be meaningful if the matrix is sparse? Sparse matrices and dense matrices are very different objects and linalg.svd is a wrapper around LAPACK which only works on dense matrices. Getting all linear algebra operations working for sparse matrices is a very tall order and has not been done yet. -Travis From oliphant at ee.byu.edu Wed Nov 17 20:12:50 2004 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Wed, 17 Nov 2004 18:12:50 -0700 Subject: [SciPy-user] MatPy or SciPy In-Reply-To: <41290F4363795A4EA9EFE8EF7FB904DDE7E11A@Rijpat-s-340.europe.shell.com> References: <41290F4363795A4EA9EFE8EF7FB904DDE7E11A@Rijpat-s-340.europe.shell.com> Message-ID: <419BF712.4040300@ee.byu.edu> Leguijt, Jaap J SIEP-EPT-RES wrote: > Hello, > > Recently I started looking at python. As a Matlab user I wondered > whether python is a good or better alternative to Matlab as a > prototyping tool. On of the things I like about python is the fact > that it is open and it has a very nice object model. I found a package > called "MatPy" that seems to cover, in python, the functionality of > Matlab. I tried to install MatPy and that was not easy. It needed > additional packages and the most difficult to get properly installed > was cephes. After some time I managed to install MatPy but then I > found out that it failed many of its standard tests. The BioSimGrid > organisation has a nice reference manual that list the packages needed > to install MatPy. In this manual I found a reference to SciPy and to > me it seems to offer more functionality than MatPy. I don't have a > full overview yet of the functionality offered by the various packages > but is it correct to assume that SciPy covers the functionality of > Matpy. If this is true, I can give up my struggle with Matpy, > which seems to be an inactive project anyhow. > I recommend installing Numeric and start learning about it's array object and Matrix object. Then don't hesitate to install SciPy as it provides a large number of functions and classes for working with the Numeric array object. MatPy was an older project to add a Matrix Object that didn't seem to recognize that there was one already available in Numeric. If you install Numeric and then SciPy (needs Numeric anyway), then there are many on this list who can help you make the transistion... As an old Matlab user myself, I may be able to answer your transition questions. -Travis Oliphant From alexey.goldin at gmail.com Wed Nov 17 21:11:15 2004 From: alexey.goldin at gmail.com (Alexey Goldin) Date: Wed, 17 Nov 2004 18:11:15 -0800 Subject: [SciPy-user] SciPy cookbook Message-ID: I think at SciPy 2004 it was desided to have some kind of "cookbook" for scipy, something like ASPN cookbok at http://aspn.activestate.com/ASPN/Python/Cookbook/ , but with emphasis on scientific data processing. I have in mind mostly simple illustrative one page examples for people new to Python (not necessarily utilizing scipy), rather then complicated scripts. Is there some preferred place on SciPy site? Are SciPy wiki's pages friendly to Python source code? From nwagner at mecha.uni-stuttgart.de Thu Nov 18 03:48:54 2004 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Thu, 18 Nov 2004 09:48:54 +0100 Subject: [SciPy-user] Sparse matrices .todense(Complex) Message-ID: <419C61F6.3040006@mecha.uni-stuttgart.de> Hi all, Thanks a lot for the latest fixes with respect to handling of complex matrices by iterative solvers. One thing, which doesn't work is the change from sparse to dense matrices in case of complex matrices. >>> print A (0, 0) 0.809662640095 (0, 1) (0.880416512489+0.50495326519j) (1, 0) (0.880416512489-0.50495326519j) (1, 1) 0.733849585056 (1, 2) (0.907823622227+0.290321737528j) (2, 1) (0.907823622227-0.290321737528j) (2, 2) 0.300018787384 >>> A.todense(Complex) array([[ 0.80966264, 0.88041651, 0. ], [ 0.88041651, 0.73384959, 0.90782362], [ 0. , 0.90782362, 0.30001879]]) >>> Nils From travis at enthought.com Thu Nov 18 11:35:08 2004 From: travis at enthought.com (Travis N. Vaught) Date: Thu, 18 Nov 2004 10:35:08 -0600 Subject: [SciPy-user] Re: Scipy.org layout In-Reply-To: <419BC628.1090900@colorado.edu> References: <20041117200019.D6FCB3EB85@www.scipy.com> <419BB5A9.2030204@enthought.com> <419BC628.1090900@colorado.edu> Message-ID: <419CCF3C.50602@enthought.com> Fernando Perez wrote: > > Much better, thanks. A minor nit: is it possible to have the blue > code boxes line their text inside of them flush left by default? Look > at how they open by default: > > http://amath.colorado.edu/faculty/fperez/tmp/scipy_plottut.png > > all the extra left space causes unnecessary scroll bars to appear. > Perhaps a tiny bit of whitespace is good for aesthetic reasons, but > one or two fixed-width spaces should be enough, I think. Hey Fernando, I just looked into this and I'm running into a limitation of the Structured Text processor. From a docstring in ClassicStructuredText.py: - Sub-paragraphs of a paragraph that ends in the word 'example' or the word 'examples', or '::' is treated as example code and is output as is. The problem here is that, when using ST, in order to define a "sub-paragraph", you have to use white space indentation. This indentation, however, is preserved by the
     tag that gets put around 
    the sub-paragraph.  Note, the additional space is not a function of the 
    style sheet definition for 
    -- that looks to be OK.
    
    Regards,
    
    Travis
    
    
    
    From oliphant at ee.byu.edu  Thu Nov 18 12:32:08 2004
    From: oliphant at ee.byu.edu (Travis Oliphant)
    Date: Thu, 18 Nov 2004 10:32:08 -0700
    Subject: [SciPy-user] Sparse matrices  .todense(Complex)
    In-Reply-To: <419C61F6.3040006@mecha.uni-stuttgart.de>
    References: <419C61F6.3040006@mecha.uni-stuttgart.de>
    Message-ID: <419CDC98.8030806@ee.byu.edu>
    
    Nils Wagner wrote:
    
    > Hi all,
    >
    > Thanks a lot for the latest fixes with respect to handling of complex 
    > matrices by iterative solvers.
    > One thing, which doesn't work is the change from sparse to dense 
    > matrices in case of complex matrices.
    >
    > >>> print A
    >  (0, 0)        0.809662640095
    >  (0, 1)        (0.880416512489+0.50495326519j)
    >  (1, 0)        (0.880416512489-0.50495326519j)
    >  (1, 1)        0.733849585056
    >  (1, 2)        (0.907823622227+0.290321737528j)
    >  (2, 1)        (0.907823622227-0.290321737528j)
    >  (2, 2)        0.300018787384
    > >>> A.todense(Complex)
    > array([[ 0.80966264,  0.88041651,  0.        ],
    >       [ 0.88041651,  0.73384959,  0.90782362],
    >       [ 0.        ,  0.90782362,  0.30001879]])
    > >>>
    
    This is now fixed in CVS (or will be in a few minutes).  But, note that 
    it only corresponds to dok_matrices.  The other kinds of matrices have 
    not had the same problems which stem from not being able to tell what 
    the type of a dok_matrix is until you actually convert it to a dense 
    matrix...
    
    -Travis
    
    
    
    From swisher at enthought.com  Thu Nov 18 12:36:52 2004
    From: swisher at enthought.com (Janet Swisher)
    Date: Thu, 18 Nov 2004 11:36:52 -0600
    Subject: [SciPy-user] SciPy cookbook
    In-Reply-To: 
    Message-ID: <002a01c4cd95$33ff7f60$ab01a8c0@SWISHER>
    
    
    
    > -----Original Message-----
    > From: Alexey Goldin
     
    > I think at SciPy 2004 it was desided to have some kind of 
    > "cookbook" for scipy, something like ASPN cookbok at 
    > http://aspn.activestate.com/ASPN/Python/Cookbook/ , but with 
    > emphasis on scientific data processing.  I have  in mind 
    > mostly simple illustrative one page examples for people new  
    > to Python  (not necessarily utilizing scipy), rather then 
    > complicated scripts. Is there some preferred place on SciPy 
    > site? 
    
    One possibility would be to put this under the Accessible SciPy wiki. For
    example, you could put a "Cookbook" link under "Documentation Issues" on the
    main page, and grow it from there. Even if the cookbook is not specific to
    SciPy, that's a reasonable place to start it. It can always be moved when it
    "grows up".
    
    > Are SciPy wiki's  pages friendly to Python source code?
    
    Not sure what you mean by "friendly", but I think so. For example, you can
    upload a .py file, and the site will recognize that it is a text file and
    display its contents in fixed-width font. You can also write a wiki page
    using ST that contains code examples. However, neither of these features is
    specific to Python, so it's not *more* friendly to Python than other
    languages. (Of course, the site is implemented with Plone and Zope, so it's
    *written* in Python.)
    
    To create a code example in an ST page, end the preceding paragraph with
    '::' and indent the code example more than the preceding paragraph. (If the
    '::' is attached to a word, it gets replaced by a single colon in the
    output. If there's whitespace in front of '::', the double-colon is hidden
    in the output.) For example::
    
      # This is a code example.
    
    HTH,
    Janet
    
    
    
    
    
    From Fernando.Perez at colorado.edu  Thu Nov 18 13:54:25 2004
    From: Fernando.Perez at colorado.edu (Fernando Perez)
    Date: Thu, 18 Nov 2004 11:54:25 -0700
    Subject: [SciPy-user] Re: Scipy.org layout
    In-Reply-To: <419CCF3C.50602@enthought.com>
    References: <20041117200019.D6FCB3EB85@www.scipy.com>
    	<419BB5A9.2030204@enthought.com> <419BC628.1090900@colorado.edu>
    	<419CCF3C.50602@enthought.com>
    Message-ID: <419CEFE1.8030007@colorado.edu>
    
    Hey Travis,
    
    Travis N. Vaught schrieb:
    
    > I just looked into this and I'm running into a limitation of the 
    > Structured Text processor.  From a docstring in ClassicStructuredText.py:
    > 
    > - Sub-paragraphs of a paragraph that ends in the word 'example' or the
    >   word 'examples', or '::' is treated as example code and is output as is.
    > 
    > The problem here is that, when using ST, in order to define a 
    > "sub-paragraph", you have to use white space indentation.  This 
    > indentation, however, is preserved by the 
     tag that gets put around 
    > the sub-paragraph.  Note, the additional space is not a function of the 
    > style sheet definition for 
    -- that looks to be OK.
    
    Oh well.  No big deal, all of the work you guys put in yesterday did make big 
    improvements.  Further work is probably best left for later, when the site 
    overhaul project goes through.  This will avoid doing work now just to redo it 
    in a few weeks.
    
    Again, many thanks for spending yesterday on these changes.  Though apparently 
    minor, they do make a huge usability difference as it is.
    
    best,
    
    f
    
    
    
    From grante at visi.com  Thu Nov 18 17:33:35 2004
    From: grante at visi.com (Grant Edwards)
    Date: Thu, 18 Nov 2004 16:33:35 -0600
    Subject: [SciPy-user] Problem with titles/labels in gplt (and why is it so
    	sloooow?)
    Message-ID: <20041118223334.GA9234@grante.dsl.visi.com>
    
    
    I can't figure out how to get titles and labels to appear on
    gplt output without having the plot re-drawn multiple times.
    If I set the titles, etc. before the gplt.surf() command, they
    don't show up.  If I set them afterwards, the plot is re-drawn
    everytime I set a label, title, key, etc.
    
    gplot.surf() is already an order of magnitude slower than using
    gnuplot-py, and when each plot has to be redraw a half-dozen
    time times it's excruciating: a plot that should take a few
    milliseconds ends up taking _seconds_ to complete.
    
    I'm tempted to give up and just use gnuplot-py, except that
    requires a lot more work to convert arrays into data objects.
    
    What am I doing wrong that makes gplt so slow?
    
    Why do I have to replot the surface 4 times to get the title
    and axes labels?
    
    Here's my test program:
    
    ------------------------------8<------------------------------
    import sys,time,Gnuplot
    from scipy import *
    
    iterations = range(10)
    
    x,y = mgrid[-1:1:20j,-1:1:20j]
    z = (x+y)*exp(-6.0*(x*x+y*y))
    
    gplt.current()._send('set mouse;')
    t1 = time.time()
    for i in iterations:
        gplt.surf(x,y,z)
        gplt.title("Sparsely sampled function -- iteration %d" % i)
        gplt.xtitle("x")
        gplt.ytitle("y")
        gplt.ztitle("z")
    t2 = time.time()
    
    print t2-t1
    
    gp = Gnuplot.Gnuplot()
    t1 = time.time()
    for i in iterations:
        gp.title("Sparsely sampled function -- iteration %d" % i)
        gp.set_label("xlabel","x")
        gp.set_label("ylabel","y")
        gp.set_label("zlabel","z")
        gp.splot(Gnuplot.GridData(z,x[:,0],y[0,:],with='lines'))
    t2 = time.time()
    
    print t2-t1
    
    sys.stdin.readline()
    ------------------------------8<------------------------------
    
    
    The output is:
    
    $ python testit.py
    27.0858080387
    0.360200166702
    
    scipy.gplt is almost 100 times slower than gnuplot-py.
    
    -- 
    Grant Edwards
    grante at visi.com
    
    
    
    From grante at visi.com  Thu Nov 18 18:34:36 2004
    From: grante at visi.com (Grant Edwards)
    Date: Thu, 18 Nov 2004 17:34:36 -0600
    Subject: [SciPy-user] Nit in SciPy tutorial?
    Message-ID: <20041118233435.GA10164@grante.dsl.visi.com>
    
    
    Sorry to be a pest, but I'm confused by something in the
    SciPy tutorial.  On p23, it says
    
      "[...],the function interpolate.bisplrep is available. This
       function takes as required inputs the 1-D arrays x, y, and z
       which represent points on the surface z = f (x, y).
    
    In the example code at the bottom of p23:
    
    >>> x,y = mgrid[-1:1:20j,-1:1:20j]
    [...]
    >>> tck = interpolate.bisplrep(x,y,z,s=0)
    
    According to the text x and y should be 1-D arrays, but I
    printed their shapes and x.shape and y.shape are both (20,20).
    That's 2-D, right?  [It does work.]
    
    A bit of experimenting seems to indicate that x,y,z can either
    be 1-D or 2-D, since the following code works as well:
    
    import sys,Gnuplot
    from scipy import *
    
    x,y = mgrid[-1:1:20j,-1:1:20j]
    x.shape,y.shape = (-1,),(-1,)
    
    z = (x+y)*exp(-6.0*(x*x+y*y))
    
    tck = interpolate.bisplrep(x,y,z,s=0)
    
    xn,yn = mgrid[-1:1:70j,-1:1:70j]
    zn = interpolate.bisplev(xn[:,0],yn[0,:],tck)
    
    gp = Gnuplot.Gnuplot()
    gp.set_label("xlabel","x")
    gp.set_label("ylabel","y")
    gp.set_label("zlabel","z")
    gp.splot(Gnuplot.Data(x,y,z,with='points'),
             Gnuplot.GridData(zn,xn[:,0],yn[0,:],with='lines'))
    
    
    -- 
    Grant Edwards
    grante at visi.com
    
    
    
    From Giovanni.Samaey at cs.kuleuven.ac.be  Fri Nov 19 05:01:34 2004
    From: Giovanni.Samaey at cs.kuleuven.ac.be (Giovanni Samaey)
    Date: Fri, 19 Nov 2004 11:01:34 +0100
    Subject: [SciPy-user] error linking to libg2c on 64-bit machine
    In-Reply-To: 
    References: <200411151937.iAFJbsU08350@iris.cs.kuleuven.ac.be>
    	
    Message-ID: <419DC47E.10604@cs.kuleuven.ac.be>
    
    Pearu,
    
    thank you so much for your help so far.  I believe I am getting in the 
    range of "standard" problems now ;-)
    
    >
    >
    > But until then, you can force undetecting X11 libraries by setting
    >
    >   x11_libs = noX11
    >
    > in [x11] section of site.cfg file. That should give you a change to 
    > complete scipy build.
    
    I got scipy built completely (without x11), but the scipy.test() gives 5 
    failures and 2 errors, listed below this mail.
    I suspect the ATLAS binary atlas atlas3.7.8_Linux_HAMMER64SSE2_2.tgz 
    that I downloaded from scipy.org
    is the cause, since it is probably not compatible with my system.
    System info as required by the INSTALL file is attached in info.txt
    However, when I follow the instructions on the scipy webpage for 
    building scipy, I get the familiar error that I should
    recompile them with -fPIC.  How do I alter the building of ATLAS to take 
    care of this?
    (Or is there another cause for the error?)
    
    Then concerning x11: I indeed only have .so files.  I have libX11.so.6 
    and libX11.so.6.2; so if I could configure something such that they are 
    used, also this problem should be solvable. 
    
    Thanks,
    
    Giovanni
    
    
    
    ---
    scipy.test() errors:
    
    ======================================================================
    ERROR: check_simple_overdet (scipy.linalg.basic.test_basic.test_lstsq)
    ----------------------------------------------------------------------
    Traceback (most recent call last):
      File 
    "/data/home/giovanni/lib/python2.3/site-packages/scipy/linalg/tests/test_basic.py", 
    line 361, in check_simple_overdet
        x,res,r,s = lstsq(a,b)
      File 
    "/data/home/giovanni/lib/python2.3/site-packages/scipy/linalg/basic.py", 
    line 353, in lstsq
        if rank==n: resids = sum(x[n:]**2)
    ArithmeticError: Integer overflow in power.
    
    ======================================================================
    ERROR: check_simple_sym (scipy.linalg.basic.test_basic.test_solve)
    ----------------------------------------------------------------------
    Traceback (most recent call last):
      File 
    "/data/home/giovanni/lib/python2.3/site-packages/scipy/linalg/tests/test_basic.py", 
    line 80, in check_simple_sym
        x = solve(a,b,sym_pos=1,lower=lower)
      File 
    "/data/home/giovanni/lib/python2.3/site-packages/scipy/linalg/basic.py", 
    line 123, in solve
        raise LinAlgError, "singular matrix"
    LinAlgError: singular matrix
    
    ======================================================================
    FAIL: check_simple (scipy.linalg.decomp.test_decomp.test_cholesky)
    ----------------------------------------------------------------------
    Traceback (most recent call last):
      File 
    "/data/home/giovanni/lib/python2.3/site-packages/scipy/linalg/tests/test_decomp.py", 
    line 247, in check_simple
        assert_array_almost_equal(Numeric.dot(Numeric.transpose(c),c),a)
      File 
    "/data/home/giovanni/lib/python2.3/site-packages/scipy_test/testing.py", 
    line 688, in assert_array_almost_equal
        assert cond,\
    AssertionError:
    Arrays are not almost equal (mismatch 100.0%):
            Array 1: [[-1885965976648932823  7796773222197633577  
    -761725725645001531]
     [ 7796773222197633577 -6619736815450336083  789149371...
            Array 2: [[8 2 3]
     [2 9 3]
     [3 3 6]]
    
    
    ======================================================================
    FAIL: check_simple (scipy.linalg.basic.test_basic.test_det)
    ----------------------------------------------------------------------
    Traceback (most recent call last):
      File 
    "/data/home/giovanni/lib/python2.3/site-packages/scipy/linalg/tests/test_basic.py", 
    line 273, in check_simple
        assert_almost_equal(a_det,-2.0)
      File 
    "/data/home/giovanni/lib/python2.3/site-packages/scipy_test/testing.py", 
    line 610, in assert_almost_equal
        assert round(abs(desired - actual),decimal) == 0, msg
    AssertionError:
    Items are not equal:
    DESIRED: -2.0
    ACTUAL: -0.0
    
    ======================================================================
    FAIL: check_simple (scipy.linalg.basic.test_basic.test_inv)
    ----------------------------------------------------------------------
    Traceback (most recent call last):
      File 
    "/data/home/giovanni/lib/python2.3/site-packages/scipy/linalg/tests/test_basic.py", 
    line 202, in check_simple
        [[1,0],[0,1]])
      File 
    "/data/home/giovanni/lib/python2.3/site-packages/scipy_test/testing.py", 
    line 688, in assert_array_almost_equal
        assert cond,\
    AssertionError:
    Arrays are not almost equal (mismatch 100.0%):
            Array 1: [[9216616637413720064   -4503599627370496]
     [9207609438158979072   -9007199254740992]]
            Array 2: [[1 0]
     [0 1]]
    
    
    ======================================================================
    FAIL: check_simple_exact (scipy.linalg.basic.test_basic.test_lstsq)
    ----------------------------------------------------------------------
    Traceback (most recent call last):
      File 
    "/data/home/giovanni/lib/python2.3/site-packages/scipy/linalg/tests/test_basic.py", 
    line 356, in check_simple_exact
        assert_array_almost_equal(Numeric.matrixmultiply(a,x),b)
      File 
    "/data/home/giovanni/lib/python2.3/site-packages/scipy_test/testing.py", 
    line 688, in assert_array_almost_equal
        assert cond,\
    AssertionError:
    Arrays are not almost equal (mismatch 50.0%):
            Array 1: [0 0]
            Array 2: [1 0]
    
    
    ======================================================================
    FAIL: check_simple (scipy.linalg.basic.test_basic.test_solve)
    ----------------------------------------------------------------------
    Traceback (most recent call last):
      File 
    "/data/home/giovanni/lib/python2.3/site-packages/scipy/linalg/tests/test_basic.py", 
    line 74, in check_simple
        assert_array_almost_equal(Numeric.matrixmultiply(a,x),b)
      File 
    "/data/home/giovanni/lib/python2.3/site-packages/scipy_test/testing.py", 
    line 688, in assert_array_almost_equal
        assert cond,\
    AssertionError:
    Arrays are not almost equal (mismatch 100.0%):
            Array 1: [[             nan              nan]
     [             nan              nan]]
            Array 2: [[1 0]
     [0 1]]
    
    
    ----------------------------------------------------------------------
    Ran 984 tests in 1.679s
    
    -------------- next part --------------
    An embedded and charset-unspecified text was scrubbed...
    Name: info.txt
    URL: 
    
    From grante at visi.com  Fri Nov 19 10:42:02 2004
    From: grante at visi.com (Grant Edwards)
    Date: Fri, 19 Nov 2004 09:42:02 -0600
    Subject: [SciPy-user] Fitpack: SystemError: error return without exception
    	set
    Message-ID: <20041119154201.GA15711@grante.dsl.visi.com>
    
    
    I'm trying to fit a b-spline surface to a set of 64 points, and
    I'm getting the following error:
    
        $ python testit.py
         iopt,kx,ky,m= 0 3 3 64
         nxest,nyest,nmax= 9 9 9
         lwrk1,lwrk2,kwrk= 1682 571 68
         xb,xe,yb,ye=  3.  21.  5695.27  175898.35
         eps,s  1.E-16  52.6862915
        Traceback (most recent call last):
          File "testit.py", line 9, in ?
            tck = interpolate.bisplrep(x,y,z)
          File
        "/usr/lib/python2.3/site-packages/scipy/interpolate/fitpack.py",
        line 611, in bisplrep
            tx,ty,nxest,nyest,wrk,lwrk1,lwrk2)
        SystemError: error return without exception set
    
    Can anybody shed any light on the possible source of this
    error?  I've ordered a copy of Dierckx's book, but it won't be
    here for a week or two.
    
    Here's my source code:
    
    ------------------------------8<------------------------------
    import sys,Gnuplot
    from scipy import *
    
    d = array(map(float,file('foo.data','r').read().split()))
    d.shape = (-1,3)
    
    x,y,z = d[:,0],d[:,1],d[:,2]
    
    tck = interpolate.bisplrep(x,y,z)
    
    xn,yn = mgrid[min(x):max(x):50j,min(y):max(y):50j]
    zn = interpolate.bisplev(xn[:,0],yn[0,:],tck)
    
    gp = Gnuplot.Gnuplot()
    gp.set_label("xlabel","x")
    gp.set_label("ylabel","y")
    gp.set_label("zlabel","z")
    gp.splot(Gnuplot.Data(x,y,z,with='points'),
             Gnuplot.GridData(zn,xn[:,0],yn[0,:],with='lines'))
    sys.stdin.readline()         
    ------------------------------8<------------------------------
    
    foo.data looks like this:
    
        3.00   5695.270    0.390
        3.00   9073.820    0.400
        3.00  17216.820    0.430
        3.00  23477.480    0.450
        3.00  34757.700    0.470
        3.00  62282.540    0.490
        3.00 111602.470    0.520
        3.00 174409.030    0.540
        4.50   6115.870    0.520
        4.50   9646.110    0.530
        4.50  16044.670    0.540
        4.50  23380.950    0.550
        4.50  32013.490    0.560
        4.50  63247.840    0.600
        4.50 113898.510    0.620
        4.50 175822.500    0.630
        6.00   6171.030    0.600
        6.00  10149.440    0.600
        6.00  18292.440    0.620
        6.00  25897.620    0.630
        6.00  34288.840    0.650
        6.00  64688.890    0.680
        6.00 113843.350    0.700
        6.00 177980.640    0.700
        9.00   4895.450    0.640
        9.00   9742.640    0.660
        9.00  15906.770    0.680
        9.00  23422.320    0.700
        9.00  32082.440    0.710
        9.00  62475.600    0.730
        9.00 109375.390    0.740
        9.00 171147.690    0.750
       12.00   4998.880    0.700
       12.00   9080.720    0.720
       12.00  15920.560    0.730
       12.00  22312.220    0.750
       12.00  31048.190    0.750
       12.00  63633.960    0.770
       12.00 112036.860    0.770
       12.00 185647.880    0.760
       15.00   5164.360    0.710
       15.00   9832.270    0.740
       15.00  16685.900    0.760
       15.00  23711.910    0.760
       15.00  34840.440    0.770
       15.00  62489.390    0.780
       15.00 110044.200    0.790
       15.00 177546.250    0.790
       18.00   5364.310    0.720
       18.00   9797.800    0.750
       18.00  15934.350    0.760
       18.00  22725.920    0.770
       18.00  31820.430    0.780
       18.00  61241.390    0.780
       18.00 110382.060    0.790
       18.00 173788.480    0.800
       21.00   5198.830    0.730
       21.00   9728.850    0.770
       21.00  15844.710    0.780
       21.00  23560.220    0.780
       21.00  32275.500    0.790
       21.00  63151.310    0.800
       21.00 110630.280    0.800
       21.00 175898.350    0.810
    
    
    -- 
    Grant Edwards
    grante at visi.com
    
    
    
    From swangken at yahoo.com  Sun Nov 21 11:18:30 2004
    From: swangken at yahoo.com (Xiaowen Wang)
    Date: Sun, 21 Nov 2004 08:18:30 -0800 (PST)
    Subject: [SciPy-user] PyFPE_jbuf unresolved when loading gistC.so
    Message-ID: <20041121161830.4698.qmail@web20922.mail.yahoo.com>
    
    Hi:
    I'm new to scipy. When I tried out xplt, it complains
    that gistC.so failed to resolve PyFPE_jbuf. A search
    on googles seems to suggest that this is a static
    variable defined in python 2.2. I'm using python2.3.3
    on FC2. 
    Any suggestion on how to resolve this without
    recompiling?
    
    Thanks
    XW
    
    >>> from scipy import *
    >>> xplt.demo5.demo5(1)
    Traceback (most recent call last):
      File "", line 1, in ?
      File
    "/usr/lib/python2.3/site-packages/scipy_base/ppimport.py",
    line 303, in __getattr__
        module = self._ppimport_importer()
      File
    "/usr/lib/python2.3/site-packages/scipy_base/ppimport.py",
    line 262, in _ppimport_importer
        raise PPImportError,\
    scipy_base.ppimport.PPImportError: Traceback (most
    recent call last):
      File
    "/usr/lib/python2.3/site-packages/scipy_base/ppimport.py",
    line 273, in _ppimport_importer
        module = __import__(name,None,None,['*'])
      File
    "/usr/lib/python2.3/site-packages/scipy/xplt/__init__.py",
    line 8, in ?
        from gist import *
      File
    "/usr/lib/python2.3/site-packages/scipy/xplt/gist.py",
    line 62, in ?
        from gistC import *
    ImportError:
    /usr/lib/python2.3/site-packages/scipy/xplt/gistC.so:
    undefined symbol: PyFPE_jbuf
    
    
    
    		
    __________________________________ 
    Do you Yahoo!? 
    Meet the all-new My Yahoo! - Try it today! 
    http://my.yahoo.com 
     
    
    
    
    From gerard.vermeulen at grenoble.cnrs.fr  Sun Nov 21 12:25:52 2004
    From: gerard.vermeulen at grenoble.cnrs.fr (Gerard Vermeulen)
    Date: Sun, 21 Nov 2004 18:25:52 +0100
    Subject: [SciPy-user] ANNOUNCE: PyQwt-4.1
    Message-ID: <20041121182552.501c8840.gerard.vermeulen@grenoble.cnrs.fr>
    
    I am proud to announce PyQwt-4.1.
    
    What is PyQwt?
    
    - it is a set of Python bindings for the Qwt C++ class library which
      extends the Qt framework with widgets for scientific, engineering
      and financial applications.   It provides a widget to plot
      2-dimensional data and various widgets to display and control
      bounded or unbounded floating point values.
    
    - it requires and extends PyQt, a set of Python bindings for Qt.
    
    - it really shines with either Numeric, numarray or both. Numeric
      and/or numarray extend the Python language with new data types which
      turn Python into an ideal system for numerical computing and
      experimentation, better than MatLab and IDL.
    
    - it supports the use of PyQt, Qt, Qwt, the Numerical Python extensions
      (either Numeric, or numarray or both) and optionally SciPy in a GUI
      Python application or in an interactive Python session.
    
    - it runs on POSIX, MacOS/X and Windows (any operating system supported
      by Qt and Python).
    
    The home page of PyQwt is http://pyqwt.sourceforge.net.
    
    
    Main changes in PyQwt-4.1:
    1. supports PyQt-3.13, -3.12, PyQt-3.11, and PyQt-3.10.
    2. supports sip-4.1.1, -4.1, -4.0, sip-3.11, and sip-3.10.
    3. supports Qt-3.3.3 downto -2.3.0.
    4. based on Qwt-4.2.0.
    5. either links with a shared (dll) Qwt-4.2.0 library or links
       statically with a slightly patched internal version of Qwt-4.2.0.
    
    
    Have fun -- Gerard Vermeulen
    
    
    
    From nwagner at mecha.uni-stuttgart.de  Mon Nov 22 09:38:53 2004
    From: nwagner at mecha.uni-stuttgart.de (Nils Wagner)
    Date: Mon, 22 Nov 2004 15:38:53 +0100
    Subject: [SciPy-user] How do I build scipy using standard BLAS/LAPACK 
    Message-ID: <41A1F9FD.5030102@mecha.uni-stuttgart.de>
    
    Hi all,
    
    Assuming that libblas.a and liblapack.a are located in /usr/lib
    how do I build scipy from cvs without usage of ATLAS ?
    What are the neccessary steps ? (modify setup.py ?)
    
    Nils
    
    
     
    
    
    
    
    From pearu at scipy.org  Mon Nov 22 10:12:58 2004
    From: pearu at scipy.org (Pearu Peterson)
    Date: Mon, 22 Nov 2004 09:12:58 -0600 (CST)
    Subject: [SciPy-user] How do I build scipy using standard BLAS/LAPACK 
    In-Reply-To: <41A1F9FD.5030102@mecha.uni-stuttgart.de>
    References: <41A1F9FD.5030102@mecha.uni-stuttgart.de>
    Message-ID: 
    
    
    
    On Mon, 22 Nov 2004, Nils Wagner wrote:
    
    > Hi all,
    >
    > Assuming that libblas.a and liblapack.a are located in /usr/lib
    > how do I build scipy from cvs without usage of ATLAS ?
    > What are the neccessary steps ? (modify setup.py ?)
    
    Please read 'Linear Algebra libraries' section in
    
       http://www.scipy.org/documentation/buildscipy.txt
    
    All you need is to specify BLAS and LAPACK environment variables and set 
    ATLAS=None. No need to modify setup.py. To see if proper libraries are 
    found by scipy setup.py script, study the output of
    
       python system_info.py lapack_opt blas_opt
    
    Also note that on some systems /usr/lib/lib{blas,lapack}.a
    (i) may use ATLAS libraries (e.g. Debian)
    (ii) or these BLAS/LAPACK libraries are not complete (e.g. RedHat).
    
    So, in general, using /usr/lib/lib{blas,lapack}.a for Fortran BLAS/LAPACK 
    libraries is not always reliable but getting BLAS/LAPACK from netlib and 
    building them yourself is.
    
    Pearu
    
    
    
    From nwagner at mecha.uni-stuttgart.de  Mon Nov 22 13:56:16 2004
    From: nwagner at mecha.uni-stuttgart.de (Nils Wagner)
    Date: Mon, 22 Nov 2004 19:56:16 +0100
    Subject: [SciPy-user] How do I build scipy using standard
     BLAS/LAPACK 
    In-Reply-To: 
    References: <41A1F9FD.5030102@mecha.uni-stuttgart.de>
    	
    Message-ID: 
    
    On Mon, 22 Nov 2004 09:12:58 -0600 (CST)
      Pearu Peterson  wrote:
    > 
    > 
    > On Mon, 22 Nov 2004, Nils Wagner wrote:
    > 
    >> Hi all,
    >>
    >> Assuming that libblas.a and liblapack.a are located in 
    >>/usr/lib
    >> how do I build scipy from cvs without usage of ATLAS ?
    >> What are the neccessary steps ? (modify setup.py ?)
    > 
    > Please read 'Linear Algebra libraries' section in
    > 
    >   http://www.scipy.org/documentation/buildscipy.txt
    > 
    > All you need is to specify BLAS and LAPACK environment 
    >variables and set ATLAS=None. No need to modify setup.py. 
    >To see if proper libraries are found by scipy setup.py 
    >script, study the output of
    > 
    >   python system_info.py lapack_opt blas_opt
    > 
    > Also note that on some systems 
    >/usr/lib/lib{blas,lapack}.a
    > (i) may use ATLAS libraries (e.g. Debian)
    > (ii) or these BLAS/LAPACK libraries are not complete 
    >(e.g. RedHat).
    > 
    > So, in general, using /usr/lib/lib{blas,lapack}.a for 
    >Fortran BLAS/LAPACK libraries is not always reliable but 
    >getting BLAS/LAPACK from netlib and building them 
    >yourself is.
    > 
    > Pearu
    > 
    > _______________________________________________
    > SciPy-user mailing list
    > SciPy-user at scipy.net
    > http://www.scipy.net/mailman/listinfo/scipy-user
    
      
    Pearu,
    
    Following the notes
    http://www.scipy.org/documentation/buildscipy.txt
    Or get LAPACK/BLAS sources as follows:
    
    gives
    
    Traceback (most recent call last):
       File "setup.py", line 112, in ?
         setup_package(ignore_packages)
       File "setup.py", line 99, in setup_package
         url = "http://www.scipy.org",
       File "scipy_core/scipy_distutils/core.py", line 73, in 
    setup
         return old_setup(**new_attr)
       File "/usr/local/lib/python2.3/distutils/core.py", line 
    149, in setup
         dist.run_commands()
       File "/usr/local/lib/python2.3/distutils/dist.py", line 
    907, in run_commands
         self.run_command(cmd)
       File "/usr/local/lib/python2.3/distutils/dist.py", line 
    927, in run_command
         cmd_obj.run()
       File 
    "/usr/local/lib/python2.3/distutils/command/build.py", 
    line 107, in run
         self.run_command(cmd_name)
       File "/usr/local/lib/python2.3/distutils/cmd.py", line 
    333, in run_command
         self.distribution.run_command(command)
       File "/usr/local/lib/python2.3/distutils/dist.py", line 
    927, in run_command
         cmd_obj.run()
       File "scipy_core/scipy_distutils/command/build_src.py", 
    line 81, in run
         self.build_sources()
       File "scipy_core/scipy_distutils/command/build_src.py", 
    line 88, in build_sour
    ces
         self.build_extension_sources(ext)
       File "scipy_core/scipy_distutils/command/build_src.py", 
    line 122, in build_ext
    ension_sources
         sources = self.generate_sources(sources, ext)
       File "scipy_core/scipy_distutils/command/build_src.py", 
    line 164, in generate_
    sources
         source = func(extension, build_dir)
       File "Lib/lib/lapack/setup_lapack.py", line 87, in 
    get_clapack_source
         target = join(build_dir,target_dir,'clapack.pyf')
    NameError: global name 'join' is not defined
    nwagner at linux:~/cvs/scipy> echo $LAPACK_SRC
    /home/nwagner/src/lapack/LAPACK/SRC
    nwagner at linux:~/cvs/scipy> echo $BLAS_SRC
    /home/nwagner/src/blas
    nwagner at linux:~/cvs/scipy> echo $ATLAS
    None
    
    Any suggestion how to continue ?
    
    Nils
    
    
    
    From pearu at scipy.org  Mon Nov 22 14:03:14 2004
    From: pearu at scipy.org (Pearu Peterson)
    Date: Mon, 22 Nov 2004 13:03:14 -0600 (CST)
    Subject: [SciPy-user] How do I build scipy using standard BLAS/LAPACK 
    In-Reply-To: 
    References: <41A1F9FD.5030102@mecha.uni-stuttgart.de>
    	
    	
    Message-ID: 
    
    
    
    On Mon, 22 Nov 2004, Nils Wagner wrote:
    
    > Following the notes
    > http://www.scipy.org/documentation/buildscipy.txt
    > Or get LAPACK/BLAS sources as follows:
    >
    > gives
    >
    > Traceback (most recent call last):
    >  File "setup.py", line 112, in ?
    >    setup_package(ignore_packages)
    >  File "setup.py", line 99, in setup_package
    >    url = "http://www.scipy.org",
    >  File "scipy_core/scipy_distutils/core.py", line 73, in setup
    >    return old_setup(**new_attr)
    >  File "/usr/local/lib/python2.3/distutils/core.py", line 149, in setup
    >    dist.run_commands()
    >  File "/usr/local/lib/python2.3/distutils/dist.py", line 907, in 
    > run_commands
    >    self.run_command(cmd)
    >  File "/usr/local/lib/python2.3/distutils/dist.py", line 927, in run_command
    >    cmd_obj.run()
    >  File "/usr/local/lib/python2.3/distutils/command/build.py", line 107, in 
    > run
    >    self.run_command(cmd_name)
    >  File "/usr/local/lib/python2.3/distutils/cmd.py", line 333, in run_command
    >    self.distribution.run_command(command)
    >  File "/usr/local/lib/python2.3/distutils/dist.py", line 927, in run_command
    >    cmd_obj.run()
    >  File "scipy_core/scipy_distutils/command/build_src.py", line 81, in run
    >    self.build_sources()
    >  File "scipy_core/scipy_distutils/command/build_src.py", line 88, in 
    > build_sour
    > ces
    >    self.build_extension_sources(ext)
    >  File "scipy_core/scipy_distutils/command/build_src.py", line 122, in 
    > build_ext
    > ension_sources
    >    sources = self.generate_sources(sources, ext)
    >  File "scipy_core/scipy_distutils/command/build_src.py", line 164, in 
    > generate_
    > sources
    >    source = func(extension, build_dir)
    >  File "Lib/lib/lapack/setup_lapack.py", line 87, in get_clapack_source
    >    target = join(build_dir,target_dir,'clapack.pyf')
    > NameError: global name 'join' is not defined
    > nwagner at linux:~/cvs/scipy> echo $LAPACK_SRC
    > /home/nwagner/src/lapack/LAPACK/SRC
    > nwagner at linux:~/cvs/scipy> echo $BLAS_SRC
    > /home/nwagner/src/blas
    > nwagner at linux:~/cvs/scipy> echo $ATLAS
    > None
    >
    > Any suggestion how to continue ?
    
    This is now fixed in scipy CVS. Update from cvs and try again.
    
    Pearu
    
    
    
    From reikhboerger at gmx.de  Mon Nov 22 14:57:21 2004
    From: reikhboerger at gmx.de (=?ISO-8859-1?Q?Reik_B=F6rger?=)
    Date: Mon, 22 Nov 2004 20:57:21 +0100
    Subject: [SciPy-user] fmin
    Message-ID: 
    
    Hi everybody,
    
    I am new to Python and new to this list, so I will probably have lots 
    of questions and hopefully I can get good advice from you :-)
    
    My problem at the moment is the following: I need to minimize a 
    function of 4 variables. I use optimize.fmin and everything works fine, 
    besides the computation time.
    Now, I additionally provide the gradient of the objective function and 
    use optimize.fmin_bfgs. But when I run the program, the optimizer 
    always stops after 2 or 3 iterations and returns a value close to my 
    starting value as the optimal solution - doesn't matter, where I start, 
    it is always very close to the starting value.
    I always get a warning: Optimization does not terminate succesfully DUE 
    TO PRECISION LOSS. What does that mean and how I can I fix it? I tried 
    several things, but I have no idea, what is going wrong...
    
    Can anyone help me?
    
    Thanks and regards,
    Reik
    
    
    
    From nwagner at mecha.uni-stuttgart.de  Tue Nov 23 03:03:27 2004
    From: nwagner at mecha.uni-stuttgart.de (Nils Wagner)
    Date: Tue, 23 Nov 2004 09:03:27 +0100
    Subject: [SciPy-user] How do I build scipy using standard BLAS/LAPACK
    In-Reply-To: 
    References: <41A1F9FD.5030102@mecha.uni-stuttgart.de>
    		
    	
    Message-ID: <41A2EECF.5080806@mecha.uni-stuttgart.de>
    
    Pearu Peterson wrote:
    
    >
    >
    > On Mon, 22 Nov 2004, Nils Wagner wrote:
    >
    >> Following the notes
    >> http://www.scipy.org/documentation/buildscipy.txt
    >> Or get LAPACK/BLAS sources as follows:
    >>
    >> gives
    >>
    >> Traceback (most recent call last):
    >>  File "setup.py", line 112, in ?
    >>    setup_package(ignore_packages)
    >>  File "setup.py", line 99, in setup_package
    >>    url = "http://www.scipy.org",
    >>  File "scipy_core/scipy_distutils/core.py", line 73, in setup
    >>    return old_setup(**new_attr)
    >>  File "/usr/local/lib/python2.3/distutils/core.py", line 149, in setup
    >>    dist.run_commands()
    >>  File "/usr/local/lib/python2.3/distutils/dist.py", line 907, in 
    >> run_commands
    >>    self.run_command(cmd)
    >>  File "/usr/local/lib/python2.3/distutils/dist.py", line 927, in 
    >> run_command
    >>    cmd_obj.run()
    >>  File "/usr/local/lib/python2.3/distutils/command/build.py", line 
    >> 107, in run
    >>    self.run_command(cmd_name)
    >>  File "/usr/local/lib/python2.3/distutils/cmd.py", line 333, in 
    >> run_command
    >>    self.distribution.run_command(command)
    >>  File "/usr/local/lib/python2.3/distutils/dist.py", line 927, in 
    >> run_command
    >>    cmd_obj.run()
    >>  File "scipy_core/scipy_distutils/command/build_src.py", line 81, in run
    >>    self.build_sources()
    >>  File "scipy_core/scipy_distutils/command/build_src.py", line 88, in 
    >> build_sour
    >> ces
    >>    self.build_extension_sources(ext)
    >>  File "scipy_core/scipy_distutils/command/build_src.py", line 122, in 
    >> build_ext
    >> ension_sources
    >>    sources = self.generate_sources(sources, ext)
    >>  File "scipy_core/scipy_distutils/command/build_src.py", line 164, in 
    >> generate_
    >> sources
    >>    source = func(extension, build_dir)
    >>  File "Lib/lib/lapack/setup_lapack.py", line 87, in get_clapack_source
    >>    target = join(build_dir,target_dir,'clapack.pyf')
    >> NameError: global name 'join' is not defined
    >> nwagner at linux:~/cvs/scipy> echo $LAPACK_SRC
    >> /home/nwagner/src/lapack/LAPACK/SRC
    >> nwagner at linux:~/cvs/scipy> echo $BLAS_SRC
    >> /home/nwagner/src/blas
    >> nwagner at linux:~/cvs/scipy> echo $ATLAS
    >> None
    >>
    >> Any suggestion how to continue ?
    >
    >
    > This is now fixed in scipy CVS. Update from cvs and try again.
    >
    > Pearu
    >
    > _______________________________________________
    > SciPy-user mailing list
    > SciPy-user at scipy.net
    > http://www.scipy.net/mailman/listinfo/scipy-user
    
    
    Now the installation works fine, but scipy.test() failed
    
    snip
    
    ****************************************************************
    WARNING: clapack module is empty
    -----------
    See scipy/INSTALL.txt for troubleshooting.
    Notes:
    * If atlas library is not found by scipy/system_info.py,
      then scipy uses flapack instead of clapack.
    ****************************************************************
    
    snip
    ======================================================================
    ERROR: check_sh_legendre (scipy.special.basic.test_basic.test_sh_legendre)
    ----------------------------------------------------------------------
    Traceback (most recent call last):
      File 
    "/usr/lib/python2.3/site-packages/scipy/special/tests/test_basic.py", 
    line 1806, in check_sh_legendre
        Ps1 = sh_legendre(1)
      File "/usr/lib/python2.3/site-packages/scipy/special/orthogonal.py", 
    line 593, in sh_legendre
        x,w,mu0 = ps_roots(n,mu=1)
      File "/usr/lib/python2.3/site-packages/scipy/special/orthogonal.py", 
    line 584, in ps_roots
        return js_roots(n,1.0,1.0,mu=mu)
      File "/usr/lib/python2.3/site-packages/scipy/special/orthogonal.py", 
    line 205, in js_roots
        val = gen_roots_and_weights(n,an_Js,sbn_Js,mu0)
      File "/usr/lib/python2.3/site-packages/scipy/special/orthogonal.py", 
    line 121, in gen_roots_and_weights
        eig = get_eig_func()
      File "/usr/lib/python2.3/site-packages/scipy/special/orthogonal.py", 
    line 91, in get_eig_func
        eig = scipy.linalg.eig
    AttributeError: 'module' object has no attribute 'eig'
    
    ----------------------------------------------------------------------
    Ran 855 tests in 2.121s
    
    FAILED (errors=15)
    
    
    Nils
    
    
     
    
    
    
    
    From pearu at scipy.org  Tue Nov 23 03:55:02 2004
    From: pearu at scipy.org (Pearu Peterson)
    Date: Tue, 23 Nov 2004 02:55:02 -0600 (CST)
    Subject: [SciPy-user] How do I build scipy using standard BLAS/LAPACK
    In-Reply-To: <41A2EECF.5080806@mecha.uni-stuttgart.de>
    References: <41A1F9FD.5030102@mecha.uni-stuttgart.de>
    	
    	<41A2EECF.5080806@mecha.uni-stuttgart.de>
    Message-ID: 
    
    
    
    On Tue, 23 Nov 2004, Nils Wagner wrote:
    
    > Now the installation works fine, but scipy.test() failed
    >
    > snip
    >
    > ****************************************************************
    > WARNING: clapack module is empty
    > -----------
    > See scipy/INSTALL.txt for troubleshooting.
    > Notes:
    > * If atlas library is not found by scipy/system_info.py,
    > then scipy uses flapack instead of clapack.
    > ****************************************************************
    >
    > snip
    > ======================================================================
    > ERROR: check_sh_legendre (scipy.special.basic.test_basic.test_sh_legendre)
    > ----------------------------------------------------------------------
    > Traceback (most recent call last):
    > File "/usr/lib/python2.3/site-packages/scipy/special/tests/test_basic.py", 
    > line 1806, in check_sh_legendre
    >   Ps1 = sh_legendre(1)
    > File "/usr/lib/python2.3/site-packages/scipy/special/orthogonal.py", line 
    > 593, in sh_legendre
    >   x,w,mu0 = ps_roots(n,mu=1)
    > File "/usr/lib/python2.3/site-packages/scipy/special/orthogonal.py", line 
    > 584, in ps_roots
    >   return js_roots(n,1.0,1.0,mu=mu)
    > File "/usr/lib/python2.3/site-packages/scipy/special/orthogonal.py", line 
    > 205, in js_roots
    >   val = gen_roots_and_weights(n,an_Js,sbn_Js,mu0)
    > File "/usr/lib/python2.3/site-packages/scipy/special/orthogonal.py", line 
    > 121, in gen_roots_and_weights
    >   eig = get_eig_func()
    > File "/usr/lib/python2.3/site-packages/scipy/special/orthogonal.py", line 
    > 91, in get_eig_func
    >   eig = scipy.linalg.eig
    > AttributeError: 'module' object has no attribute 'eig'
    >
    > ----------------------------------------------------------------------
    > Ran 855 tests in 2.121s
    >
    > FAILED (errors=15)
    > 
    
    Are you sure that you included all of the stderr messages here?
    
    I just checked, all CVS scipy tests pass ok when using Fortran BLAS/LAPACK 
    libraries.
    
    What is the output of `python system_info.py blas_opt lapack_opt`? Is 
    `import scipy.linalg.flapack` succesful?
    
    Pearu
    
    
    
    From nwagner at mecha.uni-stuttgart.de  Tue Nov 23 04:02:53 2004
    From: nwagner at mecha.uni-stuttgart.de (Nils Wagner)
    Date: Tue, 23 Nov 2004 10:02:53 +0100
    Subject: [SciPy-user] How do I build scipy using standard BLAS/LAPACK
    In-Reply-To: 
    References: <41A1F9FD.5030102@mecha.uni-stuttgart.de>
    		<41A2EECF.5080806@mecha.uni-stuttgart.de>
    	
    Message-ID: <41A2FCBD.8080706@mecha.uni-stuttgart.de>
    
    Pearu Peterson wrote:
    
    >
    >
    > On Tue, 23 Nov 2004, Nils Wagner wrote:
    >
    >> Now the installation works fine, but scipy.test() failed
    >>
    >> snip
    >>
    >> ****************************************************************
    >> WARNING: clapack module is empty
    >> -----------
    >> See scipy/INSTALL.txt for troubleshooting.
    >> Notes:
    >> * If atlas library is not found by scipy/system_info.py,
    >> then scipy uses flapack instead of clapack.
    >> ****************************************************************
    >>
    >> snip
    >> ======================================================================
    >> ERROR: check_sh_legendre 
    >> (scipy.special.basic.test_basic.test_sh_legendre)
    >> ----------------------------------------------------------------------
    >> Traceback (most recent call last):
    >> File 
    >> "/usr/lib/python2.3/site-packages/scipy/special/tests/test_basic.py", 
    >> line 1806, in check_sh_legendre
    >>   Ps1 = sh_legendre(1)
    >> File "/usr/lib/python2.3/site-packages/scipy/special/orthogonal.py", 
    >> line 593, in sh_legendre
    >>   x,w,mu0 = ps_roots(n,mu=1)
    >> File "/usr/lib/python2.3/site-packages/scipy/special/orthogonal.py", 
    >> line 584, in ps_roots
    >>   return js_roots(n,1.0,1.0,mu=mu)
    >> File "/usr/lib/python2.3/site-packages/scipy/special/orthogonal.py", 
    >> line 205, in js_roots
    >>   val = gen_roots_and_weights(n,an_Js,sbn_Js,mu0)
    >> File "/usr/lib/python2.3/site-packages/scipy/special/orthogonal.py", 
    >> line 121, in gen_roots_and_weights
    >>   eig = get_eig_func()
    >> File "/usr/lib/python2.3/site-packages/scipy/special/orthogonal.py", 
    >> line 91, in get_eig_func
    >>   eig = scipy.linalg.eig
    >> AttributeError: 'module' object has no attribute 'eig'
    >>
    >> ----------------------------------------------------------------------
    >> Ran 855 tests in 2.121s
    >>
    >> FAILED (errors=15)
    >> 
    >
    >
    > Are you sure that you included all of the stderr messages here?
    >
    > I just checked, all CVS scipy tests pass ok when using Fortran 
    > BLAS/LAPACK libraries.
    >
    > What is the output of `python system_info.py blas_opt lapack_opt`?
    
    
    blas_opt_info:
    atlas_blas_threads_info:
    Setting PTATLAS=ATLAS
    Disabled atlas_blas_threads_info (['PTATLAS', 'ATLAS'] is None)
      NOT AVAILABLE
    
    atlas_blas_info:
    Disabled atlas_blas_info (ATLAS is None)
      NOT AVAILABLE
    
    /var/tmp/cvs/scipy/scipy_core/scipy_distutils/system_info.py:1036: 
    UserWarning:
        Atlas (http://math-atlas.sourceforge.net/) libraries not found.
        Directories to search for the libraries can be specified in the
        scipy_distutils/site.cfg file (section [atlas]) or by setting
        the ATLAS environment variable.
      warnings.warn(AtlasNotFoundError.__doc__)
    blas_info:
      FOUND:
        libraries = ['blas']
        library_dirs = ['/usr/lib']
        language = f77
    
    ( library_dirs = /usr/local/lib:/usr/lib )
      FOUND:
        libraries = ['blas']
        library_dirs = ['/usr/lib']
        define_macros = [('NO_ATLAS_INFO', 1)]
        language = f77
    
    lapack_opt_info:
    atlas_threads_info:
    Setting PTATLAS=ATLAS
    Disabled atlas_threads_info (['PTATLAS', 'ATLAS'] is None)
    system_info.atlas_threads_info
      NOT AVAILABLE
    
    atlas_info:
    Disabled atlas_info (ATLAS is None)
    system_info.atlas_info
      NOT AVAILABLE
    
    /var/tmp/cvs/scipy/scipy_core/scipy_distutils/system_info.py:964: 
    UserWarning:
        Atlas (http://math-atlas.sourceforge.net/) libraries not found.
        Directories to search for the libraries can be specified in the
        scipy_distutils/site.cfg file (section [atlas]) or by setting
        the ATLAS environment variable.
      warnings.warn(AtlasNotFoundError.__doc__)
    lapack_info:
      FOUND:
        libraries = ['lapack']
        library_dirs = ['/usr/lib']
        language = f77
    
    ( library_dirs = /usr/local/lib:/usr/lib )
      FOUND:
        libraries = ['lapack', 'blas']
        library_dirs = ['/usr/lib']
        define_macros = [('NO_ATLAS_INFO', 1)]
        language = f77
    
    scipy_core/scipy_distutils>
    
    > Is `import scipy.linalg.flapack` succesful?
    
    No
    
    Python 2.3.3 (#1, Apr  6 2004, 01:47:39)
    [GCC 3.3.3 (SuSE Linux)] on linux2
    Type "help", "copyright", "credits" or "license" for more information.
     >>> import scipy.linalg.flapack
    Traceback (most recent call last):
      File "", line 1, in ?
    ImportError: No module named flapack
    
    Nils
    
    > Pearu
    >
    > _______________________________________________
    > SciPy-user mailing list
    > SciPy-user at scipy.net
    > http://www.scipy.net/mailman/listinfo/scipy-user
    
    
    
     
    
    
    
    From pearu at scipy.org  Tue Nov 23 04:31:33 2004
    From: pearu at scipy.org (Pearu Peterson)
    Date: Tue, 23 Nov 2004 03:31:33 -0600 (CST)
    Subject: [SciPy-user] How do I build scipy using standard BLAS/LAPACK
    In-Reply-To: <41A2FCBD.8080706@mecha.uni-stuttgart.de>
    References: <41A1F9FD.5030102@mecha.uni-stuttgart.de>
    	
    	
    	<41A2FCBD.8080706@mecha.uni-stuttgart.de>
    Message-ID: 
    
    
    
    On Tue, 23 Nov 2004, Nils Wagner wrote:
    
    >> What is the output of `python system_info.py blas_opt lapack_opt`?
    >
    >
    > blas_opt_info:
    > FOUND:
    >   libraries = ['blas']
    >   library_dirs = ['/usr/lib']
    >   define_macros = [('NO_ATLAS_INFO', 1)]
    >   language = f77
    
    So, scipy was not built against blas/lapack libraries in
    /home/nwagner/src/{blas,lapack} but was built against the ones from 
    /usr/lib (and that probably require atlas libraries). Please, read again
    my previous messages and carefully apply the given instructions (soon 
    I'll stop responding if there is no progress when there should be).
    
    >> Is `import scipy.linalg.flapack` succesful?
    >
    > No
    >
    > Python 2.3.3 (#1, Apr  6 2004, 01:47:39)
    > [GCC 3.3.3 (SuSE Linux)] on linux2
    > Type "help", "copyright", "credits" or "license" for more information.
    >>>> import scipy.linalg.flapack
    > Traceback (most recent call last):
    > File "", line 1, in ?
    > ImportError: No module named flapack
    
    So, there must have some failures when building flapack extension module.
    Check the messages from scipy build process.
    
    Btw, why do you want to build scipy against Fortran BLAS/LAPACK in the 
    first place? It is not needed if you only want to build symeig against 
    Fortran BLAS/LAPACK.
    
    Pearu
    
    
    
    From nwagner at mecha.uni-stuttgart.de  Tue Nov 23 04:44:35 2004
    From: nwagner at mecha.uni-stuttgart.de (Nils Wagner)
    Date: Tue, 23 Nov 2004 10:44:35 +0100
    Subject: [SciPy-user] How do I build scipy using standard BLAS/LAPACK
    In-Reply-To: 
    References: <41A1F9FD.5030102@mecha.uni-stuttgart.de>
    		
    	<41A2FCBD.8080706@mecha.uni-stuttgart.de>
    	
    Message-ID: <41A30683.8000501@mecha.uni-stuttgart.de>
    
    Pearu Peterson wrote:
    
    >
    >
    > On Tue, 23 Nov 2004, Nils Wagner wrote:
    >
    >>> What is the output of `python system_info.py blas_opt lapack_opt`?
    >>
    >>
    >>
    >> blas_opt_info:
    >> FOUND:
    >>   libraries = ['blas']
    >>   library_dirs = ['/usr/lib']
    >>   define_macros = [('NO_ATLAS_INFO', 1)]
    >>   language = f77
    >
    >
    > So, scipy was not built against blas/lapack libraries in
    > /home/nwagner/src/{blas,lapack} but was built against the ones from 
    > /usr/lib (and that probably require atlas libraries). Please, read again
    > my previous messages and carefully apply the given instructions (soon 
    > I'll stop responding if there is no progress when there should be).
    >
    >>> Is `import scipy.linalg.flapack` succesful?
    >>
    >>
    >> No
    >>
    >> Python 2.3.3 (#1, Apr  6 2004, 01:47:39)
    >> [GCC 3.3.3 (SuSE Linux)] on linux2
    >> Type "help", "copyright", "credits" or "license" for more information.
    >>
    >>>>> import scipy.linalg.flapack
    >>>>
    >> Traceback (most recent call last):
    >> File "", line 1, in ?
    >> ImportError: No module named flapack
    >
    >
    > So, there must have some failures when building flapack extension module.
    > Check the messages from scipy build process.
    >
    > Btw, why do you want to build scipy against Fortran BLAS/LAPACK in the 
    > first place? It is not needed if you only want to build symeig against 
    > Fortran BLAS/LAPACK.
    >
    Sorry, I wasn't aware of this fact.
    
    > Pearu
    >
    > _______________________________________________
    > SciPy-user mailing list
    > SciPy-user at scipy.net
    > http://www.scipy.net/mailman/listinfo/scipy-user
    
    
    Pearu,
    
    I have set the environment variables according to your last e-mail.
    
    /home/nwagner> echo $BLAS_SRC
    /home/nwagner/src/blas
    /home/nwagner> echo $LAPACK_SRC
    /home/nwagner/src/lapack
    /home/nwagner> echo $ATLAS
    None
    
    However, the libraries libblas.a and liblapack.a already available in /usr/lib are used instead of FORTRAN BLAS/LAPACK. How can I circumvent this ? 
    
    Nils
     
    
    
    
    
    From pearu at scipy.org  Tue Nov 23 04:51:04 2004
    From: pearu at scipy.org (Pearu Peterson)
    Date: Tue, 23 Nov 2004 03:51:04 -0600 (CST)
    Subject: [SciPy-user] How do I build scipy using standard BLAS/LAPACK
    In-Reply-To: <41A30683.8000501@mecha.uni-stuttgart.de>
    References: <41A1F9FD.5030102@mecha.uni-stuttgart.de>
    	
    	<41A2FCBD.8080706@mecha.uni-stuttgart.de>
    	<41A30683.8000501@mecha.uni-stuttgart.de>
    Message-ID: 
    
    
    
    On Tue, 23 Nov 2004, Nils Wagner wrote:
    
    > I have set the environment variables according to your last e-mail.
    >
    > /home/nwagner> echo $BLAS_SRC
    > /home/nwagner/src/blas
    > /home/nwagner> echo $LAPACK_SRC
    > /home/nwagner/src/lapack
    > /home/nwagner> echo $ATLAS
    > None
    >
    > However, the libraries libblas.a and liblapack.a already available in 
    > /usr/lib are used instead of FORTRAN BLAS/LAPACK. How can I circumvent this ?
    
    Set also
    
    BLAS=None
    LAPACK=None
    
    Pearu
    
    
    
    From nwagner at mecha.uni-stuttgart.de  Tue Nov 23 05:44:25 2004
    From: nwagner at mecha.uni-stuttgart.de (Nils Wagner)
    Date: Tue, 23 Nov 2004 11:44:25 +0100
    Subject: [SciPy-user] How do I build scipy using standard BLAS/LAPACK
    In-Reply-To: 
    References: <41A1F9FD.5030102@mecha.uni-stuttgart.de>
    		<41A2FCBD.8080706@mecha.uni-stuttgart.de>
    	<41A30683.8000501@mecha.uni-stuttgart.de>
    	
    Message-ID: <41A31489.8060809@mecha.uni-stuttgart.de>
    
    Pearu Peterson wrote:
    
    >
    >
    > On Tue, 23 Nov 2004, Nils Wagner wrote:
    >
    >> I have set the environment variables according to your last e-mail.
    >>
    >> /home/nwagner> echo $BLAS_SRC
    >> /home/nwagner/src/blas
    >> /home/nwagner> echo $LAPACK_SRC
    >> /home/nwagner/src/lapack
    >> /home/nwagner> echo $ATLAS
    >> None
    >>
    >> However, the libraries libblas.a and liblapack.a already available in 
    >> /usr/lib are used instead of FORTRAN BLAS/LAPACK. How can I 
    >> circumvent this ?
    >
    >
    > Set also
    >
    > BLAS=None
    > LAPACK=None
    >
    > Pearu
    >
    Pearu,
    
    I have also set the additional environment variables BLAS=None LAPACK=None
    
    numpy_info:
      FOUND:
        define_macros = [('NUMERIC_VERSION', '"\\"23.6\\""')]
        include_dirs = ['/usr/include/python2.3']
    
    lapack_opt_info:
    atlas_threads_info:
    Setting PTATLAS=ATLAS
    Disabled atlas_threads_info (['PTATLAS', 'ATLAS'] is None)
    scipy_distutils.system_info.atlas_threads_info
      NOT AVAILABLE
    
    atlas_info:
    Disabled atlas_info (ATLAS is None)
    scipy_distutils.system_info.atlas_info
      NOT AVAILABLE
    
    scipy_core/scipy_distutils/system_info.py:964: UserWarning:
        Atlas (http://math-atlas.sourceforge.net/) libraries not found.
        Directories to search for the libraries can be specified in the
        scipy_distutils/site.cfg file (section [atlas]) or by setting
        the ATLAS environment variable.
      warnings.warn(AtlasNotFoundError.__doc__)
    lapack_info:
    Disabled lapack_info (LAPACK is None)
      NOT AVAILABLE
    
    scipy_core/scipy_distutils/system_info.py:975: UserWarning:
        Lapack (http://www.netlib.org/lapack/) libraries not found.
        Directories to search for the libraries can be specified in the
        scipy_distutils/site.cfg file (section [lapack]) or by setting
        the LAPACK environment variable.
      warnings.warn(LapackNotFoundError.__doc__)
    lapack_src_info:
      NOT AVAILABLE
    
    scipy_core/scipy_distutils/system_info.py:978: UserWarning:
        Lapack (http://www.netlib.org/lapack/) sources not found.
        Directories to search for the sources can be specified in the
        scipy_distutils/site.cfg file (section [lapack_src]) or by setting
        the LAPACK_SRC environment variable.
      warnings.warn(LapackSrcNotFoundError.__doc__)
    Traceback (most recent call last):
      File "setup.py", line 112, in ?
        setup_package(ignore_packages)
      File "setup.py", line 86, in setup_package
        ignore_packages = ignore_packages)
      File "scipy_core/scipy_distutils/misc_util.py", line 475, in 
    get_subpackages
        config = setup_module.configuration(*args)
      File "/var/tmp/cvs/scipy/Lib/lib/setup_lib.py", line 14, in configuration
        parent_path=parent_path)
      File "scipy_core/scipy_distutils/misc_util.py", line 475, in 
    get_subpackages
        config = setup_module.configuration(*args)
      File "Lib/lib/lapack/setup_lapack.py", line 40, in configuration
        lapack_opt = get_info('lapack_opt',notfound_action=2)
      File "scipy_core/scipy_distutils/system_info.py", line 192, in get_info
        return cl().get_info(notfound_action)
      File "scipy_core/scipy_distutils/system_info.py", line 340, in get_info
        raise self.notfounderror,self.notfounderror.__doc__
    scipy_distutils.system_info.NotFoundError: Some third-party program or 
    library is not found.
    
    but
    
    lisa:/var/tmp/cvs/scipy # echo $LAPACK_SRC
    /home/nwagner/src/lapack
    lisa:/var/tmp/cvs/scipy # echo $BLAS_SRC
    /home/nwagner/src/blas
    lisa:/var/tmp/cvs/scipy # echo $BLAS
    None
    lisa:/var/tmp/cvs/scipy # echo $LAPACK
    None
    
    Nils
    
    > _______________________________________________
    > SciPy-user mailing list
    > SciPy-user at scipy.net
    > http://www.scipy.net/mailman/listinfo/scipy-user
    
    
    
     
    
    
    
    From Giovanni.Samaey at cs.kuleuven.ac.be  Tue Nov 23 05:56:50 2004
    From: Giovanni.Samaey at cs.kuleuven.ac.be (Giovanni Samaey)
    Date: Tue, 23 Nov 2004 11:56:50 +0100
    Subject: [SciPy-user] problems with scipy.test() on Opteron 64-bit
    Message-ID: <41A31772.3040201@cs.kuleuven.ac.be>
    
    Hi again,
    
    after all my trouble (reported above) I finally got a completed scipy build.
    I got however errors in scipy.test(), which I presumed to be due to 
    incorrect atlas libraries.
    So I decided to compile those myself, following the instructions on 
    www.scipy.org.
    
    I had to make several modifications; namely adding -fPIC and -m64 as 
    options to the make
    files for blas and lapack.  (Otherwise, linking failed during scipy build.)
    
    Running scipy.test(), I get the errors that are shown in 
    file_atlas.txt.   I decided to give it a try
    without atlas (which in my case worked without any difficulty).  I used 
    the blas and lapack
    routines that were compiled by myself and that are used to build atlas.  
    As you will notice,
    I get one error and one failure less in this case.
    
    I am using cvs scipy and atlas3.6.0.  Everything is compiled using the 
    gnu compilers with the
    default options, to which -fPIC and -m64 are added for lapack and blas.
    I googled and looked through this mailing list, but did not find 
    anything directly applicable.
    (If I overlooked it, I already apologize beforehand.)
    
    Best,  Giovanni
    
    -- 
    Giovanni Samaey		 	http://www.cs.kuleuven.ac.be/~giovanni/ 
    Katholieke Universiteit Leuven 	      email: giovanni at cs.kuleuven.ac.be 
    Departement Computerwetenschappen                  phone: +32-16-327081
    Celestijnenlaan 200A, B-3001 Heverlee, Belgium       fax: +32-16-327996
    Office: A04.36
    
    
    -------------- next part --------------
    An embedded and charset-unspecified text was scrubbed...
    Name: file_atlas.txt
    URL: 
    -------------- next part --------------
    An embedded and charset-unspecified text was scrubbed...
    Name: file_noatlas.txt
    URL: 
    
    From pearu at scipy.org  Tue Nov 23 06:00:35 2004
    From: pearu at scipy.org (Pearu Peterson)
    Date: Tue, 23 Nov 2004 05:00:35 -0600 (CST)
    Subject: [SciPy-user] How do I build scipy using standard BLAS/LAPACK
    In-Reply-To: <41A31489.8060809@mecha.uni-stuttgart.de>
    References: <41A1F9FD.5030102@mecha.uni-stuttgart.de>
    	
    	<41A30683.8000501@mecha.uni-stuttgart.de>
    	<41A31489.8060809@mecha.uni-stuttgart.de>
    Message-ID: 
    
    
    
    On Tue, 23 Nov 2004, Nils Wagner wrote:
    
    > I have also set the additional environment variables BLAS=None LAPACK=None
    >
    > scipy_core/scipy_distutils/system_info.py:978: UserWarning:
    >   Lapack (http://www.netlib.org/lapack/) sources not found.
    >   Directories to search for the sources can be specified in the
    >   scipy_distutils/site.cfg file (section [lapack_src]) or by setting
    >   the LAPACK_SRC environment variable.
    > warnings.warn(LapackSrcNotFoundError.__doc__)
    > Traceback (most recent call last):
    > File "setup.py", line 112, in ?
    >   setup_package(ignore_packages)
    > File "setup.py", line 86, in setup_package
    >   ignore_packages = ignore_packages)
    > File "scipy_core/scipy_distutils/misc_util.py", line 475, in get_subpackages
    >   config = setup_module.configuration(*args)
    > File "/var/tmp/cvs/scipy/Lib/lib/setup_lib.py", line 14, in configuration
    >   parent_path=parent_path)
    > File "scipy_core/scipy_distutils/misc_util.py", line 475, in get_subpackages
    >   config = setup_module.configuration(*args)
    > File "Lib/lib/lapack/setup_lapack.py", line 40, in configuration
    >   lapack_opt = get_info('lapack_opt',notfound_action=2)
    > File "scipy_core/scipy_distutils/system_info.py", line 192, in get_info
    >   return cl().get_info(notfound_action)
    > File "scipy_core/scipy_distutils/system_info.py", line 340, in get_info
    >   raise self.notfounderror,self.notfounderror.__doc__
    > scipy_distutils.system_info.NotFoundError: Some third-party program or 
    > library is not found.
    >
    > but
    >
    > lisa:/var/tmp/cvs/scipy # echo $LAPACK_SRC
    > /home/nwagner/src/lapack
    
    Do you have anything in /home/nwagner/src/lapack? May be you should have
    
       LAPACK_SRC=/home/nwagner/src/LAPACK
    
    ? If so then it should solve the problem. You should not follow docs 
    literately when it is obvious that docs are wrong.
    
    Pearu
    
    
    
    From nwagner at mecha.uni-stuttgart.de  Tue Nov 23 06:28:10 2004
    From: nwagner at mecha.uni-stuttgart.de (Nils Wagner)
    Date: Tue, 23 Nov 2004 12:28:10 +0100
    Subject: [SciPy-user] How do I build scipy using standard BLAS/LAPACK
    In-Reply-To: 
    References: <41A1F9FD.5030102@mecha.uni-stuttgart.de>
    		<41A30683.8000501@mecha.uni-stuttgart.de>
    	<41A31489.8060809@mecha.uni-stuttgart.de>
    	
    Message-ID: <41A31ECA.6090407@mecha.uni-stuttgart.de>
    
    Pearu Peterson wrote:
    
    >
    >
    > On Tue, 23 Nov 2004, Nils Wagner wrote:
    >
    >> I have also set the additional environment variables BLAS=None 
    >> LAPACK=None
    >>
    >> scipy_core/scipy_distutils/system_info.py:978: UserWarning:
    >>   Lapack (http://www.netlib.org/lapack/) sources not found.
    >>   Directories to search for the sources can be specified in the
    >>   scipy_distutils/site.cfg file (section [lapack_src]) or by setting
    >>   the LAPACK_SRC environment variable.
    >> warnings.warn(LapackSrcNotFoundError.__doc__)
    >> Traceback (most recent call last):
    >> File "setup.py", line 112, in ?
    >>   setup_package(ignore_packages)
    >> File "setup.py", line 86, in setup_package
    >>   ignore_packages = ignore_packages)
    >> File "scipy_core/scipy_distutils/misc_util.py", line 475, in 
    >> get_subpackages
    >>   config = setup_module.configuration(*args)
    >> File "/var/tmp/cvs/scipy/Lib/lib/setup_lib.py", line 14, in 
    >> configuration
    >>   parent_path=parent_path)
    >> File "scipy_core/scipy_distutils/misc_util.py", line 475, in 
    >> get_subpackages
    >>   config = setup_module.configuration(*args)
    >> File "Lib/lib/lapack/setup_lapack.py", line 40, in configuration
    >>   lapack_opt = get_info('lapack_opt',notfound_action=2)
    >> File "scipy_core/scipy_distutils/system_info.py", line 192, in get_info
    >>   return cl().get_info(notfound_action)
    >> File "scipy_core/scipy_distutils/system_info.py", line 340, in get_info
    >>   raise self.notfounderror,self.notfounderror.__doc__
    >> scipy_distutils.system_info.NotFoundError: Some third-party program 
    >> or library is not found.
    >>
    >> but
    >>
    >> lisa:/var/tmp/cvs/scipy # echo $LAPACK_SRC
    >> /home/nwagner/src/lapack
    >
    >
    > Do you have anything in /home/nwagner/src/lapack? May be you should have
    >
    >   LAPACK_SRC=/home/nwagner/src/LAPACK
    >
    > ? If so then it should solve the problem. You should not follow docs 
    > literately when it is obvious that docs are wrong.
    >
    > Pearu
    >
    > _______________________________________________
    > SciPy-user mailing list
    > SciPy-user at scipy.net
    > http://www.scipy.net/mailman/listinfo/scipy-user
    
    The gzipped tarball lapack.tgz from netlib contains a subdirectory 
    LAPACK/SRC
    The tarball is located in ~/src/lapack
    
    So I have used
    setenv LAPACK_SRC ~/src/lapack
    
    Nils
    
    
    
     
    
    
    
    
    From stephen.walton at csun.edu  Tue Nov 23 07:37:33 2004
    From: stephen.walton at csun.edu (Stephen Walton)
    Date: Tue, 23 Nov 2004 04:37:33 -0800
    Subject: [SciPy-user] fmin
    In-Reply-To: 
    References: 
    Message-ID: <1101213453.3370.2.camel@localhost.localdomain>
    
    On Mon, 2004-11-22 at 20:57 +0100, Reik B?rger wrote:
    
    > Now, I additionally provide the gradient of the objective function and 
    > use optimize.fmin_bfgs. But when I run the program, the optimizer 
    > always stops after 2 or 3 iterations and returns a value close to my 
    > starting value as the optimal solution - doesn't matter, where I start, 
    > it is always very close to the starting value.
    
    I am only asking this because I've made the same mistake myself many
    times:  how sure are you that your analytic gradient is correct?  I
    would write a test to compare the analytic gradient against a
    numerically computed one before doing anything else.
    
    Moreover, if the function you're optimizing is itself a Fortran or C
    routine with a complicated or even non-analytic gradient, you might look
    at the automatic differentiators out there:
    http://www-unix.mcs.anl.gov/autodiff/AD_Tools/
    
    -- 
    Stephen Walton 
    Dept. of Physics & Astronomy, CSU Northridge
    -------------- next part --------------
    A non-text attachment was scrubbed...
    Name: signature.asc
    Type: application/pgp-signature
    Size: 189 bytes
    Desc: This is a digitally signed message part
    URL: 
    
    From rkern at ucsd.edu  Tue Nov 23 10:37:01 2004
    From: rkern at ucsd.edu (Robert Kern)
    Date: Tue, 23 Nov 2004 07:37:01 -0800
    Subject: [SciPy-user] How do I build scipy using standard BLAS/LAPACK
    In-Reply-To: <41A31ECA.6090407@mecha.uni-stuttgart.de>
    References: <41A1F9FD.5030102@mecha.uni-stuttgart.de>
    		<41A30683.8000501@mecha.uni-stuttgart.de>
    	<41A31489.8060809@mecha.uni-stuttgart.de>
    	
    	<41A31ECA.6090407@mecha.uni-stuttgart.de>
    Message-ID: <41A3591D.1080305@ucsd.edu>
    
    Nils Wagner wrote:
    
    > The gzipped tarball lapack.tgz from netlib contains a subdirectory 
    > LAPACK/SRC
    > The tarball is located in ~/src/lapack
    > 
    > So I have used
    > setenv LAPACK_SRC ~/src/lapack
    
    setup.py won't unpack the tarball for you. The instructions ask that 
    LAPACK_SRC point to the directory that has the LAPACK source code, not 
    the directory that has the tarball of the source code.
    
    You should be able to unpack the tarball where it is, and things should 
    work.
    
    -- 
    Robert Kern
    rkern at ucsd.edu
    
    "In the fields of hell where the grass grows high
      Are the graves of dreams allowed to die."
       -- Richard Harter
    
    
    
    From nwagner at mecha.uni-stuttgart.de  Tue Nov 23 10:46:29 2004
    From: nwagner at mecha.uni-stuttgart.de (Nils Wagner)
    Date: Tue, 23 Nov 2004 16:46:29 +0100
    Subject: [SciPy-user] How do I build scipy using standard BLAS/LAPACK
    In-Reply-To: <41A3591D.1080305@ucsd.edu>
    References: <41A1F9FD.5030102@mecha.uni-stuttgart.de>
    		<41A30683.8000501@mecha.uni-stuttgart.de>
    	<41A31489.8060809@mecha.uni-stuttgart.de>
    	
    	<41A31ECA.6090407@mecha.uni-stuttgart.de> <41A3591D.1080305@ucsd.edu>
    Message-ID: <41A35B55.3030403@mecha.uni-stuttgart.de>
    
    Robert Kern wrote:
    
    > Nils Wagner wrote:
    >
    >> The gzipped tarball lapack.tgz from netlib contains a subdirectory 
    >> LAPACK/SRC
    >> The tarball is located in ~/src/lapack
    >>
    >> So I have used
    >> setenv LAPACK_SRC ~/src/lapack
    >
    >
    > setup.py won't unpack the tarball for you. The instructions ask that 
    > LAPACK_SRC point to the directory that has the LAPACK source code, not 
    > the directory that has the tarball of the source code.
    >
    Sorry for my misleading explanation. Of course I have unpacked the tarball !
    But it doesn't work.
    
    Nils
    
    > You should be able to unpack the tarball where it is, and things 
    > should work.
    >
     
    
    
    
    From rkern at ucsd.edu  Tue Nov 23 10:56:36 2004
    From: rkern at ucsd.edu (Robert Kern)
    Date: Tue, 23 Nov 2004 07:56:36 -0800
    Subject: [SciPy-user] How do I build scipy using standard BLAS/LAPACK
    In-Reply-To: <41A35B55.3030403@mecha.uni-stuttgart.de>
    References: <41A1F9FD.5030102@mecha.uni-stuttgart.de>
    		<41A30683.8000501@mecha.uni-stuttgart.de>
    	<41A31489.8060809@mecha.uni-stuttgart.de>
    	
    	<41A31ECA.6090407@mecha.uni-stuttgart.de> <41A3591D.1080305@ucsd.edu>
    	<41A35B55.3030403@mecha.uni-stuttgart.de>
    Message-ID: <41A35DB4.1030306@ucsd.edu>
    
    Nils Wagner wrote:
    
    > Sorry for my misleading explanation. Of course I have unpacked the 
    > tarball !
    > But it doesn't work.
    
    Okay. Look at the method lapack_src_info.calc_info() in the file 
    scipy_core/scipy_distutils/system_info.py and modify it to try to debug 
    why it can't find the files.
    
    -- 
    Robert Kern
    rkern at ucsd.edu
    
    "In the fields of hell where the grass grows high
      Are the graves of dreams allowed to die."
       -- Richard Harter
    
    
    
    From rkern at ucsd.edu  Tue Nov 23 12:39:40 2004
    From: rkern at ucsd.edu (Robert Kern)
    Date: Tue, 23 Nov 2004 09:39:40 -0800
    Subject: [SciPy-user] Enthon for the Mac
    Message-ID: <41A375DC.3090800@ucsd.edu>
    
    I'm gearing up to make binary packages for Scipy et al. for Mac OS X. I 
    hope that these might be considered collectively as Enthought-Edition 
    Python ("Enthon") for the Mac.
    
    So, Mac users, what do you want to see in Mac Enthon? My current plan is 
    to track CVS/SVN for some of the packages, certainly scipy. As I update 
    packages, Mac Enthon users will be able to download them individually 
    and install them. I can only support Panther right now; when I get 
    Tiger, I'll support both Panther and Tiger. I may also make a Python 2.4 
    version. The packages will be the standard Mac double-click-to-install 
    packages with an Enthon meta-package for convenient install of all 
    packages together.
    
    What packages would you like to see? My current list:
    
    Scipy - CVS
    Numeric - stable
    numarray - stable?
    ScientificPython - stable
    PIL - stable
    readline - stable
    IPython - stable
    wxPython - latest developement version
    PyObjC - SVN
    matplotlib - CVS?
    Tkinter - stable
    Pmw - stable
    Kiva    \
    Chaco   |- SVN versions?
    Traits  /
    PyOpenGL - stable
    f2py - CVS
    Mayavi - stable
    VTK - stable
    PyX - latest
    PySQLite - last stable, and CVS once the new version is usable
    SWIG - latest
    Pyrex - latest
    Twisted - stable
    PyTables - latest
    SCons - stable
    ReportLab - stable
    PyProtocols - latest
    PyChecker - stable
    mx.TextTools - stable
    mx.DateTime - stable
    odr - whenever I get around to updating the bloody thing
    PyMC - latest
    MDP - latest
    ZODB - stable
    PyXML - stable
    BioPython?
    
    I'll put this somewhere on the Wiki later today.
    
    -- 
    Robert Kern
    rkern at ucsd.edu
    
    "In the fields of hell where the grass grows high
      Are the graves of dreams allowed to die."
       -- Richard Harter
    
    
    
    From prabhu_r at users.sf.net  Tue Nov 23 13:07:16 2004
    From: prabhu_r at users.sf.net (Prabhu Ramachandran)
    Date: Tue, 23 Nov 2004 23:37:16 +0530
    Subject: [SciPy-user] Enthon for the Mac
    In-Reply-To: <41A375DC.3090800@ucsd.edu>
    References: <41A375DC.3090800@ucsd.edu>
    Message-ID: <16803.31828.363535.815026@monster.linux.in>
    
    >>>>> "RK" == Robert Kern  writes:
    
        RK> I'm gearing up to make binary packages for Scipy et al. for
        RK> Mac OS X. I hope that these might be considered collectively
        RK> as Enthought-Edition Python ("Enthon") for the Mac.
    
    I'm not a Mac user but here is a data point.
    
        RK> Mayavi - stable 
    
    You should really use CVS.  There were a few bugs that were fixed in
    the last few months.  If CVS is a pain grab the last CVS snapshot 
    
     http://vvikram.com/~prabhu/download/mayavi/
    
    Its old but fixed bad bugs in the volume mapper code.
    
    The only reason there isn't a new MayaVi release is that I don't have
    the time to to make the windows installer.  No one has stepped forward
    to help with that so until I make the time to upgrade my machines,
    install Windows blah blah and build the installer CVS is your best
    bet.
    
    cheers,
    prabhu
    
    
    
    From jdhunter at ace.bsd.uchicago.edu  Tue Nov 23 13:29:38 2004
    From: jdhunter at ace.bsd.uchicago.edu (John Hunter)
    Date: Tue, 23 Nov 2004 12:29:38 -0600
    Subject: [SciPy-user] Enthon for the Mac
    In-Reply-To: <41A375DC.3090800@ucsd.edu> (Robert Kern's message of "Tue, 23
    	Nov 2004 09:39:40 -0800")
    References: <41A375DC.3090800@ucsd.edu>
    Message-ID: 
    
    >>>>> "Robert" == Robert Kern  writes:
    
        Robert> I'm gearing up to make binary packages for Scipy et
        Robert> al. for Mac OS X. I hope that these might be considered
        Robert> collectively as Enthought-Edition Python ("Enthon") for
        Robert> the Mac.
    
    This will be great!  I'll be happy to test...
    
      VTK
    
    For VTK I'd like to see some of the optional flags turned on (I
    suspect from former communications Prabhu would concur on many of
    these)
    
      Patented : On
      Hybrid   : On
      GL2PS    : On
    
    I don't know how you feel about turning patented on - I think it's a
    good idea because you need it to do any iso-surface reconstruction
    with marching cubes, and lots of useful stuff like the decimate
    filters are in there.
    
    JDH
    
    
    
    From prabhu_r at users.sf.net  Tue Nov 23 14:00:53 2004
    From: prabhu_r at users.sf.net (Prabhu Ramachandran)
    Date: Wed, 24 Nov 2004 00:30:53 +0530
    Subject: [SciPy-user] Enthon for the Mac
    In-Reply-To: 
    References: <41A375DC.3090800@ucsd.edu>
    	
    Message-ID: <16803.35045.101333.635808@monster.linux.in>
    
    >>>>> "JH" == John Hunter  writes:
    
        JH>   VTK
        JH> For VTK I'd like to see some of the optional flags turned on
        JH> (I suspect from former communications Prabhu would concur on
        JH> many of these)
    
        JH>   Patented : On Hybrid : On GL2PS : On
    
    Yes, patented is upto Robert, but the other two are *very* useful.
    
    cheers,
    prabhu
    
    
    
    From falted at pytables.org  Tue Nov 23 14:25:41 2004
    From: falted at pytables.org (Francesc Altet)
    Date: Tue, 23 Nov 2004 20:25:41 +0100
    Subject: [SciPy-user] Enthon for the Mac
    In-Reply-To: <41A375DC.3090800@ucsd.edu>
    References: <41A375DC.3090800@ucsd.edu>
    Message-ID: <200411232025.46749.falted@pytables.org>
    
    A Dimarts 23 Novembre 2004 18:39, Robert Kern va escriure:
    > I'm gearing up to make binary packages for Scipy et al. for Mac OS X. I 
    > hope that these might be considered collectively as Enthought-Edition 
    > Python ("Enthon") for the Mac.
    
    Great. The complete list seems quite a large job, but after checking the
    good packages on it, I think that would be really cool.
    
    > PyTables - latest
    
    Ok. I'm about to release 0.9.1 which fixes some little bugs in 0.9, and a
    few improvements. I'll do that most probably this week or the next one,
    depending on my workload. Just for your information.
    
    Gook luck!
    
    -- 
    Francesc Altet
    
    
    
    From joe at enthought.com  Tue Nov 23 15:16:27 2004
    From: joe at enthought.com (Joe Cooper)
    Date: Tue, 23 Nov 2004 14:16:27 -0600
    Subject: [SciPy-user] Enthon for the Mac
    In-Reply-To: <41A375DC.3090800@ucsd.edu>
    References: <41A375DC.3090800@ucsd.edu>
    Message-ID: <41A39A9B.3080507@enthought.com>
    
    Hey Robert,
    
    Perfect timing.  I'm in the midst of preparing a new Enthon for Windows 
    (and peripherally one for Fedora).  We should probably chat sometime 
    about making sure most things are roughly functionally 
    compatible/equivalent, so that folks writing tutorials and other docs 
    can just say "install Enthon for your OS".
    
    I had started in on the Enthon on OS X endeavor about six months ago, 
    but only a few packages ever came of it.  The packaging on Mac is quite 
    abyssmal, so there isn't much to be gained by handing off the packages I 
    did back then--you pretty much just have to build the software by hand, 
    and then push the pretty "Go" button and it all gets piled up together 
    into an archive.  I do have simple aap recipes for almost everything in 
    Enthon on Windows, which you might find useful in automating your 
    construction on Mac, since the concept is the same for the Wise 
    installer.  It's nicer to say "aap -vf scipy.aap" than to have to go 
    download all of the bits and pieces.
    
    On the Kiva, Chaco, Traits front, we'll want to rope Eric into a 
    conversation.  I think there are some outstanding issues with 
    integrating the SVN versions.
    
    One thing we're doing with the next version is to integrate IPython as 
    kind of the de facto interactive python.  There's a scipy config file 
    that works well and makes writing tutorials simpler, since it's possible 
    to preload all of the relevant modules.
    
    Robert Kern wrote:
    > I'm gearing up to make binary packages for Scipy et al. for Mac OS X. I 
    > hope that these might be considered collectively as Enthought-Edition 
    > Python ("Enthon") for the Mac.
    > 
    > So, Mac users, what do you want to see in Mac Enthon? My current plan is 
    > to track CVS/SVN for some of the packages, certainly scipy. As I update 
    > packages, Mac Enthon users will be able to download them individually 
    > and install them. I can only support Panther right now; when I get 
    > Tiger, I'll support both Panther and Tiger. I may also make a Python 2.4 
    > version. The packages will be the standard Mac double-click-to-install 
    > packages with an Enthon meta-package for convenient install of all 
    > packages together.
    > 
    > What packages would you like to see? My current list:
    > 
    > Scipy - CVS
    > Numeric - stable
    > numarray - stable?
    > ScientificPython - stable
    > PIL - stable
    > readline - stable
    > IPython - stable
    > wxPython - latest developement version
    > PyObjC - SVN
    > matplotlib - CVS?
    > Tkinter - stable
    > Pmw - stable
    > Kiva    \
    > Chaco   |- SVN versions?
    > Traits  /
    > PyOpenGL - stable
    > f2py - CVS
    > Mayavi - stable
    > VTK - stable
    > PyX - latest
    > PySQLite - last stable, and CVS once the new version is usable
    > SWIG - latest
    > Pyrex - latest
    > Twisted - stable
    > PyTables - latest
    > SCons - stable
    > ReportLab - stable
    > PyProtocols - latest
    > PyChecker - stable
    > mx.TextTools - stable
    > mx.DateTime - stable
    > odr - whenever I get around to updating the bloody thing
    > PyMC - latest
    > MDP - latest
    > ZODB - stable
    > PyXML - stable
    > BioPython?
    > 
    > I'll put this somewhere on the Wiki later today.
    > 
    
    
    
    From rkern at ucsd.edu  Tue Nov 23 14:38:12 2004
    From: rkern at ucsd.edu (Robert Kern)
    Date: Tue, 23 Nov 2004 11:38:12 -0800
    Subject: [SciPy-user] Enthon for the Mac
    In-Reply-To: 
    References: <41A375DC.3090800@ucsd.edu>
    	
    Message-ID: <41A391A4.5090500@ucsd.edu>
    
    John Hunter wrote:
    >>>>>>"Robert" == Robert Kern  writes:
    > 
    > 
    >     Robert> I'm gearing up to make binary packages for Scipy et
    >     Robert> al. for Mac OS X. I hope that these might be considered
    >     Robert> collectively as Enthought-Edition Python ("Enthon") for
    >     Robert> the Mac.
    > 
    > This will be great!  I'll be happy to test...
    > 
    >   VTK
    > 
    > For VTK I'd like to see some of the optional flags turned on (I
    > suspect from former communications Prabhu would concur on many of
    > these)
    > 
    >   Patented : On
    >   Hybrid   : On
    >   GL2PS    : On
    > 
    > I don't know how you feel about turning patented on - I think it's a
    > good idea because you need it to do any iso-surface reconstruction
    > with marching cubes, and lots of useful stuff like the decimate
    > filters are in there.
    
    I will not be turning Patented on. I posted instructions on how to build 
    VTK on the Mac to the PythonMac list. It's very straightforward and 
    hassle-free. I will point out the limitation, and include instructions 
    on how to build VTK with Patented for those individuals who know they 
    can legally use it.
    
    -- 
    Robert Kern
    rkern at ucsd.edu
    
    "In the fields of hell where the grass grows high
      Are the graves of dreams allowed to die."
       -- Richard Harter
    
    
    
    From Fernando.Perez at colorado.edu  Tue Nov 23 15:24:31 2004
    From: Fernando.Perez at colorado.edu (Fernando Perez)
    Date: Tue, 23 Nov 2004 13:24:31 -0700
    Subject: [SciPy-user] Enthon for the Mac
    In-Reply-To: <41A39A9B.3080507@enthought.com>
    References: <41A375DC.3090800@ucsd.edu> <41A39A9B.3080507@enthought.com>
    Message-ID: <41A39C7F.9000602@colorado.edu>
    
    Joe Cooper schrieb:
    
    > One thing we're doing with the next version is to integrate IPython as 
    > kind of the de facto interactive python.  There's a scipy config file 
    > that works well and makes writing tutorials simpler, since it's possible 
    > to preload all of the relevant modules.
    
    I'm obviously very happy to see this in.  One comment: I was planning on using 
    this Thanksgiving weekend to push an ipython 0.6.5 out, including the 
    reactivation of the wx/gtk threading code thanks to Prabhu.  This is a fairly 
    significant enhancement, since apparently it allows users to get interactive 
    control of wx/gtk apps with a far lighter solution than gui_thread.  Is this 
    timing bad for you?  If it helps coordinate things, I can _promise_ to have it 
    out by Sunday night, or even Saturday if it makes a difference.
    
    As I _really_ expect this to be the 'end of the road' for ipython while I 
    rewrite it, it would be really nice to get this threading functionality 
    shipped out to all Enthon Win/OSX users.
    
    Cheers,
    
    f
    
    
    
    From Fernando.Perez at colorado.edu  Tue Nov 23 15:31:00 2004
    From: Fernando.Perez at colorado.edu (Fernando Perez)
    Date: Tue, 23 Nov 2004 13:31:00 -0700
    Subject: [SciPy-user] Enthon for the Mac
    In-Reply-To: <41A391A4.5090500@ucsd.edu>
    References: <41A375DC.3090800@ucsd.edu>
    	
    	<41A391A4.5090500@ucsd.edu>
    Message-ID: <41A39E04.7010107@colorado.edu>
    
    Robert Kern schrieb:
    
    > I will not be turning Patented on. I posted instructions on how to build 
    > VTK on the Mac to the PythonMac list. It's very straightforward and 
    > hassle-free. I will point out the limitation, and include instructions 
    > on how to build VTK with Patented for those individuals who know they 
    > can legally use it.
    
    I understand the reasons for this, and it's definitely not a good idea for 
    enthought to distribute a full enthon with patented stuff in.  But would it be 
    possible to put a download link, besides enthon(with patented=off) to a single 
    vtk(patented=on) package?  I ask this simply thinking of the "if it takes more 
    than 3 minutes and 2 clicks I won't install it" problem we've discussed before.
    
    This would allow users who know they are OK with patented stuff to simply grab 
    both packages and then overwrite the default VTK with VTK_patented after 
    installing Enthon.
    
    There are two possible issues with this:
    
    1. it may be too much additional work for you.
    
    2. it may still be a source of legal concerns.  I would imagine that since 
    you'd only be distributing a compiled package, with a clear disclaimer that 
    its _usage_ must be done only with knowledge of the patent issues, it might be 
    OK.  But in the land of competition-by-lawyers, one can never know...
    
    Anyway, this is just an idea.  Ultimately if either of the above is a sticking 
    point, your solution of documenting the build process is a very reasonable 
    fallback.
    
    And for the record, I'm glad to see Enthon becoming available for both Win and 
    OSX, I think it will be a major accessibility gain!
    
    Cheers,
    
    f
    
    
    
    From rkern at ucsd.edu  Tue Nov 23 14:40:40 2004
    From: rkern at ucsd.edu (Robert Kern)
    Date: Tue, 23 Nov 2004 11:40:40 -0800
    Subject: [SciPy-user] Enthon for the Mac
    In-Reply-To: <16803.31828.363535.815026@monster.linux.in>
    References: <41A375DC.3090800@ucsd.edu>
    	<16803.31828.363535.815026@monster.linux.in>
    Message-ID: <41A39238.5010607@ucsd.edu>
    
    Prabhu Ramachandran wrote:
    >>>>>>"RK" == Robert Kern  writes:
    > 
    > 
    >     RK> I'm gearing up to make binary packages for Scipy et al. for
    >     RK> Mac OS X. I hope that these might be considered collectively
    >     RK> as Enthought-Edition Python ("Enthon") for the Mac.
    > 
    > I'm not a Mac user but here is a data point.
    > 
    >     RK> Mayavi - stable 
    > 
    > You should really use CVS.  There were a few bugs that were fixed in
    > the last few months.  If CVS is a pain grab the last CVS snapshot 
    > 
    >  http://vvikram.com/~prabhu/download/mayavi/
    > 
    > Its old but fixed bad bugs in the volume mapper code.
    > 
    > The only reason there isn't a new MayaVi release is that I don't have
    > the time to to make the windows installer.  No one has stepped forward
    > to help with that so until I make the time to upgrade my machines,
    > install Windows blah blah and build the installer CVS is your best
    > bet.
    
    Okay. I track CVS in any case. I just assumed that since MayaVi 
    development was halting on that branch that the latest release would 
    reflect the current state of CVS. Thanks for the heads-up.
    
    -- 
    Robert Kern
    rkern at ucsd.edu
    
    "In the fields of hell where the grass grows high
      Are the graves of dreams allowed to die."
       -- Richard Harter
    
    
    
    From joe at enthought.com  Tue Nov 23 15:50:15 2004
    From: joe at enthought.com (Joe Cooper)
    Date: Tue, 23 Nov 2004 14:50:15 -0600
    Subject: [SciPy-user] Enthon for the Mac
    In-Reply-To: <41A39C7F.9000602@colorado.edu>
    References: <41A375DC.3090800@ucsd.edu> <41A39A9B.3080507@enthought.com>
    	<41A39C7F.9000602@colorado.edu>
    Message-ID: <41A3A287.9030309@enthought.com>
    
    Fernando Perez wrote:
    > Joe Cooper schrieb:
    > 
    >> One thing we're doing with the next version is to integrate IPython as 
    >> kind of the de facto interactive python.  There's a scipy config file 
    >> that works well and makes writing tutorials simpler, since it's 
    >> possible to preload all of the relevant modules.
    > 
    > 
    > I'm obviously very happy to see this in.  One comment: I was planning on 
    > using this Thanksgiving weekend to push an ipython 0.6.5 out, including 
    > the reactivation of the wx/gtk threading code thanks to Prabhu.  This is 
    > a fairly significant enhancement, since apparently it allows users to 
    > get interactive control of wx/gtk apps with a far lighter solution than 
    > gui_thread.  Is this timing bad for you?  If it helps coordinate things, 
    > I can _promise_ to have it out by Sunday night, or even Saturday if it 
    > makes a difference.
    
    Don't kill yourself to get it done, Fernando.
    
    I plan to release the next Enthon around about the end of next week, or 
    whenever it is ready.  The scary packages have been dealt with, so I 
    don't foresee any problems rolling a few late changes in, especially 
    with packages that are relatively painless, like IPython.
    
    > As I _really_ expect this to be the 'end of the road' for ipython while 
    > I rewrite it, it would be really nice to get this threading 
    > functionality shipped out to all Enthon Win/OSX users.
    
    I can wait until you've got a release of IPython you're happy with.
    
    
    
    From rkern at ucsd.edu  Tue Nov 23 15:53:43 2004
    From: rkern at ucsd.edu (Robert Kern)
    Date: Tue, 23 Nov 2004 12:53:43 -0800
    Subject: [SciPy-user] Enthon for the Mac
    In-Reply-To: <41A39A9B.3080507@enthought.com>
    References: <41A375DC.3090800@ucsd.edu> <41A39A9B.3080507@enthought.com>
    Message-ID: <41A3A357.8050603@ucsd.edu>
    
    Joe Cooper wrote:
    > Hey Robert,
    > 
    > Perfect timing.  I'm in the midst of preparing a new Enthon for Windows 
    > (and peripherally one for Fedora).  We should probably chat sometime 
    > about making sure most things are roughly functionally 
    > compatible/equivalent, so that folks writing tutorials and other docs 
    > can just say "install Enthon for your OS".
    
    Okay. My needs diverge a bit from yours, I think. The push for me to do 
    this comes from the fact that people at my institution are beginning to 
    use Python and are heavily invested in the Mac platform. The people I'm 
    supporting with this will need recent CVS versions of some packages.
    
    For MacEnthon at least, it may be worth having Stable and Unstable 
    branches. I'm willing to cut a Stable MacEnthon that has versions which 
    match Enthon for Windows if such versions work on the Mac. wxPython may 
    be a big problem here; only the 2.5 series works acceptably well on the Mac.
    
    > I had started in on the Enthon on OS X endeavor about six months ago, 
    > but only a few packages ever came of it.  The packaging on Mac is quite 
    > abyssmal, so there isn't much to be gained by handing off the packages I 
    > did back then--you pretty much just have to build the software by hand, 
    > and then push the pretty "Go" button and it all gets piled up together 
    > into an archive.  I do have simple aap recipes for almost everything in 
    > Enthon on Windows, which you might find useful in automating your 
    > construction on Mac, since the concept is the same for the Wise 
    > installer.  It's nicer to say "aap -vf scipy.aap" than to have to go 
    > download all of the bits and pieces.
    
    aap as in www.a-a-p.org? I'll have to take a look at that.
    
    For distutils-using Python packages, Bob Ippolito has recently made a 
    bdist_mpkg target that puts everything into nice meta-packages[1]. 
    There's a bit of customization work that needs to be done to make 
    everything work as Enthon.mpkg.
    
    There is currently a package manager for Python packages on the Mac 
    called PackMan[2]. It's not a particularly satisfying solution, and 
    bdist_mpkg is intended to be the base of a new system. I hope the 
    presence of MacEnthon will encourage development of the next generation 
    of PackMan.
    
    [1] http://svn.red-bean.com/bob/py2app/trunk
    [2] http://www.python.org/packman
    
    > On the Kiva, Chaco, Traits front, we'll want to rope Eric into a 
    > conversation.  I think there are some outstanding issues with 
    > integrating the SVN versions.
    
    Kiva at least is relatively stable, and the SVN now includes working Mac 
    support. The other two may be too unstable.
    
    > One thing we're doing with the next version is to integrate IPython as 
    > kind of the de facto interactive python.  There's a scipy config file 
    > that works well and makes writing tutorials simpler, since it's possible 
    > to preload all of the relevant modules.
    
    -- 
    Robert Kern
    rkern at ucsd.edu
    
    "In the fields of hell where the grass grows high
      Are the graves of dreams allowed to die."
       -- Richard Harter
    
    
    
    From Fernando.Perez at colorado.edu  Tue Nov 23 16:53:05 2004
    From: Fernando.Perez at colorado.edu (Fernando Perez)
    Date: Tue, 23 Nov 2004 14:53:05 -0700
    Subject: [SciPy-user] Enthon for the Mac
    In-Reply-To: <41A3A287.9030309@enthought.com>
    References: <41A375DC.3090800@ucsd.edu> <41A39A9B.3080507@enthought.com>
    	<41A39C7F.9000602@colorado.edu> <41A3A287.9030309@enthought.com>
    Message-ID: <41A3B141.9090505@colorado.edu>
    
    Joe Cooper schrieb:
    
    >>I'm obviously very happy to see this in.  One comment: I was planning on 
    >>using this Thanksgiving weekend to push an ipython 0.6.5 out, including 
    >>the reactivation of the wx/gtk threading code thanks to Prabhu.  This is 
    >>a fairly significant enhancement, since apparently it allows users to 
    >>get interactive control of wx/gtk apps with a far lighter solution than 
    >>gui_thread.  Is this timing bad for you?  If it helps coordinate things, 
    >>I can _promise_ to have it out by Sunday night, or even Saturday if it 
    >>makes a difference.
    > 
    > 
    > Don't kill yourself to get it done, Fernando.
    
    OK.  At any rate, this weekend is about my only free window, so I'll try to 
    have it done by Monday.  I'll obviously post on the list, so you'll get notice 
    when it actually happens.
    
    Regards,
    
    f
    
    
    
    From bsder at mail.allcaps.org  Tue Nov 23 15:14:26 2004
    From: bsder at mail.allcaps.org (Andrew P. Lentvorski, Jr.)
    Date: Tue, 23 Nov 2004 12:14:26 -0800
    Subject: [SciPy-user] Enthon for the Mac
    In-Reply-To: <41A375DC.3090800@ucsd.edu>
    References: <41A375DC.3090800@ucsd.edu>
    Message-ID: <45CFAEE2-3D8C-11D9-84FD-000A95C874EE@mail.allcaps.org>
    
    
    On Nov 23, 2004, at 9:39 AM, Robert Kern wrote:
    
    > What packages would you like to see? My current list:
    >
    > wxPython - latest developement version
    
    If I remember correctly, there was enough API shift in wxPython that 
    programs which ran under wxPython 2.4.1.2 (included with the current 
    Enthought) broke when run under the latest development version.
    
    My memory is foggy on what it was, but I ran into it pretty quickly.  I 
    think it was something about event callbacks.
    
    > PyOpenGL - stable
    
    If you do this before PyOpenGL 2.0.2, you probably want the maint-2_0 
    branch in CVS.
    
    I'm probably going to get pilloried for this, but how about pygtk and 
    pygtkglext?  Is there a particular reason that these are excluded, or 
    is it just that nobody has stepped forward to do the work?
    
    -a
    
    
    
    From rkern at ucsd.edu  Tue Nov 23 17:31:48 2004
    From: rkern at ucsd.edu (Robert Kern)
    Date: Tue, 23 Nov 2004 14:31:48 -0800
    Subject: [SciPy-user] Enthon for the Mac
    In-Reply-To: <45CFAEE2-3D8C-11D9-84FD-000A95C874EE@mail.allcaps.org>
    References: <41A375DC.3090800@ucsd.edu>
    	<45CFAEE2-3D8C-11D9-84FD-000A95C874EE@mail.allcaps.org>
    Message-ID: <41A3BA54.5000502@ucsd.edu>
    
    Andrew P. Lentvorski, Jr. wrote:
    > 
    > On Nov 23, 2004, at 9:39 AM, Robert Kern wrote:
    > 
    >> What packages would you like to see? My current list:
    >>
    >> wxPython - latest developement version
    > 
    > 
    > If I remember correctly, there was enough API shift in wxPython that 
    > programs which ran under wxPython 2.4.1.2 (included with the current 
    > Enthought) broke when run under the latest development version.
    > 
    > My memory is foggy on what it was, but I ran into it pretty quickly.  I 
    > think it was something about event callbacks.
    
    wxWidgets 2.4 is simply too broken on the Mac platform and all effort to 
    make wxWidgets viable on the Mac is being given to 2.5. FWIW, many of 
    the packages that only work with wxWidgets 2.4 on other platforms, won't 
    work at all on the Mac. I believe the latest wxPython 2.5 has added a 
    compatibility API, but I haven't tested it out, yet.
    
    >> PyOpenGL - stable
    > 
    > 
    > If you do this before PyOpenGL 2.0.2, you probably want the maint-2_0 
    > branch in CVS.
    
    Thanks!
    
    > I'm probably going to get pilloried for this, but how about pygtk and 
    > pygtkglext?  Is there a particular reason that these are excluded, or is 
    > it just that nobody has stepped forward to do the work?
    
    Has anyone compiled these for the Mac outside of 
    Fink/Gentoo/DarwinPorts? With wxPython and Tkinter, someone else has 
    done the work to make installation of the underlying libraries easy 
    enough that I can package them up. As far as I know, no one has done so 
    for GTK+ and I'm not willing to do so.
    
    -- 
    Robert Kern
    rkern at ucsd.edu
    
    "In the fields of hell where the grass grows high
      Are the graves of dreams allowed to die."
       -- Richard Harter
    
    
    
    From rkern at ucsd.edu  Tue Nov 23 17:54:19 2004
    From: rkern at ucsd.edu (Robert Kern)
    Date: Tue, 23 Nov 2004 14:54:19 -0800
    Subject: [SciPy-user] Enthon for the Mac
    In-Reply-To: <41A39E04.7010107@colorado.edu>
    References: <41A375DC.3090800@ucsd.edu>
    		<41A391A4.5090500@ucsd.edu>
    	<41A39E04.7010107@colorado.edu>
    Message-ID: <41A3BF9B.2030702@ucsd.edu>
    
    Fernando Perez wrote:
    > Robert Kern schrieb:
    > 
    >> I will not be turning Patented on. I posted instructions on how to 
    >> build VTK on the Mac to the PythonMac list. It's very straightforward 
    >> and hassle-free. I will point out the limitation, and include 
    >> instructions on how to build VTK with Patented for those individuals 
    >> who know they can legally use it.
    > 
    > 
    > I understand the reasons for this, and it's definitely not a good idea 
    > for enthought to distribute a full enthon with patented stuff in.  But 
    > would it be possible to put a download link, besides enthon(with 
    > patented=off) to a single vtk(patented=on) package?  I ask this simply 
    > thinking of the "if it takes more than 3 minutes and 2 clicks I won't 
    > install it" problem we've discussed before.
    > 
    > This would allow users who know they are OK with patented stuff to 
    > simply grab both packages and then overwrite the default VTK with 
    > VTK_patented after installing Enthon.
    
    I'm definitely thinking that individual packages will be available to 
    install semi-independently of everything else. The issue with VTK 
    Patented is whether or not to provide an alternate VTK Patented package.
    
    > There are two possible issues with this:
    > 
    > 1. it may be too much additional work for you.
    > 
    > 2. it may still be a source of legal concerns.  I would imagine that 
    > since you'd only be distributing a compiled package, with a clear 
    > disclaimer that its _usage_ must be done only with knowledge of the 
    > patent issues, it might be OK.  But in the land of 
    > competition-by-lawyers, one can never know...
    
    I think that if someone won't install VTK Patented because "it takes 
    more than 3 minutes and 2 clicks", they don't need VTK Patented or won't 
    take the time to decide whether or not they have the right to use it.
    
    If they had a license statement for non-commercial users (and hopefully 
    some statement about whether or not academic or governmental research 
    counts as non-commercial application) that I could have users read 
    before installing, I'd be more amenable to including a VTK Patented 
    package. Unfortunately, the only statements they have are like the 
    following:
    
       """Application of this software for commercial purposes requires
          a license grant from GE."""
    
    My concern is not that myself or Enthought will be sued for 
    *distributing* VTK Patented; that's covered by copyright law, and the 
    copyright license permits redistribution.
    
    I'm much more concerned about people developing applications that depend 
    on the patented algorithms when they really shouldn't have without a 
    license. (Note: I don't really have any moral indignation about the 
    issue, I'm just not personally interested in enabling people to be 
    careless about IP.)
    
    > Anyway, this is just an idea.  Ultimately if either of the above is a 
    > sticking point, your solution of documenting the build process is a very 
    > reasonable fallback.
     >
    > And for the record, I'm glad to see Enthon becoming available for both 
    > Win and OSX, I think it will be a major accessibility gain!
    
    -- 
    Robert Kern
    rkern at ucsd.edu
    
    "In the fields of hell where the grass grows high
      Are the graves of dreams allowed to die."
       -- Richard Harter
    
    
    
    From jdhunter at ace.bsd.uchicago.edu  Tue Nov 23 17:54:11 2004
    From: jdhunter at ace.bsd.uchicago.edu (John Hunter)
    Date: Tue, 23 Nov 2004 16:54:11 -0600
    Subject: [SciPy-user] Enthon for the Mac
    In-Reply-To: <41A3BA54.5000502@ucsd.edu> (Robert Kern's message of "Tue, 23
    	Nov 2004 14:31:48 -0800")
    References: <41A375DC.3090800@ucsd.edu>
    	<45CFAEE2-3D8C-11D9-84FD-000A95C874EE@mail.allcaps.org>
    	<41A3BA54.5000502@ucsd.edu>
    Message-ID: 
    
    >>>>> "Robert" == Robert Kern  writes:
    
        Robert> Has anyone compiled these for the Mac outside of
        Robert> Fink/Gentoo/DarwinPorts? With wxPython and Tkinter,
        Robert> someone else has done the work to make installation of the
        Robert> underlying libraries easy enough that I can package them
        Robert> up. As far as I know, no one has done so for GTK+ and I'm
        Robert> not willing to do so.
    
    I've done it from src for gtk-2, pygtk and gtkglext on 10.3.  It was a
    pain, yes.  I'd be happy to help out here time permitting if you let
    me know what you need, but I've not built darwin packages before so I
    may not be of much use.  I for one would like to see these in the OSX
    enthon -- I just didn't have the guts to ask :-).
    
    The software I develop for work relies heavily on gtkglext / pygtk /
    vtk.  For the record, these packages now work seamlessly together on
    win32.  I'm gearing up to release my EEG/MRI analysis tools; the
    target audience is primarily clinicians who use win32 almost
    exclusively -- many of whom will never have heard of python.  I was
    planning on figuring out how to build an all-in-one win32 installer
    with all the goodies included for these folks.
    
    JDH
    
    
    
    From Fernando.Perez at colorado.edu  Tue Nov 23 17:59:41 2004
    From: Fernando.Perez at colorado.edu (Fernando Perez)
    Date: Tue, 23 Nov 2004 15:59:41 -0700
    Subject: [SciPy-user] Enthon for the Mac
    In-Reply-To: <41A3BF9B.2030702@ucsd.edu>
    References: <41A375DC.3090800@ucsd.edu>
    		<41A391A4.5090500@ucsd.edu>
    	<41A39E04.7010107@colorado.edu> <41A3BF9B.2030702@ucsd.edu>
    Message-ID: <41A3C0DD.20106@colorado.edu>
    
    Robert Kern schrieb:
    
    > My concern is not that myself or Enthought will be sued for 
    > *distributing* VTK Patented; that's covered by copyright law, and the 
    > copyright license permits redistribution.
    > 
    > I'm much more concerned about people developing applications that depend 
    > on the patented algorithms when they really shouldn't have without a 
    > license. (Note: I don't really have any moral indignation about the 
    > issue, I'm just not personally interested in enabling people to be 
    > careless about IP.)
    
    That's probably a wise choice, actually.  Forcing users of the patented stuff 
    to jump through an extra hoop will help prevent accidental misuse of those 
    algorithms.
    
    Best,
    
    f
    
    
    
    From rkern at ucsd.edu  Tue Nov 23 18:57:35 2004
    From: rkern at ucsd.edu (Robert Kern)
    Date: Tue, 23 Nov 2004 15:57:35 -0800
    Subject: [SciPy-user] Enthon for the Mac
    In-Reply-To: 
    References: <41A375DC.3090800@ucsd.edu>
    	<45CFAEE2-3D8C-11D9-84FD-000A95C874EE@mail.allcaps.org>
    	<41A3BA54.5000502@ucsd.edu>
    	
    Message-ID: <41A3CE6F.80906@ucsd.edu>
    
    John Hunter wrote:
    >>>>>>"Robert" == Robert Kern  writes:
    > 
    > 
    >     Robert> Has anyone compiled these for the Mac outside of
    >     Robert> Fink/Gentoo/DarwinPorts? With wxPython and Tkinter,
    >     Robert> someone else has done the work to make installation of the
    >     Robert> underlying libraries easy enough that I can package them
    >     Robert> up. As far as I know, no one has done so for GTK+ and I'm
    >     Robert> not willing to do so.
    > 
    > I've done it from src for gtk-2, pygtk and gtkglext on 10.3.  It was a
    > pain, yes.  I'd be happy to help out here time permitting if you let
    > me know what you need, but I've not built darwin packages before so I
    > may not be of much use.  I for one would like to see these in the OSX
    > enthon -- I just didn't have the guts to ask :-).
    
    I, for one, want to avoid anything that is going to require the user or 
    the installer to add paths to DYLD_LIBRARY_PATH. If pygtk can be made 
    reasonably stand-alone, I'll package it. For example, the wxPython puts 
    all of the wxWidgets stuff under /usr/local/lib/wxPython-unicode-2.5.3.1 
      (i.e. wxWidgets was configured with "./configure 
    --prefix=/usr/local/lib/wxPython-unicode-2.5.3.1"). I'm not sure what 
    they do to make sure wxPython can find the shared libraries at runtime. 
    If you can figure that out, I'll work on packaging it up.
    
    I make no guarantees as to priority, however.  :-)
    
    > The software I develop for work relies heavily on gtkglext / pygtk /
    > vtk.  For the record, these packages now work seamlessly together on
    > win32.  I'm gearing up to release my EEG/MRI analysis tools; the
    > target audience is primarily clinicians who use win32 almost
    > exclusively -- many of whom will never have heard of python.  I was
    > planning on figuring out how to build an all-in-one win32 installer
    > with all the goodies included for these folks.
    
    Perhaps Joe might be willing to share the techniques and automation 
    scripts he uses to build Enthon on Windows so other people could make 
    their own Everything-and-the-Kitchen-Sink distributions.
    
    -- 
    Robert Kern
    rkern at ucsd.edu
    
    "In the fields of hell where the grass grows high
      Are the graves of dreams allowed to die."
       -- Richard Harter
    
    
    
    From joe at enthought.com  Tue Nov 23 19:03:04 2004
    From: joe at enthought.com (Joe Cooper)
    Date: Tue, 23 Nov 2004 18:03:04 -0600
    Subject: [SciPy-user] Enthon for the Mac
    In-Reply-To: <41A3CE6F.80906@ucsd.edu>
    References: <41A375DC.3090800@ucsd.edu>
    	<45CFAEE2-3D8C-11D9-84FD-000A95C874EE@mail.allcaps.org>
    	<41A3BA54.5000502@ucsd.edu>	
    	<41A3CE6F.80906@ucsd.edu>
    Message-ID: <41A3CFB8.4010806@enthought.com>
    
    Robert Kern wrote:
    
    > Perhaps Joe might be willing to share the techniques and automation 
    > scripts he uses to build Enthon on Windows so other people could make 
    > their own Everything-and-the-Kitchen-Sink distributions.
    
    It's not as automated as one might wish.  ;-)
    
    Every release is still a two week project, including building, testing, 
    troubleshooting, blaming Dave for all of the problems even if he's never 
    seen the code, etc.  Then, again, that's a lot less time than the first 
    Enthon release took!
    
    
    
    From jdhunter at ace.bsd.uchicago.edu  Tue Nov 23 22:24:16 2004
    From: jdhunter at ace.bsd.uchicago.edu (John Hunter)
    Date: Tue, 23 Nov 2004 21:24:16 -0600
    Subject: [SciPy-user] Enthon for the Mac
    In-Reply-To: <41A3C0DD.20106@colorado.edu> (Fernando Perez's message of
    	"Tue, 23 Nov 2004 15:59:41 -0700")
    References: <41A375DC.3090800@ucsd.edu>
    	
    	<41A391A4.5090500@ucsd.edu> <41A39E04.7010107@colorado.edu>
    	<41A3BF9B.2030702@ucsd.edu> <41A3C0DD.20106@colorado.edu>
    Message-ID: 
    
    >>>>> "Fernando" == Fernando Perez  writes:
    
        Fernando> That's probably a wise choice, actually.  Forcing users
        Fernando> of the patented stuff to jump through an extra hoop will
        Fernando> help prevent accidental misuse of those algorithms.
    
    I'm more interested in giving a potential user of my software a
    convenient install than I am in saving a potential developer from
    inadvertently writing some commercial software that requires a
    license.  My guess is that 99% of the people who will use enthon/OSX
    satisfy the constraint of not using the patented classes in a
    commercial app, so my preference would be to include it with a
    disclaimer.  
    
    If someone ignores a suitably positioned disclaimer at the download
    site *and* inadvertently develops an app that requires a license *and*
    wants to distribute it commercially -- we're talking about a
    vanishingly small subset here -- then they do so at their own peril.
    
    Of course I don't second guess Robert's decision since he is the one
    doing the hard work, but this is how I see it.
    
    JDH
    
    
    
    From rkern at ucsd.edu  Tue Nov 23 23:16:58 2004
    From: rkern at ucsd.edu (Robert Kern)
    Date: Tue, 23 Nov 2004 20:16:58 -0800
    Subject: [SciPy-user] Enthon for the Mac
    In-Reply-To: 
    References: <41A375DC.3090800@ucsd.edu>
    		<41A391A4.5090500@ucsd.edu>
    	<41A39E04.7010107@colorado.edu>	<41A3BF9B.2030702@ucsd.edu>
    	<41A3C0DD.20106@colorado.edu>
    	
    Message-ID: <41A40B3A.601@ucsd.edu>
    
    John Hunter wrote:
    >>>>>>"Fernando" == Fernando Perez  writes:
    > 
    > 
    >     Fernando> That's probably a wise choice, actually.  Forcing users
    >     Fernando> of the patented stuff to jump through an extra hoop will
    >     Fernando> help prevent accidental misuse of those algorithms.
    > 
    > I'm more interested in giving a potential user of my software a
    > convenient install than I am in saving a potential developer from
    > inadvertently writing some commercial software that requires a
    > license.  My guess is that 99% of the people who will use enthon/OSX
    > satisfy the constraint of not using the patented classes in a
    > commercial app, so my preference would be to include it with a
    > disclaimer.  
    
    The patent license isn't just needed for implementing marching cubes in 
    a program that you sell. It's also needed for applying the algorithm for 
    commercial purposes. If I use VTK(+Patented) to make pretty pictures in 
    the course of my job, I need to pay for a patent license. Research in an 
    academic or governmental context may or may not be considered 
    commercial. In the absence of a clear license for non-commercial and 
    research use, I'm not tempted to guess their intentions.
    
    My guess is that most people who will be using MacEnthon do not in fact 
    qualify as non-commercial users but think they do. Not packaging the 
    patented algorithms removes an attractive nuisance and assuages my 
    conscience.
    
    http://www.vtk.org/Wiki/ITK_Patent_Bazaar
    
    > If someone ignores a suitably positioned disclaimer at the download
    > site *and* inadvertently develops an app that requires a license *and*
    > wants to distribute it commercially -- we're talking about a
    > vanishingly small subset here -- then they do so at their own peril.
    > 
    > Of course I don't second guess Robert's decision since he is the one
    > doing the hard work, but this is how I see it.
    
    -- 
    Robert Kern
    rkern at ucsd.edu
    
    "In the fields of hell where the grass grows high
      Are the graves of dreams allowed to die."
       -- Richard Harter
    
    
    
    From jdhunter at ace.bsd.uchicago.edu  Tue Nov 23 23:47:56 2004
    From: jdhunter at ace.bsd.uchicago.edu (John Hunter)
    Date: Tue, 23 Nov 2004 22:47:56 -0600
    Subject: [SciPy-user] Enthon for the Mac
    In-Reply-To: <41A40B3A.601@ucsd.edu> (Robert Kern's message of "Tue, 23 Nov
    	2004 20:16:58 -0800")
    References: <41A375DC.3090800@ucsd.edu>
    	
    	<41A391A4.5090500@ucsd.edu> <41A39E04.7010107@colorado.edu>
    	<41A3BF9B.2030702@ucsd.edu> <41A3C0DD.20106@colorado.edu>
    	 <41A40B3A.601@ucsd.edu>
    Message-ID: 
    
    >>>>> "Robert" == Robert Kern  writes:
    
        Robert> The patent license isn't just needed for implementing
        Robert> marching cubes in a program that you sell. It's also
        Robert> needed for applying the algorithm for commercial
        Robert> purposes. If I use VTK(+Patented) to make pretty pictures
        Robert> in the course of my job, I need to pay for a patent
        Robert> license. Research in an academic or governmental context
        Robert> may or may not be considered commercial. In the absence of
        Robert> a clear license for non-commercial and research use, I'm
        Robert> not tempted to guess their intentions.
    
    I'm not a lawyer (my wife is, does that count?), but I did read the
     VTK README regarding patents before posting.  It says
    
      VTK has a generous open-source copyright modelled after the BSD
      license. Yes, you can use VTK in commercial products. The only
      caveat is that if you use any classes (there are a small number) in
      the VTK/Patented directory in commercial application, you will want
      to contact the patent holder (listed in the class header) for a
      license. The complete text of the copyright follows.
    
    To me the key phrase is using classes "in commercial application".  I
    read (past tense) "application" to mean a computer program that you
    sell to people, eg, MS Word.  I (now) see there is an alternative
    meaning of "commercial application", the generic one, "the ability to
    use learned material in new and concrete situations".
    
    Do you agree that the reading of this license hinges on how you
    interpret the phrase "commercial application"?  If you read it in the
    sense of a program you want to sell, an "application" in geek
    parlance, it is fairly permissive in a commercial environment.  If you
    read it in the non-geek sense of using learned material in a new and
    concrete situation, it is fairly restrictive.
    
    Goddam lawyers (wife excluded...).
    
    
    JDH
     
    
    
    
    From Fernando.Perez at colorado.edu  Wed Nov 24 00:07:32 2004
    From: Fernando.Perez at colorado.edu (Fernando Perez)
    Date: Tue, 23 Nov 2004 22:07:32 -0700
    Subject: [SciPy-user] Enthon for the Mac
    In-Reply-To: 
    References: <41A375DC.3090800@ucsd.edu>
    		<41A391A4.5090500@ucsd.edu>
    	<41A39E04.7010107@colorado.edu>	<41A3BF9B.2030702@ucsd.edu>
    	<41A3C0DD.20106@colorado.edu>	
    	<41A40B3A.601@ucsd.edu> 
    Message-ID: <41A41714.9080300@colorado.edu>
    
    John Hunter wrote:
    
    >   VTK has a generous open-source copyright modelled after the BSD
    >   license. Yes, you can use VTK in commercial products. The only
    >   caveat is that if you use any classes (there are a small number) in
    >   the VTK/Patented directory in commercial application, you will want
    >   to contact the patent holder (listed in the class header) for a
    >   license. The complete text of the copyright follows.
    > 
    > To me the key phrase is using classes "in commercial application".  I
    > read (past tense) "application" to mean a computer program that you
    > sell to people, eg, MS Word.  I (now) see there is an alternative
    > meaning of "commercial application", the generic one, "the ability to
    > use learned material in new and concrete situations".
    > 
    > Do you agree that the reading of this license hinges on how you
    > interpret the phrase "commercial application"?  If you read it in the
    > sense of a program you want to sell, an "application" in geek
    > parlance, it is fairly permissive in a commercial environment.  If you
    > read it in the non-geek sense of using learned material in a new and
    > concrete situation, it is fairly restrictive.
    
    I'd really like to encourage all those interested in this discussion to go to 
    the link Robert posted:
    
    http://www.vtk.org/Wiki/ITK_Patent_Bazaar
    
    It is _very_ worrisome.  It's not 100% clear to me from that link exactly how 
    the Supreme Court ruled, since it says that it _did_ rule, but the Science 
    excerpt only mentions 'asking the court to review a lower court decision'. 
    I'm sure that someone used to searching case law could answer this in 2 minutes.
    
    But from that page, it really appears to me (despite my wanting to share 
    John's position), that Robert's prudence is here well founded.  Given that in 
    this country people will throw lawsuits into the wind just to see where they 
    land, I wouldn't be surprised to see someday down the road one of us sued for 
    writing a python app which exposes a patented algorithm.  Not because any of 
    us is rich individually, but because Enthought, the University of Chicago, U. 
    Colorado, or whatever, are potentially deep pockets to go after. 
    Unfortunately, that's how those vultures think...
    
    Cheers,
    
    f
    
    
    
    From rkern at ucsd.edu  Wed Nov 24 00:28:36 2004
    From: rkern at ucsd.edu (Robert Kern)
    Date: Tue, 23 Nov 2004 21:28:36 -0800
    Subject: [SciPy-user] Enthon for the Mac
    In-Reply-To: <41A41714.9080300@colorado.edu>
    References: <41A375DC.3090800@ucsd.edu>
    		<41A391A4.5090500@ucsd.edu>
    	<41A39E04.7010107@colorado.edu>	<41A3BF9B.2030702@ucsd.edu>
    	<41A3C0DD.20106@colorado.edu>	
    	<41A40B3A.601@ucsd.edu> 
    	<41A41714.9080300@colorado.edu>
    Message-ID: <41A41C04.9060204@ucsd.edu>
    
    Fernando Perez wrote:
    > John Hunter wrote:
    > 
    >>   VTK has a generous open-source copyright modelled after the BSD
    >>   license. Yes, you can use VTK in commercial products. The only
    >>   caveat is that if you use any classes (there are a small number) in
    >>   the VTK/Patented directory in commercial application, you will want
    >>   to contact the patent holder (listed in the class header) for a
    >>   license. The complete text of the copyright follows.
    >>
    >> To me the key phrase is using classes "in commercial application".  I
    >> read (past tense) "application" to mean a computer program that you
    >> sell to people, eg, MS Word.  I (now) see there is an alternative
    >> meaning of "commercial application", the generic one, "the ability to
    >> use learned material in new and concrete situations".
    >>
    >> Do you agree that the reading of this license hinges on how you
    >> interpret the phrase "commercial application"?  If you read it in the
    >> sense of a program you want to sell, an "application" in geek
    >> parlance, it is fairly permissive in a commercial environment.  If you
    >> read it in the non-geek sense of using learned material in a new and
    >> concrete situation, it is fairly restrictive.
    > 
    > 
    > I'd really like to encourage all those interested in this discussion to 
    > go to the link Robert posted:
    > 
    > http://www.vtk.org/Wiki/ITK_Patent_Bazaar
    > 
    > It is _very_ worrisome.  It's not 100% clear to me from that link 
    > exactly how the Supreme Court ruled, since it says that it _did_ rule, 
    > but the Science excerpt only mentions 'asking the court to review a 
    > lower court decision'. I'm sure that someone used to searching case law 
    > could answer this in 2 minutes.
    
    Indeed. :-)
    
    http://caselaw.lp.findlaw.com/cgi-bin/getcase.pl?court=Fed&navby=case&no=011567
    
    -- 
    Robert Kern
    rkern at ucsd.edu
    
    "In the fields of hell where the grass grows high
      Are the graves of dreams allowed to die."
       -- Richard Harter
    
    
    
    From rkern at ucsd.edu  Wed Nov 24 00:38:17 2004
    From: rkern at ucsd.edu (Robert Kern)
    Date: Tue, 23 Nov 2004 21:38:17 -0800
    Subject: [SciPy-user] Enthon for the Mac
    In-Reply-To: 
    References: <41A375DC.3090800@ucsd.edu>
    		<41A391A4.5090500@ucsd.edu>
    	<41A39E04.7010107@colorado.edu>	<41A3BF9B.2030702@ucsd.edu>
    	<41A3C0DD.20106@colorado.edu>	
    	<41A40B3A.601@ucsd.edu> 
    Message-ID: <41A41E49.80704@ucsd.edu>
    
    John Hunter wrote:
    >>>>>>"Robert" == Robert Kern  writes:
    > 
    > 
    >     Robert> The patent license isn't just needed for implementing
    >     Robert> marching cubes in a program that you sell. It's also
    >     Robert> needed for applying the algorithm for commercial
    >     Robert> purposes. If I use VTK(+Patented) to make pretty pictures
    >     Robert> in the course of my job, I need to pay for a patent
    >     Robert> license. Research in an academic or governmental context
    >     Robert> may or may not be considered commercial. In the absence of
    >     Robert> a clear license for non-commercial and research use, I'm
    >     Robert> not tempted to guess their intentions.
    > 
    > I'm not a lawyer (my wife is, does that count?),
    
    No.  :-)
    
    And while we're at it, I am not a lawyer, either. This is not legal advice.
    
    > but I did read the
    >  VTK README regarding patents before posting.  It says
    > 
    >   VTK has a generous open-source copyright modelled after the BSD
    >   license. Yes, you can use VTK in commercial products. The only
    >   caveat is that if you use any classes (there are a small number) in
    >   the VTK/Patented directory in commercial application, you will want
    >   to contact the patent holder (listed in the class header) for a
    >   license. The complete text of the copyright follows.
    > 
    > To me the key phrase is using classes "in commercial application".  I
    > read (past tense) "application" to mean a computer program that you
    > sell to people, eg, MS Word.  I (now) see there is an alternative
    > meaning of "commercial application", the generic one, "the ability to
    > use learned material in new and concrete situations".
    > 
    > Do you agree that the reading of this license hinges on how you
    > interpret the phrase "commercial application"? 
    
    It's not a patent license. It is a copyright license, and does not 
    confer any license to use the patented algorithms, commercial or 
    non-commercial use aside. The patent owners of the algorithm are not the 
    copyright owners of VTK, and the VTK people have no ability to grant a 
    patent license for the code in question.
    
    In the US, true non-commercial use does not count as infringement of the 
    patent. I believe that the verbiage they use here is referring to that. 
    Unfortunately, it's incredibly misleading because the courts have 
    recently become extremely strict and narrow as to what they consider 
    non-infringing use. See my response to Fernando for a link to a relevant 
    Supreme Court case.
    
    > If you read it in the
    > sense of a program you want to sell, an "application" in geek
    > parlance, it is fairly permissive in a commercial environment.  If you
    > read it in the non-geek sense of using learned material in a new and
    > concrete situation, it is fairly restrictive.
    
    I do believe that the latter is the intended meaning. I believe that it 
    is improper grammar to say "in commercial application" when you mean the 
    former. The grammatically correct version, if the geek meaning is 
    intended, would be "in *a* commercial application."
    
    Nonetheless, I believe the issue is moot since the VTK developers have 
    no ability to grant a patent license for the algorithms.
    
    > Goddam lawyers (wife excluded...).
    
    -- 
    Robert Kern
    rkern at ucsd.edu
    
    "In the fields of hell where the grass grows high
      Are the graves of dreams allowed to die."
       -- Richard Harter
    
    
    
    From Fernando.Perez at colorado.edu  Wed Nov 24 00:45:25 2004
    From: Fernando.Perez at colorado.edu (Fernando Perez)
    Date: Tue, 23 Nov 2004 22:45:25 -0700
    Subject: [SciPy-user] Enthon for the Mac
    In-Reply-To: <41A41C04.9060204@ucsd.edu>
    References: <41A375DC.3090800@ucsd.edu>
    		<41A391A4.5090500@ucsd.edu>
    	<41A39E04.7010107@colorado.edu>	<41A3BF9B.2030702@ucsd.edu>
    	<41A3C0DD.20106@colorado.edu>	
    	<41A40B3A.601@ucsd.edu> 
    	<41A41714.9080300@colorado.edu> <41A41C04.9060204@ucsd.edu>
    Message-ID: <41A41FF5.6020606@colorado.edu>
    
    Robert Kern wrote:
    
    >>It is _very_ worrisome.  It's not 100% clear to me from that link 
    >>exactly how the Supreme Court ruled, since it says that it _did_ rule, 
    >>but the Science excerpt only mentions 'asking the court to review a 
    >>lower court decision'. I'm sure that someone used to searching case law 
    >>could answer this in 2 minutes.
    > 
    > 
    > Indeed. :-)
    > 
    > http://caselaw.lp.findlaw.com/cgi-bin/getcase.pl?court=Fed&navby=case&no=011567
    
    Well, if I read that correctly (and I _hate_ lawyerese), that's the Federal 
    Circuit court's reversal of the lower court's decision, but not the Supreme 
    Court's ruling.  Did the SC indeed hear this case?  The kitware link mentioned 
    the SC being _asked_ to review this case, but they can decline to do so, case 
    in which the case you linked would be the last word on this matter.
    
    Anyway, I really, really hate having to think, even for one stupid second, 
    about this kind of stuff.  And since this has already drifted off-topic and 
    over my legalese yearly allowance, I'll just go back to working on Prabhu's 
    threading ipython patch.  At this point, even the joys of mixing threads, 
    exceptions and 3 different GUI frameworks in python look very, very appealing :)
    
    Cheers,
    
    f
    
    
    
    From rkern at ucsd.edu  Wed Nov 24 01:00:36 2004
    From: rkern at ucsd.edu (Robert Kern)
    Date: Tue, 23 Nov 2004 22:00:36 -0800
    Subject: [SciPy-user] Enthon for the Mac
    In-Reply-To: <41A41FF5.6020606@colorado.edu>
    References: <41A375DC.3090800@ucsd.edu>
    		<41A391A4.5090500@ucsd.edu>
    	<41A39E04.7010107@colorado.edu>	<41A3BF9B.2030702@ucsd.edu>
    	<41A3C0DD.20106@colorado.edu>	
    	<41A40B3A.601@ucsd.edu> 
    	<41A41714.9080300@colorado.edu> <41A41C04.9060204@ucsd.edu>
    	<41A41FF5.6020606@colorado.edu>
    Message-ID: <41A42384.1020008@ucsd.edu>
    
    Fernando Perez wrote:
    > Robert Kern wrote:
    
    >> http://caselaw.lp.findlaw.com/cgi-bin/getcase.pl?court=Fed&navby=case&no=011567 
    >>
    > 
    > 
    > Well, if I read that correctly (and I _hate_ lawyerese), that's the 
    > Federal Circuit court's reversal of the lower court's decision, but not 
    > the Supreme Court's ruling.  Did the SC indeed hear this case?  The 
    > kitware link mentioned the SC being _asked_ to review this case, but 
    > they can decline to do so, case in which the case you linked would be 
    > the last word on this matter.
    
    The Supreme Court declined to review the case. The Circuit opinion is 
    the controlling law for the entire nation.
    
    http://www.imakenews.com/bakerbotts/e_article000166656.cfm
    
    > Anyway, I really, really hate having to think, even for one stupid 
    > second, about this kind of stuff.  And since this has already drifted 
    > off-topic and over my legalese yearly allowance, I'll just go back to 
    > working on Prabhu's threading ipython patch.  At this point, even the 
    > joys of mixing threads, exceptions and 3 different GUI frameworks in 
    > python look very, very appealing :)
    
    You are a very brave man.
    
    -- 
    Robert Kern
    rkern at ucsd.edu
    
    "In the fields of hell where the grass grows high
      Are the graves of dreams allowed to die."
       -- Richard Harter
    
    
    
    From nwagner at mecha.uni-stuttgart.de  Wed Nov 24 02:30:26 2004
    From: nwagner at mecha.uni-stuttgart.de (Nils Wagner)
    Date: Wed, 24 Nov 2004 08:30:26 +0100
    Subject: [SciPy-user] How do I build scipy using standard BLAS/LAPACK
    In-Reply-To: <41A35DB4.1030306@ucsd.edu>
    References: <41A1F9FD.5030102@mecha.uni-stuttgart.de>
    		<41A30683.8000501@mecha.uni-stuttgart.de>
    	<41A31489.8060809@mecha.uni-stuttgart.de>
    	
    	<41A31ECA.6090407@mecha.uni-stuttgart.de> <41A3591D.1080305@ucsd.edu>
    	<41A35B55.3030403@mecha.uni-stuttgart.de> <41A35DB4.1030306@ucsd.edu>
    Message-ID: <41A43892.3070205@mecha.uni-stuttgart.de>
    
    Robert Kern wrote:
    
    > Nils Wagner wrote:
    >
    >> Sorry for my misleading explanation. Of course I have unpacked the 
    >> tarball !
    >> But it doesn't work.
    >
    >
    > Okay. Look at the method lapack_src_info.calc_info() in the file 
    > scipy_core/scipy_distutils/system_info.py and modify it to try to 
    > debug why it can't find the files.
    >
    I guess it is connected with the entry in system_info.py
    
       default_src_dirs = ['.','/usr/local/src', '/opt/src','/sw/src']
    
    class LapackSrcNotFoundError(LapackNotFoundError):
        """
        Lapack (http://www.netlib.org/lapack/) sources not found.
        Directories to search for the sources can be specified in the
        scipy_distutils/site.cfg file (section [lapack_src]) or by setting
        the LAPACK_SRC environment variable."""
    
    In the file sample_site.cfg, which is a sample file of site.cfg I 
    found   (1)
    
    [lapack_src]
    # src_dirs = ..
    
    # Default values are defined in scipy_distutils/system_info.py
    # file and here shown in comments. Feel free to uncomment and modify
    # the corresponding values to your system needs.
    #
    
    I will rename rename it (sample_site.cfg) to site.cfg and modify (1)
    
    [lapack_src]
    src_dirs=/home/nwagner/src/lapack
    
    The setting of the environment variable LAPACK_SRC seems to be  without 
    effect...
    
    Nils
    
     
    
    
    
    
    From pearu at scipy.org  Wed Nov 24 04:02:08 2004
    From: pearu at scipy.org (Pearu Peterson)
    Date: Wed, 24 Nov 2004 03:02:08 -0600 (CST)
    Subject: [SciPy-user] How do I build scipy using standard BLAS/LAPACK
    In-Reply-To: <41A43892.3070205@mecha.uni-stuttgart.de>
    References: <41A1F9FD.5030102@mecha.uni-stuttgart.de>
    	
    	<41A31489.8060809@mecha.uni-stuttgart.de>
    	<41A31ECA.6090407@mecha.uni-stuttgart.de>
    	<41A3591D.1080305@ucsd.edu> <41A35B55.3030403@mecha.uni-stuttgart.de>
    	<41A35DB4.1030306@ucsd.edu> <41A43892.3070205@mecha.uni-stuttgart.de>
    Message-ID: 
    
    
    
    On Wed, 24 Nov 2004, Nils Wagner wrote:
    
    > The setting of the environment variable LAPACK_SRC seems to be  without 
    > effect...
    
    Once again, could you send the listing of $LAPACK_SRC directory?
    Also, what is the output of
       python system_info.py lapack_src
    ?
    
    I moved the LAPACK directory (from lapack.tgz) to various places and each 
    time system_info.py could find it succesfully using the value of 
    LAPACK_SRC environment variable. So I doubt that system_info.py would be 
    bugy in this respect..
    
    Pearu
    
    
    
    From joe at enthought.com  Wed Nov 24 04:46:11 2004
    From: joe at enthought.com (Joe Cooper)
    Date: Wed, 24 Nov 2004 03:46:11 -0600
    Subject: [SciPy-user] Patented algorithms in VTK (was: Enthon for the Mac)
    In-Reply-To: <41A42384.1020008@ucsd.edu>
    References: <41A375DC.3090800@ucsd.edu>
    		<41A391A4.5090500@ucsd.edu>
    	<41A39E04.7010107@colorado.edu>	<41A3BF9B.2030702@ucsd.edu>
    	<41A3C0DD.20106@colorado.edu>	
    	<41A40B3A.601@ucsd.edu> 
    	<41A41714.9080300@colorado.edu> <41A41C04.9060204@ucsd.edu>
    	<41A41FF5.6020606@colorado.edu> <41A42384.1020008@ucsd.edu>
    Message-ID: <41A45863.1040502@enthought.com>
    
    
    
    Enthon for Windows will not have the patented algorithms from VTK.
    
    Them's the breaks.
    
    
    
    From Giovanni.Samaey at cs.kuleuven.ac.be  Wed Nov 24 04:53:52 2004
    From: Giovanni.Samaey at cs.kuleuven.ac.be (Giovanni Samaey)
    Date: Wed, 24 Nov 2004 10:53:52 +0100
    Subject: [SciPy-user] problems with scipy.test() on Opteron 64-bit
    In-Reply-To: <41A31772.3040201@cs.kuleuven.ac.be>
    References: <41A31772.3040201@cs.kuleuven.ac.be>
    Message-ID: <41A45A30.6050202@cs.kuleuven.ac.be>
    
    Since I don't use linalg myself, I do not have any problems using SciPy 
    like this.
    But I would appreciate this being solved before giving access to others.
    Since we plan on encouraging new students to use this environment 
    instead of matlab,
    any help would be welcome.
    
    Giovanni
    
    Giovanni Samaey wrote:
    
    > Hi again,
    >
    > after all my trouble (reported above) I finally got a completed scipy 
    > build.
    > I got however errors in scipy.test(), which I presumed to be due to 
    > incorrect atlas libraries.
    > So I decided to compile those myself, following the instructions on 
    > www.scipy.org.
    >
    > I had to make several modifications; namely adding -fPIC and -m64 as 
    > options to the make
    > files for blas and lapack.  (Otherwise, linking failed during scipy 
    > build.)
    >
    > Running scipy.test(), I get the errors that are shown in 
    > file_atlas.txt.   I decided to give it a try
    > without atlas (which in my case worked without any difficulty).  I 
    > used the blas and lapack
    > routines that were compiled by myself and that are used to build 
    > atlas.  As you will notice,
    > I get one error and one failure less in this case.
    >
    > I am using cvs scipy and atlas3.6.0.  Everything is compiled using the 
    > gnu compilers with the
    > default options, to which -fPIC and -m64 are added for lapack and blas.
    > I googled and looked through this mailing list, but did not find 
    > anything directly applicable.
    > (If I overlooked it, I already apologize beforehand.)
    >
    > Best,  Giovanni
    >
    >------------------------------------------------------------------------
    >
    >======================================================================
    >ERROR: check_simple_overdet (scipy.linalg.basic.test_basic.test_lstsq)
    >----------------------------------------------------------------------
    >Traceback (most recent call last):
    >  File "/data/home/giovanni/lib/python2.3/site-packages/scipy/linalg/tests/test_basic.py", line 361, in check_simple_overdet
    >    x,res,r,s = lstsq(a,b)
    >  File "/data/home/giovanni/lib/python2.3/site-packages/scipy/linalg/basic.py", line 353, in lstsq
    >    if rank==n: resids = sum(x[n:]**2)
    >ArithmeticError: Integer overflow in power.
    >
    >======================================================================
    >ERROR: check_simple_sym (scipy.linalg.basic.test_basic.test_solve)
    >----------------------------------------------------------------------
    >Traceback (most recent call last):
    >  File "/data/home/giovanni/lib/python2.3/site-packages/scipy/linalg/tests/test_basic.py", line 80, in check_simple_sym
    >    x = solve(a,b,sym_pos=1,lower=lower)
    >  File "/data/home/giovanni/lib/python2.3/site-packages/scipy/linalg/basic.py", line 123, in solve
    >    raise LinAlgError, "singular matrix"
    >LinAlgError: singular matrix
    >
    >======================================================================
    >FAIL: check_simple (scipy.linalg.decomp.test_decomp.test_cholesky)
    >----------------------------------------------------------------------
    >Traceback (most recent call last):
    >  File "/data/home/giovanni/lib/python2.3/site-packages/scipy/linalg/tests/test_decomp.py", line 260, in check_simple
    >    assert_array_almost_equal(dot(transpose(c),c),a)
    >  File "/data/home/giovanni/lib/python2.3/site-packages/scipy_test/testing.py", line 727, in assert_array_almost_equal
    >    assert cond,\
    >AssertionError:
    >Arrays are not almost equal (mismatch 100.0%):
    >        Array 1: [[-1885965976648932823  7796773222197633577  -761725725645001531]
    > [ 7796773222197633577 -6619736815450336083  789149371...
    >        Array 2: [[8 2 3]
    > [2 9 3]
    > [3 3 6]]
    >
    >
    >======================================================================
    >FAIL: check_simple (scipy.linalg.decomp.test_decomp.test_eig)
    >----------------------------------------------------------------------
    >Traceback (most recent call last):
    >  File "/data/home/giovanni/lib/python2.3/site-packages/scipy/linalg/tests/test_decomp.py", line 88, in check_simple
    >    assert_array_almost_equal(w,exact_w)
    >  File "/data/home/giovanni/lib/python2.3/site-packages/scipy_test/testing.py", line 727, in assert_array_almost_equal
    >    assert cond,\
    >AssertionError:
    >Arrays are not almost equal (mismatch 66.6666666667%):
    >        Array 1: [ 9.4371906+0.j -0.1153653+0.j -0.3218254+0.j]
    >        Array 2: [ 9.3218254  0.        -0.3218254]
    >
    >
    >======================================================================
    >FAIL: check_simple (scipy.linalg.decomp.test_decomp.test_eigvals)
    >----------------------------------------------------------------------
    >Traceback (most recent call last):
    >  File "/data/home/giovanni/lib/python2.3/site-packages/scipy/linalg/tests/test_decomp.py", line 36, in check_simple
    >    assert_array_almost_equal(w,exact_w)
    >  File "/data/home/giovanni/lib/python2.3/site-packages/scipy_test/testing.py", line 727, in assert_array_almost_equal
    >    assert cond,\
    >AssertionError:
    >Arrays are not almost equal (mismatch 66.6666666667%):
    >        Array 1: [ 9.4371906+0.j -0.1153653+0.j -0.3218254+0.j]
    >        Array 2: [ 9.3218254  0.        -0.3218254]
    >
    >
    >======================================================================
    >FAIL: check_simple_tr (scipy.linalg.decomp.test_decomp.test_eigvals)
    >----------------------------------------------------------------------
    >Traceback (most recent call last):
    >  File "/data/home/giovanni/lib/python2.3/site-packages/scipy/linalg/tests/test_decomp.py", line 44, in check_simple_tr
    >    assert_array_almost_equal(w,exact_w)
    >  File "/data/home/giovanni/lib/python2.3/site-packages/scipy_test/testing.py", line 727, in assert_array_almost_equal
    >    assert cond,\
    >AssertionError:
    >Arrays are not almost equal (mismatch 66.6666666667%):
    >        Array 1: [ 9.4371906+0.j -0.1153653+0.j -0.3218254+0.j]
    >        Array 2: [ 9.3218254  0.        -0.3218254]
    >
    >
    >======================================================================
    >FAIL: check_simple (scipy.linalg.basic.test_basic.test_det)
    >----------------------------------------------------------------------
    >Traceback (most recent call last):
    >  File "/data/home/giovanni/lib/python2.3/site-packages/scipy/linalg/tests/test_basic.py", line 273, in check_simple
    >    assert_almost_equal(a_det,-2.0)
    >  File "/data/home/giovanni/lib/python2.3/site-packages/scipy_test/testing.py", line 649, in assert_almost_equal
    >    assert round(abs(desired - actual),decimal) == 0, msg
    >AssertionError:
    >Items are not equal:
    >DESIRED: -2.0
    >ACTUAL: -0.0
    >
    >======================================================================
    >FAIL: check_simple (scipy.linalg.basic.test_basic.test_inv)
    >----------------------------------------------------------------------
    >Traceback (most recent call last):
    >  File "/data/home/giovanni/lib/python2.3/site-packages/scipy/linalg/tests/test_basic.py", line 202, in check_simple
    >    [[1,0],[0,1]])
    >  File "/data/home/giovanni/lib/python2.3/site-packages/scipy_test/testing.py", line 727, in assert_array_almost_equal
    >    assert cond,\
    >AssertionError:
    >Arrays are not almost equal (mismatch 100.0%):
    >        Array 1: [[9216616637413720064   -4503599627370496]
    > [9207609438158979072   -9007199254740992]]
    >        Array 2: [[1 0]
    > [0 1]]
    >
    >
    >======================================================================
    >FAIL: check_simple_exact (scipy.linalg.basic.test_basic.test_lstsq)
    >----------------------------------------------------------------------
    >Traceback (most recent call last):
    >  File "/data/home/giovanni/lib/python2.3/site-packages/scipy/linalg/tests/test_basic.py", line 356, in check_simple_exact
    >    assert_array_almost_equal(Numeric.matrixmultiply(a,x),b)
    >  File "/data/home/giovanni/lib/python2.3/site-packages/scipy_test/testing.py", line 727, in assert_array_almost_equal
    >    assert cond,\
    >AssertionError:
    >Arrays are not almost equal (mismatch 50.0%):
    >        Array 1: [0 0]
    >        Array 2: [1 0]
    >
    >
    >======================================================================
    >FAIL: check_simple (scipy.linalg.basic.test_basic.test_solve)
    >----------------------------------------------------------------------
    >Traceback (most recent call last):
    >  File "/data/home/giovanni/lib/python2.3/site-packages/scipy/linalg/tests/test_basic.py", line 74, in check_simple
    >    assert_array_almost_equal(Numeric.matrixmultiply(a,x),b)
    >  File "/data/home/giovanni/lib/python2.3/site-packages/scipy_test/testing.py", line 727,
    >in assert_array_almost_equal
    >    assert cond,\
    >AssertionError:
    >Arrays are not almost equal (mismatch 100.0%):
    >        Array 1: [[             nan              nan]
    > [             nan              nan]]
    >        Array 2: [[1 0]
    > [0 1]]
    >
    >
    >======================================================================
    >FAIL: check_matvec (scipy.sparse.Sparse.test_Sparse.test_csc)
    >----------------------------------------------------------------------
    >Traceback (most recent call last):
    >  File "/data/home/giovanni/lib/python2.3/site-packages/scipy/sparse/tests/test_Sparse.py", line 66, in check_matvec
    >    assert_array_almost_equal([1,2,3,4]*b,dot([1,2,3,4],b.todense()))
    >  File "/data/home/giovanni/lib/python2.3/site-packages/scipy_test/testing.py", line 727,
    >in assert_array_almost_equal
    >    assert cond,\
    >AssertionError:
    >Arrays are not almost equal (mismatch 100.0%):
    >        Array 1: [  8.3991160e-323   6.9169190e-323   4.4465908e-323]
    >        Array 2: [ 17.  14.   9.]
    >
    >
    >======================================================================
    >FAIL: check_matvec (scipy.sparse.Sparse.test_Sparse.test_csr)
    >----------------------------------------------------------------------
    >Traceback (most recent call last):
    >  File "/data/home/giovanni/lib/python2.3/site-packages/scipy/sparse/tests/test_Sparse.py", line 66, in check_matvec
    >    assert_array_almost_equal([1,2,3,4]*b,dot([1,2,3,4],b.todense()))
    >  File "/data/home/giovanni/lib/python2.3/site-packages/scipy_test/testing.py", line 727,
    >in assert_array_almost_equal
    >    assert cond,\
    >AssertionError:
    >Arrays are not almost equal (mismatch 100.0%):
    >        Array 1: [  8.3991160e-323   6.9169190e-323   4.4465908e-323]
    >        Array 2: [ 17.  14.   9.]
    >
    >
    >======================================================================
    >FAIL: check_matvec (scipy.sparse.Sparse.test_Sparse.test_dok)
    >----------------------------------------------------------------------
    >Traceback (most recent call last):
    >  File "/data/home/giovanni/lib/python2.3/site-packages/scipy/sparse/tests/test_Sparse.py", line 66, in check_matvec
    >    assert_array_almost_equal([1,2,3,4]*b,dot([1,2,3,4],b.todense()))
    >  File "/data/home/giovanni/lib/python2.3/site-packages/scipy_test/testing.py", line 727,
    >in assert_array_almost_equal
    >    assert cond,\
    >AssertionError:
    >Arrays are not almost equal (mismatch 100.0%):
    >        Array 1: [  8.3991160e-323   6.9169190e-323   4.4465908e-323]
    >        Array 2: [ 17.  14.   9.]
    >
    >
    >----------------------------------------------------------------------
    >Ran 1080 tests in 1.265s
    >
    >FAILED (failures=11, errors=2)
    >
    >  
    >
    >------------------------------------------------------------------------
    >
    >======================================================================
    >ERROR: check_simple_overdet (scipy.linalg.basic.test_basic.test_lstsq)
    >----------------------------------------------------------------------
    >Traceback (most recent call last):
    >  File "/data/home/giovanni/lib/python2.3/site-packages/scipy/linalg/tests/test_basic.py", line 361, in check_simple_overdet
    >    x,res,r,s = lstsq(a,b)
    >  File "/data/home/giovanni/lib/python2.3/site-packages/scipy/linalg/basic.py", line 353, in lstsq
    >    if rank==n: resids = sum(x[n:]**2)
    >ArithmeticError: Integer overflow in power.
    >
    >======================================================================
    >FAIL: check_simple (scipy.linalg.decomp.test_decomp.test_eig)
    >----------------------------------------------------------------------
    >Traceback (most recent call last):
    >  File "/data/home/giovanni/lib/python2.3/site-packages/scipy/linalg/tests/test_decomp.py", line 88, in check_simple
    >    assert_array_almost_equal(w,exact_w)
    >  File "/data/home/giovanni/lib/python2.3/site-packages/scipy_test/testing.py", line 727, in assert_array_almost_equal
    >    assert cond,\
    >AssertionError:
    >Arrays are not almost equal (mismatch 66.6666666667%):
    >        Array 1: [ 9.4371906+0.j -0.1153653+0.j -0.3218254+0.j]
    >        Array 2: [ 9.3218254  0.        -0.3218254]
    >
    >
    >======================================================================
    >FAIL: check_simple (scipy.linalg.decomp.test_decomp.test_eigvals)
    >----------------------------------------------------------------------
    >Traceback (most recent call last):
    >  File "/data/home/giovanni/lib/python2.3/site-packages/scipy/linalg/tests/test_decomp.py", line 36, in check_simple
    >    assert_array_almost_equal(w,exact_w)
    >  File "/data/home/giovanni/lib/python2.3/site-packages/scipy_test/testing.py", line 727, in assert_array_almost_equal
    >    assert cond,\
    >AssertionError:
    >Arrays are not almost equal (mismatch 66.6666666667%):
    >        Array 1: [ 9.4371906+0.j -0.1153653+0.j -0.3218254+0.j]
    >        Array 2: [ 9.3218254  0.        -0.3218254]
    >
    >
    >======================================================================
    >FAIL: check_simple_tr (scipy.linalg.decomp.test_decomp.test_eigvals)
    >----------------------------------------------------------------------
    >Traceback (most recent call last):
    >  File "/data/home/giovanni/lib/python2.3/site-packages/scipy/linalg/tests/test_decomp.py", line 44, in check_simple_tr
    >    assert_array_almost_equal(w,exact_w)
    >  File "/data/home/giovanni/lib/python2.3/site-packages/scipy_test/testing.py", line 727, in assert_array_almost_equal
    >    assert cond,\
    >AssertionError:
    >Arrays are not almost equal (mismatch 66.6666666667%):
    >        Array 1: [ 9.4371906+0.j -0.1153653+0.j -0.3218254+0.j]
    >        Array 2: [ 9.3218254  0.        -0.3218254]
    >
    >
    >======================================================================
    >FAIL: check_simple (scipy.linalg.basic.test_basic.test_det)
    >----------------------------------------------------------------------
    >Traceback (most recent call last):
    >  File "/data/home/giovanni/lib/python2.3/site-packages/scipy/linalg/tests/test_basic.py", line 273, in check_simple
    >    assert_almost_equal(a_det,-2.0)
    >  File "/data/home/giovanni/lib/python2.3/site-packages/scipy_test/testing.py", line 649, in assert_almost_equal
    >    assert round(abs(desired - actual),decimal) == 0, msg
    >AssertionError:
    >Items are not equal:
    >DESIRED: -2.0
    >ACTUAL: inf
    >
    >======================================================================
    >FAIL: check_simple_exact (scipy.linalg.basic.test_basic.test_lstsq)
    >----------------------------------------------------------------------
    >Traceback (most recent call last):
    >  File "/data/home/giovanni/lib/python2.3/site-packages/scipy/linalg/tests/test_basic.py", line 356, in check_simple_exact
    >    assert_array_almost_equal(Numeric.matrixmultiply(a,x),b)
    >  File "/data/home/giovanni/lib/python2.3/site-packages/scipy_test/testing.py", line 727, in assert_array_almost_equal
    >    assert cond,\
    >AssertionError:
    >Arrays are not almost equal (mismatch 50.0%):
    >        Array 1: [0 0]
    >        Array 2: [1 0]
    >
    >
    >======================================================================
    >FAIL: check_simple (scipy.linalg.basic.test_basic.test_solve)
    >----------------------------------------------------------------------
    >Traceback (most recent call last):
    >  File "/data/home/giovanni/lib/python2.3/site-packages/scipy/linalg/tests/test_basic.py", line 74, in check_simple
    >    assert_array_almost_equal(Numeric.matrixmultiply(a,x),b)
    >  File "/data/home/giovanni/lib/python2.3/site-packages/scipy_test/testing.py", line 727, in assert_array_almost_equal
    >    assert cond,\
    >AssertionError:
    >Arrays are not almost equal (mismatch 50.0%):
    >        Array 1: [0 0]
    >        Array 2: [1 0]
    >
    >
    >======================================================================
    >FAIL: check_simple_sym (scipy.linalg.basic.test_basic.test_solve)
    >----------------------------------------------------------------------
    >Traceback (most recent call last):
    >  File "/data/home/giovanni/lib/python2.3/site-packages/scipy/linalg/tests/test_basic.py", line 81, in check_simple_sym
    >    assert_array_almost_equal(Numeric.matrixmultiply(a,x),b)
    >  File "/data/home/giovanni/lib/python2.3/site-packages/scipy_test/testing.py", line 727,
    >in assert_array_almost_equal
    >    assert cond,\
    >AssertionError:
    >Arrays are not almost equal (mismatch 100.0%):
    >        Array 1: [-9223372036854775784 -9223372036854775770]
    >        Array 2: [1 0]
    >
    >
    >======================================================================
    >FAIL: check_matvec (scipy.sparse.Sparse.test_Sparse.test_csc)
    >----------------------------------------------------------------------
    >Traceback (most recent call last):
    >  File "/data/home/giovanni/lib/python2.3/site-packages/scipy/sparse/tests/test_Sparse.py", line 66, in check_matvec
    >    assert_array_almost_equal([1,2,3,4]*b,dot([1,2,3,4],b.todense()))
    >  File "/data/home/giovanni/lib/python2.3/site-packages/scipy_test/testing.py", line 727,
    >in assert_array_almost_equal
    >    assert cond,\
    >AssertionError:
    >Arrays are not almost equal (mismatch 100.0%):
    >        Array 1: [  8.3991160e-323   6.9169190e-323   4.4465908e-323]
    >        Array 2: [ 17.  14.   9.]
    >
    >
    >======================================================================
    >FAIL: check_matvec (scipy.sparse.Sparse.test_Sparse.test_csr)
    >----------------------------------------------------------------------
    >Traceback (most recent call last):
    >  File "/data/home/giovanni/lib/python2.3/site-packages/scipy/sparse/tests/test_Sparse.py", line 66, in check_matvec
    >    assert_array_almost_equal([1,2,3,4]*b,dot([1,2,3,4],b.todense()))
    >  File "/data/home/giovanni/lib/python2.3/site-packages/scipy_test/testing.py", line 727,
    >in assert_array_almost_equal
    >    assert cond,\
    >AssertionError:
    >Arrays are not almost equal (mismatch 100.0%):
    >        Array 1: [  8.3991160e-323   6.9169190e-323   4.4465908e-323]
    >        Array 2: [ 17.  14.   9.]
    >
    >
    >======================================================================
    >FAIL: check_matvec (scipy.sparse.Sparse.test_Sparse.test_dok)
    >----------------------------------------------------------------------
    >Traceback (most recent call last):
    >  File "/data/home/giovanni/lib/python2.3/site-packages/scipy/sparse/tests/test_Sparse.py", line 66, in check_matvec
    >    assert_array_almost_equal([1,2,3,4]*b,dot([1,2,3,4],b.todense()))
    >  File "/data/home/giovanni/lib/python2.3/site-packages/scipy_test/testing.py", line 727,
    >in assert_array_almost_equal
    >    assert cond,\
    >AssertionError:
    >Arrays are not almost equal (mismatch 100.0%):
    >        Array 1: [  8.3991160e-323   6.9169190e-323   4.4465908e-323]
    >        Array 2: [ 17.  14.   9.]
    >
    >
    >----------------------------------------------------------------------
    >Ran 1080 tests in 1.189s
    >
    >FAILED (failures=10, errors=1)
    >
    >  
    >
    >------------------------------------------------------------------------
    >
    >_______________________________________________
    >SciPy-user mailing list
    >SciPy-user at scipy.net
    >http://www.scipy.net/mailman/listinfo/scipy-user
    >  
    >
    
    
    -- 
    Giovanni Samaey		 	http://www.cs.kuleuven.ac.be/~giovanni/ 
    Katholieke Universiteit Leuven 	      email: giovanni at cs.kuleuven.ac.be 
    Departement Computerwetenschappen                  phone: +32-16-327081
    Celestijnenlaan 200A, B-3001 Heverlee, Belgium       fax: +32-16-327996
    Office: A04.36
    
    
    
    
    From nwagner at mecha.uni-stuttgart.de  Wed Nov 24 05:24:18 2004
    From: nwagner at mecha.uni-stuttgart.de (Nils Wagner)
    Date: Wed, 24 Nov 2004 11:24:18 +0100
    Subject: [SciPy-user] How do I build scipy using standard BLAS/LAPACK
    In-Reply-To: 
    References: <41A1F9FD.5030102@mecha.uni-stuttgart.de>
    		<41A31489.8060809@mecha.uni-stuttgart.de>
    	<41A31ECA.6090407@mecha.uni-stuttgart.de>	<41A3591D.1080305@ucsd.edu>
    	<41A35B55.3030403@mecha.uni-stuttgart.de>	<41A35DB4.1030306@ucsd.edu>
    	<41A43892.3070205@mecha.uni-stuttgart.de>
    	
    Message-ID: <41A46152.9000300@mecha.uni-stuttgart.de>
    
    Pearu Peterson wrote:
    
    >
    >
    > On Wed, 24 Nov 2004, Nils Wagner wrote:
    >
    >> The setting of the environment variable LAPACK_SRC seems to be  
    >> without effect...
    >
    >
    > Once again, could you send the listing of $LAPACK_SRC directory?
    
    cvs/scipy> ls -l /var/tmp/src/lapack/
    total 35014
    drwxr-xr-x  7 nwagner users      264 2000-05-17 20:27 LAPACK
    -rw-r--r--  1 nwagner users 35817984 2004-11-24 08:57 lapack.tar
    
    > Also, what is the output of
    >   python system_info.py lapack_src
    > ?
    
    lapack_src_info:
    ( src_dirs = /var/tmp/src/lapack:.:/usr/local/src:/var/tmp/src )
    ( paths: /var/tmp/src/lapack/LAPACK/SRC )
    ( library_dirs = /usr/local/lib:/usr/lib )
      FOUND:
        sources = ['/var/tmp/src/lapack/LAPACK/SRC/sbdsdc.f', 
    '/var/tmp/src/la ...
    ... CK/SRC/izmax1.f', '/var/tmp/src/lapack/LAPACK/SRC/dzsum1.f']
        language = f77
    
    
    >
    > I moved the LAPACK directory (from lapack.tgz) to various places and 
    > each time system_info.py could find it succesfully using the value of 
    > LAPACK_SRC environment variable. So I doubt that system_info.py would 
    > be bugy in this respect..
    >
    Meanwhile I have made so many modifications with respect to the setting 
    of environment variables and site.cfg
    that I cannot tell you what is going wrong. So I decided to use ATLAS again.
    Actually, I was going to resolve the segmentation fault in 
    symeig.test.test().
    As you mentioned before it is not neccessary to build scipy against 
    FORTRAN LAPACK/BLAS.
    So, do you plan to include the symeig package in scipy ?
    
    Nils
    
    > Pearu
    >
    > _______________________________________________
    > SciPy-user mailing list
    > SciPy-user at scipy.net
    > http://www.scipy.net/mailman/listinfo/scipy-user
    
    
    
     
    
    
    
    From pearu at scipy.org  Wed Nov 24 05:44:23 2004
    From: pearu at scipy.org (Pearu Peterson)
    Date: Wed, 24 Nov 2004 04:44:23 -0600 (CST)
    Subject: [SciPy-user] How do I build scipy using standard BLAS/LAPACK
    In-Reply-To: <41A46152.9000300@mecha.uni-stuttgart.de>
    References: <41A1F9FD.5030102@mecha.uni-stuttgart.de>
    	
    	<41A31ECA.6090407@mecha.uni-stuttgart.de>
    	<41A3591D.1080305@ucsd.edu> <41A35B55.3030403@mecha.uni-stuttgart.de>
    	<41A35DB4.1030306@ucsd.edu> <41A43892.3070205@mecha.uni-stuttgart.de>
    	<41A46152.9000300@mecha.uni-stuttgart.de>
    Message-ID: 
    
    
    
    On Wed, 24 Nov 2004, Nils Wagner wrote:
    
    > Pearu Peterson wrote:
    >
    >> 
    >> 
    >> On Wed, 24 Nov 2004, Nils Wagner wrote:
    >> 
    >>> The setting of the environment variable LAPACK_SRC seems to be  without 
    >>> effect...
    >> 
    >> 
    >> Once again, could you send the listing of $LAPACK_SRC directory?
    >
    > cvs/scipy> ls -l /var/tmp/src/lapack/
    > total 35014
    > drwxr-xr-x  7 nwagner users      264 2000-05-17 20:27 LAPACK
    > -rw-r--r--  1 nwagner users 35817984 2004-11-24 08:57 lapack.tar
    >
    >> Also, what is the output of
    >>   python system_info.py lapack_src
    >> ?
    >
    > lapack_src_info:
    > ( src_dirs = /var/tmp/src/lapack:.:/usr/local/src:/var/tmp/src )
    > ( paths: /var/tmp/src/lapack/LAPACK/SRC )
    > ( library_dirs = /usr/local/lib:/usr/lib )
    > FOUND:
    >   sources = ['/var/tmp/src/lapack/LAPACK/SRC/sbdsdc.f', '/var/tmp/src/la ...
    > ... CK/SRC/izmax1.f', '/var/tmp/src/lapack/LAPACK/SRC/dzsum1.f']
    >   language = f77
    
    So, system_info.py found LAPACK sources! And
    
    ATLAS=None BLAS=None LAPACK=None BLAS_SRC=/path/to/src/blas \
    LAPACK_SRC=/path/to/src/lapack python setup.py build
    
    in symeig source tree should work (with or without site.cfg).
    
    >> I moved the LAPACK directory (from lapack.tgz) to various places and each 
    >> time system_info.py could find it succesfully using the value of LAPACK_SRC 
    >> environment variable. So I doubt that system_info.py would be bugy in this 
    >> respect..
    >> 
    > Meanwhile I have made so many modifications with respect to the setting of 
    > environment variables and site.cfg
    > that I cannot tell you what is going wrong. So I decided to use ATLAS again.
    > Actually, I was going to resolve the segmentation fault in 
    > symeig.test.test().
    > As you mentioned before it is not neccessary to build scipy against FORTRAN 
    > LAPACK/BLAS.
    > So, do you plan to include the symeig package in scipy ?
    
    Yes, but not by copy&paste method (see my earlier comment on that) and
    after lapack and blas wrappers from Lib/linalg is moved to Lib/lib/lapack
    and Lib/lib/blas, respectively. Lib/lib/lapack is done, Lib/lib/blas needs 
    to be finished - it's not much work but right now I have to concentrate
    on other more urgent matters with deadlines...
    
    Pearu
    
    
    
    From pearu at scipy.org  Wed Nov 24 06:06:30 2004
    From: pearu at scipy.org (Pearu Peterson)
    Date: Wed, 24 Nov 2004 05:06:30 -0600 (CST)
    Subject: [SciPy-user] problems with scipy.test() on Opteron 64-bit
    In-Reply-To: <41A45A30.6050202@cs.kuleuven.ac.be>
    References: <41A31772.3040201@cs.kuleuven.ac.be>
    	<41A45A30.6050202@cs.kuleuven.ac.be>
    Message-ID: 
    
    
    
    On Wed, 24 Nov 2004, Giovanni Samaey wrote:
    
    > Since I don't use linalg myself, I do not have any problems using SciPy 
    > like this. But I would appreciate this being solved before giving access 
    > to others. Since we plan on encouraging new students to use this 
    > environment instead of matlab, any help would be welcome.
    
    On a 64-bit platform you'll need F2PY version 2.43.239_1831, at least.
    Also upgrading Numeric would be recommended (make sure that new
    Numeric header files are copied to install directory as well).
    
    Btw, all scipy tests pass ok on a dual Opteron box with 64-bit gentoo 
    system. I am using f2py from cvs, scipy from cvs, Numeric 23.5, ATLAS 
    3.7.8, Python 2.3.4, gcc 3.4.3 in this box.
    
    Pearu
    
    
    
    From nwagner at mecha.uni-stuttgart.de  Wed Nov 24 07:30:00 2004
    From: nwagner at mecha.uni-stuttgart.de (Nils Wagner)
    Date: Wed, 24 Nov 2004 13:30:00 +0100
    Subject: [SciPy-user] How do I build scipy using standard BLAS/LAPACK
    In-Reply-To: 
    References: <41A1F9FD.5030102@mecha.uni-stuttgart.de>
    		<41A31ECA.6090407@mecha.uni-stuttgart.de>
    	<41A3591D.1080305@ucsd.edu> <41A35B55.3030403@mecha.uni-stuttgart.de>
    	<41A35DB4.1030306@ucsd.edu> <41A43892.3070205@mecha.uni-stuttgart.de>
    	<41A46152.9000300@mecha.uni-stuttgart.de>
    	
    Message-ID: <41A47EC8.4000205@mecha.uni-stuttgart.de>
    
    Pearu Peterson wrote:
    
    >
    >
    > On Wed, 24 Nov 2004, Nils Wagner wrote:
    >
    >> Pearu Peterson wrote:
    >>
    >>>
    >>>
    >>> On Wed, 24 Nov 2004, Nils Wagner wrote:
    >>>
    >>>> The setting of the environment variable LAPACK_SRC seems to be  
    >>>> without effect...
    >>>
    >>>
    >>>
    >>> Once again, could you send the listing of $LAPACK_SRC directory?
    >>
    >>
    >> cvs/scipy> ls -l /var/tmp/src/lapack/
    >> total 35014
    >> drwxr-xr-x  7 nwagner users      264 2000-05-17 20:27 LAPACK
    >> -rw-r--r--  1 nwagner users 35817984 2004-11-24 08:57 lapack.tar
    >>
    >>> Also, what is the output of
    >>>   python system_info.py lapack_src
    >>> ?
    >>
    >>
    >> lapack_src_info:
    >> ( src_dirs = /var/tmp/src/lapack:.:/usr/local/src:/var/tmp/src )
    >> ( paths: /var/tmp/src/lapack/LAPACK/SRC )
    >> ( library_dirs = /usr/local/lib:/usr/lib )
    >> FOUND:
    >>   sources = ['/var/tmp/src/lapack/LAPACK/SRC/sbdsdc.f', 
    >> '/var/tmp/src/la ...
    >> ... CK/SRC/izmax1.f', '/var/tmp/src/lapack/LAPACK/SRC/dzsum1.f']
    >>   language = f77
    >
    >
    > So, system_info.py found LAPACK sources! And
    >
    At the beginning I have used ~/src/lapack However, my home directory 
    "lives" on another machine.
    Finally I have used /var/tmp/src/lapack on my local host and it works fine.
    
    Nils
    
    > ATLAS=None BLAS=None LAPACK=None BLAS_SRC=/path/to/src/blas \
    > LAPACK_SRC=/path/to/src/lapack python setup.py build
    >
    > in symeig source tree should work (with or without site.cfg).
    >
    >>> I moved the LAPACK directory (from lapack.tgz) to various places and 
    >>> each time system_info.py could find it succesfully using the value 
    >>> of LAPACK_SRC environment variable. So I doubt that system_info.py 
    >>> would be bugy in this respect..
    >>>
    >> Meanwhile I have made so many modifications with respect to the 
    >> setting of environment variables and site.cfg
    >> that I cannot tell you what is going wrong. So I decided to use ATLAS 
    >> again.
    >> Actually, I was going to resolve the segmentation fault in 
    >> symeig.test.test().
    >> As you mentioned before it is not neccessary to build scipy against 
    >> FORTRAN LAPACK/BLAS.
    >> So, do you plan to include the symeig package in scipy ?
    >
    >
    > Yes, but not by copy&paste method (see my earlier comment on that) and
    > after lapack and blas wrappers from Lib/linalg is moved to Lib/lib/lapack
    > and Lib/lib/blas, respectively. Lib/lib/lapack is done, Lib/lib/blas 
    > needs to be finished - it's not much work but right now I have to 
    > concentrate
    > on other more urgent matters with deadlines...
    >
    > Pearu
    >
    > _______________________________________________
    > SciPy-user mailing list
    > SciPy-user at scipy.net
    > http://www.scipy.net/mailman/listinfo/scipy-user
    
    
    
     
    
    
    
    From nwagner at mecha.uni-stuttgart.de  Wed Nov 24 07:49:55 2004
    From: nwagner at mecha.uni-stuttgart.de (Nils Wagner)
    Date: Wed, 24 Nov 2004 13:49:55 +0100
    Subject: [SciPy-user] Failures in symeig.test.test()
    Message-ID: <41A48373.6000106@mecha.uni-stuttgart.de>
    
    Hi all,
    
    Finally I have build scipy from cvs using ATLAS.
    
    For the external package symeig I have used FORTRAN LAPACK/BLAS
    (see http://mdp-toolkit.sourceforge.net/symeig.html)
    
    by setting the environment variables
    
    setenv ATLAS None
    setenv LAPACK None
    setenv BLAS None
    setenv LAPACK_SRC /var/tmp/src/lapack
    setenv BLAS_SRC /var/tmp/src/blas
    
     >>> import symeig
     >>> symeig.test.test()
    Random Seed:  (1268049219, 2102953867)
    testComplex (symeig.test.test_symeig.SymeigTestCase) ... ERROR
    testComplexGeneralized (symeig.test.test_symeig.SymeigTestCase) ... ERROR
    testIntegerMatrix (symeig.test.test_symeig.SymeigTestCase) ... ok
    testNonContiguousMatrix (symeig.test.test_symeig.SymeigTestCase) ... ok
    testOverwriteBug (symeig.test.test_symeig.SymeigTestCase) ... ok
    testReal (symeig.test.test_symeig.SymeigTestCase) ... ERROR
    testRealGeneralized (symeig.test.test_symeig.SymeigTestCase) ... ERROR
    testTypecodeConversion (symeig.test.test_symeig.SymeigTestCase) ... ok
    
    ======================================================================
    ERROR: testComplex (symeig.test.test_symeig.SymeigTestCase)
    ----------------------------------------------------------------------
    Traceback (most recent call last):
      File "/usr/lib/python2.3/site-packages/symeig/test/test_symeig.py", 
    line 126, in testComplex
        self.eigenproblem(DIM,'D',0,'off',range)
      File "/usr/lib/python2.3/site-packages/symeig/test/test_symeig.py", 
    line 53, in eigenproblem
        assert_array_almost_equal(diag, w, DIGITS[typecode])
      File "/usr/lib/python2.3/site-packages/symeig/test/testing_tools.py", 
    line 17, in assert_array_almost_equal_diff
        maxdiff = max(scipy.ravel(abs(x-y)))/\
    ZeroDivisionError: float division
    
    ======================================================================
    ERROR: testComplexGeneralized (symeig.test.test_symeig.SymeigTestCase)
    ----------------------------------------------------------------------
    Traceback (most recent call last):
      File "/usr/lib/python2.3/site-packages/symeig/test/test_symeig.py", 
    line 141, in testComplexGeneralized
        self.geneigenproblem(DIM,'D',0,'off',range)
      File "/usr/lib/python2.3/site-packages/symeig/test/test_symeig.py", 
    line 68, in geneigenproblem
        assert_array_almost_equal(diag1, w, DIGITS[typecode])
      File "/usr/lib/python2.3/site-packages/symeig/test/testing_tools.py", 
    line 17, in assert_array_almost_equal_diff
        maxdiff = max(scipy.ravel(abs(x-y)))/\
    ZeroDivisionError: float division
    
    ======================================================================
    ERROR: testReal (symeig.test.test_symeig.SymeigTestCase)
    ----------------------------------------------------------------------
    Traceback (most recent call last):
      File "/usr/lib/python2.3/site-packages/symeig/test/test_symeig.py", 
    line 99, in testReal
        self.eigenproblem(DIM,'d',0,'off',range)
      File "/usr/lib/python2.3/site-packages/symeig/test/test_symeig.py", 
    line 53, in eigenproblem
        assert_array_almost_equal(diag, w, DIGITS[typecode])
      File "/usr/lib/python2.3/site-packages/symeig/test/testing_tools.py", 
    line 17, in assert_array_almost_equal_diff
        maxdiff = max(scipy.ravel(abs(x-y)))/\
    ZeroDivisionError: float division
    
    ======================================================================
    ERROR: testRealGeneralized (symeig.test.test_symeig.SymeigTestCase)
    ----------------------------------------------------------------------
    Traceback (most recent call last):
      File "/usr/lib/python2.3/site-packages/symeig/test/test_symeig.py", 
    line 114, in testRealGeneralized
        self.geneigenproblem(DIM,'d',0,'off',range)
      File "/usr/lib/python2.3/site-packages/symeig/test/test_symeig.py", 
    line 68, in geneigenproblem
        assert_array_almost_equal(diag1, w, DIGITS[typecode])
      File "/usr/lib/python2.3/site-packages/symeig/test/testing_tools.py", 
    line 17, in assert_array_almost_equal_diff
        maxdiff = max(scipy.ravel(abs(x-y)))/\
    ZeroDivisionError: float division
    
    ----------------------------------------------------------------------
    Ran 8 tests in 1.127s
    
    FAILED (errors=4)
    
    The former segmentation fault vanishes if  I  use FORTRAN LAPACK/BLAS 
    instead of ATLAS.
    
    Can someone reproduce these failures ?
    
    Nils
    
     
    
    
    
    From gpajer at rider.edu  Wed Nov 24 08:32:33 2004
    From: gpajer at rider.edu (Gary Pajer)
    Date: Wed, 24 Nov 2004 08:32:33 -0500
    Subject: [SciPy-user] Enthon for the Mac
    In-Reply-To: <41A375DC.3090800@ucsd.edu>
    References: <41A375DC.3090800@ucsd.edu>
    Message-ID: <41A48D71.1080509@rider.edu>
    
    Robert Kern wrote:
    
    > I'm gearing up to make binary packages for Scipy et al. for Mac OS X. 
    > I hope that these might be considered collectively as 
    > Enthought-Edition Python ("Enthon") for the Mac.
    >
    Maybe a dumb question from a sometimes (but never comfortable) Mac 
    user:  is this a fink package or a "framework" package?
    
    -gary
    
    
    
    From Giovanni.Samaey at cs.kuleuven.ac.be  Wed Nov 24 11:22:56 2004
    From: Giovanni.Samaey at cs.kuleuven.ac.be (Giovanni Samaey)
    Date: Wed, 24 Nov 2004 17:22:56 +0100
    Subject: [SciPy-user] problems with scipy.test() on Opteron 64-bit
    In-Reply-To: 
    References: <41A31772.3040201@cs.kuleuven.ac.be>
    	<41A45A30.6050202@cs.kuleuven.ac.be>
    	
    Message-ID: <41A4B560.8030609@cs.kuleuven.ac.be>
    
    Pearu Peterson wrote:
    
    >
    >
    > On Wed, 24 Nov 2004, Giovanni Samaey wrote:
    >
    >> Since I don't use linalg myself, I do not have any problems using 
    >> SciPy like this. But I would appreciate this being solved before 
    >> giving access to others. Since we plan on encouraging new students to 
    >> use this environment instead of matlab, any help would be welcome.
    >
    >
    > On a 64-bit platform you'll need F2PY version 2.43.239_1831, at least.
    > Also upgrading Numeric would be recommended (make sure that new
    > Numeric header files are copied to install directory as well).
    
    OK -- I succeeded in recompiling atlas version 3.7.8, Numeric 23.6 (I 
    also tried Numeric 23.5 ), and the latest
    f2py (2.43.239_1844).  Python version is 2.3.4. However, I use gcc3.3.2.
    Now the failed scipy tests reduce to only 3 (and 3 times the same 
    computation that fails).
    Would the compiler version really make the difference, is something 
    wrong with the compile options, or could there
    be an other problem?
    I compiled atlas by following the instructions on the scipy homepage, 
    with a few modifications:
    I told ./xconfig -b $BLAS (so I did not give compiler options) and I 
    then manually changed FFLAGS, CCFLAGS and
    MCCFLAGS to include -fPIC
    
    The lapack and blas were compiled by adding -fPIC -m64 to their options, 
    to make the compiler options compatible with those of atlas.
    
    ======================================================================
    FAIL: check_simple (scipy.linalg.decomp.test_decomp.test_eig)
    ----------------------------------------------------------------------
    Traceback (most recent call last):
      File 
    "/data/home/giovanni/lib/python2.3/site-packages/scipy/linalg/tests/test_decomp.py", 
    line 88, in check_simple
        assert_array_almost_equal(w,exact_w)
      File 
    "/data/home/giovanni/lib/python2.3/site-packages/scipy_test/testing.py", 
    line 727,
    in assert_array_almost_equal
        assert cond,\
    AssertionError:
    Arrays are not almost equal (mismatch 66.6666666667%):
            Array 1: [ 9.4371906+0.j -0.1153653+0.j -0.3218254+0.j]
            Array 2: [ 9.3218254  0.        -0.3218254]
    
    
    ======================================================================
    FAIL: check_simple (scipy.linalg.decomp.test_decomp.test_eigvals)
    ----------------------------------------------------------------------
    Traceback (most recent call last):
      File 
    "/data/home/giovanni/lib/python2.3/site-packages/scipy/linalg/tests/test_decomp.py", 
    line 36, in check_simple
        assert_array_almost_equal(w,exact_w)
      File 
    "/data/home/giovanni/lib/python2.3/site-packages/scipy_test/testing.py", 
    line 727,
    in assert_array_almost_equal
        assert cond,\
    AssertionError:
    Arrays are not almost equal (mismatch 66.6666666667%):
            Array 1: [ 9.4371906+0.j -0.1153653+0.j -0.3218254+0.j]
            Array 2: [ 9.3218254  0.        -0.3218254]
    
    
    ======================================================================
    FAIL: check_simple_tr (scipy.linalg.decomp.test_decomp.test_eigvals)
    ----------------------------------------------------------------------
    Traceback (most recent call last):
      File 
    "/data/home/giovanni/lib/python2.3/site-packages/scipy/linalg/tests/test_decomp.py", 
    line 44, in check_simple_tr
        assert_array_almost_equal(w,exact_w)
      File 
    "/data/home/giovanni/lib/python2.3/site-packages/scipy_test/testing.py", 
    line 727,
    in assert_array_almost_equal
        assert cond,\
    AssertionError:
    Arrays are not almost equal (mismatch 66.6666666667%):
            Array 1: [ 9.4371906+0.j -0.1153653+0.j -0.3218254+0.j]
            Array 2: [ 9.3218254  0.        -0.3218254]
    
    
    ----------------------------------------------------------------------
    
    >
    > Btw, all scipy tests pass ok on a dual Opteron box with 64-bit gentoo 
    > system. I am using f2py from cvs, scipy from cvs, Numeric 23.5, ATLAS 
    > 3.7.8, Python 2.3.4, gcc 3.4.3 in this box.
    >
    > Pearu
    >
    > _______________________________________________
    > SciPy-user mailing list
    > SciPy-user at scipy.net
    > http://www.scipy.net/mailman/listinfo/scipy-user
    
    
    
    -- 
    Giovanni Samaey		 	http://www.cs.kuleuven.ac.be/~giovanni/ 
    Katholieke Universiteit Leuven 	      email: giovanni at cs.kuleuven.ac.be 
    Departement Computerwetenschappen                  phone: +32-16-327081
    Celestijnenlaan 200A, B-3001 Heverlee, Belgium       fax: +32-16-327996
    Office: A04.36
    
    
    
    
    From rkern at ucsd.edu  Wed Nov 24 12:38:13 2004
    From: rkern at ucsd.edu (Robert Kern)
    Date: Wed, 24 Nov 2004 09:38:13 -0800
    Subject: [SciPy-user] Enthon for the Mac
    In-Reply-To: <41A48D71.1080509@rider.edu>
    References: <41A375DC.3090800@ucsd.edu> <41A48D71.1080509@rider.edu>
    Message-ID: <41A4C705.4050306@ucsd.edu>
    
    Gary Pajer wrote:
    > Robert Kern wrote:
    > 
    >> I'm gearing up to make binary packages for Scipy et al. for Mac OS X. 
    >> I hope that these might be considered collectively as 
    >> Enthought-Edition Python ("Enthon") for the Mac.
    >>
    > Maybe a dumb question from a sometimes (but never comfortable) Mac 
    > user:  is this a fink package or a "framework" package?
    
    Framework package. The goal is to make something that a user can install 
    and have running without installing anything else, so it is going to use 
    the pre-installed Python (or install its own in the case of Python 2.4).
    
    The current plan, so fas as I've worked it out is now on the Wiki:
    
    http://www.scipy.org/wikis/featurerequests/MacEnthon
    
    -- 
    Robert Kern
    rkern at ucsd.edu
    
    "In the fields of hell where the grass grows high
      Are the graves of dreams allowed to die."
       -- Richard Harter
    
    
    
    From strawman at astraw.com  Wed Nov 24 13:55:33 2004
    From: strawman at astraw.com (Andrew Straw)
    Date: Wed, 24 Nov 2004 10:55:33 -0800
    Subject: [SciPy-user] Enthon for the Mac
    In-Reply-To: <41A375DC.3090800@ucsd.edu>
    References: <41A375DC.3090800@ucsd.edu>
    Message-ID: <6BA0F9E6-3E4A-11D9-BEB3-000D93476BAC@astraw.com>
    
    On Nov 23, 2004, at 9:39 AM, Robert Kern wrote:
    
    > I'm gearing up to make binary packages for Scipy et al. for Mac OS X. 
    > I hope that these might be considered collectively as 
    > Enthought-Edition Python ("Enthon") for the Mac.
    >
    > So, Mac users, what do you want to see in Mac Enthon?
    
    This is great news!
    
     From a purely selfish point of view (to help with installing my pet 
    project, the Vision Egg), would you be willing to include pygame (and 
    SDL, SDL_ttf2, libfreetype, SDL_image, SDL_mixer, SMPEG) if I built it? 
    I would happily do so Saturday or Sunday this weekend -- I just want to 
    make sure we get the same version of cross-dependencies.  If so, will 
    Numeric be the 23.6 release?  Are you going to include freetype 
    (2.1.9?), or should I build it? Should unix-like dependencies go in 
    /usr/local/lib and so on?
    
    
    
    From pearu at scipy.org  Wed Nov 24 15:14:01 2004
    From: pearu at scipy.org (Pearu Peterson)
    Date: Wed, 24 Nov 2004 14:14:01 -0600 (CST)
    Subject: [SciPy-user] problems with scipy.test() on Opteron 64-bit
    In-Reply-To: <41A4B560.8030609@cs.kuleuven.ac.be>
    References: <41A31772.3040201@cs.kuleuven.ac.be>
    	<41A45A30.6050202@cs.kuleuven.ac.be>
    	<41A4B560.8030609@cs.kuleuven.ac.be>
    Message-ID: 
    
    
    
    On Wed, 24 Nov 2004, Giovanni Samaey wrote:
    
    > Pearu Peterson wrote:
    >
    >> 
    >> 
    >> On Wed, 24 Nov 2004, Giovanni Samaey wrote:
    >> 
    >>> Since I don't use linalg myself, I do not have any problems using SciPy 
    >>> like this. But I would appreciate this being solved before giving access 
    >>> to others. Since we plan on encouraging new students to use this 
    >>> environment instead of matlab, any help would be welcome.
    >> 
    >> 
    >> On a 64-bit platform you'll need F2PY version 2.43.239_1831, at least.
    >> Also upgrading Numeric would be recommended (make sure that new
    >> Numeric header files are copied to install directory as well).
    >
    > OK -- I succeeded in recompiling atlas version 3.7.8, Numeric 23.6 (I also 
    > tried Numeric 23.5 ), and the latest
    > f2py (2.43.239_1844).  Python version is 2.3.4. However, I use gcc3.3.2.
    > Now the failed scipy tests reduce to only 3 (and 3 times the same computation 
    > that fails).
    
    > The lapack and blas were compiled by adding -fPIC -m64 to their options, to 
    > make the compiler options compatible with those of atlas.
    >
    > ======================================================================
    > FAIL: check_simple (scipy.linalg.decomp.test_decomp.test_eig)
    > ----------------------------------------------------------------------
    > Arrays are not almost equal (mismatch 66.6666666667%):
    >       Array 1: [ 9.4371906+0.j -0.1153653+0.j -0.3218254+0.j]
    >       Array 2: [ 9.3218254  0.        -0.3218254]
    >
    >
    > ======================================================================
    > FAIL: check_simple (scipy.linalg.decomp.test_decomp.test_eigvals)
    > ----------------------------------------------------------------------
    > Arrays are not almost equal (mismatch 66.6666666667%):
    >       Array 1: [ 9.4371906+0.j -0.1153653+0.j -0.3218254+0.j]
    >       Array 2: [ 9.3218254  0.        -0.3218254]
    >
    >
    > ======================================================================
    > FAIL: check_simple_tr (scipy.linalg.decomp.test_decomp.test_eigvals)
    > ----------------------------------------------------------------------
    > Arrays are not almost equal (mismatch 66.6666666667%):
    >       Array 1: [ 9.4371906+0.j -0.1153653+0.j -0.3218254+0.j]
    >       Array 2: [ 9.3218254  0.        -0.3218254]
    
    It is interesting that only simple eig tests fail while complex and random
    tests pass. And in these simple tests eigenvalues are only slightly 
    incorrect.
    
    Could you try building scipy against Fortran BLAS/LAPACK libriares to
    see if your ATLAS libraries are ok? If ATLAS is OK, then you could try 
    f2py from cvs though I can't remember that it would include any 64-bit 
    specifix fixes.
    
    Pearu
    
    
    
    From rkern at ucsd.edu  Wed Nov 24 15:28:25 2004
    From: rkern at ucsd.edu (Robert Kern)
    Date: Wed, 24 Nov 2004 12:28:25 -0800
    Subject: [SciPy-user] Enthon for the Mac
    In-Reply-To: <6BA0F9E6-3E4A-11D9-BEB3-000D93476BAC@astraw.com>
    References: <41A375DC.3090800@ucsd.edu>
    	<6BA0F9E6-3E4A-11D9-BEB3-000D93476BAC@astraw.com>
    Message-ID: <41A4EEE9.70901@ucsd.edu>
    
    Andrew Straw wrote:
    > On Nov 23, 2004, at 9:39 AM, Robert Kern wrote:
    > 
    >> I'm gearing up to make binary packages for Scipy et al. for Mac OS X. 
    >> I hope that these might be considered collectively as 
    >> Enthought-Edition Python ("Enthon") for the Mac.
    >>
    >> So, Mac users, what do you want to see in Mac Enthon?
    > 
    > 
    > This is great news!
    > 
    >  From a purely selfish point of view (to help with installing my pet 
    > project, the Vision Egg), would you be willing to include pygame (and 
    > SDL, SDL_ttf2, libfreetype, SDL_image, SDL_mixer, SMPEG) if I built it? 
    
    Sure. Why not?  :-)  I'm going to want to be able to build everything 
    myself, so all you have to do is figure out how to build it in an 
    acceptable way. pygame itself should be packaged up with Bob Ippolito's 
    bdist_mpkg:
    
    http://svn.red-bean.com/bob/py2app/trunk
    
    Do the standard 1.2.7 frameworks work? or do you have to compile more 
    things? I don't know if they include SMPEG in the frameworks.
    
    http://www.libsdl.org/release/SDL-1.2.7.pkg.tar.gz
    
    Packaging up pygame (and pygtk, too; you're not alone) is, I think, 
    going to be low on my list of priorities. Most of the packages on the 
    list are there because I have, sometimes extensive, experience with 
    building them and getting them to work; this way I can spend my efforts 
    on the packaging itself. I haven't built pygame in a while, and with its 
    dependencies, it is probably going to be more complicated than the other 
    packages.
    
    > I would happily do so Saturday or Sunday this weekend -- I just want to 
    > make sure we get the same version of cross-dependencies. 
    
    MacEnthon won't be released next week, so don't waste your weekend if 
    you don't have to. There will probably be a few alpha releases, so you 
    should have plenty of time.
    
    > If so, will 
    > Numeric be the 23.6 release? 
    
    I'm actually going to go with CVS since it has some post-23.6 changes 
    related to the Mac.
    
    > Are you going to include freetype 
    > (2.1.9?), or should I build it? Should unix-like dependencies go in 
    > /usr/local/lib and so on?
    
    Preferably, libraries should compile into frameworks. Hopefully, the 
    standard packages distributed by libsdl.org will suffice.
    
    See the Wiki page for my current thoughts on how this is all going to work.
    
    http://www.scipy.org/wikis/featurerequests/MacEnthon
    
    -- 
    Robert Kern
    rkern at ucsd.edu
    
    "In the fields of hell where the grass grows high
      Are the graves of dreams allowed to die."
       -- Richard Harter
    
    
    
    From Giovanni.Samaey at cs.kuleuven.ac.be  Thu Nov 25 05:16:53 2004
    From: Giovanni.Samaey at cs.kuleuven.ac.be (Giovanni Samaey)
    Date: Thu, 25 Nov 2004 11:16:53 +0100
    Subject: [SciPy-user] problems with scipy.test() on Opteron 64-bit
    In-Reply-To: 
    References: <41A31772.3040201@cs.kuleuven.ac.be>
    	<41A45A30.6050202@cs.kuleuven.ac.be>	<41A4B560.8030609@cs.kuleuven.ac.be>
    	
    Message-ID: <41A5B115.1080405@cs.kuleuven.ac.be>
    
    I tried all your suggestions:
    - I compiled scipy against my BLAS and LAPACK libraries
    - I compiled scipy giving only BLAS and LAPACK source (which was then 
    compiled during scipy build)
    - I upgraded f2py and built scipy against altas3.7.8; blas/lapack 
    libraries and blas/lapack source.
    There is no difference.
    
    >
    > It is interesting that only simple eig tests fail while complex and 
    > random
    > tests pass. And in these simple tests eigenvalues are only slightly 
    > incorrect.
    
    What is remarkable is that, the error is the same in the 3 cases.  
    Moreover, if you add up
    the first two components of array1 you get the first component of array2.
    
    ======================================================================
    FAIL: check_simple (scipy.linalg.decomp.test_decomp.test_eigvals)
    ----------------------------------------------------------------------
    Traceback (most recent call last):
      File 
    "/data/home/giovanni/lib/python2.3/site-packages/scipy/linalg/tests/test_decomp.py", 
    line 36, in check_simple
        assert_array_almost_equal(w,exact_w)
      File 
    "/data/home/giovanni/lib/python2.3/site-packages/scipy_test/testing.py", 
    line 727,
    in assert_array_almost_equal
        assert cond,\
    AssertionError:
    Arrays are not almost equal (mismatch 66.6666666667%):
            Array 1: [ 9.4371906+0.j -0.1153653+0.j -0.3218254+0.j]
            Array 2: [ 9.3218254  0.        -0.3218254]
    
    
    Giovanni
    
    
    
    From pearu at scipy.org  Thu Nov 25 08:26:53 2004
    From: pearu at scipy.org (Pearu Peterson)
    Date: Thu, 25 Nov 2004 07:26:53 -0600 (CST)
    Subject: [SciPy-user] problems with scipy.test() on Opteron 64-bit
    In-Reply-To: <41A5B115.1080405@cs.kuleuven.ac.be>
    References: <41A31772.3040201@cs.kuleuven.ac.be>
    	<41A45A30.6050202@cs.kuleuven.ac.be>
    	
    	<41A5B115.1080405@cs.kuleuven.ac.be>
    Message-ID: 
    
    
    
    On Thu, 25 Nov 2004, Giovanni Samaey wrote:
    
    > What is remarkable is that, the error is the same in the 3 cases.  Moreover, 
    > if you add up the first two components of array1 you get the first 
    > component of array2.
    
    It would look like atlas bug but since the same results are obtained with 
    Fortran blas/lapack, then I don't know what is happening. I presume that
    you did
       rm -rf build
    each time before rebuilding scipy with different blas/lapack libraries.
    
    What happens when you call geev routine directly? Try for example:
    
    In [1]: import scipy
    
    In [2]: scipy.linalg.flapack.dgeev([[1,2,3],[1,2,3],[2,5,6]],0,0)
    Out[2]:
    (array([  9.32182538e+00,  -6.03426271e-16,  -3.21825380e-01]),
      array([ 0.,  0.,  0.]),
      array([       [ 0.,  0.,  0.]]),
      array([       [ 0.,  0.,  0.]]),
      0)
    
    In [3]: scipy.linalg.flapack.sgeev([[1,2,3],[1,2,3],[2,5,6]],0,0)
    Out[3]:
    (array([  9.32182503e+00,  -2.82419336e-07,  -3.21825117e-01],'f'),
      array([ 0.,  0.,  0.],'f'),
      array([       [ 0.,  0.,  0.]],'f'),
      array([       [ 0.,  0.,  0.]],'f'),
      0)
    
    In [4]: scipy.linalg.calc_lwork.geev('d',3)
    Out[4]: (12, 102)
    
    Is there any difference in your case?
    
    Pearu
    
    
    
    From Giovanni.Samaey at cs.kuleuven.ac.be  Thu Nov 25 08:40:12 2004
    From: Giovanni.Samaey at cs.kuleuven.ac.be (Giovanni Samaey)
    Date: Thu, 25 Nov 2004 14:40:12 +0100
    Subject: [SciPy-user] problems with scipy.test() on Opteron 64-bit
    In-Reply-To: 
    References: <41A31772.3040201@cs.kuleuven.ac.be>
    	<41A45A30.6050202@cs.kuleuven.ac.be>
    	
    	<41A5B115.1080405@cs.kuleuven.ac.be>
    	
    Message-ID: <41A5E0BC.5000900@cs.kuleuven.ac.be>
    
    Pearu Peterson wrote:
    
    >
    >
    > On Thu, 25 Nov 2004, Giovanni Samaey wrote:
    >
    >> What is remarkable is that, the error is the same in the 3 cases.  
    >> Moreover, if you add up the first two components of array1 you get 
    >> the first component of array2.
    >
    >
    > It would look like atlas bug but since the same results are obtained 
    > with Fortran blas/lapack, then I don't know what is happening. I 
    > presume that
    > you did
    >   rm -rf build
    
    Of course. If I would forget, I would notice that nothing would be 
    built, anyway...
    
    > What happens when you call geev routine directly? Try for example:
    >
    > In [1]: import scipy
    >
    > In [2]: scipy.linalg.flapack.dgeev([[1,2,3],[1,2,3],[2,5,6]],0,0)
    > Out[2]:
    > (array([  9.32182538e+00,  -6.03426271e-16,  -3.21825380e-01]),
    >  array([ 0.,  0.,  0.]),
    >  array([       [ 0.,  0.,  0.]]),
    >  array([       [ 0.,  0.,  0.]]),
    >  0)
    >
    > In [3]: scipy.linalg.flapack.sgeev([[1,2,3],[1,2,3],[2,5,6]],0,0)
    > Out[3]:
    > (array([  9.32182503e+00,  -2.82419336e-07,  -3.21825117e-01],'f'),
    >  array([ 0.,  0.,  0.],'f'),
    >  array([       [ 0.,  0.,  0.]],'f'),
    >  array([       [ 0.,  0.,  0.]],'f'),
    >  0)
    >
    > In [4]: scipy.linalg.calc_lwork.geev('d',3)
    > Out[4]: (12, 102)
    >
    > Is there any difference in your case?
    
    Yes there is; in fact there is a difference in all three.
    
    In[1]: import scipy
    In[2]: scipy.linalg.flapack.dgeev([[1,2,3],[1,2,3],[2,5,6]],0,0)
    Out[2]: (array([ 9.43719064, -0.11536526, -0.32182538]),
    array([ 0.,  0.,  0.]),
    array([       [ 0.,  0.,  0.]]),
    array([       [ 0.,  0.,  0.]]),
    0)
    In[3]: scipy.linalg.flapack.sgeev([[1,2,3],[1,2,3],[2,5,6]],0,0)
    Out[3]: (array([ 0.,  0.,  0.],'f'),
    array([ 0.,  0.,  0.],'f'),
    array([       [ 0.,  0.,  0.]],'f'),
    array([       [ 0.,  0.,  0.]],'f'), 3)
    In[4]: scipy.linalg.calc_lwork.geev('d',3)
    Out[4]: (12, 174)
    
    Giovanni
    
    
    >
    > Pearu
    >
    > _______________________________________________
    > SciPy-user mailing list
    > SciPy-user at scipy.net
    > http://www.scipy.net/mailman/listinfo/scipy-user
    
    
    
    -- 
    Giovanni Samaey		 	http://www.cs.kuleuven.ac.be/~giovanni/ 
    Katholieke Universiteit Leuven 	      email: giovanni at cs.kuleuven.ac.be 
    Departement Computerwetenschappen                  phone: +32-16-327081
    Celestijnenlaan 200A, B-3001 Heverlee, Belgium       fax: +32-16-327996
    Office: A04.36
    
    
    
    
    From pearu at scipy.org  Thu Nov 25 09:00:11 2004
    From: pearu at scipy.org (Pearu Peterson)
    Date: Thu, 25 Nov 2004 08:00:11 -0600 (CST)
    Subject: [SciPy-user] problems with scipy.test() on Opteron 64-bit
    In-Reply-To: <41A5E0BC.5000900@cs.kuleuven.ac.be>
    References: <41A31772.3040201@cs.kuleuven.ac.be>
    	<41A45A30.6050202@cs.kuleuven.ac.be>
    	<41A5B115.1080405@cs.kuleuven.ac.be>
    	<41A5E0BC.5000900@cs.kuleuven.ac.be>
    Message-ID: 
    
    
    
    On Thu, 25 Nov 2004, Giovanni Samaey wrote:
    
    > Pearu Peterson wrote:
    >
    >> 
    >> 
    >> On Thu, 25 Nov 2004, Giovanni Samaey wrote:
    >> 
    >>> What is remarkable is that, the error is the same in the 3 cases. 
    >>> Moreover, if you add up the first two components of array1 you get the 
    >>> first component of array2.
    >> 
    >> 
    >> It would look like atlas bug but since the same results are obtained with 
    >> Fortran blas/lapack, then I don't know what is happening. I presume that
    >> you did
    >>   rm -rf build
    >
    > Of course. If I would forget, I would notice that nothing would be built, 
    > anyway...
    >
    >> What happens when you call geev routine directly? Try for example:
    >> 
    >> In [1]: import scipy
    >> 
    >> In [2]: scipy.linalg.flapack.dgeev([[1,2,3],[1,2,3],[2,5,6]],0,0)
    >> Out[2]:
    >> (array([  9.32182538e+00,  -6.03426271e-16,  -3.21825380e-01]),
    >>  array([ 0.,  0.,  0.]),
    >>  array([       [ 0.,  0.,  0.]]),
    >>  array([       [ 0.,  0.,  0.]]),
    >>  0)
    >> 
    >> In [3]: scipy.linalg.flapack.sgeev([[1,2,3],[1,2,3],[2,5,6]],0,0)
    >> Out[3]:
    >> (array([  9.32182503e+00,  -2.82419336e-07,  -3.21825117e-01],'f'),
    >>  array([ 0.,  0.,  0.],'f'),
    >>  array([       [ 0.,  0.,  0.]],'f'),
    >>  array([       [ 0.,  0.,  0.]],'f'),
    >>  0)
    >> 
    >> In [4]: scipy.linalg.calc_lwork.geev('d',3)
    >> Out[4]: (12, 102)
    
    This was obtained on a 32-bit system.
    
    >> Is there any difference in your case?
    >
    > Yes there is; in fact there is a difference in all three.
    >
    > In[1]: import scipy
    > In[2]: scipy.linalg.flapack.dgeev([[1,2,3],[1,2,3],[2,5,6]],0,0)
    > Out[2]: (array([ 9.43719064, -0.11536526, -0.32182538]),
    > array([ 0.,  0.,  0.]),
    > array([       [ 0.,  0.,  0.]]),
    > array([       [ 0.,  0.,  0.]]),
    > 0)
    
    Hmm, try also
    
    scipy.linalg.flapack.zgeev([[1,2,3],[1,2,3],[2,5,6]],0,0)
    scipy.linalg.flapack.dgeev(scipy.array([[1,2,3],[1,2,3],[2,5,6]],'d'),0,0)
    scipy.linalg.flapack.dgeev(Numeric.array([[1,2,3],[1,2,3],[2,5,6]],'d'),0,0)
    
    If those fail as well then try a C program that uses dgeev to solve the 
    same eigenvalue problem:
    
    /* main.c */
    #include 
    extern 
    dgeev_(char*,char*,int*,double*,int*,double*,double*,double*,int*,double*,int*,double*,int*,int*);
    main() {
       int n=3,lwork=12,info=0;
       double a[] = {1,1,2,2,2,5,3,3,6};
       double wr[]={0,0,0},wi[]={0,0,0},*vl,*vr,work[lwork];
       dgeev_("N","N",&n,a,&n,wr,wi,vl,&n,vr,&n,work,&lwork,&info);
       printf("wr=%g,%g,%g,wi=%g,%g,%g\n",wr[0],wr[1],wr[2],wi[0],wi[1],wi[2]);
    }
    /*eof*/
    
    $ gcc main.c -llapack
    $ ./a.out
    wr=9.32183,-6.20979e-16,-0.321825,wi=0,0,0
    
    
    > In[3]: scipy.linalg.flapack.sgeev([[1,2,3],[1,2,3],[2,5,6]],0,0)
    > Out[3]: (array([ 0.,  0.,  0.],'f'),
    > array([ 0.,  0.,  0.],'f'),
    > array([       [ 0.,  0.,  0.]],'f'),
    > array([       [ 0.,  0.,  0.]],'f'), 3)
    
    Here info==3 which means that the QR algorithm  failed  to  compute  all 
    the eigenvalues and 0:info of eigenvalues are such that converged. But
    that does not makes sense because info==n...
    
    > In[4]: scipy.linalg.calc_lwork.geev('d',3)
    > Out[4]: (12, 174)
    
    I get the same result on an Opteron box.
    
    Pearu
    
    
    
    From nwagner at mecha.uni-stuttgart.de  Thu Nov 25 09:17:49 2004
    From: nwagner at mecha.uni-stuttgart.de (Nils Wagner)
    Date: Thu, 25 Nov 2004 15:17:49 +0100
    Subject: [SciPy-user] problems with scipy.test() on Opteron 64-bit
    In-Reply-To: 
    References: <41A31772.3040201@cs.kuleuven.ac.be>
    	<41A45A30.6050202@cs.kuleuven.ac.be>	<41A5B115.1080405@cs.kuleuven.ac.be>
    	<41A5E0BC.5000900@cs.kuleuven.ac.be>
    	
    Message-ID: <41A5E98D.10501@mecha.uni-stuttgart.de>
    
    Pearu Peterson wrote:
    
    >
    >
    > On Thu, 25 Nov 2004, Giovanni Samaey wrote:
    >
    >> Pearu Peterson wrote:
    >>
    >>>
    >>>
    >>> On Thu, 25 Nov 2004, Giovanni Samaey wrote:
    >>>
    >>>> What is remarkable is that, the error is the same in the 3 cases. 
    >>>> Moreover, if you add up the first two components of array1 you get 
    >>>> the first component of array2.
    >>>
    >>>
    >>>
    >>> It would look like atlas bug but since the same results are obtained 
    >>> with Fortran blas/lapack, then I don't know what is happening. I 
    >>> presume that
    >>> you did
    >>>   rm -rf build
    >>
    >>
    >> Of course. If I would forget, I would notice that nothing would be 
    >> built, anyway...
    >>
    >>> What happens when you call geev routine directly? Try for example:
    >>>
    >>> In [1]: import scipy
    >>>
    >>> In [2]: scipy.linalg.flapack.dgeev([[1,2,3],[1,2,3],[2,5,6]],0,0)
    >>> Out[2]:
    >>> (array([  9.32182538e+00,  -6.03426271e-16,  -3.21825380e-01]),
    >>>  array([ 0.,  0.,  0.]),
    >>>  array([       [ 0.,  0.,  0.]]),
    >>>  array([       [ 0.,  0.,  0.]]),
    >>>  0)
    >>>
    >>> In [3]: scipy.linalg.flapack.sgeev([[1,2,3],[1,2,3],[2,5,6]],0,0)
    >>> Out[3]:
    >>> (array([  9.32182503e+00,  -2.82419336e-07,  -3.21825117e-01],'f'),
    >>>  array([ 0.,  0.,  0.],'f'),
    >>>  array([       [ 0.,  0.,  0.]],'f'),
    >>>  array([       [ 0.,  0.,  0.]],'f'),
    >>>  0)
    >>>
    >>> In [4]: scipy.linalg.calc_lwork.geev('d',3)
    >>> Out[4]: (12, 102)
    >>
    >
    > This was obtained on a 32-bit system.
    >
    >>> Is there any difference in your case?
    >>
    >>
    >> Yes there is; in fact there is a difference in all three.
    >>
    >> In[1]: import scipy
    >> In[2]: scipy.linalg.flapack.dgeev([[1,2,3],[1,2,3],[2,5,6]],0,0)
    >> Out[2]: (array([ 9.43719064, -0.11536526, -0.32182538]),
    >> array([ 0.,  0.,  0.]),
    >> array([       [ 0.,  0.,  0.]]),
    >> array([       [ 0.,  0.,  0.]]),
    >> 0)
    >
    >
    > Hmm, try also
    >
    > scipy.linalg.flapack.zgeev([[1,2,3],[1,2,3],[2,5,6]],0,0)
    > scipy.linalg.flapack.dgeev(scipy.array([[1,2,3],[1,2,3],[2,5,6]],'d'),0,0) 
    >
    > scipy.linalg.flapack.dgeev(Numeric.array([[1,2,3],[1,2,3],[2,5,6]],'d'),0,0) 
    >
    >
    > If those fail as well then try a C program that uses dgeev to solve 
    > the same eigenvalue problem:
    >
    > /* main.c */
    > #include 
    > extern 
    > dgeev_(char*,char*,int*,double*,int*,double*,double*,double*,int*,double*,int*,double*,int*,int*); 
    >
    > main() {
    >   int n=3,lwork=12,info=0;
    >   double a[] = {1,1,2,2,2,5,3,3,6};
    >   double wr[]={0,0,0},wi[]={0,0,0},*vl,*vr,work[lwork];
    >   dgeev_("N","N",&n,a,&n,wr,wi,vl,&n,vr,&n,work,&lwork,&info);
    >   
    > printf("wr=%g,%g,%g,wi=%g,%g,%g\n",wr[0],wr[1],wr[2],wi[0],wi[1],wi[2]);
    > }
    > /*eof*/
    >
    > $ gcc main.c -llapack
    > $ ./a.out
    > wr=9.32183,-6.20979e-16,-0.321825,wi=0,0,0
    >
    >
    How can I enforce the usage of a static library *.a instead of *.so ?
    
    /home/nwagner> gcc main.c -L/usr/local/lib -llapack
    /usr/lib/gcc-lib/i586-suse-linux/3.3.3/../../../liblapack.so: undefined 
    reference to `e_wsfe'
    /usr/lib/gcc-lib/i586-suse-linux/3.3.3/../../../liblapack.so: undefined 
    reference to `z_abs'
    /usr/lib/gcc-lib/i586-suse-linux/3.3.3/../../../liblapack.so: undefined 
    reference to `c_sqrt'
    /usr/lib/gcc-lib/i586-suse-linux/3.3.3/../../../liblapack.so: undefined 
    reference to `s_cmp'
    /usr/lib/gcc-lib/i586-suse-linux/3.3.3/../../../liblapack.so: undefined 
    reference to `z_exp'
    /usr/lib/gcc-lib/i586-suse-linux/3.3.3/../../../liblapack.so: undefined 
    reference to `c_exp'
    /usr/lib/gcc-lib/i586-suse-linux/3.3.3/../../../liblapack.so: undefined 
    reference to `etime_'
    /usr/lib/gcc-lib/i586-suse-linux/3.3.3/../../../liblapack.so: undefined 
    reference to `do_fio'
    /usr/lib/gcc-lib/i586-suse-linux/3.3.3/../../../liblapack.so: undefined 
    reference to `z_sqrt'
    /usr/lib/gcc-lib/i586-suse-linux/3.3.3/../../../liblapack.so: undefined 
    reference to `s_cat'
    /usr/lib/gcc-lib/i586-suse-linux/3.3.3/../../../liblapack.so: undefined 
    reference to `s_stop'
    /usr/lib/gcc-lib/i586-suse-linux/3.3.3/../../../liblapack.so: undefined 
    reference to `c_abs'
    /usr/lib/gcc-lib/i586-suse-linux/3.3.3/../../../liblapack.so: undefined 
    reference to `s_wsfe'
    /usr/lib/gcc-lib/i586-suse-linux/3.3.3/../../../liblapack.so: undefined 
    reference to `s_copy'
    collect2: ld returned 1 exit status
    
    Nils
    
    >> In[3]: scipy.linalg.flapack.sgeev([[1,2,3],[1,2,3],[2,5,6]],0,0)
    >> Out[3]: (array([ 0.,  0.,  0.],'f'),
    >> array([ 0.,  0.,  0.],'f'),
    >> array([       [ 0.,  0.,  0.]],'f'),
    >> array([       [ 0.,  0.,  0.]],'f'), 3)
    >
    >
    > Here info==3 which means that the QR algorithm  failed  to  compute  
    > all the eigenvalues and 0:info of eigenvalues are such that converged. 
    > But
    > that does not makes sense because info==n...
    >
    >> In[4]: scipy.linalg.calc_lwork.geev('d',3)
    >> Out[4]: (12, 174)
    >
    >
    > I get the same result on an Opteron box.
    >
    > Pearu
    >
    > _______________________________________________
    > SciPy-user mailing list
    > SciPy-user at scipy.net
    > http://www.scipy.net/mailman/listinfo/scipy-user
    
    
    
     
    
    
    
    From pearu at scipy.org  Thu Nov 25 09:20:23 2004
    From: pearu at scipy.org (Pearu Peterson)
    Date: Thu, 25 Nov 2004 08:20:23 -0600 (CST)
    Subject: [SciPy-user] problems with scipy.test() on Opteron 64-bit
    In-Reply-To: <41A5E98D.10501@mecha.uni-stuttgart.de>
    References: <41A31772.3040201@cs.kuleuven.ac.be>
    	<41A45A30.6050202@cs.kuleuven.ac.be>
    	<41A5E0BC.5000900@cs.kuleuven.ac.be>
    	<41A5E98D.10501@mecha.uni-stuttgart.de>
    Message-ID: 
    
    
    
    On Thu, 25 Nov 2004, Nils Wagner wrote:
    
    > /home/nwagner> gcc main.c -L/usr/local/lib -llapack
    > /usr/lib/gcc-lib/i586-suse-linux/3.3.3/../../../liblapack.so: undefined 
    > reference to `e_wsfe'
    > collect2: ld returned 1 exit status
    >
    > Nils
    
    You'll need also -lg2c.
    
    Pearu
    
    
    
    From Giovanni.Samaey at cs.kuleuven.ac.be  Thu Nov 25 10:04:27 2004
    From: Giovanni.Samaey at cs.kuleuven.ac.be (Giovanni Samaey)
    Date: Thu, 25 Nov 2004 16:04:27 +0100
    Subject: [SciPy-user] problems with scipy.test() on Opteron 64-bit
    In-Reply-To: 
    References: <41A31772.3040201@cs.kuleuven.ac.be>
    	<41A45A30.6050202@cs.kuleuven.ac.be>	<41A5B115.1080405@cs.kuleuven.ac.be>
    	<41A5E0BC.5000900@cs.kuleuven.ac.be>
    	
    Message-ID: <41A5F47B.6030004@cs.kuleuven.ac.be>
    
    
    >
    > Hmm, try also
    >
    > scipy.linalg.flapack.zgeev([[1,2,3],[1,2,3],[2,5,6]],0,0)
    > scipy.linalg.flapack.dgeev(scipy.array([[1,2,3],[1,2,3],[2,5,6]],'d'),0,0) 
    >
    > scipy.linalg.flapack.dgeev(Numeric.array([[1,2,3],[1,2,3],[2,5,6]],'d'),0,0) 
    >
    
    The first one gives the correct result.  The second and third give the 
    wrong result.
    
    >
    > If those fail as well then try a C program that uses dgeev to solve 
    > the same eigenvalue problem:
    >
    > $ gcc main.c -llapack
    > $ ./a.out
    > wr=9.32183,-6.20979e-16,-0.321825,wi=0,0,0
    
    I get (after adding some -lblas -lm and paths etc...)
    ./a.out
    wr=9.43719,-0.115365,-0.321825,wi=0,0,0
    
    which probably shows that the error comes from lapack.
    (This is with the lapack I compiled myself in an intermediate step to 
    build atlas for scipy.)
    Just to repeat how I compiled lapack: I took the Makefile and added 
    -fPIC and -m64 there
    to give it the same compiler options as that atlas makefile.
    
    Giovanni
    
    
    
    From pearu at scipy.org  Thu Nov 25 10:14:23 2004
    From: pearu at scipy.org (Pearu Peterson)
    Date: Thu, 25 Nov 2004 09:14:23 -0600 (CST)
    Subject: [SciPy-user] problems with scipy.test() on Opteron 64-bit
    In-Reply-To: <41A5F47B.6030004@cs.kuleuven.ac.be>
    References: <41A31772.3040201@cs.kuleuven.ac.be>
    	<41A45A30.6050202@cs.kuleuven.ac.be>
    	<41A5E0BC.5000900@cs.kuleuven.ac.be>
    	<41A5F47B.6030004@cs.kuleuven.ac.be>
    Message-ID: 
    
    
    
    On Thu, 25 Nov 2004, Giovanni Samaey wrote:
    
    >> If those fail as well then try a C program that uses dgeev to solve the 
    >> same eigenvalue problem:
    >> 
    >> $ gcc main.c -llapack
    >> $ ./a.out
    >> wr=9.32183,-6.20979e-16,-0.321825,wi=0,0,0
    >
    > I get (after adding some -lblas -lm and paths etc...)
    > ./a.out
    > wr=9.43719,-0.115365,-0.321825,wi=0,0,0
    >
    > which probably shows that the error comes from lapack.
    
    Yes.
    
    > (This is with the lapack I compiled myself in an intermediate step to build 
    > atlas for scipy.)
    > Just to repeat how I compiled lapack: I took the Makefile and added -fPIC and 
    > -m64 there to give it the same compiler options as that atlas makefile.
    
    Now try building Fortran lapack without optimization flags and see if you 
    still get incorrect results.
    
    I think it is reasonable to check also if the incorrect results are 
    related to gcc 3.2 that you are using. I might be wrong but gcc 3.2
    is probably older than Opterons..
    
    Pearu
    
    
    
    From Giovanni.Samaey at cs.kuleuven.ac.be  Thu Nov 25 10:43:01 2004
    From: Giovanni.Samaey at cs.kuleuven.ac.be (Giovanni Samaey)
    Date: Thu, 25 Nov 2004 16:43:01 +0100
    Subject: [SciPy-user] problems with scipy.test() on Opteron 64-bit
    In-Reply-To: 
    References: <41A31772.3040201@cs.kuleuven.ac.be>
    	<41A45A30.6050202@cs.kuleuven.ac.be>	<41A5E0BC.5000900@cs.kuleuven.ac.be>
    	<41A5F47B.6030004@cs.kuleuven.ac.be>
    	
    Message-ID: <41A5FD85.5070709@cs.kuleuven.ac.be>
    
    Hi Pearu,
    
    >
    > Now try building Fortran lapack without optimization flags and see if 
    > you still get incorrect results.
    
    I get correct results now!  To finish completely, I will gradually 
    increase the optimization flags until where
    it starts failing; and meanwhile I will try to persuade the system 
    administration to upgrade the compilers.
    Maybe the newer compilers handle optimization flags better.
    
    Anyway, the computational part of SciPy is now completely functional.  
    The only thing that needs a little more work is the X11 support.  I 
    turned that off completely last time by including x11_libs = noX11 in 
    site.cfg.  But I have a libX11.so file
    available, if I would be able to link with that.
    
    Pearu, I have noticed that you will be in Florida in February.  So am I, 
    so I will definitely buy you beers!
    
    Best,
    
    Giovanni
    
    
    
    From Jonathan.Peirce at nottingham.ac.uk  Thu Nov 25 11:20:22 2004
    From: Jonathan.Peirce at nottingham.ac.uk (Jon Peirce)
    Date: Thu, 25 Nov 2004 16:20:22 +0000
    Subject: [SciPy-user] Re: Enthon for the Mac
    Message-ID: <41A60646.6090603@psychology.nottingham.ac.uk>
    
    If installing PyOpenGL - I'd love to see pygame in there too if it can 
    be managed. At the moment Bob Ippolito's package manager has it built 
    for OS X.
    
    Jon
    
    -- 
    Jon Peirce
    Nottingham University
    +44 (0)115 8467176 (tel)
    +44 (0)115 9515324 (fax)
    
    http://www.psychology.nottingham.ac.uk/staff/jwp/
    
    
    This message has been scanned but we cannot guarantee that it and any
    attachments are free from viruses or other damaging content: you are
    advised to perform your own checks.  Email communications with the
    University of Nottingham may be monitored as permitted by UK legislation.
    
    
    
    From aisaac at american.edu  Fri Nov 26 11:22:52 2004
    From: aisaac at american.edu (Alan G Isaac)
    Date: Fri, 26 Nov 2004 11:22:52 -0500 (Eastern Standard Time)
    Subject: [SciPy-user] test matrix equality
    Message-ID: 
    
    Newbie question:
    
    What is the right/efficient way to test
    for equality between two matrices?
    While alltrue(ravel(A==B)) works,
    I am guessing alltrue(alltrue(A==B))
    is better.
    
    If I import MA it seems I can use
    allequal, but this does not seem
    to be available otherwise.  (Why??)
    Also, although this behaves like
    I want, it is oddly inconsistent
    with the other 'all' functions.
    
    I'd like to be able to just look at
    A.eq(B) for any matrices A and B.
    
    Thank you,
    Alan Isaac
    
    PS Suggestion: allow setting index=None
    for alltrue to test all individual
    elements.
    
    
    
    
    
    From aisaac at american.edu  Fri Nov 26 15:42:15 2004
    From: aisaac at american.edu (Alan G Isaac)
    Date: Fri, 26 Nov 2004 15:42:15 -0500 (Eastern Standard Time)
    Subject: [SciPy-user] switch columns
    Message-ID: 
    
    Suppose in Scipy I have a 2x2 array A and I
    want to switch the first two columns.
    I incorrectly expected the following to work:
    A[:,0],A[:,1]=A[:,1],A[:,0]
    
    Then I discovered this works:
    A=mat(A)
    A[:,0],A[:,1]=A[:,1],A[:,0]
    
    Question: Why the difference?
    And: how to anticipate this difference?
    
    Thank you,
    Alan Isaac
    
    
    
    
    From rkern at ucsd.edu  Fri Nov 26 17:09:51 2004
    From: rkern at ucsd.edu (Robert Kern)
    Date: Fri, 26 Nov 2004 14:09:51 -0800
    Subject: [SciPy-user] switch columns
    In-Reply-To: 
    References: 
    Message-ID: <41A7A9AF.9020802@ucsd.edu>
    
    Alan G Isaac wrote:
    > Suppose in Scipy I have a 2x2 array A and I
    > want to switch the first two columns.
    > I incorrectly expected the following to work:
    > A[:,0],A[:,1]=A[:,1],A[:,0]
    > 
    > Then I discovered this works:
    > A=mat(A)
    > A[:,0],A[:,1]=A[:,1],A[:,0]
    > 
    > Question: Why the difference?
    > And: how to anticipate this difference?
    
    When A is a Numeric array, slicing it produces a view on that slice of 
    the array rather than producing a copy. When you change the underlying 
    array, the values in the view change as well. So when the tuple gets 
    unpacked, A[:,0] gets assigned the values originally in A[:,1], then 
    A[:,1] gets assigned the values currently in A[:,0] which are the values 
    that were originally in A[:,1].
    
    Apparently (I don't use the mat object, so I'm no authority here), 
    slicing a mat object yields a copy, not a view.
    
    If you explicitly want a copy of a Numeric array, use the copy() method.
    
       A[:,0], A[:,1] = A[:,1].copy(), A[:,0].copy()
    
    -- 
    Robert Kern
    rkern at ucsd.edu
    
    "In the fields of hell where the grass grows high
      Are the graves of dreams allowed to die."
       -- Richard Harter
    
    
    
    From aisaac at american.edu  Fri Nov 26 19:26:00 2004
    From: aisaac at american.edu (Alan G Isaac)
    Date: Fri, 26 Nov 2004 19:26:00 -0500 (Eastern Standard Time)
    Subject: [SciPy-user] switch columns
    In-Reply-To: <41A7A9AF.9020802@ucsd.edu>
    References: 
    	<41A7A9AF.9020802@ucsd.edu>
    Message-ID: 
    
    On Fri, 26 Nov 2004, Robert Kern apparently wrote:
    > When A is a Numeric array, slicing it produces a view on that slice of
    > the array rather than producing a copy.
    
    ...
    
    > Apparently (I don't use the mat object, so I'm no authority here),
    > slicing a mat object yields a copy, not a view.
    
    >From chapter 11 of the Numeric documentation:
    
        Matrix.py
    
        The Matrix.py python module defines a class Matrix which
        is a subclass of UserArray. The only differences between
        Matrix instances and UserArray instances is that the *
        operator on Matrix performs a matrix multiplication, as
        opposed to element-wise multiplication, and that the
        power operator ** is disallowed for Matrix instances.
    
    So, this documentation is radically wrong, right?
    
    Thanks,
    Alan Isaac
    
    
    
    
    From bob at redivi.com  Fri Nov 26 19:43:29 2004
    From: bob at redivi.com (Bob Ippolito)
    Date: Fri, 26 Nov 2004 19:43:29 -0500
    Subject: [SciPy-user] Enthon for the Mac
    In-Reply-To: <41A39A9B.3080507@enthought.com>
    References: <41A375DC.3090800@ucsd.edu> <41A39A9B.3080507@enthought.com>
    Message-ID: <5B2854F8-400D-11D9-87C6-000A9567635C@redivi.com>
    
    
    On Nov 23, 2004, at 3:16 PM, Joe Cooper wrote:
    
    > I had started in on the Enthon on OS X endeavor about six months ago, 
    > but only a few packages ever came of it.  The packaging on Mac is 
    > quite abyssmal, so there isn't much to be gained by handing off the 
    > packages I did back then--you pretty much just have to build the 
    > software by hand, and then push the pretty "Go" button and it all gets 
    > piled up together into an archive.  I do have simple aap recipes for 
    > almost everything in Enthon on Windows, which you might find useful in 
    > automating your construction on Mac, since the concept is the same for 
    > the Wise installer.  It's nicer to say "aap -vf scipy.aap" than to 
    > have to go download all of the bits and pieces.
    
    You might want to consider looking bdist_mpkg, which is shipped in 
    py2app 0.1.5.  You can use this to build an OS X metapackage from an 
    existing setup.py script (typically with no modifications!).  You can 
    then just make an Enthought metapackage that references all of these 
    and allows for them all to be installed in one fell swoop.
    
    I'll probably be releasing 0.2 somewhat soon, but I think most of the 
    user-visible changes will be that modulegraph is going to be its own 
    cross-platform package, since it is useful to other projects such as 
    py2exe, and a bunch of new recipes for py2app (PyOpenGL, PyQt, ...).
    
    -bob
    
    
    
    From bob at redivi.com  Fri Nov 26 19:47:38 2004
    From: bob at redivi.com (Bob Ippolito)
    Date: Fri, 26 Nov 2004 19:47:38 -0500
    Subject: [SciPy-user] Enthon for the Mac
    In-Reply-To: <41A4EEE9.70901@ucsd.edu>
    References: <41A375DC.3090800@ucsd.edu>
    	<6BA0F9E6-3E4A-11D9-BEB3-000D93476BAC@astraw.com>
    	<41A4EEE9.70901@ucsd.edu>
    Message-ID: 
    
    
    On Nov 24, 2004, at 3:28 PM, Robert Kern wrote:
    
    > Andrew Straw wrote:
    >> On Nov 23, 2004, at 9:39 AM, Robert Kern wrote:
    >>> I'm gearing up to make binary packages for Scipy et al. for Mac OS 
    >>> X. I hope that these might be considered collectively as 
    >>> Enthought-Edition Python ("Enthon") for the Mac.
    >>>
    >>> So, Mac users, what do you want to see in Mac Enthon?
    >> This is great news!
    >>  From a purely selfish point of view (to help with installing my pet 
    >> project, the Vision Egg), would you be willing to include pygame (and 
    >> SDL, SDL_ttf2, libfreetype, SDL_image, SDL_mixer, SMPEG) if I built 
    >> it?
    >
    > Sure. Why not?  :-)  I'm going to want to be able to build everything 
    > myself, so all you have to do is figure out how to build it in an 
    > acceptable way. pygame itself should be packaged up with Bob 
    > Ippolito's bdist_mpkg:
    >
    > http://svn.red-bean.com/bob/py2app/trunk
    
    Don't use trunk, use 0.1.5, I'm refactoring..
    
    -bob
    
    
    
    From strawman at astraw.com  Sat Nov 27 19:15:28 2004
    From: strawman at astraw.com (Andrew Straw)
    Date: Sat, 27 Nov 2004 16:15:28 -0800
    Subject: [SciPy-user] Enthon for the Mac - pygame notes
    In-Reply-To: <41A4EEE9.70901@ucsd.edu>
    References: <41A375DC.3090800@ucsd.edu>
    	<6BA0F9E6-3E4A-11D9-BEB3-000D93476BAC@astraw.com>
    	<41A4EEE9.70901@ucsd.edu>
    Message-ID: <9C14E108-40D2-11D9-90AD-000D93476BAC@astraw.com>
    
    On Nov 24, 2004, at 12:28 PM, Robert Kern wrote:
    
    > Andrew Straw wrote:
    >> would you be willing to include pygame (and SDL, SDL_ttf2,  
    >> libfreetype, SDL_image, SDL_mixer, SMPEG) if I built it?
    >
    > Sure. Why not?  :-)
    
    Are LGPL packages going to be allowed in Enthon? pygame, SDL, and SDL_*  
    are all LGPL, we may want to keep them out of Enthon if it's  
    BSD-licensed code only...
    
    > Do the standard 1.2.7 frameworks work? or do you have to compile more  
    > things? I don't know if they include SMPEG in the frameworks.
    >
    > http://www.libsdl.org/release/SDL-1.2.7.pkg.tar.gz
    
    This standard framework appears to work.  In addition to the above  
    .pkg, you'd probably also want to distribute:
    
    http://www.libsdl.org/projects/SDL_image/release/SDL_image 
    -1.2.3.pkg.tar.gz
    http://www.libsdl.org/projects/SDL_mixer/release/SDL_mixer 
    -1.2.5.pkg.tar.gz
    http://www.libsdl.org/projects/SDL_ttf/release/SDL_ttf-2.0.6.pkg.tar.gz
    
    To build pygame, however, you will need the following *-devel .pkgs,  
    which include the header files.  Note that the default installation  
    location for these is in ~/Libraries/Frameworks, whereas pygame's  
    config.py looks first in /Libraries/Frameworks.  So, you will either  
    have to clean out /Libraries/Frameworks/SDL* before building pygame or  
    manually edit pygame/config.py to check ~/Libraries/Frameworks first.
    
    http://www.libsdl.org/release/SDL-devel-1.2.7.pkg.tar.gz
    http://www.libsdl.org/projects/SDL_image/release/SDL_image-devel 
    -1.2.3.pkg.tar.gz
    http://www.libsdl.org/projects/SDL_mixer/release/SDL_mixer-devel 
    -1.2.5.pkg.tar.gz
    http://www.libsdl.org/projects/SDL_ttf/release/SDL_ttf-devel 
    -2.0.6.pkg.tar.gz
    
    I couldn't find a framework smpeg, so that'll have to wait...
    
    > I'm going to want to be able to build everything myself, so all you  
    > have to do is figure out how to build it in an acceptable way. pygame  
    > itself should be packaged up with Bob Ippolito's bdist_mpkg:
    
    I tested this after building pygame with the above frameworks, and  
    bdist_mpkg seems to work fine for pygame.  This was my first go using  
    bdist_mpkg (I've recently returned from an OS X development hiatus...)  
    and bdist_mpkg looks awesome.  Thanks again, Bob!
    
    Note: I didn't dig into the details of how the SDL_* packages handled  
    libfreetype, libjpeg, and libpng, but I cleared out all relevant  
    .dylibs from my /usr/local/lib directory before doing this, and  
    everything appears to work, so I think it's all OK.
    
    > Packaging up pygame (and pygtk, too; you're not alone) is, I think,  
    > going to be low on my list of priorities.
    
    Understood.  But it does work. :)
    
    Cheers!
    Andrew
    
    
    
    From reikhboerger at gmx.de  Mon Nov 29 07:49:27 2004
    From: reikhboerger at gmx.de (=?ISO-8859-1?Q?Reik_H._B=F6rger?=)
    Date: Mon, 29 Nov 2004 13:49:27 +0100
    Subject: [SciPy-user] linalg.eigenvalues
    Message-ID: <1B014707-4205-11D9-BEB9-000A959A0E34@gmx.de>
    
    Hi,
    
    I encountered the following problem: When giving a symmetric matrix (so 
    only real entries) to the function linalg.eig, then it can happen, that 
    eigenvalues and eigenvectors are complex.
    
    I admit, that the matrix has almost collinear columns and the imaginary 
    part of the eigenvalues/eigenvectors is almost zero (10^-15). But this 
    shouldn't happen, since everything in this problem is only real.
    
    I came to that problem when generating multivariate normal random 
    variables with stats.rv.multivariate_normal.
    By the way, I typed the matrix into Maple, and it delivers only real 
    results. So there should be a stable algorithm.
    
    thanks for reading
    Reik
    
    
    
    From t.zito at biologie.hu-berlin.de  Mon Nov 29 07:55:55 2004
    From: t.zito at biologie.hu-berlin.de (Tiziano Zito)
    Date: Mon, 29 Nov 2004 13:55:55 +0100
    Subject: [SciPy-user] linalg.eigenvalues
    In-Reply-To: <1B014707-4205-11D9-BEB9-000A959A0E34@gmx.de>
    References: <1B014707-4205-11D9-BEB9-000A959A0E34@gmx.de>
    Message-ID: <20041129125555.GA10919@itb.biologie.hu-berlin.de>
    
    > Hi,
    > 
    > I encountered the following problem: When giving a symmetric matrix (so 
    > only real entries) to the function linalg.eig, then it can happen, that 
    > eigenvalues and eigenvectors are complex.
    
    You may want to have a look at:
    http://mdp-toolkit.sourceforge.net/symeig.html
    
    The symeig module contains a Python wrapper for the LAPACK functions
    to solve the standard and generalized eigenvalue problems for
    symmetric (hermitian) positive definite matrices. Those specialized
    algorithms give an important speed-up with repect to the generic
    LAPACK eigenvalue problem solver used by SciPy (scipy.linalg.eig).
    
    hth, 
    tiziano
    
    
    
    From rkern at ucsd.edu  Mon Nov 29 15:32:36 2004
    From: rkern at ucsd.edu (Robert Kern)
    Date: Mon, 29 Nov 2004 12:32:36 -0800
    Subject: [SciPy-user] linalg.eigenvalues
    In-Reply-To: <1B014707-4205-11D9-BEB9-000A959A0E34@gmx.de>
    References: <1B014707-4205-11D9-BEB9-000A959A0E34@gmx.de>
    Message-ID: <41AB8764.9050007@ucsd.edu>
    
    Reik H. B?rger wrote:
    > Hi,
    > 
    > I encountered the following problem: When giving a symmetric matrix (so 
    > only real entries) to the function linalg.eig, then it can happen, that 
    > eigenvalues and eigenvectors are complex.
    > 
    > I admit, that the matrix has almost collinear columns and the imaginary 
    > part of the eigenvalues/eigenvectors is almost zero (10^-15). But this 
    > shouldn't happen, since everything in this problem is only real.
    
    You are running into the problems of finite-precision floating point 
    arithmetic. Values around 1e-15 *are* zero for practical purposes. Since 
    you know that the output values should be real (up to numerical 
    precision), you can just take the real part. That is more or less what 
    Maple is doing.
    
    I also recommend reading "What Every Computer Scientist Should Know 
    About Floating-Point Arithmetic":
    
    http://docs.sun.com/source/806-3568/ncg_goldberg.html
    
    -- 
    Robert Kern
    rkern at ucsd.edu
    
    "In the fields of hell where the grass grows high
      Are the graves of dreams allowed to die."
       -- Richard Harter
    
    
    
    From Fernando.Perez at colorado.edu  Tue Nov 30 19:35:29 2004
    From: Fernando.Perez at colorado.edu (Fernando Perez)
    Date: Tue, 30 Nov 2004 17:35:29 -0700
    Subject: [SciPy-user] ANN: IPython 0.6.5 is out
    Message-ID: <41AD11D1.2070401@colorado.edu>
    
    Hi all,
    
    [sorry for those who'll get this twice, but I figure that enough scipy users 
    are also ipython users that at this point the cross-post is forgiven]
    
    [Note better emacs support at the end, which I forgot in the ipython list msg]
    
    
    I'm glad to announce that IPython 0.6.5 is finally out. IPython's homepage is at:
    
    http://ipython.scipy.org
    
    and downloads are at:
    
    http://ipython.scipy.org/dist
    
    I've provided RPMs for Python 2.2 and 2.3, plus source downloads (.tar.gz and
    .zip).
    
    Debian, Fink and BSD packages for this version should be coming soon, as the
    respective maintainers (many thanks to Jack Moffit, Andrea Riciputi and Dryice
    Liu) have the time to follow their packaging procedures.
    
    Many thanks to Enthought for their continued hosting support for IPython, and
    to all the users who contributed ideas, fixes and reports.
    
    I'd promised that 0.6.4 would be the last version before the cleanup, but
    Prabhu Ramachandran managed to resucitate the GUI threading support which I'd
    worked on recently, but disabled after thinking it could not work.  It turns
    out we were very close, and Prabhu did fix the remaining problems.  Since this
    is a fairly significant improvement, I decided to make a release for it.  In
    the process I added a few minor other things.
    
    
    *** WHAT is IPython? IPython tries to:
    
    1. Provide an interactive shell superior to Python's default. IPython has many
    features for object introspection, system shell access, and its own special
    command system for adding functionality when working interactively.
    
    2. Serve as an embeddable, ready to use interpreter for your own programs.
    IPython can be started with a single call from inside another program,
    providing access to the current namespace.
    
    3. Offer a flexible framework which can be used as the base environment for
    other systems with Python as the underlying language.
    
    
    *** NEW for this release:
    
    As always, the complete NEWS file can be found at
    http://ipython.scipy.org/NEWS, and the whole ChangeLog at
    http://ipython.scipy.org/ChangeLog.
    
    * Threading support for WXPython and pyGTK.  It is now possible (with the
    -wthread and -gthread options) to control wx/gtk graphical interfaces from
    within an interactive ipython shell.  Note that your wx/gtk libs need to be
    compiled with threading support for this to work.  There is also experimental
    (but brittle) support for ALSO running Tkinter graphical interfaces alongside
    with wx or gtk ones.
    
    * New -d option to %run, for executing whole scripts with the interactive pdb
    debugger.  This allows you to step, watch variables, set breakpoints, etc,
    without having to modify your scripts in any way.
    
    * Added filtering support for variable types to %who and %whos. You can now
    say 'whos function str' and whos will only list functions and strings, instead
    of all variables.  Useful when working with crowded namespaces. (For some
    reason I forgot to document this in the ChangeLog).
    
    * Added ipython.el to the end-user distribution, for (X)Emacs support, since
    now the official python-mode.el from
    
    http://sourceforge.net/projects/python-mode
    
    has all the necessary fixes for ipython to work correctly (in CVS at this 
    moment, it will go into the next release I suppose).
    
    * Other minor fixes and cleanups, both to code and documentation.
    
    
    Enjoy, and as usual please report any problems.
    
    Regards,
    
    Fernando.