From oliphant at ee.byu.edu Sat Oct 1 23:56:58 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Sat, 01 Oct 2005 21:56:58 -0600 Subject: [SciPy-dev] Always including lapack_lite and blas_lite in sdist Message-ID: <433F5A8A.3090508@ee.byu.edu> Eric Firing noticed a problem with the newcore setup.py script in that the blas_lite and lapack_lite files are missing. This is because, on my system, I used blas and so didn't compile those. I'm wondering how we can detect in the setup.py file that an sdist command has been issued and so include the blas_lite and lapack_lite files even if somebody has atlas installed. Does somebody know how to tell what command was run when we are inside a configuration? -Travis From Fernando.Perez at colorado.edu Sun Oct 2 02:14:16 2005 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Sun, 02 Oct 2005 00:14:16 -0600 Subject: [SciPy-dev] Always including lapack_lite and blas_lite in sdist In-Reply-To: <433F5A8A.3090508@ee.byu.edu> References: <433F5A8A.3090508@ee.byu.edu> Message-ID: <433F7AB8.6040002@colorado.edu> Travis Oliphant wrote: > Eric Firing noticed a problem with the newcore setup.py script in that > the blas_lite and lapack_lite files are missing. This is because, on > my system, I used blas and so didn't compile those. > > I'm wondering how we can detect in the setup.py file that an sdist > command has been issued and so include the blas_lite and lapack_lite > files even if somebody has atlas installed. > > Does somebody know how to tell what command was run when we are inside a > configuration? In most cases, a test like if 'sdist' in sys.argv: ... should work, I would think. Unless something is actively rewriting sys.argv, even other scripts called by setup.py should see this. Or am I missing something more subtle? Cheers, f From oliphant at ee.byu.edu Sun Oct 2 02:20:55 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Sun, 02 Oct 2005 00:20:55 -0600 Subject: [SciPy-dev] Always including lapack_lite and blas_lite in sdist In-Reply-To: <433F7AB8.6040002@colorado.edu> References: <433F5A8A.3090508@ee.byu.edu> <433F7AB8.6040002@colorado.edu> Message-ID: <433F7C47.3050106@ee.byu.edu> Fernando Perez wrote: > Travis Oliphant wrote: > >> Eric Firing noticed a problem with the newcore setup.py script in >> that the blas_lite and lapack_lite files are missing. This is >> because, on my system, I used blas and so didn't compile those. >> >> I'm wondering how we can detect in the setup.py file that an sdist >> command has been issued and so include the blas_lite and lapack_lite >> files even if somebody has atlas installed. >> >> Does somebody know how to tell what command was run when we are >> inside a configuration? > > > In most cases, a test like > > if 'sdist' in sys.argv: > ... > > should work, I would think. Unless something is actively rewriting > sys.argv, even other scripts called by setup.py should see this. Or > am I missing something more subtle? > No, I think you got it. Somethings just escape me :-) Thanks for the help. -Travis From oliphant at ee.byu.edu Sun Oct 2 02:31:23 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Sun, 02 Oct 2005 00:31:23 -0600 Subject: [SciPy-dev] New SciPy Branch created Message-ID: <433F7EBB.7030104@ee.byu.edu> I just started work on getting scipy ported to the new core. I made a new branch called newscipy it is at http://svn.scipy.org/svn/scipy/branches/newscipy get it with svn co http://svn.scipy.org/svn/scipy/branches/newscipy newscipy From stephen.walton at csun.edu Sun Oct 2 02:44:15 2005 From: stephen.walton at csun.edu (Stephen Walton) Date: Sat, 01 Oct 2005 23:44:15 -0700 Subject: [SciPy-dev] New SciPy Branch created In-Reply-To: <433F7EBB.7030104@ee.byu.edu> References: <433F7EBB.7030104@ee.byu.edu> Message-ID: <433F81BF.2030502@csun.edu> Travis Oliphant wrote: > > I just started work on getting scipy ported to the new core. > > I made a new branch called newscipy I actually erased the previous install of "old scipy" on my laptop and replaced it with newcore. With this installation, the setup.py lines in newscipy which begin with "import scipy_distutils" need to be replaced with "import scipy.distutils", do they not? Or is newcore going to move scipy_distutils back up into its old location in the hierarchy? From rkern at ucsd.edu Sun Oct 2 02:46:57 2005 From: rkern at ucsd.edu (Robert Kern) Date: Sat, 01 Oct 2005 23:46:57 -0700 Subject: [SciPy-dev] New SciPy Branch created In-Reply-To: <433F81BF.2030502@csun.edu> References: <433F7EBB.7030104@ee.byu.edu> <433F81BF.2030502@csun.edu> Message-ID: <433F8261.4080102@ucsd.edu> Stephen Walton wrote: > Travis Oliphant wrote: > >> I just started work on getting scipy ported to the new core. >> >> I made a new branch called newscipy > > I actually erased the previous install of "old scipy" on my laptop and > replaced it with newcore. With this installation, the setup.py lines in > newscipy which begin with "import scipy_distutils" need to be replaced > with "import scipy.distutils", do they not? Yes, among many other things. Work is just beginning on this. -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From oliphant at ee.byu.edu Sun Oct 2 02:47:14 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Sun, 02 Oct 2005 00:47:14 -0600 Subject: [SciPy-dev] New SciPy Branch created In-Reply-To: <433F81BF.2030502@csun.edu> References: <433F7EBB.7030104@ee.byu.edu> <433F81BF.2030502@csun.edu> Message-ID: <433F8272.9040804@ee.byu.edu> Stephen Walton wrote: > Travis Oliphant wrote: > >> >> I just started work on getting scipy ported to the new core. >> >> I made a new branch called newscipy > > > I actually erased the previous install of "old scipy" on my laptop and > replaced it with newcore. With this installation, the setup.py lines > in newscipy which begin with "import scipy_distutils" need to be > replaced with "import scipy.distutils", do they not? Or is newcore > going to move scipy_distutils back up into its old location in the > hierarchy? Yes, one of the needed changes is to replace scipy_distutils with scipy.distutils and scipy_base with scipy.base I will place a file called NEEDED_CHANGES.txt in the main directory and as people think of things that need to be done place it there. -Travis From pearu at scipy.org Sun Oct 2 13:42:12 2005 From: pearu at scipy.org (Pearu Peterson) Date: Sun, 2 Oct 2005 12:42:12 -0500 (CDT) Subject: [SciPy-dev] Always including lapack_lite and blas_lite in sdist In-Reply-To: <433F7C47.3050106@ee.byu.edu> References: <433F5A8A.3090508@ee.byu.edu> <433F7AB8.6040002@colorado.edu> <433F7C47.3050106@ee.byu.edu> Message-ID: On Sun, 2 Oct 2005, Travis Oliphant wrote: > Fernando Perez wrote: > >> Travis Oliphant wrote: >> >>> Eric Firing noticed a problem with the newcore setup.py script in that the >>> blas_lite and lapack_lite files are missing. This is because, on my >>> system, I used blas and so didn't compile those. >>> >>> I'm wondering how we can detect in the setup.py file that an sdist command >>> has been issued and so include the blas_lite and lapack_lite files even if >>> somebody has atlas installed. >>> >>> Does somebody know how to tell what command was run when we are inside a >>> configuration? >> >> >> In most cases, a test like >> >> if 'sdist' in sys.argv: >> ... >> >> should work, I would think. Unless something is actively rewriting >> sys.argv, even other scripts called by setup.py should see this. Or am I >> missing something more subtle? >> > No, I think you got it. Somethings just escape me :-) Using `sdist in sys.argv` works but not in general. scipy.distutils supports a general solution that in addition to the above problem, it can handle cases where configuration depends also on a platform (e.g. creating distributions on different platfrom from where it will be used). So, I wouldn't use `sdist in sys.argv` but methods provided by scipy.distutils. For example, using depends keyword argument to Configuration.add_extension method, or functions that can filter out unnecessary sources, etc. If you have already used `sdist in sys.argv`, that's ok, just I may replace this approach with another one. Regards, Pearu From stephen.walton at csun.edu Sun Oct 2 16:09:53 2005 From: stephen.walton at csun.edu (Stephen Walton) Date: Sun, 02 Oct 2005 13:09:53 -0700 Subject: [SciPy-dev] New SciPy Branch created In-Reply-To: <433F8272.9040804@ee.byu.edu> References: <433F7EBB.7030104@ee.byu.edu> <433F81BF.2030502@csun.edu> <433F8272.9040804@ee.byu.edu> Message-ID: <43403E91.5090309@csun.edu> Travis Oliphant wrote: > I will place a file called NEEDED_CHANGES.txt in the main directory > and as people think of things that need to be done place it there. Well, I'd love to help with this, but I don't know enough about how the new scipy/distutils works. Do we need some input from Pearu? In particular, what should default_config_dict and merge_config_dicts be replaced with? From oliphant at ee.byu.edu Sun Oct 2 18:32:53 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Sun, 02 Oct 2005 16:32:53 -0600 Subject: [SciPy-dev] Always including lapack_lite and blas_lite in sdist In-Reply-To: References: <433F5A8A.3090508@ee.byu.edu> <433F7AB8.6040002@colorado.edu> <433F7C47.3050106@ee.byu.edu> Message-ID: <43406015.4000402@ee.byu.edu> Pearu Peterson wrote: > > Using `sdist in sys.argv` works but not in general. scipy.distutils > supports a general solution that in addition to the above problem, it > can handle cases where configuration depends also on a platform (e.g. > creating distributions on different platfrom from where it will be used). > So, I wouldn't use `sdist in sys.argv` but methods provided by > scipy.distutils. For example, using depends keyword argument to > Configuration.add_extension method, or functions that can filter out > unnecessary sources, etc. If you have already used `sdist in > sys.argv`, that's ok, just I may replace this approach with another one. Great, I was hoping you would find a better solution. 'sdist' in sys.argv seems like a bit of a hack. -Travis From oliphant at ee.byu.edu Sun Oct 2 18:42:00 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Sun, 02 Oct 2005 16:42:00 -0600 Subject: [SciPy-dev] Re: [Numpy-discussion] Release of SciPy Core 0.4 (Beta) In-Reply-To: <20051002213837.70eec02c.gerard.vermeulen@grenoble.cnrs.fr> References: <433CE24A.6040509@ee.byu.edu> <20051002213837.70eec02c.gerard.vermeulen@grenoble.cnrs.fr> Message-ID: <43406238.6050806@ee.byu.edu> Gerard Vermeulen wrote: >I have found two problems: > >(1) scipy_core-0.4.1 does not compile on this system without ATLAS, > because there are missing files: >... > > This should be fixed (with a better solution forthcoming). > >(2) when building scipy_core from SVN the header files do not get installed. > The install section from my RPM SPEC file reads: > > Now, I'm not sure what is meant here. They install evry time for me, but currently not to the main Python include directory. I'm not sure I like having the headers not installed to the Python include directory. It will be harder for people to use scipy_core as a replacement to Numeric if they have to re-write their setup.py scripts to find the headers. The Python include directory is included by default. Is it as easy for people to just use the setup from scipy.disutils? -Travis From oliphant at ee.byu.edu Sun Oct 2 18:44:28 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Sun, 02 Oct 2005 16:44:28 -0600 Subject: [SciPy-dev] Re: [Numpy-discussion] Re: [SciPy-user] Release of SciPy Core 0.4 (Beta) In-Reply-To: References: <433CE24A.6040509@ee.byu.edu> Message-ID: <434062CC.1060802@ee.byu.edu> Rob Managan wrote: > OK, I am giving this a go. > > I seem to be not setting a flag about some math functions availability. > There seems to be a problem with the configuration on Mac OS X. Anybody able to track down why we get a blank MATHLIB in the config.h file? -Travis From rkern at ucsd.edu Sun Oct 2 20:24:00 2005 From: rkern at ucsd.edu (Robert Kern) Date: Sun, 02 Oct 2005 17:24:00 -0700 Subject: [SciPy-dev] Re: [Numpy-discussion] Release of SciPy Core 0.4 (Beta) In-Reply-To: <43406238.6050806@ee.byu.edu> References: <433CE24A.6040509@ee.byu.edu> <20051002213837.70eec02c.gerard.vermeulen@grenoble.cnrs.fr> <43406238.6050806@ee.byu.edu> Message-ID: <43407A20.2040303@ucsd.edu> Travis Oliphant wrote: > Gerard Vermeulen wrote: >> (2) when building scipy_core from SVN the header files do not get >> installed. >> The install section from my RPM SPEC file reads: >> > Now, I'm not sure what is meant here. They install evry time for me, > but currently not to the main Python include directory. > > I'm not sure I like having the headers not installed to the Python > include directory. It will be harder for people to use scipy_core as a > replacement to Numeric if they have to re-write their setup.py scripts > to find the headers. The Python include directory is included by default. > Is it as easy for people to just use the setup from scipy.disutils? Some people *can't* install to the Python include directory. It was never a reliable place to put package header files. No one should have to totally rewrite their setup.py scripts, just add two lines: from scipy.distutils.misc_util import get_scipy_include_dirs myext = Extension('myext', ... include_dirs=get_scipy_include_dirs()) -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From rkern at ucsd.edu Sun Oct 2 20:40:53 2005 From: rkern at ucsd.edu (Robert Kern) Date: Sun, 02 Oct 2005 17:40:53 -0700 Subject: [SciPy-dev] Re: [Numpy-discussion] Re: [SciPy-user] Release of SciPy Core 0.4 (Beta) In-Reply-To: <434062CC.1060802@ee.byu.edu> References: <433CE24A.6040509@ee.byu.edu> <434062CC.1060802@ee.byu.edu> Message-ID: <43407E15.8090800@ucsd.edu> Travis Oliphant wrote: > Rob Managan wrote: > >> OK, I am giving this a go. >> >> I seem to be not setting a flag about some math functions availability. > > There seems to be a problem with the configuration on Mac OS X. Anybody > able to track down why we get a blank MATHLIB in the config.h file? libm is always implicitly linked. "-lm" is never necessary although it nevers hurts (nor helps!). However, it looks like 10.3's math.h doesn't include any definitions for the long double versions of math functions. floorl() is defined in the library, however. -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From oliphant at ee.byu.edu Sun Oct 2 21:37:48 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Sun, 02 Oct 2005 19:37:48 -0600 Subject: [SciPy-dev] Re: [Numpy-discussion] Release of SciPy Core 0.4 (Beta) In-Reply-To: <43407A20.2040303@ucsd.edu> References: <433CE24A.6040509@ee.byu.edu> <20051002213837.70eec02c.gerard.vermeulen@grenoble.cnrs.fr> <43406238.6050806@ee.byu.edu> <43407A20.2040303@ucsd.edu> Message-ID: <43408B6C.2040006@ee.byu.edu> Robert Kern wrote: >No one should have to totally rewrite their setup.py scripts, just add >two lines: > > from scipy.distutils.misc_util import get_scipy_include_dirs > > myext = Extension('myext', ... > include_dirs=get_scipy_include_dirs()) > > What I'm saying is that the default setup command needs to include the directory where the headers are installed as well. Yes, this is only two lines, but I think it will cause issues for current packages that already use Numeric. They will have to re-write their setup.py scripts before they can use scipy_core. How standard is it to put headers all over the place. It seems much more natural to have one or two include root trees where all headers go. I also think the default place to install headers should be where Python installs them. This should only change if the user cannot write there for whatever reason. On my system, for example, it seems silly to install to /usr/lib/python2.4/site-packages/scipy/base/include/scipy when because I can write to site-packages I can write to /usr/lib, I could also write to /usr/include/python2.4/scipy If the user can't write to the default include directory, then just place the headers into another "user-default" directory -- either specified by the user or some higher-level place: If scipy is installed here: /some/foo/directory/scipy then the headers should be here: /some/foo/directory/include/scipy and not where they currently are (clear down in the guts of scipy). Preferably, I suppose you would just install to $PREFIX/include/pythonX.X/scipy where the user can fix the PREFIX. I'm really concerned about making it easy for people to be able to build their packages against the new scipy_core system easily. This new include system seems strange to me. Are their other packages out there that offer a C-API. How do they package their include file? -Travis From rkern at ucsd.edu Sun Oct 2 22:26:46 2005 From: rkern at ucsd.edu (Robert Kern) Date: Sun, 02 Oct 2005 19:26:46 -0700 Subject: [SciPy-dev] Re: [Numpy-discussion] Release of SciPy Core 0.4 (Beta) In-Reply-To: <43408B6C.2040006@ee.byu.edu> References: <433CE24A.6040509@ee.byu.edu> <20051002213837.70eec02c.gerard.vermeulen@grenoble.cnrs.fr> <43406238.6050806@ee.byu.edu> <43407A20.2040303@ucsd.edu> <43408B6C.2040006@ee.byu.edu> Message-ID: <434096E6.3030904@ucsd.edu> Travis Oliphant wrote: > Robert Kern wrote: > >> No one should have to totally rewrite their setup.py scripts, just add >> two lines: >> >> from scipy.distutils.misc_util import get_scipy_include_dirs >> >> myext = Extension('myext', ... >> include_dirs=get_scipy_include_dirs()) > > What I'm saying is that the default setup command needs to include the > directory where the headers are installed as well. > > Yes, this is only two lines, but I think it will cause issues for > current packages that already use Numeric. They will have to re-write > their setup.py scripts before they can use scipy_core. I think that overstates the issue. They don't have to rewrite their setup.py scripts from top to bottom. They just add to it. There are no other complications. Odds are that most setup.py scripts try to import Numeric to check for its existence; people will likely have to do some touch-ups to them regardless. > How standard is it to put headers all over the place. It seems much > more natural to have one or two include root trees where all headers go. > > I also think the default place to install headers should be where Python > installs them. This should only change if the user cannot write there > for whatever reason. > On my system, for example, it seems silly to install to > > /usr/lib/python2.4/site-packages/scipy/base/include/scipy > when because I can write to site-packages I can write to /usr/lib, I > could also write to > > /usr/include/python2.4/scipy I also don't see the harm in it. > If the user can't write to the default include directory, then just > place the headers into another "user-default" directory -- either > specified by the user or some higher-level place: > > If scipy is installed here: > > /some/foo/directory/scipy > > then the headers should be here: > > /some/foo/directory/include/scipy > > and not where they currently are (clear down in the guts of scipy). > > Preferably, I suppose you would just install to > > $PREFIX/include/pythonX.X/scipy > > where the user can fix the PREFIX. $PREFIX isn't reliable. Some setups don't have the UNIX-like $prefix/lib, $prefix/include, etc. structure. For instance, non-root users on Macs would install packages to ~/Library/Python/2.4/site-packages/. There's no place for non-root users to install include files. Then those users will have to modify every scipy-using package's setup.py to point to the nonstandard include directory. I would much prefer there to be one standard way to get the include directory for scipy. That way, it could be done in the upstream package's setup.py *once* and it will work unchanged on *every* system. There was never a standard secondary directory for when you couldn't install to the Python include directory. People had to find a place to put the headers and manually modify the setup.py scripts to look there for headers, too. Putting the headers in the package also allows scipy_core to be packaged as a PythonEgg, which does not bundle things installed by install_header (nor can it be modified do so reliably). Packaging the headers in the package itself induces a very small one-time cost and resolves problems that we've had for years. > I'm really concerned about making it easy for people to be able to build > their packages against the new scipy_core system easily. This new > include system seems strange to me. Are their other packages out there > that offer a C-API. How do they package their include file? AFAICT, we're the only community (Numeric, numarray, Konrad's Scientific) that installs header files into Python include directory. numarray has a way to inquire from the package where its headers were installed just like I'm suggesting, although they install headers to the Python directory anyways. That mechanism fails when numarray is bundled as an Egg since it is expecting to locate the headers in a place external to the Egg. wxPython installs headers with the wxWidgets headers (also breaking Eggs). The egenix tools (mxBeeBase, mxStack, etc.) install headers into the package itself. -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From pearu at scipy.org Mon Oct 3 05:08:56 2005 From: pearu at scipy.org (Pearu Peterson) Date: Mon, 3 Oct 2005 04:08:56 -0500 (CDT) Subject: [SciPy-dev] New SciPy Branch created In-Reply-To: <43403E91.5090309@csun.edu> References: <433F7EBB.7030104@ee.byu.edu> <433F81BF.2030502@csun.edu> <433F8272.9040804@ee.byu.edu> <43403E91.5090309@csun.edu> Message-ID: On Sun, 2 Oct 2005, Stephen Walton wrote: > Travis Oliphant wrote: > >> I will place a file called NEEDED_CHANGES.txt in the main directory and as >> people think of things that need to be done place it there. > > Well, I'd love to help with this, but I don't know enough about how the new > scipy/distutils works. Do we need some input from Pearu? In particular, > what should default_config_dict and merge_config_dicts be replaced with? It's a sensible request. I'll create a tutorial on scipy.distutils as soon as possible. Until then here's a short overview of how setup.py files in scipy should look like: 1) First, it must define a function configuration(parent_package,top_path) that returns a dictionary or a Configuration instance. Here's a simple example: def configuration(parent_package='',top_path=None): config = Configuration('mypackage',parent_package,top_path) return config # or config.todict() that is suitable for a pure python package called 'mypackage'. 2) If a package contains subpackages, then they can be specified using add_subpackage() method, for example, config.add_subpackage('mysubpackage') Of course, mysubpackage setup.py should follow the same convention as any scipy setup.py file. 3) If a package contains extensions, then they can be specified using add_extension() method, for example, config.add_extension('myextension',sources,..) 4) If a package contains data files, then they can be specified using add_data_files() method, for example, config.add_data_files('datafile1','datafile2',..) or add_data_dir() method, for example, config.add_data_dir('datadir') 5) If a package needs to install header files, then header files should be specified using add_headers method, for example, config.add_headers('headerfile1','headerfile2',..) 6) If a package extension modules use local headers for compilation, then the location of header files can be specified using add_include_dirs method, for example, config.add_include_dirs('includepath1','includepath2',..) 7) If a package needs to install scripts, then scripts can be specified using add_scripts method, for example, config.add_scripts('script1','script2',..) 8) If a package has libraries, then they can be specified using add_library mehtod. The add_library is not fully tested as of yet. There are several helper methods in Configuration instance, for example, paths, get_config_cmd, get_build_temp_dir, have_f77c, have_f90c, get_version, make_svn_version_py, get_subpackage, get_distribution, todict, etc but I'll describe them later. Also the usage of alreay mentioned methods needs some comments to deal with some special cases that may occur in practice, eg. howto define packages that directory names don't match with package names, etc. Configuration instance has also several useful attributes such as name, local_path, top_path but in general it can have attributes that distutils.core.setup function can accept. Also, one of the coolest features in scipy.distutils is that files in sources or other similar arguments may be functions that are called during building steps to generate sources on fly. I'll need to explain the usage of such functions as well. If I forgot to mention something important, it will appear in scipy.distutils howto or tutorial document.. unless someone asks it first.. Pearu From stephen.walton at csun.edu Mon Oct 3 11:23:03 2005 From: stephen.walton at csun.edu (Stephen Walton) Date: Mon, 03 Oct 2005 08:23:03 -0700 Subject: [SciPy-dev] Re: [Numpy-discussion] Release of SciPy Core 0.4 (Beta) In-Reply-To: <4340C6FA.1050307@optusnet.com.au> References: <433CE24A.6040509@ee.byu.edu> <4340B872.8020102@optusnet.com.au> <4340BE90.6080302@pfdubois.com> <4340C6FA.1050307@optusnet.com.au> Message-ID: <43414CD7.8020507@csun.edu> Tim Churches wrote: >That would eb a most welcome development. If you are in the mood for >some minor extensions to MA (you probably aren't, but just on the >off-chance...), then support for multiple masking values would be great. > > This sounds like an overly complicated addition to MA. If I may be so bold, it seems to me that what you might want is your own object which maintains parallel arrays of values and "mask reasons," if you will, and generates a mask array accordingly From stephen.walton at csun.edu Mon Oct 3 16:02:34 2005 From: stephen.walton at csun.edu (Stephen Walton) Date: Mon, 03 Oct 2005 13:02:34 -0700 Subject: [SciPy-dev] Re: [Numpy-discussion] Release of SciPy Core 0.4 (Beta) In-Reply-To: <4340B872.8020102@optusnet.com.au> References: <433CE24A.6040509@ee.byu.edu> <4340B872.8020102@optusnet.com.au> Message-ID: <43418E5A.5040105@csun.edu> Tim Churches wrote: >However, can I ask if there are plans to add Masked Arrays to SciPy >Core? > Eric Firing pointed out on the matplotlib mailing list that they're already there: import scipy.base.ma as ma From oliphant at ee.byu.edu Mon Oct 3 17:02:12 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Mon, 03 Oct 2005 15:02:12 -0600 Subject: [SciPy-dev] Re: [Numpy-discussion] Release of SciPy Core 0.4 (Beta) In-Reply-To: <43418E5A.5040105@csun.edu> References: <433CE24A.6040509@ee.byu.edu> <4340B872.8020102@optusnet.com.au> <43418E5A.5040105@csun.edu> Message-ID: <43419C54.6030003@ee.byu.edu> Stephen Walton wrote: > Tim Churches wrote: > >> However, can I ask if there are plans to add Masked Arrays to SciPy >> Core? >> > Eric Firing pointed out on the matplotlib mailing list that they're > already there: > > import scipy.base.ma as ma from scipy import ma also works (everything under scipy.base is also under scipy alone). -Travis From oliphant at ee.byu.edu Mon Oct 3 20:23:58 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Mon, 03 Oct 2005 18:23:58 -0600 Subject: [SciPy-dev] Purchasing Documentation In-Reply-To: <4341C1C8.3000506@optusnet.com.au> References: <43418AA8.2040809@hawaii.edu> <4341A410.5010504@optusnet.com.au> <4341B659.7070503@hawaii.edu> <4341C1C8.3000506@optusnet.com.au> Message-ID: <4341CB9E.4050306@ee.byu.edu> Tim Churches wrote: >Eric Firing wrote: > > >>>OK, thanks. In the absence of documentation, I just looked for an MA >>>subdirectory, couldn't find one and assumed that it wasn't (yet) >>>supported. >>> >>> >>Tim, >> >>Documentation is coming along, but being made available in an unusual >>way: http://www.tramy.us/ >> >> > >copy of the documentation. Most likely, one copy of the documentation >will be purchased to be shared between several (or many) users in a >workgroup within an institution. > Note that it is expressly against the agreement for one copy to be shared between multiple users at the same institution. I hope this is clear.... Of course you can let somebody else look at it a couple of times, but if they will use it regularly, they need to get their own copy. Prices are always a matter of supply and demand. The whole point of the system is to allow the price system to help coordinate what people think is valuable and what developers spend time doing. What you see currently is the maximum price (and time) I could possibly set as per the agreement with Trelgol. These things can always come down, however, as time proceeds, and the market responds. Now, obviously the cost of the documentation includes something of the cost of producing the actual code. Of course, you may disagree, but I did choose the numbers based on a little bit of market research. I don't think that 7000 copies of the documentation or 7 years is all that ridiculous given that there have been over 12000 downloads of the Numeric 24.0b2 source code since April and Numeric has been in stable use for nearly 10 years. If scipy does it's job correctly, then a user-base needing documentation of 7000 is rather low-balling it I would say. I want scipy to surpass the number of users of Numeric. I'm trying to make scipy core so that everybody can convert to it, eventually. The old Numeric manual still provides documentation, and the source is still available. I think you are still getting a great deal. Unless there is another majore re-write, the documentation will be updated as it goes (and you get the updates). >I would say that perhaps $30k or one year (after completion of the >documentation) would be more reasonable criteria for making the >documentation freely available (but then I am not writing it). > > Well, given the time I had to spend on this, that is quite a bit less than the market will bear for my services elsewhere. I suppose if I were rich, I could donate more of my time. But, I'm not.... I'm really not trying to make people upset. I'm really a huge fan of open information, and would love to see the documentation completely free. It's just that I cannot afford to create it for nothing. I have lots of demands on my time. Spending it doing scientific python has to be justified, somehow. I did not start the creation of a hybrid Numeric / numarray with the hope of making any money. I started it because there was a need. I thought I would get more help with its implementation. But, given the reality of people's scarce time (they need to make money...), nobody was able to help me. Out of this, the documentation idea was born to try and help fund development of scipy core. I hope people can understand that the reality of scarcity dictates that we coordinate efforts through some mechanism. The price mechanism has been the most succesful large-scale mechanism yet developed. I am interested in feedback. If you don't buy the book because you think I'm asking too much money, then let me know, as Tim has done. You can email me directly, as well. Best regards, -Travis From oliphant at ee.byu.edu Mon Oct 3 20:48:53 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Mon, 03 Oct 2005 18:48:53 -0600 Subject: [SciPy-dev] Some benchmarks of new SVN scipy Message-ID: <4341D175.4050700@ee.byu.edu> Chris did some early benchmarks and we were able to speed up some things in current SciPy. Attached is the code: Here are my results for various values of N, and Nl (using SVN scipy_core)... Number of loops = 50 Size of matrix = 1000 x 1000 time for numarray = 10.834883213 time for numeric = 33.9484729767 time for scipy = 9.82745909691 Number of loops = 500 Size of matrix = 100 x 100 time for numarray = 0.684185028076 time for numeric = 1.40390586853 time for scipy = 0.416875839233 Number of loops = 5000 Size of matrix = 10 x 10 time for numarray = 1.48704504967 time for numeric = 0.419279098511 time for scipy = 0.906091928482 Number of loops = 5000 Size of matrix = 1 x 1 time for numarray = 1.45864701271 time for numeric = 0.344217061996 time for scipy = 0.856736898422 Conclusion, SciPy is as fast as (or faster than) numarray for large arrays (at least in this test), but for many small arrays can still be a bit slower (there is currently more setup time for the ufunc loop) -------------- next part -------------- A non-text attachment was scrubbed... Name: test_bigarray.py Type: application/x-python Size: 1045 bytes Desc: not available URL: From rkern at ucsd.edu Tue Oct 4 05:20:07 2005 From: rkern at ucsd.edu (Robert Kern) Date: Tue, 04 Oct 2005 02:20:07 -0700 Subject: [SciPy-dev] Proposal: scipy.sandbox Message-ID: <43424947.7090809@ucsd.edu> I would like to propose adding a sandbox subpackage to scipy. Underneath scipy.sandbox would be other subsubpackages that are undergoing active development and whose APIs will possibly be changing rapidly or be broken at times. For actual scipy releases, the build of the sandbox would be turned off although the full source code should probably be included in the source distribution. The setup.py scripts should be written (hopefully) modular enough such that enabling individual subsubpackages is a matter of uncommenting a few lines. The sandbox will give us a place to put experimental code so we can share amongst ourselves and collaboratively improve new modules without committing to their placement in the scipy package hierarchy. In this new namespace, we can more easily develop potential replacements for current functions (e.g. the f2py-ification of certain older FORTRAN-based modules). Given such a sandbox, I would also propose moving to the sandbox certain scipy modules that I think have succumbed to bitrot over the years, namely cow/ and ga/. Also, a few functions here and there are in a similar state (e.g. scipy.stats.anova). Oh, and xxx/; it's not faulty, but it exists for documentation not execution. So, what do you think? -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From aisaac at american.edu Tue Oct 4 05:22:05 2005 From: aisaac at american.edu (Alan G Isaac) Date: Tue, 4 Oct 2005 05:22:05 -0400 Subject: [SciPy-dev] Re: [Numpy-discussion] Purchasing Documentation In-Reply-To: <4341CB9E.4050306@ee.byu.edu> References: <43418AA8.2040809@hawaii.edu> <4341A410.5010504@optusnet.com.au> <4341B659.7070503@hawaii.edu> <4341C1C8.3000506@optusnet.com.au><4341CB9E.4050306@ee.byu.edu> Message-ID: On Mon, 03 Oct 2005, Travis Oliphant apparently wrote: > I hope people can understand that the reality of scarcity > dictates that we coordinate efforts through some > mechanism. The price mechanism has been the most > succesful large-scale mechanism yet developed. > I am interested in feedback. If you don't buy the book > because you think I'm asking too much money, then let me > know, as Tim has done. I found this an interesting approach to supporting the project. I plan to buy the book when it is released. Hmm, why wait? I should put my money where my mouth is. Just a moment ... ok, done. I view the book as a *complement* to other documentation that will appear and as a way to support the project. I agree with Tim that freely accessible online documentation will and must become available as well. As Chris notes, some of this can happen on the Wiki. I also plan to ask our library to purchase the book, but I am concerned that your statement that multiple users each need their own copy might mean a library purchase is forbidden. I assume it did not mean that, and that you just meant that making multiple copies is restricted. (Our library supports electronic book check out.) Ruling out library purchases would, I think, be a costly mistake for many reasons, which I can list if you are interested. Finally, I agree with Tim that seven years is too long and at the price I'd hope for a paperback copy. I think a better strategy would be two years copy protection, with an updated edition every two years. (But then I am not writing the code!) The basic concept is really nice, as long as it does not make it harder for you to - fully document your code, - smile on the free documentation that emerges, and - keep your sunny disposition. Cheers, Alan Isaac From jonas at cortical.mit.edu Tue Oct 4 07:48:25 2005 From: jonas at cortical.mit.edu (Eric Jonas) Date: Tue, 4 Oct 2005 07:48:25 -0400 Subject: [SciPy-dev] Re: [Numpy-discussion] Purchasing Documentation In-Reply-To: References: <43418AA8.2040809@hawaii.edu> <4341A410.5010504@optusnet.com.au> <4341B659.7070503@hawaii.edu> Message-ID: <20051004114825.GX5015@convolution.mit.edu> I wanted to echo isaac's point: > I also plan to ask our library to purchase the book, but > I am concerned that your statement that multiple users each > need their own copy might mean a library purchase is > forbidden. I assume it did not mean that, and that you > just meant that making multiple copies is restricted. (Our > library supports electronic book check out.) Ruling out > library purchases would, I think, be a costly mistake for > many reasons, which I can list if you are interested. I couldn't agree more, although I'm not quite sure how a license could be worded such that it would allow a library copy and prevent a lab bench copy, which I think was Travis' intent. That said, have you considered selling "site licenses" of a sort? I know my lab would pay a few hundred to get a PDF that we could just stick in our fileserver and use in perpetuity. I know that right now there's nothing -preventing- us from buying just one copy and doing that (other than pesky copyright law), but we'd like to support the project. ...Eric From rkern at ucsd.edu Tue Oct 4 09:01:51 2005 From: rkern at ucsd.edu (Robert Kern) Date: Tue, 04 Oct 2005 06:01:51 -0700 Subject: [SciPy-dev] Bus error when calling put() method on object arrays Message-ID: <43427D3F.4070206@ucsd.edu> I've tracked it down to the first line of OBJECT_setitem() in arraytypes.inc.src, but that's where my debug-fu fails me: static int OBJECT_setitem(PyObject *op, char *ov, PyArrayObject *ap) { Py_XDECREF(*(PyObject **)ov); Py_INCREF(op); *(PyObject **)ov = op; return PyErr_Occurred() ? -1:0; } Changing the Py_XDECREF to Py_DECREF doesn't help. The chain of function calls that leads to this is PyArray_Put -> PyArray_ContiguousFromObject (values, not indices) -> PyArray_ContiguousFromObject -> PyArray_FromAny -> array_fromobject -> Array_FromSequence -> Assign_Array -> PySequence_SetItem -> array_ass_item -> OBJECT_setitem -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From rkern at ucsd.edu Tue Oct 4 10:17:20 2005 From: rkern at ucsd.edu (Robert Kern) Date: Tue, 04 Oct 2005 07:17:20 -0700 Subject: [SciPy-dev] scipy.base passing tests Message-ID: <43428EF0.7010602@ucsd.edu> In my working copy, all of the tests for scipy.base pass. Except for the ones that don't. Before checking my stuff in, I want to get some feedback on these: * test_shape_base.test_apply_along_axis causes a segfault due to the put() on object arrays bug I posted about earlier. Personally, I've commented out this testcase so I could run the others. * test_type_check.test_mintypecode has a few cases using savespace. That's obsolete now, but I'm not sure if there's new functionality we should test in its place. * Various tests in test_type_check want to divide scalar arrays by 0 to make nans and infs. This fails with a ZeroDivisionError. E.g. ====================================================================== ERROR: check_complex1 (scipy.base.type_check.test_type_check.test_isfinite) ---------------------------------------------------------------------- Traceback (most recent call last): File "/Users/kern/svk-projects/scipy_core/scipy/base/tests/test_type_check.py", line 162, in check_complex1 assert_all(isfinite(array(1+1j)/0.) == 0) ZeroDivisionError: complex division I'm not sure whether this is desired behavior, or the scalar array types should be modified to treat division by 0 like other arrays. I also occasionally get a bus error when exiting the process. I have no idea why. -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From Fernando.Perez at colorado.edu Tue Oct 4 12:59:25 2005 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Tue, 04 Oct 2005 10:59:25 -0600 Subject: [SciPy-dev] Proposal: scipy.sandbox In-Reply-To: <43424947.7090809@ucsd.edu> References: <43424947.7090809@ucsd.edu> Message-ID: <4342B4ED.5050908@colorado.edu> Robert Kern wrote: > I would like to propose adding a sandbox subpackage to scipy. Underneath > scipy.sandbox would be other subsubpackages that are undergoing active > development and whose APIs will possibly be changing rapidly or be > broken at times. For actual scipy releases, the build of the sandbox > would be turned off although the full source code should probably be > included in the source distribution. The setup.py scripts should be > written (hopefully) modular enough such that enabling individual > subsubpackages is a matter of uncommenting a few lines. [...] > So, what do you think? +1 Note that xxx could probably use a bit of updating to reflect the current state of scipy. But even as it is, it's a useful starter for new package authors, and sandbox is probably the right place for it. Cheers, f From pearu at scipy.org Tue Oct 4 12:50:00 2005 From: pearu at scipy.org (Pearu Peterson) Date: Tue, 4 Oct 2005 11:50:00 -0500 (CDT) Subject: [SciPy-dev] Proposal: scipy.sandbox In-Reply-To: <4342B4ED.5050908@colorado.edu> References: <43424947.7090809@ucsd.edu> <4342B4ED.5050908@colorado.edu> Message-ID: On Tue, 4 Oct 2005, Fernando Perez wrote: > Robert Kern wrote: >> I would like to propose adding a sandbox subpackage to scipy. Underneath >> scipy.sandbox would be other subsubpackages that are undergoing active >> development and whose APIs will possibly be changing rapidly or be >> broken at times. For actual scipy releases, the build of the sandbox >> would be turned off although the full source code should probably be >> included in the source distribution. The setup.py scripts should be >> written (hopefully) modular enough such that enabling individual >> subsubpackages is a matter of uncommenting a few lines. > > [...] > >> So, what do you think? > > +1 me too. In addition to cow and ga, also gui_thread, gplt, plt, xplt(?) should be moved under sandbox. > > Note that xxx could probably use a bit of updating to reflect the current > state of scipy. But even as it is, it's a useful starter for new package > authors, and sandbox is probably the right place for it. I would rename xxx to example or expackage or smth like that. And it really needs updated to current scipy.distutils. I would not recommend it for starter, instead, one should look at newcore setup.py files when creating a new package. Pearu From jorgen.stenarson at bostream.nu Tue Oct 4 13:59:14 2005 From: jorgen.stenarson at bostream.nu (=?ISO-8859-1?Q?J=F6rgen_Stenarson?=) Date: Tue, 04 Oct 2005 19:59:14 +0200 Subject: [SciPy-dev] Build of newcore on macosX and with mingw32 Message-ID: <4342C2F2.4080703@bostream.nu> Hi, I'm trying to build newcore on macosx using python 2.4 from darwinports. I have checked out: http://svn.scipy.org/svn/scipy_core/branches/newcore but after a while the build fails while compiling the umath module. The first line after issuing the build command says: Not loaded: Are you running from the source directory? is this expected? any ideas what the problem can be? Best regards, J?rgen in the newcore topdirectory I run: $python setup.py build Not loaded: Are you running from the source directory? Assuming default configuration (scipy/distutils/command/{setup_command,setup}.py was not found) Appending scipy.distutils.command configuration to scipy.distutils Assuming default configuration (scipy/distutils/fcompiler/{setup_fcompiler,setup}.py was not found) Appending scipy.distutils.fcompiler configuration to scipy.distutils Appending scipy.distutils configuration to scipy Assuming default configuration (/Users/jorgenstenarson/programming/python/newcore/scipy/weave/tests/{setup_tests,setup}.py was not found) Appending scipy.weave.tests configuration to scipy.weave Appending scipy.weave configuration to scipy Assuming default configuration (scipy/test/{setup_test,setup}.py was not found) Appending scipy.test configuration to scipy No module named __svn_version__ F2PY Version No module named __svn_version__ 2_? Appending scipy.f2py configuration to scipy Appending scipy.base configuration to scipy blas_opt_info: FOUND: extra_link_args = ['-Wl,-framework', '-Wl,Accelerate'] define_macros = [('NO_ATLAS_INFO', 3)] extra_compile_args = ['-faltivec', '-I/System/Library/Frameworks/vecLib.framework/Headers'] lapack_opt_info: FOUND: extra_link_args = ['-Wl,-framework', '-Wl,Accelerate'] define_macros = [('NO_ATLAS_INFO', 3)] extra_compile_args = ['-faltivec'] Appending scipy.lib configuration to scipy Assuming default configuration (scipy/fftpack/{setup_fftpack,setup}.py was not found) Appending scipy.fftpack configuration to scipy Assuming default configuration (scipy/linalg/{setup_linalg,setup}.py was not found) Appending scipy.linalg configuration to scipy Assuming default configuration (scipy/stats/{setup_stats,setup}.py was not found) Appending scipy.stats configuration to scipy Appending scipy configuration to scipy_core version 0.4.2 Creating scipy/f2py2e/__svn_version__.py (version='1166') Creating scipy/__svn_version__.py (version='1166') running build running config_fc running build_src building extension "scipy.base.multiarray" sources creating build creating build/src creating build/src/scipy creating build/src/scipy/base Generating build/src/scipy/base/config.h customize NAGFCompiler customize AbsoftFCompiler customize IbmFCompiler Could not locate executable g77 Could not locate executable f77 customize GnuFCompiler customize G95FCompiler customize GnuFCompiler customize NAGFCompiler customize NAGFCompiler using config gcc options: '-fno-strict-aliasing -Wno-long-double -no-cpp-precomp -mno-fused-madd -fno-common -fno-common -dynamic -DNDEBUG -g -O3 -Wall -Wstrict-prototypes' compile options: '-I/opt/local/Library/Frameworks/Python.framework/Versions/2.4/include/python2.4 -Iscipy/base/src -I/opt/local/Library/Frameworks/Python.framework/Versions/2.4/include/python2.4 -I/opt/local/include/python2.4 -c' gcc: _configtest.c _configtest.c: In function 'main': _configtest.c:50: warning: format '%d' expects type 'int', but argument 3 has type 'long unsigned int' _configtest.c:57: warning: format '%d' expects type 'int', but argument 3 has type 'long unsigned int' _configtest.c:72: warning: format '%d' expects type 'int', but argument 3 has type 'long unsigned int' gcc _configtest.o -o _configtest _configtest success! removing: _configtest.c _configtest.o _configtest gcc options: '-fno-strict-aliasing -Wno-long-double -no-cpp-precomp -mno-fused-madd -fno-common -fno-common -dynamic -DNDEBUG -g -O3 -Wall -Wstrict-prototypes' compile options: '-Iscipy/base/src -I/opt/local/Library/Frameworks/Python.framework/Versions/2.4/include/python2.4 -I/opt/local/include/python2.4 -c' gcc: _configtest.c gcc _configtest.o -o _configtest _configtest success! removing: _configtest.c _configtest.o _configtest gcc options: '-fno-strict-aliasing -Wno-long-double -no-cpp-precomp -mno-fused-madd -fno-common -fno-common -dynamic -DNDEBUG -g -O3 -Wall -Wstrict-prototypes' compile options: '-Iscipy/base/src -I/opt/local/Library/Frameworks/Python.framework/Versions/2.4/include/python2.4 -I/opt/local/include/python2.4 -c' gcc: _configtest.c _configtest.c:3: warning: function declaration isn't a prototype _configtest.c: In function 'main': _configtest.c:4: warning: statement with no effect _configtest.c:5: warning: control reaches end of non-void function gcc _configtest.o -o _configtest success! removing: _configtest.c _configtest.o _configtest gcc options: '-fno-strict-aliasing -Wno-long-double -no-cpp-precomp -mno-fused-madd -fno-common -fno-common -dynamic -DNDEBUG -g -O3 -Wall -Wstrict-prototypes' compile options: '-Iscipy/base/src -I/opt/local/Library/Frameworks/Python.framework/Versions/2.4/include/python2.4 -I/opt/local/include/python2.4 -c' gcc: _configtest.c _configtest.c:3: warning: function declaration isn't a prototype _configtest.c: In function 'main': _configtest.c:4: warning: statement with no effect _configtest.c:5: warning: control reaches end of non-void function gcc _configtest.o -o _configtest success! removing: _configtest.c _configtest.o _configtest gcc options: '-fno-strict-aliasing -Wno-long-double -no-cpp-precomp -mno-fused-madd -fno-common -fno-common -dynamic -DNDEBUG -g -O3 -Wall -Wstrict-prototypes' compile options: '-Iscipy/base/src -I/opt/local/Library/Frameworks/Python.framework/Versions/2.4/include/python2.4 -I/opt/local/include/python2.4 -c' gcc: _configtest.c _configtest.c:3: warning: function declaration isn't a prototype _configtest.c: In function 'main': _configtest.c:4: warning: statement with no effect _configtest.c:5: warning: control reaches end of non-void function gcc _configtest.o -o _configtest success! removing: _configtest.c _configtest.o _configtest gcc options: '-fno-strict-aliasing -Wno-long-double -no-cpp-precomp -mno-fused-madd -fno-common -fno-common -dynamic -DNDEBUG -g -O3 -Wall -Wstrict-prototypes' compile options: '-Iscipy/base/src -I/opt/local/Library/Frameworks/Python.framework/Versions/2.4/include/python2.4 -I/opt/local/include/python2.4 -c' gcc: _configtest.c _configtest.c:3: warning: function declaration isn't a prototype _configtest.c: In function 'main': _configtest.c:4: warning: statement with no effect _configtest.c:5: warning: control reaches end of non-void function gcc _configtest.o -o _configtest success! removing: _configtest.c _configtest.o _configtest gcc options: '-fno-strict-aliasing -Wno-long-double -no-cpp-precomp -mno-fused-madd -fno-common -fno-common -dynamic -DNDEBUG -g -O3 -Wall -Wstrict-prototypes' compile options: '-Iscipy/base/src -I/opt/local/Library/Frameworks/Python.framework/Versions/2.4/include/python2.4 -I/opt/local/include/python2.4 -c' gcc: _configtest.c _configtest.c:3: warning: function declaration isn't a prototype _configtest.c: In function 'main': _configtest.c:4: error: 'isnan' undeclared (first use in this function) _configtest.c:4: error: (Each undeclared identifier is reported only once _configtest.c:4: error: for each function it appears in.) _configtest.c:3: warning: function declaration isn't a prototype _configtest.c: In function 'main': _configtest.c:4: error: 'isnan' undeclared (first use in this function) _configtest.c:4: error: (Each undeclared identifier is reported only once _configtest.c:4: error: for each function it appears in.) failure. removing: _configtest.c _configtest.o adding 'build/src/scipy/base/config.h' to sources. executing /Users/jorgenstenarson/programming/python/newcore/scipy/base/code_generators/generate_array_api.py adding 'build/src/scipy/base/__multiarray_api.h' to sources. creating build/src/scipy/base/src conv_template:> build/src/scipy/base/src/scalartypes.inc adding 'build/src/scipy/base/src' to include_dirs. conv_template:> build/src/scipy/base/src/arraytypes.inc scipy.base - nothing done with h_files= ['build/src/scipy/base/src/scalartypes.inc', 'build/src/scipy/base/src/arraytypes.inc', 'build/src/scipy/base/config.h', 'build/src/scipy/base/__multiarray_api.h'] building extension "scipy.base.umath" sources adding 'build/src/scipy/base/config.h' to sources. executing /Users/jorgenstenarson/programming/python/newcore/scipy/base/code_generators/generate_ufunc_api.py adding 'build/src/scipy/base/__ufunc_api.h' to sources. conv_template:> build/src/scipy/base/src/umathmodule.c adding 'build/src/scipy/base/src' to include_dirs. scipy.base - nothing done with h_files= ['build/src/scipy/base/src/scalartypes.inc', 'build/src/scipy/base/src/arraytypes.inc', 'build/src/scipy/base/config.h', 'build/src/scipy/base/__ufunc_api.h'] building extension "scipy.base._compiled_base" sources adding 'build/src/scipy/base/config.h' to sources. adding 'build/src/scipy/base/__multiarray_api.h' to sources. scipy.base - nothing done with h_files= ['build/src/scipy/base/config.h', 'build/src/scipy/base/__multiarray_api.h'] building extension "scipy.lib._dotblas" sources creating build/src/scipy/lib adding 'scipy/corelib/blasdot/_dotblas.c' to sources. building extension "scipy.lib.fftpack_lite" sources building extension "scipy.lib.mtrand" sources building extension "scipy.lib.lapack_lite" sources adding 'scipy/corelib/lapack_lite/lapack_litemodule.c' to sources. running build_py creating build/lib.darwin-8.2.0-Power_Macintosh-2.4 creating build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy copying scipy/__init__.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy copying scipy/__svn_version__.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy copying scipy/core_version.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy copying scipy/setup.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy creating build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/distutils copying scipy/distutils/__init__.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/distutils copying scipy/distutils/__version__.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/distutils copying scipy/distutils/ccompiler.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/distutils copying scipy/distutils/conv_template.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/distutils copying scipy/distutils/core.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/distutils copying scipy/distutils/cpuinfo.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/distutils copying scipy/distutils/exec_command.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/distutils copying scipy/distutils/extension.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/distutils copying scipy/distutils/from_template.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/distutils copying scipy/distutils/line_endings.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/distutils copying scipy/distutils/log.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/distutils copying scipy/distutils/misc_util.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/distutils copying scipy/distutils/setup.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/distutils copying scipy/distutils/system_info.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/distutils copying scipy/distutils/unixccompiler.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/distutils creating build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/distutils/command copying scipy/distutils/command/__init__.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/distutils/command copying scipy/distutils/command/bdist_rpm.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/distutils/command copying scipy/distutils/command/build.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/distutils/command copying scipy/distutils/command/build_clib.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/distutils/command copying scipy/distutils/command/build_ext.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/distutils/command copying scipy/distutils/command/build_py.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/distutils/command copying scipy/distutils/command/build_scripts.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/distutils/command copying scipy/distutils/command/build_src.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/distutils/command copying scipy/distutils/command/config.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/distutils/command copying scipy/distutils/command/config_compiler.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/distutils/command copying scipy/distutils/command/install.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/distutils/command copying scipy/distutils/command/install_data.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/distutils/command copying scipy/distutils/command/install_headers.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/distutils/command copying scipy/distutils/command/sdist.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/distutils/command creating build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/distutils/fcompiler copying scipy/distutils/fcompiler/__init__.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/distutils/fcompiler copying scipy/distutils/fcompiler/absoft.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/distutils/fcompiler copying scipy/distutils/fcompiler/compaq.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/distutils/fcompiler copying scipy/distutils/fcompiler/g95.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/distutils/fcompiler copying scipy/distutils/fcompiler/gnu.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/distutils/fcompiler copying scipy/distutils/fcompiler/hpux.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/distutils/fcompiler copying scipy/distutils/fcompiler/ibm.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/distutils/fcompiler copying scipy/distutils/fcompiler/intel.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/distutils/fcompiler copying scipy/distutils/fcompiler/lahey.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/distutils/fcompiler copying scipy/distutils/fcompiler/mips.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/distutils/fcompiler copying scipy/distutils/fcompiler/nag.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/distutils/fcompiler copying scipy/distutils/fcompiler/pg.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/distutils/fcompiler copying scipy/distutils/fcompiler/sun.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/distutils/fcompiler copying scipy/distutils/fcompiler/vast.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/distutils/fcompiler creating build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/weave copying /Users/jorgenstenarson/programming/python/newcore/scipy/weave/__init__.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/weave copying /Users/jorgenstenarson/programming/python/newcore/scipy/weave/accelerate_tools.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/weave copying /Users/jorgenstenarson/programming/python/newcore/scipy/weave/ast_tools.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/weave copying /Users/jorgenstenarson/programming/python/newcore/scipy/weave/base_info.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/weave copying /Users/jorgenstenarson/programming/python/newcore/scipy/weave/base_spec.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/weave copying /Users/jorgenstenarson/programming/python/newcore/scipy/weave/blitz_spec.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/weave copying /Users/jorgenstenarson/programming/python/newcore/scipy/weave/blitz_tools.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/weave copying /Users/jorgenstenarson/programming/python/newcore/scipy/weave/build_tools.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/weave copying /Users/jorgenstenarson/programming/python/newcore/scipy/weave/bytecodecompiler.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/weave copying /Users/jorgenstenarson/programming/python/newcore/scipy/weave/c_spec.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/weave copying /Users/jorgenstenarson/programming/python/newcore/scipy/weave/catalog.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/weave copying /Users/jorgenstenarson/programming/python/newcore/scipy/weave/common_info.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/weave copying /Users/jorgenstenarson/programming/python/newcore/scipy/weave/converters.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/weave copying /Users/jorgenstenarson/programming/python/newcore/scipy/weave/cpp_namespace_spec.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/weave copying /Users/jorgenstenarson/programming/python/newcore/scipy/weave/dumb_shelve.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/weave copying /Users/jorgenstenarson/programming/python/newcore/scipy/weave/dumbdbm_patched.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/weave copying /Users/jorgenstenarson/programming/python/newcore/scipy/weave/ext_tools.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/weave copying /Users/jorgenstenarson/programming/python/newcore/scipy/weave/info_weave.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/weave copying /Users/jorgenstenarson/programming/python/newcore/scipy/weave/inline_tools.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/weave copying /Users/jorgenstenarson/programming/python/newcore/scipy/weave/platform_info.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/weave copying /Users/jorgenstenarson/programming/python/newcore/scipy/weave/setup.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/weave copying /Users/jorgenstenarson/programming/python/newcore/scipy/weave/size_check.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/weave copying /Users/jorgenstenarson/programming/python/newcore/scipy/weave/slice_handler.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/weave copying /Users/jorgenstenarson/programming/python/newcore/scipy/weave/standard_array_spec.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/weave copying /Users/jorgenstenarson/programming/python/newcore/scipy/weave/swig2_spec.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/weave copying /Users/jorgenstenarson/programming/python/newcore/scipy/weave/swigptr.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/weave copying /Users/jorgenstenarson/programming/python/newcore/scipy/weave/swigptr2.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/weave copying /Users/jorgenstenarson/programming/python/newcore/scipy/weave/vtk_spec.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/weave copying /Users/jorgenstenarson/programming/python/newcore/scipy/weave/weave_version.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/weave copying /Users/jorgenstenarson/programming/python/newcore/scipy/weave/wx_spec.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/weave creating build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/test copying scipy/test/__init__.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/test copying scipy/test/auto_test.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/test copying scipy/test/info_scipy_test.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/test copying scipy/test/logging.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/test copying scipy/test/scipy_test_version.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/test copying scipy/test/setup_scipy_test.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/test copying scipy/test/testing.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/test creating build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/f2py copying scipy/f2py2e/__init__.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/f2py copying scipy/f2py2e/__svn_version__.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/f2py copying scipy/f2py2e/__version__.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/f2py copying scipy/f2py2e/auxfuncs.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/f2py copying scipy/f2py2e/capi_maps.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/f2py copying scipy/f2py2e/cb_rules.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/f2py copying scipy/f2py2e/cfuncs.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/f2py copying scipy/f2py2e/common_rules.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/f2py copying scipy/f2py2e/crackfortran.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/f2py copying scipy/f2py2e/diagnose.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/f2py copying scipy/f2py2e/f2py2e.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/f2py copying scipy/f2py2e/f2py_testing.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/f2py copying scipy/f2py2e/f90mod_rules.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/f2py copying scipy/f2py2e/func2subr.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/f2py copying scipy/f2py2e/rules.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/f2py copying scipy/f2py2e/setup.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/f2py copying scipy/f2py2e/use_rules.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/f2py creating build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/base copying scipy/base/__init__.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/base copying scipy/base/_internal.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/base copying scipy/base/arrayprint.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/base copying scipy/base/convertcode.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/base copying scipy/base/function_base.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/base copying scipy/base/getlimits.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/base copying scipy/base/index_tricks.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/base copying scipy/base/info_scipy_base.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/base copying scipy/base/ma.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/base copying scipy/base/machar.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/base copying scipy/base/matrix.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/base copying scipy/base/numeric.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/base copying scipy/base/numerictypes.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/base copying scipy/base/oldnumeric.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/base copying scipy/base/polynomial.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/base copying scipy/base/records.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/base copying scipy/base/scimath.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/base copying scipy/base/setup.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/base copying scipy/base/shape_base.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/base copying scipy/base/twodim_base.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/base copying scipy/base/type_check.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/base copying scipy/base/ufunclike.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/base copying scipy/base/code_generators/generate_array_api.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/base creating build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/lib copying scipy/corelib/__init__.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/lib copying scipy/corelib/setup.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/lib creating build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/fftpack copying scipy/fftpack/__init__.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/fftpack copying scipy/fftpack/fft_lite.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/fftpack creating build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/linalg copying scipy/linalg/__init__.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/linalg copying scipy/linalg/basic_lite.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/linalg creating build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/stats copying scipy/stats/__init__.py -> build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/stats running build_ext customize UnixCCompiler customize UnixCCompiler using build_ext building 'scipy.base.multiarray' extension compiling C sources gcc options: '-fno-strict-aliasing -Wno-long-double -no-cpp-precomp -mno-fused-madd -fno-common -fno-common -dynamic -DNDEBUG -g -O3 -Wall -Wstrict-prototypes' creating build/temp.darwin-8.2.0-Power_Macintosh-2.4 creating build/temp.darwin-8.2.0-Power_Macintosh-2.4/scipy creating build/temp.darwin-8.2.0-Power_Macintosh-2.4/scipy/base creating build/temp.darwin-8.2.0-Power_Macintosh-2.4/scipy/base/src compile options: '-Ibuild/src/scipy/base/src -Iscipy/base/include -Ibuild/src/scipy/base -Iscipy/base/src -I/opt/local/Library/Frameworks/Python.framework/Versions/2.4/include/python2.4 -I/opt/local/include/python2.4 -c' gcc: scipy/base/src/multiarraymodule.c scipy/base/include/scipy/__multiarray_api.h:131: warning: 'PyArray_GetBuffer' declared 'static' but never defined scipy/base/include/scipy/__multiarray_api.h:223: warning: 'PyArray_Sign' declared 'static' but never defined scipy/base/include/scipy/__multiarray_api.h:225: warning: 'PyArray_Round' declared 'static' but never defined gcc -bundle -undefined dynamic_lookup build/temp.darwin-8.2.0-Power_Macintosh-2.4/scipy/base/src/multiarraymodule.o -o build/lib.darwin-8.2.0-Power_Macintosh-2.4/scipy/base/multiarray.so building 'scipy.base.umath' extension compiling C sources gcc options: '-fno-strict-aliasing -Wno-long-double -no-cpp-precomp -mno-fused-madd -fno-common -fno-common -dynamic -DNDEBUG -g -O3 -Wall -Wstrict-prototypes' creating build/temp.darwin-8.2.0-Power_Macintosh-2.4/build creating build/temp.darwin-8.2.0-Power_Macintosh-2.4/build/src creating build/temp.darwin-8.2.0-Power_Macintosh-2.4/build/src/scipy creating build/temp.darwin-8.2.0-Power_Macintosh-2.4/build/src/scipy/base creating build/temp.darwin-8.2.0-Power_Macintosh-2.4/build/src/scipy/base/src compile options: '-Ibuild/src/scipy/base/src -Iscipy/base/include -Ibuild/src/scipy/base -Iscipy/base/src -I/opt/local/Library/Frameworks/Python.framework/Versions/2.4/include/python2.4 -I/opt/local/include/python2.4 -c' gcc: build/src/scipy/base/src/umathmodule.c build/src/scipy/base/src/umathmodule.c:56: error: static declaration of 'acoshf' follows non-static declaration build/src/scipy/base/src/umathmodule.c:61: error: static declaration of 'asinhf' follows non-static declaration build/src/scipy/base/src/umathmodule.c:66: error: static declaration of 'atanhf' follows non-static declaration build/src/scipy/base/src/umathmodule.c:56: error: static declaration of 'acoshf' follows non-static declaration build/src/scipy/base/src/umathmodule.c:61: error: static declaration of 'asinhf' follows non-static declaration build/src/scipy/base/src/umathmodule.c:66: error: static declaration of 'atanhf' follows non-static declaration removed scipy/__svn_version__.py removed scipy/f2py2e/__svn_version__.py error: Command "gcc -fno-strict-aliasing -Wno-long-double -no-cpp-precomp -mno-fused-madd -fno-common -fno-common -dynamic -DNDEBUG -g -O3 -Wall -Wstrict-prototypes -Ibuild/src/scipy/base/src -Iscipy/base/include -Ibuild/src/scipy/base -Iscipy/base/src -I/opt/local/Library/Frameworks/Python.framework/Versions/2.4/include/python2.4 -I/opt/local/include/python2.4 -c build/src/scipy/base/src/umathmodule.c -o build/temp.darwin-8.2.0-Power_Macintosh-2.4/build/src/scipy/base/src/umathmodule.o" failed with exit status 1 From pearu at scipy.org Tue Oct 4 13:25:39 2005 From: pearu at scipy.org (Pearu Peterson) Date: Tue, 4 Oct 2005 12:25:39 -0500 (CDT) Subject: [SciPy-dev] Build of newcore on macosX and with mingw32 In-Reply-To: <4342C2F2.4080703@bostream.nu> References: <4342C2F2.4080703@bostream.nu> Message-ID: On Tue, 4 Oct 2005, J?rgen Stenarson wrote: > Hi, > > I'm trying to build newcore on macosx using python 2.4 from darwinports. > > I have checked out: http://svn.scipy.org/svn/scipy_core/branches/newcore > > but after a while the build fails while compiling the umath module. > > The first line after issuing the build command says: > Not loaded: Are you running from the source directory? > is this expected? Yes. Because you're running from the source directory:) > any ideas what the problem can be? ... > gcc: build/src/scipy/base/src/umathmodule.c > build/src/scipy/base/src/umathmodule.c:56: error: static declaration of > 'acoshf' follows non-static declaration > build/src/scipy/base/src/umathmodule.c:61: error: static declaration of > 'asinhf' follows non-static declaration > build/src/scipy/base/src/umathmodule.c:66: error: static declaration of > 'atanhf' follows non-static declaration > build/src/scipy/base/src/umathmodule.c:56: error: static declaration of > 'acoshf' follows non-static declaration > build/src/scipy/base/src/umathmodule.c:61: error: static declaration of > 'asinhf' follows non-static declaration > build/src/scipy/base/src/umathmodule.c:66: error: static declaration of > 'atanhf' follows non-static declaration Could you send the contects of build/src/scipy/base/config.h? What happens if you add #define HAVE_INVERSE_HYPERBOLIC_FLOAT to build/src/scipy/base/config.h? The code probably builds then. Then it's not clear to me why kws_args = {'libraries':libs,'decl':0,'headers':['math.h']} config_cmd.check_func('atanhf', **kws_args) fails in your platform when atanhf seems to be available. Pearu From Fernando.Perez at colorado.edu Tue Oct 4 14:34:18 2005 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Tue, 04 Oct 2005 12:34:18 -0600 Subject: [SciPy-dev] Proposal: scipy.sandbox In-Reply-To: References: <43424947.7090809@ucsd.edu> <4342B4ED.5050908@colorado.edu> Message-ID: <4342CB2A.4080804@colorado.edu> Pearu Peterson wrote: > > On Tue, 4 Oct 2005, Fernando Perez wrote: >>Note that xxx could probably use a bit of updating to reflect the current >>state of scipy. But even as it is, it's a useful starter for new package >>authors, and sandbox is probably the right place for it. > > > I would rename xxx to example or expackage or smth like that. > And it really needs updated to current scipy.distutils. I would not > recommend it for starter, instead, one should look at newcore setup.py > files when creating a new package. +1 on the renaming. My point was that the basic idea of a subdirectory that new users can copy wholesale to get started when writing a new package is a good one. It lowers the barrier for new contributors, as they can just copy that and start renaming and deleting what they don't need, and be up and running quickly. It is harder (for a newcomer) to try to extract from the full tree the relevant bits, since they won't know what's critical and what isn't. That's why I see value in a standalone minimal example package, perhaps I wasn't clear enough originally. I suspected it was out of date, esp. with respect to the advances in scipy.distutils, but I think with a bit of cleanup/updating it can be a useful piece, especially as we try to make the whole of scipy more friendly towards modularization efforts. Cheers, f From pearu at scipy.org Tue Oct 4 13:41:03 2005 From: pearu at scipy.org (Pearu Peterson) Date: Tue, 4 Oct 2005 12:41:03 -0500 (CDT) Subject: [SciPy-dev] Proposal: scipy.sandbox In-Reply-To: <4342CB2A.4080804@colorado.edu> References: <43424947.7090809@ucsd.edu> <4342B4ED.5050908@colorado.edu> <4342CB2A.4080804@colorado.edu> Message-ID: On Tue, 4 Oct 2005, Fernando Perez wrote: > Pearu Peterson wrote: >> >> On Tue, 4 Oct 2005, Fernando Perez wrote: > >>> Note that xxx could probably use a bit of updating to reflect the current >>> state of scipy. But even as it is, it's a useful starter for new package >>> authors, and sandbox is probably the right place for it. >> >> >> I would rename xxx to example or expackage or smth like that. >> And it really needs updated to current scipy.distutils. I would not >> recommend it for starter, instead, one should look at newcore setup.py >> files when creating a new package. > > +1 on the renaming. My point was that the basic idea of a subdirectory that > new users can copy wholesale to get started when writing a new package is a > good one. It lowers the barrier for new contributors, as they can just copy > that and start renaming and deleting what they don't need, and be up and > running quickly. It is harder (for a newcomer) to try to extract from the > full tree the relevant bits, since they won't know what's critical and what > isn't. That's why I see value in a standalone minimal example package, > perhaps I wasn't clear enough originally. It was clear enough for me. I just wanted to make sure that new contributors (many people have offered help recently) would not look into xxx as it is now. > I suspected it was out of date, esp. with respect to the advances in > scipy.distutils, but I think with a bit of cleanup/updating it can be a > useful piece, especially as we try to make the whole of scipy more friendly > towards modularization efforts. Sure. Pearu From Fernando.Perez at colorado.edu Tue Oct 4 15:08:23 2005 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Tue, 04 Oct 2005 13:08:23 -0600 Subject: [SciPy-dev] Proposal: scipy.sandbox In-Reply-To: References: <43424947.7090809@ucsd.edu> <4342B4ED.5050908@colorado.edu> <4342CB2A.4080804@colorado.edu> Message-ID: <4342D327.6030100@colorado.edu> Pearu Peterson wrote: > It was clear enough for me. I just wanted to make sure that new > contributors (many people have offered help recently) would not look into > xxx as it is now. One last thing: if renamed, let's use something that's not a normal word (not 'example', for one). This package will be run through a search&replace, and it's best if the name is easy to change without hitting false positives. expackage, scipyexpack, whatever, but not any common word. Cheers, f From jorgen.stenarson at bostream.nu Tue Oct 4 16:01:43 2005 From: jorgen.stenarson at bostream.nu (=?ISO-8859-1?Q?J=F6rgen_Stenarson?=) Date: Tue, 04 Oct 2005 22:01:43 +0200 Subject: [SciPy-dev] Build of newcore on macosX and with mingw32 In-Reply-To: References: <4342C2F2.4080703@bostream.nu> Message-ID: <4342DFA7.6060100@bostream.nu> Hi, thanks for looking into this. Pearu Peterson wrote: ... > > ... > >> gcc: build/src/scipy/base/src/umathmodule.c >> build/src/scipy/base/src/umathmodule.c:56: error: static declaration >> of 'acoshf' follows non-static declaration >> build/src/scipy/base/src/umathmodule.c:61: error: static declaration >> of 'asinhf' follows non-static declaration >> build/src/scipy/base/src/umathmodule.c:66: error: static declaration >> of 'atanhf' follows non-static declaration >> build/src/scipy/base/src/umathmodule.c:56: error: static declaration >> of 'acoshf' follows non-static declaration >> build/src/scipy/base/src/umathmodule.c:61: error: static declaration >> of 'asinhf' follows non-static declaration >> build/src/scipy/base/src/umathmodule.c:66: error: static declaration >> of 'atanhf' follows non-static declaration > > > Could you send the contects of build/src/scipy/base/config.h? /* #define SIZEOF_SHORT 2 */ /* #define SIZEOF_INT 4 */ /* #define SIZEOF_LONG 4 */ /* #define SIZEOF_FLOAT 4 */ /* #define SIZEOF_DOUBLE 8 */ #define SIZEOF_LONG_DOUBLE 16 #define SIZEOF_PY_INTPTR_T 4 /* #define SIZEOF_LONG_LONG 8 */ #define SIZEOF_PY_LONG_LONG 8 /* #define CHAR_BIT 8 */ #define MATHLIB #define HAVE_LONGDOUBLE_FUNCS #define HAVE_FLOAT_FUNCS #define HAVE_INVERSE_HYPERBOLIC #define HAVE_INVERSE_HYPERBOLIC_FLOAT > > What happens if you add > > #define HAVE_INVERSE_HYPERBOLIC_FLOAT > as you can see above it is already there. > to build/src/scipy/base/config.h? The code probably builds then. > Then it's not clear to me why > > kws_args = {'libraries':libs,'decl':0,'headers':['math.h']} > config_cmd.check_func('atanhf', **kws_args) > > fails in your platform when atanhf seems to be available. > > Pearu ... do you have any other ideas? /J?rgen From pearu at scipy.org Tue Oct 4 15:08:21 2005 From: pearu at scipy.org (Pearu Peterson) Date: Tue, 4 Oct 2005 14:08:21 -0500 (CDT) Subject: [SciPy-dev] Build of newcore on macosX and with mingw32 In-Reply-To: <4342DFA7.6060100@bostream.nu> References: <4342C2F2.4080703@bostream.nu> <4342DFA7.6060100@bostream.nu> Message-ID: On Tue, 4 Oct 2005, J?rgen Stenarson wrote: >>> gcc: build/src/scipy/base/src/umathmodule.c >>> build/src/scipy/base/src/umathmodule.c:56: error: static declaration of >>> 'acoshf' follows non-static declaration >>> build/src/scipy/base/src/umathmodule.c:61: error: static declaration of >>> 'asinhf' follows non-static declaration >>> build/src/scipy/base/src/umathmodule.c:66: error: static declaration of >>> 'atanhf' follows non-static declaration >>> build/src/scipy/base/src/umathmodule.c:56: error: static declaration of >>> 'acoshf' follows non-static declaration >>> build/src/scipy/base/src/umathmodule.c:61: error: static declaration of >>> 'asinhf' follows non-static declaration >>> build/src/scipy/base/src/umathmodule.c:66: error: static declaration of >>> 'atanhf' follows non-static declaration >> >> >> Could you send the contects of build/src/scipy/base/config.h? > > /* #define SIZEOF_SHORT 2 */ > /* #define SIZEOF_INT 4 */ > /* #define SIZEOF_LONG 4 */ > /* #define SIZEOF_FLOAT 4 */ > /* #define SIZEOF_DOUBLE 8 */ > #define SIZEOF_LONG_DOUBLE 16 > #define SIZEOF_PY_INTPTR_T 4 > /* #define SIZEOF_LONG_LONG 8 */ > #define SIZEOF_PY_LONG_LONG 8 > /* #define CHAR_BIT 8 */ > #define MATHLIB > #define HAVE_LONGDOUBLE_FUNCS > #define HAVE_FLOAT_FUNCS > #define HAVE_INVERSE_HYPERBOLIC > #define HAVE_INVERSE_HYPERBOLIC_FLOAT > >> >> What happens if you add >> >> #define HAVE_INVERSE_HYPERBOLIC_FLOAT >> > as you can see above it is already there. So, the code that give errors should not be available to the compiler. > do you have any other ideas? Hmm, may be the building process picks config.h up from some other place. Introduce some syntax error to build/src/scipy/base/config.h in order to verify this. Try to remove old scipy.core from your system, that includes also header files in include/python2.4/scipy directory, then do `rm -rf build` and try to rebuild. Pearu From oliphant at ee.byu.edu Tue Oct 4 18:23:55 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Tue, 04 Oct 2005 16:23:55 -0600 Subject: [SciPy-dev] Re: [Numpy-discussion] Purchasing Documentation In-Reply-To: References: <43418AA8.2040809@hawaii.edu> <4341A410.5010504@optusnet.com.au> <4341B659.7070503@hawaii.edu> <4341C1C8.3000506@optusnet.com.au><4341CB9E.4050306@ee.byu.edu> Message-ID: <434300FB.7070606@ee.byu.edu> Alan G Isaac wrote: >I also plan to ask our library to purchase the book, but >I am concerned that your statement that multiple users each >need their own copy might mean a library purchase is >forbidden. I assume it did not mean that, and that you >just meant that making multiple copies is restricted. (Our >library supports electronic book check out.) Ruling out >library purchases would, I think, be a costly mistake for >many reasons, which I can list if you are interested. > > A library purchase is fine. If that how a single copy is shared. I'll make that more clear. But, really, if multiple users need to use it at the same time, then the library should purchase several copies. >Finally, I agree with Tim that seven years is too long and >at the price I'd hope for a paperback copy. I think >a better strategy would be two years copy protection, with >an updated edition every two years. (But then I am not >writing the code!) T > Thanks for the feedback. I'm experimenting with the right combination of total price and total time so feedback is welcomed. I want to encourage people who can really afford the book to spend the money on it. What is the "right" time/$$ combination that will encourage this. I'm willing to shorten the time and come down on the total price. They are set so I cannot increase them. But, there is no problem with lowering them. I could also support the idea of a cheaper total price 1st edition with a need to spend for the 2nd edition again. Thanks for the feedback. >he basic concept is really nice, as >long as it does not make it harder for you to >- fully document your code, >- smile on the free documentation that emerges, and >- keep your sunny disposition. > > Don't worry, I'm not banking my future on this little experiment, so I won't worry about what other people do. In fact, as John Hunter inferred, I would be thrilled by more contributions however they come. I just want to see more people use Python for scientific computing, and am trying some things out to help it along. My only question about writing "free" documentation, is that it just seems rather wasteful to spend time writing free documentation when you can set free the documentation by spending a little money instead. If you think I'm charging too much (either in $$ or time-delay), then continue to give me feedback. I am interested in what works. -Travis From rkern at ucsd.edu Tue Oct 4 18:29:32 2005 From: rkern at ucsd.edu (Robert Kern) Date: Tue, 04 Oct 2005 15:29:32 -0700 Subject: [SciPy-dev] Proposal: scipy.sandbox In-Reply-To: References: <43424947.7090809@ucsd.edu> <4342B4ED.5050908@colorado.edu> Message-ID: <4343024C.9090006@ucsd.edu> Pearu Peterson wrote: > me too. In addition to cow and ga, also gui_thread, gplt, plt, xplt(?) > should be moved under sandbox. IMO, gui_thread, gplt, and plt should simply be retired. They're not just succumbing to bitrot; they don't really have a reason to exist anymore. xplt still has some fans, though, and if they are going to step up to support it, then scipy.sandbox is a good a place as any until it finds a separate home. I *don't* want it to migrate back alongside linalg, special, etc. like I hope cow and ga will. -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From aisaac at american.edu Tue Oct 4 18:41:32 2005 From: aisaac at american.edu (Alan G Isaac) Date: Tue, 4 Oct 2005 18:41:32 -0400 Subject: [SciPy-dev] Re: [Numpy-discussion] Purchasing Documentation In-Reply-To: <434300FB.7070606@ee.byu.edu> References: <43418AA8.2040809@hawaii.edu> <4341A410.5010504@optusnet.com.au> <4341B659.7070503@hawaii.edu> <4341C1C8.3000506@optusnet.com.au><4341CB9E.4050306@ee.byu.edu> <434300FB.7070606@ee.byu.edu> Message-ID: On Tue, 04 Oct 2005, Travis Oliphant apparently wrote: > A library purchase is fine. If that how a single copy is shared. I'll > make that more clear. But, really, if multiple users need to use it at > the same time, then the library should purchase several copies. Our library supports single copy checkout. I think this currently is a pretty standard library function these days. > My only question about writing "free" documentation, is that it just > seems rather wasteful to spend time writing free documentation when you > can set free the documentation by spending a little money instead. I've done my part. ;-) Cheers, Alan Isaac From rkern at ucsd.edu Tue Oct 4 18:47:48 2005 From: rkern at ucsd.edu (Robert Kern) Date: Tue, 04 Oct 2005 15:47:48 -0700 Subject: [SciPy-dev] Build of newcore on macosX and with mingw32 In-Reply-To: <4342C2F2.4080703@bostream.nu> References: <4342C2F2.4080703@bostream.nu> Message-ID: <43430694.4000505@ucsd.edu> J?rgen Stenarson wrote: > Hi, > > I'm trying to build newcore on macosx using python 2.4 from darwinports. > > I have checked out: http://svn.scipy.org/svn/scipy_core/branches/newcore > > but after a while the build fails while compiling the umath module. > > The first line after issuing the build command says: > Not loaded: Are you running from the source directory? > is this expected? As Pearu noted, yes this is expected. I would like to take this opportunity, though, to request that we reduce the number of such warnings. I believe they tend to make newbies nervous more than they inform. Another example would be the warnings about not finding unit test files for tons of modules when running scipy.test(). > gcc options: '-fno-strict-aliasing -Wno-long-double -no-cpp-precomp > -mno-fused-madd -fno-common -fno-common -dynamic -DNDEBUG -g -O3 -Wall > -Wstrict-prototypes' > compile options: '-Iscipy/base/src > -I/opt/local/Library/Frameworks/Python.framework/Versions/2.4/include/python2.4 > -I/opt/local/include/python2.4 -c' > gcc: _configtest.c > _configtest.c:3: warning: function declaration isn't a prototype > _configtest.c: In function 'main': > _configtest.c:4: error: 'isnan' undeclared (first use in this function) > _configtest.c:4: error: (Each undeclared identifier is reported only once > _configtest.c:4: error: for each function it appears in.) > _configtest.c:3: warning: function declaration isn't a prototype > _configtest.c: In function 'main': > _configtest.c:4: error: 'isnan' undeclared (first use in this function) > _configtest.c:4: error: (Each undeclared identifier is reported only once > _configtest.c:4: error: for each function it appears in.) > failure. > removing: _configtest.c _configtest.o That shouldn't happen. isnan() is macro in math.h on Tiger. If you're picking up the right math.h (/usr/include/math.h, which includes /usr/include/architecture/ppc/math.h for the actual implementation), you shouldn't be running into this problem. What compiler are you using? -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From Fernando.Perez at colorado.edu Tue Oct 4 18:53:00 2005 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Tue, 04 Oct 2005 16:53:00 -0600 Subject: [SciPy-dev] Re: [Numpy-discussion] Purchasing Documentation In-Reply-To: <434300FB.7070606@ee.byu.edu> References: <43418AA8.2040809@hawaii.edu> <4341A410.5010504@optusnet.com.au> <4341B659.7070503@hawaii.edu> <4341C1C8.3000506@optusnet.com.au><4341CB9E.4050306@ee.byu.edu> <434300FB.7070606@ee.byu.edu> Message-ID: <434307CC.7020303@colorado.edu> Travis Oliphant wrote: > Alan G Isaac wrote: > > >>I also plan to ask our library to purchase the book, but >>I am concerned that your statement that multiple users each >>need their own copy might mean a library purchase is >>forbidden. I assume it did not mean that, and that you >>just meant that making multiple copies is restricted. (Our >>library supports electronic book check out.) Ruling out >>library purchases would, I think, be a costly mistake for >>many reasons, which I can list if you are interested. >> >> > > A library purchase is fine. If that how a single copy is shared. I'll > make that more clear. But, really, if multiple users need to use it at > the same time, then the library should purchase several copies. Travis, I think that what confused some people (and which I believe was not your original intent) was the impression that you meant to have terms on the printed version of the book which were more restrictive than those of traditional paper books. With a single physical copy of a paper book, the rules are pretty simple and constrained by the laws of nature (one non-quantum object can't really be in more than one place at the same time). Lending, borrowing, library use, 'lab bench' use, etc, are all accepted practices because while one person is using the book, nobody else has access to it. Since your book is originally provided electronically, there is the technical possiblity to make multiple physical copies, which I believe is what you wish to prevent (and something I'm not arguing with). So perhaps a clarification along the lines of the following could help (this is my wording, of course, so you should say what _you_ want, not what I get from trying to read your mind :) 'a single printed copy can be made from the electronic version, which is subject to the same restrictions imposed on paper books (lending is OK but not wholesale photocopying for redistribution, for example)' This would put at ease a lot of people who normally buy a book in a lab or research group with the natural assumption that anyone in that lab can go to the shelf and read it. Obviously if it's a book with very frequent use, traditional book purchasers buy multiple copies. With your book, the exact same thing would be expected: just because they have the PDF doesn't mean they can print 10 copies of it for the whole lab. Or at least that's my understanding. Best regards, Fernando. From oliphant at ee.byu.edu Tue Oct 4 19:08:18 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Tue, 04 Oct 2005 17:08:18 -0600 Subject: [SciPy-user] Re: [SciPy-dev] Re: [Numpy-discussion] Purchasing Documentation In-Reply-To: <434307CC.7020303@colorado.edu> References: <43418AA8.2040809@hawaii.edu> <4341A410.5010504@optusnet.com.au> <4341B659.7070503@hawaii.edu> <4341C1C8.3000506@optusnet.com.au><4341CB9E.4050306@ee.byu.edu> <434300FB.7070606@ee.byu.edu> <434307CC.7020303@colorado.edu> Message-ID: <43430B62.9090908@ee.byu.edu> Fernando Perez wrote: > Travis Oliphant wrote: > >> Alan G Isaac wrote: >> >> >>> I also plan to ask our library to purchase the book, but I am >>> concerned that your statement that multiple users each need their >>> own copy might mean a library purchase is forbidden. I assume it >>> did not mean that, and that you just meant that making multiple >>> copies is restricted. (Our library supports electronic book check >>> out.) Ruling out library purchases would, I think, be a costly >>> mistake for many reasons, which I can list if you are interested. >>> >> >> A library purchase is fine. If that how a single copy is shared. >> I'll make that more clear. But, really, if multiple users need to >> use it at the same time, then the library should purchase several >> copies. > > > 'a single printed copy can be made from the electronic version, which > is subject to the same restrictions imposed on paper books (lending is > OK but not wholesale photocopying for redistribution, for example)' > > This would put at ease a lot of people who normally buy a book in a > lab or research group with the natural assumption that anyone in that > lab can go to the shelf and read it. Obviously if it's a book with > very frequent use, traditional book purchasers buy multiple copies. > With your book, the exact same thing would be expected: just because > they have the PDF doesn't mean they can print 10 copies of it for the > whole lab. Or at least that's my understanding. > Thanks, I like this wording. It is exactly what I meant. I also think except for a library-checkout system (where only one digital copy is in circulation), somebody should not be able to buy an e-copy and then make electronic copies for everybody in their organization. That's really quite counter productive. If you want to share your e-copy with someone for a while (or give it away) fine... I'm really just asking that you treat the e-copy something like a physical book. I'll make some more details concerning the intent, available. Thanks for everybody's help. -Travis From oliphant at ee.byu.edu Wed Oct 5 01:02:53 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Tue, 04 Oct 2005 23:02:53 -0600 Subject: [SciPy-dev] scipy.base passing tests In-Reply-To: <43428EF0.7010602@ucsd.edu> References: <43428EF0.7010602@ucsd.edu> Message-ID: <43435E7D.9090204@ee.byu.edu> Robert Kern wrote: >In my working copy, all of the tests for scipy.base pass. Except for the >ones that don't. Before checking my stuff in, I want to get some >feedback on these: > >* test_shape_base.test_apply_along_axis causes a segfault due to the >put() on object arrays bug I posted about earlier. Personally, I've >commented out this testcase so I could run the others. > > I'll look in to this one.. >* test_type_check.test_mintypecode has a few cases using savespace. >That's obsolete now, but I'm not sure if there's new functionality we >should test in its place. > >* Various tests in test_type_check want to divide scalar arrays by 0 to >make nans and infs. This fails with a ZeroDivisionError. E.g. > >====================================================================== >ERROR: check_complex1 (scipy.base.type_check.test_type_check.test_isfinite) >---------------------------------------------------------------------- >Traceback (most recent call last): > File >"/Users/kern/svk-projects/scipy_core/scipy/base/tests/test_type_check.py", >line 162, in check_complex1 > assert_all(isfinite(array(1+1j)/0.) == 0) >ZeroDivisionError: complex division > >I'm not sure whether this is desired behavior, or the scalar array types >should be modified to treat division by 0 like other arrays. > > > No, I don't think it's desired. I think the right solution is to check in code for as_number methods of the array scalars that is specific to the different types of arrays (integers, float, complex, string, unicode, etc...). Right now, they all inherit from the same gentype_as_number methods, which does different things depending on what is passed in. Right now, if a mixed array / normal Python object operation is done, I think the normal Python object gets to try it's hand. Thus, the division by zero error. Now, we could try to fix the generic code to be exactly what we want. But, this seems unwieldy. We already have different as_number tables we could play with (all the array scalars don't need to point to one function table which they do now). We have the opportunity to adjust the as_number tables to behave as we want, so we really should just do that. This will also speed up array scalar operations, because right now, they all go through 0 dimensional arrays for their math which is slower. -Travis From oliphant at ee.byu.edu Wed Oct 5 03:01:43 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Wed, 05 Oct 2005 01:01:43 -0600 Subject: [SciPy-dev] Bus error when calling put() method on object arrays In-Reply-To: <43427D3F.4070206@ucsd.edu> References: <43427D3F.4070206@ucsd.edu> Message-ID: <43437A57.80800@ee.byu.edu> Robert Kern wrote: >I've tracked it down to the first line of OBJECT_setitem() in >arraytypes.inc.src, but that's where my debug-fu fails me: > > Got it and squashed it. This was a nasty typo in PyArray_Put It should have been self->descr->type_num in the values. Basically, while PyArray_New works with typecharacters instead of type numbers, it did not set Object arrays created that way to NULL. So, the code was trying to DECREF unitialized OBJECT pointer. Python was having a great time with it I'm sure... -Travis From nwagner at mecha.uni-stuttgart.de Wed Oct 5 04:28:50 2005 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Wed, 05 Oct 2005 10:28:50 +0200 Subject: [SciPy-dev] Results of scipy.test(1,verbosity=10) in version 0.4.2 Message-ID: <43438EC2.3050701@mecha.uni-stuttgart.de> Hi all, scipy.test(1,verbosity=10) in version 0.4.2 results in 16 errors and 1 failure. ====================================================================== ERROR: check_simple (scipy.base.shape_base.test_shape_base.test_apply_along_axis) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib/python2.4/site-packages/scipy/base/tests/test_shape_base.py", line 14, in check_simple assert_array_equal(apply_along_axis(len,0,a),len(a)*ones(shape(a)[1])) File "/usr/local/lib/python2.4/site-packages/scipy/base/shape_base.py", line 31, in apply_along_axis res = func1d(arr[tuple(i)],*args) IndexError: each subindex must be either a slice, an integer, Ellipsis, or newaxis ====================================================================== ERROR: check_complex1 (scipy.base.type_check.test_type_check.test_isfinite) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib/python2.4/site-packages/scipy/base/tests/test_type_check.py", line 162, in check_complex1 assert_all(isfinite(array(1+1j)/0.) == 0) ZeroDivisionError: complex division ====================================================================== ERROR: check_neginf_scalar (scipy.base.type_check.test_type_check.test_isinf) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib/python2.4/site-packages/scipy/base/tests/test_type_check.py", line 176, in check_neginf_scalar assert_all(isinf(array(-1.)/0.) == 1) ZeroDivisionError: float division ====================================================================== ERROR: check_posinf_scalar (scipy.base.type_check.test_type_check.test_isinf) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib/python2.4/site-packages/scipy/base/tests/test_type_check.py", line 172, in check_posinf_scalar assert_all(isinf(array(1.,)/0.) == 1) ZeroDivisionError: float division ====================================================================== ERROR: check_complex1 (scipy.base.type_check.test_type_check.test_isnan) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib/python2.4/site-packages/scipy/base/tests/test_type_check.py", line 142, in check_complex1 assert_all(isnan(array(0+0j)/0.) == 1) ZeroDivisionError: complex division ====================================================================== ERROR: check_generic (scipy.base.type_check.test_type_check.test_isneginf) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib/python2.4/site-packages/scipy/base/tests/test_type_check.py", line 192, in check_generic vals = isneginf(array((-1.,0,1))/0.) File "/usr/local/lib/python2.4/site-packages/scipy/base/ufunclike.py", line 40, in isneginf umath.logical_and(isinf(x), signbit(x), y) NameError: global name 'umath' is not defined ====================================================================== ERROR: check_generic (scipy.base.type_check.test_type_check.test_isposinf) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib/python2.4/site-packages/scipy/base/tests/test_type_check.py", line 185, in check_generic vals = isposinf(array((-1.,0,1))/0.) File "/usr/local/lib/python2.4/site-packages/scipy/base/ufunclike.py", line 34, in isposinf umath.logical_and(isinf(x), ~signbit(x), y) NameError: global name 'umath' is not defined ====================================================================== ERROR: check_default_1 (scipy.base.type_check.test_type_check.test_mintypecode) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib/python2.4/site-packages/scipy/base/tests/test_type_check.py", line 18, in check_default_1 assert_equal(mintypecode(itype),'d') File "/usr/local/lib/python2.4/site-packages/scipy/base/type_check.py", line 32, in mintypecode typecodes = [(type(t) is type('') and t) or asarray(t).dtypechar\ UnboundLocalError: local variable 'typecodes' referenced before assignment ====================================================================== ERROR: check_default_2 (scipy.base.type_check.test_type_check.test_mintypecode) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib/python2.4/site-packages/scipy/base/tests/test_type_check.py", line 26, in check_default_2 assert_equal(mintypecode(itype+'f'),'f') File "/usr/local/lib/python2.4/site-packages/scipy/base/type_check.py", line 32, in mintypecode typecodes = [(type(t) is type('') and t) or asarray(t).dtypechar\ UnboundLocalError: local variable 'typecodes' referenced before assignment ====================================================================== ERROR: check_default_3 (scipy.base.type_check.test_type_check.test_mintypecode) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib/python2.4/site-packages/scipy/base/tests/test_type_check.py", line 50, in check_default_3 assert_equal(mintypecode('fdF'),'D') File "/usr/local/lib/python2.4/site-packages/scipy/base/type_check.py", line 32, in mintypecode typecodes = [(type(t) is type('') and t) or asarray(t).dtypechar\ UnboundLocalError: local variable 'typecodes' referenced before assignment ====================================================================== ERROR: check_complex_bad (scipy.base.type_check.test_type_check.test_nan_to_num) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib/python2.4/site-packages/scipy/base/tests/test_type_check.py", line 211, in check_complex_bad v += array(0+1.j)/0. ZeroDivisionError: complex division ====================================================================== ERROR: check_complex_bad2 (scipy.base.type_check.test_type_check.test_nan_to_num) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib/python2.4/site-packages/scipy/base/tests/test_type_check.py", line 217, in check_complex_bad2 v += array(-1+1.j)/0. ZeroDivisionError: complex division ====================================================================== ERROR: check_complex_good (scipy.base.type_check.test_type_check.test_nan_to_num) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib/python2.4/site-packages/scipy/base/tests/test_type_check.py", line 207, in check_complex_good vals = nan_to_num(1+1j) File "/usr/local/lib/python2.4/site-packages/scipy/base/type_check.py", line 95, in nan_to_num y = nan_to_num(x.real) + 1j * nan_to_num(x.imag) File "/usr/local/lib/python2.4/site-packages/scipy/base/type_check.py", line 98, in nan_to_num are_inf = isposinf(y) File "/usr/local/lib/python2.4/site-packages/scipy/base/ufunclike.py", line 34, in isposinf umath.logical_and(isinf(x), ~signbit(x), y) NameError: global name 'umath' is not defined ====================================================================== ERROR: check_generic (scipy.base.type_check.test_type_check.test_nan_to_num) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib/python2.4/site-packages/scipy/base/tests/test_type_check.py", line 199, in check_generic vals = nan_to_num(array((-1.,0,1))/0.) File "/usr/local/lib/python2.4/site-packages/scipy/base/type_check.py", line 98, in nan_to_num are_inf = isposinf(y) File "/usr/local/lib/python2.4/site-packages/scipy/base/ufunclike.py", line 34, in isposinf umath.logical_and(isinf(x), ~signbit(x), y) NameError: global name 'umath' is not defined ====================================================================== ERROR: check_integer (scipy.base.type_check.test_type_check.test_nan_to_num) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib/python2.4/site-packages/scipy/base/tests/test_type_check.py", line 204, in check_integer vals = nan_to_num(1) File "/usr/local/lib/python2.4/site-packages/scipy/base/type_check.py", line 98, in nan_to_num are_inf = isposinf(y) File "/usr/local/lib/python2.4/site-packages/scipy/base/ufunclike.py", line 34, in isposinf umath.logical_and(isinf(x), ~signbit(x), y) NameError: global name 'umath' is not defined ====================================================================== ERROR: check_basic (scipy.base.type_check.test_type_check.test_real_if_close) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib/python2.4/site-packages/scipy/base/tests/test_type_check.py", line 230, in check_basic b = real_if_close(a+1e-15j) File "/usr/local/lib/python2.4/site-packages/scipy/base/type_check.py", line 116, in real_if_close tol = f.epsilon * tol AttributeError: 'finfo' object has no attribute 'epsilon' ====================================================================== FAIL: check_neginf (scipy.base.type_check.test_type_check.test_isinf) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib/python2.4/site-packages/scipy/base/tests/test_type_check.py", line 174, in check_neginf assert_all(isinf(array((-1.,))/0.) == 1) File "/usr/local/lib/python2.4/site-packages/scipy/base/tests/test_type_check.py", line 12, in assert_all assert(all(x)), x AssertionError: [False] ---------------------------------------------------------------------- Ran 122 tests in 0.268s FAILED (failures=1, errors=16) From rkern at ucsd.edu Wed Oct 5 04:45:41 2005 From: rkern at ucsd.edu (Robert Kern) Date: Wed, 05 Oct 2005 01:45:41 -0700 Subject: [SciPy-dev] Results of scipy.test(1,verbosity=10) in version 0.4.2 In-Reply-To: <43438EC2.3050701@mecha.uni-stuttgart.de> References: <43438EC2.3050701@mecha.uni-stuttgart.de> Message-ID: <434392B5.3040609@ucsd.edu> Nils Wagner wrote: > Hi all, > > scipy.test(1,verbosity=10) in version 0.4.2 results in 16 errors and 1 > failure. etc. We're aware of these. -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From arnd.baecker at web.de Wed Oct 5 05:10:40 2005 From: arnd.baecker at web.de (Arnd Baecker) Date: Wed, 5 Oct 2005 11:10:40 +0200 (CEST) Subject: [SciPy-dev] frange for scipy? Message-ID: Hi, I just did a quick check-out of the new scipy_core. It looks *very* nice (many thanks Travis!!) - what I like in particular is that `zeros` now takes an optional `fortran` parameter (this is to set up the array in fortran ordering in memory, right?). As quite a few of Fernandos routines from IPython.numutils seem to be incorporated, there is one which I'd love to see as well: IPython.numutils.frange(xini, xfin=None, delta=None, **kw) to generate an array of floats with a specified number of elements. Best, Arnd From pearu at scipy.org Wed Oct 5 04:28:23 2005 From: pearu at scipy.org (Pearu Peterson) Date: Wed, 5 Oct 2005 03:28:23 -0500 (CDT) Subject: [SciPy-dev] newscipy sandbox Message-ID: Hi, I have created sandbox in newscipy and adapted scipy setup.py file to new scipy.distutils. At the moment the sandbox contains only exmplpackage that is former xxx (and updated to scipy.distutils) and should serve as an example scipy package. So, currently installed scipy consists only of scipy.sandbox package. In order to add other packages in Lib to scipy installation, one must 1) add config.add_subpackage('packagename') line to Lib/setup.py 2) move Lib/packagename/setup_packagename.py to Lib/packagename/setup.py 3) adapt Lib/packagename/setup.py and tests files to scipy.distutils and scipy.test Pearu From rkern at ucsd.edu Wed Oct 5 05:35:58 2005 From: rkern at ucsd.edu (Robert Kern) Date: Wed, 05 Oct 2005 02:35:58 -0700 Subject: [SciPy-dev] frange for scipy? In-Reply-To: References: Message-ID: <43439E7E.2020001@ucsd.edu> Arnd Baecker wrote: > Hi, > > I just did a quick check-out of the new scipy_core. > It looks *very* nice (many thanks Travis!!) - > what I like in particular is that `zeros` now takes > an optional `fortran` parameter (this is to set up > the array in fortran ordering in memory, right?). > > As quite a few of Fernandos routines from IPython.numutils > seem to be incorporated, there is one which I'd love to see as well: > IPython.numutils.frange(xini, xfin=None, delta=None, **kw) > to generate an array of floats with a specified number of elements. Does linspace() not suit your needs? -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From arnd.baecker at web.de Wed Oct 5 05:52:49 2005 From: arnd.baecker at web.de (Arnd Baecker) Date: Wed, 5 Oct 2005 11:52:49 +0200 (CEST) Subject: [SciPy-dev] frange for scipy? In-Reply-To: <43439E7E.2020001@ucsd.edu> References: <43439E7E.2020001@ucsd.edu> Message-ID: On Wed, 5 Oct 2005, Robert Kern wrote: > Arnd Baecker wrote: [...] > > IPython.numutils.frange(xini, xfin=None, delta=None, **kw) > > to generate an array of floats with a specified number of elements. > > Does linspace() not suit your needs? It does - excellent! I just did not know about its existence - sorry. (note to self: need to buy manual ...) Many thanks, Arnd From nwagner at mecha.uni-stuttgart.de Wed Oct 5 06:47:08 2005 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Wed, 05 Oct 2005 12:47:08 +0200 Subject: [SciPy-dev] Release of SciPy Core 0.4 (Beta) Message-ID: <4343AF2C.8050507@mecha.uni-stuttgart.de> Hi all, what has happened to the matfuncs like linalg.expm, linalg.sqrtm, ... to name a few. >>> dir (linalg) ['Heigenvalues', 'Heigenvectors', 'LinAlgError', 'MLab', 'Numeric', '__builtins__', '__doc__', '__file__', '__name__', '__path__', 'asarray', 'basic_lite', 'cholesky', 'cholesky_decomposition', 'det', 'determinant', 'eig', 'eigenvalues', 'eigenvectors', 'eigh', 'eigvals', 'eigvalsh', 'generalized_inverse', 'inv', 'inverse', 'lapack_lite', 'linear_least_squares', 'lstsq', 'math', 'multiply', 'pinv', 'singular_value_decomposition', 'solve', 'solve_linear_equations', 'svd'] >>> And the status of the former packages io, optimize, sparse, signal and integrate in new scipy - how about that ? Nils From rkern at ucsd.edu Wed Oct 5 06:58:20 2005 From: rkern at ucsd.edu (Robert Kern) Date: Wed, 05 Oct 2005 03:58:20 -0700 Subject: [SciPy-dev] Release of SciPy Core 0.4 (Beta) In-Reply-To: <4343AF2C.8050507@mecha.uni-stuttgart.de> References: <4343AF2C.8050507@mecha.uni-stuttgart.de> Message-ID: <4343B1CC.2010609@ucsd.edu> Nils Wagner wrote: > Hi all, > > what has happened to the matfuncs like linalg.expm, linalg.sqrtm, ... to > name a few. > > >>>>dir (linalg) > > ['Heigenvalues', 'Heigenvectors', 'LinAlgError', 'MLab', 'Numeric', > '__builtins__', '__doc__', '__file__', '__name__', '__path__', > 'asarray', 'basic_lite', 'cholesky', 'cholesky_decomposition', 'det', > 'determinant', 'eig', 'eigenvalues', 'eigenvectors', 'eigh', 'eigvals', > 'eigvalsh', 'generalized_inverse', 'inv', 'inverse', 'lapack_lite', > 'linear_least_squares', 'lstsq', 'math', 'multiply', 'pinv', > 'singular_value_decomposition', 'solve', 'solve_linear_equations', 'svd'] > > > And the status of the former packages io, optimize, sparse, signal and > integrate in new scipy - how about that ? They're not in scipy_core. That's why it's called scipy_core; it's just the core functionality. The other functions and modules will be available in the complete scipy package; we're porting them over now. -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From arnd.baecker at web.de Wed Oct 5 07:01:49 2005 From: arnd.baecker at web.de (Arnd Baecker) Date: Wed, 5 Oct 2005 13:01:49 +0200 (CEST) Subject: [SciPy-dev] newscipy sandbox In-Reply-To: References: Message-ID: Hi Pearu, On Wed, 5 Oct 2005, Pearu Peterson wrote: > > Hi, > > I have created sandbox in newscipy and adapted scipy setup.py file > to new scipy.distutils. At the moment the sandbox contains only > exmplpackage that is former xxx (and updated to scipy.distutils) and > should serve as an example scipy package. > So, currently installed scipy consists only of scipy.sandbox package. I just installed it and the import works, but `scipy` does not seem to contain `exmplpackage_foo_bar`: In [3]:import scipy.sandbox.exmplpackage In [4]:scipy.ex scipy.exp scipy.expand_dims scipy.extract According to the doc: Scipy example module for developers This is documentation of exmplpackage. Provides: exmplpackage_foo_bar - also available in scipy name space foo_gun yyy.fun So I would expect this to work - am I missing something? Best, Arnd From arnd.baecker at web.de Wed Oct 5 07:01:49 2005 From: arnd.baecker at web.de (Arnd Baecker) Date: Wed, 5 Oct 2005 13:01:49 +0200 (CEST) Subject: [SciPy-dev] newscipy sandbox In-Reply-To: References: Message-ID: Hi Pearu, On Wed, 5 Oct 2005, Pearu Peterson wrote: > > Hi, > > I have created sandbox in newscipy and adapted scipy setup.py file > to new scipy.distutils. At the moment the sandbox contains only > exmplpackage that is former xxx (and updated to scipy.distutils) and > should serve as an example scipy package. > So, currently installed scipy consists only of scipy.sandbox package. I just installed it and the import works, but `scipy` does not seem to contain `exmplpackage_foo_bar`: In [3]:import scipy.sandbox.exmplpackage In [4]:scipy.ex scipy.exp scipy.expand_dims scipy.extract According to the doc: Scipy example module for developers This is documentation of exmplpackage. Provides: exmplpackage_foo_bar - also available in scipy name space foo_gun yyy.fun So I would expect this to work - am I missing something? Best, Arnd From rkern at ucsd.edu Wed Oct 5 07:09:03 2005 From: rkern at ucsd.edu (Robert Kern) Date: Wed, 05 Oct 2005 04:09:03 -0700 Subject: [SciPy-dev] newscipy sandbox In-Reply-To: References: Message-ID: <4343B44F.7090005@ucsd.edu> Arnd Baecker wrote: > Hi Pearu, > > On Wed, 5 Oct 2005, Pearu Peterson wrote: > >>Hi, >> >>I have created sandbox in newscipy and adapted scipy setup.py file >>to new scipy.distutils. At the moment the sandbox contains only >>exmplpackage that is former xxx (and updated to scipy.distutils) and >>should serve as an example scipy package. >>So, currently installed scipy consists only of scipy.sandbox package. > > I just installed it and the import works, > but `scipy` does not seem to contain `exmplpackage_foo_bar`: It shouldn't. scipy.sandbox.exmplpackage is the name of the package. It doesn't get loaded into the scipy.* namespace, nor should it. It shouldn't be built by default, either, but that's another issue. -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From pearu at scipy.org Wed Oct 5 06:54:51 2005 From: pearu at scipy.org (Pearu Peterson) Date: Wed, 5 Oct 2005 05:54:51 -0500 (CDT) Subject: [SciPy-dev] newscipy sandbox In-Reply-To: <4343B44F.7090005@ucsd.edu> References: <4343B44F.7090005@ucsd.edu> Message-ID: On Wed, 5 Oct 2005, Robert Kern wrote: >> I just installed it and the import works, >> but `scipy` does not seem to contain `exmplpackage_foo_bar`: > > It shouldn't. scipy.sandbox.exmplpackage is the name of the package. It > doesn't get loaded into the scipy.* namespace, nor should it. It > shouldn't be built by default, either, but that's another issue. Indeed, though, it should be safe to build exmplpackage for testing purposes. Hmm, or did you meant that packages under sandbox should be installed under scipy tree, not under scipy.sandbox tree? Pearu From rkern at ucsd.edu Wed Oct 5 08:02:31 2005 From: rkern at ucsd.edu (Robert Kern) Date: Wed, 05 Oct 2005 05:02:31 -0700 Subject: [SciPy-dev] newscipy sandbox In-Reply-To: References: <4343B44F.7090005@ucsd.edu> Message-ID: <4343C0D7.1050009@ucsd.edu> Pearu Peterson wrote: > > On Wed, 5 Oct 2005, Robert Kern wrote: > >>> I just installed it and the import works, >>> but `scipy` does not seem to contain `exmplpackage_foo_bar`: >> >> It shouldn't. scipy.sandbox.exmplpackage is the name of the package. It >> doesn't get loaded into the scipy.* namespace, nor should it. It >> shouldn't be built by default, either, but that's another issue. > > Indeed, though, it should be safe to build exmplpackage for testing > purposes. True. However, I would like it to be the rule that all of the sandbox subpackages are always commented out in the setup.py. These packages are likely to have fragile builds, and I don't want users (even SVN users) to have their builds break for no good reason. Since exmplpackage is not only an example scipy package but an example sandbox package, I'm leaning towards wanting it too commented out in the checked-in version of the sandbox setup.py. > Hmm, or did you meant that packages under sandbox > should be installed under scipy tree, not under scipy.sandbox tree? No, your first interpretation was correct. These packages should definitely live under scipy.sandbox until they're officially moved out. -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From pearu at scipy.org Wed Oct 5 07:21:57 2005 From: pearu at scipy.org (Pearu Peterson) Date: Wed, 5 Oct 2005 06:21:57 -0500 (CDT) Subject: [SciPy-dev] newscipy sandbox In-Reply-To: <4343C0D7.1050009@ucsd.edu> References: <4343B44F.7090005@ucsd.edu> <4343C0D7.1050009@ucsd.edu> Message-ID: On Wed, 5 Oct 2005, Robert Kern wrote: > Pearu Peterson wrote: >> >> On Wed, 5 Oct 2005, Robert Kern wrote: >> >>>> I just installed it and the import works, >>>> but `scipy` does not seem to contain `exmplpackage_foo_bar`: >>> >>> It shouldn't. scipy.sandbox.exmplpackage is the name of the package. It >>> doesn't get loaded into the scipy.* namespace, nor should it. It >>> shouldn't be built by default, either, but that's another issue. >> >> Indeed, though, it should be safe to build exmplpackage for testing >> purposes. > > True. However, I would like it to be the rule that all of the sandbox > subpackages are always commented out in the setup.py. These packages are > likely to have fragile builds, and I don't want users (even SVN users) > to have their builds break for no good reason. Since exmplpackage is not > only an example scipy package but an example sandbox package, I'm > leaning towards wanting it too commented out in the checked-in version > of the sandbox setup.py. > >> Hmm, or did you meant that packages under sandbox >> should be installed under scipy tree, not under scipy.sandbox tree? > > No, your first interpretation was correct. These packages should > definitely live under scipy.sandbox until they're officially moved out. Ok. So, in order to disable/enable sandbox packages, one only needs to comment out/in the following line config.add_subpackage('sandbox') in Lib/setup.py file. Pearu From arnd.baecker at web.de Wed Oct 5 09:11:02 2005 From: arnd.baecker at web.de (Arnd Baecker) Date: Wed, 5 Oct 2005 15:11:02 +0200 (CEST) Subject: [SciPy-dev] newscipy sandbox In-Reply-To: <4343B44F.7090005@ucsd.edu> References: <4343B44F.7090005@ucsd.edu> Message-ID: On Wed, 5 Oct 2005, Robert Kern wrote: > Arnd Baecker wrote: > > Hi Pearu, > > > > On Wed, 5 Oct 2005, Pearu Peterson wrote: > > > >>Hi, > >> > >>I have created sandbox in newscipy and adapted scipy setup.py file > >>to new scipy.distutils. At the moment the sandbox contains only > >>exmplpackage that is former xxx (and updated to scipy.distutils) and > >>should serve as an example scipy package. > >>So, currently installed scipy consists only of scipy.sandbox package. > > > > I just installed it and the import works, > > but `scipy` does not seem to contain `exmplpackage_foo_bar`: > > It shouldn't. scipy.sandbox.exmplpackage is the name of the package. It > doesn't get loaded into the scipy.* namespace, nor should it. It > shouldn't be built by default, either, but that's another issue. >From the doc-string of the package `scipy.sandbox.exmplpackage` I thought that the *function* `exmplpackage_foo_bar` is available as `scipy.exmplpackage_foo_bar()`: In [3]:scipy.sandbox.exmplpackage? Type: module Base Class: String Form: Namespace: Interactive File: /home/abaecker/NBB/SOFTWARE/scipy_new/lib/python2.3/site-packages/scipy/sandbox/exmplpackage/__init__.py Docstring: Scipy example module for developers This is documentation of exmplpackage. Provides: exmplpackage_foo_bar - also available in scipy name space foo_gun yyy.fun (I just checked with `config.add_subpackage('sandbox')` activated and there is no function `scipy.exmplpackage_foo_bar`, which, if I understood you correctly, should not be the case anyway - sorry, I was just confused by the wording in the above doc-string and assumed it should be there...) Best, Arnd From Fernando.Perez at colorado.edu Wed Oct 5 11:12:19 2005 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Wed, 05 Oct 2005 09:12:19 -0600 Subject: [SciPy-dev] frange for scipy? In-Reply-To: References: <43439E7E.2020001@ucsd.edu> Message-ID: <4343ED53.5040201@colorado.edu> Arnd Baecker wrote: > On Wed, 5 Oct 2005, Robert Kern wrote: > > >>Arnd Baecker wrote: > > > [...] > >>> IPython.numutils.frange(xini, xfin=None, delta=None, **kw) >>>to generate an array of floats with a specified number of elements. >> >>Does linspace() not suit your needs? > > > It does - excellent! I just did not know about its existence - sorry. > (note to self: need to buy manual ...) No worries: a few weeks ago John Hunter, after graciously taking frange into pylab, shamed me into showing my ignorance of both linspace and logspace :) So frange can be quietly deprecated, I think (at least I've stopped using it since I learned about lin/logspace). Cheers, f From jorgen.stenarson at bostream.nu Wed Oct 5 13:45:10 2005 From: jorgen.stenarson at bostream.nu (=?ISO-8859-1?Q?J=F6rgen_Stenarson?=) Date: Wed, 05 Oct 2005 19:45:10 +0200 Subject: [SciPy-dev] Build of newcore on macosX and with mingw32 In-Reply-To: References: <4342C2F2.4080703@bostream.nu> <4342DFA7.6060100@bostream.nu> Message-ID: <43441126.1050903@bostream.nu> Hi, It turned out I had an old version of scipy installed I had forgot about . After deleting the old version + rm -Rf build/ the compilation process works as it should. But when I try to import scipy I get a bus error. I have included a dump with python -v /J?rgen $ python -v ... Python 2.4.1 (#1, Aug 27 2005, 23:10:21) [GCC 4.0.0 20041026 (Apple Computer, Inc. build 4061)] on darwin Type "help", "copyright", "credits" or "license" for more information. import readline # dynamically loaded from /opt/local/lib/python2.4/lib-dynload/readline.so >>> import scipy import scipy # directory scipy # scipy/__init__.pyc matches scipy/__init__.py import scipy # precompiled from scipy/__init__.pyc import scipy.base # directory scipy/base # scipy/base/__init__.pyc matches scipy/base/__init__.py import scipy.base # precompiled from scipy/base/__init__.pyc # scipy/base/info_scipy_base.pyc matches scipy/base/info_scipy_base.py import scipy.base.info_scipy_base # precompiled from scipy/base/info_scipy_base.pyc # scipy/core_version.pyc matches scipy/core_version.py import scipy.core_version # precompiled from scipy/core_version.pyc import scipy.base.multiarray # dynamically loaded from scipy/base/multiarray.so import scipy.base.umath # dynamically loaded from scipy/base/umath.so # scipy/base/numerictypes.pyc matches scipy/base/numerictypes.py import scipy.base.numerictypes # precompiled from scipy/base/numerictypes.pyc # scipy/base/numeric.pyc matches scipy/base/numeric.py import scipy.base.numeric # precompiled from scipy/base/numeric.pyc import math # dynamically loaded from /opt/local/lib/python2.4/lib-dynload/math.soimport scipy.base._compiled_base # dynamically loaded from scipy/base/_compiled_base.so import scipy.lib # directory scipy/lib # scipy/lib/__init__.pyc matches scipy/lib/__init__.py import scipy.lib # precompiled from scipy/lib/__init__.pyc import scipy.lib._dotblas # dynamically loaded from scipy/lib/_dotblas.so # scipy/base/arrayprint.pyc matches scipy/base/arrayprint.py import scipy.base.arrayprint # precompiled from scipy/base/arrayprint.pyc import cStringIO # dynamically loaded from /opt/local/lib/python2.4/lib-dynload/cStringIO.so import cPickle # dynamically loaded from /opt/local/lib/python2.4/lib-dynload/cPickle.so # scipy/base/oldnumeric.pyc matches scipy/base/oldnumeric.py import scipy.base.oldnumeric # precompiled from scipy/base/oldnumeric.pyc # scipy/base/function_base.pyc matches scipy/base/function_base.py import scipy.base.function_base # precompiled from scipy/base/function_base.pyc import operator # dynamically loaded from /opt/local/lib/python2.4/lib-dynload/operator.so # scipy/base/type_check.pyc matches scipy/base/type_check.py import scipy.base.type_check # precompiled from scipy/base/type_check.pyc # scipy/base/ufunclike.pyc matches scipy/base/ufunclike.py import scipy.base.ufunclike # precompiled from scipy/base/ufunclike.pyc # scipy/base/shape_base.pyc matches scipy/base/shape_base.py import scipy.base.shape_base # precompiled from scipy/base/shape_base.pyc # scipy/base/matrix.pyc matches scipy/base/matrix.py import scipy.base.matrix # precompiled from scipy/base/matrix.pyc # /opt/local/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/string.pyc matches /opt/local/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/string.py import string # precompiled from /opt/local/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/string.pyc # /opt/local/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/re.pyc matches /opt/local/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/re.py import re # precompiled from /opt/local/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/re.pyc # /opt/local/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/sre.pyc matches /opt/local/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/sre.py import sre # precompiled from /opt/local/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/sre.pyc # /opt/local/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/sre_compile.pyc matches /opt/local/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/sre_compile.py import sre_compile # precompiled from /opt/local/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/sre_compile.pyc import _sre # builtin # /opt/local/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/sre_constants.pyc matches /opt/local/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/sre_constants.py import sre_constants # precompiled from /opt/local/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/sre_constants.pyc # /opt/local/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/sre_parse.pyc matches /opt/local/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/sre_parse.py import sre_parse # precompiled from /opt/local/Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/sre_parse.pyc import strop # dynamically loaded from /opt/local/lib/python2.4/lib-dynload/strop.so # scipy/base/index_tricks.pyc matches scipy/base/index_tricks.py import scipy.base.index_tricks # precompiled from scipy/base/index_tricks.pyc # scipy/base/twodim_base.pyc matches scipy/base/twodim_base.py import scipy.base.twodim_base # precompiled from scipy/base/twodim_base.pyc # scipy/base/scimath.pyc matches scipy/base/scimath.py import scipy.base.scimath # precompiled from scipy/base/scimath.pyc Bus error Pearu Peterson wrote: > > > On Tue, 4 Oct 2005, J?rgen Stenarson wrote: > >>>> gcc: build/src/scipy/base/src/umathmodule.c >>>> build/src/scipy/base/src/umathmodule.c:56: error: static declaration >>>> of 'acoshf' follows non-static declaration >>>> build/src/scipy/base/src/umathmodule.c:61: error: static declaration >>>> of 'asinhf' follows non-static declaration >>>> build/src/scipy/base/src/umathmodule.c:66: error: static declaration >>>> of 'atanhf' follows non-static declaration >>>> build/src/scipy/base/src/umathmodule.c:56: error: static declaration >>>> of 'acoshf' follows non-static declaration >>>> build/src/scipy/base/src/umathmodule.c:61: error: static declaration >>>> of 'asinhf' follows non-static declaration >>>> build/src/scipy/base/src/umathmodule.c:66: error: static declaration >>>> of 'atanhf' follows non-static declaration >>> >>> >>> >>> Could you send the contects of build/src/scipy/base/config.h? >> >> >> /* #define SIZEOF_SHORT 2 */ >> /* #define SIZEOF_INT 4 */ >> /* #define SIZEOF_LONG 4 */ >> /* #define SIZEOF_FLOAT 4 */ >> /* #define SIZEOF_DOUBLE 8 */ >> #define SIZEOF_LONG_DOUBLE 16 >> #define SIZEOF_PY_INTPTR_T 4 >> /* #define SIZEOF_LONG_LONG 8 */ >> #define SIZEOF_PY_LONG_LONG 8 >> /* #define CHAR_BIT 8 */ >> #define MATHLIB >> #define HAVE_LONGDOUBLE_FUNCS >> #define HAVE_FLOAT_FUNCS >> #define HAVE_INVERSE_HYPERBOLIC >> #define HAVE_INVERSE_HYPERBOLIC_FLOAT >> >>> >>> What happens if you add >>> >>> #define HAVE_INVERSE_HYPERBOLIC_FLOAT >>> >> as you can see above it is already there. > > > So, the code that give errors should not be available to the compiler. > >> do you have any other ideas? > > > Hmm, may be the building process picks config.h up from some other place. > Introduce some syntax error to build/src/scipy/base/config.h in order to > verify this. > > Try to remove old scipy.core from your system, that includes also header > files in include/python2.4/scipy directory, then do `rm -rf build` and > try to rebuild. > > Pearu > > > ------------------------------------------------------------------------ > > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-dev From oliphant at ee.byu.edu Wed Oct 5 17:05:10 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Wed, 05 Oct 2005 15:05:10 -0600 Subject: [SciPy-dev] More bug fixes in SVN Message-ID: <43444006.6030208@ee.byu.edu> I've fixed some bugs in SVN related to casting in the BUFFER section of the ufunc looping code. (Use setbufsize(40) and an array of size 100 to see it...) I've also fixed scalar operations, so that most type_check.py tests pass (not the "correct" way -- just by modifying the logic a bit of the general-purpose array_scalar calculation function). Keep the bug reports coming.... :-) -Travis From oliphant at ee.byu.edu Wed Oct 5 17:11:09 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Wed, 05 Oct 2005 15:11:09 -0600 Subject: [SciPy-dev] All tests in test_basic pass Message-ID: <4344416D.3040106@ee.byu.edu> With the new SVN, all tests in test_basic pass. I think it's time for another beta release... -Travis From rkern at ucsd.edu Wed Oct 5 19:57:35 2005 From: rkern at ucsd.edu (Robert Kern) Date: Wed, 05 Oct 2005 16:57:35 -0700 Subject: [SciPy-dev] All tests in test_basic pass In-Reply-To: <4344416D.3040106@ee.byu.edu> References: <4344416D.3040106@ee.byu.edu> Message-ID: <4344686F.4040404@ucsd.edu> Travis Oliphant wrote: > > With the new SVN, all tests in test_basic pass. > > I think it's time for another beta release... I'm getting a segfault if I try to evaluate something after I run the test suite from the interpreter. E.g. .......................................................................................................................... ---------------------------------------------------------------------- Ran 122 tests in 0.909s OK >>> 1 zsh: segmentation fault python The stack trace: Thread 0 Crashed: 0 org.python.python 0x100bf9e8 PyObject_GC_UnTrack + 28 (gcmodule.c:1234) 1 org.python.python 0x1004dfe4 subtype_dealloc + 196 (typeobject.c:643) 2 org.python.python 0x10037fd4 dict_dealloc + 192 (dictobject.c:728) 3 org.python.python 0x1004e1d8 subtype_dealloc + 696 (typeobject.c:692) 4 org.python.python 0x10036984 PyDict_SetItem + 276 (dictobject.c:398) 5 org.python.python 0x1003cbe8 PyObject_GenericSetAttr + 440 (object.c:1371) 6 org.python.python 0x1003c62c PyObject_SetAttr + 244 (object.c:1124) 7 org.python.python 0x1003dae4 PyObject_SetAttrString + 100 (object.c:1044) 8 org.python.python 0x100b6548 sys_displayhook + 140 (sysmodule.c:105) Also, when I run the tests from the command line (python -c "import scipy; scipy.test(10,10)"), I get a segfault in the middle of the test run: check_scalar (scipy.base.function_base.test_function_base.test_vectorize) ... ok check_simple (scipy.base.function_base.test_function_base.test_vectorize) ... ok check_matrix (scipy.base.twodim_base.test_twodim_base.test_diag) ... zsh: segmentation fault python -c "import scipy; scipy.test(10,10)" with this stack trace: Thread 0 Crashed: 0 org.python.python 0x1003ee8c PyObject_Malloc + 88 (obmalloc.c:605) 1 multiarray.so 0x002077c0 dyld_stub_PyErr_SetString + 1848400 2 multiarray.so 0x0020b3fc dyld_stub_PyErr_SetString + 1863820 3 multiarray.so 0x0020b230 dyld_stub_PyErr_SetString + 1863360 4 umath.so 0x000c9adc initumath + 77344 Running test_twodim_base.py from the command line gives no error, nor does running test_basic.py . -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From oliphant at ee.byu.edu Wed Oct 5 20:49:54 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Wed, 05 Oct 2005 18:49:54 -0600 Subject: [SciPy-dev] All tests in test_basic pass In-Reply-To: <4344686F.4040404@ucsd.edu> References: <4344416D.3040106@ee.byu.edu> <4344686F.4040404@ucsd.edu> Message-ID: <434474B2.8050902@ee.byu.edu> Robert Kern wrote: >Travis Oliphant wrote: > > >> >>With the new SVN, all tests in test_basic pass. >> >>I think it's time for another beta release... >> >> > >I'm getting a segfault if I try to evaluate something after I run the >test suite from the interpreter. E.g. > > I can't reproduce this. Darn.. -Travis From pearu at scipy.org Wed Oct 5 21:00:37 2005 From: pearu at scipy.org (Pearu Peterson) Date: Wed, 5 Oct 2005 20:00:37 -0500 (CDT) Subject: [SciPy-dev] All tests in test_basic pass In-Reply-To: <4344686F.4040404@ucsd.edu> References: <4344416D.3040106@ee.byu.edu> <4344686F.4040404@ucsd.edu> Message-ID: On Wed, 5 Oct 2005, Robert Kern wrote: > Travis Oliphant wrote: >> >> With the new SVN, all tests in test_basic pass. >> >> I think it's time for another beta release... > > I'm getting a segfault if I try to evaluate something after I run the > test suite from the interpreter. E.g. > > .......................................................................................................................... > ---------------------------------------------------------------------- > Ran 122 tests in 0.909s > > OK > >>>> 1 > zsh: segmentation fault python I get segfault after calling scipy.test() several times in row. In most cases python crashes when scipy.test() has been called 12 times. Here's libwadpy output: ... ---------------------------------------------------------------------- Ran 122 tests in 0.243s OK WAD: Collecting debugging information... WAD: Heap overflow detected. WAD: Segmentation fault. #23 0x080b8695 in PyEval_EvalCode() #22 0x080b8417 in PyEval_EvalCodeEx() #21 0x080b6b64 in ?() #20 0x080b8417 in PyEval_EvalCodeEx() #19 0x080b6b64 in ?() #18 0x080b8417 in PyEval_EvalCodeEx() #17 0x080b6b64 in ?() #16 0x080b8417 in PyEval_EvalCodeEx() #15 0x080b6d7c in ?() #14 0x080b6b64 in ?() #13 0x080b8417 in PyEval_EvalCodeEx() #12 0x080b6b64 in ?() #11 0x080b8417 in PyEval_EvalCodeEx() #10 0x080b7733 in ?() #9 0x080b8417 in PyEval_EvalCodeEx() #8 0x080b6b64 in ?() #7 0x080b7fa6 in PyEval_EvalCodeEx() #6 0x080f8b63 in ?() #5 0x0808bb3e in ?() #4 0x0807aa8d in ?() #3 0x0806e0e0 in ?() #2 0x0808bb3e in ?() #1 0x0807aa15 in ?() #0 0x080dfbb1 in PyObject_GC_UnTrack() --------------------------------------------------------------------------- exceptions.SegFault Traceback (most recent call last) SegFault: [ C stack trace ] #23 0x080b8695 in PyEval_EvalCode() #22 0x080b8417 in PyEval_EvalCodeEx() #21 0x080b6b64 in ?() #20 0x080b8417 in PyEval_EvalCodeEx() #19 0x080b6b64 in ?() #18 0x080b8417 in PyEval_EvalCodeEx() #17 0x080b6b64 in ?() #16 0x080b8417 in PyEval_EvalCodeEx() #15 0x080b6d7c in ?() #14 0x080b6b64 in ?() #13 0x080b8417 in PyEval_EvalCodeEx() #12 0x080b6b64 in ?() #11 0x080b8417 in PyEval_EvalCodeEx() #10 0x080b7733 in ?() #9 0x080b8417 in PyEval_EvalCodeEx() #8 0x080b6b64 in ?() #7 0x080b7fa6 in PyEval_EvalCodeEx() #6 0x080f8b63 in ?() #5 0x0808bb3e in ?() #4 0x0807aa8d in ?() #3 0x0806e0e0 in ?() #2 0x0808bb3e in ?() #1 0x0807aa15 in ?() #0 0x080dfbb1 in PyObject_GC_UnTrack() WAD: Collecting debugging information... WAD: Segmentation fault. #5 0x080549a1 in ?() #4 0x4102e413 in ?() #3 0x08054e41 in Py_Main() #2 0x080d8bc6 in Py_Finalize() #1 0x080e0764 in PyGC_Collect() #0 0x080dfe50 in ?() Pearu From joe at enthought.com Wed Oct 5 22:08:12 2005 From: joe at enthought.com (Joe Cooper) Date: Wed, 05 Oct 2005 21:08:12 -0500 Subject: [SciPy-dev] Testing Message-ID: <4344870C.9020406@enthought.com> Please ignore me. Just testing some updates... From pearu at scipy.org Wed Oct 5 21:21:18 2005 From: pearu at scipy.org (Pearu Peterson) Date: Wed, 5 Oct 2005 20:21:18 -0500 (CDT) Subject: [SciPy-dev] All tests in test_basic pass In-Reply-To: <434474B2.8050902@ee.byu.edu> References: <4344416D.3040106@ee.byu.edu> <4344686F.4040404@ucsd.edu> <434474B2.8050902@ee.byu.edu> Message-ID: On Wed, 5 Oct 2005, Travis Oliphant wrote: > Robert Kern wrote: > >> Travis Oliphant wrote: >> >>> >>> With the new SVN, all tests in test_basic pass. >>> >>> I think it's time for another beta release... >>> >> >> I'm getting a segfault if I try to evaluate something after I run the >> test suite from the interpreter. E.g. >> > I can't reproduce this. Darn.. This segfault seems to occur when running function_base tests: In [1]: from scipy.test.testing import ScipyTest In [2]: ScipyTest('scipy.base.function_base').test() Found 23 tests for scipy.base.function_base Found 0 tests for __main__ ....................... ---------------------------------------------------------------------- Ran 23 tests in 0.076s OK Segmentation fault When enableing all checks except test_vectorize in test_function_base.py, no segfault occure. And when disableing all checks except test_vectorize, the segfault occurs after calling test multiple times. In conclusion, the vectorize function seem to be the cause of these segfaults. Pearu From oliphant at ee.byu.edu Wed Oct 5 22:54:59 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Wed, 05 Oct 2005 20:54:59 -0600 Subject: [SciPy-dev] All tests in test_basic pass In-Reply-To: References: <4344416D.3040106@ee.byu.edu> <4344686F.4040404@ucsd.edu> <434474B2.8050902@ee.byu.edu> Message-ID: <43449203.2080005@ee.byu.edu> Pearu Peterson wrote: >In conclusion, the vectorize function seem to be the cause of these >segfaults. > > Thanks for tracking this down. No doubt it is an object_array issue, then. Probably another unitialized object somewhere, or a refcounting problem. Just when you think you've figured out refcounting... -Travis From oliphant at ee.byu.edu Wed Oct 5 22:57:50 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Wed, 05 Oct 2005 20:57:50 -0600 Subject: [SciPy-dev] Getting SciPy ready for New Core Message-ID: <434492AE.2090901@ee.byu.edu> I've been cleaning up the setup scripts for new scipy. I'm "done" with io and am onto special. I'd like to test for the existence of certain functions and conditionally compile some of the code, very similar to what was done in setup.py of scipy.base Now, there was a lot of effort there to find the mathlibraries first so they could be linked against on a trial run. Is that information stored anywhere so that scipy.disutils.misc_util.get_math_libs() or something can be used? It would be really useful, if it were. -Travis From rkern at ucsd.edu Wed Oct 5 23:29:06 2005 From: rkern at ucsd.edu (Robert Kern) Date: Wed, 05 Oct 2005 20:29:06 -0700 Subject: [SciPy-dev] Getting SciPy ready for New Core In-Reply-To: <434492AE.2090901@ee.byu.edu> References: <434492AE.2090901@ee.byu.edu> Message-ID: <43449A02.4080508@ucsd.edu> Travis Oliphant wrote: > I've been cleaning up the setup scripts for new scipy. > > I'm "done" with io and am onto special. > > I'd like to test for the existence of certain functions and > conditionally compile some of the code, very similar to what was done in > setup.py of scipy.base > > Now, there was a lot of effort there to find the mathlibraries first so > they could be linked against on a trial run. Is that information stored > anywhere so that scipy.disutils.misc_util.get_math_libs() or something > can be used? > > It would be really useful, if it were. For that, you could parse config.h as is done in scipy/base/setup.py[76:86]. We really do need a general configuration mechanism that can detect things and save what was detected. PETSc's BuildSystem does this particularly well. I'll start looking into integrating its configuration component with distutils. -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From nwagner at mecha.uni-stuttgart.de Thu Oct 6 03:01:22 2005 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Thu, 06 Oct 2005 09:01:22 +0200 Subject: [SciPy-dev] All tests in test_basic pass In-Reply-To: <434474B2.8050902@ee.byu.edu> References: <4344416D.3040106@ee.byu.edu> <4344686F.4040404@ucsd.edu> <434474B2.8050902@ee.byu.edu> Message-ID: <4344CBC2.2050609@mecha.uni-stuttgart.de> Travis Oliphant wrote: > Robert Kern wrote: > >> Travis Oliphant wrote: >> >> >>> >>> With the new SVN, all tests in test_basic pass. >>> >>> I think it's time for another beta release... >>> >> >> I'm getting a segfault if I try to evaluate something after I run the >> test suite from the interpreter. E.g. >> >> > I can't reproduce this. Darn.. > > -Travis > > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-dev If I run scipy.test(1,verbosity=10) several times I'm getting a segfault check_matrix (scipy.base.twodim_base.test_twodim_base.test_diag) ... Program received signal SIGSEGV, Segmentation fault. [Switching to Thread 1076102528 (LWP 9144)] PyObject_Malloc (nbytes=44) at Objects/obmalloc.c:605 605 if ((pool->freeblock = *(block **)bp) != NULL) { (gdb) bt #0 PyObject_Malloc (nbytes=44) at Objects/obmalloc.c:605 #1 0x080a230b in PyType_GenericAlloc (type=0x4035fb40, nitems=0) at Objects/typeobject.c:455 #2 0x4031ac73 in PyArray_New (subtype=0x4035fb40, nd=2, dims=0x82a29a8, type_num=7, strides=0x0, data=0x82c06f0 "", itemsize=4, flags=1833, obj=0x406cddd0) at arrayobject.c:3147 #3 0x4031c698 in PyArray_Newshape (self=0x406cddd0, newdims=0x0) at scipy/base/src/multiarraymodule.c:272 #4 0x4031c746 in PyArray_Reshape (self=0x406cddd0, shape=0x4029252c) at scipy/base/src/multiarraymodule.c:201 #5 0x403763b8 in ufunc_outer (self=0x4029e4e0, args=Variable "args" is not available. ) at ufuncobject.c:2657 #6 0x0811d436 in PyCFunction_Call (func=0x405b6c8c, arg=0x4029248c, kw=0x0) at Objects/methodobject.c:93 #7 0x080c66d9 in PyEval_EvalFrame (f=0x81d4c34) at Python/ceval.c:3547 #8 0x080c7312 in PyEval_EvalFrame (f=0x81c53c4) at Python/ceval.c:3629 #9 0x080c7312 in PyEval_EvalFrame (f=0x81c3cb4) at Python/ceval.c:3629 #10 0x080c7a54 in PyEval_EvalCodeEx (co=0x40408f60, globals=0x403dff0c, locals=0x0, args=0x4041b618, argcount=2, kws=0x0, kwcount=0, defs=0x4040fd98, defcount=1, closure=0x0) at Python/ceval.c:2730 #11 0x0811c682 in function_call (func=0x4040db8c, arg=0x4041b60c, kw=0x0) at Objects/funcobject.c:550 #12 0x0805928e in PyObject_Call (func=0x4040db8c, arg=0x4041b60c, kw=0x0) at Objects/abstract.c:1746 #13 0x08064b24 in instancemethod_call (func=0x55, arg=0x4061236c, kw=0x0) at Objects/classobject.c:2431 #14 0x0805928e in PyObject_Call (func=0x405b575c, arg=0x4061236c, kw=0x0) at Objects/abstract.c:1746 #15 0x08097557 in slot_tp_call (self=0x405f1e6c, args=0x4061236c, kwds=0x0) at Objects/typeobject.c:4526 #16 0x0805928e in PyObject_Call (func=0x405f1e6c, arg=0x4061236c, kw=0x0) at Objects/abstract.c:1746 #17 0x080c408d in PyEval_EvalFrame (f=0x81ba9cc) at Python/ceval.c:3755 #18 0x080c7a54 in PyEval_EvalCodeEx (co=0x4040c560, globals=0x403dff0c, locals=0x0, args=0x4041b638, argcount=2, kws=0x0, kwcount=0, defs=0x0, defcount=0, closure=0x0) at Python/ceval.c:2730 #19 0x0811c682 in function_call (func=0x4040df7c, arg=0x4041b62c, kw=0x0) at Objects/funcobject.c:550 #20 0x0805928e in PyObject_Call (func=0x4040df7c, arg=0x4041b62c, kw=0x0) at Objects/abstract.c:1746 #21 0x08064b24 in instancemethod_call (func=0x55, arg=0x405ba46c, kw=0x0) at Objects/classobject.c:2431 #22 0x0805928e in PyObject_Call (func=0x403b80a4, arg=0x405ba46c, kw=0x0) at Objects/abstract.c:1746 #23 0x08097557 in slot_tp_call (self=0x4041874c, args=0x405ba46c, kwds=0x0) at Objects/typeobject.c:4526 #24 0x0805928e in PyObject_Call (func=0x4041874c, arg=0x405ba46c, kw=0x0) at Objects/abstract.c:1746 #25 0x080c408d in PyEval_EvalFrame (f=0x817c21c) at Python/ceval.c:3755 #26 0x080c7312 in PyEval_EvalFrame (f=0x81938ec) at Python/ceval.c:3629 #27 0x080c7a54 in PyEval_EvalCodeEx (co=0x403e52e0, globals=0x403df68c, locals=0x0, args=0x8196d48, argcount=2, kws=0x8196d50, kwcount=1, defs=0x40412558, defcount=2, closure=0x0) at Python/ceval.c:2730 #28 0x080c52a8 in PyEval_EvalFrame (f=0x8196bfc) at Python/ceval.c:3639 #29 0x080c7a54 in PyEval_EvalCodeEx (co=0x405f8aa0, globals=0x40259824, locals=0x40259824, args=0x0, argcount=0, kws=0x0, kwcount=0, defs=0x0, defcount=0, closure=0x0) at Python/ceval.c:2730 #30 0x080c7c85 in PyEval_EvalCode (co=0x405f8aa0, globals=0x40259824, locals=0x40259824) at Python/ceval.c:484 #31 0x080f67e6 in PyRun_InteractiveOneFlags (fp=0x40235720, filename=0x8122a01 "", flags=0xbfffe924) at Python/pythonrun.c:1264 #32 0x080f6a49 in PyRun_InteractiveLoopFlags (fp=0x40235720, filename=0x8122a01 "", flags=0xbfffe924) at Python/pythonrun.c:694 #33 0x080f6b70 in PyRun_AnyFileExFlags (fp=0x40235720, filename=0x8122a01 "", closeit=0, flags=0xbfffe924) at Python/pythonrun.c:657 ---Type to continue, or q to quit--- #34 0x08055857 in Py_Main (argc=0, argv=0xbfffe9e4) at Modules/main.c:484 #35 0x08054f07 in main (argc=1, argv=0xbfffe9e4) at Modules/python.c:23 Nils From Fernando.Perez at colorado.edu Thu Oct 6 10:52:58 2005 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Thu, 06 Oct 2005 08:52:58 -0600 Subject: [SciPy-dev] [SciPy-user] Re: Re: [Numpy-discussion] Purchasing Documentation In-Reply-To: <1e2af89e0510060529s5319192ayea270ebe2949b6b8@mail.gmail.com> References: <43418AA8.2040809@hawaii.edu> <4341A410.5010504@optusnet.com.au> <4341B659.7070503@hawaii.edu> <4341C1C8.3000506@optusnet.com.au> <4341CB9E.4050306@ee.byu.edu> <434300FB.7070606@ee.byu.edu> <434307CC.7020303@colorado.edu> <43430B62.9090908@ee.byu.edu> <1e2af89e0510060529s5319192ayea270ebe2949b6b8@mail.gmail.com> Message-ID: <43453A4A.30207@colorado.edu> Matthew Brett wrote: > So, just to clarify - does Fenando win his bet (was it dollars or > doughnuts)? Are you happy for the community to write and release free > basic documentation for scipy.base? Minor correction: John's bet. I like bagels, not doughnuts :) f From oliphant at ee.byu.edu Thu Oct 6 15:38:16 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Thu, 06 Oct 2005 13:38:16 -0600 Subject: [SciPy-dev] [Fwd: Build Issues on Solaris] Message-ID: <43457D28.1020000@ee.byu.edu> -------------- next part -------------- An embedded message was scrubbed... From: Christopher Hanley Subject: Build Issues on Solaris Date: Thu, 06 Oct 2005 14:27:48 -0400 Size: 12598 URL: From stephen.walton at csun.edu Thu Oct 6 17:20:00 2005 From: stephen.walton at csun.edu (Stephen Walton) Date: Thu, 06 Oct 2005 14:20:00 -0700 Subject: [SciPy-dev] newscipy sandbox In-Reply-To: References: Message-ID: <43459500.30504@csun.edu> Pearu Peterson wrote: > > I have created sandbox in newscipy and adapted scipy setup.py file > to new scipy.distutils. Thanks, Pearu, this is a good job. I am having a problem, though. config.add_library in scipy.distutils creates names libamos__OF__scipy.special.a instead of just libamos.a. So I guess my questions are: (1) should the add_library method actually do this? (2) If yes to (1), does this mean that Lib/special/setup _special.py should be modified so its list of library names is of this form? (3) Or, should the add_extension method of Configuration mangle the passed-in library names to be of this form? If I get a vote, I would answer no to #1, but if there's a good reason for that answer to be yes, then I would say yes to #3 as well. Steve From oliphant at ee.byu.edu Thu Oct 6 18:20:19 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Thu, 06 Oct 2005 16:20:19 -0600 Subject: [SciPy-dev] Threaded support failing Message-ID: <4345A323.7040504@ee.byu.edu> My simple threaded support must not be right as it is causing my multi-processor system to segfault on import scipy. I've commented out the defines for now, but would like to figure it out if anyone has any ideas. I think the problem is that I have OBJECT function loops in the same loop as non OBJECT function loops. Fundamentally, they need different threading support. OBJECT function loops really can't release the GIL, while other loops can. Thus, I think I'm going to have to special case the OBJECT_function loops in order to provide suitable threading support (essentially calling a different function for OBJECT array looping). Can anybody else think of a better solution? -Travis From pearu at scipy.org Thu Oct 6 17:26:12 2005 From: pearu at scipy.org (Pearu Peterson) Date: Thu, 6 Oct 2005 16:26:12 -0500 (CDT) Subject: [SciPy-dev] newscipy sandbox In-Reply-To: <43459500.30504@csun.edu> References: <43459500.30504@csun.edu> Message-ID: On Thu, 6 Oct 2005, Stephen Walton wrote: > Pearu Peterson wrote: > >> >> I have created sandbox in newscipy and adapted scipy setup.py file >> to new scipy.distutils. > > > Thanks, Pearu, this is a good job. I am having a problem, though. > config.add_library in scipy.distutils creates names > libamos__OF__scipy.special.a instead of just libamos.a. So I guess my > questions are: > > (1) should the add_library method actually do this? not sure yet. > (2) If yes to (1), does this mean that Lib/special/setup _special.py > should be modified so its list of library names is of this form? no > (3) Or, should the add_extension method of Configuration mangle the > passed-in library names to be of this form? yes, but see (1). > If I get a vote, I would answer no to #1, but if there's a good reason > for that answer to be yes, then I would say yes to #3 as well. I am not quite finished with implementing what's behind in add_library method, and so, it may not work yet (I had a working version before cvs->svn but this version is against cvs repository that I need to apply to svn). The idea behind this strange naming convention (that may change to a better one) is that one can build/install pure libraries that will be available to different scipy packages. A classical example is blas/lapack libraries that are used by many extension modules from different packages. The problem is to find out how setup scripts of packages can find the location of such libraries that may be in completely different branch of a directory tree and using this naming convention is one of the ways to specify libraries locations. I know, it's ugly and I don't like it myself but that's just a side result of a process for looking a right solution. I'll try to find some time to review the add_library stuff as soon as possible. Pearu From oliphant at ee.byu.edu Thu Oct 6 19:07:52 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Thu, 06 Oct 2005 17:07:52 -0600 Subject: [SciPy-dev] newscipy sandbox In-Reply-To: References: <43459500.30504@csun.edu> Message-ID: <4345AE48.1050809@ee.byu.edu> Pearu Peterson wrote: >I am not quite finished with implementing what's behind in add_library >method, and so, it may not work yet (I had a working version before >cvs->svn but this version is against cvs repository that I need to apply >to svn). > > Well, it worked enough to get scipy.special working in current newscipy SVN. >The idea behind this strange naming convention (that may change to a >better one) is that one can build/install pure libraries that will be >available to different scipy packages. A classical example is blas/lapack >libraries that are used by many extension modules from different packages. >The problem is to find out how setup scripts of packages can find the >location of such libraries that may be in completely different branch of >a directory tree and using this naming convention is one of the ways to >specify libraries locations. I know, it's ugly and I don't like it myself >but that's just a side result of a process for looking a right >solution. > >I'll try to find some time to review the add_library stuff as soon as >possible. > > Thanks Pearu, I just wanted you to know that I was able to get scipy.special built against new scipy with your add_library method of Configuration. -Travis From oliphant at ee.byu.edu Thu Oct 6 22:56:48 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Thu, 06 Oct 2005 20:56:48 -0600 Subject: [SciPy-dev] More bugs fixed Message-ID: <4345E3F0.8010802@ee.byu.edu> This is just to let people know that I found and fixed the problem that was causing segfaults in vectorize. Basically, when I added the new feature to allow user-defined UFUNCS to be created, I did not set the needed (userloops) item to NULL in the ufunc_frompyfunc function (I didn't initialize it at all). Thus, when the ufunc was deallocated and the XDECREF encountered, some random segment of memory was being DECREFd. So, again, strange problems caused by unitialized memory being XDECREF'd. Same problem, different reason. So at least for know, my understanding of reference counting is not at fault :-) I also decided to add a check for object looping and act accordingly on threaded machines and re-enabled the thread support for the ufunc loops. This seemed to work on one multiprocessor machine I have access to. -Travis From rkern at ucsd.edu Fri Oct 7 01:13:19 2005 From: rkern at ucsd.edu (Robert Kern) Date: Thu, 06 Oct 2005 22:13:19 -0700 Subject: [SciPy-dev] More bugs fixed In-Reply-To: <4345E3F0.8010802@ee.byu.edu> References: <4345E3F0.8010802@ee.byu.edu> Message-ID: <434603EF.5090209@ucsd.edu> Travis Oliphant wrote: > This is just to let people know that I found and fixed the problem that > was causing segfaults in vectorize. Excellent! I no longer get segfaults or bus errors on my machine. Thank you. On an unrelated note, I would like to propose that we get rid of scipy.stats in scipy_core. At the moment, the only things there are a few aliases (which can be moved, and I would like to deprecate; they only existed as separate entities in the RANLIB world to work around interface issues that no longer exist) and a private helper function to help pickling RandomState instances. However, scipy.stats (scipy_core) is now littered with non-uniform RNG functions (10 more with my last checkin!). These are names that we would like to use for full distribution objects in scipy.stats proper. codeconverter.py can change RandomArray to scipy.lib.mtrand instead. -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From stephen.walton at csun.edu Fri Oct 7 01:17:49 2005 From: stephen.walton at csun.edu (Stephen Walton) Date: Thu, 06 Oct 2005 22:17:49 -0700 Subject: [SciPy-dev] newscipy sandbox In-Reply-To: <4345AE48.1050809@ee.byu.edu> References: <43459500.30504@csun.edu> <4345AE48.1050809@ee.byu.edu> Message-ID: <434604FD.6030508@csun.edu> Travis Oliphant wrote: >Pearu Peterson wrote: > > > >>I am not quite finished with implementing what's behind in add_library >>method, >> >> >Well, it worked enough to get scipy.special working in current newscipy >SVN. > > With all due respect, and with knowledge I'm probably doing something wrong, I don't see how. The last few lines of the output of "setup.py build" in newscipy on my system (svn revision 1297) are: gcc -pthread -shared build/temp.linux-i686-2.4/Lib/special/_cephesmodule.o build/temp.linux-i686-2.4/Lib/special/amos_wrappers.o build/temp.linux-i686-2.4/Lib/special/specfun_wrappers.o build/temp.linux-i686-2.4/Lib/special/toms_wrappers.o build/temp.linux-i686-2.4/Lib/special/cdf_wrappers.o build/temp.linux-i686-2.4/Lib/special/ufunc_extras.o -Lbuild/temp.linux-i686-2.4 -lamos -ltoms -lc_misc -lcephes -lmach -lcdf -lspecfun -o build/lib.linux-i686-2.4/scipy/special/_cephes.so /usr/bin/ld: cannot find -lamos collect2: ld returned 1 exit status /usr/bin/ld: cannot find -lamos collect2: ld returned 1 exit status error: Command "gcc -pthread -shared build/temp.linux-i686-2.4/Lib/special/_cephesmodule.o build/temp.linux-i686-2.4/Lib/special/amos_wrappers.o build/temp.linux-i686-2.4/Lib/special/specfun_wrappers.o build/temp.linux-i686-2.4/Lib/special/toms_wrappers.o build/temp.linux-i686-2.4/Lib/special/cdf_wrappers.o build/temp.linux-i686-2.4/Lib/special/ufunc_extras.o -Lbuild/temp.linux-i686-2.4 -lamos -ltoms -lc_misc -lcephes -lmach -lcdf -lspecfun -o build/lib.linux-i686-2.4/scipy/special/_cephes.so" failed with exit status 1 removed Lib/__svn_version__.py because there is no file called libamos.a, only libamos__OF__scipy.special.a. From stephen.walton at csun.edu Fri Oct 7 01:24:08 2005 From: stephen.walton at csun.edu (Stephen Walton) Date: Thu, 06 Oct 2005 22:24:08 -0700 Subject: [SciPy-dev] newscipy sandbox In-Reply-To: References: <43459500.30504@csun.edu> Message-ID: <43460678.7030603@csun.edu> Pearu Peterson wrote: > >The idea behind this strange naming convention (that may change to a >better one) is that one can build/install pure libraries that will be >available to different scipy packages. > Maybe I'm being naive here, but wouldn't it be the case that libamos.so, for example, is automatically put in directory scipy/special and that it is redundant for the library itself to be named libamos__OF__scipy.special.so? From arnd.baecker at web.de Fri Oct 7 02:44:01 2005 From: arnd.baecker at web.de (Arnd Baecker) Date: Fri, 7 Oct 2005 08:44:01 +0200 (CEST) Subject: [SciPy-dev] `newcore` on opteron Message-ID: Hi, just for the fun of it I wanted to test the thread aspect on our dual opteron, but don't get very far in the installation: Procedure: svn checkout http://svn.scipy.org/svn/scipy_core/branches/newcore/ (checked out revision is 1184) cd newcore python setup.py install --prefix=${HOME}/NEW_SCIPY/scipy_new gives: compiling C sources gcc options: '-pthread -fno-strict-aliasing -DNDEBUG -g -O3 -Wall -Wstrict-prototypes -fPIC' compile options: '-Ibuild/src/scipy/base/src -Iscipy/base/include -Ibuild/src/scipy/base -Iscipy/base/src -I/scr/python/include/python2.4 -c' gcc: build/src/scipy/base/src/umathmodule.c In file included from build/src/scipy/base/src/umathmodule.c:7477: scipy/base/src/ufuncobject.c:95: error: conflicting types for 'PyUFunc_FF_F_As_DD_D' build/src/scipy/base/__ufunc_api.h:38: error: previous declaration of 'PyUFunc_FF_F_As_DD_D' was here scipy/base/src/ufuncobject.c:95: error: conflicting types for 'PyUFunc_FF_F_As_DD_D' build/src/scipy/base/__ufunc_api.h:38: error: previous declaration of 'PyUFunc_FF_F_As_DD_D' was here scipy/base/src/ufuncobject.c:113: error: conflicting types for 'PyUFunc_DD_D' [... many similar ones ... ] scipy/base/src/ufuncobject.c:2047: warning: int format, different type arg (arg 3) scipy/base/src/ufuncobject.c:2131: warning: passing arg 2 of pointer to function from incompatible pointer type build/src/scipy/base/src/umathmodule.c: At top level: build/src/scipy/base/__ufunc_api.h:38: warning: 'PyUFunc_FF_F_As_DD_D' used but never defined build/src/scipy/base/__ufunc_api.h:40: warning: 'PyUFunc_DD_D' used but never defined build/src/scipy/base/__ufunc_api.h:42: warning: 'PyUFunc_FF_F' used but never defined build/src/scipy/base/__ufunc_api.h:44: warning: 'PyUFunc_GG_G' used but never defined build/src/scipy/base/__ufunc_api.h:46: warning: 'PyUFunc_OO_O' used but never defined error: Command "gcc -pthread -fno-strict-aliasing -DNDEBUG -g -O3 -Wall -Wstrict-prototypes -fPIC -Ibuild/src/scipy/base/src -Iscipy/base/include -Ibuild/src/scipy/base -Iscipy/base/src -I/scr/python/include/python2.4 -c build/src/scipy/base/src/umathmodule.c -o build/temp.linux-x86_64-2.4/build/src/scipy/base/src/umathmodule.o" failed with exit status 1 removed scipy/__svn_version__.py removed scipy/f2py2e/__svn_version__.py This is with gcc -v Reading specs from /scr/python/bin/../lib/gcc/x86_64-unknown-linux-gnu/3.4.4/specs Configured with: ../gcc-3.4.4/configure --prefix=/scr/python/ --enable-shared --enable-threads=posix --enable-__cxa_atexit --enable-clocale=gnu --enable-languages=c,c++,f77,objc Thread model: posix gcc version 3.4.4 I don't need this at the moment (the normal scipy does work fine on that machine now), but if it is of any help for `newcore`, I will follow any advice to debug this ;-)... Best, Arnd P.S.: for `scipy_new`, on a different machine I also get the -lamos error and have to comment out #config.add_subpackage('special') #config.add_subpackage('optimize') in Lib/setup.py From oliphant at ee.byu.edu Fri Oct 7 03:12:22 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Fri, 07 Oct 2005 01:12:22 -0600 Subject: [SciPy-dev] `newcore` on opteron In-Reply-To: References: Message-ID: <43461FD6.1020703@ee.byu.edu> Arnd Baecker wrote: >Hi, > >just for the fun of it I wanted to test the thread aspect on >our dual opteron, but don't get very far in the installation: > > Thank you for these results. I'm not sure what is going on. Are you sure your not picking up headers from somewhere else as well? I would be interested for example to see the file: scipy/base/src/__ufunc_api.h This file is autogenerated, and it could be that something is wrong there. Most likely it is a problem with the header. See below... >compiling C sources >gcc options: '-pthread -fno-strict-aliasing -DNDEBUG -g -O3 -Wall >-Wstrict-prototypes -fPIC' >compile options: '-Ibuild/src/scipy/base/src -Iscipy/base/include >-Ibuild/src/scipy/base -Iscipy/base/src -I/scr/python/include/python2.4 >-c' >gcc: build/src/scipy/base/src/umathmodule.c >In file included from build/src/scipy/base/src/umathmodule.c:7477: >scipy/base/src/ufuncobject.c:95: error: conflicting types for >'PyUFunc_FF_F_As_DD_D' >build/src/scipy/base/__ufunc_api.h:38: error: previous declaration of >'PyUFunc_FF_F_As_DD_D' was here >scipy/base/src/ufuncobject.c:95: error: conflicting types for >'PyUFunc_FF_F_As_DD_D' >build/src/scipy/base/__ufunc_api.h:38: error: previous declaration of >'PyUFunc_FF_F_As_DD_D' was here >scipy/base/src/ufuncobject.c:113: error: conflicting types for >'PyUFunc_DD_D' > >This is with >gcc -v >Reading specs from >/scr/python/bin/../lib/gcc/x86_64-unknown-linux-gnu/3.4.4/specs >Configured with: ../gcc-3.4.4/configure --prefix=/scr/python/ >--enable-shared --enable-threads=posix --enable-__cxa_atexit >--enable-clocale=gnu --enable-languages=c,c++,f77,objc >Thread model: posix >gcc version 3.4.4 > > >I don't need this at the moment (the normal scipy does work >fine on that machine now), but if it is of any help >for `newcore`, I will follow any advice to debug this ;-)... > > It's also possible that something I did recently messed up the ufuncobject.h header (I was playing with some threading issues --- (non-object-array) ufuncs are supposed to release the GIL to allow other Python threads to proceed. I'll look into that. >Best, > >Arnd > >P.S.: for `scipy_new`, on a different machine I also get the -lamos error >and have to comment out > #config.add_subpackage('special') > #config.add_subpackage('optimize') >in Lib/setup.py > > Yes, I think I got this to work, because I played with some stuff in scipy.distutils and then built around libraries previously-compiled. Pearu is supposed to fix this very soon (I just commented out the addition of the __OF__ +self.name stuff in the add_library method of Configuration class in scipy.distutils.misc_util to get it to work). Thanks for your tests. -Travis > > >_______________________________________________ >Scipy-dev mailing list >Scipy-dev at scipy.net >http://www.scipy.net/mailman/listinfo/scipy-dev > > From oliphant at ee.byu.edu Fri Oct 7 03:15:28 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Fri, 07 Oct 2005 01:15:28 -0600 Subject: [SciPy-dev] newscipy sandbox In-Reply-To: <434604FD.6030508@csun.edu> References: <43459500.30504@csun.edu> <4345AE48.1050809@ee.byu.edu> <434604FD.6030508@csun.edu> Message-ID: <43462090.1000108@ee.byu.edu> Stephen Walton wrote: >Travis Oliphant wrote: > > > >>Pearu Peterson wrote: >> >> >> >> >> >>>I am not quite finished with implementing what's behind in add_library >>>method, >>> >>> >>> >>> >>Well, it worked enough to get scipy.special working in current newscipy >>SVN. >> >> >> >> >With all due respect, and with knowledge I'm probably doing something >wrong, I don't see how. The last few lines of the output of "setup.py >build" in newscipy on my system (svn revision 1297) are: > > No, you're doing nothing wrong. The add_library method is broken. Basically, I removed the __OF__ stuff from add_library and then compiled (building the needed libraries just fine). When I added it back (it built the funny-named libraries to), but now the correctly-named ones were still there and could find. them. So, it was an accident that I got things to build. The add_library method is still broken. I'll check in my removal to newcore until Pearu fixes things... -Travis From nwagner at mecha.uni-stuttgart.de Fri Oct 7 03:19:39 2005 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Fri, 07 Oct 2005 09:19:39 +0200 Subject: [SciPy-dev] Problems with newscipy -lamos -lminpack Message-ID: <4346218B.4000001@mecha.uni-stuttgart.de> Hi all, python setup.py build in newscipy results in /usr/lib/gcc-lib/i586-suse-linux/3.3.3/../../../../i586-suse-linux/bin/ld: cannot find -lminpack collect2: ld returned 1 exit status /usr/lib/gcc-lib/i586-suse-linux/3.3.3/../../../../i586-suse-linux/bin/ld: cannot find -lminpack collect2: ld returned 1 exit status error: Command "gcc -pthread -shared build/temp.linux-i686-2.4/Lib/optimize/_minpackmodule.o -Lbuild/temp.linux-i686-2.4 -lminpack -o build/lib.linux-i686-2.4/scipy/optimize/_minpack.so" failed with exit status 1 removed Lib/__svn_version__.py Nils From rkern at ucsd.edu Fri Oct 7 03:23:29 2005 From: rkern at ucsd.edu (Robert Kern) Date: Fri, 07 Oct 2005 00:23:29 -0700 Subject: [SciPy-dev] Problems with newscipy -lamos -lminpack In-Reply-To: <4346218B.4000001@mecha.uni-stuttgart.de> References: <4346218B.4000001@mecha.uni-stuttgart.de> Message-ID: <43462271.9010704@ucsd.edu> Nils Wagner wrote: > Hi all, > > python setup.py build in newscipy results in > > /usr/lib/gcc-lib/i586-suse-linux/3.3.3/../../../../i586-suse-linux/bin/ld: > cannot find -lminpack We are aware of this. The current build process is in flux with respect to building libraries. There have been a spate of recent messages to this list about this issue. In general, before reporting a bug, please take the time to read the recent messages on the list. It's fairly likely that someone else has already reported it. -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From oliphant at ee.byu.edu Fri Oct 7 03:40:27 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Fri, 07 Oct 2005 01:40:27 -0600 Subject: [SciPy-dev] More bugs fixed In-Reply-To: <434603EF.5090209@ucsd.edu> References: <4345E3F0.8010802@ee.byu.edu> <434603EF.5090209@ucsd.edu> Message-ID: <4346266B.6040405@ee.byu.edu> Robert Kern wrote: >Travis Oliphant wrote: > > >>This is just to let people know that I found and fixed the problem that >>was causing segfaults in vectorize. >> >> > >Excellent! I no longer get segfaults or bus errors on my machine. Thank you. > >On an unrelated note, I would like to propose that we get rid of >scipy.stats in scipy_core. > Hmm. I suppose we could do that. But, it seems wrong to relegate your nice mtrand package to a library only. It makes it seem much more vulgar and low-level than it is (It's a very, very nice contribution to scipy core). It's a minor issue, admitedly, but is there someway we could alias it to something else, so that scipy core only users won't have to say import scipy.lib.mtrand I'd like to have a name on the same level as fftpack, linalg, etc. Perhaps scipy.random? -Travis From oliphant at ee.byu.edu Fri Oct 7 03:46:52 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Fri, 07 Oct 2005 01:46:52 -0600 Subject: [SciPy-dev] Add library temporary fix in newcore In-Reply-To: <434604FD.6030508@csun.edu> References: <43459500.30504@csun.edu> <4345AE48.1050809@ee.byu.edu> <434604FD.6030508@csun.edu> Message-ID: <434627EC.2030207@ee.byu.edu> The svn version of new core contains a temporary fix to add_library that lets (what's converted) in newscipy build on newcore. -Travis From rkern at ucsd.edu Fri Oct 7 04:08:31 2005 From: rkern at ucsd.edu (Robert Kern) Date: Fri, 07 Oct 2005 01:08:31 -0700 Subject: [SciPy-dev] More bugs fixed In-Reply-To: <4346266B.6040405@ee.byu.edu> References: <4345E3F0.8010802@ee.byu.edu> <434603EF.5090209@ucsd.edu> <4346266B.6040405@ee.byu.edu> Message-ID: <43462CFF.6000702@ucsd.edu> Travis Oliphant wrote: > Robert Kern wrote: >>On an unrelated note, I would like to propose that we get rid of >>scipy.stats in scipy_core. > > Hmm. I suppose we could do that. But, it seems wrong to relegate your > nice mtrand package to a library only. It makes it seem much more > vulgar and > low-level than it is (It's a very, very nice contribution to scipy core). > > It's a minor issue, admitedly, but is there someway we could alias it to > something else, > so that scipy core only users won't have to say > > import scipy.lib.mtrand > > I'd like to have a name on the same level as fftpack, linalg, etc. > > Perhaps scipy.random? Works for me. We could, in fact, just move scipy/stats/__init__.py to scipy/random.py if we're allowing "nontrivial" .py files there. I'm really only concerned about freeing up the scipy.stats namespace for the real scipy.stats, not just the base random number stuff. Incidentally, what's the plan for merging the full scipy (and specifically scipy.linalg and scipy.fftpack) with scipy_core? -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From nwagner at mecha.uni-stuttgart.de Fri Oct 7 04:19:05 2005 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Fri, 07 Oct 2005 10:19:05 +0200 Subject: [SciPy-dev] Bug in setup_optimize.py Message-ID: <43462F79.7040408@mecha.uni-stuttgart.de> config.add_extension('_cobyla', sources=[join('cobyla',x) for x in ['cobyla.pyf', 'cobyla2.f', 'trstlp,f']]) It should be trstlp.f gcc options: '-pthread -fno-strict-aliasing -DNDEBUG -g -O3 -Wall -Wstrict-prototypes -fPIC' error: unknown file type '' (from 'cobyla/trstlp,f') Nils From rkern at ucsd.edu Fri Oct 7 04:23:16 2005 From: rkern at ucsd.edu (Robert Kern) Date: Fri, 07 Oct 2005 01:23:16 -0700 Subject: [SciPy-dev] Bug in setup_optimize.py In-Reply-To: <43462F79.7040408@mecha.uni-stuttgart.de> References: <43462F79.7040408@mecha.uni-stuttgart.de> Message-ID: <43463074.20600@ucsd.edu> Nils Wagner wrote: > config.add_extension('_cobyla', > sources=[join('cobyla',x) for x in ['cobyla.pyf', > 'cobyla2.f', > 'trstlp,f']]) > > It should be trstlp.f > > > gcc options: '-pthread -fno-strict-aliasing -DNDEBUG -g -O3 -Wall > -Wstrict-prototypes -fPIC' > error: unknown file type '' (from 'cobyla/trstlp,f') The fix got commited a few minutes ago. :-) -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From oliphant at ee.byu.edu Fri Oct 7 04:24:29 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Fri, 07 Oct 2005 02:24:29 -0600 Subject: [SciPy-dev] More bugs fixed In-Reply-To: <43462CFF.6000702@ucsd.edu> References: <4345E3F0.8010802@ee.byu.edu> <434603EF.5090209@ucsd.edu> <4346266B.6040405@ee.byu.edu> <43462CFF.6000702@ucsd.edu> Message-ID: <434630BD.5020808@ee.byu.edu> Robert Kern wrote: >Works for me. We could, in fact, just move scipy/stats/__init__.py to >scipy/random.py if we're allowing "nontrivial" .py files there. > Good idea. >I'm >really only concerned about freeing up the scipy.stats namespace for the >real scipy.stats, not just the base random number stuff. > > Right, sounds good. >Incidentally, what's the plan for merging the full scipy (and >specifically scipy.linalg and scipy.fftpack) with scipy_core? > > My greatly-thought-out plan ;-) is to just over-write the old with the new. The trouble I suppose is that the interfaces do change (in keywords and stuff) for some of the function calls in scipy's version of these things. So, I'm not sure how to handle that. A better solution might be to have new names for the core sub packages. Perhaps we should call fftpack ffts and linalg linear in the core distribution. Then people would always know what they were caling, instead of my badly thought out current plan. -Travis From oliphant at ee.byu.edu Fri Oct 7 04:25:02 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Fri, 07 Oct 2005 02:25:02 -0600 Subject: [SciPy-dev] Bug in setup_optimize.py In-Reply-To: <43462F79.7040408@mecha.uni-stuttgart.de> References: <43462F79.7040408@mecha.uni-stuttgart.de> Message-ID: <434630DE.8040700@ee.byu.edu> Nils Wagner wrote: > config.add_extension('_cobyla', > sources=[join('cobyla',x) for x in ['cobyla.pyf', > 'cobyla2.f', > 'trstlp,f']]) > >It should be trstlp.f > > > > Thanks, checked it in. -Travis From arnd.baecker at web.de Fri Oct 7 04:27:44 2005 From: arnd.baecker at web.de (Arnd Baecker) Date: Fri, 7 Oct 2005 10:27:44 +0200 (CEST) Subject: [SciPy-dev] `newcore` on opteron In-Reply-To: <43461FD6.1020703@ee.byu.edu> References: <43461FD6.1020703@ee.byu.edu> Message-ID: Hi Travis, On Fri, 7 Oct 2005, Travis Oliphant wrote: > Arnd Baecker wrote: > > >Hi, > > > >just for the fun of it I wanted to test the thread aspect on > >our dual opteron, but don't get very far in the installation: > > Thank you for these results. That's the least I can do ;-) > I'm not sure what is going on. Are you sure your not picking up headers > from somewhere else as well? Well, you never know... Is there a way to check this? To be sure I started again (removing all dirs and without any environment variables, which would point to my install of "old scipy", including the gcc 3.4.4). This means that now I am running with gcc -v Using built-in specs. Target: x86_64-suse-linux Configured with: ../configure --enable-threads=posix --prefix=/usr --with-local-prefix=/usr/local --infodir=/usr/share/info --mandir=/usr/share/man --libdir=/usr/lib64 --libexecdir=/usr/lib64 --enable-languages=c,c++,objc,f95,java,ada --enable-checking --with-gxx-include-dir=/usr/include/c++/4.0.2 --enable-java-awt=gtk --disable-libjava-multilib --with-slibdir=/lib64 --with-system-zlib --enable-shared --enable-__cxa_atexit --without-system-libunwind --host=x86_64-suse-linux Thread model: posix gcc version 4.0.2 20050826 (prerelease) (SUSE Linux) > I would be interested for example to see the file: > > scipy/base/src/__ufunc_api.h ./newcore/scipy/base/src/ does not contain this file, but ./newcore/build/src/scipy/base/__ufunc_api.h does [ I send you the file off-list ] > This file is autogenerated, and it could be that something is wrong there. > > Most likely it is a problem with the header. See below... [...] > It's also possible that something I did recently messed up the > ufuncobject.h header (I was playing with some threading issues --- > (non-object-array) ufuncs are supposed to release the GIL to allow other > Python threads to proceed. I'll look into that. OK, just let me know if you need any further information or test runs etc. Best, Arnd From nwagner at mecha.uni-stuttgart.de Fri Oct 7 04:35:27 2005 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Fri, 07 Oct 2005 10:35:27 +0200 Subject: [SciPy-dev] ImportError: cannot import name Inf Message-ID: <4346334F.5030004@mecha.uni-stuttgart.de> Now python setup.py install works fine, but >>> from scipy import optimize Traceback (most recent call last): File "", line 1, in ? File "/usr/local/lib/python2.4/site-packages/scipy/optimize/__init__.py", line 7, in ? from optimize import * File "/usr/local/lib/python2.4/site-packages/scipy/optimize/optimize.py", line 58, in ? from scipy.base import atleast_1d, eye, mgrid, argmin, zeros, shape, \ ImportError: cannot import name Inf Nils From oliphant at ee.byu.edu Fri Oct 7 04:44:10 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Fri, 07 Oct 2005 02:44:10 -0600 Subject: [SciPy-dev] ImportError: cannot import name Inf In-Reply-To: <4346334F.5030004@mecha.uni-stuttgart.de> References: <4346334F.5030004@mecha.uni-stuttgart.de> Message-ID: <4346355A.70306@ee.byu.edu> Nils Wagner wrote: >Now python setup.py install works fine, but > > > >>>>from scipy import optimize >>>> >>>> >Traceback (most recent call last): > File "", line 1, in ? > File >"/usr/local/lib/python2.4/site-packages/scipy/optimize/__init__.py", >line 7, in ? > from optimize import * > File >"/usr/local/lib/python2.4/site-packages/scipy/optimize/optimize.py", >line 58, in ? > from scipy.base import atleast_1d, eye, mgrid, argmin, zeros, shape, \ >ImportError: cannot import name Inf > > > > I haven't gone through and checked the imports yet. If you can summarize more problems that would be great. There will probably be lots of them. (I need to make changes to the core as they are found, too.) Perhaps you should wait until things are more stable. -Travis From oliphant at ee.byu.edu Fri Oct 7 04:56:45 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Fri, 07 Oct 2005 02:56:45 -0600 Subject: [SciPy-dev] `newcore` on opteron In-Reply-To: References: <43461FD6.1020703@ee.byu.edu> Message-ID: <4346384D.2080900@ee.byu.edu> Arnd Baecker wrote: >Hi Travis, > >On Fri, 7 Oct 2005, Travis Oliphant wrote: > > > >>Arnd Baecker wrote: >> >> >> >>>Hi, >>> >>>just for the fun of it I wanted to test the thread aspect on >>>our dual opteron, but don't get very far in the installation: >>> >>> So, just to be clear. I'm very interested in your compilation results. Please send me the logs and the config.h file of any build you do on a 64-bit system. There may be more bugs like the ones you've uncovered lurking unbeknownest to us who have intp == int -Travis From arnd.baecker at web.de Fri Oct 7 04:58:33 2005 From: arnd.baecker at web.de (Arnd Baecker) Date: Fri, 7 Oct 2005 10:58:33 +0200 (CEST) Subject: [SciPy-dev] `newcore` on opteron In-Reply-To: <434635DC.6040108@ee.byu.edu> References: <434635DC.6040108@ee.byu.edu> Message-ID: On Fri, 7 Oct 2005, Travis Oliphant wrote: [...] > I should have also asked for the config.h > file. You can post it to the list. It's found in the same directory as > the other generated include files. more config.h /* #define SIZEOF_SHORT 2 */ /* #define SIZEOF_INT 4 */ /* #define SIZEOF_LONG 8 */ /* #define SIZEOF_FLOAT 4 */ /* #define SIZEOF_DOUBLE 8 */ #define SIZEOF_LONG_DOUBLE 16 #define SIZEOF_PY_INTPTR_T 8 /* #define SIZEOF_LONG_LONG 8 */ #define SIZEOF_PY_LONG_LONG 8 /* #define CHAR_BIT 8 */ #define MATHLIB m #define HAVE_LONGDOUBLE_FUNCS #define HAVE_FLOAT_FUNCS #define HAVE_INVERSE_HYPERBOLIC #define HAVE_INVERSE_HYPERBOLIC_FLOAT #define HAVE_ISNAN > I think there is an unanticipated problem on this 64-bit system. So, > this is a good test... Excellent ;-) Best, Arnd From arnd.baecker at web.de Fri Oct 7 06:51:05 2005 From: arnd.baecker at web.de (Arnd Baecker) Date: Fri, 7 Oct 2005 12:51:05 +0200 (CEST) Subject: [SciPy-dev] `newcore` on opteron In-Reply-To: <4346384D.2080900@ee.byu.edu> References: <4346384D.2080900@ee.byu.edu> Message-ID: On Fri, 7 Oct 2005, Travis Oliphant wrote: [...] > There may be more bugs like the ones you've uncovered lurking > unbeknownest to us who have intp == int Looks already much better now: - I did an `svn update` - it installs without error - import scipy works - scipy.test(1) hangs: >>> import scipy >>> scipy.__version__ '0.4.2' >>> scipy.test(1,verbosity=10) !! No test file 'test_mtrand.py' found for !! No test file 'test_scipy.py' found for !! No test file 'test_scimath.py' found for !! No test file 'test_testing.py' found for !! No test file 'test_umath.py' found for Found 23 tests for scipy.base.function_base !! No test file 'test_machar.py' found for !! No test file 'test_test.py' found for !! No test file 'test_ma.py' found for !! No test file 'test_fftpack.py' found for !! No test file 'test_numerictypes.py' found for !! No test file 'test_random.py' found for !! No test file 'test_core_version.py' found for !! No test file 'test_getlimits.py' found for Found 9 tests for scipy.base.twodim_base !! No test file 'test__compiled_base.py' found for !! No test file 'test_info_scipy_base.py' found for !! No test file 'test_ufunclike.py' found for !! No test file 'test_info_scipy_test.py' found for !! No test file 'test_linalg.py' found for !! No test file 'test_scipy_test_version.py' found for !! No test file 'test_base.py' found for !! No test file 'test_convertcode.py' found for !! No test file 'test_helper.py' found for !! No test file 'test_lib.py' found for !! No test file 'test_multiarray.py' found for !! No test file 'test_matrix.py' found for !! No test file 'test_oldnumeric.py' found for Found 44 tests for scipy.base.shape_base !! No test file 'test_numeric.py' found for !! No test file 'test_fftpack_lite.py' found for !! No test file 'test_lapack_lite.py' found for !! No test file 'test_polynomial.py' found for Found 42 tests for scipy.base.type_check !! No test file 'test_arrayprint.py' found for !! No test file 'test_basic_lite.py' found for !! No test file 'test_fft_lite.py' found for Found 4 tests for scipy.base.index_tricks Found 0 tests for __main__ check_basic (scipy.base.function_base.test_function_base.test_all) ... ok check_nd (scipy.base.function_base.test_function_base.test_all) ... ok check_basic (scipy.base.function_base.test_function_base.test_amax) ... ok check_basic (scipy.base.function_base.test_function_base.test_amin) ... ok check_basic (scipy.base.function_base.test_function_base.test_angle) ... (I killed the python when it went beyond the 8GB memory consumption...) Umpf, I just did another `svn update` and now it does not build: In file included from scipy/base/src/multiarraymodule.c:44: scipy/base/src/arrayobject.c: In function array_frominterface: scipy/base/src/arrayobject.c:5178: warning: passing argument 3 of PyArray_New from incompatible pointer type scipy/base/src/arrayobject.c: In function iter_subscript_int: scipy/base/src/arrayobject.c:5746: warning: format %d expects type int, but argument 3 has type intp scipy/base/src/arrayobject.c:5746: warning: format %d expects type int, but argument 4 has type intp scipy/base/src/arrayobject.c: In function iter_ass_sub_int: scipy/base/src/arrayobject.c:5950: warning: format %d expects type int, but argument 3 has type intp scipy/base/src/arrayobject.c:5950: warning: format %d expects type int, but argument 4 has type intp scipy/base/src/arrayobject.c: In function PyArray_MapIterBind: scipy/base/src/arrayobject.c:6607: warning: format %d expects type int, but argument 3 has type long int scipy/base/src/arrayobject.c:6607: warning: format %d expects type int, but argument 4 has type intp error: Command "gcc -pthread -fno-strict-aliasing -DNDEBUG -O2 -fmessage-length=0 -Wall -D_FORTIFY_SOURCE=2 -g -fPIC -Ibuild/src/scipy/base/src -Iscipy/base/include -Ibuild/src/scipy/base -Iscipy/base/src -I/usr/include/python2.4 -c scipy/base/src/multiarraymodule.c -o build/temp.linux-x86_64-2.4/scipy/base/src/multiarraymodule.o" failed with exit status 1 removed scipy/__svn_version__.py removed scipy/f2py2e/__svn_version__.py Am I catching you in the middle of changes and should I better wait? (or should I send you config files etc. ?) Best, Arnd From nwagner at mecha.uni-stuttgart.de Fri Oct 7 07:23:00 2005 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Fri, 07 Oct 2005 13:23:00 +0200 Subject: [SciPy-dev] optimize.py Message-ID: <43465A94.9080405@mecha.uni-stuttgart.de> >>> from scipy import optimize Traceback (most recent call last): File "", line 1, in ? File "/usr/local/lib/python2.4/site-packages/scipy/optimize/__init__.py", line 7, in ? from optimize import * File "/usr/local/lib/python2.4/site-packages/scipy/optimize/optimize.py", line 66, in ? max = MLab.max AttributeError: 'module' object has no attribute 'max' Nils From rkern at ucsd.edu Fri Oct 7 08:11:47 2005 From: rkern at ucsd.edu (Robert Kern) Date: Fri, 07 Oct 2005 05:11:47 -0700 Subject: [SciPy-dev] More bugs fixed In-Reply-To: <434630BD.5020808@ee.byu.edu> References: <4345E3F0.8010802@ee.byu.edu> <434603EF.5090209@ucsd.edu> <4346266B.6040405@ee.byu.edu> <43462CFF.6000702@ucsd.edu> <434630BD.5020808@ee.byu.edu> Message-ID: <43466603.1060004@ucsd.edu> Travis Oliphant wrote: > Robert Kern wrote: >>Incidentally, what's the plan for merging the full scipy (and >>specifically scipy.linalg and scipy.fftpack) with scipy_core? > > My greatly-thought-out plan ;-) is to just over-write the old with the new. > > The trouble I suppose is that the interfaces do change (in keywords and > stuff) for some of the function calls in scipy's version of these things. > > So, I'm not sure how to handle that. A better solution might be to have > new names for the core sub packages. Perhaps we should call fftpack > ffts and linalg linear in the core distribution. > > Then people would always know what they were caling, instead of my badly > thought out current plan. I *would* prefer that the modules under scipy_core be separate from the modules in scipy proper. At the moment, I have scipy_core and scipy built as separate eggs with proper namespace package support. [site-packages]$ ls -d scipy*.egg scipy-0.4.2_1307-py2.4-macosx-10.4-ppc.egg scipy_complete-0.3.3_309.4626-py2.4-macosx-10.4-ppc.egg scipy_core-0.4.2-py2.4-macosx-10.4-ppc.egg (Ignore scipy_complete, that's the old scipy. Thanks to eggs, I can easily switch back and forth between old and new scipy.) Both scipy-0.4.2*.egg and scipy_core-0.4.2*.egg are in my PYTHONPATH. By installing the setuptools importhooks: try: import pkg_resources except ImportError: pass to the scipy/__init__.py files of both eggs, both eggs are recognized as providing parts of the package "scipy". However, the system won't handle multiple definitions of subpackages. I think it would be quite nice to be able to distribute scipy_linalg, scipy_optimize, scipy_stats, etc. as eggs that may depend on each other. As an aside, I've ported my bindings to ODRPACK to use scipy_core. It took about five minutes (most of which was spent simply modernizing the old code rather than, strictly speaking, porting it). It's now sitting in scipy.sandbox.odr and it works! Color me impressed. -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From nwagner at mecha.uni-stuttgart.de Fri Oct 7 09:13:16 2005 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Fri, 07 Oct 2005 15:13:16 +0200 Subject: [SciPy-dev] segmentation fault optimize.fmin_cg Message-ID: <4346746C.5070404@mecha.uni-stuttgart.de> Hi all, Running the attached test results in a segmentation fault (gdb) run test_optimize.py Starting program: /usr/local/bin/python test_optimize.py [Thread debugging using libthread_db enabled] [New Thread 1076102528 (LWP 4283)] Program received signal SIGSEGV, Segmentation fault. [Switching to Thread 1076102528 (LWP 4283)] 0x405b06ef in array_from_pyobj (type_num=12, dims=Variable "dims" is not available. ) at build/src/fortranobject.c:651 651 if (PyArray_Check(obj)) { /* here we have always intent(in) or (gdb) bt #0 0x405b06ef in array_from_pyobj (type_num=12, dims=Variable "dims" is not available. ) at build/src/fortranobject.c:651 #1 0x405af112 in f2py_rout_minpack2_dcsrch (capi_self=0x40242d10, capi_args=0x40420974, capi_keywds=0x0, f2py_func=0x405b1340 ) at build/src/Lib/optimize/minpack2/minpack2module.c:309 #2 0x405afa72 in fortran_call (fp=Variable "fp" is not available. ) at build/src/fortranobject.c:277 #3 0x0805928e in PyObject_Call (func=0x40242d10, arg=0x40420974, kw=0x0) at Objects/abstract.c:1746 #4 0x080c408d in PyEval_EvalFrame (f=0x823f01c) at Python/ceval.c:3755 #5 0x080c7a54 in PyEval_EvalCodeEx (co=0x4041df20, globals=0x4042835c, locals=0x0, args=0x823efe4, argcount=7, kws=0x823f000, kwcount=1, defs=0x4041e790, defcount=4, closure=0x0) at Python/ceval.c:2730 #6 0x080c52a8 in PyEval_EvalFrame (f=0x823ee0c) at Python/ceval.c:3639 #7 0x080c7a54 in PyEval_EvalCodeEx (co=0x4041d7e0, globals=0x40416e84, locals=0x0, args=0x8196eb4, argcount=2, kws=0x8196ebc, kwcount=0, defs=0x404270b8, defcount=9, closure=0x0) at Python/ceval.c:2730 #8 0x080c52a8 in PyEval_EvalFrame (f=0x8196d64) at Python/ceval.c:3639 #9 0x080c7a54 in PyEval_EvalCodeEx (co=0x4028a5e0, globals=0x40259824, locals=0x40259824, args=0x0, argcount=0, kws=0x0, kwcount=0, defs=0x0, defcount=0, closure=0x0) at Python/ceval.c:2730 #10 0x080c7c85 in PyEval_EvalCode (co=0x4028a5e0, globals=0x40259824, locals=0x40259824) at Python/ceval.c:484 #11 0x080f63b8 in PyRun_SimpleFileExFlags (fp=0x815e008, filename=0xbfffec12 "test_optimize.py", closeit=1, flags=0xbfffe904) at Python/pythonrun.c:1264 #12 0x08055857 in Py_Main (argc=1, argv=0xbfffe9c4) at Modules/main.c:484 #13 0x08054f07 in main (argc=2, argv=0xbfffe9c4) at Modules/python.c:23 (gdb) Nils -------------- next part -------------- A non-text attachment was scrubbed... Name: test_optimize.py Type: text/x-python Size: 859 bytes Desc: not available URL: From cimrman3 at ntc.zcu.cz Fri Oct 7 10:46:49 2005 From: cimrman3 at ntc.zcu.cz (Robert Cimrman) Date: Fri, 07 Oct 2005 16:46:49 +0200 Subject: [SciPy-dev] sparse matrix support status In-Reply-To: <433D690D.7060008@ee.byu.edu> References: <433CE24A.6040509@ee.byu.edu> <433D5BAD.8030801@ntc.zcu.cz> <433D690D.7060008@ee.byu.edu> Message-ID: <43468A59.9000503@ntc.zcu.cz> Travis Oliphant wrote: > Robert Cimrman wrote: > >> Hi all, >> >> I would like to know what is the status of the sparse matrix support >> in SciPy. Searching the scipy- mailing lists did not reveal anything >> recent. Is there some code to work on? I would like to help - though I >> am a newbie to the Scipy world, I do have experience with sparse >> matrices. >> >> If there is nothing yet, a quick possibility would be to wrap a >> library like sparskit2 and give it a pythonic feel. What are your >> opinions? >> > > There is a sparse matrix in SciPy now: under Lib/sparse. I've > implemented about 3 or 4 sparse matrices over the years, using various > libraries as back-ends. Right now, the sparse matrix code builds on top > of superlu and some home-grown caclulation tools. > I'd love some help with this. So, take a look at what's there and see > how you can help. > > -Travis I have tried to send some suggestions directly to Travis to know his opinion before posting it to the list, but some internet-born monster ate them all, so let me summarize it here: I am just redesigning a finite element code (some example results obtained by colleagus are at http://www-rocq.inria.fr/MACS/Coeur/index.html) written in matlab+c for python+c, so the proposals below scratch mainly my itches, but they might be of a more general interest. 1. add the umfpack sparse solver 2. I have at hand several convenient and fast routines for CSR format in the context of assembling finite element matrices, namely - preallocate the exact storage of the global CSR matrix corresponding to a given finite element mesh (i.e. create a neighbour graph for the nodes in the mesh). - insert/add a full or sparse block (the element "stiffness" matrix) into a preallocated CSR matrix (the global matrix) - ... I will pythonize the code in all cases, but do you think it would be useful to have it in scipy? Well, and I am willing to do some other work too, but let me first reduce my level of scipy-newbieness... Robert Cimrman From stephen.walton at csun.edu Fri Oct 7 12:00:51 2005 From: stephen.walton at csun.edu (Stephen Walton) Date: Fri, 07 Oct 2005 09:00:51 -0700 Subject: [SciPy-dev] More bugs fixed In-Reply-To: <43466603.1060004@ucsd.edu> References: <4345E3F0.8010802@ee.byu.edu> <434603EF.5090209@ucsd.edu> <4346266B.6040405@ee.byu.edu> <43462CFF.6000702@ucsd.edu> <434630BD.5020808@ee.byu.edu> <43466603.1060004@ucsd.edu> Message-ID: <43469BB3.1090301@csun.edu> Robert Kern wrote: >I *would* prefer that the modules under scipy_core be separate from the >modules in scipy proper. > I am a bit muddled by this thread. Right now, scipy.stats is not in scipy_core, but in scipy proper (and newscipy). Robert had said earlier he wanted scipy.stats removed from scipy_core; does he mean that scipy.stats was formerly intended to move from scipy to scipy_core, and he no longer wants this to happen? At this point I want to put in a good word for scipy.stats.ppcc_plot, ppcc_max, and their cousins. What is the consensus about which packages from scipy move to the core? I take it that Perhaps the main cause of my muddle-ment is whether we're talking about scipy_core and scipy SVN trunks or the newscipy and newcore branches at any given moment. I see a bunch of new stuff in the newscipy sandbox now. >At the moment, I have scipy_core and scipy >built as separate eggs with proper namespace package support. > > Eggs sounds like a really nice technology. How do we feel about making it the default way to distribute prebuilt scipy binaries? From pwang at enthought.com Fri Oct 7 12:23:17 2005 From: pwang at enthought.com (Peter Wang) Date: Fri, 07 Oct 2005 11:23:17 -0500 Subject: [SciPy-dev] numarray weave commit? weave maintainer? In-Reply-To: <1126105689.2353.19.camel@halloween.stsci.edu> References: <1126105689.2353.19.camel@halloween.stsci.edu> Message-ID: <4346A0F5.6030206@enthought.com> Todd, About a month ago you checked in a patch which patched some of the files in weave/tests: > I have a patch for weave for numarray which passes all of the default > self-tests. In doing my final checkout, I noticed that > weave.test(level=10) has some errors and eventually segfaults... for > Numeric. This was on RHEL3 with Python-2.4.1. You changed test_ast_tools.py to import RandomArray from scipy_base.numeric but this module does not exist... was this just a typo, or is there something more subtle going on? (In test_blitz_tools you use scipy_base.numerix.) thanks, peter From rkern at ucsd.edu Fri Oct 7 12:33:17 2005 From: rkern at ucsd.edu (Robert Kern) Date: Fri, 07 Oct 2005 09:33:17 -0700 Subject: [SciPy-dev] More bugs fixed In-Reply-To: <43469BB3.1090301@csun.edu> References: <4345E3F0.8010802@ee.byu.edu> <434603EF.5090209@ucsd.edu> <4346266B.6040405@ee.byu.edu> <43462CFF.6000702@ucsd.edu> <434630BD.5020808@ee.byu.edu> <43466603.1060004@ucsd.edu> <43469BB3.1090301@csun.edu> Message-ID: <4346A34D.9010605@ucsd.edu> Stephen Walton wrote: > Robert Kern wrote: > >>I *would* prefer that the modules under scipy_core be separate from the >>modules in scipy proper. > > I am a bit muddled by this thread. Right now, scipy.stats is not in > scipy_core, but in scipy proper (and newscipy). Robert had said earlier > he wanted scipy.stats removed from scipy_core; does he mean that > scipy.stats was formerly intended to move from scipy to scipy_core, and > he no longer wants this to happen? At this point I want to put in a > good word for scipy.stats.ppcc_plot, ppcc_max, and their cousins. There was a scipy.stats in scipy_core up until last night. Before I checked in mtrand, it contained a fair bit of Python code to expand the interface provided by the ranlib extension module. It did not have all of the distributions and statistics functions that were in the old scipy.stats. Travis's original plan was for the subpackages in scipy proper (which would be much-expanded from what is in scipy_core) to simply overwrite the ones in scipy_core when they get installed. For a number of reasons, I don't think this is going to work. Now that I've checked in mtrand, there really weren't anymore interface issues, so scipy/stats/__init__.py in scipy_core became really tiny. Since having scipy.stats in both scipy_core and scipy proper (by which I mean "what's currently in the newscipy branch") is problematic, we've removed scipy.stats from scipy_core, and all of the random number generators are available from scipy.random . scipy.stats.ppcc_plot and all of the rest will be available with the complete scipy, worry not. > What is the consensus about which packages from scipy move to the core? > I take it that Probably no more packages. Perhaps a few functions may find their way over. That linalg, fftpack, and random are in scipy_core at all probably has more to do with the desire to provide just as much capability as Numeric than anything else. > Perhaps the main cause of my muddle-ment is whether we're talking about > scipy_core and scipy SVN trunks or the newscipy and newcore branches at > any given moment. I see a bunch of new stuff in the newscipy sandbox now. Yes, I can see where that would be confusing. What's in the trunks are currently dead ends. They will be replaced by the branches reasonably soon. FWIW, when I talk about scipy_core and scipy proper, I mean what are currently in the newcore and newscipy branches, respectively. If I'm talking about the dead ends, I'll usually say "old scipy" or something like that. scipy is dead! Long live scipy! >>At the moment, I have scipy_core and scipy >>built as separate eggs with proper namespace package support. > > Eggs sounds like a really nice technology. How do we feel about making > it the default way to distribute prebuilt scipy binaries? I would love for that to be the case. They probably *will* be the way I'm distributing binaries for OS X. There are still issues with EasyInstall and non-root installs on UNIX-type systems with anal sysadmins/distros. The problem has been getting quite a bit of attention recently, so I'm no longer the only one badgering Phillip Eby about it. -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From jmiller at stsci.edu Fri Oct 7 12:45:50 2005 From: jmiller at stsci.edu (Todd Miller) Date: Fri, 07 Oct 2005 12:45:50 -0400 Subject: [SciPy-dev] numarray weave commit? weave maintainer? In-Reply-To: <4346A0F5.6030206@enthought.com> References: <1126105689.2353.19.camel@halloween.stsci.edu> <4346A0F5.6030206@enthought.com> Message-ID: <4346A63E.3070608@stsci.edu> Peter Wang wrote: >Todd, > >About a month ago you checked in a patch which patched some of the files in >weave/tests: > > > >>I have a patch for weave for numarray which passes all of the default >>self-tests. In doing my final checkout, I noticed that >>weave.test(level=10) has some errors and eventually segfaults... for >>Numeric. This was on RHEL3 with Python-2.4.1. >> >> > >You changed test_ast_tools.py to import RandomArray from scipy_base.numeric >but this module does not exist... was this just a typo, or is there something >more subtle going on? (In test_blitz_tools you use scipy_base.numerix.) > > That's a typo... please correct it to "scipy_base.numerix". Thanks, Todd From oliphant at ee.byu.edu Fri Oct 7 13:08:17 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Fri, 07 Oct 2005 11:08:17 -0600 Subject: [SciPy-dev] `newcore` on opteron In-Reply-To: References: <4346384D.2080900@ee.byu.edu> Message-ID: <4346AB81.1060104@ee.byu.edu> Arnd Baecker wrote: >(I killed the python when it went beyond the 8GB memory consumption...) > >Umpf, I just did another `svn update` and now it does not build: > >In file included from scipy/base/src/multiarraymodule.c:44: >scipy/base/src/arrayobject.c: In function array_frominterface: >scipy/base/src/arrayobject.c:5178: warning: passing argument 3 of >PyArray_New from incompatible pointer type >scipy/base/src/arrayobject.c: In function iter_subscript_int: >scipy/base/src/arrayobject.c:5746: warning: format %d expects type int, >but argument 3 has type intp >scipy/base/src/arrayobject.c:5746: warning: format %d expects type int, >but argument 4 has type intp >scipy/base/src/arrayobject.c: In function iter_ass_sub_int: >scipy/base/src/arrayobject.c:5950: warning: format %d expects type int, >but argument 3 has type intp >scipy/base/src/arrayobject.c:5950: warning: format %d expects type int, >but argument 4 has type intp >scipy/base/src/arrayobject.c: In function PyArray_MapIterBind: >scipy/base/src/arrayobject.c:6607: warning: format %d expects type int, >but argument 3 has type long int >scipy/base/src/arrayobject.c:6607: warning: format %d expects type int, >but argument 4 has type intp >error: Command "gcc -pthread -fno-strict-aliasing -DNDEBUG -O2 >-fmessage-length=0 -Wall -D_FORTIFY_SOURCE=2 -g -fPIC >-Ibuild/src/scipy/base/src -Iscipy/base/include -Ibuild/src/scipy/base >-Iscipy/base/src -I/usr/include/python2.4 -c >scipy/base/src/multiarraymodule.c -o >build/temp.linux-x86_64-2.4/scipy/base/src/multiarraymodule.o" failed with >exit status 1 >removed scipy/__svn_version__.py >removed scipy/f2py2e/__svn_version__.py > >Am I catching you in the middle of changes and should I better wait? >(or should I send you config files etc. ?) > > Yes, but your tests are invaluable so keep trying... I don't see the error, just these warnings.. What caused the problem? There was a problem on 5178... >Best, > >Arnd > >_______________________________________________ >Scipy-dev mailing list >Scipy-dev at scipy.net >http://www.scipy.net/mailman/listinfo/scipy-dev > > From oliphant at ee.byu.edu Fri Oct 7 13:13:28 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Fri, 07 Oct 2005 11:13:28 -0600 Subject: [SciPy-dev] `newcore` on opteron In-Reply-To: References: <4346384D.2080900@ee.byu.edu> Message-ID: <4346ACB8.2030001@ee.byu.edu> Arnd Baecker wrote: >Am I catching you in the middle of changes and should I better wait? >(or should I send you config files etc. ?) > > Did I mention that you should keep trying. Your build error logs are the best thing... -Travis From Fernando.Perez at colorado.edu Fri Oct 7 13:54:37 2005 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Fri, 07 Oct 2005 11:54:37 -0600 Subject: [SciPy-dev] sparse matrix support status In-Reply-To: <43468A59.9000503@ntc.zcu.cz> References: <433CE24A.6040509@ee.byu.edu> <433D5BAD.8030801@ntc.zcu.cz> <433D690D.7060008@ee.byu.edu> <43468A59.9000503@ntc.zcu.cz> Message-ID: <4346B65D.3070604@colorado.edu> > Travis Oliphant wrote: >>There is a sparse matrix in SciPy now: under Lib/sparse. I've >>implemented about 3 or 4 sparse matrices over the years, using various >>libraries as back-ends. Right now, the sparse matrix code builds on top >>of superlu and some home-grown caclulation tools. >>I'd love some help with this. So, take a look at what's there and see >>how you can help. Just a note on this topic: an acquaintance at CU Boulder recently tried using scipy's Lib/sparse and ran some comparisons against pysparse (http://pysparse.sourceforge.net). He found the pysparse performance to be a lot better, and ended up using that. Pysparse uses also SuperLU, so I'm not exactly sure what the root cause of the discrepancy may be, and he didn't track it down further (he just switched over to using pysparse and moved on). I simply mention this thinking that it might be worth, now that scipy is going through a cleanup phase for the long haul, to contact the pysparse team and offer them to merge their work into the core scipy(new). I haven't personally looked at both code bases, but I'd guess that a merge of the best from both would be possible and beneficial to everyone in the long run. Not being the one who is going to do this particular work, all I can do is drop the info here. Regards, f From Fernando.Perez at colorado.edu Fri Oct 7 14:33:53 2005 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Fri, 07 Oct 2005 12:33:53 -0600 Subject: [SciPy-dev] More bugs fixed In-Reply-To: <434630BD.5020808@ee.byu.edu> References: <4345E3F0.8010802@ee.byu.edu> <434603EF.5090209@ucsd.edu> <4346266B.6040405@ee.byu.edu> <43462CFF.6000702@ucsd.edu> <434630BD.5020808@ee.byu.edu> Message-ID: <4346BF91.7080401@colorado.edu> Travis Oliphant wrote: > So, I'm not sure how to handle that. A better solution might be to have > new names for the core sub packages. Perhaps we should call fftpack > ffts and linalg linear in the core distribution. -1 on 'linear' for being too generic: linear algebra, linear filters, linear ODEs, linear... Cheers, f From oliphant at ee.byu.edu Fri Oct 7 14:43:19 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Fri, 07 Oct 2005 12:43:19 -0600 Subject: [SciPy-dev] sparse matrix support status In-Reply-To: <4346B65D.3070604@colorado.edu> References: <433CE24A.6040509@ee.byu.edu> <433D5BAD.8030801@ntc.zcu.cz> <433D690D.7060008@ee.byu.edu> <43468A59.9000503@ntc.zcu.cz> <4346B65D.3070604@colorado.edu> Message-ID: <4346C1C7.605@ee.byu.edu> Fernando Perez wrote: >>Travis Oliphant wrote: >> >> > > > >>>There is a sparse matrix in SciPy now: under Lib/sparse. I've >>>implemented about 3 or 4 sparse matrices over the years, using various >>>libraries as back-ends. Right now, the sparse matrix code builds on top >>>of superlu and some home-grown caclulation tools. >>>I'd love some help with this. So, take a look at what's there and see >>>how you can help. >>> >>> > >Just a note on this topic: an acquaintance at CU Boulder recently tried using >scipy's Lib/sparse and ran some comparisons against pysparse >(http://pysparse.sourceforge.net). He found the pysparse performance to be a >lot better, and ended up using that. Pysparse uses also SuperLU, so I'm not >exactly sure what the root cause of the discrepancy may be, and he didn't >track it down further (he just switched over to using pysparse and moved on). > > I know that pysparse defines some C-types. But there may be other issues as well. I'd love to merge pysparse stuff in. Robert Cimrman has shown interest in sparse matrices. Perhaps he could help. -Travis From guyer at nist.gov Fri Oct 7 14:53:49 2005 From: guyer at nist.gov (Jonathan Guyer) Date: Fri, 7 Oct 2005 14:53:49 -0400 Subject: [SciPy-dev] sparse matrix support status In-Reply-To: <4346B65D.3070604@colorado.edu> References: <433CE24A.6040509@ee.byu.edu> <433D5BAD.8030801@ntc.zcu.cz> <433D690D.7060008@ee.byu.edu> <43468A59.9000503@ntc.zcu.cz> <4346B65D.3070604@colorado.edu> Message-ID: On Oct 7, 2005, at 1:54 PM, Fernando Perez wrote: > Pysparse uses also SuperLU, so I'm not > exactly sure what the root cause of the discrepancy may be, and he > didn't > track it down further (he just switched over to using pysparse and > moved on). pysparse supports a number of different solvers (CGS, PCG, GMRES, JOR), not just LU. Most of them are much faster than LU, but often not as accurate. Or was he comparing LU to LU? From Fernando.Perez at colorado.edu Fri Oct 7 15:01:44 2005 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Fri, 07 Oct 2005 13:01:44 -0600 Subject: [SciPy-dev] sparse matrix support status In-Reply-To: References: <433CE24A.6040509@ee.byu.edu> <433D5BAD.8030801@ntc.zcu.cz> <433D690D.7060008@ee.byu.edu> <43468A59.9000503@ntc.zcu.cz> <4346B65D.3070604@colorado.edu> Message-ID: <4346C618.2040403@colorado.edu> Jonathan Guyer wrote: > On Oct 7, 2005, at 1:54 PM, Fernando Perez wrote: > > >>Pysparse uses also SuperLU, so I'm not >>exactly sure what the root cause of the discrepancy may be, and he >>didn't >>track it down further (he just switched over to using pysparse and >>moved on). > > > pysparse supports a number of different solvers (CGS, PCG, GMRES, JOR), > not just LU. Most of them are much faster than LU, but often not as > accurate. > > Or was he comparing LU to LU? Dunno. He did some quick tests, found better performance in pysparse and just moved on (numerics is not directly his field, so he just needed something that worked quickly, and once he found it he didn't look back). But since that is a normal attitude in most users (myself included), and it does seem that pysparse offers extra capabilities beyond what scipy.sparse has, it would be nice (I think) to fold that into the new shiny scipy. Cheers, f From rkern at ucsd.edu Fri Oct 7 15:09:56 2005 From: rkern at ucsd.edu (Robert Kern) Date: Fri, 07 Oct 2005 12:09:56 -0700 Subject: [SciPy-dev] More bugs fixed In-Reply-To: <4346BF91.7080401@colorado.edu> References: <4345E3F0.8010802@ee.byu.edu> <434603EF.5090209@ucsd.edu> <4346266B.6040405@ee.byu.edu> <43462CFF.6000702@ucsd.edu> <434630BD.5020808@ee.byu.edu> <4346BF91.7080401@colorado.edu> Message-ID: <4346C804.2050104@ucsd.edu> Fernando Perez wrote: > Travis Oliphant wrote: > >>So, I'm not sure how to handle that. A better solution might be to have >>new names for the core sub packages. Perhaps we should call fftpack >>ffts and linalg linear in the core distribution. > > -1 on 'linear' for being too generic: linear algebra, linear filters, linear > ODEs, linear... corefft, corelinalg? So as to beat the user over the head with the fact that this stuff is just the basics; the good stuff is in the full scipy. -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From pearu at scipy.org Fri Oct 7 14:14:28 2005 From: pearu at scipy.org (Pearu Peterson) Date: Fri, 7 Oct 2005 13:14:28 -0500 (CDT) Subject: [SciPy-dev] More bugs fixed In-Reply-To: <4346C804.2050104@ucsd.edu> References: <4345E3F0.8010802@ee.byu.edu> <434603EF.5090209@ucsd.edu> <434630BD.5020808@ee.byu.edu> <4346BF91.7080401@colorado.edu> <4346C804.2050104@ucsd.edu> Message-ID: On Fri, 7 Oct 2005, Robert Kern wrote: > Fernando Perez wrote: >> Travis Oliphant wrote: >> >>> So, I'm not sure how to handle that. A better solution might be to have >>> new names for the core sub packages. Perhaps we should call fftpack >>> ffts and linalg linear in the core distribution. >> >> -1 on 'linear' for being too generic: linear algebra, linear filters, linear >> ODEs, linear... > > corefft, corelinalg? So as to beat the user over the head with the fact > that this stuff is just the basics; the good stuff is in the full scipy. core.fft, core.linalg? So that scipy tree will be cleaner. Pearu From travis at enthought.com Fri Oct 7 15:16:43 2005 From: travis at enthought.com (Travis N. Vaught) Date: Fri, 07 Oct 2005 14:16:43 -0500 Subject: [SciPy-dev] More bugs fixed In-Reply-To: <4346C804.2050104@ucsd.edu> References: <4345E3F0.8010802@ee.byu.edu> <434603EF.5090209@ucsd.edu> <4346266B.6040405@ee.byu.edu> <43462CFF.6000702@ucsd.edu> <434630BD.5020808@ee.byu.edu> <4346BF91.7080401@colorado.edu> <4346C804.2050104@ucsd.edu> Message-ID: <4346C99B.5050302@enthought.com> Robert Kern wrote: >Fernando Perez wrote: > > >>Travis Oliphant wrote: >> >> >> >>>So, I'm not sure how to handle that. A better solution might be to have >>>new names for the core sub packages. Perhaps we should call fftpack >>>ffts and linalg linear in the core distribution. >>> >>> >>-1 on 'linear' for being too generic: linear algebra, linear filters, linear >>ODEs, linear... >> >> > >corefft, corelinalg? So as to beat the user over the head with the fact >that this stuff is just the basics; the good stuff is in the full scipy. > > > Or perhaps simplefft, simplelinalg? -- ........................ Travis N. Vaught CEO Enthought, Inc. http://www.enthought.com ........................ From rkern at ucsd.edu Fri Oct 7 15:38:40 2005 From: rkern at ucsd.edu (Robert Kern) Date: Fri, 07 Oct 2005 12:38:40 -0700 Subject: [SciPy-dev] More bugs fixed In-Reply-To: References: <4345E3F0.8010802@ee.byu.edu> <434603EF.5090209@ucsd.edu> <434630BD.5020808@ee.byu.edu> <4346BF91.7080401@colorado.edu> <4346C804.2050104@ucsd.edu> Message-ID: <4346CEC0.4030906@ucsd.edu> Pearu Peterson wrote: > > On Fri, 7 Oct 2005, Robert Kern wrote: >>corefft, corelinalg? So as to beat the user over the head with the fact >>that this stuff is just the basics; the good stuff is in the full scipy. > > core.fft, core.linalg? So that scipy tree will be cleaner. I think the presence of a scipy.core (which definitely isn't synonymous with scipy_core) might further confuse our already-belabored constellation of similar names floating around. But I like stuffing them into one subpackage. Since the extensions live in scipy.lib, you could just copy the Python files from scipy.fftpack and scipy.linalg over to the new directory, rename the __init__.py's to fft.py and linalg.py, and add an empty __init__.py. scipy.simple? scipy.compat? scipy.oh_go_ahead_just_install_the_full_package_already_you_know_you_want_to? -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From pearu at scipy.org Fri Oct 7 14:45:05 2005 From: pearu at scipy.org (Pearu Peterson) Date: Fri, 7 Oct 2005 13:45:05 -0500 (CDT) Subject: [SciPy-dev] More bugs fixed In-Reply-To: <4346CEC0.4030906@ucsd.edu> References: <4345E3F0.8010802@ee.byu.edu> <434603EF.5090209@ucsd.edu> <434630BD.5020808@ee.byu.edu> <4346BF91.7080401@colorado.edu> <4346C804.2050104@ucsd.edu> <4346CEC0.4030906@ucsd.edu> Message-ID: On Fri, 7 Oct 2005, Robert Kern wrote: > Pearu Peterson wrote: >> >> On Fri, 7 Oct 2005, Robert Kern wrote: > >>> corefft, corelinalg? So as to beat the user over the head with the fact >>> that this stuff is just the basics; the good stuff is in the full scipy. >> >> core.fft, core.linalg? So that scipy tree will be cleaner. > > I think the presence of a scipy.core (which definitely isn't synonymous > with scipy_core) might further confuse our already-belabored > constellation of similar names floating around. But I like stuffing them > into one subpackage. Since the extensions live in scipy.lib, you could > just copy the Python files from scipy.fftpack and scipy.linalg over to > the new directory, rename the __init__.py's to fft.py and linalg.py, and > add an empty __init__.py. > > scipy.simple? Not sure that adjective like `simple` is appropiate, it's a subjective word. But then again what do I know about English.. > scipy.compat? > > scipy.oh_go_ahead_just_install_the_full_package_already_you_know_you_want_to? scipy.basic? Pearu From oliphant at ee.byu.edu Fri Oct 7 15:49:26 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Fri, 07 Oct 2005 13:49:26 -0600 Subject: [SciPy-dev] More bugs fixed In-Reply-To: References: <4345E3F0.8010802@ee.byu.edu> <434603EF.5090209@ucsd.edu> <434630BD.5020808@ee.byu.edu> <4346BF91.7080401@colorado.edu> <4346C804.2050104@ucsd.edu> <4346CEC0.4030906@ucsd.edu> Message-ID: <4346D146.6060400@ee.byu.edu> Pearu Peterson wrote: >>scipy.oh_go_ahead_just_install_the_full_package_already_you_know_you_want_to? >> >> > >scipy.basic? > > I like scipy.basic So, how about scipy.basic.fft scipy.basic.linalg scipy.basic.random as all python modules under the basic package. I have to admit that scipy.oh_go_ahead_just_install_the_full_package_already_you_know_you_want_to is a close second. From guyer at nist.gov Fri Oct 7 15:54:41 2005 From: guyer at nist.gov (Jonathan Guyer) Date: Fri, 7 Oct 2005 15:54:41 -0400 Subject: [SciPy-dev] More bugs fixed In-Reply-To: <4346D146.6060400@ee.byu.edu> References: <4345E3F0.8010802@ee.byu.edu> <434603EF.5090209@ucsd.edu> <434630BD.5020808@ee.byu.edu> <4346BF91.7080401@colorado.edu> <4346C804.2050104@ucsd.edu> <4346CEC0.4030906@ucsd.edu> <4346D146.6060400@ee.byu.edu> Message-ID: <14e186d2e281bb0e623dcce8c7d52096@nist.gov> On Oct 7, 2005, at 3:49 PM, Travis Oliphant wrote: > I have to admit that > > scipy.oh_go_ahead_just_install_the_full_package_already_you_know_you_wa > nt_to > > is a close second. Can I put in a vote for camelCase? From guyer at nist.gov Fri Oct 7 16:00:49 2005 From: guyer at nist.gov (Jonathan Guyer) Date: Fri, 7 Oct 2005 16:00:49 -0400 Subject: [SciPy-dev] sparse matrix support status In-Reply-To: <4346C618.2040403@colorado.edu> References: <433CE24A.6040509@ee.byu.edu> <433D5BAD.8030801@ntc.zcu.cz> <433D690D.7060008@ee.byu.edu> <43468A59.9000503@ntc.zcu.cz> <4346B65D.3070604@colorado.edu> <4346C618.2040403@colorado.edu> Message-ID: On Oct 7, 2005, at 3:01 PM, Fernando Perez wrote: > it does seem that > pysparse offers extra capabilities beyond what scipy.sparse has, it > would be > nice (I think) to fold that into the new shiny scipy. True enough. Dan Wheeler and I would be willing to look into this. We presently use pysparse in FiPy because that's what we could figure out, particularly for iterative solvers, but it'd simplify our lives considerably if we could reduce our installation instructions to "Get SciPy. Get FiPy." Roman Geus has indicated to us that he's not interested in merging pysparse into a larger suite like SciPy; he prefers individually maintained small packages to large monolithic systems. He may have changed his mind about that, though, and regardless, pysparse is BSD licensed, so it's perfectly legal to use his code to improve scipy.sparse (assuming that rigorous benchmarking determines that there are, in fact, improvements to be made). We'll do some tests and, if a merge is warranted, we'll run it by Roman out of courtesy. From oliphant at ee.byu.edu Fri Oct 7 16:06:42 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Fri, 07 Oct 2005 14:06:42 -0600 Subject: [SciPy-dev] More bugs fixed In-Reply-To: <14e186d2e281bb0e623dcce8c7d52096@nist.gov> References: <4345E3F0.8010802@ee.byu.edu> <434603EF.5090209@ucsd.edu> <434630BD.5020808@ee.byu.edu> <4346BF91.7080401@colorado.edu> <4346C804.2050104@ucsd.edu> <4346CEC0.4030906@ucsd.edu> <4346D146.6060400@ee.byu.edu> <14e186d2e281bb0e623dcce8c7d52096@nist.gov> Message-ID: <4346D552.4090207@ee.byu.edu> Jonathan Guyer wrote: >On Oct 7, 2005, at 3:49 PM, Travis Oliphant wrote: > > > >>I have to admit that >> >>scipy.oh_go_ahead_just_install_the_full_package_already_you_know_you_wa >>nt_to >> >>is a close second. >> >> > >Can I put in a vote for camelCase? > > Do you mean camelCase for package or module names? According to the usage pattern of scipy (as far as I understand it), camelCase is used only for class definitions. Packages are usually lower case. And if there is a need for an underscore we are moving towards subpackages in those cases. Thus scipy_base --> scipy.base scipy_distutils --> scipy.distutils scipy_test --> scipy.test -Travis >_______________________________________________ >Scipy-dev mailing list >Scipy-dev at scipy.net >http://www.scipy.net/mailman/listinfo/scipy-dev > > From guyer at nist.gov Fri Oct 7 16:15:42 2005 From: guyer at nist.gov (Jonathan Guyer) Date: Fri, 7 Oct 2005 16:15:42 -0400 Subject: [SciPy-dev] More bugs fixed In-Reply-To: <4346D552.4090207@ee.byu.edu> References: <4345E3F0.8010802@ee.byu.edu> <434603EF.5090209@ucsd.edu> <434630BD.5020808@ee.byu.edu> <4346BF91.7080401@colorado.edu> <4346C804.2050104@ucsd.edu> <4346CEC0.4030906@ucsd.edu> <4346D146.6060400@ee.byu.edu> <14e186d2e281bb0e623dcce8c7d52096@nist.gov> <4346D552.4090207@ee.byu.edu> Message-ID: On Oct 7, 2005, at 4:06 PM, Travis Oliphant wrote: >> Can I put in a vote for camelCase? >> >> > Do you mean camelCase for package or module names? Wasn't actually a serious proposal. Just looking for a 19% reduction in oh_go_ahead_just_install_the_full_package_already_you_know_you_want_to. 13 __init__.py files probably isn't worth the trouble to go to submodules. 8^) From chanley at stsci.edu Fri Oct 7 16:17:50 2005 From: chanley at stsci.edu (Christopher Hanley) Date: Fri, 07 Oct 2005 16:17:50 -0400 Subject: [SciPy-dev] Cannot build scipy_core (newcore) on Solaris 8 Message-ID: <4346D7EE.9010604@stsci.edu> Greetings, I am currently trying to build scipy_core on a Solaris 8 system using the native Sun Compilers (Sun Workshop Release 6 update 2). I have always had difficulty with Numeric3 building on Solaris because of the C99 library functions. However, I was able to get around that by commenting out tests for those functions in the the setup.py file (thank you Travis for that hint). Unfortunately, my standard bag of tricks is no longer working and I am unable to get scipy_core to build at all. I would appreciate and tips folks could give. I'm attaching the log file from by recent build attempt for reference. Thank you for your time and help, Chris -- Christopher Hanley Senior Software Engineer Space Telescope Science Institute 3700 San Martin Drive Baltimore MD, 21218 (410) 338-4338 -------------- next part -------------- An embedded and charset-unspecified text was scrubbed... Name: solaris8_cc_build.log URL: From oliphant at ee.byu.edu Fri Oct 7 16:27:23 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Fri, 07 Oct 2005 14:27:23 -0600 Subject: [SciPy-dev] Cannot build scipy_core (newcore) on Solaris 8 In-Reply-To: <4346D7EE.9010604@stsci.edu> References: <4346D7EE.9010604@stsci.edu> Message-ID: <4346DA2B.9060708@ee.byu.edu> Christopher Hanley wrote: > Greetings, > > I am currently trying to build scipy_core on a Solaris 8 system using > the native Sun Compilers (Sun Workshop Release 6 update 2). I have > always had difficulty with Numeric3 building on Solaris because of the > C99 library functions. However, I was able to get around that by > commenting out tests for those functions in the the setup.py file > (thank you Travis for that hint). Unfortunately, my standard bag of > tricks is no longer working and I am unable to get scipy_core to build > at all. I would appreciate and tips folks could give. I'm attaching > the log file from by recent build attempt for reference. > > Thank you for your time and help, Could you send your config.h file (it should be in build/src/scipy/base). The defines in that file play an important role in determining the compilation. It may be incorrect. There are a couple of strategies. If we can jerry-rig the config.h file to work for your installation (I'm not sure exactly how to "turn off config.h" generation though), then we can narrow down the problem to creating a proper config.h file. If we can't get a config.h file that works for your system, then our problem is deeper and we need to add more defines to the code. Thanks for your help. -Travis From chanley at stsci.edu Fri Oct 7 16:36:37 2005 From: chanley at stsci.edu (Christopher Hanley) Date: Fri, 07 Oct 2005 16:36:37 -0400 Subject: [SciPy-dev] Cannot build scipy_core (newcore) on Solaris 8 In-Reply-To: <4346DA2B.9060708@ee.byu.edu> References: <4346D7EE.9010604@stsci.edu> <4346DA2B.9060708@ee.byu.edu> Message-ID: <4346DC55.5000904@stsci.edu> The config.h file is attached. Thanks Travis. Chris Travis Oliphant wrote: > Could you send your config.h file (it should be in > build/src/scipy/base). The defines in that file play an important role > in determining the compilation. > > It may be incorrect. > > There are a couple of strategies. If we can jerry-rig the config.h > file to work for your installation (I'm not sure exactly how to "turn > off config.h" generation though), then we can narrow down the problem to > creating a proper config.h file. If we can't get a config.h file that > works for your system, then our problem is deeper and we need to add > more defines to the code. > > Thanks for your help. > > -Travis > > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-dev -------------- next part -------------- A non-text attachment was scrubbed... Name: config.h Type: text/x-c-header Size: 359 bytes Desc: not available URL: From pearu at scipy.org Fri Oct 7 15:51:06 2005 From: pearu at scipy.org (Pearu Peterson) Date: Fri, 7 Oct 2005 14:51:06 -0500 (CDT) Subject: [SciPy-dev] Cannot build scipy_core (newcore) on Solaris 8 In-Reply-To: <4346DA2B.9060708@ee.byu.edu> References: <4346D7EE.9010604@stsci.edu> <4346DA2B.9060708@ee.byu.edu> Message-ID: On Fri, 7 Oct 2005, Travis Oliphant wrote: > There are a couple of strategies. If we can jerry-rig the config.h > file to work for your installation (I'm not sure exactly how to "turn > off config.h" generation though), mkdir -p build/src/scipy/base touch build/src/scipy/base/config.h should be enough. config.h is generated when it does not exist or it is older than scipy/base/setup.py. Pearu From Fernando.Perez at colorado.edu Fri Oct 7 16:56:08 2005 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Fri, 07 Oct 2005 14:56:08 -0600 Subject: [SciPy-dev] More bugs fixed In-Reply-To: <4346D146.6060400@ee.byu.edu> References: <4345E3F0.8010802@ee.byu.edu> <434603EF.5090209@ucsd.edu> <434630BD.5020808@ee.byu.edu> <4346BF91.7080401@colorado.edu> <4346C804.2050104@ucsd.edu> <4346CEC0.4030906@ucsd.edu> <4346D146.6060400@ee.byu.edu> Message-ID: <4346E0E8.3030901@colorado.edu> Travis Oliphant wrote: > I like scipy.basic > > So, how about > > scipy.basic.fft > scipy.basic.linalg > scipy.basic.random > > as all python modules under the basic package. Just to clarify (sorry, but I'm a bit dizzy from the current naming blizzard): for those of us who install/use the full scipy, scipy.{fft,linalg,random,...} would continue to exist, correct? Or would they instead be found as scipy.lib.{fft,...}? > I have to admit that > > scipy.oh_go_ahead_just_install_the_full_package_already_you_know_you_want_to > > is a close second. It would certainly serve as encouragement for people to install the full scipy, so perhaps we shouldn't dismiss it so lightly ;) Cheers, f From stephen.walton at csun.edu Fri Oct 7 17:02:56 2005 From: stephen.walton at csun.edu (Stephen Walton) Date: Fri, 07 Oct 2005 14:02:56 -0700 Subject: [SciPy-dev] More bugs fixed In-Reply-To: <4346E0E8.3030901@colorado.edu> References: <4345E3F0.8010802@ee.byu.edu> <434603EF.5090209@ucsd.edu> <434630BD.5020808@ee.byu.edu> <4346BF91.7080401@colorado.edu> <4346C804.2050104@ucsd.edu> <4346CEC0.4030906@ucsd.edu> <4346D146.6060400@ee.byu.edu> <4346E0E8.3030901@colorado.edu> Message-ID: <4346E280.1090504@csun.edu> Fernando Perez wrote: >Just to clarify (sorry, but I'm a bit dizzy from the current naming blizzard): >for those of us who install/use the full scipy, scipy.{fft,linalg,random,...} >would continue to exist, correct? Or would they instead be found as >scipy.lib.{fft,...}? > > > I would vote for them still being found as scipy.{fft,linalg,...}. To me, scipy.lib.something implies that all that's there is a library, not a set of tools for doing "something." I personally think the primary goal here is to have basic stable tools which are always there, so that someone developing a third party package for scipy can depend on them. This may not be the case for the rest of scipy. From oliphant at ee.byu.edu Fri Oct 7 19:00:55 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Fri, 07 Oct 2005 17:00:55 -0600 Subject: [SciPy-dev] Cannot build scipy_core (newcore) on Solaris 8 In-Reply-To: <4346DC55.5000904@stsci.edu> References: <4346D7EE.9010604@stsci.edu> <4346DA2B.9060708@ee.byu.edu> <4346DC55.5000904@stsci.edu> Message-ID: <4346FE27.3020902@ee.byu.edu> Christopher Hanley wrote: > The config.h file is attached. > Sorry I didn't get back to you, earlier. Add the following line to your config.h file #define HAVE_INVERSE_HYPERBOLIC_FLOAT If that doesn't work, then you may want to add #define HAVE_LONGDOUBLE_FUNCS #define HAVE_FLOAT_FUNCS as well (one at a time). Place this config.h file in the build directory where you got it from (you might also copy it to the installation directory --- or check to make sure one isn't there already). These headers are installed to site-packages/scipy/base/include so look there to make sure you don't have something already (if you do delete it). Also, remove any scipy subdirectory under the place where your python includes are Mine are in /usr/include/python2.4 so I would execute rm /usr/include/python2.4/scipy We don't want to be pulling out the wrong config file. If we can get a good config.h file for your platform, then we see about how to make it generated automatically. -Travis From oliphant at ee.byu.edu Fri Oct 7 19:07:13 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Fri, 07 Oct 2005 17:07:13 -0600 Subject: [SciPy-dev] More bugs fixed In-Reply-To: <4346E280.1090504@csun.edu> References: <4345E3F0.8010802@ee.byu.edu> <434603EF.5090209@ucsd.edu> <434630BD.5020808@ee.byu.edu> <4346BF91.7080401@colorado.edu> <4346C804.2050104@ucsd.edu> <4346CEC0.4030906@ucsd.edu> <4346D146.6060400@ee.byu.edu> <4346E0E8.3030901@colorado.edu> <4346E280.1090504@csun.edu> Message-ID: <4346FFA1.4060703@ee.byu.edu> Stephen Walton wrote: >Fernando Perez wrote: > > > >>Just to clarify (sorry, but I'm a bit dizzy from the current naming blizzard): >>for those of us who install/use the full scipy, scipy.{fft,linalg,random,...} >>would continue to exist, correct? Or would they instead be found as >>scipy.lib.{fft,...}? >> >> >> >> >> >I would vote for them still being found as scipy.{fft,linalg,...}. To >me, scipy.lib.something implies that all that's there is a library, not >a set of tools for doing "something." I personally think the primary >goal here is to have basic stable tools which are always there, so that >someone developing a third party package for scipy can depend on them. > > Right now all the extra's beyond the basic arrayobject, it's functions, and the ufunc object are in a package called scipy.basic There you can find modules scipy.basic.fft scipy.basic.linalg scipy.basic.random that contain the functionality (often by importing some heavy-lifting code from scipy.lib. Full scipy will still have scipy.fftpack scipy.linalg scipy.stats as it does now. And now installing newcore is more compatible with all of scipy. There's just the matter of the __init__ file. But, frankly, we can probably distribute an __init__ file with scipy core that is generic enough to get all the packages installed, and not distribute an __init__ file with scipy. That way we achieve the modularity of scipy that is desired (and can perhaps start to persuade people like pysparse to get their package in a condition to work like scipy.sparse). And they can still keep their own release (because scipy core will be the basic foundation). And installing packages separately becomes no big deal at all. -Travis >This may not be the case for the rest of scipy. > >_______________________________________________ >Scipy-dev mailing list >Scipy-dev at scipy.net >http://www.scipy.net/mailman/listinfo/scipy-dev > > From Fernando.Perez at colorado.edu Fri Oct 7 19:11:58 2005 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Fri, 07 Oct 2005 17:11:58 -0600 Subject: [SciPy-dev] sparse matrix support status In-Reply-To: References: <433CE24A.6040509@ee.byu.edu> <433D5BAD.8030801@ntc.zcu.cz> <433D690D.7060008@ee.byu.edu> <43468A59.9000503@ntc.zcu.cz> <4346B65D.3070604@colorado.edu> <4346C618.2040403@colorado.edu> Message-ID: <434700BE.6020308@colorado.edu> Jonathan Guyer wrote: > On Oct 7, 2005, at 3:01 PM, Fernando Perez wrote: > > >>it does seem that >>pysparse offers extra capabilities beyond what scipy.sparse has, it >>would be >>nice (I think) to fold that into the new shiny scipy. > > > True enough. Dan Wheeler and I would be willing to look into this. We > presently use pysparse in FiPy because that's what we could figure out, > particularly for iterative solvers, but it'd simplify our lives > considerably if we could reduce our installation instructions to "Get > SciPy. Get FiPy." > > Roman Geus has indicated to us that he's not interested in merging > pysparse into a larger suite like SciPy; he prefers individually > maintained small packages to large monolithic systems. He may have > changed his mind about that, though, and regardless, pysparse is BSD > licensed, so it's perfectly legal to use his code to improve > scipy.sparse (assuming that rigorous benchmarking determines that there > are, in fact, improvements to be made). We'll do some tests and, if a > merge is warranted, we'll run it by Roman out of courtesy. Well, as Travis just mentioned in another thread: """ But, frankly, we can probably distribute an __init__ file with scipy core that is generic enough to get all the packages installed, and not distribute an __init__ file with scipy. That way we achieve the modularity of scipy that is desired (and can perhaps start to persuade people like pysparse to get their package in a condition to work like scipy.sparse). And they can still keep their own release (because scipy core will be the basic foundation). And installing packages separately becomes no big deal at all. """ scipy is moving towards making it as easy as possible for externally maintained packages (possibly with their own release cycle) to integrate cleanly into scipy. So hopefully there will be room for common work here... Best regards, f From guyer at nist.gov Fri Oct 7 19:23:01 2005 From: guyer at nist.gov (Jonathan Guyer) Date: Fri, 7 Oct 2005 19:23:01 -0400 Subject: [SciPy-dev] sparse matrix support status In-Reply-To: <434700BE.6020308@colorado.edu> References: <433CE24A.6040509@ee.byu.edu> <433D5BAD.8030801@ntc.zcu.cz> <433D690D.7060008@ee.byu.edu> <43468A59.9000503@ntc.zcu.cz> <4346B65D.3070604@colorado.edu> <4346C618.2040403@colorado.edu> <434700BE.6020308@colorado.edu> Message-ID: On Oct 7, 2005, at 7:11 PM, Fernando Perez wrote: > scipy is moving towards making it as easy as possible for externally > maintained packages (possibly with their own release cycle) to > integrate > cleanly into scipy. So hopefully there will be room for common work > here... Sounds great. I'll post here when we've got some stats on how pysparse and scipy.sparse stack up. From stephen.walton at csun.edu Fri Oct 7 19:51:56 2005 From: stephen.walton at csun.edu (Stephen Walton) Date: Fri, 07 Oct 2005 16:51:56 -0700 Subject: [SciPy-dev] More bugs fixed In-Reply-To: <4346FFA1.4060703@ee.byu.edu> References: <4345E3F0.8010802@ee.byu.edu> <434603EF.5090209@ucsd.edu> <434630BD.5020808@ee.byu.edu> <4346BF91.7080401@colorado.edu> <4346C804.2050104@ucsd.edu> <4346CEC0.4030906@ucsd.edu> <4346D146.6060400@ee.byu.edu> <4346E0E8.3030901@colorado.edu> <4346E280.1090504@csun.edu> <4346FFA1.4060703@ee.byu.edu> Message-ID: <43470A1C.9040606@csun.edu> Travis Oliphant wrote, well, a good deal clarifying the structure of the new scipy core, which is much appreciated. I'll make sure to keep a copy handy as I work. Steve From stephen.walton at csun.edu Fri Oct 7 19:53:18 2005 From: stephen.walton at csun.edu (Stephen Walton) Date: Fri, 07 Oct 2005 16:53:18 -0700 Subject: [SciPy-dev] bdist_rpm problems Message-ID: <43470A6E.5030706@csun.edu> I'm seeing a couple of problems with "setup.py bdist_rpm". If this is still in progress, so be it, but here are the specific issues I'm hitting: 1. "setup.py config_fc --fcompiler=absoft bdist_rpm" in newcore still uses g77, even though "setup.py config_fc --fcompiler=absoft build" uses Absoft. 2. With today's SVN checkout, newscipy's bdist_rpm command isn't working at all, with the output: Assuming default configuration (Lib/utils/{setup_utils,setup}.py was not found) Appending scipy.utils configuration to scipy Appending scipy.io configuration to scipy Appending scipy.special configuration to scipy lapack_opt_info: atlas_threads_info: Setting PTATLAS=ATLAS scipy.distutils.system_info.atlas_threads_info NOT AVAILABLE atlas_info: scipy.distutils.system_info.atlas_info FOUND: libraries = ['lapack', 'f77blas', 'cblas', 'atlas'] library_dirs = ['/usr/lib/atlas'] language = f77 include_dirs = ['/usr/include/atlas'] running build_src building extension "atlas_version" sources adding 'build/src/atlas_version_-0x51ab4a5a.c' to sources. running build_ext customize UnixCCompiler customize UnixCCompiler using build_ext customize GnuFCompiler customize GnuFCompiler customize GnuFCompiler using build_ext FOUND: libraries = ['lapack', 'f77blas', 'cblas', 'atlas'] library_dirs = ['/usr/lib/atlas'] language = f77 define_macros = [('ATLAS_INFO', '"\\"3.7.11\\""')] include_dirs = ['/usr/include/atlas'] Appending scipy.optimize configuration to scipy Appending scipy.stats configuration to scipy Appending scipy.interpolate configuration to scipy Appending scipy configuration to Creating Lib/__svn_version__.py (version='1318') running bdist_rpm creating build/bdist.linux-i686 creating build/bdist.linux-i686/rpm creating build/bdist.linux-i686/rpm/SOURCES creating build/bdist.linux-i686/rpm/SPECS creating build/bdist.linux-i686/rpm/BUILD creating build/bdist.linux-i686/rpm/RPMS creating build/bdist.linux-i686/rpm/SRPMS writing 'build/bdist.linux-i686/rpm/SPECS/scipy.spec' running sdist warning: sdist: standard file not found: should have one of README, README.txt Traceback (most recent call last): File "setup.py", line 35, in ? setup_package() File "setup.py", line 26, in setup_package url = "http://www.scipy.org", File "/usr/lib/python2.4/site-packages/scipy/distutils/core.py", line 80, in setup return old_setup(**new_attr) File "/usr/lib/python2.4/distutils/core.py", line 149, in setup dist.run_commands() File "/usr/lib/python2.4/distutils/dist.py", line 946, in run_commands self.run_command(cmd) File "/usr/lib/python2.4/distutils/dist.py", line 966, in run_command cmd_obj.run() File "/usr/lib/python2.4/distutils/command/bdist_rpm.py", line 305, in run self.run_command('sdist') File "/usr/lib/python2.4/distutils/cmd.py", line 333, in run_command self.distribution.run_command(command) File "/usr/lib/python2.4/distutils/dist.py", line 966, in run_command cmd_obj.run() File "/usr/lib/python2.4/distutils/command/sdist.py", line 143, in run self.get_file_list() File "/usr/lib/python2.4/distutils/command/sdist.py", line 240, in get_file_list self.add_defaults() File "/usr/lib/python2.4/site-packages/scipy/distutils/command/sdist.py", line 9, in add_defaults old_sdist.add_defaults(self) File "/usr/lib/python2.4/distutils/command/sdist.py", line 305, in add_defaults self.filelist.extend(build_clib.get_source_files()) File "/usr/lib/python2.4/site-packages/scipy/distutils/command/build_clib.py", line 95, in get_source_files filenames.extend(get_lib_source_files(lib)) File "/usr/lib/python2.4/site-packages/scipy/distutils/misc_util.py", line 227, in get_lib_source_files depends = build_info.get('depends',[]) 3. I *think*, although I cannot be sure, that both scipy_core and scipy are installing __svn_version__.py files in /usr/lib/python-x.y/site-packages/scipy. I don't think this is intended, and means that, once bdist_rpm is working, one won't be able to install both without using --replacefiles. From arnd.baecker at web.de Sat Oct 8 06:20:02 2005 From: arnd.baecker at web.de (Arnd Baecker) Date: Sat, 8 Oct 2005 12:20:02 +0200 (CEST) Subject: [SciPy-dev] `newcore` on opteron In-Reply-To: <4346ACB8.2030001@ee.byu.edu> References: <4346ACB8.2030001@ee.byu.edu> Message-ID: Hi Travis, On Fri, 7 Oct 2005, Travis Oliphant wrote: > Did I mention that you should keep trying. Your build error logs are > the best thing... Glad that I can help! I have automated the testing a bit by using a script (in particular, because over the week-end I only have modem access). Currently it does not build for me: scipy/base/src/arrayobject.c:6632: warning: format %d expects type int, but argument 4 has type intp In file included from scipy/base/src/arrayobject.c:566, from scipy/base/src/multiarraymodule.c:44: build/src/scipy/base/src/scalartypes.inc: In function initialize_numeric_types: build/src/scipy/base/src/scalartypes.inc:4001: error: longlong_arrtype_hash undeclared (first use in this function) build/src/scipy/base/src/scalartypes.inc:4001: error: (Each undeclared identifier is reported only once build/src/scipy/base/src/scalartypes.inc:4001: error: for each function it appears in.) In file included from scipy/base/src/multiarraymodule.c:44: scipy/base/src/arrayobject.c: In function PyArray_RegisterDescrForType: Note that this is with gcc version 4.0.2 20050826 (prerelease) (SUSE Linux) So if we think that this is a bit unsafe (personally I am always a bit sceptical about rushing for the latest stuff as they do - I grew up in a debian world...;-) ((the special characters above are because of SUSEs gcc, I think)). I send you build_1_log.zip off-list. Best, Arnd From arnd.baecker at web.de Sat Oct 8 13:00:02 2005 From: arnd.baecker at web.de (Arnd Baecker) Date: Sat, 8 Oct 2005 19:00:02 +0200 (CEST) Subject: [SciPy-dev] More bugs fixed In-Reply-To: <4346FFA1.4060703@ee.byu.edu> References: <4345E3F0.8010802@ee.byu.edu> <434603EF.5090209@ucsd.edu> <434630BD.5020808@ee.byu.edu> <4346BF91.7080401@colorado.edu> <4346C804.2050104@ucsd.edu> <4346CEC0.4030906@ucsd.edu> <4346D146.6060400@ee.byu.edu> <4346E0E8.3030901@colorado.edu> <4346E280.1090504@csun.edu> <4346FFA1.4060703@ee.byu.edu> Message-ID: On Fri, 7 Oct 2005, Travis Oliphant wrote: > Stephen Walton wrote: > > >Fernando Perez wrote: > > > >>Just to clarify (sorry, but I'm a bit dizzy from the current naming blizzard): > >>for those of us who install/use the full scipy, scipy.{fft,linalg,random,...} > >>would continue to exist, correct? Or would they instead be found as > >>scipy.lib.{fft,...}? > >> > >I would vote for them still being found as scipy.{fft,linalg,...}. To +1 (FWIW) > >me, scipy.lib.something implies that all that's there is a library, not > >a set of tools for doing "something." I personally think the primary > >goal here is to have basic stable tools which are always there, so that > >someone developing a third party package for scipy can depend on them. > > > Right now all the extra's beyond the basic arrayobject, it's functions, > and the ufunc object > > are in a package called > > scipy.basic > > There you can find modules > > scipy.basic.fft > scipy.basic.linalg > scipy.basic.random > > that contain the functionality (often by importing some heavy-lifting > code from scipy.lib. > > Full scipy will still have > > scipy.fftpack > scipy.linalg > scipy.stats > > as it does now. Question: wouldn't it be possible to use `scipy.linalg` both for "scipy core" and for "scipy full"? When only "scipy core" is installed, scipy.linalg will just contain the basic routines, and use lapack_lite (and possibly no dotblas etc.). When also "scipy full" is installed, scipy.linalg will have the full glory of all the routines provided in present scipy. Technically the question is if there is a way to achieve this without "scipy full" overwriting files from "scipy core" (thinking of .deb/.rpm/... packages). Does this sound reasonable and feasible? Best, Arnd From rkern at ucsd.edu Sat Oct 8 23:25:25 2005 From: rkern at ucsd.edu (Robert Kern) Date: Sat, 08 Oct 2005 20:25:25 -0700 Subject: [SciPy-dev] More bugs fixed In-Reply-To: References: <4345E3F0.8010802@ee.byu.edu> <434603EF.5090209@ucsd.edu> <434630BD.5020808@ee.byu.edu> <4346BF91.7080401@colorado.edu> <4346C804.2050104@ucsd.edu> <4346CEC0.4030906@ucsd.edu> <4346D146.6060400@ee.byu.edu> <4346E0E8.3030901@colorado.edu> <4346E280.1090504@csun.edu> <4346FFA1.4060703@ee.byu.edu> Message-ID: <43488DA5.6040508@ucsd.edu> Arnd Baecker wrote: > Question: wouldn't it be possible to use > `scipy.linalg` both for "scipy core" and for "scipy full"? > > When only "scipy core" is installed, > scipy.linalg will just contain the basic routines, > and use lapack_lite (and possibly no dotblas etc.). > > When also "scipy full" is installed, scipy.linalg will > have the full glory of all the routines provided in present scipy. > > Technically the question is if there is > a way to achieve this without "scipy full" overwriting > files from "scipy core" (thinking of .deb/.rpm/... packages). > > Does this sound reasonable and feasible? Reasonable, yes. Feasible, not really, I don't think. At the moment, the scipy_core versions are "physically" under scipy.basic. In scipy/__init__.py, we have a block like this: import scipy.basic.fft as fftpack import scipy.basic.linalg as linalg import scipy.basic.random as random So "from scipy import linalg" should work even with scipy_core. I'm not entirely sure if this is going to continue to work in the various situations where you can't overwrite stuff, but I'm going to wait until the real scipy.linalg and scipy.fftpack are working to test that out. I'm willing to entertain specific suggestions to change my mind. -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From arnd.baecker at web.de Sun Oct 9 14:09:21 2005 From: arnd.baecker at web.de (Arnd Baecker) Date: Sun, 9 Oct 2005 20:09:21 +0200 (CEST) Subject: [SciPy-dev] More bugs fixed In-Reply-To: <43488DA5.6040508@ucsd.edu> References: <4345E3F0.8010802@ee.byu.edu> <434603EF.5090209@ucsd.edu> <434630BD.5020808@ee.byu.edu> <4346BF91.7080401@colorado.edu> <4346D146.6060400@ee.byu.edu> <4346E0E8.3030901@colorado.edu> <4346E280.1090504@csun.edu> <4346FFA1.4060703@ee.byu.edu> <43488DA5.6040508@ucsd.edu> Message-ID: On Sat, 8 Oct 2005, Robert Kern wrote: > Arnd Baecker wrote: > > > Question: wouldn't it be possible to use > > `scipy.linalg` both for "scipy core" and for "scipy full"? > > > > When only "scipy core" is installed, > > scipy.linalg will just contain the basic routines, > > and use lapack_lite (and possibly no dotblas etc.). > > > > When also "scipy full" is installed, scipy.linalg will > > have the full glory of all the routines provided in present scipy. > > > > Technically the question is if there is > > a way to achieve this without "scipy full" overwriting > > files from "scipy core" (thinking of .deb/.rpm/... packages). > > > > Does this sound reasonable and feasible? > > Reasonable, yes. Feasible, not really, I don't think. At the moment, the > scipy_core versions are "physically" under scipy.basic. In > scipy/__init__.py, we have a block like this: > > import scipy.basic.fft as fftpack > import scipy.basic.linalg as linalg > import scipy.basic.random as random > > So "from scipy import linalg" should work even with scipy_core. I'm not > entirely sure if this is going to continue to work in the various > situations where you can't overwrite stuff, Would something like try: import scipy.THE_FULL_ONE.fft as fftpack import scipy.THE_FULL_ONE.linalg as linalg import scipy.THE_FULL_ONE.random as random except ImportError: import scipy.basic.fft as fftpack import scipy.basic.linalg as linalg import scipy.basic.random as random work? This assumes that the routines from `THE_FULL_ONE` are compatible with the `basic` ones. [...] Best, Arnd From rkern at ucsd.edu Sun Oct 9 14:09:47 2005 From: rkern at ucsd.edu (Robert Kern) Date: Sun, 09 Oct 2005 11:09:47 -0700 Subject: [SciPy-dev] Segfault initializing _cephes.so Message-ID: <43495CEB.6030709@ucsd.edu> It's on line 412 of _cephesmodule.c, the second ufunc to be created. f = PyUFunc_FromFuncAndData(cephes3a_functions, bdtr_data, cephes_4_types, 2, 3, 1, PyUFunc_None, "bdtr", bdtr_doc, 0); The error occurs before the function can be executed. -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From pearu at scipy.org Sun Oct 9 13:31:23 2005 From: pearu at scipy.org (Pearu Peterson) Date: Sun, 9 Oct 2005 12:31:23 -0500 (CDT) Subject: [SciPy-dev] More bugs fixed In-Reply-To: References: <4345E3F0.8010802@ee.byu.edu> <434603EF.5090209@ucsd.edu> <434630BD.5020808@ee.byu.edu> <4346BF91.7080401@colorado.edu> <4346D146.6060400@ee.byu.edu> <4346E0E8.3030901@colorado.edu> <43488DA5.6040508@ucsd.edu> Message-ID: On Sun, 9 Oct 2005, Arnd Baecker wrote: >>> When only "scipy core" is installed, >>> scipy.linalg will just contain the basic routines, >>> and use lapack_lite (and possibly no dotblas etc.). >>> >>> When also "scipy full" is installed, scipy.linalg will >>> have the full glory of all the routines provided in present scipy. >>> >>> Technically the question is if there is >>> a way to achieve this without "scipy full" overwriting >>> files from "scipy core" (thinking of .deb/.rpm/... packages). >>> >>> Does this sound reasonable and feasible? >> >> Reasonable, yes. Feasible, not really, I don't think. At the moment, the >> scipy_core versions are "physically" under scipy.basic. In >> scipy/__init__.py, we have a block like this: >> >> import scipy.basic.fft as fftpack >> import scipy.basic.linalg as linalg >> import scipy.basic.random as random >> >> So "from scipy import linalg" should work even with scipy_core. I'm not >> entirely sure if this is going to continue to work in the various >> situations where you can't overwrite stuff, > > > Would something like > > try: > import scipy.THE_FULL_ONE.fft as fftpack > import scipy.THE_FULL_ONE.linalg as linalg > import scipy.THE_FULL_ONE.random as random > except ImportError: > import scipy.basic.fft as fftpack > import scipy.basic.linalg as linalg > import scipy.basic.random as random > > work? > This assumes that the routines from > `THE_FULL_ONE` are compatible with the `basic` ones. It would probably work but it is a bad idea as it assumes that packages are are always bug-free, which is not true in real world. IMO any try-except block should be implemented so that it always behaves according to the meaning when the try-except block was introduced. So, try-except blocks should be very minimal both in code as well as in action. Anyway, I would suggest implementing scipy/__init__.py such that it would not require modifications when a new Scipy package has been included. So, it means that the __init__.py files should recognize somehow which scipy subdirectories are SciPy packages and then import them + update documentations and test suites. This was implemented in Scipy 0.3 Lib/__init__.py file, see the _import_packages() function, for Scipy 0.4 it may require some revision though. Pearu From stephen.walton at csun.edu Sun Oct 9 15:17:59 2005 From: stephen.walton at csun.edu (Stephen Walton) Date: Sun, 09 Oct 2005 12:17:59 -0700 Subject: [SciPy-dev] More bugs fixed In-Reply-To: <43488DA5.6040508@ucsd.edu> References: <4345E3F0.8010802@ee.byu.edu> <434603EF.5090209@ucsd.edu> <434630BD.5020808@ee.byu.edu> <4346BF91.7080401@colorado.edu> <4346C804.2050104@ucsd.edu> <4346CEC0.4030906@ucsd.edu> <4346D146.6060400@ee.byu.edu> <4346E0E8.3030901@colorado.edu> <4346E280.1090504@csun.edu> <4346FFA1.4060703@ee.byu.edu> <43488DA5.6040508@ucsd.edu> Message-ID: <43496CE7.3090008@csun.edu> Robert Kern wrote: >At the moment, the >scipy_core versions are "physically" under scipy.basic. In >scipy/__init__.py, we have a block like this: > > import scipy.basic.fft as fftpack > import scipy.basic.linalg as linalg > import scipy.basic.random as random > >So "from scipy import linalg" should work even with scipy_core. > Yes, but the equivalent usage import scipy.linalg as linalg does not work with this setup. I'm not sure how big a concern that is, but wanted to point it out. From oliphant at ee.byu.edu Sun Oct 9 20:10:06 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Sun, 09 Oct 2005 18:10:06 -0600 Subject: [SciPy-dev] [SciPy-user] Re: [Numpy-discussion] Purchasing Documentation In-Reply-To: <4349ADD4.7060701@optusnet.com.au> References: <43418AA8.2040809@hawaii.edu> <4341C1C8.3000506@optusnet.com.au> <4341CB9E.4050306@ee.byu.edu> <434300FB.7070606@ee.byu.edu> <434307CC.7020303@colorado.edu> <43430B62.9090908@ee.byu.edu> <1e2af89e0510060529s5319192ayea270ebe2949b6b8@mail.gmail.com> <43454135.503@ee.byu.edu> <4345A449.8040303@colorado.edu> <1e2af89e0510091306w5b435c32j3e59cc453a35be30@mail.gmail.com> <4349A524.40208@colorado.edu> <4349ADD4.7060701@optusnet.com.au> Message-ID: <4349B15E.1060801@ee.byu.edu> Tim Churches wrote: >Finally, to whom should we send docstrings? To Travis? > > I'll certainly take them, but anybody with commit priviledges to the SciPy SVN tree can insert them as well. If you are a regular contributor and would like to commit them yourself, it is not difficult to get access to the tree, either. Just send me your desired user name and password and I'll set you up. -Travis From oliphant at ee.byu.edu Mon Oct 10 00:27:14 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Sun, 09 Oct 2005 22:27:14 -0600 Subject: [SciPy-dev] Question about 64-bit integers being cast to double precision Message-ID: <4349EDA2.2090802@ee.byu.edu> What is the opinion of people here regarding the casting of 64-bit integers to double precision. In scipy (as in Numeric), there is the concept of "Casting safely" to a type. This concept is used when choosing a ufunc, for example. My understanding is that a 64-bit integer cannot be cast safely to a double-precision floating point number, because precision is lost in the conversion. However, at least a signed 64-bit integer can usually be cast safely to a long double precision floating point number. This is not too big a deal on 32-bit systems where people rarely request 64-bit integers. However, on some 64-bit systems (where the C long is 64-bit), Python's default integer is 64-bit. Therefore, simple expressions like sqrt(2) which require conversion to floating point will look for the first floating point number that it can convert a 64-bit integer to safely. This can only be a long double. The result is that on 64-bit systems, the long double type gets used a lot more. Is this acceptable? expected? What do those of you on 64-bit systems think? -Travis From cimrman3 at ntc.zcu.cz Mon Oct 10 04:17:10 2005 From: cimrman3 at ntc.zcu.cz (Robert Cimrman) Date: Mon, 10 Oct 2005 10:17:10 +0200 Subject: [SciPy-dev] sparse matrix support status In-Reply-To: References: <433CE24A.6040509@ee.byu.edu> <433D5BAD.8030801@ntc.zcu.cz> <433D690D.7060008@ee.byu.edu> <43468A59.9000503@ntc.zcu.cz> <4346B65D.3070604@colorado.edu> <4346C618.2040403@colorado.edu> Message-ID: <434A2386.7040100@ntc.zcu.cz> Jonathan Guyer wrote: > On Oct 7, 2005, at 3:01 PM, Fernando Perez wrote: > > >>it does seem that >>pysparse offers extra capabilities beyond what scipy.sparse has, it >>would be >>nice (I think) to fold that into the new shiny scipy. > > > True enough. Dan Wheeler and I would be willing to look into this. We > presently use pysparse in FiPy because that's what we could figure out, > particularly for iterative solvers, but it'd simplify our lives > considerably if we could reduce our installation instructions to "Get > SciPy. Get FiPy." > > Roman Geus has indicated to us that he's not interested in merging > pysparse into a larger suite like SciPy; he prefers individually > maintained small packages to large monolithic systems. He may have > changed his mind about that, though, and regardless, pysparse is BSD > licensed, so it's perfectly legal to use his code to improve > scipy.sparse (assuming that rigorous benchmarking determines that there > are, in fact, improvements to be made). We'll do some tests and, if a > merge is warranted, we'll run it by Roman out of courtesy. As Travis Oliphant wrote, I would like to help, too. So let me know if you have a need for another hands in (as almost anybody, I did write my own sparce matrix implementations in past). As a side-note, I would really appreciate to see umfpack bindinds in scipy (if the umfpack licence allows that, of course) - I have made some tests (admittedly a long time ago) showing that, at least for my class of problems umfpack performed much better than superlu via pysparse. (iterative solvers are out of question for me.) cheers, r. From rkern at ucsd.edu Mon Oct 10 09:57:01 2005 From: rkern at ucsd.edu (Robert Kern) Date: Mon, 10 Oct 2005 06:57:01 -0700 Subject: [SciPy-dev] sparse matrix support status In-Reply-To: <434A2386.7040100@ntc.zcu.cz> References: <433CE24A.6040509@ee.byu.edu> <433D5BAD.8030801@ntc.zcu.cz> <433D690D.7060008@ee.byu.edu> <43468A59.9000503@ntc.zcu.cz> <4346B65D.3070604@colorado.edu> <4346C618.2040403@colorado.edu> <434A2386.7040100@ntc.zcu.cz> Message-ID: <434A732D.1080009@ucsd.edu> Robert Cimrman wrote: > As Travis Oliphant wrote, I would like to help, too. So let me know if > you have a need for another hands in (as almost anybody, I did write my > own sparce matrix implementations in past). As a side-note, I would > really appreciate to see umfpack bindinds in scipy (if the umfpack > licence allows that, of course) - I have made some tests (admittedly a > long time ago) showing that, at least for my class of problems umfpack > performed much better than superlu via pysparse. (iterative solvers are > out of question for me.) UMFPACK 4.4 does indeed have a suitable license, and helping hands are always useful. -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From cimrman3 at ntc.zcu.cz Mon Oct 10 10:13:17 2005 From: cimrman3 at ntc.zcu.cz (Robert Cimrman) Date: Mon, 10 Oct 2005 16:13:17 +0200 Subject: [SciPy-dev] sparse matrix support status In-Reply-To: <434A732D.1080009@ucsd.edu> References: <433CE24A.6040509@ee.byu.edu> <433D5BAD.8030801@ntc.zcu.cz> <433D690D.7060008@ee.byu.edu> <43468A59.9000503@ntc.zcu.cz> <4346B65D.3070604@colorado.edu> <4346C618.2040403@colorado.edu> <434A2386.7040100@ntc.zcu.cz> <434A732D.1080009@ucsd.edu> Message-ID: <434A76FD.90906@ntc.zcu.cz> Robert Kern wrote: > UMFPACK 4.4 does indeed have a suitable license, and helping hands are > always useful. I am going to wrap it then. Travis, if you think that your prior implementation would be a good start for me, send me the sources, please. r. From cimrman3 at ntc.zcu.cz Mon Oct 10 10:18:40 2005 From: cimrman3 at ntc.zcu.cz (Robert Cimrman) Date: Mon, 10 Oct 2005 16:18:40 +0200 Subject: [SciPy-dev] newscipy build problem Message-ID: <434A7840.20900@ntc.zcu.cz> Hi all, 'newcore' installation works for me, but I get the traceback below for newscipy. I am running on linux/gentoo. (FYI, 'sci-libs/blas-atlas' ebuild is required instead of 'sci-libs/atlas' to have full lapack implementation.) Traceback (most recent call last): File "setup.py", line 35, in ? setup_package() File "setup.py", line 17, in setup_package config_dict = configuration(top_path='') File "setup.py", line 5, in configuration from scipy.distutils.misc_util import Configuration File "/usr/lib/python2.3/site-packages/scipy/__init__.py", line 17, in ? import scipy.basic.linalg as linalg File "/usr/lib/python2.3/site-packages/scipy/basic/linalg.py", line 1, in ? from basic_lite import * File "/usr/lib/python2.3/site-packages/scipy/basic/basic_lite.py", line 7, in ? import scipy.lib.lapack_lite as lapack_lite ImportError: /usr/lib/python2.3/site-packages/scipy/lib/lapack_lite.so: undefined symbol: s_wsfe Any help would be appreciated, r. From oliphant at ee.byu.edu Mon Oct 10 12:17:02 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Mon, 10 Oct 2005 10:17:02 -0600 Subject: [SciPy-dev] sparse matrix support status In-Reply-To: <434A76FD.90906@ntc.zcu.cz> References: <433CE24A.6040509@ee.byu.edu> <433D5BAD.8030801@ntc.zcu.cz> <433D690D.7060008@ee.byu.edu> <43468A59.9000503@ntc.zcu.cz> <4346B65D.3070604@colorado.edu> <4346C618.2040403@colorado.edu> <434A2386.7040100@ntc.zcu.cz> <434A732D.1080009@ucsd.edu> <434A76FD.90906@ntc.zcu.cz> Message-ID: <434A93FE.3040901@ee.byu.edu> Robert Cimrman wrote: >Robert Kern wrote: > > >>UMFPACK 4.4 does indeed have a suitable license, and helping hands are >>always useful. >> >> > >I am going to wrap it then. Travis, if you think that your prior >implementation would be a good start for me, send me the sources, please. > >r. > > Great, I'll look for them, They were based on UMFPACK 2.2 I believe. I really liked UMFPACK, so having them in SciPy will be great. -Travis From stephen.walton at csun.edu Mon Oct 10 12:18:27 2005 From: stephen.walton at csun.edu (Stephen Walton) Date: Mon, 10 Oct 2005 09:18:27 -0700 Subject: [SciPy-dev] Question about 64-bit integers being cast to double precision In-Reply-To: <4349EDA2.2090802@ee.byu.edu> References: <4349EDA2.2090802@ee.byu.edu> Message-ID: <434A9453.2010705@csun.edu> Travis Oliphant wrote: >In scipy (as in Numeric), there is the concept of "Casting safely" to a >type. This concept is used when choosing a ufunc, for example. > >My understanding is that a 64-bit integer cannot be cast safely to a >double-precision floating point number, because precision is lost in the >conversion...The result is that on 64-bit systems, the long double type gets used a >lot more. Is this acceptable? expected? What do those of you on >64-bit systems think? > > I am not on a 64 bit system but can give you the perspective of someone who's thought a lot about floating point precision in the context of both my research and of teaching classes on numerical analysis for physics majors. To take your example, and looking at it from an experimentalist's viewpoint, sqrt(2) where 2 is an integer has only one significant figure, and so casting it to a long double seems like extreme overkill. The numerical analysis community has probably had the greatest influence on the design of Fortran, and there sqrt(2) (2 integer) is simply not defined. The user must specify sqrt(2.0) to get a REAL result, sqrt(2.0d0) to get a DOUBLE PRECISION result. These usually map to IEEE 32 and 64 bit REALs today, respectively, on 32-bit hardware and to IEEE 64 and 128 bit (is there such a thing?) on 64-bit hardware. I imagine that if there were an integer square root function in Fortran, it would simply round to the nearest integer. In addition, the idea of "casting safely" would, it seems to me, also require sqrt(2) to return a double on a 32-bit machine. The question, I think, is part of the larger question: to what extent should the language leave precision issues under the user's control, and to what extent should it make decisions automatically? A lot of the behind-the-scenes stuff which goes on in all the Fortran routines from Netlib which are now part of Scipy involve using the machine precision to decide on step sizes and other algorithmic choices. These choices become wrong if the underlying language changes precision without telling the user, a la C's old habit of automatically casting all floats to doubles. With all that, my vote on Travis's specific question: if conversion of an N-bit integer in scipy_core is required, it gets converted to an N-bit float. The only cases in which precision will be lost is if the integer is large enough to require more than (N-e) bits for its representation, where e is the number of bits in the exponent of the floating point representation. Those who really need to control precision should, in my view, create arrays of the appropriate type to begin with. I suppose these sorts of questions are why there are now special purpose libraries for fixed precision numbers. From stephen.walton at csun.edu Mon Oct 10 12:22:08 2005 From: stephen.walton at csun.edu (Stephen Walton) Date: Mon, 10 Oct 2005 09:22:08 -0700 Subject: [SciPy-dev] newscipy build problem In-Reply-To: <434A7840.20900@ntc.zcu.cz> References: <434A7840.20900@ntc.zcu.cz> Message-ID: <434A9530.1010003@csun.edu> Robert Cimrman wrote: >ImportError: /usr/lib/python2.3/site-packages/scipy/lib/lapack_lite.so: >undefined symbol: s_wsfe > > This is a symbol from the g2c library which is supposed to be part of g77. Rebuild newscipy, redirectiing the output to a file, and post the line which shows how lapack_lite.so is linked. Grep'ping the output for the string "-o lapack_lite.so" should find it. What versions of gcc and g77 do you have? That is, what is the output of 'g77 -v' and 'gcc -v'? Do you have gfortran? From oliphant at ee.byu.edu Mon Oct 10 12:38:03 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Mon, 10 Oct 2005 10:38:03 -0600 Subject: [SciPy-dev] Question about 64-bit integers being cast to double precision In-Reply-To: <434A9453.2010705@csun.edu> References: <4349EDA2.2090802@ee.byu.edu> <434A9453.2010705@csun.edu> Message-ID: <434A98EB.7030407@ee.byu.edu> Stephen Walton wrote: >Travis Oliphant wrote: > > > >>In scipy (as in Numeric), there is the concept of "Casting safely" to a >>type. This concept is used when choosing a ufunc, for example. >> >>My understanding is that a 64-bit integer cannot be cast safely to a >>double-precision floating point number, because precision is lost in the >>conversion...The result is that on 64-bit systems, the long double type gets used a >>lot more. Is this acceptable? expected? What do those of you on >>64-bit systems think? >> >> >> >> >I am not on a 64 bit system but can give you the perspective of someone >who's thought a lot about floating point precision in the context of >both my research and of teaching classes on numerical analysis for >physics majors. To take your example, and looking at it from an >experimentalist's viewpoint, sqrt(2) where 2 is an integer has only one >significant figure, and so casting it to a long double seems like >extreme overkill. > I agree, which is why it concerned me when I saw it. But, it is consistent with the rest of the casting features. >With all that, my vote on Travis's specific question: if conversion of >an N-bit integer in scipy_core is required, it gets converted to an >N-bit float. The only cases in which precision will be lost is if the >integer is large enough to require more than (N-e) bits for its >representation, where e is the number of bits in the exponent of the >floating point representation. > Yes, it is only for large integers that problems arise. I like this scheme and it would be very easy to implement, and it would provide a consistent interface. The only problem is that it would mean that on current 32-bit systems sqrt(2) would cast 2 to a "single-precision" float and return a single-precision result. If that is not a problem, then great... Otherwise, a more complicated (and less consistent) rule like integer float ============== 8-bit 32-bit 16-bit 32-bit 32-bit 64-bit 64-bit 64-bit would be needed (this is also not too hard to do). -Travis From chanley at stsci.edu Mon Oct 10 12:38:33 2005 From: chanley at stsci.edu (Christopher Hanley) Date: Mon, 10 Oct 2005 12:38:33 -0400 Subject: [SciPy-dev] Cannot build scipy_core (newcore) on Solaris 8 In-Reply-To: <4346FE27.3020902@ee.byu.edu> References: <4346D7EE.9010604@stsci.edu> <4346DA2B.9060708@ee.byu.edu> <4346DC55.5000904@stsci.edu> <4346FE27.3020902@ee.byu.edu> Message-ID: <434A9909.6000402@stsci.edu> Travis, Unfortunately this has not corrected the problems. I am attaching the latest build log and config.h file for reference. Chris Travis Oliphant wrote: > > Sorry I didn't get back to you, earlier. > > Add the following line to your config.h file > > #define HAVE_INVERSE_HYPERBOLIC_FLOAT > > > If that doesn't work, then you may want to add > > #define HAVE_LONGDOUBLE_FUNCS > #define HAVE_FLOAT_FUNCS > > as well (one at a time). > > Place this config.h file in the build directory where you got it from > (you might also copy it to the installation directory --- or check to > make sure one isn't there already). These headers are installed to > site-packages/scipy/base/include > > so look there to make sure you don't have something already (if you do > delete it). > > Also, remove any scipy subdirectory under the place where your python > includes are > > Mine are in /usr/include/python2.4 so I would execute rm > /usr/include/python2.4/scipy > > We don't want to be pulling out the wrong config file. > > > If we can get a good config.h file for your platform, then we see about > how to make it generated automatically. > > -Travis > > > > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-dev -------------- next part -------------- A non-text attachment was scrubbed... Name: config.h Type: text/x-c-header Size: 402 bytes Desc: not available URL: -------------- next part -------------- An embedded and charset-unspecified text was scrubbed... Name: solaris8_cc_new_build.log URL: From cimrman3 at ntc.zcu.cz Mon Oct 10 12:42:38 2005 From: cimrman3 at ntc.zcu.cz (Robert Cimrman) Date: Mon, 10 Oct 2005 18:42:38 +0200 Subject: [SciPy-dev] newscipy build problem In-Reply-To: <434A9530.1010003@csun.edu> References: <434A7840.20900@ntc.zcu.cz> <434A9530.1010003@csun.edu> Message-ID: <434A99FE.7060909@ntc.zcu.cz> Stephen Walton wrote: > Robert Cimrman wrote: > > >>ImportError: /usr/lib/python2.3/site-packages/scipy/lib/lapack_lite.so: >>undefined symbol: s_wsfe >> >> > > This is a symbol from the g2c library which is supposed to be part of > g77. Rebuild newscipy, redirectiing the output to a file, and post the > line which shows how lapack_lite.so is linked. Grep'ping the output for > the string "-o lapack_lite.so" should find it. What versions of gcc and > g77 do you have? That is, what is the output of 'g77 -v' and 'gcc -v'? > Do you have gfortran? I have gcc+g77, version 3.3.6 (Gentoo 3.3.6, ssp-3.3.6-1.0, pie-8.7.8), no gfortran. The output I posted is all I get after doing 'python setup.py build' in 'newscipy'. the '/usr/lib/python2.3/site-packages/scipy/lib/lapack_lite.so' file is created when installing 'newcore', where the relevant lines say -- building extension "scipy.lib.lapack_lite" sources adding 'scipy/corelib/lapack_lite/lapack_litemodule.c' to sources. -- r. From rkern at ucsd.edu Mon Oct 10 12:45:55 2005 From: rkern at ucsd.edu (Robert Kern) Date: Mon, 10 Oct 2005 09:45:55 -0700 Subject: [SciPy-dev] newscipy build problem In-Reply-To: <434A99FE.7060909@ntc.zcu.cz> References: <434A7840.20900@ntc.zcu.cz> <434A9530.1010003@csun.edu> <434A99FE.7060909@ntc.zcu.cz> Message-ID: <434A9AC3.9090601@ucsd.edu> Robert Cimrman wrote: > The output I posted is all I get after doing 'python setup.py build' in > 'newscipy'. the > '/usr/lib/python2.3/site-packages/scipy/lib/lapack_lite.so' file is > created when installing 'newcore', where the relevant lines say > -- > building extension "scipy.lib.lapack_lite" sources > adding 'scipy/corelib/lapack_lite/lapack_litemodule.c' to sources. > -- That's not the relevant line. We need the lines where lapack_lite.so is getting linked. This isn't an issue with newscipy; you should get the same error simply importing scipy after installing scipy_core. -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From cimrman3 at ntc.zcu.cz Mon Oct 10 13:03:15 2005 From: cimrman3 at ntc.zcu.cz (Robert Cimrman) Date: Mon, 10 Oct 2005 19:03:15 +0200 Subject: [SciPy-dev] newscipy build problem In-Reply-To: <434A9AC3.9090601@ucsd.edu> References: <434A7840.20900@ntc.zcu.cz> <434A9530.1010003@csun.edu> <434A99FE.7060909@ntc.zcu.cz> <434A9AC3.9090601@ucsd.edu> Message-ID: <434A9ED3.7040805@ntc.zcu.cz> Robert Kern wrote: > Robert Cimrman wrote: > > >>The output I posted is all I get after doing 'python setup.py build' in >>'newscipy'. the >>'/usr/lib/python2.3/site-packages/scipy/lib/lapack_lite.so' file is >>created when installing 'newcore', where the relevant lines say >>-- >>building extension "scipy.lib.lapack_lite" sources >> adding 'scipy/corelib/lapack_lite/lapack_litemodule.c' to sources. >>-- > > > That's not the relevant line. We need the lines where lapack_lite.so is > getting linked. This isn't an issue with newscipy; you should get the > same error simply importing scipy after installing scipy_core. > whoops, you are right. but scipy_core used to work... I must have broken something. How do I get then the lines where lapack_lite.so is getting linked? 'python -v setup.py build' in 'newcore' does not show it. r. From oliphant at ee.byu.edu Mon Oct 10 13:06:52 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Mon, 10 Oct 2005 11:06:52 -0600 Subject: [SciPy-dev] Cannot build scipy_core (newcore) on Solaris 8 In-Reply-To: <434A9909.6000402@stsci.edu> References: <4346D7EE.9010604@stsci.edu> <4346DA2B.9060708@ee.byu.edu> <4346DC55.5000904@stsci.edu> <4346FE27.3020902@ee.byu.edu> <434A9909.6000402@stsci.edu> Message-ID: <434A9FAC.5080702@ee.byu.edu> >compile options: '-Ibuild/src/scipy/base/src -Iscipy/base/include -Ibuild/src/scipy/base -Iscipy/base/src -I/usr/stsci/pyssgx/Python-2.4.1/include/python2.4 -c' >cc: build/src/scipy/base/src/umathmodule.c >"scipy/base/include/scipy/arrayobject.h", line 84: warning: typedef redeclared: ushort >"scipy/base/include/scipy/arrayobject.h", line 85: warning: typedef redeclared: uint >"scipy/base/include/scipy/arrayobject.h", line 86: warning: typedef redeclared: ulong >"build/src/scipy/base/src/umathmodule.c", line 7010: syntax error before or at: ) >"build/src/scipy/base/src/umathmodule.c", line 7034: syntax error before or at: ) >"build/src/scipy/base/src/umathmodule.c", line 7070: syntax error before or at: ) >"build/src/scipy/base/src/umathmodule.c", line 7082: syntax error before or at: ) >"build/src/scipy/base/src/umathmodule.c", line 7106: syntax error before or at: ) >"build/src/scipy/base/src/umathmodule.c", line 7156: syntax error before or at: ) >"build/src/scipy/base/src/umathmodule.c", line 7182: syntax error before or at: ) >"build/src/scipy/base/src/umathmodule.c", line 7221: syntax error before or at: ) >"build/src/scipy/base/src/umathmodule.c", line 7234: syntax error before or at: ) >"build/src/scipy/base/src/umathmodule.c", line 7237: cannot recover from previous errors >cc: acomp failed for build/src/scipy/base/src/umathmodule.c >"scipy/base/include/scipy/arrayobject.h", line 84: warning: typedef redeclared: ushort >"scipy/base/include/scipy/arrayobject.h", line 85: warning: typedef redeclared: uint >"scipy/base/include/scipy/arrayobject.h", line 86: warning: typedef redeclared: ulong >"build/src/scipy/base/src/umathmodule.c", line 7010: syntax error before or at: ) >"build/src/scipy/base/src/umathmodule.c", line 7034: syntax error before or at: ) >"build/src/scipy/base/src/umathmodule.c", line 7070: syntax error before or at: ) >"build/src/scipy/base/src/umathmodule.c", line 7082: syntax error before or at: ) >"build/src/scipy/base/src/umathmodule.c", line 7106: syntax error before or at: ) >"build/src/scipy/base/src/umathmodule.c", line 7156: syntax error before or at: ) >"build/src/scipy/base/src/umathmodule.c", line 7182: syntax error before or at: ) >"build/src/scipy/base/src/umathmodule.c", line 7221: syntax error before or at: ) >"build/src/scipy/base/src/umathmodule.c", line 7234: syntax error before or at: ) >"build/src/scipy/base/src/umathmodule.c", line 7237: cannot recover from previous errors >cc: acomp failed for build/src/scipy/base/src/umathmodule.c >error: Command "cc -DNDEBUG -O -Ibuild/src/scipy/base/src -Iscipy/base/include -Ibuild/src/scipy/base -Iscipy/base/src -I/usr/stsci/pyssgx/Python-2.4.1/include/python2.4 -c build/src/scipy/base/src/umathmodule.c -o build/temp.solaris-2.8-sun4u-2.4/build/src/scipy/base/src/umathmodule.o" failed with exit status 2 >removed scipy/__svn_version__.py >removed scipy/f2py2e/__svn_version__.py > > O.K. It looks like we solved one problem, and found another. There was a typo in umathmodule.c.src You are picking it up because you don't have isnan on your system. It should be fixed in SVN now. -Travis From rkern at ucsd.edu Mon Oct 10 13:12:43 2005 From: rkern at ucsd.edu (Robert Kern) Date: Mon, 10 Oct 2005 10:12:43 -0700 Subject: [SciPy-dev] newscipy build problem In-Reply-To: <434A9ED3.7040805@ntc.zcu.cz> References: <434A7840.20900@ntc.zcu.cz> <434A9530.1010003@csun.edu> <434A99FE.7060909@ntc.zcu.cz> <434A9AC3.9090601@ucsd.edu> <434A9ED3.7040805@ntc.zcu.cz> Message-ID: <434AA10B.1020700@ucsd.edu> Robert Cimrman wrote: > whoops, you are right. but scipy_core used to work... I must have broken > something. How do I get then the lines where lapack_lite.so is > getting linked? 'python -v setup.py build' in 'newcore' does not show it. Try deleting the build directory and rebuilding. -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From cimrman3 at ntc.zcu.cz Mon Oct 10 13:27:25 2005 From: cimrman3 at ntc.zcu.cz (Robert Cimrman) Date: Mon, 10 Oct 2005 19:27:25 +0200 Subject: [SciPy-dev] newscipy build problem In-Reply-To: <434AA10B.1020700@ucsd.edu> References: <434A7840.20900@ntc.zcu.cz> <434A9530.1010003@csun.edu> <434A99FE.7060909@ntc.zcu.cz> <434A9AC3.9090601@ucsd.edu> <434A9ED3.7040805@ntc.zcu.cz> <434AA10B.1020700@ucsd.edu> Message-ID: <434AA47D.7080907@ntc.zcu.cz> Robert Kern wrote: > Robert Cimrman wrote: > > >>whoops, you are right. but scipy_core used to work... I must have broken >>something. How do I get then the lines where lapack_lite.so is >>getting linked? 'python -v setup.py build' in 'newcore' does not show it. > > > Try deleting the build directory and rebuilding. stupid me, I thought that running 'python setup.py clean' does remove all build files... so thank you, problem solved, now build/install runs ok :-) (I have also upgraded to python 2.4.1 in the meantime...) r. From chanley at stsci.edu Mon Oct 10 13:34:58 2005 From: chanley at stsci.edu (Christopher Hanley) Date: Mon, 10 Oct 2005 13:34:58 -0400 Subject: [SciPy-dev] Cannot build scipy_core (newcore) on Solaris 8 In-Reply-To: <434A9FAC.5080702@ee.byu.edu> References: <4346D7EE.9010604@stsci.edu> <4346DA2B.9060708@ee.byu.edu> <4346DC55.5000904@stsci.edu> <4346FE27.3020902@ee.byu.edu> <434A9909.6000402@stsci.edu> <434A9FAC.5080702@ee.byu.edu> Message-ID: <434AA642.2000404@stsci.edu> O.K. I have checked out the changes from SVN and made another attempt on Solaris. I have continued to use the modified config.h file. I once again seem to be getting complaints about some of the C99 functions. The build log is attached. Chris p.s. I did "touch" the config.h file prior to attempting the rebuild so that it wouldn't get rebuilt. Travis Oliphant wrote: > > O.K. It looks like we solved one problem, and found another. There was > a typo in umathmodule.c.src > > You are picking it up because you don't have isnan on your system. It > should be fixed in SVN now. > > -Travis > > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-dev -------------- next part -------------- An embedded and charset-unspecified text was scrubbed... Name: solaris_cc_isnanfix_build.log URL: From oliphant at ee.byu.edu Mon Oct 10 17:35:12 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Mon, 10 Oct 2005 15:35:12 -0600 Subject: [SciPy-dev] [SciPy-user] install location of the newcore header files In-Reply-To: <434ABFF9.3020209@ucsd.edu> References: <20051010210259.029a6a1f.gerard.vermeulen@grenoble.cnrs.fr> <434ABFF9.3020209@ucsd.edu> Message-ID: <434ADE90.9090907@ee.byu.edu> Robert Kern wrote: >Gerard Vermeulen wrote: > > >>I finally discovered where the header files get installed when reading CAPI.txt; >>on the scip-dev list it is argued that the Python include directory is not a >>standard place for headers. >> >> > >No one is arguing that it's not standard. I'm arguing that it's a *bad* >place because frequently people can't install to it. > > > >>I understand that some people cannot write to the Python include directory, >>but distutils can take care of that in principle, see >>http://python.org/doc/2.4.2/inst/search-path.html >> >> > >Yes, that solves installing the headers to a new location. But people >then have to modify every setup.py script for modules that use scipy >headers to point to the new location. > > So, why not have the get_headers_dir() function in scipy.distutils, but just install to the "standard" place by default and only install to the non-standard place if needed. This seems like it would make everybody happy. -Travis From rkern at ucsd.edu Mon Oct 10 17:52:53 2005 From: rkern at ucsd.edu (Robert Kern) Date: Mon, 10 Oct 2005 14:52:53 -0700 Subject: [SciPy-dev] [SciPy-user] install location of the newcore header files In-Reply-To: <434ADE90.9090907@ee.byu.edu> References: <20051010210259.029a6a1f.gerard.vermeulen@grenoble.cnrs.fr> <434ABFF9.3020209@ucsd.edu> <434ADE90.9090907@ee.byu.edu> Message-ID: <434AE2B5.1070307@ucsd.edu> Travis Oliphant wrote: > So, why not have the get_headers_dir() function in scipy.distutils, but > just install to the "standard" place by default and only install to the > non-standard place if needed. This seems like it would make everybody > happy. I foresee people neglecting to use the get_headers_dir() function. How does one test whether headers can be installed to the main Python include directory? The most robust strawman I can think of works as follows: The headers are always installed in the package. They are also added with config.add_headers(). get_headers_dir() checks for their existence in $prefix/include/pythonX.Y/scipy/ and uses the package headers otherwise. Personally, I think that's more confusing, but I'll tolerate it. -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From pearu at scipy.org Tue Oct 11 02:39:28 2005 From: pearu at scipy.org (Pearu Peterson) Date: Tue, 11 Oct 2005 01:39:28 -0500 (CDT) Subject: [SciPy-dev] install location of the newcore header files In-Reply-To: <434AE2B5.1070307@ucsd.edu> References: <20051010210259.029a6a1f.gerard.vermeulen@grenoble.cnrs.fr> <434AE2B5.1070307@ucsd.edu> Message-ID: I find this thread of where headers should be installed a bit strange. First, as a developer, I don't care where the headers are installed as long as it is well documented how to access them. Second, currently there a two choises whether to install them to headers directory or to package directory. As I know now, the first choice would cause problems to certain setups and the second choice would work for all camps. So, for me it would be natural to go with the second choice. It is simple and works for all. There are arguments that no matter what, we should follow Python recommendations, even if the standard code is broken. For scipy it would mean adding fixes to scipy.distutils that's additional code that we need to maintain and also watch out that it will behave as in new versions of distutils that might contain the same fixes. Note that we cannot drop the corresponing patches in scipy.distutils because it should work with older Python versions (2.3 and up) as well. So, I think the trouble of strictly following Pyhton conventions is not worth as we already have a working solution. Finally, about the suggestion of installing header files to both places, hmm, that sounds insane, IMHO. Again, this is even more code to maintain as well as a source of confusion when debugging. Suggestion: leave the current behavior as it is and when platforms/setups/whatever that currently have problems with installing header files to standard places, are fixed to handle these places or Python folks work something out for them, then we switch over to installing headers to recommended locations. this is my 2 estonian cents, Pearu From cimrman3 at ntc.zcu.cz Tue Oct 11 04:49:06 2005 From: cimrman3 at ntc.zcu.cz (Robert Cimrman) Date: Tue, 11 Oct 2005 10:49:06 +0200 Subject: [SciPy-dev] update: sparse matrices + UMFPACK Message-ID: <434B7C82.2080403@ntc.zcu.cz> I have just got a message from Joachim Dahl (http://www.ee.ucla.edu/~vandenbe/cvxopt) about their implementation of (sparse) matrices and umfpack 4.4 bindings. He has provided me kindly with their prerelease code to get inspiration. r. From jh at oobleck.astro.cornell.edu Tue Oct 11 10:33:50 2005 From: jh at oobleck.astro.cornell.edu (Joe Harrington) Date: Tue, 11 Oct 2005 10:33:50 -0400 Subject: [SciPy-dev] ASP status, docs, and supporting development Message-ID: <200510111433.j9BEXo0p013993@oobleck.astro.cornell.edu> After SciPy '04, Perry, Janet, and a few others and I put together a roadmap and framework, called ASP, for people to contribute to all the parts of SciPy other than the software: docs, packaging, website, etc. There was much interest expressed, and a few people even signed on the electronic dotted line to be involved: http://www.scipy.org/wikis/accessible_scipy/AccessibleSciPy The roadmap is laid out in this thread, which is linked in the first paragraph of the page above: http://www.scipy.org/mailinglists/mailman?fn=scipy-dev/2004-October/002412.html The first thing we did was to gather all the external projects using SciPy that we could find, and make an index page. The community found them and Fernando Perez did the hard work (more than he bargained for) of collating everything into the index page: http://www.scipy.org/wikis/topical_software/TopicalSoftware That is now a live wiki page, so if your project isn't on there, please add it! We were gearing up for effort #2, a web-site overhaul, early this year, when Travis announced his intention to heal the rift between the small- and large-array communities. We held up our push on web development, which was a few days from kickoff, so that it wouldn't take volunteers from his more-crucial effort. We all know the story of last year: Travis worked like crazy, called for volunteers, got only a few, and finished the job anyway. Now he's publishing a book in hopes of supporting some of his work from the revenues. He's made it clear that he will not be offended by or opposed to free docs, and may even contribute to them. He's also still Head Nummie and is leading the hard work of testing and bug swatting. Of course, we all want him to continue, and most of us freely admit our getting more than we are giving, and our gratitude to Travis, Todd, Robert, and the other core developers. Meanwhile, we have the problem of needing some basic docs that are free, and there seems to be quite a bit of interest in the community for doing one or more free docs. This seems to have more community energy behind it now than a web overhaul, so let's do it. The wiki and procedures are all set up to do some docs, and have been for about a year now. If you're interested in doing some docs, either as the lead author of a doc, as a contributor, or as a reviewer, please sign up on http://www.scipy.org/wikis/accessible_scipy/AccessibleSciPy The goal of the signups page is to make it easy for people to find each other: for lead authors to find people who will help them, for the community to identify who is taking part in what efforts, for low-level-of-effort volunteers to become hooked up with bigger projects, etc. There should be dozens of names there, not just three! So: If you're interested in LEADING A DOC, please add your name to the page, make a page for your doc on the wiki, and hang it off the main page, as "Scipy Cookbook" has done (there's a help link at the top of the page with instructions). A project can be anything from writing a collaborative book from scratch, to writing a monograph, to editing and revising existing docs. Announce your project on scipy-dev. If you would like to do a little work but not take the lead on something, you can contribute to the Cookbook or sign up to be a reviewer or contributor, either on an existing doc or at large. Or, contact a doc lead directly and sign up under that project. Please read the roadmap for ideas of docs we thought the community needs. The roadmap document is meant to be amended. For example, is the idea of using the docstrings to make a full-blown reference manual a good idea? I think so, since it's a rather self-updating format, but it will require some substantial work to get them all fleshed out and up to par. Discuss plans, changes, and ongoing efforts for docs on the scipy-dev mailing. It would be nice to have each project have a home on scipy.org, and Plone has excellent workflow-management tools. But, it's ok to home a project on your own site and just put a link on scipy.org. Finally, PLEASE everyone buy Travis's book if you can! Wait for the price to go up. Buy copies for everyone in your lab, if you can afford that. Buy one for your grandmother. It looks like it will be a really nice book, but it's more than that. This is a way to support development, which everyone desperately needs but few have the time (and fewer the skill and experience) to do. If you're at a company that benefits from SciPy and that hasn't already contributed resources to the effort, please consider parting with a larger chunk of change, either by buying more copies or by hiring Travis or others directly to do ongoing maintenance and package development. --jh-- From oliphant at ee.byu.edu Tue Oct 11 15:32:03 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Tue, 11 Oct 2005 13:32:03 -0600 Subject: [SciPy-dev] Found 64-bit bug. New beta to come out tonight. Message-ID: <434C1333.3010906@ee.byu.edu> And there was cause for great rejoicing... Arnd Becker helped me track down a bug in the ufuncobject.c code that was only showing up on his 64-bit system (but would likely have shown up on other systems as well). The recent SVN of newcore passes most of the tests (there are a couple too-stringent tests that don't pass because of numerical noise). But it looks like we are ready for another beta release. Please make any commits as soon as you can. I'll make a release tonight (about 7 hours from now). -Travis From oliphant at ee.byu.edu Tue Oct 11 17:26:46 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Tue, 11 Oct 2005 15:26:46 -0600 Subject: [SciPy-dev] Thoughts on when to make a full release of SciPy Core Message-ID: <434C2E16.7090902@ee.byu.edu> I've been thinking about when to make a full release of SciPy Core. This is what I think needs to be done (in random order): * Get full scipy working on top of it. * Get records.py working. * Get ma.py (Masked Arrays) working well. * Write some tests for fft, linear algebra, and random code * Port memory-mapped module from numarray. Would like to have done (but not essential for first release): * Fix scalar_as_number implementations to not go through ufuncs * Optimizations From rkern at ucsd.edu Tue Oct 11 17:29:55 2005 From: rkern at ucsd.edu (Robert Kern) Date: Tue, 11 Oct 2005 14:29:55 -0700 Subject: [SciPy-dev] Thoughts on when to make a full release of SciPy Core In-Reply-To: <434C2E16.7090902@ee.byu.edu> References: <434C2E16.7090902@ee.byu.edu> Message-ID: <434C2ED3.3020701@ucsd.edu> Travis Oliphant wrote: > I've been thinking about when to make a full release of SciPy Core. > > This is what I think needs to be done (in random order): > > * Get full scipy working on top of it. > * Get records.py working. > * Get ma.py (Masked Arrays) working well. > * Write some tests for fft, linear algebra, and random code > * Port memory-mapped module from numarray. > > Would like to have done (but not essential for first release): > > * Fix scalar_as_number implementations to not go through ufuncs > * Optimizations * Port nd_image -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From oliphant at ee.byu.edu Tue Oct 11 17:55:44 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Tue, 11 Oct 2005 15:55:44 -0600 Subject: [SciPy-dev] Is scipy_core requiring a FORTRAN compiler? In-Reply-To: <434C30D5.3040705@stsci.edu> References: <434C30D5.3040705@stsci.edu> Message-ID: <434C34E0.2030000@ee.byu.edu> Christopher Hanley wrote: > Hi Travis, > > I was under the impression that there was no FORTRAN code in > scipy_core. However, I recently tried re-building on a system that > had an improperly defined F77 environment variable and the build > failed. Is this only occurring because I have ATLAS and BLAS on my > system or is there now FORTRAN code in SCIPY_CORE? No, no fortran code, If anything it is scipy.distutils choking. SciPy distutils supports Fortran compilers but should not need them for scipy core. So, I'm not sure what the problem is. Show us the build log. What is the error? By the way, where do we stand on the Solairs build. I was distracted trying to resolve some issues on a 64-bit system. -Travis From chanley at stsci.edu Tue Oct 11 18:03:56 2005 From: chanley at stsci.edu (Christopher Hanley) Date: Tue, 11 Oct 2005 18:03:56 -0400 Subject: [SciPy-dev] Is scipy_core requiring a FORTRAN compiler? In-Reply-To: <434C34E0.2030000@ee.byu.edu> References: <434C30D5.3040705@stsci.edu> <434C34E0.2030000@ee.byu.edu> Message-ID: <434C36CC.1050707@stsci.edu> > No, no fortran code, If anything it is scipy.distutils choking. SciPy > distutils supports Fortran compilers but should not need them for scipy > core. So, I'm not sure what the problem is. Show us the build log. I have attached the build log. Keep in mind that the F77 environment variable is currently pointed at an invalid FORTRAN compiler. I was just surprised that the setup.py would care about any FORTRAN compiler for scipy_core. Chris -------------- next part -------------- An embedded and charset-unspecified text was scrubbed... Name: build.log URL: From chanley at stsci.edu Tue Oct 11 18:06:24 2005 From: chanley at stsci.edu (Christopher Hanley) Date: Tue, 11 Oct 2005 18:06:24 -0400 Subject: [SciPy-dev] Cannot build scipy_core (newcore) on Solaris 8 In-Reply-To: <434AA642.2000404@stsci.edu> References: <4346D7EE.9010604@stsci.edu> <4346DA2B.9060708@ee.byu.edu> <4346DC55.5000904@stsci.edu> <4346FE27.3020902@ee.byu.edu> <434A9909.6000402@stsci.edu> <434A9FAC.5080702@ee.byu.edu> <434AA642.2000404@stsci.edu> Message-ID: <434C3760.3070705@stsci.edu> Travis, I still cannot build scipy_core on Solaris even with your latest round of fixes. I seem to be getting C99 related error messages again. I'm attaching the build log from my last attempt. Chris -------------- next part -------------- An embedded and charset-unspecified text was scrubbed... Name: solaris_cc_isnanfix_build.log URL: From oliphant at ee.byu.edu Tue Oct 11 18:30:01 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Tue, 11 Oct 2005 16:30:01 -0600 Subject: [SciPy-dev] Is scipy_core requiring a FORTRAN compiler? In-Reply-To: <434C36CC.1050707@stsci.edu> References: <434C30D5.3040705@stsci.edu> <434C34E0.2030000@ee.byu.edu> <434C36CC.1050707@stsci.edu> Message-ID: <434C3CE9.9030604@ee.byu.edu> Christopher Hanley wrote: > > I have attached the build log. Keep in mind that the F77 environment > variable is currently pointed at an invalid FORTRAN compiler. I was > just surprised that the setup.py would care about any FORTRAN compiler > for scipy_core. > >Setting PTATLAS=ATLAS > NOT AVAILABLE > >atlas_blas_info: > NOT AVAILABLE > >blas_info: > FOUND: > libraries = ['blas'] > library_dirs = ['/usr/lib'] > language = f77 > > FOUND: > libraries = ['blas'] > library_dirs = ['/usr/lib'] > define_macros = [('NO_ATLAS_INFO', 1)] > language = f77 > >lapack_opt_info: >atlas_threads_info: >Setting PTATLAS=ATLAS >scipy.distutils.system_info.atlas_threads_info > NOT AVAILABLE > >atlas_info: >scipy.distutils.system_info.atlas_info > NOT AVAILABLE > >lapack_info: > FOUND: > libraries = ['lapack'] > library_dirs = ['/usr/lib'] > language = f77 > > FOUND: > libraries = ['lapack', 'blas'] > library_dirs = ['/usr/lib'] > define_macros = [('NO_ATLAS_INFO', 1)] > language = f77 > > I could be wrong, but I think it is the fact that blas_info and lapack_info are returning that the language of your BLAS and LAPACK libraries is f77 that is the issue. Presumably, if you have Fortran-compiled BLAS and LAPACK libraries, then the f77 linker will be called to link them to _dotblas.c So, I'm not sure this is a problem (except with Chris's environment variables). Is there anything we can do to make this easier for the user?? -Travis From oliphant at ee.byu.edu Tue Oct 11 18:33:04 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Tue, 11 Oct 2005 16:33:04 -0600 Subject: [SciPy-dev] Cannot build scipy_core (newcore) on Solaris 8 In-Reply-To: <434C3760.3070705@stsci.edu> References: <4346D7EE.9010604@stsci.edu> <4346DA2B.9060708@ee.byu.edu> <4346DC55.5000904@stsci.edu> <4346FE27.3020902@ee.byu.edu> <434A9909.6000402@stsci.edu> <434A9FAC.5080702@ee.byu.edu> <434AA642.2000404@stsci.edu> <434C3760.3070705@stsci.edu> Message-ID: <434C3DA0.9000802@ee.byu.edu> Christopher Hanley wrote: > Travis, > > I still cannot build scipy_core on Solaris even with your latest round > of fixes. I seem to be getting C99 related error messages again. I'm > attaching the build log from my last attempt. O.K. Your system does not have the longdouble and floatingpoint math functions (or at least we need different headers to get them). Let's go back and remove the HAVE_LONGDOUBLE_FUNCS and HAVE_FLOAT_FUNCS lines from your config.h file and try again. From rkern at ucsd.edu Tue Oct 11 20:31:07 2005 From: rkern at ucsd.edu (Robert Kern) Date: Tue, 11 Oct 2005 17:31:07 -0700 Subject: [SciPy-dev] matrix and ravel() Message-ID: <434C594B.40501@ucsd.edu> In [1]: A = array([[1,2],[3,4]]) In [2]: B = mat(A) In [3]: ravel(A) Out[3]: array([1, 2, 3, 4]) In [4]: ravel(B) Out[4]: matrix([[1, 2, 3, 4]]) This appears to be a side effect of the new __array_finalize__ magic. I'm not sure that it's desirable. I'm also not sure that it's desirable to change it. -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From Fernando.Perez at colorado.edu Tue Oct 11 21:26:36 2005 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Tue, 11 Oct 2005 19:26:36 -0600 Subject: [SciPy-dev] matrix and ravel() In-Reply-To: <434C594B.40501@ucsd.edu> References: <434C594B.40501@ucsd.edu> Message-ID: <434C664C.400@colorado.edu> Robert Kern wrote: > In [1]: A = array([[1,2],[3,4]]) > > In [2]: B = mat(A) > > In [3]: ravel(A) > Out[3]: array([1, 2, 3, 4]) > > In [4]: ravel(B) > Out[4]: matrix([[1, 2, 3, 4]]) > > This appears to be a side effect of the new __array_finalize__ magic. > I'm not sure that it's desirable. I'm also not sure that it's desirable > to change it. Well, at least I hope it gets changed, or a lot of code which expects ravel to produce a flat, one-d array will blow to pieces. Cheers, f From rkern at ucsd.edu Tue Oct 11 21:51:52 2005 From: rkern at ucsd.edu (Robert Kern) Date: Tue, 11 Oct 2005 18:51:52 -0700 Subject: [SciPy-dev] Modernizing code Message-ID: <434C6C38.7040701@ucsd.edu> For both scipy_core and the full scipy, I would like to make these two changes: * Make all classes new-style classes * Change malloc/free/etc. to their PyMem equivalents The former is, I think, uncontroversial. The latter *may* have an effect on performance, so I'm curious if it has already been considered. If there's no objection, then I'll get started on cleaning things up as soon as I can. -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From chanley at stsci.edu Tue Oct 11 22:02:14 2005 From: chanley at stsci.edu (Christopher Hanley) Date: Tue, 11 Oct 2005 22:02:14 -0400 Subject: [SciPy-dev] Cannot build scipy_core (newcore) on Solaris 8 In-Reply-To: <434C3DA0.9000802@ee.byu.edu> Message-ID: <200510120202.COS19507@stsci.edu> > O.K. Your system does not have the longdouble and floatingpoint math > functions (or at least we need different headers to get them). > > Let's go back and remove the > > HAVE_LONGDOUBLE_FUNCS > > and > > HAVE_FLOAT_FUNCS > > lines > > from your config.h > > file and try again. > O.K. That did the trick. The config.h file now looks like this: /* #define SIZEOF_SHORT 2 */ /* #define SIZEOF_INT 4 */ /* #define SIZEOF_LONG 4 */ /* #define SIZEOF_FLOAT 4 */ /* #define SIZEOF_DOUBLE 8 */ #define SIZEOF_LONG_DOUBLE 16 #define SIZEOF_PY_INTPTR_T 4 /* #define SIZEOF_LONG_LONG 8 */ #define SIZEOF_PY_LONG_LONG 8 /* #define CHAR_BIT 8 */ #define MATHLIB m #define HAVE_INVERSE_HYPERBOLIC_FLOAT All we need now is a setup.py file that can generate this automatically. Thanks, Chris From cookedm at physics.mcmaster.ca Tue Oct 11 22:23:34 2005 From: cookedm at physics.mcmaster.ca (David M. Cooke) Date: Tue, 11 Oct 2005 22:23:34 -0400 Subject: [SciPy-dev] Modernizing code In-Reply-To: <434C6C38.7040701@ucsd.edu> (Robert Kern's message of "Tue, 11 Oct 2005 18:51:52 -0700") References: <434C6C38.7040701@ucsd.edu> Message-ID: Robert Kern writes: > For both scipy_core and the full scipy, I would like to make these two > changes: > > * Make all classes new-style classes > > * Change malloc/free/etc. to their PyMem equivalents > > The former is, I think, uncontroversial. The latter *may* have an effect > on performance, so I'm curious if it has already been considered. I doubt it. PyMem_Malloc, for instance, is a simple wrapper around calling PyMem_MALLOC, which (when PYMALLOC_DEBUG is not defined) is just a macro calling malloc. The PyMem_New, etc. macros would also clean things up. > If there's no objection, then I'll get started on cleaning things up as > soon as I can. I'd add: * use void * instead of char * for non-string pointers. This would allow getting rid of a lot of (char *) casts. I can help with that (and the others). -- |>|\/|< /--------------------------------------------------------------------------\ |David M. Cooke http://arbutus.physics.mcmaster.ca/dmc/ |cookedm at physics.mcmaster.ca From rkern at ucsd.edu Tue Oct 11 22:40:37 2005 From: rkern at ucsd.edu (Robert Kern) Date: Tue, 11 Oct 2005 19:40:37 -0700 Subject: [SciPy-dev] matrix bug Message-ID: <434C77A5.70304@ucsd.edu> [~]$ cat test.py from scipy import * A = array([[1., 2.], [3., 4.]]) mA = matrix(A) print 3*mA print 3*A print mA*mA print 3*mA print 3*A [~]$ python test.py [[ 3. 6.] [ 9. 12.]] [[ 3. 6.] [ 9. 12.]] [[ 3. 6.] [ 24. 34.]] [[ 3. 6.] [ 9. 12.]] ---------------------------------------- I can't reproduce this at the prompt although it is showing up in the unit tests I'm writing. -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From rkern at ucsd.edu Tue Oct 11 22:47:42 2005 From: rkern at ucsd.edu (Robert Kern) Date: Tue, 11 Oct 2005 19:47:42 -0700 Subject: [SciPy-dev] matrix bug In-Reply-To: <434C77A5.70304@ucsd.edu> References: <434C77A5.70304@ucsd.edu> Message-ID: <434C794E.4080001@ucsd.edu> Even better: In [1]: A = array([[1., 2.], ...: [3., 4.]]) In [2]: mA = matrix(A) In [20]: dot(3, mA) Out[20]: matrix([[ 3., 6.], [ 9., 12.]]) In [21]: dot(3, mA) Out[21]: matrix([[ 3., 6.], [ 9., 12.]]) In [22]: dot(3, mA) Out[22]: matrix([[ 3.00000000e+00, 6.00000000e+00], [ 7.30927399e+92, 1.20000000e+01]]) In [23]: dot(3, A) Out[23]: array([[ 3.00000000e+00, 6.00000000e+00], [ 6.60362578e+91, 1.20000000e+01]]) In [24]: dot(3, A) Out[24]: array([[ 3., 6.], [ 9., 12.]]) -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From oliphant at ee.byu.edu Tue Oct 11 22:49:58 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Tue, 11 Oct 2005 20:49:58 -0600 Subject: [SciPy-dev] matrix and ravel() In-Reply-To: <434C594B.40501@ucsd.edu> References: <434C594B.40501@ucsd.edu> Message-ID: <434C79D6.4000002@ee.byu.edu> Robert Kern wrote: >In [1]: A = array([[1,2],[3,4]]) > >In [2]: B = mat(A) > >In [3]: ravel(A) >Out[3]: array([1, 2, 3, 4]) > >In [4]: ravel(B) >Out[4]: matrix([[1, 2, 3, 4]]) > >This appears to be a side effect of the new __array_finalize__ magic. >I'm not sure that it's desirable. I'm also not sure that it's desirable >to change it. > > Not really a side-effect. An intended feature. Matrices are supposed to be matrices (i.e. 2-d arrays). They cannot be 1-d arrays no matter how hard you try.... If you want to ravel a matrix and get a 1-d array out, you are not using matrices right.... Use a 1-d array (i.e. the .A attribute). It does bring up an issue, though, and may require some re-thinking. asarray(a) will return a if a is any sub-class of a (big)ndarray. Thus, asarray(a) will return a matrix if a is a matrix. Thus, a = asarray(a) return a*a will do matrix multiplication if a matrix is passed in, but element-by-element multiplication if another array is passed in. asndarray(a) will always return a base-class array (even 0-d arrays). One option, I suppose is to change the meaning of asarray to asndarray, (thus destroying the ability for matrices to perservere through many operations) --- and introduce another function like asanyarray or something to mean any sub-class is O.K. too. Comments welcome. -Travis From oliphant at ee.byu.edu Tue Oct 11 22:52:49 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Tue, 11 Oct 2005 20:52:49 -0600 Subject: [SciPy-dev] Cannot build scipy_core (newcore) on Solaris 8 In-Reply-To: <200510120202.COS19507@stsci.edu> References: <200510120202.COS19507@stsci.edu> Message-ID: <434C7A81.7040701@ee.byu.edu> Christopher Hanley wrote: >>O.K. Your system does not have the longdouble and floatingpoint math >>functions (or at least we need different headers to get them). >> >>Let's go back and remove the >> >>HAVE_LONGDOUBLE_FUNCS >> >>and >> >>HAVE_FLOAT_FUNCS >> >>lines >> >>from your config.h >> >>file and try again. >> >> >> > >O.K. That did the trick. The config.h file now looks like this: > >/* #define SIZEOF_SHORT 2 */ >/* #define SIZEOF_INT 4 */ >/* #define SIZEOF_LONG 4 */ >/* #define SIZEOF_FLOAT 4 */ >/* #define SIZEOF_DOUBLE 8 */ >#define SIZEOF_LONG_DOUBLE 16 >#define SIZEOF_PY_INTPTR_T 4 >/* #define SIZEOF_LONG_LONG 8 */ >#define SIZEOF_PY_LONG_LONG 8 >/* #define CHAR_BIT 8 */ >#define MATHLIB m >#define HAVE_INVERSE_HYPERBOLIC_FLOAT > > > Remind us what does get produced automatically... -Travis From rkern at ucsd.edu Tue Oct 11 23:11:58 2005 From: rkern at ucsd.edu (Robert Kern) Date: Tue, 11 Oct 2005 20:11:58 -0700 Subject: [SciPy-dev] matrix and ravel() In-Reply-To: <434C79D6.4000002@ee.byu.edu> References: <434C594B.40501@ucsd.edu> <434C79D6.4000002@ee.byu.edu> Message-ID: <434C7EFE.3060308@ucsd.edu> Travis Oliphant wrote: > Robert Kern wrote: > >>In [1]: A = array([[1,2],[3,4]]) >> >>In [2]: B = mat(A) >> >>In [3]: ravel(A) >>Out[3]: array([1, 2, 3, 4]) >> >>In [4]: ravel(B) >>Out[4]: matrix([[1, 2, 3, 4]]) >> >>This appears to be a side effect of the new __array_finalize__ magic. >>I'm not sure that it's desirable. I'm also not sure that it's desirable >>to change it. > > Not really a side-effect. An intended feature. Matrices are supposed > to be matrices (i.e. 2-d arrays). They cannot be 1-d arrays no matter > how hard you try.... If you want to ravel a matrix and get a 1-d array > out, you are not using matrices right.... Use a 1-d array (i.e. the .A > attribute). It's kind of an Immovable Object meets Irresistable Force deal. I expect ravel() to always give me a 1-d array, and I also expect matrix object to always by 2-d. I'm willing to concede the fact that all of my desires are mutually contradictory in this regard. -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From oliphant at ee.byu.edu Tue Oct 11 22:51:52 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Tue, 11 Oct 2005 20:51:52 -0600 Subject: [SciPy-dev] Modernizing code In-Reply-To: <434C6C38.7040701@ucsd.edu> References: <434C6C38.7040701@ucsd.edu> Message-ID: <434C7A48.9000701@ee.byu.edu> Robert Kern wrote: >For both scipy_core and the full scipy, I would like to make these two >changes: > > * Make all classes new-style classes > > Fine with me... > * Change malloc/free/etc. to their PyMem equivalents > > Not a real problem. I'm interested in performance too. You'll notice the macros in arrayobject.h that let you change the underlying memory allocation call immediately (there are still a few places that use malloc alone). >The former is, I think, uncontroversial. The latter *may* have an effect >on performance, so I'm curious if it has already been considered. > >If there's no objection, then I'll get started on cleaning things up as >soon as I can. > > Great... -Travis From oliphant at ee.byu.edu Tue Oct 11 23:05:49 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Tue, 11 Oct 2005 21:05:49 -0600 Subject: [SciPy-dev] matrix bug In-Reply-To: <434C77A5.70304@ucsd.edu> References: <434C77A5.70304@ucsd.edu> Message-ID: <434C7D8D.2030809@ee.byu.edu> Robert Kern wrote: >[~]$ cat test.py >from scipy import * >A = array([[1., 2.], > [3., 4.]]) >mA = matrix(A) > >print 3*mA >print 3*A >print > > > I'm getting something weird too. It's in the _dotblas dot function. I'll look into it. -Travis From oliphant at ee.byu.edu Wed Oct 12 00:56:46 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Tue, 11 Oct 2005 22:56:46 -0600 Subject: [SciPy-dev] How do I get the doc directory packaged? Message-ID: <434C978E.9070506@ee.byu.edu> I don't know why it's so hard for me to figure this out. We have the doc directory. How can we get this packaged? I've tried config.add_data_files, config.add_data_dir. Is there anyway to place the files under scipy/doc in the installation directory. Or do we have to move the doc directory under scipy on the svn server to do it? Thanks for any help? -Travis From nwagner at mecha.uni-stuttgart.de Wed Oct 12 02:38:37 2005 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Wed, 12 Oct 2005 08:38:37 +0200 Subject: [SciPy-dev] scipy.base.__version__ 0.4.3.1247 Message-ID: <434CAF6D.6050508@mecha.uni-stuttgart.de> >>> from scipy import stats Traceback (most recent call last): File "", line 1, in ? File "/usr/local/lib/python2.4/site-packages/scipy/stats/__init__.py", line 7, in ? from stats import * File "/usr/local/lib/python2.4/site-packages/scipy/stats/stats.py", line 1503, in ? import distributions File "/usr/local/lib/python2.4/site-packages/scipy/stats/distributions.py", line 845, in ? anglit = anglit_gen(a=-pi/4,b=pi/4,name='anglit', extradoc=""" NameError: name 'pi' is not defined From oliphant at ee.byu.edu Wed Oct 12 02:08:28 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Wed, 12 Oct 2005 00:08:28 -0600 Subject: [SciPy-dev] New beta released Message-ID: <434CA85C.9080105@ee.byu.edu> I made a new beta release (I tagged the svn tree this time.. beta-0.4.2 under tags). I have not made a windows binary, because it's not working on my system. My system is giving me strange errors and I don't trust it. The last windows binary I made was hanging under PythonWin. Is there anybody who can compile under windows and try out the binary? Thanks, -Travis From arnd.baecker at web.de Wed Oct 12 03:58:14 2005 From: arnd.baecker at web.de (Arnd Baecker) Date: Wed, 12 Oct 2005 09:58:14 +0200 (CEST) Subject: [SciPy-dev] modulo operation and new scipy core Message-ID: Hi, one thing which I find irritating is the behaviour of the modulo operation for arrays: Compare: In [1]: from scipy import * In [2]: -0.4 % 1.0 Out[2]: 0.59999999999999998 In [3]: x=arange(-0.6,1.0,0.1) In [4]: x%1.0 Out[4]: array([ -6.00000000e-01, -5.00000000e-01, -4.00000000e-01, -3.00000000e-01, -2.00000000e-01, -1.00000000e-01, 1.11022302e-16, 1.00000000e-01, 2.00000000e-01, 3.00000000e-01, 4.00000000e-01, 5.00000000e-01, 6.00000000e-01, 7.00000000e-01, 8.00000000e-01, 9.00000000e-01]) Even worse (IMHO): take a scalar (I know it is still an array, but it does not look like one ;-) from the array In [5]: x[2] Out[5]: -0.39999999999999997 In [6]: x[2] % 1.0 Out[6]: -0.39999999999999997 It seems that for arrays % behaves like `fmod` and not like `mod`. I find this confusing as it is in contrast to the python 2.4 documentation: "5.6. Binary arithmetic operations" """The % (modulo) operator yields the remainder from the division of the first argument by the second. [...] The arguments may be floating point numbers, e.g., 3.14%0.7 equals 0.34 (since 3.14 equals 4*0.7 + 0.34.) The modulo operator always yields a result with the same sign as its second operand (or zero); the absolute value of the result is strictly smaller than the absolute value of the second operand.""" Would it be possible for the new scipy core that % behaves the same (standard python) way for scalars and for arrays? Best, Arnd From oliphant at ee.byu.edu Wed Oct 12 04:28:39 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Wed, 12 Oct 2005 02:28:39 -0600 Subject: [SciPy-dev] lib2def? Message-ID: <434CC937.5080808@ee.byu.edu> What is the status of lib2def? It doesn't seem to be needed in Python2.4, but in Python 2.3 is it necessary? I noticed it is not in the new scipy.distutils. -Travis From oliphant at ee.byu.edu Wed Oct 12 04:41:56 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Wed, 12 Oct 2005 02:41:56 -0600 Subject: [SciPy-dev] Windows problem solved Message-ID: <434CCC54.5090005@ee.byu.edu> I solved my windows problem by deleting cygwin and installing msys and mingw separately. Now, the binaries build and seem stable. I had to get lib2def out of scipy_distutils and run it on python23.lib to construct libpython23.a for Python2.3 on Windows. I did not have to do that for Python2.4 on Windows. Are we going to include lib2def.py with scipy.distutils? -Travis From oliphant at ee.byu.edu Wed Oct 12 04:46:11 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Wed, 12 Oct 2005 02:46:11 -0600 Subject: [SciPy-dev] modulo operation and new scipy core In-Reply-To: References: Message-ID: <434CCD53.8080307@ee.byu.edu> Arnd Baecker wrote: >Hi, > >one thing which I find irritating is the behaviour of >the modulo operation for arrays: > >Compare: > >In [1]: from scipy import * >In [2]: -0.4 % 1.0 >Out[2]: 0.59999999999999998 >In [3]: x=arange(-0.6,1.0,0.1) >In [4]: x%1.0 >Out[4]: >array([ -6.00000000e-01, -5.00000000e-01, -4.00000000e-01, > -3.00000000e-01, -2.00000000e-01, -1.00000000e-01, > 1.11022302e-16, 1.00000000e-01, 2.00000000e-01, > 3.00000000e-01, 4.00000000e-01, 5.00000000e-01, > 6.00000000e-01, 7.00000000e-01, 8.00000000e-01, > 9.00000000e-01]) > >Even worse (IMHO): take a scalar (I know it is still an array, >but it does not look like one ;-) from the array > > Actually it is a real scalar (it's just using the array math right now). >In [5]: x[2] >Out[5]: -0.39999999999999997 > >In [6]: x[2] % 1.0 >Out[6]: -0.39999999999999997 > >It seems that for arrays % behaves like `fmod` and not like `mod`. > > > Yes, that has been the behavior of Numeric. There is the mod function for arrays. Should we switch that? It will cause a couple of incompatibilities if people relied on the old (arguably) non-standard behavior. >I find this confusing as it is in contrast to the >python 2.4 documentation: > >"5.6. Binary arithmetic operations" > > """The % (modulo) operator yields the remainder from the division > of the first argument by the second. [...] > The arguments may be floating point numbers, e.g., > 3.14%0.7 equals 0.34 (since 3.14 equals 4*0.7 + 0.34.) > The modulo operator always yields a result with the same sign as > its second operand (or zero); the absolute value of the result > is strictly smaller than the absolute value of the second > operand.""" > >Would it be possible for the new scipy core that % behaves >the same (standard python) way for scalars and for arrays? > > I would do this. What do others think? -Travis From pearu at scipy.org Wed Oct 12 03:47:44 2005 From: pearu at scipy.org (Pearu Peterson) Date: Wed, 12 Oct 2005 02:47:44 -0500 (CDT) Subject: [SciPy-dev] Windows problem solved In-Reply-To: <434CCC54.5090005@ee.byu.edu> References: <434CCC54.5090005@ee.byu.edu> Message-ID: On Wed, 12 Oct 2005, Travis Oliphant wrote: > > I solved my windows problem by deleting cygwin and installing msys and > mingw separately. > > Now, the binaries build and seem stable. > > I had to get lib2def out of scipy_distutils and run it on python23.lib > to construct libpython23.a for Python2.3 on Windows. I did not have to > do that for Python2.4 on Windows. > > Are we going to include lib2def.py with scipy.distutils? Only if necessary for Python 2.3. I'll try to install Python 2.3 and test it. If somebody can test it earlier, let me know. Pearu From pearu at scipy.org Wed Oct 12 03:51:20 2005 From: pearu at scipy.org (Pearu Peterson) Date: Wed, 12 Oct 2005 02:51:20 -0500 (CDT) Subject: [SciPy-dev] How do I get the doc directory packaged? In-Reply-To: <434C978E.9070506@ee.byu.edu> References: <434C978E.9070506@ee.byu.edu> Message-ID: On Tue, 11 Oct 2005, Travis Oliphant wrote: > > I don't know why it's so hard for me to figure this out. > > We have the doc directory. How can we get this packaged? > > I've tried config.add_data_files, config.add_data_dir. Is there > anyway to place the files under > scipy/doc in the installation directory. Or do we have to move the doc > directory under scipy on the svn server to do it? > > Thanks for any help? Moving doc under scipy would work immidiately (I see you have already done that). Otherwise, adding the following line to scipy/setup.py should have worked: config.add_data_files(('doc','../doc/*.txt')) Also, with the current svn scipy.distutils, config.add_data_dir(('doc','../doc')) works. However, the current solution is just fine. Pearu From oliphant at ee.byu.edu Wed Oct 12 05:04:43 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Wed, 12 Oct 2005 03:04:43 -0600 Subject: [SciPy-dev] modulo operation and new scipy core In-Reply-To: References: Message-ID: <434CD1AB.8070003@ee.byu.edu> Arnd Baecker wrote: >Hi, > >one thing which I find irritating is the behaviour of >the modulo operation for arrays: > >Compare: > >In [1]: from scipy import * >In [2]: -0.4 % 1.0 >Out[2]: 0.59999999999999998 >In [3]: x=arange(-0.6,1.0,0.1) >In [4]: x%1.0 >Out[4]: >array([ -6.00000000e-01, -5.00000000e-01, -4.00000000e-01, > -3.00000000e-01, -2.00000000e-01, -1.00000000e-01, > 1.11022302e-16, 1.00000000e-01, 2.00000000e-01, > 3.00000000e-01, 4.00000000e-01, 5.00000000e-01, > 6.00000000e-01, 7.00000000e-01, 8.00000000e-01, > > > 9.00000000e-01]) > > By the way (get current SVN first). you can change the behavior of all the array operations using. from scipy.base.multiarray import set_numeric_ops For example (after from scipy import mod) set_numeric_ops(remainder=mod) will give you the behavior you seek. -Travis From pearu at scipy.org Wed Oct 12 05:11:09 2005 From: pearu at scipy.org (Pearu Peterson) Date: Wed, 12 Oct 2005 04:11:09 -0500 (CDT) Subject: [SciPy-dev] Windows problem solved In-Reply-To: References: <434CCC54.5090005@ee.byu.edu> Message-ID: On Wed, 12 Oct 2005, Pearu Peterson wrote: > > > On Wed, 12 Oct 2005, Travis Oliphant wrote: > >> >> I solved my windows problem by deleting cygwin and installing msys and >> mingw separately. >> >> Now, the binaries build and seem stable. >> >> I had to get lib2def out of scipy_distutils and run it on python23.lib >> to construct libpython23.a for Python2.3 on Windows. I did not have to >> do that for Python2.4 on Windows. >> >> Are we going to include lib2def.py with scipy.distutils? > > Only if necessary for Python 2.3. I'll try to install Python 2.3 and test > it. If somebody can test it earlier, let me know. Yes, we need lib2def.py for Python 2.3 to build libpython23.a. It was not needed for Python 2.4 because it ships already libpython24.a. I have also added mingw32ccompiler.py to svn that fixes few other things. Pearu From arnd.baecker at web.de Wed Oct 12 06:50:32 2005 From: arnd.baecker at web.de (Arnd Baecker) Date: Wed, 12 Oct 2005 12:50:32 +0200 (CEST) Subject: [SciPy-dev] modulo operation and new scipy core In-Reply-To: <434CD1AB.8070003@ee.byu.edu> References: <434CD1AB.8070003@ee.byu.edu> Message-ID: On Wed, 12 Oct 2005, Travis Oliphant wrote: > Arnd Baecker wrote: > > >Hi, > > > >one thing which I find irritating is the behaviour of > >the modulo operation for arrays: > > > >Compare: > > > >In [1]: from scipy import * > >In [2]: -0.4 % 1.0 > >Out[2]: 0.59999999999999998 > >In [3]: x=arange(-0.6,1.0,0.1) > >In [4]: x%1.0 > >Out[4]: > >array([ -6.00000000e-01, -5.00000000e-01, -4.00000000e-01, > > -3.00000000e-01, -2.00000000e-01, -1.00000000e-01, > > 1.11022302e-16, 1.00000000e-01, 2.00000000e-01, > > 3.00000000e-01, 4.00000000e-01, 5.00000000e-01, > > 6.00000000e-01, 7.00000000e-01, 8.00000000e-01, > > > > > > 9.00000000e-01]) > By the way (get current SVN first). > > you can change the behavior of all the array operations using. > > from scipy.base.multiarray import set_numeric_ops > > For example (after from scipy import mod) > > set_numeric_ops(remainder=mod) > > will give you the behavior you seek. Wow - that's cool! Still I am in faviour of % behaving the same way for arrays and scalars as default. (And with `set_numeric_ops(remainder=fmod)` one can get the old behaviour, for those who rely on that). Arnd From schofield at ftw.at Wed Oct 12 09:08:57 2005 From: schofield at ftw.at (Ed Schofield) Date: Wed, 12 Oct 2005 15:08:57 +0200 Subject: [SciPy-dev] Introductions, sparse matrix support Message-ID: <434D0AE9.9070707@ftw.at> Hi all, First of all, let me congratulate Travis, Robert, Pearu, and the other core developers for their great work on SciPy and SciPy Core. I'd like to introduce myself and express my interest to get involved. I'm the author of a toolkit for maximum entropy modelling (with the misleading Sourceforge project name 'textmodeller') that uses SciPy and PySparse. I've been reading the scipy-dev list for about a year with interest. Now I'd like to get involved in SciPy development, starting with improving the sparse matrix module and, when that's in shape, contributing my maxent module to SciPy (if you want it!) I've made small contributions to PySparse in the past and briefly discussed the idea of merging PySparse into SciPy with Roman Geus by email. His comment in April was: "So far I have not had contacts to the scipy people, and admittedly I have never even installed scipy. When creating pysparse (some years ago now) I liked the idea of smaller well-definied packages more, than a big collection dependent packages. " I broadly agree with this philosophy, and it has served PySparse development well until now, but in this case I think that there's value to the community in having a well-integrated core of matrix functionality (sparse and dense) and one set of prerequisites and interfaces rather than several. Jonathan mentioned that he'd collect some stats on how pysparse and scipy.sparse stack up. If we're cherry-picking among different sparse matrix implementations, here's a qualitative assessment: PySparse has several advantages, including a mature and apparently well-written code base, a simple interface, some good documentation, even a good set of unit tests. These were the reasons I adopted it for my maximum entropy module 1-2 years ago. It has some momentum too as a project, with a small but steady number of patches flowing in since Roman stopped active development a couple of years ago. I'd be happy to work with Jonathan, Dan, and Robert on sparse matrix support in SciPy, integrating PySparse code with Travis and Robert's UMFPACK wrapper. Perhaps we could even find support among other PySparse developers. PySparse currently depends on Numeric, and will need porting to SciPy Core eventually... From aisaac at american.edu Wed Oct 12 09:14:22 2005 From: aisaac at american.edu (Alan G Isaac) Date: Wed, 12 Oct 2005 09:14:22 -0400 Subject: [SciPy-dev] matrix and ravel() In-Reply-To: <434C7EFE.3060308@ucsd.edu> References: <434C594B.40501@ucsd.edu> <434C79D6.4000002@ee.byu.edu> <434C7EFE.3060308@ucsd.edu> Message-ID: > Travis Oliphant wrote: >> Not really a side-effect. An intended feature. Matrices are supposed >> to be matrices (i.e. 2-d arrays). They cannot be 1-d arrays no matter >> how hard you try.... If you want to ravel a matrix and get a 1-d array >> out, you are not using matrices right.... Use a 1-d array (i.e. the .A >> attribute). On Tue, 11 Oct 2005, Robert Kern apparently wrote: > It's kind of an Immovable Object meets Irresistable Force deal. I expect > ravel() to always give me a 1-d array, and I also expect matrix object > to always by 2-d. I'm willing to concede the fact that all of my desires > are mutually contradictory in this regard. User comments: 1. Better support for matrices is very welcome; I use them a lot. 2. I expect ravel to return a 1-d array, not a matrix. My background is a matrix programming language (GAUSS from before its N-dimensional array support), and yet the behavior of ravel never surprised me. 3. I expect a row and column vectorization functions (named e.g., vecr and vecc) to do row and column vectorization. I do not have a strong opinion on whether what is really wanted is a vec function that takes an axis argument, although this is probably right. In particular, both row and column vectorization of matrices is common, and this is just a choice of axis. So as a user I suggest that this new functionality of ravel creates unwanted surprises, in addition to breaking past code, and as long as vec is available, will imply no missing functionality. The same goes double for asarray: it will be a really unwanted surprise if it does not return an array. fwiw, Alan Isaac From chanley at stsci.edu Wed Oct 12 11:15:58 2005 From: chanley at stsci.edu (Christopher Hanley) Date: Wed, 12 Oct 2005 11:15:58 -0400 Subject: [SciPy-dev] Cannot build scipy_core (newcore) on Solaris 8 In-Reply-To: <434C7A81.7040701@ee.byu.edu> References: <200510120202.COS19507@stsci.edu> <434C7A81.7040701@ee.byu.edu> Message-ID: <434D28AE.2080505@stsci.edu> Travis Oliphant wrote: >>O.K. That did the trick. The config.h file now looks like this: >> >>/* #define SIZEOF_SHORT 2 */ >>/* #define SIZEOF_INT 4 */ >>/* #define SIZEOF_LONG 4 */ >>/* #define SIZEOF_FLOAT 4 */ >>/* #define SIZEOF_DOUBLE 8 */ >>#define SIZEOF_LONG_DOUBLE 16 >>#define SIZEOF_PY_INTPTR_T 4 >>/* #define SIZEOF_LONG_LONG 8 */ >>#define SIZEOF_PY_LONG_LONG 8 >>/* #define CHAR_BIT 8 */ >>#define MATHLIB m >>#define HAVE_INVERSE_HYPERBOLIC_FLOAT >> >> >> > > Remind us what does get produced automatically... > > -Travis > The config.h file produced by the current version of scipy_core contains: /* #define SIZEOF_SHORT 2 */ /* #define SIZEOF_INT 4 */ /* #define SIZEOF_LONG 4 */ /* #define SIZEOF_FLOAT 4 */ /* #define SIZEOF_DOUBLE 8 */ #define SIZEOF_LONG_DOUBLE 16 #define SIZEOF_PY_INTPTR_T 4 /* #define SIZEOF_LONG_LONG 8 */ #define SIZEOF_PY_LONG_LONG 8 /* #define CHAR_BIT 8 */ #define MATHLIB m #define HAVE_INVERSE_HYPERBOLIC #define HAVE_ISNAN Chris From Fernando.Perez at colorado.edu Wed Oct 12 11:54:06 2005 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Wed, 12 Oct 2005 09:54:06 -0600 Subject: [SciPy-dev] modulo operation and new scipy core In-Reply-To: <434CCD53.8080307@ee.byu.edu> References: <434CCD53.8080307@ee.byu.edu> Message-ID: <434D319E.5020305@colorado.edu> Travis Oliphant wrote: > I would do this. What do others think? +1 (principle of least surprise and all that) Cheers, f From oliphant at ee.byu.edu Wed Oct 12 14:49:42 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Wed, 12 Oct 2005 12:49:42 -0600 Subject: [SciPy-dev] modulo operation and new scipy core In-Reply-To: References: <434CD1AB.8070003@ee.byu.edu> Message-ID: <434D5AC6.10407@ee.byu.edu> Arnd Baecker wrote: >>set_numeric_ops(remainder=mod) >> >>will give you the behavior you seek. >> >> > >Wow - that's cool! > > >Still I am in faviour of % behaving the same way for arrays >and scalars as default. > > Done. People who need the old behavior can set their numeric ops, as they wish. -Travis From oliphant at ee.byu.edu Wed Oct 12 15:00:42 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Wed, 12 Oct 2005 13:00:42 -0600 Subject: [SciPy-dev] Introductions, sparse matrix support In-Reply-To: <434D0AE9.9070707@ftw.at> References: <434D0AE9.9070707@ftw.at> Message-ID: <434D5D5A.8000106@ee.byu.edu> Ed Schofield wrote: >Hi all, > >First of all, let me congratulate Travis, Robert, Pearu, and the other >core developers for their great work on SciPy and SciPy Core. I'd like >to introduce myself and express my interest to get involved. > > We'd love to have you help. >I'm the author of a toolkit for maximum entropy modelling (with the >misleading Sourceforge project name 'textmodeller') that uses SciPy and >PySparse. I've been reading the scipy-dev list for about a year with >interest. Now I'd like to get involved in SciPy development, starting >with improving the sparse matrix module and, when that's in shape, >contributing my maxent module to SciPy (if you want it!) > > This sounds very interesting to me personally. I suspect we would be glad to have it. >I've made small contributions to PySparse in the past and briefly >discussed the idea of merging PySparse into SciPy with Roman Geus by >email. His comment in April was: > >"So far I have not had contacts to the scipy people, and admittedly I >have never even installed scipy. When creating pysparse (some years ago >now) I liked the idea of smaller well-definied packages more, than a big >collection dependent packages. " > > We are trying to advertise more that scipy is really just a namespace. Packages do not have to depend on each other, but inevitably some will. In the (near) future it should be easier for developers to simply have their package be compatible with SciPy. If they want to make it part of the SciPy tree, then great. Or, they can still manage it separately but make it easily installable with the rest of scipy (and easily buildable by people who want to distribute a large install of many packages). Any package that used to depend on Numeric will find it much easier to integrate with scipy once they convert to scipy core. >I broadly agree with this philosophy, and it has served PySparse >development well until now, but in this case I think that there's value >to the community in having a well-integrated core of matrix >functionality (sparse and dense) and one set of prerequisites and > > >interfaces rather than several. > > Yes, I would like to see some standardization in the sparse stuff. With you, Robert, Dan, and Jonathan coming aboard, it looks like there is a critical mass of people who are able to make it happen. >I'd be happy to work with Jonathan, Dan, and Robert on sparse matrix >support in SciPy, integrating PySparse code with Travis and Robert's >UMFPACK wrapper. Perhaps we could even find support among other >PySparse developers. PySparse currently depends on Numeric, and will >need porting to SciPy Core eventually... > > Sounds great. I'm very excited about this. Just let me know when you need SVN access. -Travis From chanley at stsci.edu Wed Oct 12 15:11:52 2005 From: chanley at stsci.edu (Christopher Hanley) Date: Wed, 12 Oct 2005 15:11:52 -0400 Subject: [SciPy-dev] Thoughts on when to make a full release of SciPy Core In-Reply-To: <434C2ED3.3020701@ucsd.edu> References: <434C2E16.7090902@ee.byu.edu> <434C2ED3.3020701@ucsd.edu> Message-ID: <434D5FF8.2040409@stsci.edu> Robert Kern wrote: > Travis Oliphant wrote: > >>I've been thinking about when to make a full release of SciPy Core. >> >>This is what I think needs to be done (in random order): >> >>* Get full scipy working on top of it. >>* Get records.py working. We at STScI have started working on the records.py port. I don't want to make a promise as to when this will be completed. I should have a better idea within the next week or so. >>* Get ma.py (Masked Arrays) working well. >>* Write some tests for fft, linear algebra, and random code >>* Port memory-mapped module from numarray. >> >>Would like to have done (but not essential for first release): >> >>* Fix scalar_as_number implementations to not go through ufuncs >>* Optimizations > > > * Port nd_image > If you see the porting of records and nd_image as bottle necks in terms of getting a full release of SciPy Core done quickly, you may want to consider doing a two stage release. Release 1.0 could be billed as being fully Numeric compatible. The 2.0 release would be with the complete set of numarray features. Chris -- Christopher Hanley Senior Software Engineer Space Telescope Science Institute 3700 San Martin Drive Baltimore MD, 21218 (410) 338-4338 From rkern at ucsd.edu Wed Oct 12 15:21:22 2005 From: rkern at ucsd.edu (Robert Kern) Date: Wed, 12 Oct 2005 12:21:22 -0700 Subject: [SciPy-dev] Introductions, sparse matrix support In-Reply-To: <434D5D5A.8000106@ee.byu.edu> References: <434D0AE9.9070707@ftw.at> <434D5D5A.8000106@ee.byu.edu> Message-ID: <434D6232.30008@ucsd.edu> Travis Oliphant wrote: > We are trying to advertise more that scipy is really just a namespace. > Packages do not have to depend on each other, but inevitably some will. > In the (near) future it should be easier for developers to simply have > their package be compatible with SciPy. If they want to make it part > of the SciPy tree, then great. Or, they can still manage it separately > but make it easily installable with the rest of scipy (and easily > buildable by people who want to distribute a large install of many > packages). > > Any package that used to depend on Numeric will find it much easier to > integrate with scipy once they convert to scipy core. Indeed. Except for signal, there are actually very few interdependencies. Here is my dependency analysis so far: cluster: scipy_core scipy_stats # scipy_stats is only needed for std() and cov() fftpack: scipy_core integrate: scipy_core scipy_special interpolate: scipy_core io: scipy_core lib: scipy_core linalg: scipy_core scipy_lib # scipy_utils for limits optimize: scipy_core # scipy_utils for limits signal: scipy_core scipy_special scipy_interpolate scipy_integrate scipy_optimize sparse: scipy_core special: scipy_core stats: scipy_core scipy_optimize utils: scipy_core scipy_special I have more opinions on this, scipy package organization, and What Belongs In Scipy, but I think I've used up my opinion budget for the week. -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From oliphant at ee.byu.edu Wed Oct 12 15:30:43 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Wed, 12 Oct 2005 13:30:43 -0600 Subject: [SciPy-dev] Introductions, sparse matrix support In-Reply-To: <434D6232.30008@ucsd.edu> References: <434D0AE9.9070707@ftw.at> <434D5D5A.8000106@ee.byu.edu> <434D6232.30008@ucsd.edu> Message-ID: <434D6463.4020107@ee.byu.edu> Robert Kern wrote: >I have more opinions on this, scipy package organization, and What >Belongs In Scipy, but I think I've used up my opinion budget for the week. > > I don't know, people who contribute code get much, much larger opinion budgets.... ;-) We are all waiting for your insights :-) -Travis From pearu at scipy.org Wed Oct 12 15:49:59 2005 From: pearu at scipy.org (Pearu Peterson) Date: Wed, 12 Oct 2005 14:49:59 -0500 (CDT) Subject: [SciPy-dev] Introductions, sparse matrix support In-Reply-To: <434D6463.4020107@ee.byu.edu> References: <434D0AE9.9070707@ftw.at> <434D5D5A.8000106@ee.byu.edu> <434D6232.30008@ucsd.edu> <434D6463.4020107@ee.byu.edu> Message-ID: On Wed, 12 Oct 2005, Travis Oliphant wrote: > Robert Kern wrote: > >> I have more opinions on this, scipy package organization, and What >> Belongs In Scipy, but I think I've used up my opinion budget for the week. >> >> > I don't know, people who contribute code get much, much larger opinion > budgets.... ;-) > > We are all waiting for your insights :-) Indeed, I fully agree with Travis. What Belongs In Scipy could be also stated as Who Is Maintaining What In Scipy. If developers could state which packages they are interested in maintaining and developing in future, that would ease making a decision of moving certain parts in scipy either to sandbox or droping them altogether. And may be we should start a detailed review process for each Scipy package to define which parts of it is state-of-art, partly-implemented, needs-tests, old-but-working, broken, obsolute, etc. That would also help people jumping in into scipy development process more easily. Pearu From oliphant at ee.byu.edu Wed Oct 12 16:59:07 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Wed, 12 Oct 2005 14:59:07 -0600 Subject: [SciPy-dev] New 0.4.2 beta release of scipy core Message-ID: <434D791B.9060809@ee.byu.edu> I made another beta release of scipy core last night. There are windows binaries for Python 2.4 and Python 2.3. If you are already a user of scipy, the new __init__ file installed for newcore will break your current installation of scipy (but the problem with linalg, fftpack, and stats is no longer there). There have been many improvements: - bug fixes (including 64-bit fixes) - threading support fixes - optimizations - more random numbers (thanks Robert Kern). - more distutils fixes (thanks Pearu Peterson). More tests are welcome. We are moving towards a release (but still need to get Masked Arrays working and all of scipy to build on top of the new scipy core before a full release). -Travis From oliphant at ee.byu.edu Wed Oct 12 16:36:41 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Wed, 12 Oct 2005 14:36:41 -0600 Subject: [SciPy-dev] matrix and ravel() In-Reply-To: References: <434C594B.40501@ucsd.edu> <434C79D6.4000002@ee.byu.edu> <434C7EFE.3060308@ucsd.edu> Message-ID: <434D73D9.5040802@ee.byu.edu> Alan G Isaac wrote: >>Travis Oliphant wrote: >> >> >>>Not really a side-effect. An intended feature. Matrices are supposed >>>to be matrices (i.e. 2-d arrays). They cannot be 1-d arrays no matter >>>how hard you try.... If you want to ravel a matrix and get a 1-d array >>>out, you are not using matrices right.... Use a 1-d array (i.e. the .A >>>attribute). >>> >>> >1. Better support for matrices is very welcome; I use them > a lot. >2. I expect ravel to return a 1-d array, not a matrix. My > background is a matrix programming language (GAUSS from > before its N-dimensional array support), and yet the > > > behavior of ravel never surprised me. > > I'm less concerned about this issue because it is so easy to get an array out of a matrix. >3. I expect a row and column vectorization functions (named > e.g., vecr and vecc) to do row and column vectorization. > I do not have a strong opinion on whether what is really > wanted is a vec function that takes an axis argument, > although this is probably right. In particular, both row > and column vectorization of matrices is common, and this > is just a choice of axis. > > These are m.T.ravel() and m.ravel() right now. To get a 1-d array it is m.A.ravel() >The same goes double for asarray: it will be a really >unwanted surprise if it does not return an array. > > I'm much more concerned about this issue. I would like to hear more people way in on what should be the behavior here. In particular, here are two proposals. 1) current behavior array (and asarray) --- returns the object itself if it is a sub-class of the array (or an array scalar). asndarray --- returns an actual nd array object 2) new behavior array (and asarray) always return an ndarray base-class object (or a big-nd array if the object is already one). asanyarray --- returns an nd-array or a sub-class if the object is already a sub-class. Please weigh in... --Travis >fwiw, >Alan Isaac > > >_______________________________________________ >Scipy-dev mailing list >Scipy-dev at scipy.net >http://www.scipy.net/mailman/listinfo/scipy-dev > > From Fernando.Perez at colorado.edu Wed Oct 12 18:33:11 2005 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Wed, 12 Oct 2005 16:33:11 -0600 Subject: [SciPy-dev] Question about 64-bit integers being cast to double precision In-Reply-To: <434A98EB.7030407@ee.byu.edu> References: <4349EDA2.2090802@ee.byu.edu> <434A9453.2010705@csun.edu> <434A98EB.7030407@ee.byu.edu> Message-ID: <434D8F27.100@colorado.edu> Travis Oliphant wrote: >>With all that, my vote on Travis's specific question: if conversion of >>an N-bit integer in scipy_core is required, it gets converted to an >>N-bit float. The only cases in which precision will be lost is if the >>integer is large enough to require more than (N-e) bits for its >>representation, where e is the number of bits in the exponent of the >>floating point representation. >> > > > Yes, it is only for large integers that problems arise. I like this > scheme and it would be very easy to implement, and it would provide a > consistent interface. > > The only problem is that it would mean that on current 32-bit systems > > sqrt(2) would cast 2 to a "single-precision" float and return a > single-precision result. > > If that is not a problem, then great... > > Otherwise, a more complicated (and less consistent) rule like > > integer float > ============== > 8-bit 32-bit > 16-bit 32-bit > 32-bit 64-bit > 64-bit 64-bit > > would be needed (this is also not too hard to do). Here's a different way to think about this issue: instead of thinking in terms of bit-width, let's look at it in terms of exact vs inexact numbers. Integers are exact, and their bit size only impacts the range of them which is representable. If we look at it this way, then seems to me justifiable to suggest that sqrt(2) would upcast to the highest-available precision floating point format. Obviously this can have an enormous memory impact if we're talking about a big array of numbers instead of sqrt(2), so I'm not 100% sure it's the right solution. However, I think that the rule 'if you apply "floating point" operations to integer inputs, the system will upcast the integers to give you as much precision as possible' is a reasonable one. Users needing tight memory control could always first convert their small integers to the smallest existing floats, and then operate on that. Just my 1e-2 Cheers, f From Fernando.Perez at colorado.edu Wed Oct 12 18:36:12 2005 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Wed, 12 Oct 2005 16:36:12 -0600 Subject: [SciPy-dev] New 0.4.2 beta release of scipy core In-Reply-To: <434D791B.9060809@ee.byu.edu> References: <434D791B.9060809@ee.byu.edu> Message-ID: <434D8FDC.2050306@colorado.edu> Travis Oliphant wrote: > I made another beta release of scipy core last night. There are > windows binaries for Python 2.4 and Python 2.3. If you are already a Sorry if I missed the blindingly obvious, but a URL for the tarballs (do they exist?) would be most appreciated at this point (I have a friend looking for it, and I'm embarrassed to say that I can't seem to find it). As a hook: this friend may test the build on Itanium boxes, so I'd like to respond to him soonish :) If it's just an SVN pull that's fine, just let me know and I'll pass the info along. Cheers, f From charles.harris at sdl.usu.edu Wed Oct 12 18:46:53 2005 From: charles.harris at sdl.usu.edu (Charles R Harris) Date: Wed, 12 Oct 2005 16:46:53 -0600 Subject: [SciPy-dev] Question about 64-bit integers being cast to double precision In-Reply-To: <434D8F27.100@colorado.edu> References: <4349EDA2.2090802@ee.byu.edu> <434A9453.2010705@csun.edu> <434A98EB.7030407@ee.byu.edu> <434D8F27.100@colorado.edu> Message-ID: <1129157213.4839.5.camel@E011704> On Wed, 2005-10-12 at 16:33 -0600, Fernando Perez wrote: > Travis Oliphant wrote: > > >>With all that, my vote on Travis's specific question: if conversion of > >>an N-bit integer in scipy_core is required, it gets converted to an > >>N-bit float. The only cases in which precision will be lost is if the > >>integer is large enough to require more than (N-e) bits for its > >>representation, where e is the number of bits in the exponent of the > >>floating point representation. > >> > > > > > > Yes, it is only for large integers that problems arise. I like this > > scheme and it would be very easy to implement, and it would provide a > > consistent interface. > > > > The only problem is that it would mean that on current 32-bit systems > > > > sqrt(2) would cast 2 to a "single-precision" float and return a > > single-precision result. > > > > If that is not a problem, then great... > > > > Otherwise, a more complicated (and less consistent) rule like > > > > integer float > > ============== > > 8-bit 32-bit > > 16-bit 32-bit > > 32-bit 64-bit > > 64-bit 64-bit > > > > would be needed (this is also not too hard to do). > > Here's a different way to think about this issue: instead of thinking in terms > of bit-width, let's look at it in terms of exact vs inexact numbers. Integers > are exact, and their bit size only impacts the range of them which is > representable. > > If we look at it this way, then seems to me justifiable to suggest that > sqrt(2) would upcast to the highest-available precision floating point format. > Obviously this can have an enormous memory impact if we're talking about a > big array of numbers instead of sqrt(2), so I'm not 100% sure it's the right > solution. However, I think that the rule 'if you apply "floating point" > operations to integer inputs, the system will upcast the integers to give you > as much precision as possible' is a reasonable one. Users needing tight > memory control could always first convert their small integers to the smallest > existing floats, and then operate on that. I think it is a good idea to keep double as the default, if only because Python expects it. If someone needs more control over the precision of arrays, why not do as c does and add functions sqrtf and sqrtl? Chuck From rkern at ucsd.edu Wed Oct 12 19:15:02 2005 From: rkern at ucsd.edu (Robert Kern) Date: Wed, 12 Oct 2005 16:15:02 -0700 Subject: [SciPy-dev] Package organization In-Reply-To: <434D6463.4020107@ee.byu.edu> References: <434D0AE9.9070707@ftw.at> <434D5D5A.8000106@ee.byu.edu> <434D6232.30008@ucsd.edu> <434D6463.4020107@ee.byu.edu> Message-ID: <434D98F6.5080301@ucsd.edu> Travis Oliphant wrote: > Robert Kern wrote: > >>I have more opinions on this, scipy package organization, and What >>Belongs In Scipy, but I think I've used up my opinion budget for the week. > > I don't know, people who contribute code get much, much larger opinion > budgets.... ;-) Okay. I warned you. I would like to see scipy's package organization become flatter and more oriented towards easy, lightweight, modular packaging rather than subject matter. For example, some people want bindings to SUNDIALS for ODEs. They could go into scipy.integrate, but that introduces a large, needless dependency for those who just want to compute integrals. So I would suggest that SUNDIALS bindings would be in their own scipy.sundials package. As another example, I might also suggest moving the simulated annealing module out into scipy.globalopt along with diffev.py and pso.py that are currently in my sandbox. They're all optimizers, but functionally they are unrelated to the extension-heavy optimizers that make up the remainder of scipy.optimize. The wavelets library that Fernando is working on would go in as scipy.wavelets rather than being stuffed into scipy.signal. You get the idea. This is also why I suggested making the scipy_core versions of fftpack and linalg be named scipy.corefft and scipy.corelinalg and not be aliased to scipy.fftpack and scipy.linalg. The "core" names reflect their packaging and their limited functionality. For one thing, this naming allows us to try to import the possibly optimized versions in the full scipy: # scipy/corelinalg/__init__.py import basic_lite svd = basic_lite.singular_value_decomposition ... try: from scipy import linalg svd = linalg.svd ... except ImportError: pass The explicit "core" names help us and others keep control over dependencies. Lots of the scipy subpackages need some linear algebra functions, but AFAICT, none actually require anything beyond what's in scipy_core. With the "core" names, we won't accidentally add a dependency on the full scipy.linalg without due deliberation. Okay, What Belongs In Scipy. It's somewhat difficult to answer the question, "Does this package belong in scipy?" without having a common answer to, "What is scipy?" I won't pretend to have the single answer to that last question, but I will start the dialogue based on the rationalizations I've come up with to defend my gut feelings. Things scipy is not: * A framework. You shouldn't have to restructure your programs to use the algorithms implemented in scipy. Sometimes the algorithms themselves may require it (e.g. reverse communication solvers), but that's not imposed by scipy. * Everything a scientist will need to do computing. For a variety of reasons, it's just not an achievable goal and, more importantly, it's not a good standard for making decisions. A lot of scientists need a good RDBMS, but there's no reason to put pysqlite into scipy. Enthon, package repositories, and specialized LiveCDs are better places to collect "everything." * A plotting library. (Sorry, had to throw that in.) Things scipy is: * A loose collection of slightly interdependent modules for numerical computing. * A common build environment that handles much of the annoying work for numerical extension modules. Does your module rely on a library that needs LAPACK or BLAS? If you put it in scipy, your users can configure the location of their optimized libraries *once*, and all of the scipy modules they build can use that information. * A good place to put numerical modules that don't otherwise have a good home. Things scipy *could* be: * An *excellent* build environment for library-heavy extension modules. To realize this, we would need to integrate the configuration portion of PETSc's BuildSystem or something equivalent. The automatic discovery/download/build works quite well. If this were to be realized, some packages might make more sense as subpackages of scipy. For example, matplotlib and pytables don't have much reason to be part of scipy right now, but if the libraries they depend on could be automatically detected/downloaded/built and shared with other scipy subpackages, then I think it might make sense for them to live in scipy, too. As Pearu suggested, as we port scipy packages to the new scipy_core we should audit and label them. To that end: * gui_thread, gplt, and plt are dead, I think. * xplt shambles along, but shouldn't return as scipy.xplt. It can never be *the* plotting library for scipy, and leaving it as scipy.xplt gives people that impression. * scipy.cluster is sorta broken and definitely incomplete. We should port over what's in Bio.Cluster. For that matter, there's quite a bit in biopython we should stea^Wport (Bio.GA, Bio.HMM, Bio.KDTree, Bio.NeuralNetwork, Bio.NaiveBayes, Bio.MaxEntropy, Bio.MarkovModel, Bio.LogisticRegression, Bio.Statistics.lowess). * The bindings to ODEPACK, QUADPACK, FITPACK, and MINPACK are handwritten. Should we mark them as, "f2py when you get the chance"? Otherwise, they probably count as "state-of-the-art" although we could always expand our offerings like exposing some of the other functions in ODEPACK. * scipy.optimize: I think I recently ran into a regression in the old scipy. fmin() wasn't finding the minimum of the Rosenbrock function in the tutorial. I'll have to check that again. The simulated annealing code could use some review. * scipy.special: cephes.round() seems to be buggy depending on the platform, and I think we got a bug report about one of the other functions. * I will maintain much of scipy.stats. Of course, that will probably mean, "throwing anova() into the sandbox never to return." Many of the other functions in stats.py need vetting. Now I'm sure I've used up my opinion budget. -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From oliphant at ee.byu.edu Wed Oct 12 21:13:11 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Wed, 12 Oct 2005 19:13:11 -0600 Subject: [SciPy-dev] New 0.4.2 beta release of scipy core In-Reply-To: <434D8FDC.2050306@colorado.edu> References: <434D791B.9060809@ee.byu.edu> <434D8FDC.2050306@colorado.edu> Message-ID: <434DB4A7.3@ee.byu.edu> Fernando Perez wrote: >Travis Oliphant wrote: > > >>I made another beta release of scipy core last night. There are >>windows binaries for Python 2.4 and Python 2.3. If you are already a >> >> > >Sorry if I missed the blindingly obvious, but a URL for the tarballs (do they >exist?) would be most appreciated at this point (I have a friend looking for >it, and I'm embarrassed to say that I can't seem to find it). As a hook: this >friend may test the build on Itanium boxes, so I'd like to respond to him >soonish :) If it's just an SVN pull that's fine, just let me know and I'll > > Sorry, Links to the sourceforge download site are at http://numeric.scipy.org -Travis From oliphant at ee.byu.edu Wed Oct 12 21:32:44 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Wed, 12 Oct 2005 19:32:44 -0600 Subject: [SciPy-dev] [Numpy-discussion] Re: matrix and ravel() In-Reply-To: <434DB553.9040303@noaa.gov> References: <434C594B.40501@ucsd.edu> <434C79D6.4000002@ee.byu.edu> <434C7EFE.3060308@ucsd.edu> <434D73D9.5040802@ee.byu.edu> <434DB553.9040303@noaa.gov> Message-ID: <434DB93C.9010303@ee.byu.edu> Chris Barker wrote: > Travis Oliphant wrote: > >> In particular, here are two proposals. >> 1) current behavior >> array (and asarray) --- returns the object itself if it is a >> sub-class of the array >> (or an array scalar). >> asndarray --- returns an actual nd array object > > > -1. I expect as array to return an array. Period. That's why I use it. > >> 2) new behavior >> >> array (and asarray) always return an ndarray base-class object (or a >> big-nd array if the object is already one). > > > +1 ...I think. What is a big-nd array? A big-nd arrary is the parent-class of the ndarray object. It doesn't have the sequence protocols or the buffer protocols and should not suffer from the 32-bit limitations of those protocols. An ndarray inherits from the bigarray. Eventually, when Python cleans up it's 32-bit limitations, the bigarray will disappear. >> asanyarray --- returns an nd-array or a sub-class if the object is >> already a sub-class. > > > What about having each subclass define it's own function-- asmatrix, > etc? Maybe that's not general enough, but I know I use asarray because > I know I want a NumPy Array, and nothing else. I guess the problem is that we are not used to coding to interfaces. I'm going to make the change suggested by the second point, just because I think it's more explicit and will make porting scipy a lot easier. The fact that multiplication could be redefined by the matrix which still passes as an array, means that lots of code can choke on matrices. Of course, this will have negative consequences. It will make matrices much less pervasive through function calls "automatically", but it will be safer. People who believe their code is safe for matrices, can use asanyarray. Now, what to do on the C-level... Right now PyArray_FromAny(op) returns op if op is any array subclass. I'll probably add a flag to the requires argument that forces the result to be an ndarray. -Travis From lee.j.joon at gmail.com Thu Oct 13 02:18:00 2005 From: lee.j.joon at gmail.com (Jae-Joon Lee) Date: Thu, 13 Oct 2005 15:18:00 +0900 Subject: [SciPy-dev] New beta released In-Reply-To: <434CA85C.9080105@ee.byu.edu> References: <434CA85C.9080105@ee.byu.edu> Message-ID: <6e8d907b0510122318i450677fap8695666f3df4a4ce@mail.gmail.com> On 10/12/05, Travis Oliphant wrote: > > > I made a new beta release (I tagged the svn tree this time.. beta-0.4.2 > under tags). > > I have not made a windows binary, because it's not working on my > system. My system is giving me strange errors and I don't trust it. > The last windows binary I made was hanging under PythonWin. > > Is there anybody who can compile under windows and try out the binary? > > Thanks, > > -Travis > > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-dev > -------------- next part -------------- An HTML attachment was scrubbed... URL: From Fernando.Perez at colorado.edu Thu Oct 13 02:33:53 2005 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Thu, 13 Oct 2005 00:33:53 -0600 Subject: [SciPy-dev] Package organization In-Reply-To: <434D98F6.5080301@ucsd.edu> References: <434D0AE9.9070707@ftw.at> <434D5D5A.8000106@ee.byu.edu> <434D6232.30008@ucsd.edu> <434D6463.4020107@ee.byu.edu> <434D98F6.5080301@ucsd.edu> Message-ID: <434DFFD1.9040303@colorado.edu> Robert Kern wrote: > I would like to see scipy's package organization become flatter and more > oriented towards easy, lightweight, modular packaging rather than > subject matter. For example, some people want bindings to SUNDIALS for > ODEs. They could go into scipy.integrate, but that introduces a large, > needless dependency for those who just want to compute integrals. So I > would suggest that SUNDIALS bindings would be in their own > scipy.sundials package. > > As another example, I might also suggest moving the simulated annealing > module out into scipy.globalopt along with diffev.py and pso.py that are > currently in my sandbox. They're all optimizers, but functionally they > are unrelated to the extension-heavy optimizers that make up the > remainder of scipy.optimize. > > The wavelets library that Fernando is working on would go in as > scipy.wavelets rather than being stuffed into scipy.signal. You get the > idea. > > This is also why I suggested making the scipy_core versions of fftpack > and linalg be named scipy.corefft and scipy.corelinalg and not be > aliased to scipy.fftpack and scipy.linalg. The "core" names reflect > their packaging and their limited functionality. For one thing, this > naming allows us to try to import the possibly optimized versions in the > full scipy: [snip excellent analysis of scipy's organizational status ] Let me add a minor twist to your plan, which perhaps may help a little. How about making a two-level distinction between 'scipy, the core package' and 'scipy, the collection of tools'? Here's how it could be organized, in terms of namespaces and release policy: whatever is defined as the core is released by the scipy package proper, and can be safely considered a dependency for the rest. Note that this can still be split between scipy_core and 'full scipy', where scipy_core is Travis' new Numeric/numarray and 'full scipy' contains much more functionality. But as far as packages written by third-party authors, which can live under the scipy namespace as an umbrella, benefit from scipy's build facilities and core libraries, how about putting them all into a 'toolkits' namespace? The actual name, for typing convenience, could be scipy.kits or scipy.tools (something short). This would then give us the following structure: 1. Scipy_core: the new Numeric/numarray package, which includes basic FFT, linear algebra, random numbers and perhaps basic i/o (at least save/load abilities), and whatever else I'm missing right now (I don't have it yet installed on this laptop). 2. Scipy 'full': depends on (1), and exposes all the other scipy names: scipy.{linalg,optimize,integrate,...}. These are libraries considered officially part of scipy, so that even if they are maintained by others (much like python's stdlib), there is a committment to a common release cycle. These can, if need be, have inter-dependencies, as they will always be released as a whole. 1 and 2 all use the top-level scipy namespace. Then we have: 3. The scipy.{kits|tools} namespace (or whatever the chosen name). This is where third parties can drop their own packages, which can depend either only (1) or on the full (2) system (their level of dependency should be explicitly stated). The kits namespace may ship empty by default, or it could be populated with a few things from current scipy if it is decided they are best moved there. The only thing required for projects to live in the .kits namespace is really to avoid top-level name collisions, so it would perhaps be worth having an informal policy of people checking with scipy-dev for a name before using it. This layout would allow the core team to work with relative freedom at the top-level namespace, without worrying about toolkits taking names they may need in the future. Similarly, toolkit authors will have a well-defined API to build upon. The criterion for deciding what goes in (2) should be one of generality: tools likely to be of very wide need for most things in scientific work, and which provide a foundation for toolkit authors. If this is combined with a CPAN-like system (eggs, PyPi, whatever), it should be very easy for users, once they have the basic layers in place, to grab a toolkit by issuing a single command or going to a website. I'd suggest, if this were adopted, keeping a simple page at scipy with brief descriptions for each toolkit, even if they are developed/distributed externally. The current 'example package' (the ex-xxx package) could be the prototype for a toolkit, used by new toolkit authors to get off the ground quickly, and by scipy to establish coding and documentation policy for .kits members. If we establish a few conventions to be followed by toolkits, we can ensure that the top-level documentation/info facilities automatically register them (dynamically). Anyway, I've certainly far exceeded my opinion budget on this one, so I should shut up now :) Cheers, f From nwagner at mecha.uni-stuttgart.de Thu Oct 13 02:45:18 2005 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Thu, 13 Oct 2005 08:45:18 +0200 Subject: [SciPy-dev] Cannot build newcore --> AttributeError: 'module' object has no attribute 'typeinfo' Message-ID: <434E027E.4050407@mecha.uni-stuttgart.de> Running from source directory. Traceback (most recent call last): File "setup.py", line 35, in ? setup_package() File "setup.py", line 9, in setup_package from scipy.core_version import version File "/var/tmp/svn/newcore/scipy/core_version.py", line 4, in ? import base.__svn_version__ as svn File "/var/tmp/svn/newcore/scipy/base/__init__.py", line 7, in ? import numerictypes as nt File "/var/tmp/svn/newcore/scipy/base/numerictypes.py", line 81, in ? typeinfo = multiarray.typeinfo AttributeError: 'module' object has no attribute 'typeinfo' From arnd.baecker at web.de Thu Oct 13 02:52:09 2005 From: arnd.baecker at web.de (Arnd Baecker) Date: Thu, 13 Oct 2005 08:52:09 +0200 (CEST) Subject: [SciPy-dev] Package organization In-Reply-To: <434DFFD1.9040303@colorado.edu> References: <434D0AE9.9070707@ftw.at> <434D5D5A.8000106@ee.byu.edu> <434D98F6.5080301@ucsd.edu> <434DFFD1.9040303@colorado.edu> Message-ID: On Thu, 13 Oct 2005, Fernando Perez wrote: [...] > 3. The scipy.{kits|tools} namespace (or whatever the chosen name). This is > where third parties can drop their own packages, which can depend either only > (1) or on the full (2) system (their level of dependency should be explicitly > stated). > > The kits namespace may ship empty by default, or it could be populated with a > few things from current scipy if it is decided they are best moved there. > > The only thing required for projects to live in the .kits namespace is really > to avoid top-level name collisions, so it would perhaps be worth having an > informal policy of people checking with scipy-dev for a name before using it. > > This layout would allow the core team to work with relative freedom at the > top-level namespace, without worrying about toolkits taking names they may > need in the future. Similarly, toolkit authors will have a well-defined API > to build upon. To me this looks as if there is essentially no difference to people distributing their projects independently of scipy (and outside of its name-space). I still like this for two reasons a) it would bring scientific python projects under one umbrella b) the (hopefully) simple user installation: [...] > If this is combined with a CPAN-like system (eggs, PyPi, whatever), it should > be very easy for users, once they have the basic layers in place, to grab a > toolkit by issuing a single command or going to a website. I'd suggest, if > this were adopted, keeping a simple page at scipy with brief descriptions for > each toolkit, even if they are developed/distributed externally. Sounds very nice! I have no experience with eggs: where do these go when a user cannot install them as root? (Of course, a PYTHONPATH modification should do the job.) [...] > Anyway, I've certainly far exceeded my opinion budget on this one, so I should > shut up now :) (If I disagree with this point, would this decrease my-already-in-the-negative-I-presume opinion budget? ;-) Best, Arnd From Fernando.Perez at colorado.edu Thu Oct 13 03:05:58 2005 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Thu, 13 Oct 2005 01:05:58 -0600 Subject: [SciPy-dev] Package organization In-Reply-To: References: <434D0AE9.9070707@ftw.at> <434D5D5A.8000106@ee.byu.edu> <434D98F6.5080301@ucsd.edu> <434DFFD1.9040303@colorado.edu> Message-ID: <434E0756.8070103@colorado.edu> Arnd Baecker wrote: > On Thu, 13 Oct 2005, Fernando Perez wrote: > >>This layout would allow the core team to work with relative freedom at the >>top-level namespace, without worrying about toolkits taking names they may >>need in the future. Similarly, toolkit authors will have a well-defined API >>to build upon. > > > To me this looks as if there is essentially no difference > to people distributing their projects independently of scipy > (and outside of its name-space). > I still like this for two reasons > a) it would bring scientific python projects under one umbrella > b) the (hopefully) simple user installation: You are correct, and that's precisely the point. Much as python's stdlib provides a base for functionality which all other python packages can rely on, scipy would do the same for scientific work. However, I deeply dislike the fact that python blends all namespaces into one via $PYTHONPATH: I would much rather have the stlib live under something like 'std': from std import sys, os, time so that what is coming from the stdlib would be explicitly separated from what is a third-party package (in site-packages, typically). I'm trying to make such a distinction explicit, by making sure that anything that says from scipy import foo is the 'scipy stdlib', while from scipy.kits import bar explicitly indicates it's a toolkit. In addition, since most likely all packages of serious scientific interest will need at the very least the core of scipy (for array facilities), having them under the scipy umbrella makes this explicit. So yes, people could distribute fully outside of the scipy namespace (they currently do). But by using the scipy infrastructure, they can benefit from a set of low-level facilities (distutils, testing, f2py, BuildSystem if ported,...), and they also provide a more unified interface for their users. Their packages can register with the top-level docstring facilities, so that interactive help/indexing systems show a coherent set of tools for scientific work, hopefully we'll have standardized conventions for code examples and docstrings with latex support, etc. Could people continue to work outside of the umbrella? Absolutely. But hopefully the benefits will be significant, with no major drawbacks for third-party package authors. All I'm proposing is really to mimic in scipy/scipy.kits the existing conventions of the Python world with the stdlib/site-packages, with the minor addition of an explicit namespace to disambiguate (which I wish the core language had). > [...] > > >>If this is combined with a CPAN-like system (eggs, PyPi, whatever), it should >>be very easy for users, once they have the basic layers in place, to grab a >>toolkit by issuing a single command or going to a website. I'd suggest, if >>this were adopted, keeping a simple page at scipy with brief descriptions for >>each toolkit, even if they are developed/distributed externally. > > > Sounds very nice! I have no experience with eggs: > where do these go when a user cannot install them as root? > (Of course, a PYTHONPATH modification should do the job.) Robert is our resident expert on those little beasts, so I'll leave this one to him. > [...] > > >>Anyway, I've certainly far exceeded my opinion budget on this one, so I should >>shut up now :) > > > (If I disagree with this point, would this > decrease my-already-in-the-negative-I-presume opinion budget? ;-) And I'm headed to bankruptcy court now :) Cheers, f From rkern at ucsd.edu Thu Oct 13 03:24:06 2005 From: rkern at ucsd.edu (Robert Kern) Date: Thu, 13 Oct 2005 00:24:06 -0700 Subject: [SciPy-dev] Package organization In-Reply-To: References: <434D0AE9.9070707@ftw.at> <434D5D5A.8000106@ee.byu.edu> <434D98F6.5080301@ucsd.edu> <434DFFD1.9040303@colorado.edu> Message-ID: <434E0B96.4050005@ucsd.edu> Arnd Baecker wrote: > On Thu, 13 Oct 2005, Fernando Perez wrote: > > [...] > >>3. The scipy.{kits|tools} namespace (or whatever the chosen name). This is >>where third parties can drop their own packages, which can depend either only >>(1) or on the full (2) system (their level of dependency should be explicitly >>stated). >> >>The kits namespace may ship empty by default, or it could be populated with a >>few things from current scipy if it is decided they are best moved there. >> >>The only thing required for projects to live in the .kits namespace is really >>to avoid top-level name collisions, so it would perhaps be worth having an >>informal policy of people checking with scipy-dev for a name before using it. >> >>This layout would allow the core team to work with relative freedom at the >>top-level namespace, without worrying about toolkits taking names they may >>need in the future. Similarly, toolkit authors will have a well-defined API >>to build upon. > > To me this looks as if there is essentially no difference > to people distributing their projects independently of scipy > (and outside of its name-space). > I still like this for two reasons > a) it would bring scientific python projects under one umbrella > b) the (hopefully) simple user installation: I'm not entirely convinced that everything needs to live in the scipy namespace, though. I think I overstated the build-time benefits of being in scipy itself. I think we can make most of those benefits accessible to packages that aren't living in scipy.* . I haven't been keeping up with the AstroPy discussions, but I doubt they'd really want to do from scipy.kits.astropy import wcs etc. "Flat is better than nested," and all that. I *think* a package can get all the benefits of "being in scipy" simply by depending on parts of scipy and using the tools appropriately. But to really figure this out, we need to come down to cases. Let's get our house in order first, and then we can talk about how we deal with guests. >>If this is combined with a CPAN-like system (eggs, PyPi, whatever), it should >>be very easy for users, once they have the basic layers in place, to grab a >>toolkit by issuing a single command or going to a website. I'd suggest, if >>this were adopted, keeping a simple page at scipy with brief descriptions for >>each toolkit, even if they are developed/distributed externally. > > Sounds very nice! I have no experience with eggs: > where do these go when a user cannot install them as root? > (Of course, a PYTHONPATH modification should do the job.) Primarily, you just add the egg itself to sys.path in any of the various ways one might do that. If the egg isn't "zip-safe" because it needs to access real files-on-the-filesystem, then you need to unpack the egg first. There's a tool built on top of eggs called easy_install which manages a .pth file to control which packages are enabled. This scheme will be changing to something more flexible and controllable shortly, I'm told. Yes, Fernando, this change should solve all of the problems you and Prabhu have been having with eggs. I think. I hope. http://peak.telecommunity.com/DevCenter/PythonEggs http://peak.telecommunity.com/DevCenter/EasyInstall http://peak.telecommunity.com/DevCenter/setuptools -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From pearu at scipy.org Thu Oct 13 02:26:56 2005 From: pearu at scipy.org (Pearu Peterson) Date: Thu, 13 Oct 2005 01:26:56 -0500 (CDT) Subject: [SciPy-dev] Package organization In-Reply-To: <434DFFD1.9040303@colorado.edu> References: <434D0AE9.9070707@ftw.at> <434D5D5A.8000106@ee.byu.edu> <434D98F6.5080301@ucsd.edu> <434DFFD1.9040303@colorado.edu> Message-ID: On Thu, 13 Oct 2005, Fernando Perez wrote: > Robert Kern wrote: > >> I would like to see scipy's package organization become flatter and more >> oriented towards easy, lightweight, modular packaging rather than >> subject matter. For example, some people want bindings to SUNDIALS for >> ODEs. They could go into scipy.integrate, but that introduces a large, >> needless dependency for those who just want to compute integrals. So I >> would suggest that SUNDIALS bindings would be in their own >> scipy.sundials package. This is for what scipy.lib namespace was created, to collect packages containing wrappers to various libraries. So, how scipy.lib.sundials sounds for you in addition to scipy.lib.blas scipy.lib.lapack scipy.lib.minpack scipy.lib.fftpack scipy.lib.odepack scipy.lib.quadpack etc etc ? Among other things this may reduce the time spent on importing large extension modules that users might not use in their programs. > [snip excellent analysis of scipy's organizational status ] > > Let me add a minor twist to your plan, which perhaps may help a little. How > about making a two-level distinction between 'scipy, the core package' and > 'scipy, the collection of tools'? Here's how it could be organized, in terms > of namespaces and release policy: whatever is defined as the core is released > by the scipy package proper, and can be safely considered a dependency for the > rest. Note that this can still be split between scipy_core and 'full scipy', > where scipy_core is Travis' new Numeric/numarray and 'full scipy' contains > much more functionality. > > But as far as packages written by third-party authors, which can live under > the scipy namespace as an umbrella, benefit from scipy's build facilities and > core libraries, how about putting them all into a 'toolkits' namespace? The > actual name, for typing convenience, could be scipy.kits or scipy.tools > (something short). > > This would then give us the following structure: > > 1. Scipy_core: the new Numeric/numarray package, which includes basic FFT, > linear algebra, random numbers and perhaps basic i/o (at least save/load > abilities), and whatever else I'm missing right now (I don't have it yet > installed on this laptop). > > 2. Scipy 'full': depends on (1), and exposes all the other scipy names: > scipy.{linalg,optimize,integrate,...}. These are libraries considered > officially part of scipy, so that even if they are maintained by others (much > like python's stdlib), there is a committment to a common release cycle. > These can, if need be, have inter-dependencies, as they will always be > released as a whole. This is pretty much the current structure of newcore and newscipy, even of old scipy and scipy_core. > 1 and 2 all use the top-level scipy namespace. Then we have: > > 3. The scipy.{kits|tools} namespace (or whatever the chosen name). This is > where third parties can drop their own packages, which can depend either only > (1) or on the full (2) system (their level of dependency should be explicitly > stated). Ok, that's a new bit, though it remainds me the scipy.sandbox package in newscipy. May be it's a matter of naming convention. Pearu From pearu at scipy.org Thu Oct 13 02:46:04 2005 From: pearu at scipy.org (Pearu Peterson) Date: Thu, 13 Oct 2005 01:46:04 -0500 (CDT) Subject: [SciPy-dev] Cannot build newcore --> AttributeError: 'module' object has no attribute 'typeinfo' In-Reply-To: <434E027E.4050407@mecha.uni-stuttgart.de> References: <434E027E.4050407@mecha.uni-stuttgart.de> Message-ID: On Thu, 13 Oct 2005, Nils Wagner wrote: > Running from source directory. > Traceback (most recent call last): > File "setup.py", line 35, in ? > setup_package() > File "setup.py", line 9, in setup_package > from scipy.core_version import version > File "/var/tmp/svn/newcore/scipy/core_version.py", line 4, in ? > import base.__svn_version__ as svn > File "/var/tmp/svn/newcore/scipy/base/__init__.py", line 7, in ? > import numerictypes as nt > File "/var/tmp/svn/newcore/scipy/base/numerictypes.py", line 81, in ? > typeinfo = multiarray.typeinfo > AttributeError: 'module' object has no attribute 'typeinfo' Fixed in SVN. Pearu From pearu at scipy.org Thu Oct 13 04:06:54 2005 From: pearu at scipy.org (Pearu Peterson) Date: Thu, 13 Oct 2005 03:06:54 -0500 (CDT) Subject: [SciPy-dev] Package organization In-Reply-To: <434D98F6.5080301@ucsd.edu> References: <434D0AE9.9070707@ftw.at> <434D5D5A.8000106@ee.byu.edu> <434D98F6.5080301@ucsd.edu> Message-ID: On Wed, 12 Oct 2005, Robert Kern wrote: > I would like to see scipy's package organization become flatter and more > oriented towards easy, lightweight, modular packaging rather than > subject matter. For example, some people want bindings to SUNDIALS for > ODEs. They could go into scipy.integrate, but that introduces a large, > needless dependency for those who just want to compute integrals. So I > would suggest that SUNDIALS bindings would be in their own > scipy.sundials package. I like this approach, though with scipy.lib.sundials. This is to separate plain wrappers (there can be different tools/libraries for the same task, e.g. there are various sparse matrix libraries) from general tools (tools for specific tasks, like linalg) that should live in scipy namespace. > The wavelets library that Fernando is working on would go in as > scipy.wavelets rather than being stuffed into scipy.signal. You get the > idea. I would also consider wavelets a more general tool than just a signal processing tool. Just like FFT is. > This is also why I suggested making the scipy_core versions of fftpack > and linalg be named scipy.corefft and scipy.corelinalg and not be > aliased to scipy.fftpack and scipy.linalg. The "core" names reflect > their packaging and their limited functionality. For one thing, this > naming allows us to try to import the possibly optimized versions in the > full scipy: > > # scipy/corelinalg/__init__.py > import basic_lite > svd = basic_lite.singular_value_decomposition > ... > > try: > from scipy import linalg > svd = linalg.svd > ... > except ImportError: > pass OT, I'd like to consider the above codelet just as an example of desired behaviour, rather than a real code. I have had to much bad experience with importing extension modules with failures that are hidden with such try-except blocks, and making debugging really painful. We should have a better method of detecting unavailability of a package, as the above would have the same behaviour both for broken and missing packages. > Okay, What Belongs In Scipy. It's somewhat difficult to answer the > question, "Does this package belong in scipy?" without having a common > answer to, "What is scipy?" I won't pretend to have the single answer to > that last question, but I will start the dialogue based on the > rationalizations I've come up with to defend my gut feelings. > > Things scipy is not: > > * A framework. You shouldn't have to restructure your programs to use > the algorithms implemented in scipy. Sometimes the algorithms themselves > may require it (e.g. reverse communication solvers), but that's not > imposed by scipy. > > * Everything a scientist will need to do computing. For a variety of > reasons, it's just not an achievable goal and, more importantly, it's > not a good standard for making decisions. A lot of scientists need a > good RDBMS, but there's no reason to put pysqlite into scipy. Enthon, > package repositories, and specialized LiveCDs are better places to > collect "everything." > > * A plotting library. (Sorry, had to throw that in.) > > Things scipy is: > > * A loose collection of slightly interdependent modules for numerical > computing. > > * A common build environment that handles much of the annoying work > for numerical extension modules. Does your module rely on a library that > needs LAPACK or BLAS? If you put it in scipy, your users can configure > the location of their optimized libraries *once*, and all of the scipy > modules they build can use that information. > > * A good place to put numerical modules that don't otherwise have a > good home. > > Things scipy *could* be: > > * An *excellent* build environment for library-heavy extension > modules. To realize this, we would need to integrate the configuration > portion of PETSc's BuildSystem or something equivalent. The automatic > discovery/download/build works quite well. If this were to be realized, > some packages might make more sense as subpackages of scipy. For > example, matplotlib and pytables don't have much reason to be part of > scipy right now, but if the libraries they depend on could be > automatically detected/downloaded/built and shared with other scipy > subpackages, then I think it might make sense for them to live in scipy, > too. I think it should be a separate project. Some years ago I was working on a package (taskman) that could download/build/install packages/libraries like gcc, ATLAS, Python, Numeric, etc to detect problems that might come up with certain combinations of software versions. So, I consider myself aware of most of the issues that one must deal with in such a project. > As Pearu suggested, as we port scipy packages to the new scipy_core we > should audit and label them. To that end: > > * gui_thread, gplt, and plt are dead, I think. +1 > * xplt shambles along, but shouldn't return as scipy.xplt. It can > never be *the* plotting library for scipy, and leaving it as scipy.xplt > gives people that impression. +0.5, I think scipy.xplt is a nice piece of work, where should it be moved? > * scipy.cluster is sorta broken and definitely incomplete. We should > port over what's in Bio.Cluster. For that matter, there's quite a bit in > biopython we should stea^Wport (Bio.GA, Bio.HMM, Bio.KDTree, > Bio.NeuralNetwork, Bio.NaiveBayes, Bio.MaxEntropy, Bio.MarkovModel, > Bio.LogisticRegression, Bio.Statistics.lowess). > > * The bindings to ODEPACK, QUADPACK, FITPACK, and MINPACK are > handwritten. Should we mark them as, "f2py when you get the chance"? > Otherwise, they probably count as "state-of-the-art" although we could > always expand our offerings like exposing some of the other functions in > ODEPACK. Yes. I am willing to take this task. > * scipy.optimize: I think I recently ran into a regression in the old > scipy. fmin() wasn't finding the minimum of the Rosenbrock function in > the tutorial. I'll have to check that again. The simulated annealing > code could use some review. > > * scipy.special: cephes.round() seems to be buggy depending on the > platform, and I think we got a bug report about one of the other functions. > > * I will maintain much of scipy.stats. Of course, that will probably > mean, "throwing anova() into the sandbox never to return." Many of the > other functions in stats.py need vetting. > > Now I'm sure I've used up my opinion budget. Could you give me your account number of your opinion pank, I'd like to make some opinion transfer from my credit account;-) I can maintain/review various f2py based wrappers (blas,lapack,etc, anything that should go into scipy.lib) as well fftpack and linalg packages. I am also interested in integrate, optimize, interpolate packages but mainly as a code contributor through scipy.lib. Another task that I have taken within distutils framework, is to make scipy Fortran compiler independent. It means that on the absence of a Fortran compiler f2c based C libraries will be used when building scipy. Pearu From cimrman3 at ntc.zcu.cz Thu Oct 13 05:17:04 2005 From: cimrman3 at ntc.zcu.cz (Robert Cimrman) Date: Thu, 13 Oct 2005 11:17:04 +0200 Subject: [SciPy-dev] Introductions, sparse matrix support In-Reply-To: <434D5D5A.8000106@ee.byu.edu> References: <434D0AE9.9070707@ftw.at> <434D5D5A.8000106@ee.byu.edu> Message-ID: <434E2610.7050200@ntc.zcu.cz> Travis Oliphant wrote: > Ed Schofield wrote: >>I've made small contributions to PySparse in the past and briefly >>discussed the idea of merging PySparse into SciPy with Roman Geus by >>email. His comment in April was: >> >>"So far I have not had contacts to the scipy people, and admittedly I >>have never even installed scipy. When creating pysparse (some years ago >>now) I liked the idea of smaller well-definied packages more, than a big >>collection dependent packages. " >> >>I broadly agree with this philosophy, and it has served PySparse >>development well until now, but in this case I think that there's value >>to the community in having a well-integrated core of matrix >>functionality (sparse and dense) and one set of prerequisites and >> >> >>interfaces rather than several. >> >> > > Yes, I would like to see some standardization in the sparse stuff. With > you, Robert, Dan, and Jonathan coming aboard, it looks like there is a > critical mass of people who are able to make it happen. I am amazed how much activity generated my e-mail two weeks ago! Obviously the time for the sparse module has ripen :-) >>I'd be happy to work with Jonathan, Dan, and Robert on sparse matrix >>support in SciPy, integrating PySparse code with Travis and Robert's >>UMFPACK wrapper. Perhaps we could even find support among other >>PySparse developers. PySparse currently depends on Numeric, and will >>need porting to SciPy Core eventually... PySparse contains umfpack v4.1, right? Then if it is going to go into scipy tree, having me wrapping v4.4 separately would be useless - we (me if you like) could just update what already is in PySparse... We should settle first on the sparse matrix implementation to use and then care about wrapping related solvers (who almost always use the CSR/CSC format). I am keenly waiting on the benchmark results of Jonathan and Dan... (hoping that some work remains for me on Monday (a trip to Berlin tomorrow)). cheers, r. From arnd.baecker at web.de Thu Oct 13 05:34:06 2005 From: arnd.baecker at web.de (Arnd Baecker) Date: Thu, 13 Oct 2005 11:34:06 +0200 (CEST) Subject: [SciPy-dev] Package organization In-Reply-To: <434E0B96.4050005@ucsd.edu> References: <434D0AE9.9070707@ftw.at> <434D5D5A.8000106@ee.byu.edu> <434D98F6.5080301@ucsd.edu> <434DFFD1.9040303@colorado.edu> <434E0B96.4050005@ucsd.edu> Message-ID: On Thu, 13 Oct 2005, Robert Kern wrote: [...] > Arnd Baecker wrote: > > On Thu, 13 Oct 2005, Fernando Perez wrote: > > > > [...] > > > >>3. The scipy.{kits|tools} namespace (or whatever the chosen name). This is > >>where third parties can drop their own packages, which can depend either only > >>(1) or on the full (2) system (their level of dependency should be explicitly > >>stated). > >> > >>The kits namespace may ship empty by default, or it could be populated with a > >>few things from current scipy if it is decided they are best moved there. > >> > >>The only thing required for projects to live in the .kits namespace is really > >>to avoid top-level name collisions, so it would perhaps be worth having an > >>informal policy of people checking with scipy-dev for a name before using it. > >> > >>This layout would allow the core team to work with relative freedom at the > >>top-level namespace, without worrying about toolkits taking names they may > >>need in the future. Similarly, toolkit authors will have a well-defined API > >>to build upon. > > > > To me this looks as if there is essentially no difference > > to people distributing their projects independently of scipy > > (and outside of its name-space). > > I still like this for two reasons > > a) it would bring scientific python projects under one umbrella > > b) the (hopefully) simple user installation: > > I'm not entirely convinced that everything needs to live in the scipy > namespace, though. I think I overstated the build-time benefits of being > in scipy itself. I think we can make most of those benefits accessible > to packages that aren't living in scipy.* . I haven't been keeping up > with the AstroPy discussions, but I doubt they'd really want to do > > from scipy.kits.astropy import wcs > > etc. "Flat is better than nested," and all that. > > I *think* a package can get all the benefits of "being in scipy" simply > by depending on parts of scipy and using the tools appropriately. But to > really figure this out, we need to come down to cases. Let's get our > house in order first, and then we can talk about how we deal with guests. Once everything works it would be important to promote something like the "matlab toolboxes" for scipy ("scipy kits","scipy tools","scipy toolboxes", ...) to get as many people as possible to contribute their code (Presently "matlab toolbox" gives 1.580.000 hits on google vs. "scipy toolbox" 649 ;-) (Just a -2cent from my opinion budget, I hope) Best, Arnd P.S.: the egg-stuff looks cool - thanx for the links! From pearu at scipy.org Thu Oct 13 05:55:09 2005 From: pearu at scipy.org (Pearu Peterson) Date: Thu, 13 Oct 2005 04:55:09 -0500 (CDT) Subject: [SciPy-dev] scipy.minimum on complex numbers returns junk Message-ID: Hi, I'd like to report an odd behaviour when finding scipy.minimum on complex numbers: In [2]: scipy.minimum(1,2j) # should return 2j Out[2]: 6.6869516415167017e-316j In [3]: scipy.minimum(-1,2j) # should return -1 Out[3]: (-1+7.7025147686483706j) Pearu From nwagner at mecha.uni-stuttgart.de Thu Oct 13 06:59:20 2005 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Thu, 13 Oct 2005 12:59:20 +0200 Subject: [SciPy-dev] scipy.minimum on complex numbers returns junk In-Reply-To: References: Message-ID: <434E3E08.3080901@mecha.uni-stuttgart.de> Pearu Peterson wrote: >Hi, > >I'd like to report an odd behaviour when finding scipy.minimum on complex >numbers: >In [2]: scipy.minimum(1,2j) # should return 2j >Out[2]: 6.6869516415167017e-316j > >In [3]: scipy.minimum(-1,2j) # should return -1 >Out[3]: (-1+7.7025147686483706j) > >Pearu > >_______________________________________________ >Scipy-dev mailing list >Scipy-dev at scipy.net >http://www.scipy.net/mailman/listinfo/scipy-dev > Indeed, a very strange behaviour >>> scipy.minimum(1,2j) 36.003296861206081j >>> scipy.minimum(-1,2j) (-1+36.003296861206081j) >>> scipy.base.__version__ '0.4.3.1262' >>> scipy.maximum(1,2j) (1+1.585534972086793e+29j) >>> scipy.maximum(-1,2j) 1.585534972086793e+29j >>> From schofield at ftw.at Thu Oct 13 07:04:29 2005 From: schofield at ftw.at (Ed Schofield) Date: Thu, 13 Oct 2005 13:04:29 +0200 Subject: [SciPy-dev] Introductions, sparse matrix support In-Reply-To: <434E2610.7050200@ntc.zcu.cz> References: <434D0AE9.9070707@ftw.at> <434D5D5A.8000106@ee.byu.edu> <434E2610.7050200@ntc.zcu.cz> Message-ID: <434E3F3D.7000707@ftw.at> Robert Cimrman wrote: >>>I'd be happy to work with Jonathan, Dan, and Robert on sparse matrix >>>support in SciPy, integrating PySparse code with Travis and Robert's >>>UMFPACK wrapper. Perhaps we could even find support among other >>>PySparse developers. PySparse currently depends on Numeric, and will >>>need porting to SciPy Core eventually... >>> >>> >PySparse contains umfpack v4.1, right? Then if it is going to go into >scipy tree, having me wrapping v4.4 separately would be useless - we (me >if you like) could just update what already is in PySparse... > >We should settle first on the sparse matrix implementation to use and >then care about wrapping related solvers (who almost always use the >CSR/CSC format). I am keenly waiting on the benchmark results of >Jonathan and Dan... (hoping that some work remains for me on Monday (a >trip to Berlin tomorrow)). > > Yes, you're right, PySparse does contain UMFPACK 4.1. It seems from the UMFPACK 4.4 change log that its API is backwardly compatible with 4.1, so perhaps updating PySparse won't require much more than merging the new UMFPACK source tree. I suggest we adopt all three of PySparse's implementations: LL, CSR, and SSS. LL (a simple linked list format) is particularly useful for its flexibility, and PySparse's UMFPACK wrapper uses LL matrices currently as its input format. There might be an argument that SSS support isn't necessary, but it doesn't really get in the way, and removing it from PySparse would be more work than leaving it in. I like the idea of keeping the patch set from upstream PySparse simple, limited to improving integration with SciPy (e.g. making dense and sparse matrices expose a similar interface for operations like matrix products and as iterables.) And folding our patches back into PySparse will simplify the task. From arnd.baecker at web.de Thu Oct 13 09:24:06 2005 From: arnd.baecker at web.de (Arnd Baecker) Date: Thu, 13 Oct 2005 15:24:06 +0200 (CEST) Subject: [SciPy-dev] [Numpy-discussion] New 0.4.2 beta release of scipy core In-Reply-To: <434D791B.9060809@ee.byu.edu> References: <434D791B.9060809@ee.byu.edu> Message-ID: On Wed, 12 Oct 2005, Travis Oliphant wrote: > I made another beta release of scipy core last night. There are > windows binaries for Python 2.4 and Python 2.3. If you are already a > user of scipy, the new __init__ file installed for newcore will break > your current installation of scipy (but the problem with linalg, > fftpack, and stats is no longer there). > > There have been many improvements: > > - bug fixes (including 64-bit fixes) > - threading support fixes > - optimizations > - more random numbers (thanks Robert Kern). > - more distutils fixes (thanks Pearu Peterson). > > More tests are welcome. We are moving towards a release (but still > need to get Masked Arrays working and all of scipy to build on top of > the new scipy core before a full release). Alright, on the Opteron with its 64 Bit all problems are gone. A few observations: - I installed it separately from the normal python (and old scipy) installation by `python setup.py install --prefix=$DESTDIR` This time I used gcc 3.4.4 and the installed packages end up in $DESTDIR/lib64/python2.4/site-packages/:$PYTHONPATH With gcc version 4.0.2 20050826 (prerelease) (SUSE Linux) they ended up in $DESTDIR/lib/python2.4/site-packages/:$PYTHONPATH This is not a problem for me, but it might be worth knowing. - _configtest.c: Several times one gets: gcc options: '-pthread -fno-strict-aliasing -DNDEBUG -g -O3 -Wall -Wstrict-prototypes -fPIC' compile options: '-Iscipy/base/src -I/scr/python/include/python2.4 -c' gcc: _configtest.c _configtest.c:3: warning: function declaration isn't a prototype _configtest.c: In function `main': _configtest.c:4: warning: statement with no effect _configtest.c:5: warning: control reaches end of non-void function gcc -pthread _configtest.o -lm -o _configtest success! - scipy/corelib/mtrand/mtrand.c: Several warnings of the type: In function `__pyx_f_6mtrand_cont0_array': scipy/corelib/mtrand/mtrand.c:221: warning: assignment from incompatible pointer type (should I post the full list?) - I can either use a non-optimized LAPACK or an optimized LAPACK (via atlas). Question: is there a python way to figure out which one is used by lapack_lite? (Imagine the situation that I am dumped on a machine, installed by someone else, and to my surprise I find scipy installed, and now would like to know if it uses the fast libraries, like ATLAS, fftw, ... ?) ((scipy_distutils can only tell me which libraries are available at the time of running it, right?)) - Installing the new scipy on top of scipy_core works as well. However, it seems that none of the unit tests (e.g. for special) get copied to the installation directory They are there in the source directory, but not in the build directory and neither in the installation directory. OK, so much for now, Arnd From pearu at scipy.org Thu Oct 13 10:13:30 2005 From: pearu at scipy.org (Pearu Peterson) Date: Thu, 13 Oct 2005 09:13:30 -0500 (CDT) Subject: [SciPy-dev] mtrand.c compiler warnings Message-ID: Hi Robert, Compiling mtrand.c throws lots of compiler warnings. Is it safe to handedit mtrand.c to fix these warnings? Or should pyrex fixed to get rid of these warnings? Pearu From rkern at ucsd.edu Thu Oct 13 11:17:31 2005 From: rkern at ucsd.edu (Robert Kern) Date: Thu, 13 Oct 2005 08:17:31 -0700 Subject: [SciPy-dev] Package organization In-Reply-To: References: <434D0AE9.9070707@ftw.at> <434D5D5A.8000106@ee.byu.edu> <434D98F6.5080301@ucsd.edu> <434DFFD1.9040303@colorado.edu> Message-ID: <434E7A8B.1060105@ucsd.edu> Pearu Peterson wrote: > > On Thu, 13 Oct 2005, Fernando Perez wrote: > >>Robert Kern wrote: >> >>>I would like to see scipy's package organization become flatter and more >>>oriented towards easy, lightweight, modular packaging rather than >>>subject matter. For example, some people want bindings to SUNDIALS for >>>ODEs. They could go into scipy.integrate, but that introduces a large, >>>needless dependency for those who just want to compute integrals. So I >>>would suggest that SUNDIALS bindings would be in their own >>>scipy.sundials package. > > This is for what scipy.lib namespace was created, to collect packages > containing wrappers to various libraries. So, how > scipy.lib.sundials > sounds for you in addition to > scipy.lib.blas > scipy.lib.lapack > scipy.lib.minpack > scipy.lib.fftpack > scipy.lib.odepack > scipy.lib.quadpack > etc etc > ? > > Among other things this may reduce the time spent on importing large > extension modules that users might not use in their programs. I'm proposing just the opposite. scipy.lib puts a lot of things which are unrelated together. That makes packaging harder. On top of that, you're still importing scipy.lib.sundials, scipy.lib.odepack, and scipy.lib.quadpack from scipy.integrate. That means you can't really install that functionality separately. I want the SUNDIALS raw bindings and the Python convenience code that wraps around the raw bindings to all be in scipy.sundials. That makes packaging dead simple and understandable. The things that are functionally, necessarily related to each other are combined in this one controllable bundle. Code should be organized to make coding and distribution easier. It's documentation that should be organized around concepts. -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From rkern at ucsd.edu Thu Oct 13 11:21:42 2005 From: rkern at ucsd.edu (Robert Kern) Date: Thu, 13 Oct 2005 08:21:42 -0700 Subject: [SciPy-dev] mtrand.c compiler warnings In-Reply-To: References: Message-ID: <434E7B86.5040309@ucsd.edu> Pearu Peterson wrote: > Hi Robert, > > Compiling mtrand.c throws lots of compiler warnings. Is it safe to > handedit mtrand.c to fix these warnings? Or should pyrex fixed to get rid > of these warnings? No, it's not safe to hand-edit mtrand.c . It really shouldn't even be in the repository, but I didn't want to make Pyrex an explicit build-dependency. Pyrex *ought* to be fixed to get rid of these warnings, but no one has really done the leg work to do so. None of the warnings actually matter, though. Does the distutils compiler abstraction easily allow silencing warnings? With gcc, you can just add -w, but people may not be using gcc. -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From pearu at scipy.org Thu Oct 13 10:30:06 2005 From: pearu at scipy.org (Pearu Peterson) Date: Thu, 13 Oct 2005 09:30:06 -0500 (CDT) Subject: [SciPy-dev] Package organization In-Reply-To: <434E7A8B.1060105@ucsd.edu> References: <434D0AE9.9070707@ftw.at> <434D5D5A.8000106@ee.byu.edu> <434D98F6.5080301@ucsd.edu> <434DFFD1.9040303@colorado.edu> <434E7A8B.1060105@ucsd.edu> Message-ID: On Thu, 13 Oct 2005, Robert Kern wrote: > Pearu Peterson wrote: >> >> On Thu, 13 Oct 2005, Fernando Perez wrote: >> >>> Robert Kern wrote: >>> >>>> I would like to see scipy's package organization become flatter and more >>>> oriented towards easy, lightweight, modular packaging rather than >>>> subject matter. For example, some people want bindings to SUNDIALS for >>>> ODEs. They could go into scipy.integrate, but that introduces a large, >>>> needless dependency for those who just want to compute integrals. So I >>>> would suggest that SUNDIALS bindings would be in their own >>>> scipy.sundials package. >> >> This is for what scipy.lib namespace was created, to collect packages >> containing wrappers to various libraries. So, how >> scipy.lib.sundials >> sounds for you in addition to >> scipy.lib.blas >> scipy.lib.lapack >> scipy.lib.minpack >> scipy.lib.fftpack >> scipy.lib.odepack >> scipy.lib.quadpack >> etc etc >> ? >> >> Among other things this may reduce the time spent on importing large >> extension modules that users might not use in their programs. > > I'm proposing just the opposite. Ok, I see. > scipy.lib puts a lot of things which are unrelated together. That makes > packaging harder. Hmm, how does it make packaging harder? > On top of that, you're still importing scipy.lib.sundials, > scipy.lib.odepack, and scipy.lib.quadpack from scipy.integrate. That > means you can't really install that functionality separately. May be I am not quite following you here but I don't understand the "can't" part. Could give an example how would you organize sundials,odepack,quadpack for scipy.integrate? > I want the SUNDIALS raw bindings and the Python convenience code that > wraps around the raw bindings to all be in scipy.sundials. That makes > packaging dead simple and understandable. The things that are > functionally, necessarily related to each other are combined in this one > controllable bundle. > > Code should be organized to make coding and distribution easier. It's > documentation that should be organized around concepts. I can agree on that. Pearu From pearu at scipy.org Thu Oct 13 10:43:07 2005 From: pearu at scipy.org (Pearu Peterson) Date: Thu, 13 Oct 2005 09:43:07 -0500 (CDT) Subject: [SciPy-dev] mtrand.c compiler warnings In-Reply-To: <434E7B86.5040309@ucsd.edu> References: <434E7B86.5040309@ucsd.edu> Message-ID: On Thu, 13 Oct 2005, Robert Kern wrote: > Pearu Peterson wrote: >> Hi Robert, >> >> Compiling mtrand.c throws lots of compiler warnings. Is it safe to >> handedit mtrand.c to fix these warnings? Or should pyrex fixed to get rid >> of these warnings? > > No, it's not safe to hand-edit mtrand.c . It really shouldn't even be in > the repository, but I didn't want to make Pyrex an explicit > build-dependency. Ok, that's what I also thought. > Pyrex *ought* to be fixed to get rid of these warnings, but no one has > really done the leg work to do so. None of the warnings actually matter, > though. Does the distutils compiler abstraction easily allow silencing > warnings? With gcc, you can just add -w, but people may not be using gcc. I don't believe that silencing warnings is a good practice, they should be fixed, in general. If we were using all the same compiler, then it might be safe to ignore warnings but considering on how many different platforms and with different compilers scipy should be buildable, it is safer to fix these warnings rather than hide them. Warnings like 'warning: assignment from incompatible pointer type' are particularly dangerous and such warnings often indicate the location of bugs, some compilers may handle them 'well' while others may not. Even warnings like 'unused variables' may indicate typos. Sure, mtrand.c is a generated code and then warnings on 'unused variables' are often safe to ignore. However, shutting down all warnings for one extension is dangerous to others, especially for handwritten ones. Pearu From cookedm at physics.mcmaster.ca Thu Oct 13 11:46:40 2005 From: cookedm at physics.mcmaster.ca (David M. Cooke) Date: Thu, 13 Oct 2005 11:46:40 -0400 Subject: [SciPy-dev] mtrand.c compiler warnings In-Reply-To: <434E7B86.5040309@ucsd.edu> (Robert Kern's message of "Thu, 13 Oct 2005 08:21:42 -0700") References: <434E7B86.5040309@ucsd.edu> Message-ID: Robert Kern writes: > Pearu Peterson wrote: >> Hi Robert, >> >> Compiling mtrand.c throws lots of compiler warnings. Is it safe to >> handedit mtrand.c to fix these warnings? Or should pyrex fixed to get rid >> of these warnings? > > No, it's not safe to hand-edit mtrand.c . It really shouldn't even be in > the repository, but I didn't want to make Pyrex an explicit > build-dependency. > > Pyrex *ought* to be fixed to get rid of these warnings, but no one has > really done the leg work to do so. None of the warnings actually matter, > though. Does the distutils compiler abstraction easily allow silencing > warnings? With gcc, you can just add -w, but people may not be using > gcc. I've got a patch to Pyrex that fixes a lot of the unused label warnings; I'll post it to the Pyrex list, and redo the mtrand.c creation. -- |>|\/|< /--------------------------------------------------------------------------\ |David M. Cooke http://arbutus.physics.mcmaster.ca/dmc/ |cookedm at physics.mcmaster.ca From Fernando.Perez at colorado.edu Thu Oct 13 13:28:53 2005 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Thu, 13 Oct 2005 11:28:53 -0600 Subject: [SciPy-dev] Package organization In-Reply-To: References: <434D0AE9.9070707@ftw.at> <434D5D5A.8000106@ee.byu.edu> <434D98F6.5080301@ucsd.edu> <434DFFD1.9040303@colorado.edu> Message-ID: <434E9955.70901@colorado.edu> Pearu Peterson wrote: > > On Thu, 13 Oct 2005, Fernando Perez wrote: >>This would then give us the following structure: >> >>1. Scipy_core: the new Numeric/numarray package, which includes basic FFT, ... >>2. Scipy 'full': depends on (1), and exposes all the other scipy names: ... > This is pretty much the current structure of newcore and newscipy, even of > old scipy and scipy_core. Yes, I know. I should have said so, I just wanted it to be explicit for the sake of completeness. The only minor difference is that I was considering the possibility of moving some of today's 'full scipy' into toolkits, but that would be a decision for later. > Ok, that's a new bit, though it remainds me the scipy.sandbox package in > newscipy. May be it's a matter of naming convention. Well, in my eye sandbox carries a connotation of 'experimental, staging area'. That's not what I think of the toolkits as. I see those as standalone, fully released packages which build upon scipy's core functionality for specialized tasks. I think the sandbox can remain there, but it serves a different purpose than the toolkits, at least in my mind. Cheers, f From oliphant at ee.byu.edu Thu Oct 13 13:53:09 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Thu, 13 Oct 2005 11:53:09 -0600 Subject: [SciPy-dev] Introductions, sparse matrix support In-Reply-To: <434E3F3D.7000707@ftw.at> References: <434D0AE9.9070707@ftw.at> <434D5D5A.8000106@ee.byu.edu> <434E2610.7050200@ntc.zcu.cz> <434E3F3D.7000707@ftw.at> Message-ID: <434E9F05.3070804@ee.byu.edu> Ed Schofield wrote: >Robert Cimrman wrote: > > > >>>>I'd be happy to work with Jonathan, Dan, and Robert on sparse matrix >>>>support in SciPy, integrating PySparse code with Travis and Robert's >>>>UMFPACK wrapper. Perhaps we could even find support among other >>>>PySparse developers. PySparse currently depends on Numeric, and will >>>>need porting to SciPy Core eventually... >>>> >>>> >>>> >>>> >>PySparse contains umfpack v4.1, right? Then if it is going to go into >>scipy tree, having me wrapping v4.4 separately would be useless - we (me >>if you like) could just update what already is in PySparse... >> >>We should settle first on the sparse matrix implementation to use and >>then care about wrapping related solvers (who almost always use the >>CSR/CSC format). I am keenly waiting on the benchmark results of >>Jonathan and Dan... (hoping that some work remains for me on Monday (a >>trip to Berlin tomorrow)). >> >> >> >> >Yes, you're right, PySparse does contain UMFPACK 4.1. It seems from the >UMFPACK 4.4 change log that its API is backwardly compatible with 4.1, >so perhaps updating PySparse won't require much more than merging the >new UMFPACK source tree. > >I suggest we adopt all three of PySparse's implementations: LL, CSR, and >SSS. > Don't forget about CSC.... In fact, I think the sparse matrix should allow a number of different formats. Obviously a few will be used internally more than others. But, we shouldn't enforce a certain structure to early. LL format is *much* better for building sparse matrices. But, CSR or CSC is much better for "solving" them. I suspect that most of the speed gain over scipy.sparse is the LL extension type which makes it faster to build sparse matrices. I'd happily welcome those extension types, but let's see if we can't make them all subclasses of one base-class. Look at how the scipy sparse Python classes are layed out. Basically, by defining a tocsc and fromcsc, all of them can be converted to each other and used as solvers. I did put some effort into the structure of scipy.sparse. I did not put any effort into optimizations, though. -Travis From Fernando.Perez at colorado.edu Thu Oct 13 13:59:48 2005 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Thu, 13 Oct 2005 11:59:48 -0600 Subject: [SciPy-dev] Package organization In-Reply-To: <434E0B96.4050005@ucsd.edu> References: <434D0AE9.9070707@ftw.at> <434D5D5A.8000106@ee.byu.edu> <434D98F6.5080301@ucsd.edu> <434DFFD1.9040303@colorado.edu> <434E0B96.4050005@ucsd.edu> Message-ID: <434EA094.9030002@colorado.edu> Robert Kern wrote: > I'm not entirely convinced that everything needs to live in the scipy > namespace, though. I think I overstated the build-time benefits of being > in scipy itself. I think we can make most of those benefits accessible > to packages that aren't living in scipy.* . I haven't been keeping up > with the AstroPy discussions, but I doubt they'd really want to do > > from scipy.kits.astropy import wcs > > etc. "Flat is better than nested," and all that. Yes, but there's a dark side to that motto, rarely mentioned: 'too flat leads to name clashes and silent shadowings'. That's exactly what happens with python: the search space is a single, flat list searched in order (as specified in some internal build-time paths extended with $PYTHONPATH, resulting in sys.path). This means that at any time, the appearance of a new package with the same name as another in a directory which is listed earlier in sys.path can shadow the older package with no warning. I think that scipy is a large enough collection of tools that it's worth creating at least one level of namespace protection for us against the above. Indeed, it is shorter to type from astropy import wcs and perhaps the astropy community (and others) would find the scipy.kits prefix annoying, I don't know. But I do see merit to having a namespace umbrella that may justify that price to be paid: - scipy.kits? or help(scipy.kits) will show you all the toolkits you have installed. I think this can help new users a lot, and even experienced ones, as scipy grows. Think of how much people love the integrated help indices in Mathematica or matlab... - automatic documentation facilities can collect and generate html/pdf docs of the scipy core/full AND the existing toolkits, including code examples. - with the addition of a single environment variable (say SCIPY_KIT_PATH), we can even allow users to manage local collections of toolkits outside of the sysadmin-controlled filesystem (critical for non-root usage). Obviously PYTHONPATH already provides this for import, but explicitly designating a toolkit search path will tell scipy where to look at runtime for toolkits, for the purposes of documentation/help/indexing (much like MayaVi allows for user-level local extensions for data sources, filters and modules). If the overall idea seems to anyone to have merit but the issue is with too much nesting, would a simple renaming help? from scikit.astro import wcs for example. It does bring the nesting cost down to a single lookup level (one dot instead of two, which means one less dictionary lookup) and 7 characters total. > I *think* a package can get all the benefits of "being in scipy" simply > by depending on parts of scipy and using the tools appropriately. But to > really figure this out, we need to come down to cases. Let's get our > house in order first, and then we can talk about how we deal with guests. I just want to make sure that we at least build a guest room in the house, instead of asking the guests to sleep in the garage :) We can paint it later, but we should at least plan for its existence now. Anyway, if people don't like the idea I'll just drop the discussion so that those coding can get back to their job, and I can get back to the long-abandoned ipython. I'll live with the occasional PYTHONPATH-induced name clash, but I do think it would be worth taking this opportunity now to future-proof scipy a little in this regard. Cheers, f From Fernando.Perez at colorado.edu Thu Oct 13 14:08:50 2005 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Thu, 13 Oct 2005 12:08:50 -0600 Subject: [SciPy-dev] Package organization In-Reply-To: <434EA094.9030002@colorado.edu> References: <434D0AE9.9070707@ftw.at> <434D5D5A.8000106@ee.byu.edu> <434D98F6.5080301@ucsd.edu> <434DFFD1.9040303@colorado.edu> <434E0B96.4050005@ucsd.edu> <434EA094.9030002@colorado.edu> Message-ID: <434EA2B2.30503@colorado.edu> Fernando Perez wrote: > Robert Kern wrote: > > >>I'm not entirely convinced that everything needs to live in the scipy >>namespace, though. I think I overstated the build-time benefits of being >>in scipy itself. I think we can make most of those benefits accessible >>to packages that aren't living in scipy.* . I haven't been keeping up >>with the AstroPy discussions, but I doubt they'd really want to do [...] > Anyway, if people don't like the idea I'll just drop the discussion so that > those coding can get back to their job, and I can get back to the > long-abandoned ipython. I'll live with the occasional PYTHONPATH-induced name > clash, but I do think it would be worth taking this opportunity now to > future-proof scipy a little in this regard. A clarifcation: I don't propose that every package out there which has an 'import scipy' in it should be put into the toolkits, certainly not. I think of the toolkits as a convenient way to collect small to medium-sized libraries which serve well-defined purposes. And finally, this would obviously be entirely optional and up to package authors. Anyone can put up their 2-bit lemonade stand on sourceforge and distribute PyTimeTravel as a separate package which uses scipy, and nobody will ever force them to change. OK, now back to real work. I _really_ should shut up, and let you guys just make a decision you like. Cheers, f From rkern at ucsd.edu Thu Oct 13 16:35:35 2005 From: rkern at ucsd.edu (Robert Kern) Date: Thu, 13 Oct 2005 13:35:35 -0700 Subject: [SciPy-dev] Package organization In-Reply-To: References: <434D0AE9.9070707@ftw.at> <434D5D5A.8000106@ee.byu.edu> <434D98F6.5080301@ucsd.edu> <434DFFD1.9040303@colorado.edu> <434E7A8B.1060105@ucsd.edu> Message-ID: <434EC517.2020007@ucsd.edu> Pearu Peterson wrote: > > On Thu, 13 Oct 2005, Robert Kern wrote: >>scipy.lib puts a lot of things which are unrelated together. That makes >>packaging harder. > > Hmm, how does it make packaging harder? If all of the SUNDIALS-related code is under scipy/sundials, then I can build eggs that look like this: scipy_sundials-.egg/ scipy_sundials-.egg/scipy/ scipy_sundials-.egg/scipy/sundials/ scipy_sundials-.egg/scipy/sundials/__init__.py scipy_sundials-.egg/scipy/sundials/_sundials.so scipy_sundials-.egg/scipy/sundials/sundials.py scipy_integrate-.egg/ scipy_integrate-.egg/scipy/ scipy_integrate-.egg/scipy/integrate/ scipy_integrate-.egg/scipy/integrate/__init__.py scipy_integrate-.egg/scipy/integrate/_quadpack.so scipy_integrate-.egg/scipy/integrate/quadpack.py scipy_integrate-.egg/scipy/integrate/quadrature.py rather than scipy_sundials-.egg/ scipy_sundials-.egg/scipy/ scipy_sundials-.egg/scipy/lib/ scipy_sundials-.egg/scipy/lib/sundials/ scipy_sundials-.egg/scipy/lib/sundials/_sundials.so scipy_sundials-.egg/scipy/integrate/sundials.py scipy_integrate-.egg/ scipy_integrate-.egg/scipy/ scipy_integrate-.egg/scipy/lib/ scipy_integrate-.egg/scipy/lib/quadpack/ scipy_integrate-.egg/scipy/lib/quadpack/_quadpack.so scipy_integrate-.egg/scipy/integrate/ scipy_integrate-.egg/scipy/integrate/__init__.py scipy_integrate-.egg/scipy/integrate/quadpack.py scipy_integrate-.egg/scipy/integrate/quadrature.py Now, with the former I have two, well-defined subpackages that never change. They're either present or they're not. With the latter, the contents of scipy.integrate change depending on whether or not scipy_sundials is installed. On top of that, eggs can handle namespace packages, but only one level down, so to speak. I can have several eggs exposing the scipy.* namespace and have them all treated as a single package, but I can't have several eggs exposing the scipy.lib.* namespace and have them all treated uniformly. Adding that functionality to the egg runtime would really complicate matters. Perhaps you could explain to me how we benefit putting all of the raw wrappers into scipy.lib.*. I don't really see any. >>On top of that, you're still importing scipy.lib.sundials, >>scipy.lib.odepack, and scipy.lib.quadpack from scipy.integrate. That >>means you can't really install that functionality separately. > > May be I am not quite following you here but I don't understand the > "can't" part. Could give an example how would you organize > sundials,odepack,quadpack for scipy.integrate? I would make scipy.sundials completely separate from scipy.integrate. If I were starting scipy all over again, I would also make scipy.ode for ODEPACK and VODE and leave scipy.integrate for QUADPACK and the other "integrals of functions" routines. I'm not sure it's worth doing the latter at this point. (Oh, and thank you for reducing the default chattiness of ScipyTest. I appreciate it.) -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From pearu at scipy.org Thu Oct 13 16:17:50 2005 From: pearu at scipy.org (Pearu Peterson) Date: Thu, 13 Oct 2005 15:17:50 -0500 (CDT) Subject: [SciPy-dev] Package organization In-Reply-To: <434EC517.2020007@ucsd.edu> References: <434D0AE9.9070707@ftw.at> <434D5D5A.8000106@ee.byu.edu> <434D98F6.5080301@ucsd.edu> <434DFFD1.9040303@colorado.edu> <434E7A8B.1060105@ucsd.edu> <434EC517.2020007@ucsd.edu> Message-ID: On Thu, 13 Oct 2005, Robert Kern wrote: > Pearu Peterson wrote: >> >> On Thu, 13 Oct 2005, Robert Kern wrote: > >>> scipy.lib puts a lot of things which are unrelated together. That makes >>> packaging harder. >> >> Hmm, how does it make packaging harder? > > If all of the SUNDIALS-related code is under scipy/sundials, then I can > build eggs that look like this: > > scipy_sundials-.egg/ > scipy_sundials-.egg/scipy/ > scipy_sundials-.egg/scipy/sundials/ > scipy_sundials-.egg/scipy/sundials/__init__.py > scipy_sundials-.egg/scipy/sundials/_sundials.so > scipy_sundials-.egg/scipy/sundials/sundials.py > > scipy_integrate-.egg/ > scipy_integrate-.egg/scipy/ > scipy_integrate-.egg/scipy/integrate/ > scipy_integrate-.egg/scipy/integrate/__init__.py > scipy_integrate-.egg/scipy/integrate/_quadpack.so > scipy_integrate-.egg/scipy/integrate/quadpack.py > scipy_integrate-.egg/scipy/integrate/quadrature.py > > rather than > > scipy_sundials-.egg/ > scipy_sundials-.egg/scipy/ > scipy_sundials-.egg/scipy/lib/ > scipy_sundials-.egg/scipy/lib/sundials/ > scipy_sundials-.egg/scipy/lib/sundials/_sundials.so > scipy_sundials-.egg/scipy/integrate/sundials.py > > scipy_integrate-.egg/ > scipy_integrate-.egg/scipy/ > scipy_integrate-.egg/scipy/lib/ > scipy_integrate-.egg/scipy/lib/quadpack/ > scipy_integrate-.egg/scipy/lib/quadpack/_quadpack.so > scipy_integrate-.egg/scipy/integrate/ > scipy_integrate-.egg/scipy/integrate/__init__.py > scipy_integrate-.egg/scipy/integrate/quadpack.py > scipy_integrate-.egg/scipy/integrate/quadrature.py Sure, it's a bad packaging. But why sundials.py should be under integrate? That is, wouldn't scipy_sundials-.egg/ scipy_sundials-.egg/scipy/ scipy_sundials-.egg/scipy/lib/sundials/ scipy_sundials-.egg/scipy/lib/sundials/__init__.py scipy_sundials-.egg/scipy/lib/sundials/_sundials.so scipy_sundials-.egg/scipy/lib/sundials/sundials.py scipy_integrate-.egg/ scipy_integrate-.egg/scipy/ scipy_integrate-.egg/scipy/integrate/ scipy_integrate-.egg/scipy/integrate/__init__.py scipy_integrate-.egg/scipy/integrate/_quadpack.so scipy_integrate-.egg/scipy/integrate/quadpack.py scipy_integrate-.egg/scipy/integrate/quadrature.py be equivalent to what desired? > On top of that, eggs can handle namespace packages, but only one level > down, so to speak. I can have several eggs exposing the scipy.* > namespace and have them all treated as a single package, but I can't > have several eggs exposing the scipy.lib.* namespace and have them all > treated uniformly. Adding that functionality to the egg runtime would > really complicate matters. Ah, ok, it's then all about the limitations of eggs. But what about scipy_lib_sundials-.egg/ scipy_lib_sundials-.egg/scipy/ scipy_lib_sundials-.egg/scipy/lib/sundials/ scipy_lib_sundials-.egg/scipy/lib/sundials/__init__.py scipy_lib_sundials-.egg/scipy/lib/sundials/_sundials.so scipy_lib_sundials-.egg/scipy/lib/sundials/sundials.py ? (I have never used eggs myself, so, ignore my ignorance on this matter) > Perhaps you could explain to me how we benefit putting all of the raw > wrappers into scipy.lib.*. I don't really see any. The scipy namespace will be cleaner, all packages that scipy will contain, are more or less in the same level. Things that are in scipy.lib would be raw wrappers with pythonic interfaces to various libraries that packages in scipy level can use. Packages in scipy.lib would depend only on scipy_core, not on each other. >>> On top of that, you're still importing scipy.lib.sundials, >>> scipy.lib.odepack, and scipy.lib.quadpack from scipy.integrate. That >>> means you can't really install that functionality separately. >> >> May be I am not quite following you here but I don't understand the >> "can't" part. Could give an example how would you organize >> sundials,odepack,quadpack for scipy.integrate? > > I would make scipy.sundials completely separate from scipy.integrate. That's what I would also expect and would make myself. > If I were starting scipy all over again, I would also make scipy.ode for > ODEPACK and VODE and leave scipy.integrate for QUADPACK and the other > "integrals of functions" routines. I'm not sure it's worth doing the > latter at this point. I think in scipy 0.1 or so we had all packages, both wrapper packages as well as toolkits kind of packages, in scipy namespace. I think it was a mess. Moving wrapper packages to scipy.lib would reduce this mess a bit. I think that odepack/vode as well as quadpack should be separated from scipy.integrate. My orginal idea was to move the wrappers to scipy.lib and scipy.integrate would just use them from there. The main advantage is that when some other scipy package might use, say, odepack, but not quadpack, then does not need to install the whole scipy.integrate. With this separation, the scipy will be more modular. Pearu From guyer at nist.gov Thu Oct 13 22:43:52 2005 From: guyer at nist.gov (Jonathan Guyer) Date: Thu, 13 Oct 2005 22:43:52 -0400 Subject: [SciPy-dev] Introductions, sparse matrix support In-Reply-To: <434E9F05.3070804@ee.byu.edu> References: <434D0AE9.9070707@ftw.at> <434D5D5A.8000106@ee.byu.edu> <434E2610.7050200@ntc.zcu.cz> <434E3F3D.7000707@ftw.at> <434E9F05.3070804@ee.byu.edu> Message-ID: <3d502007996f0f09808ed0b07bf3a7b7@nist.gov> On Oct 13, 2005, at 1:53 PM, Travis Oliphant wrote: > I'd happily welcome those extension types, but let's see if we can't > make them all subclasses of one base-class. Look at how the scipy > sparse Python classes are layed out. Basically, by defining a tocsc > and > fromcsc, all of them can be converted to each other and used as > solvers. > I did put some effort into the structure of scipy.sparse. I did not > put any effort into optimizations, though. I think this is important. PySparse is neither very object oriented nor very "Pythonic". I think the API can be done much better [*] (and scipy.sparse may be it; I haven't had time yet to do anything practical with it). I don't mean to denigrate PySparse; we happily use it and Roman has been very open to our suggestions, I just think that we can learn from it and do better. [*] for that matter, the C code's not very pretty, either, but that's less important to me than the design of the Python API. -- Jonathan E. Guyer, PhD Metallurgy Division National Institute of Standards and Technology From pearu at scipy.org Fri Oct 14 03:14:49 2005 From: pearu at scipy.org (Pearu Peterson) Date: Fri, 14 Oct 2005 02:14:49 -0500 (CDT) Subject: [SciPy-dev] BUG: dot segfaults on float/complex arrays Message-ID: Bug report: dot segfaults when applied to float or complex arrays. Integer arrays seem to be ok. In [1]: from scipy import * In [2]: dot(array([1.]),array([2])) Segmentation fault Pearu From pearu at scipy.org Fri Oct 14 03:18:06 2005 From: pearu at scipy.org (Pearu Peterson) Date: Fri, 14 Oct 2005 02:18:06 -0500 (CDT) Subject: [SciPy-dev] BUG: dot segfaults on float/complex arrays In-Reply-To: References: Message-ID: Please ignore this message: rm -rf build python setup.py install resolved the problem! Sorry for the noise, Pearu On Fri, 14 Oct 2005, Pearu Peterson wrote: > > Bug report: dot segfaults when applied to float or complex arrays. Integer > arrays seem to be ok. > > In [1]: from scipy import * > > In [2]: dot(array([1.]),array([2])) > Segmentation fault > > Pearu > > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-dev > From oliphant at ee.byu.edu Fri Oct 14 04:43:48 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Fri, 14 Oct 2005 02:43:48 -0600 Subject: [SciPy-dev] BUG: dot segfaults on float/complex arrays In-Reply-To: References: Message-ID: <434F6FC4.7090003@ee.byu.edu> Pearu Peterson wrote: >Please ignore this message: > rm -rf build > python setup.py install >resolved the problem! > >Sorry for the noise, >Pearu > >On Fri, 14 Oct 2005, Pearu Peterson wrote: > > > >>Bug report: dot segfaults when applied to float or complex arrays. Integer >>arrays seem to be ok. >> >> I added a new function to the array c-api. When that happens, everything needs to be rebuilt.... I'm hoping by the time scipy is ported, the array API will stabilize. -Travis From nwagner at mecha.uni-stuttgart.de Fri Oct 14 04:48:32 2005 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Fri, 14 Oct 2005 10:48:32 +0200 Subject: [SciPy-dev] BUG: dot segfaults on float/complex arrays In-Reply-To: References: Message-ID: <434F70E0.8040607@mecha.uni-stuttgart.de> Pearu Peterson wrote: >Bug report: dot segfaults when applied to float or complex arrays. Integer >arrays seem to be ok. > >In [1]: from scipy import * > >In [2]: dot(array([1.]),array([2])) >Segmentation fault > >Pearu > >_______________________________________________ >Scipy-dev mailing list >Scipy-dev at scipy.net >http://www.scipy.net/mailman/listinfo/scipy-dev > I cannot reproduce the segfault >>> from scipy import * >>> dot(array([1.]),array([2])) 2.0 >>> >>> scipy.base.__version__ '0.4.3.1277' >>> From arnd.baecker at web.de Fri Oct 14 05:48:14 2005 From: arnd.baecker at web.de (Arnd Baecker) Date: Fri, 14 Oct 2005 11:48:14 +0200 (CEST) Subject: [SciPy-dev] scipy new core, build remarks Message-ID: Hi, another morning, a new build ;-) Looks fine! There is only one warning (apart from those mentioned yesterday) concerning dotblas: creating build/temp.linux-x86_64-2.4/scipy/corelib/blasdot compile options: '-DATLAS_INFO="\"3.7.11\"" -Iscipy/corelib/blasdot -I/scr/python/include -Iscipy/base/include -Ibuild/src/scipy/base -Iscipy/base/src -I/scr/python/include/python2.4 -c' gcc: scipy/corelib/blasdot/_dotblas.c scipy/corelib/blasdot/_dotblas.c: In function `dotblas_vdot': scipy/corelib/blasdot/_dotblas.c:691: warning: passing arg 2 of pointer to function from incompatible pointer type gcc -pthread -shared build/temp.linux-x86_64-2.4/scipy/corelib/blasdot/_dotblas.o -L/scr/python/lib64 -lptf77blas -lptcblas -latlas -o build/lib.linux-x86_64-2.4/scipy/lib/_dotblas.so The reason for this is that in scipy/base/src/arrayobject.c the declaration is static PyObject * PyArray_FromDims(int nd, int *d, int type) whereas in _dotblas.c, routine dotblas_vdot: ret = (PyArrayObject *)PyArray_FromDims(0, dimensions, typenum); is used with declaration intp dimensions[MAX_DIMS]; So either "intp"->"int" or "int"->"intp", I'd guess ;-)... All tests run without error: scipy.test(10) Found 2 tests for scipy.base.umath Found 23 tests for scipy.base.function_base Found 2 tests for scipy.base.getlimits Found 9 tests for scipy.base.twodim_base Found 3 tests for scipy.base.matrix Found 44 tests for scipy.base.shape_base Found 3 tests for scipy.basic.helper Found 42 tests for scipy.base.type_check Found 4 tests for scipy.base.index_tricks Found 0 tests for __main__ Ran 132 tests in 0.362s Installing new scipy works fine as well. Concerning imports: "Old" scipy: In [1]:import scipy In [2]:scipy.special.j0(10) Out[2]:-0.24593576445134832 New scipy: In [1]:import scipy In [2]: scipy.special.j0(10) --------------------------------------------------------------------------- exceptions.AttributeError Traceback (most recent call last) AttributeError: 'module' object has no attribute 'special' Instead one has to do: In [3]: import scipy.special In [4]: scipy.special.j0(10) Out[4]: -0.24593576445134829 Is it planned (would it be possible?) that the"old" way works as well? Best, Arnd From pearu at scipy.org Fri Oct 14 04:52:32 2005 From: pearu at scipy.org (Pearu Peterson) Date: Fri, 14 Oct 2005 03:52:32 -0500 (CDT) Subject: [SciPy-dev] scipy new core, build remarks In-Reply-To: References: Message-ID: On Fri, 14 Oct 2005, Arnd Baecker wrote: > "Old" scipy: > > In [1]:import scipy > In [2]:scipy.special.j0(10) > Out[2]:-0.24593576445134832 > > New scipy: > > In [1]:import scipy > In [2]: scipy.special.j0(10) > --------------------------------------------------------------------------- > exceptions.AttributeError Traceback (most recent call last) > AttributeError: 'module' object has no attribute 'special' > > Instead one has to do: > > In [3]: import scipy.special > In [4]: scipy.special.j0(10) > Out[4]: -0.24593576445134829 > > Is it planned (would it be possible?) that the"old" way works as well? Yes! I am working on it at the moment. Pearu From nwagner at mecha.uni-stuttgart.de Fri Oct 14 10:30:00 2005 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Fri, 14 Oct 2005 16:30:00 +0200 Subject: [SciPy-dev] 0.4.3.1284 Message-ID: <434FC0E8.8060804@mecha.uni-stuttgart.de> Traceback (most recent call last): File "", line 1, in ? File "/usr/local/lib/python2.4/site-packages/scipy/stats/__init__.py", line 7, in ? from stats import * File "/usr/local/lib/python2.4/site-packages/scipy/stats/stats.py", line 199, in ? import scipy.linalg as linalg File "/usr/local/lib/python2.4/site-packages/scipy/linalg/__init__.py", line 8, in ? from basic import * File "/usr/local/lib/python2.4/site-packages/scipy/linalg/basic.py", line 223, in ? import decomp File "/usr/local/lib/python2.4/site-packages/scipy/linalg/decomp.py", line 452, in ? eps = scipy.utils.limits.double_epsilon AttributeError: 'module' object has no attribute 'utils' >>> scipy.base.__version__ '0.4.3.1284' From nwagner at mecha.uni-stuttgart.de Fri Oct 14 10:33:43 2005 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Fri, 14 Oct 2005 16:33:43 +0200 Subject: [SciPy-dev] NameError: name 'pi' is not defined Message-ID: <434FC1C7.4090203@mecha.uni-stuttgart.de> >>> import scipy.stats Traceback (most recent call last): File "", line 1, in ? File "/usr/local/lib/python2.4/site-packages/scipy/stats/__init__.py", line 7, in ? from stats import * File "/usr/local/lib/python2.4/site-packages/scipy/stats/stats.py", line 1503, in ? import distributions File "/usr/local/lib/python2.4/site-packages/scipy/stats/distributions.py", line 845, in ? anglit = anglit_gen(a=-pi/4,b=pi/4,name='anglit', extradoc=""" NameError: name 'pi' is not defined From aisaac at american.edu Fri Oct 14 12:03:50 2005 From: aisaac at american.edu (Alan Isaac) Date: Fri, 14 Oct 2005 12:03:50 -0400 (EDT) Subject: [SciPy-dev] BUG: dot segfaults on float/complex arrays In-Reply-To: References: Message-ID: On Fri, 14 Oct 2005, Pearu Peterson wrote: > Bug report: dot segfaults when applied to float or complex arrays. Integer > arrays seem to be ok. > In [1]: from scipy import * > In [2]: dot(array([1.]),array([2])) > Segmentation fault Using scipy_core Windows binary, I don't see this. Alan Isaac Python 2.4.1 (#65, Mar 30 2005, 09:13:57) [MSC v.1310 32 bit (Intel)] on win32 Type "help", "copyright", "credits" or "license" for more information. >>> import scipy as S >>> S.dot(S.array([1.]),S.array([2])) 2.0 >>> S.__version__ Traceback (most recent call last): File "", line 1, in ? AttributeError: 'module' object has no attribute '__version__' >>> From pearu at scipy.org Fri Oct 14 11:09:30 2005 From: pearu at scipy.org (Pearu Peterson) Date: Fri, 14 Oct 2005 10:09:30 -0500 (CDT) Subject: [SciPy-dev] BUG: dot segfaults on float/complex arrays In-Reply-To: References: Message-ID: On Fri, 14 Oct 2005, Alan Isaac wrote: > On Fri, 14 Oct 2005, Pearu Peterson wrote: > >> Bug report: dot segfaults when applied to float or complex arrays. Integer >> arrays seem to be ok. > >> In [1]: from scipy import * > >> In [2]: dot(array([1.]),array([2])) >> Segmentation fault > > > Using scipy_core Windows binary, I don't see this. And you shouldn't. My error report was premature as I already mentioned in previous message. Pearu From oliphant at ee.byu.edu Fri Oct 14 12:13:32 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Fri, 14 Oct 2005 10:13:32 -0600 Subject: [SciPy-dev] Python: package scipy In-Reply-To: <434FD386.9030704@sympatico.ca> References: <434FD386.9030704@sympatico.ca> Message-ID: <434FD92C.5090600@ee.byu.edu> Colin J. Williams wrote: > You asked for a pydoc of scipy. I hope that this helps. > > Colin W. > > http://www3.sympatico.ca/cjw/SciPy/ That's a great start. If this gets fleshed out so all the links work, then we should place a link to it so it is visible to everybody. -Travis From aisaac at american.edu Fri Oct 14 12:22:14 2005 From: aisaac at american.edu (Alan Isaac) Date: Fri, 14 Oct 2005 12:22:14 -0400 (EDT) Subject: [SciPy-dev] BUG: dot segfaults on float/complex arrays In-Reply-To: References: Message-ID: On Fri, 14 Oct 2005, Pearu Peterson wrote: > And you shouldn't. My error report was premature as I > already mentioned in previous message. Yes. I overlooked your response when I wrote. Sorry for the noise. But I meant my report to add a question: should scipy report a version number? >>> import scipy as S >>> S.__version__ Traceback (most recent call last): File "", line 1, in ? AttributeError: 'module' object has no attribute '__version__' Cheers, Alan From arnd.baecker at web.de Fri Oct 14 12:49:38 2005 From: arnd.baecker at web.de (Arnd Baecker) Date: Fri, 14 Oct 2005 18:49:38 +0200 (CEST) Subject: [SciPy-dev] BUG: dot segfaults on float/complex arrays In-Reply-To: References: Message-ID: On Fri, 14 Oct 2005, Alan Isaac wrote: > But I meant my report to add a question: > should scipy report a version number? > > >>> import scipy as S > >>> S.__version__ > Traceback (most recent call last): > File "", line 1, in ? > AttributeError: 'module' object has no attribute '__version__' S.base.__version__ shows the version of scipy core. Best, Arnd From Fernando.Perez at colorado.edu Fri Oct 14 17:22:10 2005 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Fri, 14 Oct 2005 15:22:10 -0600 Subject: [SciPy-dev] Itianum feedback In-Reply-To: <43501F2F.1080901@colorado.edu> References: <434D913D.203@colorado.edu> <434D92FC.6030609@colorado.edu> <434D932D.80309@colorado.edu> <43501F2F.1080901@colorado.edu> Message-ID: <43502182.8010501@colorado.edu> Hi Andrew, welcome to scipy-dev :) You may want to subscribe to continue this discussion on-list. Scipy-dev: I'm forwarding here info from a colleague, regarding build information on Itanium machines. A bit of info about the hardware: ~>uname -a Linux XXXXXXXX.colorado.edu 2.4.21-32.0.1.EL #1 SMP Tue May 17 18:02:44 EDT 2005 ia64 ia64 ia64 GNU/Linux ~>cat /proc/cpuinfo processor : 0 vendor : GenuineIntel arch : IA-64 family : Itanium 2 model : 1 revision : 5 archrev : 0 features : branchlong cpu number : 0 cpu regs : 4 cpu MHz : 1300.000000 itc MHz : 1300.000000 BogoMIPS : 1946.15 [repeated 4 times, it's a 4-cpu box] ~>free total used free shared buffers cached Mem: 24779536 13069312 11710224 0 808784 10497376 -/+ buffers/cache: 1763152 23016384 Swap: 51477632 960 51476672 ~>cat /etc/issue Red Hat Enterprise Linux AS release 3 (Taroon Update 6) ~>gcc -v Reading specs from /usr/lib/gcc-lib/ia64-redhat-linux/3.2.3/specs Configured with: ../configure --prefix=/usr --mandir=/usr/share/man --infodir=/usr/share/info --enable-shared --enable-threads=posix --disable-checking --with-system-zlib --enable-__cxa_atexit --host=ia64-redhat-linux Thread model: posix gcc version 3.2.3 20030502 (Red Hat Linux 3.2.3-53) ~>python Python 2.2.3 (#1, Feb 2 2005, 12:38:05) [GCC 3.2.3 20030502 (Red Hat Linux 3.2.3-49)] on linux2 Travis et al: I have access to these boxes and I'm upstairs from Andrew, so I'll be happy to help with further testing of this. Along with Arnd's work on the opterons, it would be great if we could confirm that upon release, scipycore/new both build and pass all tests both on Opteron and on Itanium. Andrew's original message follows unmodified. Cheers, f Andrew Docherty wrote: > Hi Fernando, > > You might be interested to know that I managed to get scipy_core to > build and import on the itanium. > > The only problem was that it tries to link files with the fortran > compiler, making it necessary to change "F77" to "gcc -shared" manually > for each linking. This is rather odd, and I couldn't figure out why. > > I didn't manage to get it to work with the intel compiler either, just > the gnu c complier so far, the intel compiler needs some more options to > link the shared libraries i think. > > I got newscipy on svn to compile, after a few changes, but unfortunately > fftpack in newscipy doesn't seem to be working at all, so no fftw as > yet. And as all my code depends on ffts I can't test anything yet... From cookedm at physics.mcmaster.ca Fri Oct 14 17:37:05 2005 From: cookedm at physics.mcmaster.ca (David M. Cooke) Date: Fri, 14 Oct 2005 17:37:05 -0400 Subject: [SciPy-dev] mtrand.c compiler warnings In-Reply-To: (David M. Cooke's message of "Thu, 13 Oct 2005 11:46:40 -0400") References: <434E7B86.5040309@ucsd.edu> Message-ID: cookedm at physics.mcmaster.ca (David M. Cooke) writes: > Robert Kern writes: > >> Pearu Peterson wrote: >>> Hi Robert, >>> >>> Compiling mtrand.c throws lots of compiler warnings. Is it safe to >>> handedit mtrand.c to fix these warnings? Or should pyrex fixed to get rid >>> of these warnings? >> >> No, it's not safe to hand-edit mtrand.c . It really shouldn't even be in >> the repository, but I didn't want to make Pyrex an explicit >> build-dependency. >> >> Pyrex *ought* to be fixed to get rid of these warnings, but no one has >> really done the leg work to do so. None of the warnings actually matter, >> though. Does the distutils compiler abstraction easily allow silencing >> warnings? With gcc, you can just add -w, but people may not be using >> gcc. > > I've got a patch to Pyrex that fixes a lot of the unused label > warnings; I'll post it to the Pyrex list, and redo the mtrand.c > creation. ... and done. I did have to make a postprocessor to clean up the last few 'assignment from incompatible pointer type' warnings, as fixing that in Pyrex looks messy. generate_mtrand_c.py runs mtrand.pyx through pyrexc, and postprocesses it. -- |>|\/|< /--------------------------------------------------------------------------\ |David M. Cooke http://arbutus.physics.mcmaster.ca/dmc/ |cookedm at physics.mcmaster.ca From oliphant at ee.byu.edu Fri Oct 14 17:45:51 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Fri, 14 Oct 2005 15:45:51 -0600 Subject: [SciPy-dev] Package organization Message-ID: <4350270F.8090308@ee.byu.edu> I really like the discussion that is occurring. Part of the problem with the current system is that there was no discussion and so it just evolved. I'm not against major surgery on the organization as long as there is some mechanism for helping old users move forward. That's why scipy has not reached 1.0 yet -- because of code organization issues. First. I like what Fernando is trying to do with the scipy.kits concept. The most valuable thing scipy could provide is a mechanism for auto-indexing your scipy-related packages for a help browser. I don't think packages would need to live under scipy to accomplish this (but it would make it easier). There are other mechanism like the environment variable concept that could also collect package information. In fact, I wonder if scipy could detect that some module imported it and add that to some internal index (or only add it if the module defined some __scipy__ name or something... Second. Regarding the debate over the .lib directory. I could accept the fact that an extremely flat packaging structure has advantages. After all MATLAB "gets away" with an enormously flat system. I also see the dependency problem with a hierarchial approach. I think Robert raises a valid point, that a nested hierarchy is largely just a documentation/indexing issue. It could be handled using some standard form of keywords/GAMS classification numbers in the sub-packages themselves. For example in optimize.py __gamskeys__ = {'fmin': 'G1a1a'} Then scipy could provide an indexing system that would pick all of these names up and produce a GAMS hierarchy of code available. I think one of the purpose of the .lib directory is to extract those elements of the current full scipy that cause interdependencies among the modules to a 'library' directory that holds the actual dependencies. It could still serve that purpose. If we just start moving things to the library on a case-by-case basis, it might help us decide what should really go there. While I don't really have enough data to form a strong opinion, I like the idea of a flat structure under scipy with perhaps a few sub-directories like lib and sandbox. Third: I have no problem with moving ode solvers out of integrate into another sub-package. -Travis From Fernando.Perez at colorado.edu Fri Oct 14 18:13:40 2005 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Fri, 14 Oct 2005 16:13:40 -0600 Subject: [SciPy-dev] Package organization In-Reply-To: <4350270F.8090308@ee.byu.edu> References: <4350270F.8090308@ee.byu.edu> Message-ID: <43502D94.90906@colorado.edu> Travis Oliphant wrote: > I really like the discussion that is occurring. Part of the problem > with the current system is that there was no discussion and so it just > evolved. I'm not against major surgery on the organization as long as > there is some mechanism for helping old users move forward. That's why > scipy has not reached 1.0 yet -- because of code organization issues. > > First. I like what Fernando is trying to do with the scipy.kits > concept. The most valuable thing scipy could provide is a mechanism > for auto-indexing your scipy-related packages for a help browser. I > don't think packages would need to live under scipy to accomplish this > (but it would make it easier). There are other mechanism like the > environment variable concept that could also collect package > information. In fact, I wonder if scipy could detect that some module > imported it and add that to some internal index (or only add it if the > module defined some __scipy__ name or something... There is no way that I can think of, for scipy to know who has imported it. This is mainly because on _second_ import, python will fully short-circuit the 'import scipy' call out from sys.modules, so regardless of how many nasty stack tricks you want to play inside scipy.__init__, they'll only work for the first import. You could define a scipy.register_kit(...) function, which modules who want to register themselves can call in their __init__ file. I can see this being useful for certain things, but it still won't solve the problem of import scipy scipy.kits? or help(scipy.kits) or scipy.kits etc. (docs generation, graphical browsers...) For that, there has to be some mechanism to tell scipy itself to look for toolkits, regardless of whether they have been imported yet or not. Obviously things which have been installed directly in the scipy/kits/ directory on the filesystem can be trivially found, but this still doesn't address the issue of allowing users to maintain local collections of toolkits beyond the boundaries of write-only (or root-only) filesystems. There are two reasonably clean and well-accepted ways to do this: 1. An environment variable for a scipy search path 2. A ~/.scipyrc file, with a suitable definition of ~ for Win32. Since scipy is a library and not an end-user standalone program, I personally tend to dislike #2, though it is technically just as feasible as #1 (and offers room for future tweaks). Cheers, f From pearu at scipy.org Fri Oct 14 17:29:20 2005 From: pearu at scipy.org (Pearu Peterson) Date: Fri, 14 Oct 2005 16:29:20 -0500 (CDT) Subject: [SciPy-dev] Package organization In-Reply-To: <4350270F.8090308@ee.byu.edu> References: <4350270F.8090308@ee.byu.edu> Message-ID: On Fri, 14 Oct 2005, Travis Oliphant wrote: > First. I like what Fernando is trying to do with the scipy.kits > concept. The most valuable thing scipy could provide is a mechanism > for auto-indexing your scipy-related packages for a help browser. I > don't think packages would need to live under scipy to accomplish this > (but it would make it easier). There are other mechanism like the > environment variable concept that could also collect package > information. In fact, I wonder if scipy could detect that some module > imported it and add that to some internal index (or only add it if the > module defined some __scipy__ name or something... That's a difficult problem and I think with some nasty hacks it could be solved, e.g. overwriting builtin __import__ function by adding to it some callback mechanism. > Second. Regarding the debate over the .lib directory. I could accept > the fact that an extremely flat packaging structure has advantages. > After all MATLAB "gets away" with an enormously flat system. I also see > the dependency problem with a hierarchial approach. I think Robert > raises a valid point, that a nested hierarchy is largely just a > documentation/indexing issue. Then we have been talking about different things. I agree that too deep hierarchy can cause problems but I have never throught about it from the documentation point of view. And I never meant that scipy packages should be organized with a deep directory structure, the depth 2 or 3 is more than enough. > It could be handled using some standard > form of keywords/GAMS classification numbers in the sub-packages > themselves. > > For example in optimize.py > > __gamskeys__ = {'fmin': 'G1a1a'} > > Then scipy could provide an indexing system that would pick all of these > names up and produce a GAMS hierarchy of code available. That's a nice idea. > I think one of the purpose of the .lib directory is to extract those > elements of the current full scipy that cause interdependencies among > the modules to a 'library' directory that holds the actual dependencies. > It could still serve that purpose. If we just start moving things > to the library on a case-by-case basis, it might help us decide what > should really go there. > > While I don't really have enough data to form a strong opinion, I like > the idea of a flat structure under scipy with perhaps a few > sub-directories like lib and sandbox. I think that putting certain scipy packages under subdirectories lib, sandbox, and maybe kits, is all that I would wish. I have designed scipy.distutils to handle scipy package directories with any depth. But I hope that scipy.distutils design did not give this impression that we would need more than these subdirectories in scipy. Pearu From Fernando.Perez at colorado.edu Fri Oct 14 18:37:26 2005 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Fri, 14 Oct 2005 16:37:26 -0600 Subject: [SciPy-dev] [Fwd: Re: Itianum feedback] Message-ID: <43503326.5080102@colorado.edu> [ Joe, I'm cc-ing you on this one in case this is something under your umbrella of coverage, please ignore if that's not the case] Problems with the mailing lists: -------- Original Message -------- Subject: Re: Itianum feedback Hey Fernando, Sorry to bother you again .. I did try to sign up to the dev list, but there's just a black page when you click on "sign-up".. = fperez = I just confirmed this. Go here: http://www.scipy.org/mailinglists/ and click on 'sign-up' for scipy-{user,dev}. Empty page. Anyone who has access to fixing this? = /fperez = And by the way, python2.3 is installed under /usr/local/python on Phillips and I used this version of python. From Fernando.Perez at colorado.edu Fri Oct 14 18:48:11 2005 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Fri, 14 Oct 2005 16:48:11 -0600 Subject: [SciPy-dev] Package organization In-Reply-To: References: <4350270F.8090308@ee.byu.edu> Message-ID: <435035AB.3050601@colorado.edu> Pearu Peterson wrote: > > On Fri, 14 Oct 2005, Travis Oliphant wrote: > > >>First. I like what Fernando is trying to do with the scipy.kits >>concept. The most valuable thing scipy could provide is a mechanism >>for auto-indexing your scipy-related packages for a help browser. I >>don't think packages would need to live under scipy to accomplish this >>(but it would make it easier). There are other mechanism like the >>environment variable concept that could also collect package >>information. In fact, I wonder if scipy could detect that some module >>imported it and add that to some internal index (or only add it if the >>module defined some __scipy__ name or something... > > > That's a difficult problem and I think with some nasty hacks it could be > solved, e.g. overwriting builtin __import__ function by adding to it some > callback mechanism. Ah, I'd forgotten about __import__, sorry. So yes, using this one, a hack could be written to trap 'import scipy' calls, which would work on every call (not just the first). Always forgetting something... Cheers, f From rkern at ucsd.edu Fri Oct 14 21:03:33 2005 From: rkern at ucsd.edu (Robert Kern) Date: Fri, 14 Oct 2005 18:03:33 -0700 Subject: [SciPy-dev] Package organization In-Reply-To: <4350270F.8090308@ee.byu.edu> References: <4350270F.8090308@ee.byu.edu> Message-ID: <43505565.9010103@ucsd.edu> Travis Oliphant wrote: > I really like the discussion that is occurring. Part of the problem > with the current system is that there was no discussion and so it just > evolved. I'm not against major surgery on the organization as long as > there is some mechanism for helping old users move forward. That's why > scipy has not reached 1.0 yet -- because of code organization issues. > > First. I like what Fernando is trying to do with the scipy.kits > concept. The most valuable thing scipy could provide is a mechanism > for auto-indexing your scipy-related packages for a help browser. I > don't think packages would need to live under scipy to accomplish this > (but it would make it easier). There are other mechanism like the > environment variable concept that could also collect package > information. In fact, I wonder if scipy could detect that some module > imported it and add that to some internal index (or only add it if the > module defined some __scipy__ name or something... I'm pretty sure we can implement these mechanisms using pkg_resources and eggs. http://peak.telecommunity.com/DevCenter/PkgResources Specifically, look at the section on "Entry Points". -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From stephen.walton at csun.edu Sat Oct 15 01:49:42 2005 From: stephen.walton at csun.edu (Stephen Walton) Date: Fri, 14 Oct 2005 22:49:42 -0700 Subject: [SciPy-dev] Is scipy_core requiring a FORTRAN compiler? In-Reply-To: <434C34E0.2030000@ee.byu.edu> References: <434C30D5.3040705@stsci.edu> <434C34E0.2030000@ee.byu.edu> Message-ID: <43509876.1030701@csun.edu> Travis Oliphant wrote: >No, no fortran code, If anything it is scipy.distutils choking. > I'm a bit behind on my mail, but have another datum here. I have ATLAS and LAPACK built with Absoft. Even though newcore doesn't contain any Fortran code per se, it does link to libf77blas, and if I build newcore without a config_fc command specifying Absoft, I get an undefined Absoft symbol at lapack_lite load time. From nwagner at mecha.uni-stuttgart.de Sat Oct 15 02:44:32 2005 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Sat, 15 Oct 2005 08:44:32 +0200 Subject: [SciPy-dev] Segmentation fault Message-ID: Found 3 tests for scipy.basic.helper !! FAILURE importing tests for /usr/local/lib/python2.4/site-packages/scipy/interpolate/tests/test_fitpack.py:16: ImportError: No module named scipy_test.testing (in ?) Found 4 tests for scipy.base.index_tricks Found 0 tests for __main__ **************************************************************** WARNING: cblas module is empty ----------- See scipy/INSTALL.txt for troubleshooting. Notes: * If atlas library is not found by scipy/system_info.py, then scipy uses fblas instead of cblas. **************************************************************** ... Program received signal SIGSEGV, Segmentation fault. [Switching to Thread 1076102528 (LWP 15907)] 0x40b40c5f in array_from_pyobj (type_num=11, dims=Variable "dims" is not available. ) at build/src/fortranobject.c:651 651 if (PyArray_Check(obj)) { /* here we have always intent(in) or From joe at enthought.com Sat Oct 15 02:45:40 2005 From: joe at enthought.com (Joe Cooper) Date: Sat, 15 Oct 2005 01:45:40 -0500 Subject: [SciPy-dev] [Fwd: Re: Itianum feedback] In-Reply-To: <43503326.5080102@colorado.edu> References: <43503326.5080102@colorado.edu> Message-ID: <4350A594.8030900@enthought.com> I'm not sure what's changed--I haven't touched the plone on scipy.org in months, possibly years. The actual mailman pages are up: http://www.scipy.net/mailman/listinfo/scipy-user http://www.scipy.net/mailman/listinfo/scipy-dev I'm sure Travis is on the list, but I've cc'ed him anyway, as he probably has a better chance of knowing what's going on than anyone. I'm looking into it, but I have a significant Plone barrier that prevents my brain from understanding anything it does or why it does it or how to fix it when it doesn't do what it's supposed to do--I wouldn't give good odds on me being able to fix it. Fernando Perez wrote: > [ Joe, I'm cc-ing you on this one in case this is something under your > umbrella of coverage, please ignore if that's not the case] > > Problems with the mailing lists: > > -------- Original Message -------- > Subject: Re: Itianum feedback > > Hey Fernando, > > Sorry to bother you again .. I did try to sign up to the dev list, but > there's just a black page when you click on "sign-up".. > > > = fperez = > > I just confirmed this. Go here: > > http://www.scipy.org/mailinglists/ > > and click on 'sign-up' for scipy-{user,dev}. Empty page. > > Anyone who has access to fixing this? > > = /fperez = > > > And by the way, python2.3 is installed under /usr/local/python on > Phillips and I used this version of python. From Fernando.Perez at colorado.edu Sat Oct 15 03:19:18 2005 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Sat, 15 Oct 2005 01:19:18 -0600 Subject: [SciPy-dev] [Fwd: Re: Itianum feedback] In-Reply-To: <4350A594.8030900@enthought.com> References: <43503326.5080102@colorado.edu> <4350A594.8030900@enthought.com> Message-ID: <4350AD76.7050103@colorado.edu> Joe Cooper wrote: > I'm not sure what's changed--I haven't touched the plone on scipy.org in > months, possibly years. The actual mailman pages are up: > > http://www.scipy.net/mailman/listinfo/scipy-user > http://www.scipy.net/mailman/listinfo/scipy-dev > > I'm sure Travis is on the list, but I've cc'ed him anyway, as he > probably has a better chance of knowing what's going on than anyone. > I'm looking into it, but I have a significant Plone barrier that > prevents my brain from understanding anything it does or why it does it > or how to fix it when it doesn't do what it's supposed to do--I wouldn't > give good odds on me being able to fix it. Cool, thanks: those links work fine. If nothing else, the links on http://www.scipy.org/mailinglists could be fixed to point to the ones you provided above. While on the subject, this would have the added benefit of taking people directly to the non-plone archives pages, which are a LOT easier to read (since the emails are not squeezed inside a box requiring horizontal scrolling on many screens), and load instantly (the plone pages quite often are agonizingly slow to load). It would be nice (IMHO) if the search function gave results on the non-plone version, though that may be too much to ask :) Anyway, many thanks for providing a workaround. Regards, f From joe at enthought.com Sat Oct 15 03:38:55 2005 From: joe at enthought.com (Joe Cooper) Date: Sat, 15 Oct 2005 02:38:55 -0500 Subject: [SciPy-dev] [Fwd: Re: Itianum feedback] In-Reply-To: <4350AD76.7050103@colorado.edu> References: <43503326.5080102@colorado.edu> <4350A594.8030900@enthought.com> <4350AD76.7050103@colorado.edu> Message-ID: <4350B20F.2060200@enthought.com> Fernando Perez wrote: > Joe Cooper wrote: > >> I'm not sure what's changed--I haven't touched the plone on scipy.org >> in months, possibly years. The actual mailman pages are up: >> >> http://www.scipy.net/mailman/listinfo/scipy-user >> http://www.scipy.net/mailman/listinfo/scipy-dev >> >> I'm sure Travis is on the list, but I've cc'ed him anyway, as he >> probably has a better chance of knowing what's going on than anyone. >> I'm looking into it, but I have a significant Plone barrier that >> prevents my brain from understanding anything it does or why it does >> it or how to fix it when it doesn't do what it's supposed to do--I >> wouldn't give good odds on me being able to fix it. > > > Cool, thanks: those links work fine. > > If nothing else, the links on > > http://www.scipy.org/mailinglists > > could be fixed to point to the ones you provided above. While on the > subject, this would have the added benefit of taking people directly to > the non-plone archives pages, which are a LOT easier to read (since the > emails are not squeezed inside a box requiring horizontal scrolling on > many screens), and load instantly (the plone pages quite often are > agonizingly slow to load). It would be nice (IMHO) if the search > function gave results on the non-plone version, though that may be too > much to ask :) > > Anyway, many thanks for providing a workaround. Plone is about to get a little bit faster when I finally pull the trigger and complete the migration to the new server (it's still sluggish on the new 3.0 GHz and 2GB RAM box that is new.scipy.org--I think in addition to the system upgrade, it's time to introduce SciPy.org to Squid, I think they'd be good friends). But it won't fix the squeezed in problem. It might be nicer for readers for us to bring the SciPy style over to the Mailman pages rather than forcing mailman to display within SciPy.org...I'll spend some time on it sometime soon. From Fernando.Perez at colorado.edu Sat Oct 15 03:47:22 2005 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Sat, 15 Oct 2005 01:47:22 -0600 Subject: [SciPy-dev] [Fwd: Re: Itianum feedback] In-Reply-To: <4350B20F.2060200@enthought.com> References: <43503326.5080102@colorado.edu> <4350A594.8030900@enthought.com> <4350AD76.7050103@colorado.edu> <4350B20F.2060200@enthought.com> Message-ID: <4350B40A.1030000@colorado.edu> Joe Cooper wrote: > Plone is about to get a little bit faster when I finally pull the > trigger and complete the migration to the new server (it's still > sluggish on the new 3.0 GHz and 2GB RAM box that is new.scipy.org--I > think in addition to the system upgrade, it's time to introduce > SciPy.org to Squid, I think they'd be good friends). But it won't fix > the squeezed in problem. It might be nicer for readers for us to bring > the SciPy style over to the Mailman pages rather than forcing mailman to > display within SciPy.org...I'll spend some time on it sometime soon. Wow. You really are right on Plone's sluggishness/resource appetite. The scipy site seems to be relatively static (even if you need a lot of bandwith for downloads, I can't imagine the dynamic load on the site being all that great). That Plone can take, with that kind of load, a 3GHz/2GB box to its knees so often is certainly a bad sign... Anyway, thanks for your help on this, quick as ever. Best, f From arnd.baecker at web.de Sat Oct 15 05:18:43 2005 From: arnd.baecker at web.de (Arnd Baecker) Date: Sat, 15 Oct 2005 11:18:43 +0200 (CEST) Subject: [SciPy-dev] Segmentation fault In-Reply-To: References: Message-ID: Not sure if it is related to what Nils observes (in my case ATLAS is used, i.e. everything is as before on the opteron) With In [1]: import scipy Importing io to scipy Importing special to scipy Importing utils to scipy Importing interpolate to scipy Importing optimize to scipy Importing linalg to scipy In [2]: scipy.base.__version__ Out[2]: '0.4.3.1292' I get for `scipy.test(10,verbosity=10)` check_zero (scipy.linalg.matfuncs.test_matfuncs.test_expm) ... ERROR check_nils (scipy.linalg.matfuncs.test_matfuncs.test_logm) ... ERROR check_defective1 (scipy.linalg.matfuncs.test_matfuncs.test_signm) ... ERROR check_defective2 (scipy.linalg.matfuncs.test_matfuncs.test_signm) ... ERROR check_defective3 (scipy.linalg.matfuncs.test_matfuncs.test_signm) ... ERROR check_nils (scipy.linalg.matfuncs.test_matfuncs.test_signm) ... ERROR check_bad (scipy.linalg.matfuncs.test_matfuncs.test_sqrtm) ... ERROR check_cblas (scipy.linalg.blas.test_blas.test_blas) ... ok check_fblas (scipy.linalg.blas.test_blas.test_blas) ... ok check_axpy (scipy.linalg.blas.test_blas.test_cblas1_simple)Segmentation fault If any of you needs the full build log, just le me know. Best, Arnd From nwagner at mecha.uni-stuttgart.de Sat Oct 15 05:27:29 2005 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Sat, 15 Oct 2005 11:27:29 +0200 Subject: [SciPy-dev] Segmentation fault In-Reply-To: References: Message-ID: On Sat, 15 Oct 2005 11:18:43 +0200 (CEST) Arnd Baecker wrote: > Not sure if it is related to what Nils observes > (in my case ATLAS is used, i.e. everything is as before >on the opteron) > > With > In [1]: import scipy > Importing io to scipy > Importing special to scipy > Importing utils to scipy > Importing interpolate to scipy > Importing optimize to scipy > Importing linalg to scipy > In [2]: scipy.base.__version__ > Out[2]: '0.4.3.1292' > > I get for `scipy.test(10,verbosity=10)` > > check_zero >(scipy.linalg.matfuncs.test_matfuncs.test_expm) ... ERROR > check_nils >(scipy.linalg.matfuncs.test_matfuncs.test_logm) ... ERROR > check_defective1 >(scipy.linalg.matfuncs.test_matfuncs.test_signm) ... > ERROR > check_defective2 >(scipy.linalg.matfuncs.test_matfuncs.test_signm) ... > ERROR > check_defective3 >(scipy.linalg.matfuncs.test_matfuncs.test_signm) ... > ERROR > check_nils >(scipy.linalg.matfuncs.test_matfuncs.test_signm) ... >ERROR > check_bad >(scipy.linalg.matfuncs.test_matfuncs.test_sqrtm) ... >ERROR > check_cblas (scipy.linalg.blas.test_blas.test_blas) ... >ok > check_fblas (scipy.linalg.blas.test_blas.test_blas) ... >ok > check_axpy >(scipy.linalg.blas.test_blas.test_cblas1_simple)Segmentation > fault > > If any of you needs the full build log, just le me know. > > Best, > > Arnd > > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-dev Please try to import scipy.stats >>> import scipy Importing io to scipy Importing special to scipy Importing utils to scipy Importing interpolate to scipy Importing optimize to scipy Importing linalg to scipy >>> import scipy.stats Segmentation fault From nwagner at mecha.uni-stuttgart.de Sat Oct 15 05:31:09 2005 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Sat, 15 Oct 2005 11:31:09 +0200 Subject: [SciPy-dev] Segmentation fault Message-ID: Here comes a full debug report >>> scipy.base.__version__ '0.4.3.1292' >>> import scipy.stats Importing io to scipy Importing special to scipy Importing utils to scipy Importing interpolate to scipy Importing optimize to scipy Importing linalg to scipy Program received signal SIGSEGV, Segmentation fault. [Switching to Thread 1076102528 (LWP 16250)] 0x40ca152b in PyFortranObject_New (defs=0x0, init=0) at build/src/fortranobject.c:41 41 v = PyArray_FromDimsAndData(fp->defs[i].rank, (gdb) bt #0 0x40ca152b in PyFortranObject_New (defs=0x0, init=0) at build/src/fortranobject.c:41 #1 0x40c9f95a in initmvn () at build/src/Lib/stats/mvnmodule.c:677 #2 0x080ee58d in _PyImport_LoadDynamicModule (name=0xbfffc240 "scipy.stats.mvn", pathname=0xbfffbdb0 "/usr/local/lib/python2.4/site-packages/scipy/stats/mvn.so", fp=0x8204ce8) at Python/importdl.c:53 #3 0x080ebf89 in load_module (name=0xbfffc240 "scipy.stats.mvn", fp=Variable "fp" is not available. ) at Python/import.c:1665 #4 0x080eca51 in import_submodule (mod=0x403a938c, subname=0xbfffc24c "mvn", fullname=0xbfffc240 "scipy.stats.mvn") at Python/import.c:2250 #5 0x080ecc99 in load_next (mod=0x403a938c, altmod=0x81387a0, p_name=Variable "p_name" is not available. ) at Python/import.c:2070 #6 0x080ed0d7 in import_module_ex (name=Variable "name" is not available. ) at Python/import.c:1905 #7 0x080ed48d in PyImport_ImportModuleEx (name=0x40bc1354 "mvn", globals=0x40c5adfc, locals=0x40c5adfc, fromlist=0x81387a0) at Python/import.c:1946 #8 0x080bb9c7 in builtin___import__ (self=0x0, args=0x40be1d74) at Python/bltinmodule.c:45 #9 0x0811d436 in PyCFunction_Call (func=0x4024ad6c, arg=0x40be1d74, kw=0x0) at Objects/methodobject.c:93 #10 0x0805928e in PyObject_Call (func=0x4024ad6c, arg=0x40be1d74, kw=0x0) at Objects/abstract.c:1746 #11 0x080c1b99 in PyEval_EvalFrame (f=0x816c1a4) at Python/ceval.c:3419 #12 0x080c7a54 in PyEval_EvalCodeEx (co=0x40be0220, globals=0x40c5adfc, locals=0x40c5adfc, args=0x0, argcount=0, kws=0x0, kwcount=0, defs=0x0, defcount=0, closure=0x0) at Python/ceval.c:2730 #13 0x080c7c85 in PyEval_EvalCode (co=0x40be0220, globals=0x40c5adfc, locals=0x40c5adfc) at Python/ceval.c:484 #14 0x080eab75 in PyImport_ExecCodeModuleEx (name=0xbfffd2c0 "scipy.stats.kde", co=0x40be0220, pathname=0xbfffc9b0 "/usr/local/lib/python2.4/site-packages/scipy/stats/kde.pyc") at Python/import.c:619 #15 0x080eb0ab in load_source_module (name=0xbfffd2c0 "scipy.stats.kde", pathname=Variable "pathname" is not available. ) at Python/import.c:893 #16 0x080ebfa9 in load_module (name=0xbfffd2c0 "scipy.stats.kde", fp=Variable "fp" is not available. ) at Python/import.c:1656 #17 0x080eca51 in import_submodule (mod=0x403a938c, subname=0xbfffd2cc "kde", fullname=0xbfffd2c0 "scipy.stats.kde") at Python/import.c:2250 #18 0x080ecc99 in load_next (mod=0x403a938c, altmod=0x81387a0, p_name=Variable "p_name" is not available. ) ---Type to continue, or q to quit--- at Python/import.c:2070 #19 0x080ed0d7 in import_module_ex (name=Variable "name" is not available. ) at Python/import.c:1905 #20 0x080ed48d in PyImport_ImportModuleEx (name=0x40291254 "kde", globals=0x40428b54, locals=0x40428b54, fromlist=0x402912ac) at Python/import.c:1946 #21 0x080bb9c7 in builtin___import__ (self=0x0, args=0x40424a54) at Python/bltinmodule.c:45 #22 0x0811d436 in PyCFunction_Call (func=0x4024ad6c, arg=0x40424a54, kw=0x0) at Objects/methodobject.c:93 #23 0x0805928e in PyObject_Call (func=0x4024ad6c, arg=0x40424a54, kw=0x0) at Objects/abstract.c:1746 #24 0x080c1b99 in PyEval_EvalFrame (f=0x819259c) at Python/ceval.c:3419 #25 0x080c7a54 in PyEval_EvalCodeEx (co=0x40288ee0, globals=0x40428b54, locals=0x40428b54, args=0x0, argcount=0, kws=0x0, kwcount=0, defs=0x0, defcount=0, closure=0x0) at Python/ceval.c:2730 #26 0x080c7c85 in PyEval_EvalCode (co=0x40288ee0, globals=0x40428b54, locals=0x40428b54) at Python/ceval.c:484 #27 0x080eab75 in PyImport_ExecCodeModuleEx (name=0xbfffe7c0 "scipy.stats", co=0x40288ee0, pathname=0xbfffda30 "/usr/local/lib/python2.4/site-packages/scipy/stats/__init__.pyc") at Python/import.c:619 #28 0x080eb0ab in load_source_module (name=0xbfffe7c0 "scipy.stats", pathname=Variable "pathname" is not available. ) at Python/import.c:893 #29 0x080ebfa9 in load_module (name=0xbfffe7c0 "scipy.stats", fp=Variable "fp" is not available. ) at Python/import.c:1656 #30 0x080ec335 in load_package (name=0xbfffe7c0 "scipy.stats", pathname=0xbfffe330 "/usr/local/lib/python2.4/site-packages/scipy/stats") at Python/import.c:949 #31 0x080ebf7b in load_module (name=0xbfffe7c0 "scipy.stats", fp=Variable "fp" is not available. ) at Python/import.c:1670 #32 0x080eca51 in import_submodule (mod=0x4024238c, subname=0xbfffe7c6 "stats", fullname=0xbfffe7c0 "scipy.stats") at Python/import.c:2250 #33 0x080ecc99 in load_next (mod=0x4024238c, altmod=0x4024238c, p_name=Variable "p_name" is not available. ) at Python/import.c:2070 #34 0x080ed1c0 in import_module_ex (name=Variable "name" is not available. ) at Python/import.c:1912 #35 0x080ed48d in PyImport_ImportModuleEx (name=0x402903cc "scipy.stats", globals=0x40259824, locals=0x40259824, fromlist=0x81387a0) at Python/import.c:1946 #36 0x080bb9c7 in builtin___import__ (self=0x0, args=0x40290374) at Python/bltinmodule.c:45 #37 0x0811d436 in PyCFunction_Call (func=0x4024ad6c, arg=0x40290374, kw=0x0) ---Type to continue, or q to quit--- at Objects/methodobject.c:93 #38 0x0805928e in PyObject_Call (func=0x4024ad6c, arg=0x40290374, kw=0x0) at Objects/abstract.c:1746 #39 0x080c1b99 in PyEval_EvalFrame (f=0x8195724) at Python/ceval.c:3419 #40 0x080c7a54 in PyEval_EvalCodeEx (co=0x4027bf20, globals=0x40259824, locals=0x40259824, args=0x0, argcount=0, kws=0x0, kwcount=0, defs=0x0, defcount=0, closure=0x0) at Python/ceval.c:2730 #41 0x080c7c85 in PyEval_EvalCode (co=0x4027bf20, globals=0x40259824, locals=0x40259824) at Python/ceval.c:484 #42 0x080f67e6 in PyRun_InteractiveOneFlags (fp=0x40235720, filename=0x8122a01 "", flags=0xbfffefd4) at Python/pythonrun.c:1264 #43 0x080f6a49 in PyRun_InteractiveLoopFlags (fp=0x40235720, filename=0x8122a01 "", flags=0xbfffefd4) at Python/pythonrun.c:694 #44 0x080f6b70 in PyRun_AnyFileExFlags (fp=0x40235720, filename=0x8122a01 "", closeit=0, flags=0xbfffefd4) at Python/pythonrun.c:657 #45 0x08055857 in Py_Main (argc=0, argv=0xbffff094) at Modules/main.c:484 #46 0x08054f07 in main (argc=1, argv=0xbffff094) at Modules/python.c:23 Cheers, Nils From nwagner at mecha.uni-stuttgart.de Sat Oct 15 05:44:43 2005 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Sat, 15 Oct 2005 11:44:43 +0200 Subject: [SciPy-dev] AttributeError: 'scipy.ndarray' object has no attribute 'typecode' Message-ID: File "/usr/local/lib/python2.4/site-packages/scipy/linalg/decomp.py", line 163, in eigvals return eig(a,b=b,left=0,right=0,overwrite_a=overwrite_a) File "/usr/local/lib/python2.4/site-packages/scipy/linalg/decomp.py", line 112, in eig geev, = get_lapack_funcs(('geev',),(a1,)) File "/usr/local/lib/python2.4/site-packages/scipy/linalg/lapack.py", line 41, in get_lapack_funcs t = arrays[i].typecode() AttributeError: 'scipy.ndarray' object has no attribute 'typecode' From pearu at scipy.org Sat Oct 15 04:45:24 2005 From: pearu at scipy.org (Pearu Peterson) Date: Sat, 15 Oct 2005 03:45:24 -0500 (CDT) Subject: [SciPy-dev] Segmentation fault In-Reply-To: References: Message-ID: Thanks for the bug reports. I just enabled building newscipy against newcore as well as importing scipy packages from scipy/__init__.py. After that I also observed these segfaults. It seems that they occur while importing or using scipy.f2py built extension modules. I don't know yet what exactly is causing these segfaults, few weeks ago scipy.f2py seemed to work fine with new scipy.base.. Pearu On Sat, 15 Oct 2005, Nils Wagner wrote: > Here comes a full debug report >>>> scipy.base.__version__ > '0.4.3.1292' > >>>> import scipy.stats > Importing io to scipy > Importing special to scipy > Importing utils to scipy > Importing interpolate to scipy > Importing optimize to scipy > Importing linalg to scipy > > Program received signal SIGSEGV, Segmentation fault. > [Switching to Thread 1076102528 (LWP 16250)] > 0x40ca152b in PyFortranObject_New (defs=0x0, init=0) at > build/src/fortranobject.c:41 > 41 v = > PyArray_FromDimsAndData(fp->defs[i].rank, > (gdb) bt > #0 0x40ca152b in PyFortranObject_New (defs=0x0, init=0) > at build/src/fortranobject.c:41 From arnd.baecker at web.de Sat Oct 15 07:03:43 2005 From: arnd.baecker at web.de (Arnd Baecker) Date: Sat, 15 Oct 2005 13:03:43 +0200 (CEST) Subject: [SciPy-dev] Segmentation fault In-Reply-To: References: Message-ID: On Sat, 15 Oct 2005, Pearu Peterson wrote: > Thanks for the bug reports. I just enabled building newscipy against > newcore as well as importing scipy packages from scipy/__init__.py. After > that I also observed these segfaults. It seems that they occur while > importing or using scipy.f2py built extension modules. I don't know yet > what exactly is causing these segfaults, few weeks ago scipy.f2py seemed > to work fine with new scipy.base.. I am sending you my build log on the opteron off-list. Maybe it helps. Best, Arnd From stephen.walton at csun.edu Sun Oct 16 01:22:07 2005 From: stephen.walton at csun.edu (Stephen Walton) Date: Sat, 15 Oct 2005 22:22:07 -0700 Subject: [SciPy-dev] Question about 64-bit integers being cast to double precision In-Reply-To: <1129157213.4839.5.camel@E011704> References: <4349EDA2.2090802@ee.byu.edu> <434A9453.2010705@csun.edu> <434A98EB.7030407@ee.byu.edu> <434D8F27.100@colorado.edu> <1129157213.4839.5.camel@E011704> Message-ID: <4351E37F.9050207@csun.edu> Charles R Harris wrote: >I think it is a good idea to keep double as the default, if only because >Python expects it. If someone needs more control over the precision of >arrays, why not do as c does and add functions sqrtf and sqrtl? > > Usages like sqrtf() and sqrtl() begin to look like pre-1975 Fortran, before generic functions were introduced. I can't change Python's basic behavior, but would rather that sqrt(scipy_integer_array) be simply disallowed in favor of requiring the user to explicitly change the type of the array to float, double, or long double. From dd55 at cornell.edu Sun Oct 16 14:33:13 2005 From: dd55 at cornell.edu (Darren Dale) Date: Sun, 16 Oct 2005 14:33:13 -0400 Subject: [SciPy-dev] signbit and -inf Message-ID: <200510161433.13556.dd55@cornell.edu> I tried testing the svn new_core this morning, and get three error messages, each ultimately caused by signbit(-inf) returning False instead of True. Has anyone else observed this? Darren test(level=1, verbosity=0) ====================================================================== FAIL: check_generic (scipy.base.type_check.test_type_check.test_isneginf) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/lib/python2.4/site-packages/scipy/base/tests/test_type_check.py", line 193, in check_generic assert(vals[0] == 1) AssertionError ====================================================================== FAIL: check_generic (scipy.base.type_check.test_type_check.test_isposinf) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/lib/python2.4/site-packages/scipy/base/tests/test_type_check.py", line 186, in check_generic assert(vals[0] == 0) AssertionError ====================================================================== FAIL: check_generic (scipy.base.type_check.test_type_check.test_nan_to_num) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/lib/python2.4/site-packages/scipy/base/tests/test_type_check.py", line 200, in check_generic assert_all(vals[0] < -1e10) and assert_all(isfinite(vals[0])) File "/usr/lib/python2.4/site-packages/scipy/base/tests/test_type_check.py", line 12, in assert_all assert(all(x)), x AssertionError: False ---------------------------------------------------------------------- Ran 135 tests in 0.211s From arnd.baecker at web.de Sun Oct 16 16:06:16 2005 From: arnd.baecker at web.de (Arnd Baecker) Date: Sun, 16 Oct 2005 22:06:16 +0200 (CEST) Subject: [SciPy-dev] signbit and -inf In-Reply-To: <200510161433.13556.dd55@cornell.edu> References: <200510161433.13556.dd55@cornell.edu> Message-ID: On Sun, 16 Oct 2005, Darren Dale wrote: > I tried testing the svn new_core this morning, and get three error messages, > each ultimately caused by signbit(-inf) returning False instead of True. Has > anyone else observed this? > > Darren > > test(level=1, verbosity=0) > ====================================================================== > FAIL: check_generic (scipy.base.type_check.test_type_check.test_isneginf) > ---------------------------------------------------------------------- > Traceback (most recent call last): > File "/usr/lib/python2.4/site-packages/scipy/base/tests/test_type_check.py", > line 193, in check_generic > assert(vals[0] == 1) > AssertionError > > ====================================================================== > FAIL: check_generic (scipy.base.type_check.test_type_check.test_isposinf) > ---------------------------------------------------------------------- > Traceback (most recent call last): > File "/usr/lib/python2.4/site-packages/scipy/base/tests/test_type_check.py", > line 186, in check_generic > assert(vals[0] == 0) > AssertionError > > ====================================================================== > FAIL: check_generic (scipy.base.type_check.test_type_check.test_nan_to_num) > ---------------------------------------------------------------------- > Traceback (most recent call last): > File "/usr/lib/python2.4/site-packages/scipy/base/tests/test_type_check.py", > line 200, in check_generic > assert_all(vals[0] < -1e10) and assert_all(isfinite(vals[0])) > File "/usr/lib/python2.4/site-packages/scipy/base/tests/test_type_check.py", > line 12, in assert_all > assert(all(x)), x > AssertionError: False > > ---------------------------------------------------------------------- > Ran 135 tests in 0.211s I don't see those (they are marked as ok), but: ====================================================================== ERROR: test_basic (scipy.base.matrix.test_matrix.test_algebra) ---------------------------------------------------------------------- Traceback (most recent call last): File "/home/abaecker/BUILDS2/Build_27/inst_scipy_newcore/lib/python2.4/site-packages/scipy/base/tests/test_matrix.py", line 60, in test_basic Ainv = linalg.inv(A) File "/home/abaecker/BUILDS2/Build_27//inst_scipy_newcore/lib/python2.4/site-packages/scipy/linalg/basic.py", line 189, in inv getrf,getri = get_lapack_funcs(('getrf','getri'),(a1,)) File "/home/abaecker/BUILDS2/Build_27//inst_scipy_newcore/lib/python2.4/site-packages/scipy/linalg/lapack.py", line 41, in get_lapack_funcs t = arrays[i].typecode() AttributeError: 'scipy.ndarray' object has no attribute 'typecode' ====================================================================== ERROR: test_basic (scipy.base.matrix.test_matrix.test_properties) ---------------------------------------------------------------------- Traceback (most recent call last): File "/home/abaecker/BUILDS2/Build_27/inst_scipy_newcore/lib/python2.4/site-packages/scipy/base/tests/test_matrix.py", line 36, in test_basic assert allclose(linalg.inv(A), mA.I) File "/home/abaecker/BUILDS2/Build_27//inst_scipy_newcore/lib/python2.4/site-packages/scipy/linalg/basic.py", line 189, in inv getrf,getri = get_lapack_funcs(('getrf','getri'),(a1,)) File "/home/abaecker/BUILDS2/Build_27//inst_scipy_newcore/lib/python2.4/site-packages/scipy/linalg/lapack.py", line 41, in get_lapack_funcs t = arrays[i].typecode() AttributeError: 'scipy.ndarray' object has no attribute 'typecode' ---------------------------------------------------------------------- Ran 129 tests in 0.071s FAILED (errors=2) Out[4]: For `scipy.base.test(1)` with In [5]: scipy.base.__version__ Out[5]: '0.4.3.1293' (For the full test, scipy.test(1,verbosity=10) I get, as mentioned before a segfault). Best, Arnd From dd55 at cornell.edu Sun Oct 16 16:56:45 2005 From: dd55 at cornell.edu (Darren Dale) Date: Sun, 16 Oct 2005 16:56:45 -0400 Subject: [SciPy-dev] signbit and -inf In-Reply-To: References: <200510161433.13556.dd55@cornell.edu> Message-ID: <200510161656.45947.dd55@cornell.edu> On Sunday 16 October 2005 4:06 pm, Arnd Baecker wrote: > On Sun, 16 Oct 2005, Darren Dale wrote: > > I tried testing the svn new_core this morning, and get three error > > messages, each ultimately caused by signbit(-inf) returning False instead > > of True. Has anyone else observed this? [...] > > I don't see those (they are marked as ok), but: [...] > > FAILED (errors=2) > Out[4]: > > For `scipy.base.test(1)` with > In [5]: scipy.base.__version__ > Out[5]: '0.4.3.1293' > > (For the full test, scipy.test(1,verbosity=10) I get, > as mentioned before a segfault). I can run the full test without a segfault, although I am only using the newcore (I get a segfault when I install newscipy and try to import scipy). I am using Gentoo on a Pentium4, with Python-2.4.2 and gcc-3.4.4. I built scipy against BLAS/LAPACK/ATLAS using the same site.cfg as I used to build scipy-0.3.2. It turns out that on my machine, signbit returns False for any negative number, not just -inf. Here is the output of system_info.py: _pkg_config_info: NOT AVAILABLE agg2_info: ( src_dirs = .:/usr/local/src ) NOT AVAILABLE atlas_blas_info: ( library_dirs = /usr/lib ) ( paths: /usr/lib/liblapack.so ) ( paths: /usr/lib/libblas.so ) ( paths: /usr/lib/libcblas.so ) ( paths: /usr/lib/libatlas.so ) ( include_dirs = /usr/include:/usr/include/atlas ) ( paths: /usr/include/atlas ) ( paths: /usr/include/atlas/atlas_cNCmm.h,/usr/include/atlas/atlas_smvN.h,/usr/include/atlas/atlas_smvS.h,/usr/include/atlas/atlas_smvT.h,/usr/include/atlas/atlas_dNCmm.h,/usr/include/atlas/atlas_type.h,/usr/include/atlas/atlas_zmvN.h,/usr/include/atlas/atlas_zmvS.h,/usr/include/atlas/atlas_zmvT.h,/usr/include/atlas/atlas_sNCmm.h,/usr/include/atlas/atlas_zNCmm.h,/usr/include/atlas/atlas_zsysinfo.h,/usr/include/atlas/atlas_ctrsmXover.h,/usr/include/atlas/atlas_ztrsmXover.h,/usr/include/atlas/atlas_csysinfo.h,/usr/include/atlas/atlas_dtrsmXover.h,/usr/include/atlas/atlas_ssysinfo.h,/usr/include/atlas/atlas_cr1.h,/usr/include/atlas/atlas_cmv.h,/usr/include/atlas/atlas_dr1.h,/usr/include/atlas/atlas_dmv.h,/usr/include/atlas/atlas_sr1.h,/usr/include/atlas/atlas_smv.h,/usr/include/atlas/atlas_zr1.h,/usr/include/atlas/atlas_zmv.h,/usr/include/atlas/atlas_strsmXover.h,/usr/include/atlas/atlas_dsysinfo.h,/usr/include/atlas/atlas_cacheedge.h,/usr/include/atlas/atlas_buildinfo.h,/usr/include/atlas/atlas_cmvN.h,/usr/include/atlas/atlas_cmvS.h,/usr/include/atlas/atlas_cmvT.h,/usr/include/atlas/atlas_dmvN.h,/usr/include/atlas/atlas_dmvS.h,/usr/include/atlas/atlas_dmvT.h ) ( paths: /usr/include/atlas/cblas.h,/usr/include/atlas/cblas.h ) ( library_dirs = /usr/lib ) FOUND: libraries = ['lapack', 'blas', 'cblas', 'atlas'] library_dirs = ['/usr/lib'] language = c include_dirs = ['/usr/include/atlas'] atlas_blas_threads_info: Setting PTATLAS=ATLAS ( library_dirs = /usr/lib ) ( paths: /usr/lib/liblapack.so ) ( paths: /usr/lib/libblas.so ) ( paths: /usr/lib/libcblas.so ) ( paths: /usr/lib/libatlas.so ) Setting PTATLAS=ATLAS ( include_dirs = /usr/include:/usr/include/atlas ) ( paths: /usr/include/atlas ) ( paths: /usr/include/atlas/atlas_cNCmm.h,/usr/include/atlas/atlas_smvN.h,/usr/include/atlas/atlas_smvS.h,/usr/include/atlas/atlas_smvT.h,/usr/include/atlas/atlas_dNCmm.h,/usr/include/atlas/atlas_type.h,/usr/include/atlas/atlas_zmvN.h,/usr/include/atlas/atlas_zmvS.h,/usr/include/atlas/atlas_zmvT.h,/usr/include/atlas/atlas_sNCmm.h,/usr/include/atlas/atlas_zNCmm.h,/usr/include/atlas/atlas_zsysinfo.h,/usr/include/atlas/atlas_ctrsmXover.h,/usr/include/atlas/atlas_ztrsmXover.h,/usr/include/atlas/atlas_csysinfo.h,/usr/include/atlas/atlas_dtrsmXover.h,/usr/include/atlas/atlas_ssysinfo.h,/usr/include/atlas/atlas_cr1.h,/usr/include/atlas/atlas_cmv.h,/usr/include/atlas/atlas_dr1.h,/usr/include/atlas/atlas_dmv.h,/usr/include/atlas/atlas_sr1.h,/usr/include/atlas/atlas_smv.h,/usr/include/atlas/atlas_zr1.h,/usr/include/atlas/atlas_zmv.h,/usr/include/atlas/atlas_strsmXover.h,/usr/include/atlas/atlas_dsysinfo.h,/usr/include/atlas/atlas_cacheedge.h,/usr/include/atlas/atlas_buildinfo.h,/usr/include/atlas/atlas_cmvN.h,/usr/include/atlas/atlas_cmvS.h,/usr/include/atlas/atlas_cmvT.h,/usr/include/atlas/atlas_dmvN.h,/usr/include/atlas/atlas_dmvS.h,/usr/include/atlas/atlas_dmvT.h ) ( paths: /usr/include/atlas/cblas.h,/usr/include/atlas/cblas.h ) Setting PTATLAS=ATLAS ( library_dirs = /usr/lib ) FOUND: libraries = ['lapack', 'blas', 'cblas', 'atlas'] library_dirs = ['/usr/lib'] language = c include_dirs = ['/usr/include/atlas'] atlas_info: ( library_dirs = /usr/lib ) ( paths: /usr/lib/liblapack.so ) ( paths: /usr/lib/libblas.so ) ( paths: /usr/lib/libcblas.so ) ( paths: /usr/lib/libatlas.so ) ( paths: /usr/lib/liblapack.so ) system_info.atlas_info ( include_dirs = /usr/include:/usr/include/atlas ) ( paths: /usr/include/atlas ) ( paths: /usr/include/atlas/atlas_cNCmm.h,/usr/include/atlas/atlas_smvN.h,/usr/include/atlas/atlas_smvS.h,/usr/include/atlas/atlas_smvT.h,/usr/include/atlas/atlas_dNCmm.h,/usr/include/atlas/atlas_type.h,/usr/include/atlas/atlas_zmvN.h,/usr/include/atlas/atlas_zmvS.h,/usr/include/atlas/atlas_zmvT.h,/usr/include/atlas/atlas_sNCmm.h,/usr/include/atlas/atlas_zNCmm.h,/usr/include/atlas/atlas_zsysinfo.h,/usr/include/atlas/atlas_ctrsmXover.h,/usr/include/atlas/atlas_ztrsmXover.h,/usr/include/atlas/atlas_csysinfo.h,/usr/include/atlas/atlas_dtrsmXover.h,/usr/include/atlas/atlas_ssysinfo.h,/usr/include/atlas/atlas_cr1.h,/usr/include/atlas/atlas_cmv.h,/usr/include/atlas/atlas_dr1.h,/usr/include/atlas/atlas_dmv.h,/usr/include/atlas/atlas_sr1.h,/usr/include/atlas/atlas_smv.h,/usr/include/atlas/atlas_zr1.h,/usr/include/atlas/atlas_zmv.h,/usr/include/atlas/atlas_strsmXover.h,/usr/include/atlas/atlas_dsysinfo.h,/usr/include/atlas/atlas_cacheedge.h,/usr/include/atlas/atlas_buildinfo.h,/usr/include/atlas/atlas_cmvN.h,/usr/include/atlas/atlas_cmvS.h,/usr/include/atlas/atlas_cmvT.h,/usr/include/atlas/atlas_dmvN.h,/usr/include/atlas/atlas_dmvS.h,/usr/include/atlas/atlas_dmvT.h ) ( paths: /usr/include/atlas/cblas.h,/usr/include/atlas/cblas.h ) ( library_dirs = /usr/lib ) FOUND: libraries = ['lapack', 'lapack', 'blas', 'cblas', 'atlas'] library_dirs = ['/usr/lib'] language = f77 include_dirs = ['/usr/include/atlas'] atlas_threads_info: Setting PTATLAS=ATLAS ( library_dirs = /usr/lib ) ( paths: /usr/lib/liblapack.so ) ( paths: /usr/lib/libblas.so ) ( paths: /usr/lib/libcblas.so ) ( paths: /usr/lib/libatlas.so ) ( paths: /usr/lib/liblapack.so ) system_info.atlas_threads_info Setting PTATLAS=ATLAS ( include_dirs = /usr/include:/usr/include/atlas ) ( paths: /usr/include/atlas ) ( paths: /usr/include/atlas/atlas_cNCmm.h,/usr/include/atlas/atlas_smvN.h,/usr/include/atlas/atlas_smvS.h,/usr/include/atlas/atlas_smvT.h,/usr/include/atlas/atlas_dNCmm.h,/usr/include/atlas/atlas_type.h,/usr/include/atlas/atlas_zmvN.h,/usr/include/atlas/atlas_zmvS.h,/usr/include/atlas/atlas_zmvT.h,/usr/include/atlas/atlas_sNCmm.h,/usr/include/atlas/atlas_zNCmm.h,/usr/include/atlas/atlas_zsysinfo.h,/usr/include/atlas/atlas_ctrsmXover.h,/usr/include/atlas/atlas_ztrsmXover.h,/usr/include/atlas/atlas_csysinfo.h,/usr/include/atlas/atlas_dtrsmXover.h,/usr/include/atlas/atlas_ssysinfo.h,/usr/include/atlas/atlas_cr1.h,/usr/include/atlas/atlas_cmv.h,/usr/include/atlas/atlas_dr1.h,/usr/include/atlas/atlas_dmv.h,/usr/include/atlas/atlas_sr1.h,/usr/include/atlas/atlas_smv.h,/usr/include/atlas/atlas_zr1.h,/usr/include/atlas/atlas_zmv.h,/usr/include/atlas/atlas_strsmXover.h,/usr/include/atlas/atlas_dsysinfo.h,/usr/include/atlas/atlas_cacheedge.h,/usr/include/atlas/atlas_buildinfo.h,/usr/include/atlas/atlas_cmvN.h,/usr/include/atlas/atlas_cmvS.h,/usr/include/atlas/atlas_cmvT.h,/usr/include/atlas/atlas_dmvN.h,/usr/include/atlas/atlas_dmvS.h,/usr/include/atlas/atlas_dmvT.h ) ( paths: /usr/include/atlas/cblas.h,/usr/include/atlas/cblas.h ) Setting PTATLAS=ATLAS ( library_dirs = /usr/lib ) FOUND: libraries = ['lapack', 'lapack', 'blas', 'cblas', 'atlas'] library_dirs = ['/usr/lib'] language = f77 include_dirs = ['/usr/include/atlas'] blas_info: ( library_dirs = /usr/lib ) ( paths: /usr/lib/libblas.so ) ( library_dirs = /usr/lib ) FOUND: libraries = ['blas'] library_dirs = ['/usr/lib'] language = f77 blas_opt_info: running build_src building extension "atlas_version" sources creating build creating build/src adding 'build/src/atlas_version_-0x54967df6.c' to sources. running build_ext customize UnixCCompiler customize UnixCCompiler using build_ext building 'atlas_version' extension compiling C sources i686-pc-linux-gnu-gcc options: '-pthread -fno-strict-aliasing -DNDEBUG -fPIC' creating build/temp.linux-i686-2.4 creating build/temp.linux-i686-2.4/build creating build/temp.linux-i686-2.4/build/src compile options: '-I/usr/include/atlas -I/usr/lib/python2.4/site-packages/scipy/base/include -I/usr/include/python2.4 -c' i686-pc-linux-gnu-gcc: build/src/atlas_version_-0x54967df6.c i686-pc-linux-gnu-gcc -pthread -shared build/temp.linux-i686-2.4/build/src/atlas_version_-0x54967df6.o -L/usr/lib -llapack -lblas -lcblas -latlas -o build/temp.linux-i686-2.4/atlas_version.so ( library_dirs = /usr/lib ) FOUND: libraries = ['lapack', 'blas', 'cblas', 'atlas'] library_dirs = ['/usr/lib'] language = c define_macros = [('ATLAS_INFO', '"\\"3.6.0\\""')] include_dirs = ['/usr/include/atlas'] blas_src_info: ( src_dirs = .:/usr/local/src ) NOT AVAILABLE boost_python_info: ( src_dirs = .:/usr/local/src ) NOT AVAILABLE dfftw_info: ( library_dirs = /usr/lib ) ( include_dirs = /usr/include:/usr/include/atlas ) ( paths: /usr/lib/libdfftw.so ) ( paths: /usr/lib/libdrfftw.so ) ( paths: /usr/lib/libsfftw.so ) ( paths: /usr/lib/libsrfftw.so ) ( paths: /usr/include/dfftw.h,/usr/include/drfftw.h ) ( library_dirs = /usr/lib ) FOUND: libraries = ['dfftw', 'drfftw', 'sfftw', 'srfftw'] library_dirs = ['/usr/lib'] define_macros = [('SCIPY_DFFTW_H', None)] include_dirs = ['/usr/include'] dfftw_threads_info: ( library_dirs = /usr/lib ) ( include_dirs = /usr/include:/usr/include/atlas ) ( paths: /usr/lib/libdfftw.so ) ( paths: /usr/lib/libdrfftw.so ) ( paths: /usr/lib/libsfftw.so ) ( paths: /usr/lib/libsrfftw.so ) ( paths: /usr/include/dfftw_threads.h,/usr/include/drfftw_threads.h ) ( library_dirs = /usr/lib ) FOUND: libraries = ['dfftw', 'drfftw', 'sfftw', 'srfftw'] library_dirs = ['/usr/lib'] define_macros = [('SCIPY_DFFTW_THREADS_H', None)] include_dirs = ['/usr/include'] djbfft_info: ( library_dirs = /usr/lib ) ( include_dirs = /usr/include:/usr/include/atlas ) NOT AVAILABLE fftw_info: ( library_dirs = /usr/lib ) ( include_dirs = /usr/include:/usr/include/atlas ) ( paths: /usr/lib/libdfftw.so ) ( paths: /usr/lib/libdrfftw.so ) ( paths: /usr/lib/libsfftw.so ) ( paths: /usr/lib/libsrfftw.so ) ( paths: /usr/include/fftw.h,/usr/include/rfftw.h ) ( library_dirs = /usr/lib ) FOUND: libraries = ['dfftw', 'drfftw', 'sfftw', 'srfftw'] library_dirs = ['/usr/lib'] define_macros = [('SCIPY_FFTW_H', None)] include_dirs = ['/usr/include'] fftw_threads_info: ( library_dirs = /usr/lib ) ( include_dirs = /usr/include:/usr/include/atlas ) ( paths: /usr/lib/libdfftw.so ) ( paths: /usr/lib/libdrfftw.so ) ( paths: /usr/lib/libsfftw.so ) ( paths: /usr/lib/libsrfftw.so ) ( paths: /usr/include/fftw_threads.h,/usr/include/rfftw_threads.h ) ( library_dirs = /usr/lib ) FOUND: libraries = ['dfftw', 'drfftw', 'sfftw', 'srfftw'] library_dirs = ['/usr/lib'] define_macros = [('SCIPY_FFTW_THREADS_H', None)] include_dirs = ['/usr/include'] freetype2_info: ( library_dirs = /usr/lib ) FOUND: libraries = ['freetype', 'z'] define_macros = [('FREETYPE2_INFO', '"\\"9.8.3\\""'), ('FREETYPE2_VERSION_9_8_3', None)] include_dirs = ['/usr/include/freetype2'] gdk_2_info: ( library_dirs = /usr/lib ) FOUND: libraries = ['gdk-x11-2.0', 'gdk_pixbuf-2.0', 'm', 'pangocairo-1.0', 'pango-1.0', 'cairo', 'gobject-2.0', 'gmodule-2.0', 'dl', 'glib-2.0'] define_macros = [('GDK_2_INFO', '"\\"2.8.6\\""'), ('GDK_VERSION_2_8_6', None)] include_dirs = ['/usr/include/gtk-2.0', '/usr/lib/gtk-2.0/include', '/usr/include/pango-1.0', '/usr/include/cairo', '/usr/include/glib-2.0', '/usr/lib/glib-2.0/include'] gdk_info: ( library_dirs = /usr/lib ) FOUND: libraries = ['gdk', 'Xi', 'Xext', 'X11', 'm', 'glib'] library_dirs = ['/usr/X11R6/lib'] define_macros = [('GDK_INFO', '"\\"1.2.10\\""'), ('GDK_VERSION_1_2_10', None)] include_dirs = ['/usr/include/gtk-1.2', '/usr/X11R6/include', '/usr/include/glib-1.2', '/usr/lib/glib/include'] gdk_pixbuf_2_info: ( library_dirs = /usr/lib ) FOUND: libraries = ['gdk_pixbuf-2.0', 'm', 'gobject-2.0', 'gmodule-2.0', 'dl', 'glib-2.0'] define_macros = [('GDK_PIXBUF_2_INFO', '"\\"2.8.6\\""'), ('GDK_PIXBUF_VERSION_2_8_6', None)] include_dirs = ['/usr/include/gtk-2.0', '/usr/include/glib-2.0', '/usr/lib/glib-2.0/include'] gdk_pixbuf_xlib_2_info: ( library_dirs = /usr/lib ) FOUND: libraries = ['gdk_pixbuf_xlib-2.0', 'gdk_pixbuf-2.0', 'm', 'gobject-2.0', 'gmodule-2.0', 'dl', 'glib-2.0'] extra_link_args = ['-Wl,--export-dynamic'] define_macros = [('GDK_PIXBUF_XLIB_2_INFO', '"\\"2.8.6\\""'), ('GDK_PIXBUF_XLIB_VERSION_2_8_6', None)] include_dirs = ['/usr/include/gtk-2.0', '/usr/include/glib-2.0', '/usr/lib/glib-2.0/include'] gdk_x11_2_info: ( library_dirs = /usr/lib ) FOUND: libraries = ['gdk-x11-2.0', 'gdk_pixbuf-2.0', 'm', 'pangocairo-1.0', 'pango-1.0', 'cairo', 'gobject-2.0', 'gmodule-2.0', 'dl', 'glib-2.0'] define_macros = [('GDK_X11_2_INFO', '"\\"2.8.6\\""'), ('GDK_X11_VERSION_2_8_6', None)] include_dirs = ['/usr/include/gtk-2.0', '/usr/lib/gtk-2.0/include', '/usr/include/pango-1.0', '/usr/include/cairo', '/usr/include/glib-2.0', '/usr/lib/glib-2.0/include'] gtkp_2_info: ( library_dirs = /usr/lib ) FOUND: libraries = ['gtk-x11-2.0', 'gdk-x11-2.0', 'atk-1.0', 'gdk_pixbuf-2.0', 'm', 'pangocairo-1.0', 'pango-1.0', 'cairo', 'gobject-2.0', 'gmodule-2.0', 'dl', 'glib-2.0'] define_macros = [('GTKP_2_INFO', '"\\"2.8.6\\""'), ('GTK_VERSION_2_8_6', None)] include_dirs = ['/usr/include/gtk-2.0', '/usr/lib/gtk-2.0/include', '/usr/include/atk-1.0', '/usr/include/cairo', '/usr/include/pango-1.0', '/usr/include/glib-2.0', '/usr/lib/glib-2.0/include'] gtkp_x11_2_info: ( library_dirs = /usr/lib ) FOUND: libraries = ['gtk-x11-2.0', 'gdk-x11-2.0', 'atk-1.0', 'gdk_pixbuf-2.0', 'm', 'pangocairo-1.0', 'pango-1.0', 'cairo', 'gobject-2.0', 'gmodule-2.0', 'dl', 'glib-2.0'] define_macros = [('GTKP_X11_2_INFO', '"\\"2.8.6\\""'), ('GTK_X11_VERSION_2_8_6', None)] include_dirs = ['/usr/include/gtk-2.0', '/usr/lib/gtk-2.0/include', '/usr/include/atk-1.0', '/usr/include/cairo', '/usr/include/pango-1.0', '/usr/include/glib-2.0', '/usr/lib/glib-2.0/include'] lapack_atlas_info: ( library_dirs = /usr/lib ) ( paths: /usr/lib/liblapack.so ) ( paths: /usr/lib/libblas.so ) ( paths: /usr/lib/libcblas.so ) ( paths: /usr/lib/libatlas.so ) ( paths: /usr/lib/liblapack.so ) system_info.lapack_atlas_info ( include_dirs = /usr/include:/usr/include/atlas ) ( paths: /usr/include/atlas ) ( paths: /usr/include/atlas/atlas_cNCmm.h,/usr/include/atlas/atlas_smvN.h,/usr/include/atlas/atlas_smvS.h,/usr/include/atlas/atlas_smvT.h,/usr/include/atlas/atlas_dNCmm.h,/usr/include/atlas/atlas_type.h,/usr/include/atlas/atlas_zmvN.h,/usr/include/atlas/atlas_zmvS.h,/usr/include/atlas/atlas_zmvT.h,/usr/include/atlas/atlas_sNCmm.h,/usr/include/atlas/atlas_zNCmm.h,/usr/include/atlas/atlas_zsysinfo.h,/usr/include/atlas/atlas_ctrsmXover.h,/usr/include/atlas/atlas_ztrsmXover.h,/usr/include/atlas/atlas_csysinfo.h,/usr/include/atlas/atlas_dtrsmXover.h,/usr/include/atlas/atlas_ssysinfo.h,/usr/include/atlas/atlas_cr1.h,/usr/include/atlas/atlas_cmv.h,/usr/include/atlas/atlas_dr1.h,/usr/include/atlas/atlas_dmv.h,/usr/include/atlas/atlas_sr1.h,/usr/include/atlas/atlas_smv.h,/usr/include/atlas/atlas_zr1.h,/usr/include/atlas/atlas_zmv.h,/usr/include/atlas/atlas_strsmXover.h,/usr/include/atlas/atlas_dsysinfo.h,/usr/include/atlas/atlas_cacheedge.h,/usr/include/atlas/atlas_buildinfo.h,/usr/include/atlas/atlas_cmvN.h,/usr/include/atlas/atlas_cmvS.h,/usr/include/atlas/atlas_cmvT.h,/usr/include/atlas/atlas_dmvN.h,/usr/include/atlas/atlas_dmvS.h,/usr/include/atlas/atlas_dmvT.h ) ( paths: /usr/include/atlas/cblas.h,/usr/include/atlas/cblas.h ) ( library_dirs = /usr/lib ) FOUND: libraries = ['lapack', 'lapack', 'blas', 'cblas', 'atlas'] library_dirs = ['/usr/lib'] language = f77 include_dirs = ['/usr/include/atlas'] lapack_atlas_threads_info: Setting PTATLAS=ATLAS ( library_dirs = /usr/lib ) ( paths: /usr/lib/liblapack.so ) ( paths: /usr/lib/libblas.so ) ( paths: /usr/lib/libcblas.so ) ( paths: /usr/lib/libatlas.so ) ( paths: /usr/lib/liblapack.so ) system_info.lapack_atlas_threads_info Setting PTATLAS=ATLAS ( include_dirs = /usr/include:/usr/include/atlas ) ( paths: /usr/include/atlas ) ( paths: /usr/include/atlas/atlas_cNCmm.h,/usr/include/atlas/atlas_smvN.h,/usr/include/atlas/atlas_smvS.h,/usr/include/atlas/atlas_smvT.h,/usr/include/atlas/atlas_dNCmm.h,/usr/include/atlas/atlas_type.h,/usr/include/atlas/atlas_zmvN.h,/usr/include/atlas/atlas_zmvS.h,/usr/include/atlas/atlas_zmvT.h,/usr/include/atlas/atlas_sNCmm.h,/usr/include/atlas/atlas_zNCmm.h,/usr/include/atlas/atlas_zsysinfo.h,/usr/include/atlas/atlas_ctrsmXover.h,/usr/include/atlas/atlas_ztrsmXover.h,/usr/include/atlas/atlas_csysinfo.h,/usr/include/atlas/atlas_dtrsmXover.h,/usr/include/atlas/atlas_ssysinfo.h,/usr/include/atlas/atlas_cr1.h,/usr/include/atlas/atlas_cmv.h,/usr/include/atlas/atlas_dr1.h,/usr/include/atlas/atlas_dmv.h,/usr/include/atlas/atlas_sr1.h,/usr/include/atlas/atlas_smv.h,/usr/include/atlas/atlas_zr1.h,/usr/include/atlas/atlas_zmv.h,/usr/include/atlas/atlas_strsmXover.h,/usr/include/atlas/atlas_dsysinfo.h,/usr/include/atlas/atlas_cacheedge.h,/usr/include/atlas/atlas_buildinfo.h,/usr/include/atlas/atlas_cmvN.h,/usr/include/atlas/atlas_cmvS.h,/usr/include/atlas/atlas_cmvT.h,/usr/include/atlas/atlas_dmvN.h,/usr/include/atlas/atlas_dmvS.h,/usr/include/atlas/atlas_dmvT.h ) ( paths: /usr/include/atlas/cblas.h,/usr/include/atlas/cblas.h ) Setting PTATLAS=ATLAS ( library_dirs = /usr/lib ) FOUND: libraries = ['lapack', 'lapack', 'blas', 'cblas', 'atlas'] library_dirs = ['/usr/lib'] language = f77 include_dirs = ['/usr/include/atlas'] lapack_info: ( library_dirs = /usr/lib ) ( paths: /usr/lib/liblapack.so ) ( library_dirs = /usr/lib ) FOUND: libraries = ['lapack'] library_dirs = ['/usr/lib'] language = f77 lapack_opt_info: running build_src building extension "atlas_version" sources adding 'build/src/atlas_version_0x26de9279.c' to sources. running build_ext customize UnixCCompiler customize UnixCCompiler using build_ext customize GnuFCompiler customize GnuFCompiler customize GnuFCompiler using build_ext building 'atlas_version' extension compiling C sources i686-pc-linux-gnu-gcc options: '-pthread -fno-strict-aliasing -DNDEBUG -fPIC' compile options: '-I/usr/include/atlas -I/usr/lib/python2.4/site-packages/scipy/base/include -I/usr/include/python2.4 -c' i686-pc-linux-gnu-gcc: build/src/atlas_version_0x26de9279.c /usr/bin/g77 -shared build/temp.linux-i686-2.4/build/src/atlas_version_0x26de9279.o -L/usr/lib -llapack -llapack -lblas -lcblas -latlas -lg2c -o build/temp.linux-i686-2.4/atlas_version.so ( library_dirs = /usr/lib ) FOUND: libraries = ['lapack', 'lapack', 'blas', 'cblas', 'atlas'] library_dirs = ['/usr/lib'] language = f77 define_macros = [('ATLAS_INFO', '"\\"3.6.0\\""')] include_dirs = ['/usr/include/atlas'] lapack_src_info: ( src_dirs = .:/usr/local/src ) NOT AVAILABLE numarray_info: ( include_dirs = /usr/include:/usr/include/atlas ) ( library_dirs = /usr/lib ) FOUND: define_macros = [('NUMARRAY_VERSION', '"\\"1.3.1\\""'), ('NUMARRAY', None)] numpy_info: NOT AVAILABLE scipy_info: ( include_dirs = /usr/include:/usr/include/atlas ) ( library_dirs = /usr/lib ) FOUND: define_macros = [] sfftw_info: ( library_dirs = /usr/lib ) ( include_dirs = /usr/include:/usr/include/atlas ) ( paths: /usr/lib/libdfftw.so ) ( paths: /usr/lib/libdrfftw.so ) ( paths: /usr/lib/libsfftw.so ) ( paths: /usr/lib/libsrfftw.so ) ( paths: /usr/include/sfftw.h,/usr/include/srfftw.h ) ( library_dirs = /usr/lib ) FOUND: libraries = ['dfftw', 'drfftw', 'sfftw', 'srfftw'] library_dirs = ['/usr/lib'] define_macros = [('SCIPY_SFFTW_H', None)] include_dirs = ['/usr/include'] sfftw_threads_info: ( library_dirs = /usr/lib ) ( include_dirs = /usr/include:/usr/include/atlas ) ( paths: /usr/lib/libdfftw.so ) ( paths: /usr/lib/libdrfftw.so ) ( paths: /usr/lib/libsfftw.so ) ( paths: /usr/lib/libsrfftw.so ) ( paths: /usr/include/sfftw_threads.h,/usr/include/srfftw_threads.h ) ( library_dirs = /usr/lib ) FOUND: libraries = ['dfftw', 'drfftw', 'sfftw', 'srfftw'] library_dirs = ['/usr/lib'] define_macros = [('SCIPY_SFFTW_THREADS_H', None)] include_dirs = ['/usr/include'] wx_info: ( library_dirs = /usr/lib ) FOUND: libraries = ['wx_gtk2u_xrc-2.6', 'wx_gtk2u_html-2.6', 'wx_gtk2u_adv-2.6', 'wx_gtk2u_core-2.6', 'wx_baseu_xml-2.6', 'wx_baseu_net-2.6', 'wx_baseu-2.6'] extra_link_args = ['-pthread'] library_dirs = ['/usr/X11R6/lib'] define_macros = [('WX_INFO', '"\\"2.6.2\\""'), ('WX_VERSION_2_6_2', None), ('WX_RELEASE_2_6', None), ('GTK_NO_CHECK_CASTS', None), ('__WXGTK__', None), ('_FILE_OFFSET_BITS', '64'), ('_LARGE_FILES', None), ('_LARGEFILE_SOURCE', '1'), ('NO_GCC_PRAGMA', None)] include_dirs = ['/usr/lib/wx/include/gtk2-unicode-release-2.6', '/usr/include/wx-2.6'] x11_info: ( library_dirs = /usr/lib ) ( include_dirs = /usr/include:/usr/include/atlas ) ( paths: /usr/lib/libX11.so ) ( paths: /usr/include/X11/X.h ) ( library_dirs = /usr/lib ) FOUND: libraries = ['X11'] library_dirs = ['/usr/lib'] include_dirs = ['/usr/include'] xft_info: ( library_dirs = /usr/lib ) FOUND: libraries = ['Xft', 'X11', 'freetype', 'Xrender', 'fontconfig'] define_macros = [('XFT_INFO', '"\\"2.1.2.2\\""'), ('XFT_VERSION_2_1_2_2', None)] include_dirs = ['/usr/include/freetype2', '/usr/include/freetype2/config'] -- Dr. Darren S. Dale dd55 at cornell.edu From stephen.walton at csun.edu Sun Oct 16 20:05:21 2005 From: stephen.walton at csun.edu (Stephen Walton) Date: Sun, 16 Oct 2005 17:05:21 -0700 Subject: [SciPy-dev] AttributeError: 'scipy.ndarray' object has no attribute 'typecode' In-Reply-To: References: Message-ID: <4352EAC1.9070908@csun.edu> Nils Wagner wrote: > t = arrays[i].typecode() >AttributeError: 'scipy.ndarray' object has no attribute >'typecode' > > I think the right thing to do is to replace this line in lapack.py with t = arrays[i].dtypechar and delete the following line. From stephen.walton at csun.edu Sun Oct 16 20:12:39 2005 From: stephen.walton at csun.edu (Stephen Walton) Date: Sun, 16 Oct 2005 17:12:39 -0700 Subject: [SciPy-dev] Segmentation fault In-Reply-To: References: Message-ID: <4352EC77.7080305@csun.edu> Arnd Baecker wrote: >check_axpy (scipy.linalg.blas.test_blas.test_cblas1_simple)Segmentation >fault > > Minimal test case: In [1]: from scipy.linalg import cblas Importing io to scipy Importing special to scipy Importing utils to scipy Importing interpolate to scipy Importing optimize to scipy Importing linalg to scipy In [2]: cblas.saxpy(5,[1,2,3],[2,-1,3]) Segmentation fault This is with newscipy svn version 1336 From rkern at ucsd.edu Sun Oct 16 23:41:59 2005 From: rkern at ucsd.edu (Robert Kern) Date: Sun, 16 Oct 2005 20:41:59 -0700 Subject: [SciPy-dev] Package organization In-Reply-To: References: <434D0AE9.9070707@ftw.at> <434D5D5A.8000106@ee.byu.edu> <434D98F6.5080301@ucsd.edu> <434DFFD1.9040303@colorado.edu> <434E7A8B.1060105@ucsd.edu> <434EC517.2020007@ucsd.edu> Message-ID: <43531D87.9080208@ucsd.edu> Pearu Peterson wrote: > > On Thu, 13 Oct 2005, Robert Kern wrote: >>On top of that, eggs can handle namespace packages, but only one level >>down, so to speak. I can have several eggs exposing the scipy.* >>namespace and have them all treated as a single package, but I can't >>have several eggs exposing the scipy.lib.* namespace and have them all >>treated uniformly. Adding that functionality to the egg runtime would >>really complicate matters. > > Ah, ok, it's then all about the limitations of eggs. But what about > > scipy_lib_sundials-.egg/ > scipy_lib_sundials-.egg/scipy/ > scipy_lib_sundials-.egg/scipy/lib/sundials/ > scipy_lib_sundials-.egg/scipy/lib/sundials/__init__.py > scipy_lib_sundials-.egg/scipy/lib/sundials/_sundials.so > scipy_lib_sundials-.egg/scipy/lib/sundials/sundials.py > > ? (I have never used eggs myself, so, ignore my ignorance on this matter) I didn't think so, but by plowing through the distutils-SIG mails and some experimentation, I've found that this will in fact work. So the only technical objection is null. >>Perhaps you could explain to me how we benefit putting all of the raw >>wrappers into scipy.lib.*. I don't really see any. > > The scipy namespace will be cleaner, all packages that scipy will contain, > are more or less in the same level. Things that are in scipy.lib would be > raw wrappers with pythonic interfaces to various libraries that packages > in scipy level can use. Packages in scipy.lib would depend only on > scipy_core, not on each other. I would only point out that most of the time, it's those Python interfaces (what would go into sundials.py) that we want to expose to the user. When SUNDIALS gets wrapped by f2py, we then write a bit of Python to make the interface convenient and robust. That's pretty much it. The functions we write in that layer are what we want users to be using. I don't want that functionality hidden away in scipy.lib.sundials; I want it in scipy.sundials. >>If I were starting scipy all over again, I would also make scipy.ode for >>ODEPACK and VODE and leave scipy.integrate for QUADPACK and the other >>"integrals of functions" routines. I'm not sure it's worth doing the >>latter at this point. > > I think in scipy 0.1 or so we had all packages, both wrapper packages as > well as toolkits kind of packages, in scipy namespace. I think it was a > mess. Moving wrapper packages to scipy.lib would reduce this mess a bit. That's where we differ, I think. I don't see the distinction between wrapper packages and toolkits. Most of the time, the wrapper is the toolkit. It's just that we tended to throw together a bunch of wrappers into one module and called it a toolkit. If we put all of the wrappers into scipy.lib.*, we're going to end up with several scipy.* packages that are just aliases to the scipy.lib.* packages. > I think that odepack/vode as well as quadpack should be separated > from scipy.integrate. My orginal idea was to move the wrappers to > scipy.lib and scipy.integrate would just use them from there. > The main advantage is that when some other scipy package might use, say, > odepack, but not quadpack, then does not need to install the whole > scipy.integrate. With this separation, the scipy will be more modular. We're shooting for the same goal. I just think we should split them into scipy.ode (ODEPACK and VODE) and scipy.integrate (QUADPACK and quadrature.py) and be done with it. You get the same level of modularity. Splitting the wrappers into separate scipy.lib.* modules doesn't assist modularity unless the scipy.* packages that depend on them are also appropriately split up as well. At that point, most of our scipy.* packages are just aliases to scipy.lib.* packages. But like I said, my technical objection is now null, so I'm mostly arguing from aesthetics. de gustibus non est disputandum, and all that. -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From bgoli at sun.ac.za Mon Oct 17 03:39:22 2005 From: bgoli at sun.ac.za (Brett Olivier) Date: Mon, 17 Oct 2005 09:39:22 +0200 Subject: [SciPy-dev] scipy_distutuils build problem on Intel P4 running x86_64 Linux Message-ID: <200510170939.23136.bgoli@sun.ac.za> Hi I've been trying to build SciPy/newSciPy SVN versions on an Intel P4 (model 640) using Mandrake Linux x86_64 with GCC 3.4.4. And get the following error messages: building 'mach' library compiling Fortran sources g77(f77) options: '-Wall -fno-second-underscore -fPIC -O3 -funroll-loops -march=pentium4 -mmmx -msse2 -msse -malign-double -fomit-frame-pointer' creating build/temp.linux-x86_64-2.4/Lib/special/mach compile options: '-c' g77:f77: Lib/special/mach/i1mach.f Lib/special/mach/i1mach.f:0: error: CPU you selected does not support x86-64 instruction set Lib/special/mach/i1mach.f:0: error: CPU you selected does not support x86-64 instruction set Lib/special/mach/i1mach.f:0: error: -malign-double makes no sense in the 64bit mode Lib/special/mach/i1mach.f:0: error: CPU you selected does not support x86-64 instruction set Lib/special/mach/i1mach.f:0: error: CPU you selected does not support x86-64 instruction set Lib/special/mach/i1mach.f:0: error: -malign-double makes no sense in the 64bit mode removed Lib/__svn_version__.py My workaround so far has been to modify gnufcompiler.py/gnu.py so that: line 164: elif cpu.is_PentiumIV(): opt.append('-march=nocona') #opt.append('-march=pentium4') line 201: if cpu.is_Intel(): opt.extend(['-fomit-frame-pointer']) #opt.extend(['-malign-double','-fomit-frame-pointer']) I've included /proc/cpuinfo is included at the end of this message and I'll be happy to provide more information/testing. Thanks. Brett /proc/cpuinfo for Intel 640 P4 =================== processor : 0 vendor_id : GenuineIntel cpu family : 15 model : 4 model name : Intel(R) Pentium(R) 4 CPU 3.20GHz stepping : 3 cpu MHz : 3211.536 cache size : 2048 KB physical id : 0 siblings : 2 fpu : yes fpu_exception : yes cpuid level : 5 wp : yes flags : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush dts acpi mmx fxsr sse sse2 ss ht tm syscall nx lm pni monitor ds_cpl est cid cx16 xtpr bogomips : 6340.60 clflush size : 64 cache_alignment : 128 address sizes : 36 bits physical, 48 bits virtual power management: -- Brett G. Olivier Postdoctoral Fellow Triple-J Group for Molecular Cell Physiology Stellenbosch University bgoli at sun dot ac dot za http://glue.jjj.sun.ac.za/~bgoli Tel +27-21-8082704 Fax +27-21-8085863 Mobile +27-82-7329306 From pearu at scipy.org Mon Oct 17 03:19:12 2005 From: pearu at scipy.org (Pearu Peterson) Date: Mon, 17 Oct 2005 02:19:12 -0500 (CDT) Subject: [SciPy-dev] scipy_distutuils build problem on Intel P4 running x86_64 Linux In-Reply-To: <200510170939.23136.bgoli@sun.ac.za> References: <200510170939.23136.bgoli@sun.ac.za> Message-ID: On Mon, 17 Oct 2005, Brett Olivier wrote: > I've been trying to build SciPy/newSciPy SVN versions on an Intel P4 (model > 640) using Mandrake Linux x86_64 with GCC 3.4.4. And get the following error > messages: > My workaround so far has been to modify gnufcompiler.py/gnu.py so that: > > line 164: > elif cpu.is_PentiumIV(): > opt.append('-march=nocona') > #opt.append('-march=pentium4') > > line 201: > if cpu.is_Intel(): > opt.extend(['-fomit-frame-pointer']) > #opt.extend(['-malign-double','-fomit-frame-pointer']) > > I've included /proc/cpuinfo is included at the end of this message and I'll be > happy to provide more information/testing. Thanks for the patch. I have modified newcore scipy.distutils accordingly. Pearu From pearu at scipy.org Mon Oct 17 04:34:29 2005 From: pearu at scipy.org (Pearu Peterson) Date: Mon, 17 Oct 2005 03:34:29 -0500 (CDT) Subject: [SciPy-dev] Package organization In-Reply-To: <43531D87.9080208@ucsd.edu> References: <434D0AE9.9070707@ftw.at> <434D5D5A.8000106@ee.byu.edu> <434D98F6.5080301@ucsd.edu> <434DFFD1.9040303@colorado.edu> <434E7A8B.1060105@ucsd.edu> <434EC517.2020007@ucsd.edu> <43531D87.9080208@ucsd.edu> Message-ID: On Sun, 16 Oct 2005, Robert Kern wrote: > Pearu Peterson wrote: >> >>> Perhaps you could explain to me how we benefit putting all of the raw >>> wrappers into scipy.lib.*. I don't really see any. >> >> The scipy namespace will be cleaner, all packages that scipy will contain, >> are more or less in the same level. Things that are in scipy.lib would be >> raw wrappers with pythonic interfaces to various libraries that packages >> in scipy level can use. Packages in scipy.lib would depend only on >> scipy_core, not on each other. > > I would only point out that most of the time, it's those Python > interfaces (what would go into sundials.py) that we want to expose to > the user. When SUNDIALS gets wrapped by f2py, we then write a bit of > Python to make the interface convenient and robust. That's pretty much > it. The functions we write in that layer are what we want users to be > using. I don't want that functionality hidden away in > scipy.lib.sundials; I want it in scipy.sundials. If odepack/vode will be under scipy.ode and quadpack under scipy.integrate then couldn't sundials be under scipy.dae? It's more informative for those not knowing about sundials but need to solve dae's. That's also related to my preference of putting wrappers under scipy.lib and let higher lever packages such as scipy.ode, scipy.integrate, etc to import needed tools from scipy.lib. After all, higher level packages may use different library wrappers for the same task, just using different numerical methods that may be more suitable for a particular problem. >>> If I were starting scipy all over again, I would also make scipy.ode for >>> ODEPACK and VODE and leave scipy.integrate for QUADPACK and the other >>> "integrals of functions" routines. I'm not sure it's worth doing the >>> latter at this point. >> >> I think in scipy 0.1 or so we had all packages, both wrapper packages as >> well as toolkits kind of packages, in scipy namespace. I think it was a >> mess. Moving wrapper packages to scipy.lib would reduce this mess a bit. > > That's where we differ, I think. I don't see the distinction between > wrapper packages and toolkits. Most of the time, the wrapper is the > toolkit. It's just that we tended to throw together a bunch of wrappers > into one module and called it a toolkit. If we put all of the wrappers > into scipy.lib.*, we're going to end up with several scipy.* packages > that are just aliases to the scipy.lib.* packages. I guess we have different experiences. For example, I found that it's easier to maintain blas/lapack wrappers under scipy.lib rather than under scipy.linalg. The building time for these wrappers is relatively long so that developing/debugging process of these packages is easier if the modularity is higher. On the other hand, scipy.fftpack contains wrappers to Fortran fftpack, fftw, and djbfft and in this case it would not be practical to separate wrappers according to underlying libraries as it was possible to unify the interface to various FFT programs. >> I think that odepack/vode as well as quadpack should be separated >> from scipy.integrate. My orginal idea was to move the wrappers to >> scipy.lib and scipy.integrate would just use them from there. >> The main advantage is that when some other scipy package might use, say, >> odepack, but not quadpack, then does not need to install the whole >> scipy.integrate. With this separation, the scipy will be more modular. > > We're shooting for the same goal. I just think we should split them into > scipy.ode (ODEPACK and VODE) and scipy.integrate (QUADPACK and > quadrature.py) and be done with it. You get the same level of modularity. See above about scipy.dae suggestion. > Splitting the wrappers into separate scipy.lib.* modules doesn't assist > modularity unless the scipy.* packages that depend on them are also > appropriately split up as well. At that point, most of our scipy.* > packages are just aliases to scipy.lib.* packages. > > But like I said, my technical objection is now null, so I'm mostly > arguing from aesthetics. de gustibus non est disputandum, and all that. I think we should think further in future in this respect. If scipy would not grow considerably then I completely would agree with you. But as I see it, the current scipy contains a rather small set of tools compared to what it could contain. I just have a premonition that perhaps some over modularity today will pay off in future. I have learned from the hard way to prefer the modularity already from the maintainence point of view. When tracking down bugs, the modularity gives great advantages. And considering how many scipy packages currently lack even basic unittests, there are lots of bugs to be discovered yet. And I don't think that some internal modularity of scipy would confuse end users, they would still use only higher level scipy packages that live directly under scipy. I would sum up the implications of this thread as follows. We should view each scipy package separately. Some of them can be completely standalone (which is always a good property of a package), they contain both toolkits and wrappers to domain specific libraries. But there are also scipy packages that would use the same library wrappers and so such libraries should be made directly available for all such scipy packages. And by directly I mean that one scipy package should not import another scipy package in order to access some particular wrapper library. So, as a first instance, when developing a scipy package, library wrappers can be placed under this pacticular scipy package. But once the library wrappers could be used by another scipy package, there should be an easy way to move library wrappers under scipy.lib. Would such a compromise be acceptable? Pearu From schofield at ftw.at Mon Oct 17 05:56:39 2005 From: schofield at ftw.at (Ed Schofield) Date: Mon, 17 Oct 2005 11:56:39 +0200 Subject: [SciPy-dev] Package organization In-Reply-To: References: <434D0AE9.9070707@ftw.at> <434D5D5A.8000106@ee.byu.edu> <434D98F6.5080301@ucsd.edu> <434DFFD1.9040303@colorado.edu> <434E7A8B.1060105@ucsd.edu> <434EC517.2020007@ucsd.edu> <43531D87.9080208@ucsd.edu> Message-ID: <43537557.6080101@ftw.at> Pearu Peterson wrote: >If odepack/vode will be under scipy.ode and quadpack under scipy.integrate >then couldn't sundials be under scipy.dae? It's more informative for those >not knowing about sundials but need to solve dae's. > >That's also related to my preference of putting wrappers under scipy.lib >and let higher lever packages such as scipy.ode, scipy.integrate, etc >to import needed tools from scipy.lib. After all, higher level packages >may use different library wrappers for the same task, just using different >numerical methods that may be more suitable for a particular problem. > > +1. I think users will appreciate a task-oriented over a wrapper-oriented hierarchy. >I have learned from the hard way to prefer the modularity already from the >maintainence point of view. When tracking down bugs, the modularity gives >great advantages. And considering how many scipy packages currently lack >even basic unittests, there are lots of bugs to be discovered yet. > >And I don't think that some internal modularity of scipy would confuse >end users, they would still use only higher level scipy packages that >live directly under scipy. > >I would sum up the implications of this thread as follows. We should view >each scipy package separately. Some of them can be completely standalone >(which is always a good property of a package), they contain both toolkits >and wrappers to domain specific libraries. But there are also scipy >packages that would use the same library wrappers and so such libraries >should be made directly available for all such scipy packages. And by >directly I mean that one scipy package should not import another scipy >package in order to access some particular wrapper library. > >So, as a first instance, when developing a scipy package, library wrappers >can be placed under this pacticular scipy package. But once the library >wrappers could be used by another scipy package, there should be an easy >way to move library wrappers under scipy.lib. > >Would such a compromise be acceptable? > > +1 From schofield at ftw.at Mon Oct 17 06:18:10 2005 From: schofield at ftw.at (Ed Schofield) Date: Mon, 17 Oct 2005 12:18:10 +0200 Subject: [SciPy-dev] Introductions, sparse matrix support In-Reply-To: <3d502007996f0f09808ed0b07bf3a7b7@nist.gov> References: <434D0AE9.9070707@ftw.at> <434D5D5A.8000106@ee.byu.edu> <434E2610.7050200@ntc.zcu.cz> <434E3F3D.7000707@ftw.at> <434E9F05.3070804@ee.byu.edu> <3d502007996f0f09808ed0b07bf3a7b7@nist.gov> Message-ID: <43537A62.9030400@ftw.at> Jonathan Guyer wrote: >On Oct 13, 2005, at 1:53 PM, Travis Oliphant wrote: > > >>I'd happily welcome those extension types, but let's see if we can't >>make them all subclasses of one base-class. Look at how the scipy >>sparse Python classes are layed out. Basically, by defining a tocsc >>and >>fromcsc, all of them can be converted to each other and used as >>solvers. >> >> >>I did put some effort into the structure of scipy.sparse. I did not >>put any effort into optimizations, though. >> >> > >I think this is important. PySparse is neither very object oriented nor >very "Pythonic". I think the API can be done much better [*] (and >scipy.sparse may be it; I haven't had time yet to do anything practical >with it). I don't mean to denigrate PySparse; we happily use it and >Roman has been very open to our suggestions, I just think that we can >learn from it and do better. > > Okay then, let's use the existing sparse module's structure as a base. I'd like to ask for clarification about the versions in SVN. The most complete version seems to be newscipy/newscipy/Lib/sparse/ but what are the branches numeric_subpackage_branch{,2}/Lib/sparse? Are these dead and buried? From cimrman3 at ntc.zcu.cz Mon Oct 17 06:27:24 2005 From: cimrman3 at ntc.zcu.cz (Robert Cimrman) Date: Mon, 17 Oct 2005 12:27:24 +0200 Subject: [SciPy-dev] Introductions, sparse matrix support In-Reply-To: <3d502007996f0f09808ed0b07bf3a7b7@nist.gov> References: <434D0AE9.9070707@ftw.at> <434D5D5A.8000106@ee.byu.edu> <434E2610.7050200@ntc.zcu.cz> <434E3F3D.7000707@ftw.at> <434E9F05.3070804@ee.byu.edu> <3d502007996f0f09808ed0b07bf3a7b7@nist.gov> Message-ID: <43537C8C.3010503@ntc.zcu.cz> Jonathan Guyer wrote: > On Oct 13, 2005, at 1:53 PM, Travis Oliphant wrote: > > >>I'd happily welcome those extension types, but let's see if we can't >>make them all subclasses of one base-class. Look at how the scipy >>sparse Python classes are layed out. Basically, by defining a tocsc >>and >>fromcsc, all of them can be converted to each other and used as >>solvers. > > >>I did put some effort into the structure of scipy.sparse. I did not >>put any effort into optimizations, though. > > > I think this is important. PySparse is neither very object oriented nor > very "Pythonic". I think the API can be done much better [*] (and > scipy.sparse may be it; I haven't had time yet to do anything practical > with it). I don't mean to denigrate PySparse; we happily use it and > Roman has been very open to our suggestions, I just think that we can > learn from it and do better. Then newscipy/Lib/sparse seems to be a good candidate to begin with. I think that first it is important to get it build & install (and work ;), then we can care about its performance. I have modified the 'Lib/sparse/setup_sparse.py', so that it compiles. Is it ok to post it here? (don't have svn write access yet...) I am now only mildly familiar with scipy variant of distutils, but it could be a starting point. cheers, r. From schofield at ftw.at Mon Oct 17 08:10:23 2005 From: schofield at ftw.at (Ed Schofield) Date: Mon, 17 Oct 2005 14:10:23 +0200 Subject: [SciPy-dev] newcore build problem on Ubuntu / Debian Message-ID: <435394AF.7050606@ftw.at> Hi all, SciPy core fails to find ATLAS or LAPACK libraries with Ubuntu's (and Debian's?) default installation. The reason is that the Ubuntu lapack3 and atlas3 packages contain files named liblapack.so.3.0, with a symlink as liblapack.so.3, but no symlink as liblapack.so. Ditto for libatlas.so.3. Should this be filed as a bug against the Ubuntu packages? Whether or not it should, I think Scipy's distutils should look for libwhatever.so.? if it can't find libwhatever.so. This would improve its robustness against such packaging schemes and make life easier for users. By the way, the current newcore doesn't seem to have any parallel to SciPy 0.3.2's file ./scipy_core/scipy_distutils/sample_site.cfg. I think it should, to give users confidence that the syntax they're using to specify their libraries is correct. We also need some documentation updates in newcore/scipy/distutils/system_info.py to refer to scipy/distutils/site.cfg rather than scipy_distutils/site.cfg. (I can do this once I have write access.) -- Ed From perry at stsci.edu Mon Oct 17 16:20:12 2005 From: perry at stsci.edu (Perry Greenfield) Date: Mon, 17 Oct 2005 16:20:12 -0400 Subject: [SciPy-dev] Package organization In-Reply-To: References: <434D0AE9.9070707@ftw.at> <434D5D5A.8000106@ee.byu.edu> <434D98F6.5080301@ucsd.edu> <434DFFD1.9040303@colorado.edu> <434E7A8B.1060105@ucsd.edu> <434EC517.2020007@ucsd.edu> <43531D87.9080208@ucsd.edu> Message-ID: I've finally had a chance to read over the messages on this topic and thought I would make a few comments. It's certainly possible that I misunderstand some of the points already made so I'll summarize some (not all) of the issues raised, at least those I have an interest in, as well as asking for clarification for those issues that I didn't see clearly spelled out. 1) flatter is better than deeper. I'm inclined to agree with this. The objection is that flat will lead to many name collisions within scipy. I guess this depends on the next issue. 2) are all the items within the scipy namespace controlled? Some of the discussions suggested that mechanisms be provided to extend the search path for scipy libraries and such. I guess the question I have is why such libraries should be loaded into the scipy namespace. Nothing prevents them from having dependencies on scipy, but do they actually need to be loaded into its namespace? I can't think of any reason why. If not, then that means the scipy project is the master of the namespace and should detect when someone tries to add a duplicate name. 3) This doesn't mean that naming problems won't arise. Using the example given regarding astropy, I'm not sure that an astropy wcs package should just go right into the scipy namespace. That acronym might be used by other specialties in entirely different ways. In this case there is something to be said for qualifying the name somehow. That might be with the name itself, i.e., astrowcs. or that domain specific knowledge does justify some nesting. Perhaps we should consider a policy where mathematical algorithms are treated as general. After all, they may be used in many specialties of science and engineering. So it is important not to group them by kind of algorithm, or by those that might use it. On the other hand, there are many domain-specific areas where it is unlikely that the package or module will be used in another. I doubt that many other areas (but in fact, a few do) will use the FITS format. A library like that could reasonably be confined to its own namespace (like astrolib). I don't know if there is any simple approach to this. It seems to me some judgment is required. One could argue that something like astronomical-specific libraries shouldn't fall under scipy at all but in its own package. 4) So why not start flat with algorithms, and leave domains to define their own name space as a starting point? If it is necessary to move things into scipy because they have usefulness over more than one domain, then do that when the need arises. Being conservative on this can probably put off name collision issue for a good while. 4a) But as pointed out, there are occasions where one wants a generic module or package (e.g., fft) for which there may be many implementations. With the flat approach, the generic name should be reserved for some sort of redirection mechanism to use a more specific implementation. 5) One of the points raised is that lib can be used to indicate items for which many other packages have dependencies. Perhaps. I suspect that things won't be so clear. Maybe the right thing to do is just to provide tools to give packages an easy way of ensuring that all needed dependent scipy elements are already installed and informing the user which are missing (ideally, automatically installing them as needed, but I don't know if we need to be that ambitious at the start). Perhaps a sandbox is useful to the extent it doesn't force a final name choice. 6) Then again, I could be entirely confused about the real issues :-) Perry From dalcinl at gmail.com Mon Oct 17 16:40:03 2005 From: dalcinl at gmail.com (Lisandro Dalcin) Date: Mon, 17 Oct 2005 17:40:03 -0300 Subject: [SciPy-dev] undefined symbols in scipy_core-0.4.2 Message-ID: If this has been reported, please ignore it... I've got undefined symbols in scipy/lib/lapack_lite.so: [dalcinl at trantor lib.linux-i686-2.4]$ python Python 2.4.2 (#1, Sep 28 2005, 15:28:55) [GCC 3.3.3 20040412 (Red Hat Linux 3.3.3-7)] on linux2 Type "help", "copyright", "credits" or "license" for more information. >>> from scipy.base import * Traceback (most recent call last): File "", line 1, in ? File "scipy/__init__.py", line 17, in ? import scipy.basic.linalg as linalg File "scipy/basic/linalg.py", line 1, in ? from basic_lite import * File "scipy/basic/basic_lite.py", line 7, in ? import scipy.lib.lapack_lite as lapack_lite ImportError: scipy/lib/lapack_lite.so: undefined symbol: s_cat Googling, I've found this reference about 'libg2c': http://gcc.gnu.org/ml/gcc-help/2000-12/msg00010.html I added '-lg2c' flag (manually, in the command line, and it worked!). Perhaps Sorry if I can't help a bit more right now! I am a recent user (comming from numarray) and I have not found the time to look at scipy.distutils. BTW, I got the following warnings... compile options: '-Ibuild/src/scipy/base/src -Iscipy/base/include -Ibuild/src/scipy/base -Iscipy/base/src -I/usr/local/include/python2.4 -c' gcc: scipy/base/src/multiarraymodule.c In file included from scipy/base/src/multiarraymodule.c:26: scipy/base/include/scipy/arrayobject.h:84: warning: redefinition of `ushort' /usr/include/sys/types.h:152: warning: `ushort' previously declared here scipy/base/include/scipy/arrayobject.h:85: warning: redefinition of `uint' /usr/include/sys/types.h:153: warning: `uint' previously declared here scipy/base/include/scipy/arrayobject.h:86: warning: redefinition of `ulong' /usr/include/sys/types.h:151: warning: `ulong' previously declared here gcc -pthread -shared build/temp.linux-i686-2.4/scipy/base/src/multiarraymodule.o -lm -lm -o build/lib.linux-i686-2.4/scipy/base/multiarray.so -- Lisandro Dalc?n --------------- Centro Internacional de M?todos Computacionales en Ingenier?a (CIMEC) Instituto de Desarrollo Tecnol?gico para la Industria Qu?mica (INTEC) Consejo Nacional de Investigaciones Cient?ficas y T?cnicas (CONICET) PTLC - G?emes 3450, (3000) Santa Fe, Argentina Tel/Fax: +54-(0)342-451.1594 From stephen.walton at csun.edu Mon Oct 17 17:57:12 2005 From: stephen.walton at csun.edu (Stephen Walton) Date: Mon, 17 Oct 2005 14:57:12 -0700 Subject: [SciPy-dev] newcore build problem on Ubuntu / Debian In-Reply-To: <435394AF.7050606@ftw.at> References: <435394AF.7050606@ftw.at> Message-ID: <43541E38.5090709@csun.edu> Ed Schofield wrote: >Hi all, > >SciPy core fails to find ATLAS or LAPACK libraries with Ubuntu's (and >Debian's?) default installation. > > I have the following packages installed on my Athlon laptop (just updated to Ubuntu 5.10): atlas3-base, atlas3-base-dev, atlas3-headers, atlas3-sse, atlas3-sse-dev The lapack packages do not seem to be needed with this setup. However, the *-dev packages are needed so that you have the *.a libraries in addition to the *.so so that linking works. Having said that, there may other issues. Attached to the end of this is the first part of the output I get when I build newcore. Notice that atlas_blas_info finds the versions in /usr/lib/sse, but scipy.distutils.system_info.atlas_info seems to find the versions in /usr/lib. In addition, it seems to think the version of ATLAS I have is 3.5.19 although "dpkg -p atlas3-base" and so on say that it is version 3.6.0. Further down in the file (not shown here) shows that _dotblas.so is linked using -L/usr/lib/sse on the command line, while lapack_lite.so is linked with -L/usr/lib/atlas. If I remove atlas3-base and atlas3-base-dev, lapack_lite.so is not linked with ATLAS at all, because it reports that we have ATLAS but not LAPACK libraries. atlas_blas_info: FOUND: libraries = ['f77blas', 'cblas', 'atlas'] library_dirs = ['/usr/lib/sse'] language = c running build_src building extension "atlas_version" sources creating build creating build/src adding 'build/src/atlas_version_-0x7408d8b6.c' to sources. running build_ext customize UnixCCompiler customize UnixCCompiler using build_ext building 'atlas_version' extension compiling C sources gcc options: '-pthread -fno-strict-aliasing -DNDEBUG -g -O3 -Wall -Wstrict-prototypes -fPIC' creating build/temp.linux-i686-2.4 creating build/temp.linux-i686-2.4/build creating build/temp.linux-i686-2.4/build/src compile options: '-Iscipy/base/include -I/usr/include/python2.4 -c' gcc: build/src/atlas_version_-0x7408d8b6.c gcc -pthread -shared build/temp.linux-i686-2.4/build/src/atlas_version_-0x7408d8b6.o -L/usr/lib/sse -lf77blas -lcblas -latlas -o build/temp.linux-i686-2.4/atlas_version.so FOUND: libraries = ['f77blas', 'cblas', 'atlas'] library_dirs = ['/usr/lib/sse'] language = c define_macros = [('ATLAS_INFO', '"\\"3.5.19\\""')] lapack_opt_info: atlas_threads_info: Setting PTATLAS=ATLAS scipy.distutils.system_info.atlas_threads_info NOT AVAILABLE atlas_info: scipy.distutils.system_info.atlas_info FOUND: libraries = ['lapack', 'f77blas', 'cblas', 'atlas'] library_dirs = ['/usr/lib/atlas', '/usr/lib'] language = f77 running build_src building extension "atlas_version" sources adding 'build/src/atlas_version_0x3bb31a54.c' to sources. running build_ext customize UnixCCompiler customize UnixCCompiler using build_ext customize GnuFCompiler customize GnuFCompiler customize GnuFCompiler using build_ext FOUND: libraries = ['lapack', 'f77blas', 'cblas', 'atlas'] library_dirs = ['/usr/lib/atlas', '/usr/lib'] language = f77 define_macros = [('ATLAS_INFO', '"\\"3.5.19\\""')] From pearu at scipy.org Tue Oct 18 04:26:26 2005 From: pearu at scipy.org (Pearu Peterson) Date: Tue, 18 Oct 2005 03:26:26 -0500 (CDT) Subject: [SciPy-dev] BUG: PyArray_FromDims|PyArray_SimpleNew causes segfault Message-ID: Hi Travis, I was investigating the reasons why newscipy extension modules crash with segfaults and found that this could be related to PyArray_FromDims (I got the same result with PyArray_SimpleNew) bug. Here's a simple test program demonstrating the bug: /* File t.c */ #include #include int main() { PyArrayObject *arr = NULL; int dims = {10}; printf("Hey\n"); //arr = (PyArrayObject *)PyArray_FromDims(1,dims,PyArray_DOUBLE); arr = (PyArrayObject *)PyArray_SimpleNew(1,dims,PyArray_DOUBLE); printf("Hi\n"); } /* eof */ gcc -I/home/pearu/svn/t/scipy/base/include -I/usr/include/python2.3 t.c -lpython2.3 ./a.out Hey Segmentation fault Pearu From pearu at scipy.org Tue Oct 18 05:46:15 2005 From: pearu at scipy.org (Pearu Peterson) Date: Tue, 18 Oct 2005 04:46:15 -0500 (CDT) Subject: [SciPy-dev] BUG: PyArray_FromDims|PyArray_SimpleNew causes segfault In-Reply-To: References: Message-ID: On Tue, 18 Oct 2005, Pearu Peterson wrote: > > Hi Travis, > > I was investigating the reasons why newscipy extension modules crash with > segfaults and found that this could be related to PyArray_FromDims (I got > the same result with PyArray_SimpleNew) bug. Here's a simple test program > demonstrating the bug: > > /* File t.c */ > > #include > #include > > int main() { > PyArrayObject *arr = NULL; > int dims = {10}; > printf("Hey\n"); > //arr = (PyArrayObject *)PyArray_FromDims(1,dims,PyArray_DOUBLE); > arr = (PyArrayObject *)PyArray_SimpleNew(1,dims,PyArray_DOUBLE); > printf("Hi\n"); > } > /* eof */ Actually, this is not a good example. Adding Py_Initialize(); fixes the segfault. However, f2py generated modules segfault in array_from_pyobj when calling PyArray_FromDims. Right now I'm not sure how to debug this further. Pearu From pearu at scipy.org Tue Oct 18 06:07:20 2005 From: pearu at scipy.org (Pearu Peterson) Date: Tue, 18 Oct 2005 05:07:20 -0500 (CDT) Subject: [SciPy-dev] BUG: PyArray_FromDims|PyArray_SimpleNew causes segfault In-Reply-To: References: Message-ID: On Tue, 18 Oct 2005, Pearu Peterson wrote: > However, f2py generated modules segfault in array_from_pyobj > when calling PyArray_FromDims. Right now I'm not sure how to debug this > further. Adding import_array(); at the begginning of array_from_pyobj(..) in fortranobject.c, segfaults went away. But I am not sure if this is a proper place to put import_array(), it looks like afful overhead to call import_array() on every function argument. Pearu From pearu at scipy.org Tue Oct 18 06:48:41 2005 From: pearu at scipy.org (Pearu Peterson) Date: Tue, 18 Oct 2005 05:48:41 -0500 (CDT) Subject: [SciPy-dev] newscipy segfaults fixed! Message-ID: Hi, The segfaults in newcore hava now a fix in SVN. Currently scipy.test() gives more than 200 errors, but they should go away as soon as newscipy packages have been ported to newcore. Pearu From dd55 at cornell.edu Tue Oct 18 11:39:06 2005 From: dd55 at cornell.edu (Darren Dale) Date: Tue, 18 Oct 2005 11:39:06 -0400 Subject: [SciPy-dev] signbit and -inf In-Reply-To: <200510161656.45947.dd55@cornell.edu> References: <200510161433.13556.dd55@cornell.edu> <200510161656.45947.dd55@cornell.edu> Message-ID: <200510181139.07062.dd55@cornell.edu> > I can run the full test without a segfault, although I am only using the > newcore (I get a segfault when I install newscipy and try to import scipy). > I am using Gentoo on a Pentium4, with Python-2.4.2 and gcc-3.4.4. I built > scipy against BLAS/LAPACK/ATLAS using the same site.cfg as I used to build > scipy-0.3.2. It turns out that on my machine, signbit returns False for any > negative number, not just -inf. Does anyone have any suggestions as to how I can pursue this? Darren From arnd.baecker at web.de Tue Oct 18 12:04:50 2005 From: arnd.baecker at web.de (Arnd Baecker) Date: Tue, 18 Oct 2005 18:04:50 +0200 (CEST) Subject: [SciPy-dev] newscipy segfaults fixed! In-Reply-To: References: Message-ID: On Tue, 18 Oct 2005, Pearu Peterson wrote: > Hi, > The segfaults in newcore hava now a fix in SVN. Currently scipy.test() > gives more than 200 errors, but they should go away as soon as newscipy > packages have been ported to newcore. Sorry, Pearu, but I still get one on the opteron: from scipy.linalg import fblas fblas.sger(1,[1,2],[3,4]) segfaults ( check_ger from test_blas.py). This is with In [1]: import scipy In [2]: scipy.base.__version__ Out[2]: '0.4.3.1300' (and new scipy At revision 1336). Additional remark: After commenting out all tests in check_ger() all tests run through. Now, removing the commmented out part and re-running the test in a new session gives no segfault - very weird! Anway, the above two lines do reproduce the segfault systematically. Best, Arnd From arnd.baecker at web.de Tue Oct 18 12:04:50 2005 From: arnd.baecker at web.de (Arnd Baecker) Date: Tue, 18 Oct 2005 18:04:50 +0200 (CEST) Subject: [SciPy-dev] newscipy segfaults fixed! In-Reply-To: References: Message-ID: On Tue, 18 Oct 2005, Pearu Peterson wrote: > Hi, > The segfaults in newcore hava now a fix in SVN. Currently scipy.test() > gives more than 200 errors, but they should go away as soon as newscipy > packages have been ported to newcore. Sorry, Pearu, but I still get one on the opteron: from scipy.linalg import fblas fblas.sger(1,[1,2],[3,4]) segfaults ( check_ger from test_blas.py). This is with In [1]: import scipy In [2]: scipy.base.__version__ Out[2]: '0.4.3.1300' (and new scipy At revision 1336). Additional remark: After commenting out all tests in check_ger() all tests run through. Now, removing the commmented out part and re-running the test in a new session gives no segfault - very weird! Anway, the above two lines do reproduce the segfault systematically. Best, Arnd From nwagner at mecha.uni-stuttgart.de Tue Oct 18 12:20:57 2005 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Tue, 18 Oct 2005 18:20:57 +0200 Subject: [SciPy-dev] newscipy segfaults fixed! In-Reply-To: References: Message-ID: On Tue, 18 Oct 2005 18:04:50 +0200 (CEST) Arnd Baecker wrote: > On Tue, 18 Oct 2005, Pearu Peterson wrote: > >> Hi, >> The segfaults in newcore hava now a fix in SVN. >>Currently scipy.test() >> gives more than 200 errors, but they should go away as >>soon as newscipy >> packages have been ported to newcore. > > Sorry, Pearu, but I still get one on the opteron: > > from scipy.linalg import fblas > fblas.sger(1,[1,2],[3,4]) > > segfaults ( check_ger from test_blas.py). > > This is with > In [1]: import scipy > In [2]: scipy.base.__version__ > Out[2]: '0.4.3.1300' > (and new scipy At revision 1336). > > Additional remark: > After commenting out all tests in check_ger() > all tests run through. > Now, removing the commmented out part > and re-running the test in a new session > gives no segfault - very weird! > Anway, the above two lines do reproduce the segfault > systematically. > > Best, Arnd > > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-dev I cannot reproduce this segfault >>> from scipy.linalg import fblas Importing io to scipy Importing special to scipy Importing utils to scipy Importing interpolate to scipy Importing optimize to scipy Importing linalg to scipy >>> fblas.sger(1,[1,2],[3,4]) array([[ 3., 4.], [ 6., 8.]], dtype=float32) Nils From dd55 at cornell.edu Tue Oct 18 13:41:12 2005 From: dd55 at cornell.edu (Darren Dale) Date: Tue, 18 Oct 2005 13:41:12 -0400 Subject: [SciPy-dev] newcore atlas info on Gentoo Message-ID: <200510181341.12351.dd55@cornell.edu> In order to get scipy-0.3.2 to build on Gentoo Linux, I use site.cfg in the distutils module. This is necessary because Gentoo has renamed the libf77blas.* libraries to libblas.* in order to allow users to easily switch between different libraries, ATLAS and Intel's MKL for example. When I compile newcore using the old site.cfg, my atlas info is properly reported. However, when I compile newscipy, the atlas info is reportedly unavailable. I looked into system_info.atlas_info and friends, and discovered that _lib_names is not informed by site.cfg, it is hardcoded with ['f77blas', 'cblas'], or ['ptf77blas','ptcblas'], etc. If I remove the 'f77', my atlas info is correctly reported. Darren From dd55 at cornell.edu Tue Oct 18 13:42:06 2005 From: dd55 at cornell.edu (Darren Dale) Date: Tue, 18 Oct 2005 13:42:06 -0400 Subject: [SciPy-dev] missing Lib/__init__.py in newscipy? Message-ID: <200510181342.06980.dd55@cornell.edu> When I tried to test newscipy from svn, I got a segfault: from scipy import * Importing io to scipy Importing special to scipy Segmentation fault which I think is caused by an absent Lib/__init__.py file. Lib/old__init__.py does exist, which I renamed. The resulting build gave me the following error when I tried to import scipy, although it didn't segfault: from scipy import * --------------------------------------------------------------------------- exceptions.ImportError Traceback (most recent call last) /home/darren/ /usr/lib/python2.4/site-packages/scipy/__init__.py 10 from scipy_version import scipy_version as __version__ 11 from scipy.base import * ---> 12 from helpmod import * 13 14 _pkg_func_docs = "" ImportError: No module named helpmod Maybe, since I am new to the svn repository, I stumbled upon a file that accidentally got deleted from the branch? Just a guess. Darren From pearu at scipy.org Tue Oct 18 14:53:18 2005 From: pearu at scipy.org (Pearu Peterson) Date: Tue, 18 Oct 2005 13:53:18 -0500 (CDT) Subject: [SciPy-dev] missing Lib/__init__.py in newscipy? In-Reply-To: <200510181342.06980.dd55@cornell.edu> References: <200510181342.06980.dd55@cornell.edu> Message-ID: On Tue, 18 Oct 2005, Darren Dale wrote: > When I tried to test newscipy from svn, I got a segfault: > > from scipy import * > Importing io to scipy > Importing special to scipy > Segmentation fault > > which I think is caused by an absent Lib/__init__.py file. Hmm, the absence of a file should not cause a segfault. > Lib/old__init__.py does exist, which I renamed. There should be no Lib/__init__.py, the installed scipy/__init__.py file comes from newcore. If you see `Importing io to scipy` then it means you must have already installed newcore with __init__.py. Try updating both newcore and newscipy from ?SVN, remove old installation of scipy, reinstall newcore and newscipy. Pearu From oliphant at ee.byu.edu Tue Oct 18 15:57:43 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Tue, 18 Oct 2005 13:57:43 -0600 Subject: [SciPy-dev] newscipy segfaults fixed! In-Reply-To: References: Message-ID: <435553B7.20700@ee.byu.edu> Pearu Peterson wrote: >Hi, >The segfaults in newcore hava now a fix in SVN. Currently scipy.test() >gives more than 200 errors, but they should go away as soon as newscipy >packages have been ported to newcore. > > Pearu, Did you find a fix for the PyArray_FromDims problem? I have not had time to track this down, but can later. -Travis From pearu at scipy.org Tue Oct 18 15:03:50 2005 From: pearu at scipy.org (Pearu Peterson) Date: Tue, 18 Oct 2005 14:03:50 -0500 (CDT) Subject: [SciPy-dev] newscipy segfaults fixed! In-Reply-To: <435553B7.20700@ee.byu.edu> References: <435553B7.20700@ee.byu.edu> Message-ID: On Tue, 18 Oct 2005, Travis Oliphant wrote: > Pearu Peterson wrote: > >> Hi, >> The segfaults in newcore hava now a fix in SVN. Currently scipy.test() >> gives more than 200 errors, but they should go away as soon as newscipy >> packages have been ported to newcore. > > Did you find a fix for the PyArray_FromDims problem? I have not had > time to track this down, but can later. Yes, it turns out that if a source file like fortranobject.c uses newcore, some function must call import_array() to set PyArray_API. I did that by introducing init_fortranobject() function that only calls import_array() and the extension module calls init_fortranobject() from the module init function. This was not needed in Numeric. Hmm, may be PyArray_API should not be defined as static. Is there a reason that PyArray_API must be defined static? Pearu From dd55 at cornell.edu Tue Oct 18 16:19:10 2005 From: dd55 at cornell.edu (Darren Dale) Date: Tue, 18 Oct 2005 16:19:10 -0400 Subject: [SciPy-dev] missing Lib/__init__.py in newscipy? In-Reply-To: References: <200510181342.06980.dd55@cornell.edu> Message-ID: <200510181619.11283.dd55@cornell.edu> On Tuesday 18 October 2005 2:53 pm, Pearu Peterson wrote: > On Tue, 18 Oct 2005, Darren Dale wrote: > > When I tried to test newscipy from svn, I got a segfault: > > > > from scipy import * > > Importing io to scipy > > Importing special to scipy > > Segmentation fault > > > > which I think is caused by an absent Lib/__init__.py file. > > Hmm, the absence of a file should not cause a segfault. > > > Lib/old__init__.py does exist, which I renamed. > > There should be no Lib/__init__.py, the installed scipy/__init__.py file > comes from newcore. > > If you see `Importing io to scipy` then it means you must have > already installed newcore with __init__.py. Try updating both newcore and > newscipy from ?SVN, remove old installation of scipy, reinstall newcore > and newscipy. Hi Pearu, I must be doing something wrong here. I deleted my old scipy and Numeric directories from site-packages, then I checked out from svn: svn co http://svn.scipy.org/svn/scipy_core/branches/newcore/ svn co http://svn.scipy.org/svn/scipy/branches/newscipy/ Then I copied site.cfg to newcore/scipy/distutils, built and installed newcore. At this point, I run test(1,0) and get the three errors related to signbit which I reported earlier. Next I built and installed newscipy, and when I try to import scipy: In [1]: from scipy import * Importing io to scipy Importing special to scipy Segmentation fault Darren From ryanlists at gmail.com Tue Oct 18 16:24:13 2005 From: ryanlists at gmail.com (Ryan Krauss) Date: Tue, 18 Oct 2005 16:24:13 -0400 Subject: [SciPy-dev] weave/blitz 0.9 for gcc/g++4.0 Message-ID: I ran into a problem with using weave/blitz on ubuntu 5.10. It seems that the blitz header files were outdated and not compatible with gcc/g++4.0 I was able to get around this by downloading blitz 0.9 and copying the header files to the directory weave is looking in. I was told I should post to this list so that this was taken care of. Ryan From arnd.baecker at web.de Tue Oct 18 18:44:51 2005 From: arnd.baecker at web.de (Arnd Baecker) Date: Wed, 19 Oct 2005 00:44:51 +0200 (CEST) Subject: [SciPy-dev] missing Lib/__init__.py in newscipy? In-Reply-To: References: <200510181342.06980.dd55@cornell.edu> Message-ID: On Tue, 18 Oct 2005, Pearu Peterson wrote: > On Tue, 18 Oct 2005, Darren Dale wrote: > > > When I tried to test newscipy from svn, I got a segfault: > > > > from scipy import * > > Importing io to scipy > > Importing special to scipy > > Segmentation fault I get the same error (debian sarge, sse2 ATLAS from ls -l /usr/lib/sse2 with manually added links libcblas.so -> libcblas.so.3.0, libf77blas.so -> libf77blas.so.3.0, ...) newcore $ svn update At revision 1303. newscipy $ svn update At revision 1337. (This is for a PIV, 2.8GHz) For the opteron: compile works, import works, scipy.test(10,10) segfaults as mailed earlier this day. Found 0 tests for __main__ check_zero (scipy.linalg.matfuncs.test_matfuncs.test_expm) ... ERROR check_nils (scipy.linalg.matfuncs.test_matfuncs.test_logm) ... ERROR check_defective1 (scipy.linalg.matfuncs.test_matfuncs.test_signm) ... ERROR check_defective2 (scipy.linalg.matfuncs.test_matfuncs.test_signm) ... ERROR check_defective3 (scipy.linalg.matfuncs.test_matfuncs.test_signm) ... ERROR check_nils (scipy.linalg.matfuncs.test_matfuncs.test_signm) ... ERROR check_bad (scipy.linalg.matfuncs.test_matfuncs.test_sqrtm) ... ERROR check_cblas (scipy.linalg.blas.test_blas.test_blas) ... ok check_fblas (scipy.linalg.blas.test_blas.test_blas) ... ok check_axpy (scipy.linalg.blas.test_blas.test_cblas1_simple) ... ok check_amax (scipy.linalg.blas.test_blas.test_fblas1_simple) ... ok check_asum (scipy.linalg.blas.test_blas.test_fblas1_simple) ... ok check_axpy (scipy.linalg.blas.test_blas.test_fblas1_simple) ... ok check_copy (scipy.linalg.blas.test_blas.test_fblas1_simple) ... ok check_dot (scipy.linalg.blas.test_blas.test_fblas1_simple) ... ok check_nrm2 (scipy.linalg.blas.test_blas.test_fblas1_simple) ... ok check_scal (scipy.linalg.blas.test_blas.test_fblas1_simple) ... ok check_swap (scipy.linalg.blas.test_blas.test_fblas1_simple) ... ok check_gemv (scipy.linalg.blas.test_blas.test_fblas2_simple) ... ok check_ger (scipy.linalg.blas.test_blas.test_fblas2_simple)Segmentation fault Best, Arnd From oliphant at ee.byu.edu Tue Oct 18 19:25:37 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Tue, 18 Oct 2005 17:25:37 -0600 Subject: [SciPy-dev] newscipy segfaults fixed! In-Reply-To: References: <435553B7.20700@ee.byu.edu> Message-ID: <43558471.707@ee.byu.edu> Pearu Peterson wrote: >On Tue, 18 Oct 2005, Travis Oliphant wrote: > > > >>Pearu Peterson wrote: >> >> >> >>>Hi, >>>The segfaults in newcore hava now a fix in SVN. Currently scipy.test() >>>gives more than 200 errors, but they should go away as soon as newscipy >>>packages have been ported to newcore. >>> >>> >>Did you find a fix for the PyArray_FromDims problem? I have not had >>time to track this down, but can later. >> >> > >Yes, it turns out that if a source file like fortranobject.c uses newcore, >some function must call import_array() to set PyArray_API. I did that by >introducing init_fortranobject() function that only calls import_array() >and the extension module calls init_fortranobject() from the module init >function. This was not needed in Numeric. > > That's because Numeric had PyArray_API not static, as you pointed out. >Hmm, may be PyArray_API should not be defined as static. Is there a reason >that PyArray_API must be defined static? > > I'm not sure. There may be a better way to do this. Using static PyArray_API is the way the Python manual suggests. However, it does mean you can't have a "library" source file that gets called from an extension module. I guess that's what fortranobject.c is doing? I'm not against changing it. -Travis From oliphant at ee.byu.edu Tue Oct 18 19:36:50 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Tue, 18 Oct 2005 17:36:50 -0600 Subject: [SciPy-dev] newscipy segfaults fixed! In-Reply-To: References: <435553B7.20700@ee.byu.edu> Message-ID: <43558712.1000607@ee.byu.edu> Pearu Peterson wrote: >On Tue, 18 Oct 2005, Travis Oliphant wrote: > > > >>Pearu Peterson wrote: >> >> >> >>>Hi, >>>The segfaults in newcore hava now a fix in SVN. Currently scipy.test() >>>gives more than 200 errors, but they should go away as soon as newscipy >>>packages have been ported to newcore. >>> >>> >>Did you find a fix for the PyArray_FromDims problem? I have not had >>time to track this down, but can later. >> >> > >Yes, it turns out that if a source file like fortranobject.c uses newcore, >some function must call import_array() to set PyArray_API. I did that by >introducing init_fortranobject() function that only calls import_array() >and the extension module calls init_fortranobject() from the module init >function. This was not needed in Numeric. > >Hmm, may be PyArray_API should not be defined as static. Is there a reason >that PyArray_API must be defined static? > > Wait a minute.... PyArray_API is defined exactly the same way the Numeric defined it. It's static (and filled by import_array) unless PY_ARRAY_UNIQUE_SYMBOL is defined or NO_IMPORT or NO_IMPORT_ARRAY is defined. So, the thing to do is compile fortranobject.c with NO_IMPORT_ARRAY defined (that way PyArray_API is declared extern), and define PY_ARRAY_UNIQUE_SYMBOL to be , and then compile the extension modules that want to use fortranobject.c code by defining PY_ARRAY_UNIQUE_SYMBOL to be . This was the same way you had to do it in Numeric. So, I don't see what's changed. I did, however, recently add to newcore NO_IMPORT to be equivalent to NO_IMPORT_ARRAY. Perhaps f2py was defining NO_IMPORT instead of NO_IMPORT_ARRAY. This would have worked with Numeric, but not newcore (until recent SVN checkin). -Travis From arnd.baecker at web.de Tue Oct 18 20:17:36 2005 From: arnd.baecker at web.de (Arnd Baecker) Date: Wed, 19 Oct 2005 02:17:36 +0200 (CEST) Subject: [SciPy-dev] newscipy segfaults fixed! In-Reply-To: <43558712.1000607@ee.byu.edu> References: <435553B7.20700@ee.byu.edu><43558712.1000607@ee.byu.edu> Message-ID: Hi, on the opteron I get a few warnings during the build: gcc: scipy/corelib/blasdot/_dotblas.c scipy/corelib/blasdot/_dotblas.c: In function `dotblas_vdot': scipy/corelib/blasdot/_dotblas.c:691: warning: passing arg 2 of pointer to function from incompatible pointer type gcc -pthread -shared build/temp.linux-x86_64-2.4/scipy/corelib/blasdot/_dotblas.o -L/scr/python/lib64 -lptf77blas -lptcblas -latlas -o build/lib.linux-x86_64-2.4/scipy/lib/_dotblas.so ---- /home/abaecker/BUILDS2/Build_33/inst_scipy_newcore/lib/python2.4/site-packages/scipy/base/include -I/scr/python/include/python2.4 -c' gcc: build/src/fortranobject.c build/src/fortranobject.c: In function `fortran_doc': build/src/fortranobject.c:123: warning: int format, different type arg (arg 3) build/src/fortranobject.c: In function `fortran_setattr': build/src/fortranobject.c:221: warning: passing arg 2 of pointer to function from incompatible pointer type build/src/fortranobject.c:235: warning: passing arg 1 of pointer to function from incompatible pointer type build/src/fortranobject.c: In function `swap_arrays': build/src/fortranobject.c:522: warning: assignment from incompatible pointer type build/src/fortranobject.c:522: warning: assignment from incompatible pointer type build/src/fortranobject.c:523: warning: assignment from incompatible pointer type build/src/fortranobject.c:523: warning: assignment from incompatible pointer type build/src/fortranobject.c: In function `array_from_pyobj': build/src/fortranobject.c:623: warning: passing arg 2 of pointer to function from incompatible pointer type build/src/fortranobject.c: In function `check_and_fix_dimensions': build/src/fortranobject.c:740: warning: int format, different type arg (arg 5) build/src/fortranobject.c:805: warning: int format, different type arg (arg 3) gcc: build/src/build/src/scipy/linalg/fblasmodule.c build/src/build/src/scipy/linalg/fblasmodule.c: In function `f2py_rout_fblas_sger': build/src/build/src/scipy/linalg/fblasmodule.c:8045: warning: passing arg 1 of `initforcomb' from incompatible pointer type build/src/build/src/scipy/linalg/fblasmodule.c: In function `f2py_rout_fblas_dger': build/src/build/src/scipy/linalg/fblasmodule.c:8237: warning: passing arg 1 of `initforcomb' from incompatible pointer type build/src/build/src/scipy/linalg/fblasmodule.c: In function `f2py_rout_fblas_cgeru': build/src/build/src/scipy/linalg/fblasmodule.c:8429: warning: passing arg 1 of `initforcomb' from incompatible pointer type build/src/build/src/scipy/linalg/fblasmodule.c: In function `f2py_rout_fblas_zgeru': build/src/build/src/scipy/linalg/fblasmodule.c:8621: warning: passing arg 1 of `initforcomb' from incompatible pointer type build/src/build/src/scipy/linalg/fblasmodule.c: In function `f2py_rout_fblas_cgerc': build/src/build/src/scipy/linalg/fblasmodule.c:8813: warning: passing arg 1 of `initforcomb' from incompatible pointer type build/src/build/src/scipy/linalg/fblasmodule.c: In function `f2py_rout_fblas_zgerc': build/src/build/src/scipy/linalg/fblasmodule.c:9005: warning: passing arg 1 of `initforcomb' from incompatible pointer type build/src/build/src/scipy/linalg/fblasmodule.c: In function `f2py_as_column_major_storage': build/src/build/src/scipy/linalg/fblasmodule.c:11705: warning: passing arg 2 of `array_from_pyobj' from incompatible pointer type compiling Fortran sources g77(f77) options: '-Wall -fno-second-underscore -fPIC -O3 -funroll-loops -march=opteron -mmmx -msse2 -msse -m3dnow' ---- gcc: build/src/build/src/scipy/linalg/cblasmodule.c build/src/build/src/scipy/linalg/cblasmodule.c: In function `f2py_as_column_major_storage': build/src/build/src/scipy/linalg/cblasmodule.c:897: warning: passing arg 2 of `array_from_pyobj' from incompatible pointer type /scr/python/bin/g77 -shared build/temp.linux-x86_64-2.4/build/src/build/src/scipy/linalg/cblasmodule.o build/temp.linux-x86_64-2.4/build/src/fortranobject.o -L/scr/python/lib64 -Lbuild/temp.linux-x86_64-2.4 -llapack -lptf77blas -lptcblas -latlas -lg2c -o build/lib.linux-x86_64-2.4/scipy/linalg/cblas.so --- /home/abaecker/BUILDS2/Build_33/inst_scipy_newcore/lib/python2.4/site-packages/scipy/base/include -I/scr/python/include/python2.4 -c' gcc: build/src/build/src/scipy/linalg/flapackmodule.c build/src/build/src/scipy/linalg/flapackmodule.c: In function `f2py_as_column_major_storage': build/src/build/src/scipy/linalg/flapackmodule.c:15793: warning: passing arg 2 of `array_from_pyobj' from incompatible pointer type /scr/python/bin/g77 -shared build/temp.linux-x86_64-2.4/build/src/build/src/scipy/linalg/flapackmodule.o build/temp.linux-x86_64-2.4/build/src/fortranobject.o -L/scr/python/lib64 -Lbuild/temp.linux-x86_64-2.4 -llapack -lptf77blas -lptcblas -latlas -lg2c -o build/lib.linux-x86_64-2.4/scipy/linalg/flapack.so building 'scipy.linalg.clapack' extension compiling C sources --- gcc: build/src/build/src/scipy/linalg/clapackmodule.c build/src/build/src/scipy/linalg/clapackmodule.c: In function `f2py_as_column_major_storage': build/src/build/src/scipy/linalg/clapackmodule.c:5845: warning: passing arg 2 of `array_from_pyobj' from incompatible pointer type /scr/python/bin/g77 -shared build/temp.linux-x86_64-2.4/build/src/build/src/scipy/linalg/clapackmodule.o build/temp.linux-x86_64-2.4/build/src/fortranobject.o -L/scr/python/lib64 -Lbuild/temp.linux-x86_64-2.4 -llapack -lptf77blas -lptcblas -latlas -lg2c -o build/lib.linux-x86_64-2.4/scipy/linalg/clapack.so building 'scipy.linalg._flinalg' extension compiling C sources ---- gcc: build/src/scipy/linalg/_flinalgmodule.c build/src/scipy/linalg/_flinalgmodule.c: In function `f2py_as_column_major_storage': build/src/scipy/linalg/_flinalgmodule.c:1913: warning: passing arg 2 of `array_from_pyobj' from incompatible pointer type compiling Fortran sources g77(f77) options: '-Wall -fno-second-underscore -fPIC -O3 -funroll-loops -march=opteron -mmmx -msse2 -msse -m3dnow' ---- gcc: build/src/scipy/linalg/calc_lworkmodule.c build/src/scipy/linalg/calc_lworkmodule.c: In function `f2py_as_column_major_storage': build/src/scipy/linalg/calc_lworkmodule.c:1276: warning: passing arg 2 of `array_from_pyobj' from incompatible pointer type compiling Fortran sources g77(f77) options: '-Wall -fno-second-underscore -fPIC -O3 -funroll-loops -march=opteron -mmmx -msse2 -msse -m3dnow' compile options: '-I/scr/python/include -Ibuild/src -I/home/abaecker/BUILDS2/Build_33/inst_scipy_newcore/lib/python2.4/site-packages/scipy/base/include -I/scr/python/include/python2.4 -c' g77:f77: Lib/linalg/src/calc_lwork.f --- gcc: build/src/build/src/Lib/linalg/iterative/_iterativemodule.c build/src/build/src/Lib/linalg/iterative/_iterativemodule.c: In function `f2py_as_column_major_storage': build/src/build/src/Lib/linalg/iterative/_iterativemodule.c:5662: warning: passing arg 2 of `array_from_pyobj' from incompatible pointer type compiling Fortran sources g77(f77) options: '-Wall -fno-second-underscore -fPIC -O3 -funroll-loops -march=opteron -mmmx -msse2 -msse -m3dnow' --- And loads of warnings of the type build/src/Lib/linalg/iterative/BiCGREVCOM.f: In subroutine `sbicgrevcom': build/src/Lib/linalg/iterative/BiCGREVCOM.f:135: warning: IF (RLBL .eq. 2) GOTO 2 1 build/src/Lib/linalg/iterative/BiCGREVCOM.f:239: (continued): 2 CONTINUE 2 Reference to label at (1) is outside block containing definition at (2) ---- gcc: Lib/special/_cephesmodule.c Lib/special/_cephesmodule.c: In function `Cephes_InitOperators': Lib/special/_cephesmodule.c:348: warning: assignment from incompatible pointer type Lib/special/_cephesmodule.c:349: warning: assignment from incompatible pointer type Lib/special/_cephesmodule.c:352: warning: assignment from incompatible pointer type Lib/special/_cephesmodule.c:353: warning: assignment from incompatible pointer type Lib/special/_cephesmodule.c:354: warning: assignment from incompatible pointer type Lib/special/_cephesmodule.c:355: warning: assignment from incompatible pointer type Lib/special/_cephesmodule.c:356: warning: assignment from incompatible pointer type Lib/special/_cephesmodule.c:357: warning: assignment from incompatible pointer type Lib/special/_cephesmodule.c:358: warning: assignment from incompatible pointer type Lib/special/_cephesmodule.c:359: warning: assignment from incompatible pointer type Lib/special/_cephesmodule.c:360: warning: assignment from incompatible pointer type Lib/special/_cephesmodule.c:361: warning: assignment from incompatible pointer type Lib/special/_cephesmodule.c:362: warning: assignment from incompatible pointer type Lib/special/_cephesmodule.c:363: warning: assignment from incompatible pointer type Lib/special/_cephesmodule.c:364: warning: assignment from incompatible pointer type Lib/special/_cephesmodule.c:365: warning: assignment from incompatible pointer type Lib/special/_cephesmodule.c:366: warning: assignment from incompatible pointer type Lib/special/_cephesmodule.c:367: warning: assignment from incompatible pointer type Lib/special/_cephesmodule.c:368: warning: assignment from incompatible pointer type Lib/special/_cephesmodule.c:369: warning: assignment from incompatible pointer type Lib/special/_cephesmodule.c:372: warning: assignment from incompatible pointer type Lib/special/_cephesmodule.c:373: warning: assignment from incompatible pointer type Lib/special/_cephesmodule.c:374: warning: assignment from incompatible pointer type Lib/special/_cephesmodule.c:375: warning: assignment from incompatible pointer type Lib/special/_cephesmodule.c:378: warning: assignment from incompatible pointer type Lib/special/_cephesmodule.c:379: warning: assignment from incompatible pointer type Lib/special/_cephesmodule.c:380: warning: assignment from incompatible pointer type Lib/special/_cephesmodule.c:381: warning: assignment from incompatible pointer type Lib/special/_cephesmodule.c:382: warning: assignment from incompatible pointer type Lib/special/_cephesmodule.c:383: warning: assignment from incompatible pointer type Lib/special/_cephesmodule.c:384: warning: assignment from incompatible pointer type Lib/special/_cephesmodule.c:385: warning: assignment from incompatible pointer type Lib/special/_cephesmodule.c:386: warning: assignment from incompatible pointer type Lib/special/_cephesmodule.c:387: warning: assignment from incompatible pointer type Lib/special/_cephesmodule.c:388: warning: assignment from incompatible pointer type Lib/special/_cephesmodule.c:389: warning: assignment from incompatible pointer type Lib/special/_cephesmodule.c:390: warning: assignment from incompatible pointer type Lib/special/_cephesmodule.c:391: warning: assignment from incompatible pointer type Lib/special/_cephesmodule.c:392: warning: assignment from incompatible pointer type Lib/special/_cephesmodule.c:393: warning: assignment from incompatible pointer type Lib/special/_cephesmodule.c:394: warning: assignment from incompatible pointer type Lib/special/_cephesmodule.c:395: warning: assignment from incompatible pointer type Lib/special/_cephesmodule.c:396: warning: assignment from incompatible pointer type Lib/special/_cephesmodule.c:397: warning: assignment from incompatible pointer type Lib/special/_cephesmodule.c:398: warning: assignment from incompatible pointer type Lib/special/_cephesmodule.c:399: warning: assignment from incompatible pointer type Lib/special/_cephesmodule.c:400: warning: assignment from incompatible pointer type Lib/special/_cephesmodule.c:401: warning: assignment from incompatible pointer type Lib/special/_cephesmodule.c:402: warning: assignment from incompatible pointer type Lib/special/_cephesmodule.c:403: warning: assignment from incompatible pointer type Lib/special/_cephesmodule.c:404: warning: assignment from incompatible pointer type Lib/special/_cephesmodule.c:405: warning: assignment from incompatible pointer type gcc: Lib/special/specfun_wrappers.c gcc: Lib/special/cdf_wrappers.c /scr/python/bin/g77 -shared build/temp.linux-x86_64-2.4/Lib/special/_cephesmodule.o build/te --- gcc: build/src/Lib/special/specfunmodule.c build/src/Lib/special/specfunmodule.c: In function `f2py_as_column_major_storage': build/src/Lib/special/specfunmodule.c:5604: warning: passing arg 2 of `array_from_pyobj' from incompatible pointer type /scr/python/bin/g77 -shared build/temp.linux-x86_64-2.4/build/src/Lib/special/specfunmodule.o build/temp.linux-x86_64-2.4/build/src/fortranobject.o -Lbuild/temp.linux-x86_64-2.4 -lspecfun -lg2c -o build/lib.linux-x86_64-2.4/scipy/special/specfun.so building 'scipy.optimize._minpack' extension ---- compile options: '-DATLAS_INFO="\"3.7.11\"" -I/scr/python/include -Ibuild/src -I/home/abaecker/BUILDS2/Build_33/inst_scipy_newcore/lib/python2.4/site-packages/scipy/base/include -I/scr/python/include/python2.4 -c' gcc: build/src/Lib/optimize/lbfgsb-0.9/_lbfgsbmodule.c build/src/Lib/optimize/lbfgsb-0.9/_lbfgsbmodule.c: In function `f2py_as_column_major_storage': build/src/Lib/optimize/lbfgsb-0.9/_lbfgsbmodule.c:774: warning: passing arg 2 of `array_from_pyobj' from incompatible pointer type compiling Fortran sources g77(f77) options: '-Wall -fno-second-underscore -fPIC -O3 -funroll-loops -march=opteron -mmmx -msse2 -msse -m3dnow' creating build/temp.linux-x86_64-2.4/Lib/optimize/lbfgsb-0.9 ---- compile options: '-Ibuild/src -I/home/abaecker/BUILDS2/Build_33/inst_scipy_newcore/lib/python2.4/site-packages/scipy/base/include -I/scr/python/include/python2.4 -c' gcc: build/src/Lib/optimize/cobyla/_cobylamodule.c build/src/Lib/optimize/cobyla/_cobylamodule.c: In function `cb_calcfc_in__cobyla__user__routines': build/src/Lib/optimize/cobyla/_cobylamodule.c:311: warning: unused variable `f' build/src/Lib/optimize/cobyla/_cobylamodule.c: In function `f2py_as_column_major_storage': build/src/Lib/optimize/cobyla/_cobylamodule.c:672: warning: passing arg 2 of `array_from_pyobj' from incompatible pointer type compiling Fortran sources g77(f77) options: '-Wall -fno-second-underscore -fPIC -O3 -funroll-loops -march=opteron -mmmx -msse2 -msse -m3dnow' creating build/temp.linux-x86_64-2.4/Lib/optimize/cobyla ---- gcc: build/src/Lib/optimize/minpack2/minpack2module.c build/src/Lib/optimize/minpack2/minpack2module.c: In function `f2py_as_column_major_storage': build/src/Lib/optimize/minpack2/minpack2module.c:613: warning: passing arg 2 of `array_from_pyobj' from incompatible pointer type compiling Fortran sources g77(f77) options: '-Wall -fno-second-underscore -fPIC -O3 -funroll-loops -march=opteron -mmmx -msse2 -msse -m3dnow' ----- compile options: '-Ibuild/src -I/home/abaecker/BUILDS2/Build_33/inst_scipy_newcore/lib/python2.4/site-packages/scipy/base/include -I/scr/python/include/python2.4 -c' gcc: build/src/Lib/stats/statlibmodule.c build/src/Lib/stats/statlibmodule.c: In function `f2py_as_column_major_storage': build/src/Lib/stats/statlibmodule.c:708: warning: passing arg 2 of `array_from_pyobj' from incompatible pointer type /scr/python/bin/g77 -shared build/temp.linux-x86_64-2.4/build/src/Lib/stats/statlibmodule.o build/temp.linux-x86_64-2.4/build/src/fortranobject.o -Lbuild/temp.linux-x86_64-2.4 -lstatlib -lg2c -o build/lib.linux-x86_64-2.4/scipy/stats/statlib.so building 'scipy.stats.futil' extension ---- compile options: '-Ibuild/src -I/home/abaecker/BUILDS2/Build_33/inst_scipy_newcore/lib/python2.4/site-packages/scipy/base/include -I/scr/python/include/python2.4 -c' gcc: build/src/scipy/stats/futilmodule.c build/src/scipy/stats/futilmodule.c: In function `f2py_as_column_major_storage': build/src/scipy/stats/futilmodule.c:370: warning: passing arg 2 of `array_from_pyobj' from incompatible pointer type compiling Fortran sources g77(f77) options: '-Wall -fno-second-underscore -fPIC -O3 -funroll-loops -march=opteron -mmm ---- compile options: '-Ibuild/src -I/home/abaecker/BUILDS2/Build_33/inst_scipy_newcore/lib/python2.4/site-packages/scipy/base/include -I/scr/python/include/python2.4 -c' gcc: build/src/Lib/stats/mvnmodule.c build/src/Lib/stats/mvnmodule.c: In function `f2py_as_column_major_storage': build/src/Lib/stats/mvnmodule.c:615: warning: passing arg 2 of `array_from_pyobj' from incompatible pointer type compiling Fortran sources g77(f77) options: '-Wall -fno-second-underscore -fPIC -O3 -funroll-loops -march=opteron -mmmx -msse2 -msse -m3dnow' ----- gcc options: '-pthread -fno-strict-aliasing -DNDEBUG -g -O3 -Wall -Wstrict-prototypes -fPIC' creating build/temp.linux-x86_64-2.4/build/src/Lib/interpolate compile options: '-Ibuild/src -I/home/abaecker/BUILDS2/Build_33/inst_scipy_newcore/lib/python2.4/site-packages/scipy/base/include -I/scr/python/include/python2.4 -c' gcc: build/src/Lib/interpolate/dfitpackmodule.c build/src/Lib/interpolate/dfitpackmodule.c: In function `f2py_rout_dfitpack_fpcurf0': build/src/Lib/interpolate/dfitpackmodule.c:1304: warning: passing arg 1 of `initforcomb' from incompatible pointer type build/src/Lib/interpolate/dfitpackmodule.c: In function `f2py_rout_dfitpack_fpcurfm1': build/src/Lib/interpolate/dfitpackmodule.c:1998: warning: passing arg 1 of `initforcomb' from incompatible pointer type build/src/Lib/interpolate/dfitpackmodule.c: In function `f2py_rout_dfitpack_surfit_smth': build/src/Lib/interpolate/dfitpackmodule.c:2613: warning: passing arg 1 of `initforcomb' from incompatible pointer type build/src/Lib/interpolate/dfitpackmodule.c: In function `f2py_rout_dfitpack_surfit_lsq': build/src/Lib/interpolate/dfitpackmodule.c:3084: warning: passing arg 1 of `initforcomb' from incompatible pointer type build/src/Lib/interpolate/dfitpackmodule.c: In function `f2py_as_column_major_storage': build/src/Lib/interpolate/dfitpackmodule.c:3328: warning: passing arg 2 of `array_from_pyobj' from incompatible pointer type compiling Fortran sources g77(f77) options: '-Wall -fno-second-underscore -fPIC -O3 -funroll-loops -march=opteron -mmm Sorry, this got longer than anticipated.... Anyway, maybe it is useful for you - I will be off from broad-band for a week, but can run builds from time to time if needed. Best, Arnd From arnd.baecker at web.de Tue Oct 18 20:39:09 2005 From: arnd.baecker at web.de (Arnd Baecker) Date: Wed, 19 Oct 2005 02:39:09 +0200 (CEST) Subject: [SciPy-dev] newscipy segfaults fixed! In-Reply-To: References: <435553B7.20700@ee.byu.edu><43558712.1000607@ee.byu.edu> Message-ID: just did one more update on the opteron. build ok, import works as well. scipy.test(1,10) leads to: !! No test file 'test__zeros.py' found for Found 0 tests for __main__ check_zero (scipy.linalg.matfuncs.test_matfuncs.test_expm)*** glibc detected *** free(): invalid next size (fast): 0x0000000000b15700 *** Aborted In [2]: scipy.base.__version__ Out[2]: '0.4.3.1305' newscipy> svn update At revision 1338. Arnd From pearu at scipy.org Wed Oct 19 01:00:27 2005 From: pearu at scipy.org (Pearu Peterson) Date: Wed, 19 Oct 2005 00:00:27 -0500 (CDT) Subject: [SciPy-dev] newscipy segfaults fixed! In-Reply-To: <43558712.1000607@ee.byu.edu> References: <435553B7.20700@ee.byu.edu><43558712.1000607@ee.byu.edu> Message-ID: On Tue, 18 Oct 2005, Travis Oliphant wrote: >> Hmm, may be PyArray_API should not be defined as static. Is there a reason >> that PyArray_API must be defined static? > > Wait a minute.... PyArray_API is defined exactly the same way the > Numeric defined it. It's static (and filled by import_array) unless > PY_ARRAY_UNIQUE_SYMBOL is defined or NO_IMPORT or NO_IMPORT_ARRAY is > defined. > > So, the thing to do is compile fortranobject.c with NO_IMPORT_ARRAY > defined (that way PyArray_API is declared extern), and define > PY_ARRAY_UNIQUE_SYMBOL to be , and then compile the extension > modules that want to use fortranobject.c code by defining > PY_ARRAY_UNIQUE_SYMBOL to be . > > This was the same way you had to do it in Numeric. So, I don't see > what's changed. Yes, I noticed that too and yesterday I copied Numeric PyArray_API handling also to newcore. So, now everything is fine. Pearu From cimrman3 at ntc.zcu.cz Wed Oct 19 03:26:42 2005 From: cimrman3 at ntc.zcu.cz (Robert Cimrman) Date: Wed, 19 Oct 2005 09:26:42 +0200 Subject: [SciPy-dev] missing Lib/__init__.py in newscipy? In-Reply-To: <200510181619.11283.dd55@cornell.edu> References: <200510181342.06980.dd55@cornell.edu> <200510181619.11283.dd55@cornell.edu> Message-ID: <4355F532.20102@ntc.zcu.cz> > Hi Pearu, > > I must be doing something wrong here. I deleted my old scipy and Numeric > directories from site-packages, then I checked out from svn: > > svn co http://svn.scipy.org/svn/scipy_core/branches/newcore/ > svn co http://svn.scipy.org/svn/scipy/branches/newscipy/ > > Then I copied site.cfg to newcore/scipy/distutils, built and installed > newcore. At this point, I run test(1,0) and get the three errors related to > signbit which I reported earlier. Next I built and installed newscipy, and > when I try to import scipy: > > In [1]: from scipy import * > Importing io to scipy > Importing special to scipy > Segmentation fault > same for me on Gentoo Linux after a complete reinstall: svn co http://svn.scipy.org/svn/scipy_core/branches/newcore/ cd newcore python setup.py install --root=/home/share/software svn co http://svn.scipy.org/svn/scipy/branches/newscipy/ cd newscipy python setup.py install --root=/home/share/software In [1]:import scipy Importing io to scipy Importing special to scipy Segmentation fault (SIGSEGV) r. From nwagner at mecha.uni-stuttgart.de Wed Oct 19 03:53:02 2005 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Wed, 19 Oct 2005 09:53:02 +0200 Subject: [SciPy-dev] newscipy segfaults fixed! In-Reply-To: References: <435553B7.20700@ee.byu.edu><43558712.1000607@ee.byu.edu> Message-ID: <4355FB5E.9010408@mecha.uni-stuttgart.de> Arnd Baecker wrote: >just did one more update on the opteron. >build ok, import works as well. > >scipy.test(1,10) leads to: > > !! No test file 'test__zeros.py' found for 'scipy.optimize._zeros' from '...kages/scipy/optimize/_zeros.so'> > Found 0 tests for __main__ >check_zero (scipy.linalg.matfuncs.test_matfuncs.test_expm)*** glibc >detected *** free(): invalid next size (fast): 0x0000000000b15700 *** >Aborted > >In [2]: scipy.base.__version__ >Out[2]: '0.4.3.1305' > >newscipy> svn update >At revision 1338. > >Arnd > >_______________________________________________ >Scipy-dev mailing list >Scipy-dev at scipy.net >http://www.scipy.net/mailman/listinfo/scipy-dev > Ran 388 tests in 1.247s FAILED (failures=28, errors=175) >>> scipy.base.__version__ '0.4.3.1305' Nils From cimrman3 at ntc.zcu.cz Wed Oct 19 04:22:16 2005 From: cimrman3 at ntc.zcu.cz (Robert Cimrman) Date: Wed, 19 Oct 2005 10:22:16 +0200 Subject: [SciPy-dev] missing Lib/__init__.py in newscipy? In-Reply-To: <4355F532.20102@ntc.zcu.cz> References: <200510181342.06980.dd55@cornell.edu> <200510181619.11283.dd55@cornell.edu> <4355F532.20102@ntc.zcu.cz> Message-ID: <43560238.80706@ntc.zcu.cz> Robert Cimrman wrote: >>Hi Pearu, >> >>I must be doing something wrong here. I deleted my old scipy and Numeric >>directories from site-packages, then I checked out from svn: >> >>svn co http://svn.scipy.org/svn/scipy_core/branches/newcore/ >>svn co http://svn.scipy.org/svn/scipy/branches/newscipy/ >> >>Then I copied site.cfg to newcore/scipy/distutils, built and installed >>newcore. At this point, I run test(1,0) and get the three errors related to >>signbit which I reported earlier. Next I built and installed newscipy, and >>when I try to import scipy: >> >>In [1]: from scipy import * >>Importing io to scipy >>Importing special to scipy >>Segmentation fault >> > > > same for me on Gentoo Linux after a complete reinstall: > > svn co http://svn.scipy.org/svn/scipy_core/branches/newcore/ > cd newcore > python setup.py install --root=/home/share/software > > svn co http://svn.scipy.org/svn/scipy/branches/newscipy/ > cd newscipy > python setup.py install --root=/home/share/software > > > In [1]:import scipy > Importing io to scipy > Importing special to scipy > Segmentation fault (SIGSEGV) > > r. update: I have commented out 'config.add_subpackage('special')' in newscipy/Lib/setup.py, reinstalled, and voila: In [1]:import scipy Importing utils to scipy Importing io to scipy Importing optimize to scipy Importing interpolate to scipy Importing linalg to scipy scipy.test then fails, but that's other thing. what is so offending in 'special' package? namely what has changed since beginnign-of-last-week's revisions of newscipy? 'import scipy' worked for me then... r. From pearu at scipy.org Wed Oct 19 06:31:34 2005 From: pearu at scipy.org (Pearu Peterson) Date: Wed, 19 Oct 2005 05:31:34 -0500 (CDT) Subject: [SciPy-dev] missing Lib/__init__.py in newscipy? In-Reply-To: <4355F532.20102@ntc.zcu.cz> References: <200510181342.06980.dd55@cornell.edu> <200510181619.11283.dd55@cornell.edu> <4355F532.20102@ntc.zcu.cz> Message-ID: On Wed, 19 Oct 2005, Robert Cimrman wrote: > In [1]:import scipy > Importing io to scipy > Importing special to scipy > Segmentation fault (SIGSEGV) This issue is now fixed in SVN (segfaults occured probably due to a bug in _cephesmodule.c). special now imports succesfully on my debian x86_64 box. However, there are probably more places where we need to make int->intp changes for 64-bit platforms. Pearu From cimrman3 at ntc.zcu.cz Wed Oct 19 08:09:03 2005 From: cimrman3 at ntc.zcu.cz (Robert Cimrman) Date: Wed, 19 Oct 2005 14:09:03 +0200 Subject: [SciPy-dev] newscipy/Lib/sparse Message-ID: <4356375F.7040501@ntc.zcu.cz> Import of scipy.sparse fails with ... /usr/lib/python2.4/site-packages/scipy/sparse/Sparse.py ImportError: No module named fastumath ... - is the fastumath module going to be included in newscipy? what is it for, btw.? Then when I comment out the 'from scipy.base.fastumath import *' line in Sparse.py, I get ... /usr/lib/python2.4/site-packages/scipy/sparse/Sparse.py /usr/lib/python2.4/site-packages/scipy/sparse/_superlu.py ImportError: /home/share/software/usr/lib/python2.4/site-packages/scipy/sparse/_zsuperlu.so: undefined symbol: strsm_ ... but this is probably caused by my modified setup_sparse.py, where I might have introduced new bugs... As a side-note, my installation of scipy really is in '/home/share/software/usr/lib/python2.4/site-packages'. ipython (and the regular python shell as well) nevertheless prints '/usr/lib/python2.4/site-packages' in the tracebacks. Does anybody know how to fix it? cheers, r. From cimrman3 at ntc.zcu.cz Wed Oct 19 08:17:32 2005 From: cimrman3 at ntc.zcu.cz (Robert Cimrman) Date: Wed, 19 Oct 2005 14:17:32 +0200 Subject: [SciPy-dev] missing Lib/__init__.py in newscipy? In-Reply-To: References: <200510181342.06980.dd55@cornell.edu> <200510181619.11283.dd55@cornell.edu> <4355F532.20102@ntc.zcu.cz> Message-ID: <4356395C.2020205@ntc.zcu.cz> Pearu Peterson wrote: > > On Wed, 19 Oct 2005, Robert Cimrman wrote: > > >>In [1]:import scipy >>Importing io to scipy >>Importing special to scipy >>Segmentation fault (SIGSEGV) > > > This issue is now fixed in SVN (segfaults occured probably due to a bug in > _cephesmodule.c). special now imports succesfully on my debian x86_64 box. Great, it works for me too, thanks! r. From schofield at ftw.at Wed Oct 19 08:58:17 2005 From: schofield at ftw.at (Ed Schofield) Date: Wed, 19 Oct 2005 14:58:17 +0200 Subject: [SciPy-dev] newscipy/Lib/sparse In-Reply-To: <4356375F.7040501@ntc.zcu.cz> References: <4356375F.7040501@ntc.zcu.cz> Message-ID: <435642E9.4020100@ftw.at> Robert Cimrman wrote: >Then when I comment out the 'from scipy.base.fastumath import *' line in >Sparse.py, I get > >... >/usr/lib/python2.4/site-packages/scipy/sparse/Sparse.py >/usr/lib/python2.4/site-packages/scipy/sparse/_superlu.py >ImportError: >/home/share/software/usr/lib/python2.4/site-packages/scipy/sparse/_zsuperlu.so: >undefined symbol: strsm_ > > I suggest we should strip away the SuperLU bindings until we've got the basic sparse data types working to our satisfaction. >As a side-note, my installation of scipy really is in >'/home/share/software/usr/lib/python2.4/site-packages'. ipython (and the >regular python shell as well) nevertheless prints >'/usr/lib/python2.4/site-packages' in the tracebacks. Does anybody know >how to fix it? > > Try changing your PYTHONPATH environment variable so it only points to site-packages/ where the new scipy is installed. -- Ed From cimrman3 at ntc.zcu.cz Wed Oct 19 09:26:14 2005 From: cimrman3 at ntc.zcu.cz (Robert Cimrman) Date: Wed, 19 Oct 2005 15:26:14 +0200 Subject: [SciPy-dev] newscipy/Lib/sparse In-Reply-To: <435642E9.4020100@ftw.at> References: <4356375F.7040501@ntc.zcu.cz> <435642E9.4020100@ftw.at> Message-ID: <43564976.7070001@ntc.zcu.cz> Ed Schofield wrote: > Robert Cimrman wrote: > > >>Then when I comment out the 'from scipy.base.fastumath import *' line in >>Sparse.py, I get >> >>... >>/usr/lib/python2.4/site-packages/scipy/sparse/Sparse.py >>/usr/lib/python2.4/site-packages/scipy/sparse/_superlu.py >>ImportError: >>/home/share/software/usr/lib/python2.4/site-packages/scipy/sparse/_zsuperlu.so: >>undefined symbol: strsm_ >> >> > > > I suggest we should strip away the SuperLU bindings until we've got the > basic sparse data types working to our satisfaction. Good suggestion :-). Now after commenting out also 'import _superlu' the import scipy.sparse succeeds. I am now converting Sparse.py for the newcore-type arrays. >>As a side-note, my installation of scipy really is in >>'/home/share/software/usr/lib/python2.4/site-packages'. ipython (and the >>regular python shell as well) nevertheless prints >>'/usr/lib/python2.4/site-packages' in the tracebacks. Does anybody know >>how to fix it? >> > Try changing your PYTHONPATH environment variable so it only points to > site-packages/ where the new scipy is installed. That is what I have :( r. From stephen.walton at csun.edu Wed Oct 19 11:15:28 2005 From: stephen.walton at csun.edu (Stephen Walton) Date: Wed, 19 Oct 2005 08:15:28 -0700 Subject: [SciPy-dev] newcore atlas info on Gentoo In-Reply-To: <200510181341.12351.dd55@cornell.edu> References: <200510181341.12351.dd55@cornell.edu> Message-ID: <43566310.8070409@csun.edu> Darren Dale wrote: >In order to get scipy-0.3.2 to build on Gentoo Linux, I use site.cfg in the >distutils module. > Although I don't use Gentoo, I tried to use site.cfg on my Ubuntu laptop and, like Darren, found it didn't seem to work. Where does it need to be located? It seems to me that distribution of distro-specific site.cfg files might be preferable to trying to get system_info.py to sniff out every possible one. From dd55 at cornell.edu Wed Oct 19 11:36:31 2005 From: dd55 at cornell.edu (Darren Dale) Date: Wed, 19 Oct 2005 11:36:31 -0400 Subject: [SciPy-dev] newcore atlas info on Gentoo In-Reply-To: <43566310.8070409@csun.edu> References: <200510181341.12351.dd55@cornell.edu> <43566310.8070409@csun.edu> Message-ID: <200510191136.31418.dd55@cornell.edu> On Wednesday 19 October 2005 11:15 am, you wrote: > Darren Dale wrote: > >In order to get scipy-0.3.2 to build on Gentoo Linux, I use site.cfg in > > the distutils module. > > Although I don't use Gentoo, I tried to use site.cfg on my Ubuntu laptop > and, like Darren, found it didn't seem to work. Where does it need to > be located? It seems to me that distribution of distro-specific > site.cfg files might be preferable to trying to get system_info.py to > sniff out every possible one. I was thinking that it would be good for scipy to use a site.cfg by default. Maybe distro-specific site.cfg files could be posted on the wiki and ultimately included in scipy. Darren From arnd.baecker at web.de Wed Oct 19 11:39:03 2005 From: arnd.baecker at web.de (Arnd Baecker) Date: Wed, 19 Oct 2005 17:39:03 +0200 (CEST) Subject: [SciPy-dev] missing Lib/__init__.py in newscipy? In-Reply-To: References: <200510181342.06980.dd55@cornell.edu> <200510181619.11283.dd55@cornell.edu> <4355F532.20102@ntc.zcu.cz> Message-ID: On Wed, 19 Oct 2005, Pearu Peterson wrote: > On Wed, 19 Oct 2005, Robert Cimrman wrote: > > > In [1]:import scipy > > Importing io to scipy > > Importing special to scipy > > Segmentation fault (SIGSEGV) > > This issue is now fixed in SVN (segfaults occured probably due to a bug in > _cephesmodule.c). special now imports succesfully on my debian x86_64 box. > > However, there are probably more places where we need to make int->intp > changes for 64-bit platforms. I fear so - the In [1]: from scipy.linalg import fblas Importing io to scipy Importing special to scipy Importing utils to scipy Importing interpolate to scipy Importing optimize to scipy Importing linalg to scipy In [2]: fblas.sger(1,[1,2],[3,4]) Segmentation fault on the opteron persists. (If you would like give me some guidance what to do next to track this down, I am all ears). On my PIV: scipy.test(10,10) gives [...] check_zero (scipy.linalg.matfuncs.test_matfuncs.test_expm) ... ERROR check_nils (scipy.linalg.matfuncs.test_matfuncs.test_logm) ... ERROR check_defective1 (scipy.linalg.matfuncs.test_matfuncs.test_signm) ... ERROR check_defective2 (scipy.linalg.matfuncs.test_matfuncs.test_signm)zsh: 3052 floating point exception (Revisions 1315 and 1345 for newcore and newscipy on both machines) Best, Arnd From stephen.walton at csun.edu Wed Oct 19 11:50:06 2005 From: stephen.walton at csun.edu (Stephen Walton) Date: Wed, 19 Oct 2005 08:50:06 -0700 Subject: [SciPy-dev] signbit and -inf In-Reply-To: <200510181139.07062.dd55@cornell.edu> References: <200510161433.13556.dd55@cornell.edu> <200510161656.45947.dd55@cornell.edu> <200510181139.07062.dd55@cornell.edu> Message-ID: <43566B2E.7090105@csun.edu> Darren Dale wrote: >>It turns out that on my machine, signbit returns False for any >>negative number, not just -inf. >> >> > >Does anyone have any suggestions as to how I can pursue this? > > All I can say is that it "Works Here" on Fedora Core 4: home built ATLAS and LAPACK per instructions at Scipy.org site, and "python setup.py config_fc --fcompiler=absoft build". From stephen.walton at csun.edu Wed Oct 19 12:09:23 2005 From: stephen.walton at csun.edu (Stephen Walton) Date: Wed, 19 Oct 2005 09:09:23 -0700 Subject: [SciPy-dev] newcore atlas info on Gentoo In-Reply-To: <200510191136.31418.dd55@cornell.edu> References: <200510181341.12351.dd55@cornell.edu> <43566310.8070409@csun.edu> <200510191136.31418.dd55@cornell.edu> Message-ID: <43566FB3.4090108@csun.edu> Darren Dale wrote: >I was thinking that it would be good for scipy to use a site.cfg by default. > > site.cfg does work. I created a site.cfg in newcore/scipy/distutils containing [lapack_src] src_dirs=/home/swalton/src/LAPACK/SRC which is where, in fact, my LAPACK sources are. "python system_info.py" in this directory then correctly finds the sources. From stephen.walton at csun.edu Wed Oct 19 12:21:27 2005 From: stephen.walton at csun.edu (Stephen Walton) Date: Wed, 19 Oct 2005 09:21:27 -0700 Subject: [SciPy-dev] Arrays and matrices share memory? Message-ID: <43567287.2030402@csun.edu> I began tracking down the failures in scipy.test (full new install) this morning on the bus and hit two problems right off, both in expm3. The first is easy to fix: line 84 needs to change from eA = eye(*A.shape,**{'typecode':t}) to eA = eye(*A.shape,**{'dtype':t}) The second one is more subtle and here's a brief demonstration of the problem: In [1]:import scipy as S In [2]:A=S.array([[1.,1],[1,1]]) In [3]:trm = S.mat(A) In [4]:trm Out[4]: matrix([[ 1., 1.], [ 1., 1.]]) In [5]:A Out[5]: array([[ 1., 1.], [ 1., 1.]]) In [6]:trm is A Out[6]:False In [7]:trm *= 0 In [8]:trm Out[8]: matrix([[ 0., 0.], [ 0., 0.]]) In [9]:A Out[9]: array([[ 0., 0.], [ 0., 0.]]) Notice that A has been set to zeros even though "trm is A" returns False. My guess, without delving deeply into the code, is that in newcore A and trm are sharing the same array object. expm3 (and probably many other routines in scipy) depend on B=mat(A) creating a completely new copy of A. Should this be changed? From oliphant at ee.byu.edu Wed Oct 19 13:46:35 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Wed, 19 Oct 2005 11:46:35 -0600 Subject: [SciPy-dev] missing Lib/__init__.py in newscipy? In-Reply-To: References: <200510181342.06980.dd55@cornell.edu> <200510181619.11283.dd55@cornell.edu> <4355F532.20102@ntc.zcu.cz> Message-ID: <4356867B.6090106@ee.byu.edu> Arnd Baecker wrote: >On Wed, 19 Oct 2005, Pearu Peterson wrote: > > > >>On Wed, 19 Oct 2005, Robert Cimrman wrote: >> >> >> >>>In [1]:import scipy >>>Importing io to scipy >>>Importing special to scipy >>>Segmentation fault (SIGSEGV) >>> >>> >>This issue is now fixed in SVN (segfaults occured probably due to a bug in >>_cephesmodule.c). special now imports succesfully on my debian x86_64 box. >> >>However, there are probably more places where we need to make int->intp >>changes for 64-bit platforms. >> >> > >I fear so - the >In [1]: from scipy.linalg import fblas >Importing io to scipy >Importing special to scipy >Importing utils to scipy >Importing interpolate to scipy >Importing optimize to scipy >Importing linalg to scipy > >In [2]: fblas.sger(1,[1,2],[3,4]) >Segmentation fault > > The first thing to do is to look for warnings in the compilation of fblasmodule.c that might lead to int->intp problems. I did not see anything significant in the build log you last sent, but I might have missed something. The next thing to do is to track down where the segfault is happening. You could try running gdb gdb file /// run from scipy.linalg import fblas fblas.sger(1,[1,2],[3,4]) gdb will catch the segfault and show you where it died. As a last resort, I insert print statements to guide me where the code is dying and carefully inspect that section of code. -Travis From oliphant at ee.byu.edu Wed Oct 19 13:53:03 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Wed, 19 Oct 2005 11:53:03 -0600 Subject: [SciPy-dev] signbit and -inf In-Reply-To: <200510181139.07062.dd55@cornell.edu> References: <200510161433.13556.dd55@cornell.edu> <200510161656.45947.dd55@cornell.edu> <200510181139.07062.dd55@cornell.edu> Message-ID: <435687FF.6000507@ee.byu.edu> Darren Dale wrote: >>I can run the full test without a segfault, although I am only using the >>newcore (I get a segfault when I install newscipy and try to import scipy). >>I am using Gentoo on a Pentium4, with Python-2.4.2 and gcc-3.4.4. I built >>scipy against BLAS/LAPACK/ATLAS using the same site.cfg as I used to build >>scipy-0.3.2. It turns out that on my machine, signbit returns False for any >>negative number, not just -inf. >> >> > >Does anyone have any suggestions as to how I can pursue this? > > > Look at the config.h file in the scipy include directory (under site-packages/scipy/base/include) The code for signbit is from cephes and is in scipy/base/src/_signbit.c It is pretty simple but depends on certain platform characteristics. The important questions are is WORDS_BIGENDIAN defined in pyconfig.h (in include/pythonX.X) and what is the SIZEOF integer on your platform. For your system, I would expect SIZEOF_INT to be 4 (look in config.h) and WORDS_BIGENDIAN to not be defined. -Travis From dd55 at cornell.edu Wed Oct 19 14:28:58 2005 From: dd55 at cornell.edu (Darren Dale) Date: Wed, 19 Oct 2005 14:28:58 -0400 Subject: [SciPy-dev] signbit and -inf In-Reply-To: <435687FF.6000507@ee.byu.edu> References: <200510161433.13556.dd55@cornell.edu> <200510181139.07062.dd55@cornell.edu> <435687FF.6000507@ee.byu.edu> Message-ID: <200510191428.59052.dd55@cornell.edu> Hi Travis, On Wednesday 19 October 2005 1:53 pm, Travis Oliphant wrote: > Darren Dale wrote: > >>I can run the full test without a segfault, although I am only using the > >>newcore (I get a segfault when I install newscipy and try to import > >> scipy). I am using Gentoo on a Pentium4, with Python-2.4.2 and > >> gcc-3.4.4. I built scipy against BLAS/LAPACK/ATLAS using the same > >> site.cfg as I used to build scipy-0.3.2. It turns out that on my > >> machine, signbit returns False for any negative number, not just -inf. > > > >Does anyone have any suggestions as to how I can pursue this? > > Look at the config.h file in the scipy include directory (under > site-packages/scipy/base/include) My config.h is located in site-packages/scipy/base/include/scipy: /* #define SIZEOF_SHORT 2 */ /* #define SIZEOF_INT 4 */ /* #define SIZEOF_LONG 4 */ /* #define SIZEOF_FLOAT 4 */ /* #define SIZEOF_DOUBLE 8 */ #define SIZEOF_LONG_DOUBLE 12 #define SIZEOF_PY_INTPTR_T 4 /* #define SIZEOF_LONG_LONG 8 */ #define SIZEOF_PY_LONG_LONG 8 /* #define CHAR_BIT 8 */ #define MATHLIB m #define HAVE_LONGDOUBLE_FUNCS #define HAVE_FLOAT_FUNCS #define HAVE_INVERSE_HYPERBOLIC #define HAVE_INVERSE_HYPERBOLIC_FLOAT #define HAVE_INVERSE_HYPERBOLIC_LONGDOUBLE #define HAVE_ISNAN > The important questions are > > is WORDS_BIGENDIAN defined in pyconfig.h (in include/pythonX.X) and what > is the SIZEOF integer on your platform. > > For your system, I would expect SIZEOF_INT to be 4 (look in config.h) > and WORDS_BIGENDIAN to not be defined. Here is the only mention of WORDS_BIGENDIAN in /usr/include/python2.4/pyconfig.h /* Define to 1 if your processor stores words with the most significant byte first (like Motorola and SPARC, unlike Intel and VAX). */ /* #undef WORDS_BIGENDIAN */ Darren From arnd.baecker at web.de Wed Oct 19 15:17:03 2005 From: arnd.baecker at web.de (Arnd Baecker) Date: Wed, 19 Oct 2005 21:17:03 +0200 (CEST) Subject: [SciPy-dev] missing Lib/__init__.py in newscipy? In-Reply-To: <4356867B.6090106@ee.byu.edu> References: <200510181342.06980.dd55@cornell.edu> <200510181619.11283.dd55@cornell.edu> <4355F532.20102@ntc.zcu.cz> <4356867B.6090106@ee.byu.edu> Message-ID: Hi Travis, On Wed, 19 Oct 2005, Travis Oliphant wrote: > Arnd Baecker wrote: > >In [2]: fblas.sger(1,[1,2],[3,4]) > >Segmentation fault > > > The first thing to do is to look for warnings in the compilation of > fblasmodule.c that might lead to int->intp problems. > > I did not see anything significant in the build log you last sent, but I > might have missed something. > > The next thing to do is to track down where the segfault is happening. > You could try running gdb > > gdb > file /// > run > from scipy.linalg import fblas > fblas.sger(1,[1,2],[3,4]) > > gdb will catch the segfault and show you where it died. > > As a last resort, I insert print statements to guide me where the code > is dying and carefully inspect that section of code. > > -Travis Alright, there are warnings: Build ===== gcc: build/src/fortranobject.c gcc: build/src/build/src/scipy/linalg/fblasmodule.c build/src/build/src/scipy/linalg/fblasmodule.c: In function `f2py_rout_fblas_sger': build/src/build/src/scipy/linalg/fblasmodule.c:8045: warning: passing arg 1 of `initforcomb' from incompatible pointer type build/src/build/src/scipy/linalg/fblasmodule.c: In function `f2py_rout_fblas_dger': build/src/build/src/scipy/linalg/fblasmodule.c:8237: warning: passing arg 1 of `initforcomb' from incompatible pointer type build/src/build/src/scipy/linalg/fblasmodule.c: In function `f2py_rout_fblas_cgeru': build/src/build/src/scipy/linalg/fblasmodule.c:8429: warning: passing arg 1 of `initforcomb' from incompatible pointer type build/src/build/src/scipy/linalg/fblasmodule.c: In function `f2py_rout_fblas_zgeru': build/src/build/src/scipy/linalg/fblasmodule.c:8621: warning: passing arg 1 of `initforcomb' from incompatible pointer type build/src/build/src/scipy/linalg/fblasmodule.c: In function `f2py_rout_fblas_cgerc': build/src/build/src/scipy/linalg/fblasmodule.c:8813: warning: passing arg 1 of `initforcomb' from incompatible pointer type build/src/build/src/scipy/linalg/fblasmodule.c: In function `f2py_rout_fblas_zgerc': build/src/build/src/scipy/linalg/fblasmodule.c:9005: warning: passing arg 1 of `initforcomb' from incompatible pointer type compiling Fortran sources g77(f77) options: '-Wall -fno-second-underscore -fPIC -O3 -funroll-loops -march=opteron -mmmx -msse2 -msse -m3dnow' creating build/temp.linux-x86_64-2.4/Lib/linalg Test ==== scipy.test(10,10) check_ger (scipy.linalg.blas.test_blas.test_fblas2_simple) Segmentation fault gdb file /scr/python/bin/python run from scipy.linalg import fblas fblas.sger(1,[1,2],[3,4]) Program received signal SIGSEGV, Segmentation fault. [Switching to Thread 46912507335168 (LWP 1278)] 0x00002aaab298ce1b in f2py_rout_fblas_sger (capi_self=0x8a9ae0, capi_args=0x8a9ae0, capi_keywds=0x0, f2py_func=0x2aaab2997e10 ) at fblasmodule.c:8047 8047 a[capi_i++] = 0; /* fortran way */ (gdb) bt #0 0x00002aaab298ce1b in f2py_rout_fblas_sger (capi_self=0x8a9ae0, capi_args=0x8a9ae0, capi_keywds=0x0, f2py_func=0x2aaab2997e10 ) at fblasmodule.c:8047 #1 0x0000000000417700 in PyObject_Call (func=0x8a9ae0, arg=0x0, kw=0x1) at abstract.c:1756 #2 0x00000000004777d9 in PyEval_EvalFrame (f=0x685990) at ceval.c:3766 #3 0x000000000047ad2f in PyEval_EvalCodeEx (co=0x2aaaaab20030, globals=0x0, locals=0x1, args=0x685990, argcount=0, kws=0x0, kwcount=0, defs=0x0, defcount=0, closure=0x0) at ceval.c:2736 #4 0x000000000047af72 in PyEval_EvalCode (co=0x8a9ae0, globals=0x0, locals=0x1) at ceval.c:484 #5 0x00000000004a1c72 in PyRun_InteractiveOneFlags (fp=0x2aaaaab136b0, filename=0x4cbf24 "", flags=0x7fffffa574dc) at pythonrun.c:1265 #6 0x00000000004a1e04 in PyRun_InteractiveLoopFlags (fp=0x2aaaab556b00, filename=0x4cbf24 "", flags=0x7fffffa574dc) at pythonrun.c:695 #7 0x00000000004a2350 in PyRun_AnyFileExFlags (fp=0x2aaaab556b00, filename=0x0, closeit=0, flags=0x7fffffa574dc) at pythonrun.c:658 #8 0x0000000000410788 in Py_Main (argc=0, argv=0x7fffffa58a6c) at main.c:484 #9 0x00002aaaab34d5aa in __libc_start_main () from /lib64/tls/libc.so.6 #10 0x000000000040fdfa in _start () at start.S:113 #11 0x00007fffffa575d8 in ?? () #12 0x00002aaaaabc19c0 in rtld_errno () from /lib64/ld-linux-x86-64.so.2 #13 0x0000000000000001 in ?? () fblasmodule.c is generated via f2py, so applying changes to that file is presumably not the way to go, right? Oh - looking at the header: /* Do not edit this file directly unless you know what you are doing!!! */ This definitively means that *I* should not edit that file ;-) Of course I could try do the changes, but maybe you see the problem straight away (if not I will give it a try ...) Best, Arnd From oliphant at ee.byu.edu Wed Oct 19 16:21:56 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Wed, 19 Oct 2005 14:21:56 -0600 Subject: [SciPy-dev] [SciPy-user] operations on int8 arrays In-Reply-To: <43566C59.6090508@psychology.nottingham.ac.uk> References: <43566C59.6090508@psychology.nottingham.ac.uk> Message-ID: <4356AAE4.7020009@ee.byu.edu> Jon Peirce wrote: >>>>Scipy arrays with dtype=uint8 or int8 seem to be >>>>mathematically-challenged on my machine (AMD64 WinXP running python >>>>2.4.2, scipy core 0.4.1). Simple int (and various others) appear fine. >>>> >>> >>> >>>>>>> >>>import scipy >>>>>>> >>>xx=scipy.array([100,100,100],scipy.int8) >>>>>>> >>>print xx.sum() >>>>>> >>>>>> >>>> 44 >>> >>> >>>>>>> >>>xx=scipy.array([100,100,100],scipy.int) >>>>>>> >>>print xx.sum() >>>>>> >>>>>> >>>> 300 >>>> >>>> >>> >>> >>This is not a bug. In the first line, you are telling the computer to >>add up 8-bit integers. The result does not fit in an 8-bit integer --- >>thus you are computing modulo 256. >> >>I suspect you wanted for the first case. >> >>xx.sum(rtype=int) -- this will "reduce" using the long integer type on >>your platform. >> >>-Travis >> > > Right, yes. I find it a bit unintuitive though that the resulting > array isn't automatically converted to a suitable type where > necessary. Even less intuitive is this: > >>>import scipy > >>>xx=scipy.array([100,100,100],scipy.int8) > >>>print xx.mean() > 14.666666666666666 Hmm. This is true. But, it is consistent with the behavior of true_divide which mean uses. It would be possible to make the default reduce type for integers 32-bit on 32-bit platforms and 64-bit on 64-bit platforms. the long integer type. Do people think that would be a good idea? These kinds of questions do come up. Or, this could simply be the default when calling the .sum method (which is add.reduce under the covers). The reduce method could stay with the default of the integer type. Obviously, it's going to give "unexpected" results to somebody. Automatic upcasting can have its downsides. But, perhaps in this case (integer reductions), it is better to do the upcasting. What do people think? -Travis From oliphant at ee.byu.edu Wed Oct 19 16:27:28 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Wed, 19 Oct 2005 14:27:28 -0600 Subject: [SciPy-dev] missing Lib/__init__.py in newscipy? In-Reply-To: References: <200510181342.06980.dd55@cornell.edu> <200510181619.11283.dd55@cornell.edu> <4355F532.20102@ntc.zcu.cz> <4356867B.6090106@ee.byu.edu> Message-ID: <4356AC30.3020402@ee.byu.edu> Arnd Baecker wrote: >Hi Travis, > >On Wed, 19 Oct 2005, Travis Oliphant wrote: > > > >>Arnd Baecker wrote: >> >> > > > >Build >===== > >gcc: build/src/fortranobject.c >gcc: build/src/build/src/scipy/linalg/fblasmodule.c >build/src/build/src/scipy/linalg/fblasmodule.c: In function >`f2py_rout_fblas_sger': >build/src/build/src/scipy/linalg/fblasmodule.c:8045: warning: passing arg >1 of `initforcomb' from incompatible pointer type >build/src/build/src/scipy/linalg/fblasmodule.c: In function >`f2py_rout_fblas_dger': >build/src/build/src/scipy/linalg/fblasmodule.c:8237: warning: passing arg >1 of `initforcomb' from incompatible pointer type >build/src/build/src/scipy/linalg/fblasmodule.c: In function >`f2py_rout_fblas_cgeru': >build/src/build/src/scipy/linalg/fblasmodule.c:8429: warning: passing arg >1 of `initforcomb' from incompatible pointer type >build/src/build/src/scipy/linalg/fblasmodule.c: In function >`f2py_rout_fblas_zgeru': >build/src/build/src/scipy/linalg/fblasmodule.c:8621: warning: passing arg >1 of `initforcomb' from incompatible pointer type >build/src/build/src/scipy/linalg/fblasmodule.c: In function >`f2py_rout_fblas_cgerc': >build/src/build/src/scipy/linalg/fblasmodule.c:8813: warning: passing arg >1 of `initforcomb' from incompatible pointer type >build/src/build/src/scipy/linalg/fblasmodule.c: In function >`f2py_rout_fblas_zgerc': >build/src/build/src/scipy/linalg/fblasmodule.c:9005: warning: passing arg >1 of `initforcomb' from incompatible pointer type >compiling Fortran sources >g77(f77) options: '-Wall -fno-second-underscore -fPIC -O3 -funroll-loops >-march=opteron -mmmx -msse2 -msse -m3dnow' >creating build/temp.linux-x86_64-2.4/Lib/linalg > >Test >==== > >scipy.test(10,10) > >check_ger (scipy.linalg.blas.test_blas.test_fblas2_simple) >Segmentation fault > > >gdb >file /scr/python/bin/python >run >from scipy.linalg import fblas >fblas.sger(1,[1,2],[3,4]) > >Program received signal SIGSEGV, Segmentation fault. >[Switching to Thread 46912507335168 (LWP 1278)] >0x00002aaab298ce1b in f2py_rout_fblas_sger (capi_self=0x8a9ae0, > capi_args=0x8a9ae0, capi_keywds=0x0, f2py_func=0x2aaab2997e10 ) > at fblasmodule.c:8047 >8047 a[capi_i++] = 0; /* fortran way */ > >(gdb) bt >#0 0x00002aaab298ce1b in f2py_rout_fblas_sger (capi_self=0x8a9ae0, > capi_args=0x8a9ae0, capi_keywds=0x0, f2py_func=0x2aaab2997e10 ) > at fblasmodule.c:8047 >#1 0x0000000000417700 in PyObject_Call (func=0x8a9ae0, arg=0x0, kw=0x1) > at abstract.c:1756 >#2 0x00000000004777d9 in PyEval_EvalFrame (f=0x685990) at ceval.c:3766 >#3 0x000000000047ad2f in PyEval_EvalCodeEx (co=0x2aaaaab20030, >globals=0x0, > locals=0x1, args=0x685990, argcount=0, kws=0x0, kwcount=0, defs=0x0, > defcount=0, closure=0x0) at ceval.c:2736 >#4 0x000000000047af72 in PyEval_EvalCode (co=0x8a9ae0, globals=0x0, > locals=0x1) at ceval.c:484 >#5 0x00000000004a1c72 in PyRun_InteractiveOneFlags (fp=0x2aaaaab136b0, > filename=0x4cbf24 "", flags=0x7fffffa574dc) at pythonrun.c:1265 >#6 0x00000000004a1e04 in PyRun_InteractiveLoopFlags (fp=0x2aaaab556b00, > filename=0x4cbf24 "", flags=0x7fffffa574dc) at pythonrun.c:695 >#7 0x00000000004a2350 in PyRun_AnyFileExFlags (fp=0x2aaaab556b00, > filename=0x0, closeit=0, flags=0x7fffffa574dc) at pythonrun.c:658 >#8 0x0000000000410788 in Py_Main (argc=0, argv=0x7fffffa58a6c) at >main.c:484 >#9 0x00002aaaab34d5aa in __libc_start_main () from /lib64/tls/libc.so.6 >#10 0x000000000040fdfa in _start () at start.S:113 >#11 0x00007fffffa575d8 in ?? () >#12 0x00002aaaaabc19c0 in rtld_errno () from /lib64/ld-linux-x86-64.so.2 >#13 0x0000000000000001 in ?? () > > >fblasmodule.c is generated via f2py, so applying changes to >that file is presumably not the way to go, right? > > Well, you can make changes to it. It's just that these changes need to be placed in the appropriate code-generation area of f2py. If you can make changes to fblasmodule.c that work, then we can back those changes into f2py. Or if you are adventurous you can scout around in the f2py code to see where the code gets generated, and make the changes there. I'm looking at the f2py-generated code now. There are several changes that need to be made to better support new scipy. -Travis From arnd.baecker at web.de Wed Oct 19 16:38:04 2005 From: arnd.baecker at web.de (Arnd Baecker) Date: Wed, 19 Oct 2005 22:38:04 +0200 (CEST) Subject: [SciPy-dev] missing Lib/__init__.py in newscipy? In-Reply-To: References: <200510181342.06980.dd55@cornell.edu> <200510181619.11283.dd55@cornell.edu> <4355F532.20102@ntc.zcu.cz> Message-ID: Hi, I *think* that in fblasmodule.c `initforcomb` and `nextforcomb` should be changed to static struct { intp nd, *d; intp *i,*i_tr,tr; } forcombcache; static int initforcomb(intp *dims,int nd,int tr) { int k; if (dims==NULL) return 0; if (nd<0) return 0; forcombcache.nd = nd; forcombcache.d = dims; forcombcache.tr = tr; if ((forcombcache.i = (intp *)malloc(sizeof(intp)*nd))==NULL) return 0; if ((forcombcache.i_tr = (intp *)malloc(sizeof(intp)*nd))==NULL) return 0; for (k=1;k (all the errors I can scroll back are of the type: ====================================================================== ERROR: check_simple_transpose_conj (scipy.linalg.fblas.test_fblas.test_cgemv) ---------------------------------------------------------------------- Traceback (most recent call last): File "/home/abaecker/BUILDS2/Build_38/inst_scipy_newcore/lib/python2.4/site-packages/scipy/linalg/tests/test_fblas.py", line 327, in check_simple_transpose_conj alpha,beta,a,x,y = self.get_data() File "/home/abaecker/BUILDS2/Build_38/inst_scipy_newcore/lib/python2.4/site-packages/scipy/linalg/tests/test_fblas.py", line 301, in get_data mult = array(1, typecode = self.typecode) TypeError: 'typecode' is an invalid keyword argument for this function Most importantly: In [4]: from scipy.linalg import fblas In [5]: fblas.sger(1,[1,2],[3,4]) Out[5]: array([[ 3., 4.], [ 6., 8.]], dtype=float32) which is in accordance with the unit test. I have no idea how to transfer that fix back to the automatic generation via f2py, so I will leave this for the experts ;-) Good night, Arnd From Fernando.Perez at colorado.edu Wed Oct 19 16:55:47 2005 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Wed, 19 Oct 2005 14:55:47 -0600 Subject: [SciPy-dev] [SciPy-user] operations on int8 arrays In-Reply-To: <4356AAE4.7020009@ee.byu.edu> References: <43566C59.6090508@psychology.nottingham.ac.uk> <4356AAE4.7020009@ee.byu.edu> Message-ID: <4356B2D3.6040200@colorado.edu> Travis Oliphant wrote: > Or, this could simply be the default when calling the .sum method (which > is add.reduce under the covers). The reduce method could stay with the > default of the integer type. > > Obviously, it's going to give "unexpected" results to somebody. > Automatic upcasting can have its downsides. But, perhaps in this case > (integer reductions), it is better to do the upcasting. What do people > think? I've personally always been of the opinion that accumulator methods are one case where automatic upcasting is justified. Since not doing it is almost guaranteed to produce incorrect results in most cases (esp. for small bit-size types), I'm +1 on upcasting on this one. I agree that in general we shouldn't upcast silently, but I think this is a case of 'practicality beats purity'. Think of it this way: almost anyone writing the equivalent to .sum() manually in C would write this with a wide enough accumulator, so I think it's OK for scipy to do the same. Just my 1e-2. Cheers, f From pearu at scipy.org Wed Oct 19 15:59:03 2005 From: pearu at scipy.org (Pearu Peterson) Date: Wed, 19 Oct 2005 14:59:03 -0500 (CDT) Subject: [SciPy-dev] missing Lib/__init__.py in newscipy? In-Reply-To: References: <200510181342.06980.dd55@cornell.edu> <200510181619.11283.dd55@cornell.edu> <4355F532.20102@ntc.zcu.cz> Message-ID: On Wed, 19 Oct 2005, Arnd Baecker wrote: > Hi, > > I *think* that in fblasmodule.c > `initforcomb` and `nextforcomb` should be changed to > > static struct { intp nd, *d; intp *i,*i_tr,tr; } forcombcache; static struct { int nd; intp *d; int *i,*i_tr,tr; } forcombcache; etc would be enough. This is in SVN now. And >>> from scipy.linalg import fblas Importing io to scipy Importing special to scipy Importing sparse to scipy Importing utils to scipy Importing interpolate to scipy Importing optimize to scipy Importing linalg to scipy >>> fblas.sger(1,[1,2],[3,4]) array([[ 3., 4.], [ 6., 8.]], dtype=float32) >>> on an opteron. Pearu From oliphant at ee.byu.edu Wed Oct 19 17:57:17 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Wed, 19 Oct 2005 15:57:17 -0600 Subject: [SciPy-dev] Arrays and matrices share memory? In-Reply-To: <43567287.2030402@csun.edu> References: <43567287.2030402@csun.edu> Message-ID: <4356C13D.8070400@ee.byu.edu> Stephen Walton wrote: >I began tracking down the failures in scipy.test (full new install) this >morning on the bus and hit two problems right off, both in expm3. The >first is easy to fix: line 84 needs to change from > > eA = eye(*A.shape,**{'typecode':t}) > >to > eA = eye(*A.shape,**{'dtype':t}) > >The second one is more subtle and here's a brief demonstration of the >problem: > >In [1]:import scipy as S > >In [2]:A=S.array([[1.,1],[1,1]]) > >In [3]:trm = S.mat(A) > >In [4]:trm >Out[4]: >matrix([[ 1., 1.], > [ 1., 1.]]) > >In [5]:A >Out[5]: >array([[ 1., 1.], > [ 1., 1.]]) > >In [6]:trm is A >Out[6]:False > >In [7]:trm *= 0 > >In [8]:trm >Out[8]: >matrix([[ 0., 0.], > [ 0., 0.]]) > >In [9]:A >Out[9]: >array([[ 0., 0.], > [ 0., 0.]]) > >Notice that A has been set to zeros even though "trm is A" returns >False. My guess, without delving deeply into the code, is that in >newcore A and trm are sharing the same array object. expm3 (and >probably many other routines in scipy) depend on B=mat(A) creating a >completely new copy of A. Should this be changed? > > > Ah, good catch. Yes, I believe that mat(A) shares the same data with A. If this was not the default previously, then it needs to change. -Travis From schofield at ftw.at Wed Oct 19 18:24:21 2005 From: schofield at ftw.at (Ed Schofield) Date: Thu, 20 Oct 2005 00:24:21 +0200 (CEST) Subject: [SciPy-dev] operations on int8 arrays In-Reply-To: <4356AAE4.7020009@ee.byu.edu> References: <43566C59.6090508@psychology.nottingham.ac.uk> <4356AAE4.7020009@ee.byu.edu> Message-ID: On Wed, 19 Oct 2005, Travis Oliphant wrote: > Jon Peirce wrote: > > >>Scipy arrays with dtype=uint8 or int8 seem to be > >>mathematically-challenged on my machine (AMD64 WinXP running python > >>2.4.2, scipy core 0.4.1). Simple int (and various others) appear fine. > >>>>> >>>import scipy > >>>>> >>>xx=scipy.array([100,100,100],scipy.int8) > >>>>> >>>print xx.sum() > >> 44 > >>>>> >>>xx=scipy.array([100,100,100],scipy.int) > >>>>> >>>print xx.sum() > >> 300 > > This is not a bug. In the first line, you are telling the computer to > add up 8-bit integers. The result does not fit in an 8-bit integer --- > thus you are computing modulo 256. I was bitten by this back in April: http://www.scipy.org/mailinglists/mailman?fn=scipy-dev/2005-April/002937.html I wasted several hours then trying to hunt down bugs in my code, before I finally realized that my sum() call was responsible. I strongly believe that the default should be changed here to upcast by default. My reasons are: 1. Python would do the same: it 'just works', upcasting where necessary from int to big integer and, in the future, making division with two int arguments return a float. We also want to avoid differences between Python's sum() and scipy's sum(): >>> a = scipy.array([100,100, 100], scipy.int8) >>> sum(a) 300 >>> scipy.sum(a) 44 2. the result of sum() or mean() without any modulo arithmetic would be a python int or float, and it seems reasonable that the result is accurate to the width of the output type. 3. the advantage in space efficiency of using a smaller type for accumulated operations is minimal (perhaps unlike an operation whose output is an array). > It would be possible to make the default reduce type for integers 32-bit > on 32-bit platforms and 64-bit on 64-bit platforms. the long integer type. As far as I understand, a Python int is always a C long, but a C long isn't always the platform word length (e.g. is sometimes 32 bit on 64 bit machines). So perhaps it'd be better to make the default reduce type for integers a C long? > Or, this could simply be the default when calling the .sum method (which > is add.reduce under the covers). The reduce method could stay with the > default of the integer type. I think reduce should upcast too. -- Ed From stephen.walton at csun.edu Wed Oct 19 18:31:27 2005 From: stephen.walton at csun.edu (Stephen Walton) Date: Wed, 19 Oct 2005 15:31:27 -0700 Subject: [SciPy-dev] Arrays and matrices share memory? In-Reply-To: <4356C13D.8070400@ee.byu.edu> References: <43567287.2030402@csun.edu> <4356C13D.8070400@ee.byu.edu> Message-ID: <4356C93F.2020408@csun.edu> Travis Oliphant wrote: >Ah, good catch. Yes, I believe that mat(A) shares the same data with >A. If this was not the default previously, then it needs to change. > > Copy was the default previously. If you look at Matrix.py in Numeric 23.8, you'll see the default value of copy in the __init__ method of the Matrix class is 1, while it is presently 0 in the __new__ method of the new matrix class. Changing this allows expm3 and friends to work properly again. From stephen.walton at csun.edu Wed Oct 19 19:10:36 2005 From: stephen.walton at csun.edu (Stephen Walton) Date: Wed, 19 Oct 2005 16:10:36 -0700 Subject: [SciPy-dev] numpyio.io problem Message-ID: <4356D26C.4010005@csun.edu> My last Bug of the Day, which is also triggered by the first test in test_array_import.py, shows that numpyio.fwrite will not convert a real array to shorts before writing them out. Note the last line succeeds in writing ints. Is this new/old/desired behavior? In [1]:import scipy Importing io to scipy Importing special to scipy Importing utils to scipy Importing interpolate to scipy Importing optimize to scipy Importing linalg to scipy In [2]:import scipy.io.numpyio as numpyio In [3]:a=255*scipy.rand(20) In [4]:fid=open('foo','wb+') In [5]:numpyio.fwrite(fid,20,a,'s') --------------------------------------------------------------------------- exceptions.ValueError Traceback (most recent call last) /home/swalton/ ValueError: Invalid type for array In [10]:numpyio.fwrite(fid,20,a,'i') From oliphant at ee.byu.edu Wed Oct 19 23:24:44 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Wed, 19 Oct 2005 21:24:44 -0600 Subject: [SciPy-dev] numpyio.io problem In-Reply-To: <4356D26C.4010005@csun.edu> References: <4356D26C.4010005@csun.edu> Message-ID: <43570DFC.70005@ee.byu.edu> Stephen Walton wrote: >My last Bug of the Day, which is also triggered by the first test in >test_array_import.py, shows that numpyio.fwrite will not convert a real >array to shorts before writing them out. Note the last line succeeds in >writing ints. Is this new/old/desired behavior? > >In [1]:import scipy >Importing io to scipy >Importing special to scipy >Importing utils to scipy >Importing interpolate to scipy >Importing optimize to scipy >Importing linalg to scipy > >In [2]:import scipy.io.numpyio as numpyio > >In [3]:a=255*scipy.rand(20) > >In [4]:fid=open('foo','wb+') > >In [5]:numpyio.fwrite(fid,20,a,'s') >--------------------------------------------------------------------------- >exceptions.ValueError Traceback (most >recent call last) > >/home/swalton/ > >ValueError: Invalid type for array > > 's' is no longer the typecode character for "short". You are looking for 'h' here. This is the character used by the struct module. I guess it means 'half' or something. numpyio needs to be updated to take actual type objects and not just character. It is not a difficult fix, since PyArray_TypecodeConverter is available in the new C-API. -Travis From oliphant at ee.byu.edu Wed Oct 19 23:28:28 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Wed, 19 Oct 2005 21:28:28 -0600 Subject: [SciPy-dev] Most tests pass now in full scipy. Message-ID: <43570EDC.4070507@ee.byu.edu> A few modifications to f2py (to update the flags after lazy transposing, and to allow array scalars to be used in complex conversions --- necessary because sadly there is no PyNumber_Complex function in Python). These modifications, plus an enhancement to the C-API PyArray_ScalarAsCtype(PyObject *scalar, void *ctypeptr) seems to make many of the tests pass. I'm not sure where the 64-bit fixes stand. I know many have been made. I'd like to see more reports. -Travis From oliphant at ee.byu.edu Thu Oct 20 00:19:55 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Wed, 19 Oct 2005 22:19:55 -0600 Subject: [SciPy-dev] Speed tests Message-ID: <43571AEB.7020502@ee.byu.edu> Currently, there is a bit more overhead for ufuncs in scipy core then there was in Numeric. This overhead is not especially significant for large arrays. However, for small arrays, it can mean 2x-3x slowdown. The code attached shows a case where I observe a 2x slowdown. About half of the overhead is due to looking up an error mask, a callback function, and a buffer size in the local, global, and builtin scope. (If I just use defaults and don't try and look them up, the slow-down shown by the attached test goes to about 50% slower). So, the question is, Is this acceptable, or should a different approach to setting the error mask be taken? I can immediately see that even if we keep the current method, I need to fix things so that only one look-up is done (and a tuple returned) instead of three separate lookups. Any ideas? -Travis -------------- next part -------------- A non-text attachment was scrubbed... Name: numerictest.py Type: text/x-python Size: 361 bytes Desc: not available URL: From arnd.baecker at web.de Thu Oct 20 03:01:14 2005 From: arnd.baecker at web.de (Arnd Baecker) Date: Thu, 20 Oct 2005 09:01:14 +0200 (CEST) Subject: [SciPy-dev] Speed tests In-Reply-To: <43571AEB.7020502@ee.byu.edu> References: <43571AEB.7020502@ee.byu.edu> Message-ID: On Wed, 19 Oct 2005, Travis Oliphant wrote: > Currently, there is a bit more overhead for ufuncs in scipy core then > there was in Numeric. > > This overhead is not especially significant for large arrays. However, > for small arrays, it can mean 2x-3x slowdown. The code attached shows a > case where I observe a 2x slowdown. > > About half of the overhead is due to looking up an error mask, a > callback function, and a buffer size in the local, global, and builtin > scope. (If I just use defaults and don't try and look them up, the > slow-down shown by the attached test goes to about 50% slower). > > So, the question is, Is this acceptable, or should a different approach > to setting the error mask be taken? Without knowing the meaning of the three calls I am tempted to say speed speed speed ;-) But this is really a statement in complete ignorance of what is going on ;-)... Best, Arnd From arnd.baecker at web.de Thu Oct 20 03:05:54 2005 From: arnd.baecker at web.de (Arnd Baecker) Date: Thu, 20 Oct 2005 09:05:54 +0200 (CEST) Subject: [SciPy-dev] missing Lib/__init__.py in newscipy? In-Reply-To: References: <200510181342.06980.dd55@cornell.edu> <200510181619.11283.dd55@cornell.edu> <4355F532.20102@ntc.zcu.cz> Message-ID: Hi Pearu, On Wed, 19 Oct 2005, Pearu Peterson wrote: > On Wed, 19 Oct 2005, Arnd Baecker wrote: > > > Hi, > > > > I *think* that in fblasmodule.c > > `initforcomb` and `nextforcomb` should be changed to > > > > static struct { intp nd, *d; intp *i,*i_tr,tr; } forcombcache; > > static struct { int nd; intp *d; int *i,*i_tr,tr; } forcombcache; > etc Autsch - sorry this was a typo - it should have been as you wrote (next time I will better send a diff, this is safer - modem access only for doing something like this is no fun ;-( ). Presently I get a hang on the opteron: check_y_stride (scipy.linalg.fblas.test_fblas.test_ccopy) ... ok check_default_beta_y (scipy.linalg.fblas.test_fblas.test_cgemv) ... No idea on this ... (Need to fist look into the test - and report later ...) Best, Arnd From nwagner at mecha.uni-stuttgart.de Thu Oct 20 03:12:48 2005 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Thu, 20 Oct 2005 09:12:48 +0200 Subject: [SciPy-dev] Most tests pass now in full scipy. In-Reply-To: <43570EDC.4070507@ee.byu.edu> References: <43570EDC.4070507@ee.byu.edu> Message-ID: <43574370.60808@mecha.uni-stuttgart.de> Travis Oliphant wrote: >A few modifications to f2py (to update the flags after lazy transposing, >and to allow array scalars to be used in complex conversions --- >necessary because sadly there is no PyNumber_Complex function in Python). > >These modifications, plus an enhancement to the C-API > >PyArray_ScalarAsCtype(PyObject *scalar, void *ctypeptr) > >seems to make many of the tests pass. > >I'm not sure where the 64-bit fixes stand. I know many have been made. >I'd like to see more reports. > >-Travis > > > >_______________________________________________ >Scipy-dev mailing list >Scipy-dev at scipy.net > >http://www.scipy.net/mailman/listinfo/scipy-dev > >>> scipy.base.__version__ '0.4.3.1326' scipy.test(1) results in only one error. Great. ====================================================================== ERROR: check_nils (scipy.linalg.matfuncs.test_matfuncs.test_logm) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib/python2.4/site-packages/scipy/linalg/tests/test_matfuncs.py", line 82, in check_nils logm((identity(7)*3.1+0j)-a) File "/usr/local/lib/python2.4/site-packages/scipy/linalg/matfuncs.py", line 233, in logm errest = norm(expm(F)-A,1) / norm(A,1) File "/usr/local/lib/python2.4/site-packages/scipy/linalg/basic.py", line 254, in norm x = asarray_chkfinite(x) File "/usr/local/lib/python2.4/site-packages/scipy/base/function_base.py", line 204, in asarray_chkfinite raise ValueError, "Array must not contain infs or nans." ValueError: Array must not contain infs or nans. ---------------------------------------------------------------------- Ran 386 tests in 1.376s FAILED (errors=1) Nils From nwagner at mecha.uni-stuttgart.de Thu Oct 20 03:20:24 2005 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Thu, 20 Oct 2005 09:20:24 +0200 Subject: [SciPy-dev] Most tests pass now in full scipy. In-Reply-To: <43570EDC.4070507@ee.byu.edu> References: <43570EDC.4070507@ee.byu.edu> Message-ID: <43574538.3010501@mecha.uni-stuttgart.de> Travis Oliphant wrote: >A few modifications to f2py (to update the flags after lazy transposing, >and to allow array scalars to be used in complex conversions --- >necessary because sadly there is no PyNumber_Complex function in Python). > >These modifications, plus an enhancement to the C-API > >PyArray_ScalarAsCtype(PyObject *scalar, void *ctypeptr) > >seems to make many of the tests pass. > >I'm not sure where the 64-bit fixes stand. I know many have been made. >I'd like to see more reports. > >-Travis > > > >_______________________________________________ >Scipy-dev mailing list >Scipy-dev at scipy.net >http://www.scipy.net/mailman/listinfo/scipy-dev > scipy.test(5,10) yields ====================================================================== ERROR: bench_random (scipy.linalg.basic.test_basic.test_det) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib/python2.4/site-packages/scipy/linalg/tests/test_basic.py", line 320, in bench_random print '| %6.2f ' % self.measure('basic_det(a)',repeat), File "/usr/local/lib/python2.4/site-packages/scipy/test/testing.py", line 155, in measure exec code in globs,locs File "ScipyTestCase runner for test_det", line 1, in ? NameError: name 'basic_det' is not defined ====================================================================== ERROR: bench_random (scipy.linalg.basic.test_basic.test_inv) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib/python2.4/site-packages/scipy/linalg/tests/test_basic.py", line 253, in bench_random print '| %6.2f ' % self.measure('basic_inv(a)',repeat), File "/usr/local/lib/python2.4/site-packages/scipy/test/testing.py", line 155, in measure exec code in globs,locs File "ScipyTestCase runner for test_inv", line 1, in ? NameError: name 'basic_inv' is not defined ====================================================================== ERROR: bench_random (scipy.linalg.basic.test_basic.test_solve) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib/python2.4/site-packages/scipy/linalg/tests/test_basic.py", line 182, in bench_random print '| %6.2f ' % self.measure('basic_solve(a,b)',repeat), File "/usr/local/lib/python2.4/site-packages/scipy/test/testing.py", line 155, in measure exec code in globs,locs File "ScipyTestCase runner for test_solve", line 1, in ? NameError: name 'basic_solve' is not defined ====================================================================== FAIL: check_nils (scipy.linalg.matfuncs.test_matfuncs.test_signm) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib/python2.4/site-packages/scipy/linalg/tests/test_matfuncs.py", line 44, in check_nils assert_array_almost_equal(r,cr) File "/usr/local/lib/python2.4/site-packages/scipy/test/testing.py", line 727, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 100.0%): Array 1: [[ 1.2104571e+01 -2.3056788e-02j -1.0034348e+00 -1.8445430e-01j 1.5239715e+01 +1.1528394e-02j 2.1808571e+01 -2.3... Array 2: [[ 11.9493333 -2.2453333 15.3173333 21.6533333 -2.2453333] [ -3.8426667 0.4986667 -4.5906667 -7.1866667 0.498... ---------------------------------------------------------------------- Ran 395 tests in 6.387s FAILED (failures=1, errors=3) From nwagner at mecha.uni-stuttgart.de Thu Oct 20 04:19:58 2005 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Thu, 20 Oct 2005 10:19:58 +0200 Subject: [SciPy-dev] AttributeError: 'scipy.ndarray' object has no attribute 'tocsc' Message-ID: <4357532E.4090200@mecha.uni-stuttgart.de> xx = sparse.lu_factor(A).solve(r) File "/usr/local/lib/python2.4/site-packages/scipy/sparse/Sparse.py", line 1521, in lu_factor csc = A.tocsc() AttributeError: 'scipy.ndarray' object has no attribute 'tocsc' From nwagner at mecha.uni-stuttgart.de Thu Oct 20 04:26:23 2005 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Thu, 20 Oct 2005 10:26:23 +0200 Subject: [SciPy-dev] Segmentation fault in sparse_test.py Message-ID: <435754AF.4030009@mecha.uni-stuttgart.de> gdb) run Starting program: /usr/local/bin/python [Thread debugging using libthread_db enabled] [New Thread 1076102528 (LWP 21242)] Python 2.4 (#2, May 12 2005, 14:45:33) [GCC 3.3.3 (SuSE Linux)] on linux2 Type "help", "copyright", "credits" or "license" for more information. >>> import sparse_test.py Importing io to scipy Importing special to scipy Importing sparse to scipy Importing utils to scipy Importing interpolate to scipy Importing optimize to scipy Importing linalg to scipy Program received signal SIGSEGV, Segmentation fault. [Switching to Thread 1076102528 (LWP 21242)] 0x40350f4a in CDOUBLE_setitem (op=Variable "op" is not available. ) at arraytypes.inc:620 620 arraytypes.inc: No such file or directory. in arraytypes.inc (gdb) bt #0 0x40350f4a in CDOUBLE_setitem (op=Variable "op" is not available. ) at arraytypes.inc:620 #1 0x4032bcf1 in array_ass_item (self=Variable "self" is not available. ) at arrayobject.c:1275 #2 0x08058ad1 in PySequence_SetItem (s=0x40c85920, i=57, o=0x4027f2c0) at Objects/abstract.c:1281 #3 0x4032b4cf in array_fromobject (op=Variable "op" is not available. ) at arrayobject.c:4697 #4 0x403403a7 in _array_fromobject (ignored=0x0, args=0x402927cc, kws=0x0) at arrayobject.c:5546 #5 0x0811d436 in PyCFunction_Call (func=0x40292f0c, arg=0x402927cc, kw=0x0) at Objects/methodobject.c:93 #6 0x080c66d9 in PyEval_EvalFrame (f=0x816c1a4) at Python/ceval.c:3547 #7 0x080c7312 in PyEval_EvalFrame (f=0x817c21c) at Python/ceval.c:3629 #8 0x080c7a54 in PyEval_EvalCodeEx (co=0x405e2720, globals=0x405d2e84, locals=0x0, args=0x8193a3c, argcount=1, kws=0x8193a40, kwcount=0, defs=0x405e3848, defcount=5, closure=0x0) at Python/ceval.c:2730 #9 0x080c52a8 in PyEval_EvalFrame (f=0x81938ec) at Python/ceval.c:3639 #10 0x080c7a54 in PyEval_EvalCodeEx (co=0x4028aee0, globals=0x4029313c, locals=0x4029313c, args=0x0, argcount=0, kws=0x0, kwcount=0, defs=0x0, defcount=0, closure=0x0) at Python/ceval.c:2730 #11 0x080c7c85 in PyEval_EvalCode (co=0x4028aee0, globals=0x4029313c, locals=0x4029313c) at Python/ceval.c:484 #12 0x080eab75 in PyImport_ExecCodeModuleEx (name=0xbfffe110 "sparse_test", co=0x4028aee0, pathname=0xbfffdc80 "sparse_test.py") at Python/import.c:619 #13 0x080eaf33 in load_source_module (name=0xbfffe110 "sparse_test", pathname=0xbfffdc80 "sparse_test.py", fp=0x815f8d0) at Python/import.c:893 #14 0x080ebfa9 in load_module (name=0xbfffe110 "sparse_test", fp=Variable "fp" is not available. ) at Python/import.c:1656 #15 0x080eca51 in import_submodule (mod=0x81387a0, subname=0xbfffe110 "sparse_test", fullname=0xbfffe110 "sparse_test") at Python/import.c:2250 #16 0x080ecc99 in load_next (mod=0x81387a0, altmod=0x81387a0, p_name=Variable "p_name" is not available. ) at Python/import.c:2070 #17 0x080ed0d7 in import_module_ex (name=Variable "name" is not available. ) at Python/import.c:1905 #18 0x080ed48d in PyImport_ImportModuleEx (name=0x402915d4 "sparse_test.py", globals=0x40259824, locals=0x40259824, fromlist=0x81387a0) at Python/import.c:1946 #19 0x080bb9c7 in builtin___import__ (self=0x0, args=0x40291644) at Python/bltinmodule.c:45 #20 0x0811d436 in PyCFunction_Call (func=0x4024ad6c, arg=0x40291644, kw=0x0) at Objects/methodobject.c:93 #21 0x0805928e in PyObject_Call (func=0x4024ad6c, arg=0x40291644, kw=0x0) at Objects/abstract.c:1746 #22 0x080c1b99 in PyEval_EvalFrame (f=0x8196bfc) at Python/ceval.c:3419 #23 0x080c7a54 in PyEval_EvalCodeEx (co=0x4027bf20, globals=0x40259824, locals=0x40259824, args=0x0, argcount=0, kws=0x0, kwcount=0, defs=0x0, defcount=0, closure=0x0) at Python/ceval.c:2730 #24 0x080c7c85 in PyEval_EvalCode (co=0x4027bf20, globals=0x40259824, locals=0x40259824) at Python/ceval.c:484 #25 0x080f67e6 in PyRun_InteractiveOneFlags (fp=0x40235720, filename=0x8122a01 "", flags=0xbfffe924) at Python/pythonrun.c:1264 #26 0x080f6a49 in PyRun_InteractiveLoopFlags (fp=0x40235720, ---Type to continue, or q to quit--- filename=0x8122a01 "", flags=0xbfffe924) at Python/pythonrun.c:694 #27 0x080f6b70 in PyRun_AnyFileExFlags (fp=0x40235720, filename=0x8122a01 "", closeit=0, flags=0xbfffe924) at Python/pythonrun.c:657 #28 0x08055857 in Py_Main (argc=0, argv=0xbfffe9e4) at Modules/main.c:484 #29 0x08054f07 in main (argc=1, argv=0xbfffe9e4) at Modules/python.c:23 -------------- next part -------------- A non-text attachment was scrubbed... Name: sparse_test.py Type: text/x-python Size: 295 bytes Desc: not available URL: From bgoli at sun.ac.za Thu Oct 20 05:52:59 2005 From: bgoli at sun.ac.za (Brett Olivier) Date: Thu, 20 Oct 2005 11:52:59 +0200 Subject: [SciPy-dev] issues building new scipy/core on 64 bit Intel P4 Message-ID: <200510201153.01105.bgoli@sun.ac.za> Hi I'm trying to build new scipy/core on an 64bit Intel and am running into some problems (test segfaults etc). Unfortunately, my test machine is at home so these issues might already be known (if so my apologies). The build logs for new scipy/core and results of runing scipy.test are available at: http://glue.jjj.sun.ac.za/~bgoli/newscipy/ I'll try to update this page regularly with my latest results and am more than happy to provide more information/testing/debugging. Many thanks to the SciPy developers for producing this fantastic software! Brett Some example build warnings : newCore (0.4.3.1314) build ================== gcc: _configtest.c _configtest.c: In function `main': _configtest.c:50: warning: int format, different type arg (arg 3) _configtest.c:57: warning: int format, different type arg (arg 3) _configtest.c:72: warning: int format, different type arg (arg 3) gcc: _configtest.c _configtest.c: In function `main': _configtest.c:4: warning: statement with no effect _configtest.c:5: warning: control reaches end of non-void function gcc -pthread _configtest.o -lm -o _configtest newSciPy (0.4.2_1344) build ================== gcc: Lib/io/numpyiomodule.c Lib/io/numpyiomodule.c: In function `numpyio_tofile': Lib/io/numpyiomodule.c:282: warning: passing arg 1 of pointer to function from incompatible pointer type Lib/io/numpyiomodule.c: In function `numpyio_convert_objects': Lib/io/numpyiomodule.c:743: warning: passing arg 2 of pointer to function from incompatible pointer type Lib/io/numpyiomodule.c: In function `numpyio_tofile': gcc: build/src/build/src/scipy/linalg/fblasmodule.c build/src/build/src/scipy/linalg/fblasmodule.c: In function `f2py_rout_fblas_sger': build/src/build/src/scipy/linalg/fblasmodule.c:8045: warning: passing arg 1 of `initforcomb' from incompatible pointer type build/src/build/src/scipy/linalg/fblasmodule.c: In function `f2py_rout_fblas_dger': build/src/build/src/scipy/linalg/fblasmodule.c:8237: warning: passing arg 1 of `initforcomb' from incompatible pointer type build/src/build/src/scipy/linalg/fblasmodule.c: In function `f2py_rout_fblas_cgeru': build/src/build/src/scipy/linalg/fblasmodule.c:8429: warning: passing arg 1 of `initforcomb' from incompatible pointer type build/src/build/src/scipy/linalg/fblasmodule.c: In function `f2py_rout_fblas_zgeru': build/src/build/src/scipy/linalg/fblasmodule.c:8621: warning: passing arg 1 of `initforcomb' from incompatible pointer type build/src/build/src/scipy/linalg/fblasmodule.c: In function `f2py_rout_fblas_cgerc': build/src/build/src/scipy/linalg/fblasmodule.c:8813: warning: passing arg 1 of `initforcomb' from incompatible pointer type build/src/build/src/scipy/linalg/fblasmodule.c: In function `f2py_rout_fblas_zgerc': build/src/build/src/scipy/linalg/fblasmodule.c:9005: warning: passing arg 1 of `initforcomb' from incompatible pointer type gcc: build/src/Lib/interpolate/dfitpackmodule.c build/src/Lib/interpolate/dfitpackmodule.c: In function `f2py_rout_dfitpack_fpcurf0': build/src/Lib/interpolate/dfitpackmodule.c:1304: warning: passing arg 1 of `initforcomb' from incompatible pointer type build/src/Lib/interpolate/dfitpackmodule.c: In function `f2py_rout_dfitpack_fpcurfm1': build/src/Lib/interpolate/dfitpackmodule.c:1998: warning: passing arg 1 of `initforcomb' from incompatible pointer type build/src/Lib/interpolate/dfitpackmodule.c: In function `f2py_rout_dfitpack_surfit_smth': build/src/Lib/interpolate/dfitpackmodule.c:2613: warning: passing arg 1 of `initforcomb' from incompatible pointer type build/src/Lib/interpolate/dfitpackmodule.c: In function `f2py_rout_dfitpack_surfit_lsq': build/src/Lib/interpolate/dfitpackmodule.c:3084: warning: passing arg 1 of `initforcomb' from incompatible pointer type -- Brett G. Olivier Postdoctoral Fellow Triple-J Group for Molecular Cell Physiology Stellenbosch University bgoli at sun dot ac dot za http://glue.jjj.sun.ac.za/~bgoli Tel +27-21-8082704 Fax +27-21-8085863 Mobile +27-82-7329306 I must Create a System, or be enslav'd by another Man's; I will not Reason and Compare; my business is to Create. -- William Blake, "Jerusalem" From nwagner at mecha.uni-stuttgart.de Thu Oct 20 07:25:32 2005 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Thu, 20 Oct 2005 13:25:32 +0200 Subject: [SciPy-dev] Matlab versus scipy Message-ID: <43577EAC.1060908@mecha.uni-stuttgart.de> Hi all, Matlab offers a warning in case of nearly singular matrices Matrix is close to singular or badly scaled Results may be inaccurate scipy doesn't have this feature. For what reason ? Nils from scipy import * a = rand(3,3) #a = 0.5*(a+transpose(a)) w = linalg.eigvals(a) # # Singular matrix # s = a-w[0]*identity(3) dets=linalg.det(s) svds=linalg.svdvals(s) evals=linalg.eigvals(s) print evals print svds[0]/svds[-1] invs=linalg.inv(s) print invs From cimrman3 at ntc.zcu.cz Thu Oct 20 07:43:39 2005 From: cimrman3 at ntc.zcu.cz (Robert Cimrman) Date: Thu, 20 Oct 2005 13:43:39 +0200 Subject: [SciPy-dev] CS* constructor error Message-ID: <435782EB.7000905@ntc.zcu.cz> I have stumbled on a possible sparsetools problem - an array of shape [1,N] passed to the CSC matrix constructor leads to the error below, as well as [N,1] for the CSR format. ([N,1] for CSC, [1,N] for CSR and general [M,N], M > 1, N > 1, are ok...) I have looked at sparsetools.pyf.src and could not see any reason for this... Could it be a f2py problem? -- import scipy import scipy.sparse as sparse ar = scipy.array( [[1,2], [3,4]] ) a = sparse.csc_matrix( ar ) print 'a from array:', a #ar = scipy.array( [[1],[2]] ) ar = scipy.array( [[1,2,3,4,5],] ) a = sparse.csc_matrix( ar ) print 'a from array:', a --> 0-th dimension must be fixed to 2 but got 6 (real index=0) Traceback (most recent call last): File "test.py", line 28, in ? a = sparse.csc_matrix( ar ) File "/usr/lib/python2.4/site-packages/scipy/sparse/Sparse.py", line 317, in __init__ sparsetools.error: failed in converting 4th argument `ptra' of sparsetools.dfulltocsc to C/Fortran array r. From arnd.baecker at web.de Thu Oct 20 09:00:36 2005 From: arnd.baecker at web.de (Arnd Baecker) Date: Thu, 20 Oct 2005 15:00:36 +0200 (CEST) Subject: [SciPy-dev] missing Lib/__init__.py in newscipy? In-Reply-To: References: <200510181342.06980.dd55@cornell.edu> <200510181619.11283.dd55@cornell.edu> <4355F532.20102@ntc.zcu.cz> Message-ID: On Thu, 20 Oct 2005, Arnd Baecker wrote: > Presently I get a hang on the opteron: > > check_y_stride (scipy.linalg.fblas.test_fblas.test_ccopy) ... ok > check_default_beta_y (scipy.linalg.fblas.test_fblas.test_cgemv) ... Ok, a little bit further: the hang occurs in get_data of base_gemv (in test_fblas.py) class base_gemv(unittest.TestCase): def get_data(self,x_stride=1,y_stride=1): print "In getdata:" mult = array(1, dtype = self.dtype) print "In getdata: 1" if self.dtype in ['F', 'D']: print "In getdata: 1a" mult = array(1+1j, dtype = self.dtype) print "In getdata: 2" from scipy.basic.random import normal print "In getdata: 3" alpha = array(1., dtype = self.dtype) * mult print "In getdata: 4" beta = array(1.,dtype = self.dtype) * mult print "In getdata: 5" a = normal(0.,1.,(3,3)).astype(self.dtype) * mult print "In getdata: 6" x = arange(shape(a)[0]*x_stride,dtype=self.dtype) * mult print "In getdata: 7" y = arange(shape(a)[1]*y_stride,dtype=self.dtype) * mult print "In getdata: 8" print "GETDATA:", print alpha,beta,a,x,y return alpha,beta,a,x,y It gets until getdata: 5. Testing separately: from scipy.basic.random import normal a = normal(0.,1.,(3,3)) hangs at normal. I have to leave this now, because I have to prepare a talk for tomorrow, but maybe someone else has insight why normal could hang on 64 Bit (it does not do so on my PIV) OTOH, I did not see any warnings concerning mtrand... Best, Arnd From chanley at stsci.edu Thu Oct 20 10:45:26 2005 From: chanley at stsci.edu (Christopher Hanley) Date: Thu, 20 Oct 2005 10:45:26 -0400 Subject: [SciPy-dev] scipy_core Solaris Build: Good and Bad News Message-ID: <4357AD86.5000805@stsci.edu> The good news is that I can now build the latest newcore version on Solaris with no modifications to the config.h or setup.py files. The bad news is that when I attempt an "import scipy" command I receive the following error message: basil> ipython Python 2.4.1 (#2, Apr 6 2005, 14:41:45) [C] Type "copyright", "credits" or "license" for more information. IPython 0.6.12 -- An enhanced Interactive Python. ? -> Introduction to IPython's features. %magic -> Information about IPython's 'magic' % functions. help -> Python's own help system. object? -> Details about 'object'. ?object also works, ?? prints more. In [1]: import scipy.base as sb --------------------------------------------------------------------------- exceptions.ImportError Traceback (most recent call last) /data/basil5/site-packages/lib/python/ /data/basil5/site-packages/lib/python/scipy/__init__.py 28 print 'Running from scipy core source directory.' 29 else: ---> 30 from scipy.base import * 31 import scipy.basic as basic 32 from scipy.basic.fft import fft, ifft /data/basil5/site-packages/lib/python/scipy/base/__init__.py 4 5 import multiarray ----> 6 import umath 7 import numerictypes as nt 8 multiarray.set_typeDict(nt.typeDict) ImportError: ld.so.1: /usr/stsci/pyssgx/Python-2.4.1/bin/ipython: fatal: relocation error: file scipy/base/umath.so: symbol isinf: referenced symbol not found In [2]: Does anyone have any suggestions on how I can correct this problem? Thanks, Chris -- Christopher Hanley Systems Software Engineer Space Telescope Science Institute 3700 San Martin Drive Baltimore MD, 21218 (410) 338-4338 From nwagner at mecha.uni-stuttgart.de Thu Oct 20 12:32:06 2005 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Thu, 20 Oct 2005 18:32:06 +0200 Subject: [SciPy-dev] 3 failures with '0.4.3.1330' Message-ID: scipy.test(1) using ATLAS results in ====================================================================== FAIL: check_random (scipy.linalg.decomp.test_decomp.test_cholesky) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib/python2.4/site-packages/scipy/linalg/tests/test_decomp.py", line 284, in check_random assert_array_almost_equal(a,a1) File "/usr/local/lib/python2.4/site-packages/scipy/test/testing.py", line 727, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 100.0%): Array 1: [[ 1.3592684e+01 1.3593857e+00 1.2131043e+00 4.4811041e-01 7.8969831e-01 1.3969793e+00 8.1902659e-01 1.... Array 2: [[ 184.7610566 18.4777003 16.4893434 6.0910231 10.7341195 18.9886978 11.1327695 17.8980693 24.0382271... ====================================================================== FAIL: check_random_complex (scipy.linalg.decomp.test_decomp.test_cholesky) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib/python2.4/site-packages/scipy/linalg/tests/test_decomp.py", line 298, in check_random_complex assert_array_almost_equal(a,a1) File "/usr/local/lib/python2.4/site-packages/scipy/test/testing.py", line 727, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 100.0%): Array 1: [[ 2.1277216e+01 +0.0000000e+00j 1.7653394e+00 -6.5097573e-02j 1.5406931e+00 +1.9640981e-01j 1.6667752e+00 +3.0... Array 2: [[ 4.5271991e+02 +0.0000000e+00j 3.7561508e+01 -1.3850951e+00j 3.2781659e+01 +4.1790538e+00j 3.5464335e+01 +6.5... ====================================================================== FAIL: check_simple_complex (scipy.linalg.decomp.test_decomp.test_cholesky) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib/python2.4/site-packages/scipy/linalg/tests/test_decomp.py", line 270, in check_simple_complex assert_array_almost_equal(a,a1) File "/usr/local/lib/python2.4/site-packages/scipy/test/testing.py", line 727, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 100.0%): Array 1: [[ 3.1622777e+00 +0.0000000e+00j 4.1109610e+00 +2.8460499e+00j 4.7434165e+00 -1.5811388e+00j] [ 0.0000000e+00 +... Array 2: [[ 1.0000000e+01 +0.0000000e+00j 1.3000000e+01 +9.0000000e+00j 1.5000000e+01 -5.0000000e+00j] [ 1.3000000e+01 -... ---------------------------------------------------------------------- Ran 386 tests in 0.987s FAILED (failures=3) From dd55 at cornell.edu Thu Oct 20 12:40:17 2005 From: dd55 at cornell.edu (Darren Dale) Date: Thu, 20 Oct 2005 12:40:17 -0400 Subject: [SciPy-dev] 3 failures with '0.4.3.1330' In-Reply-To: References: Message-ID: <200510201240.17605.dd55@cornell.edu> I also see these errors. Darren On Thursday 20 October 2005 12:32 pm, Nils Wagner wrote: > scipy.test(1) using ATLAS results in > > ====================================================================== > FAIL: check_random > (scipy.linalg.decomp.test_decomp.test_cholesky) > ---------------------------------------------------------------------- > Traceback (most recent call last): > File > "/usr/local/lib/python2.4/site-packages/scipy/linalg/tests/test_decomp.py", > line 284, in check_random > assert_array_almost_equal(a,a1) > File > "/usr/local/lib/python2.4/site-packages/scipy/test/testing.py", > line 727, in assert_array_almost_equal > assert cond,\ > AssertionError: > Arrays are not almost equal (mismatch 100.0%): > Array 1: [[ 1.3592684e+01 1.3593857e+00 > 1.2131043e+00 4.4811041e-01 > 7.8969831e-01 1.3969793e+00 8.1902659e-01 1.... > Array 2: [[ 184.7610566 18.4777003 16.4893434 > 6.0910231 10.7341195 > 18.9886978 11.1327695 17.8980693 24.0382271... > > > ====================================================================== > FAIL: check_random_complex > (scipy.linalg.decomp.test_decomp.test_cholesky) > ---------------------------------------------------------------------- > Traceback (most recent call last): > File > "/usr/local/lib/python2.4/site-packages/scipy/linalg/tests/test_decomp.py", > line 298, in check_random_complex > assert_array_almost_equal(a,a1) > File > "/usr/local/lib/python2.4/site-packages/scipy/test/testing.py", > line 727, in assert_array_almost_equal > assert cond,\ > AssertionError: > Arrays are not almost equal (mismatch 100.0%): > Array 1: [[ 2.1277216e+01 +0.0000000e+00j > 1.7653394e+00 -6.5097573e-02j > 1.5406931e+00 +1.9640981e-01j 1.6667752e+00 +3.0... > Array 2: [[ 4.5271991e+02 +0.0000000e+00j > 3.7561508e+01 -1.3850951e+00j > 3.2781659e+01 +4.1790538e+00j 3.5464335e+01 +6.5... > > > ====================================================================== > FAIL: check_simple_complex > (scipy.linalg.decomp.test_decomp.test_cholesky) > ---------------------------------------------------------------------- > Traceback (most recent call last): > File > "/usr/local/lib/python2.4/site-packages/scipy/linalg/tests/test_decomp.py", > line 270, in check_simple_complex > assert_array_almost_equal(a,a1) > File > "/usr/local/lib/python2.4/site-packages/scipy/test/testing.py", > line 727, in assert_array_almost_equal > assert cond,\ > AssertionError: > Arrays are not almost equal (mismatch 100.0%): > Array 1: [[ 3.1622777e+00 +0.0000000e+00j > 4.1109610e+00 +2.8460499e+00j > 4.7434165e+00 -1.5811388e+00j] > [ 0.0000000e+00 +... > Array 2: [[ 1.0000000e+01 +0.0000000e+00j > 1.3000000e+01 +9.0000000e+00j > 1.5000000e+01 -5.0000000e+00j] > [ 1.3000000e+01 -... > > > ---------------------------------------------------------------------- > Ran 386 tests in 0.987s > > FAILED (failures=3) > > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-dev -- Dr. Darren S. Dale dd55 at cornell.edu From charles.harris at sdl.usu.edu Thu Oct 20 13:16:11 2005 From: charles.harris at sdl.usu.edu (Charles Harris) Date: Thu, 20 Oct 2005 11:16:11 -0600 Subject: [SciPy-dev] [SciPy-user] operations on int8 arrays References: <43566C59.6090508@psychology.nottingham.ac.uk> <4356AAE4.7020009@ee.byu.edu> Message-ID: Travis, > It would be possible to make the default reduce type for integers 32-bit > on 32-bit platforms and 64-bit on 64-bit platforms. the long integer type. > > Do people think that would be a good idea? These kinds of questions do > come up. > I think it would be best to upcast the type. As others have pointed out, the unexpected rollover of the sum can cause trouble. Rollover is even more of a problem with the int8 type being used for booleans. I suppose one could always typecast the array *before* summing or add a flag to indicate that a higher precision should be used. Actually, loss of precision can even be a problem for large float arrays that should really be accumulated in double precision and then, so some sort of flag could be useful in any case. Using a larger type is unlikely to cause space problems as the sum reduces the number of elements in the array. Chuck _______________________________________________ Scipy-dev mailing list Scipy-dev at scipy.net http://www.scipy.net/mailman/listinfo/scipy-dev -------------- next part -------------- A non-text attachment was scrubbed... Name: winmail.dat Type: application/ms-tnef Size: 3067 bytes Desc: not available URL: From oliphant at ee.byu.edu Thu Oct 20 13:33:17 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Thu, 20 Oct 2005 11:33:17 -0600 Subject: [SciPy-dev] scipy_core Solaris Build: Good and Bad News In-Reply-To: <4357AD86.5000805@stsci.edu> References: <4357AD86.5000805@stsci.edu> Message-ID: <4357D4DD.8070509@ee.byu.edu> Christopher Hanley wrote: >The good news is that I can now build the latest newcore version on >Solaris with no modifications to the config.h or setup.py files. > > O.K. we are making progress. Does your config.h file contain HAVE_ISNAN? We are assuming that if you have the isnan function, then you have the isinf function as well. Can you check on your system to see if that is the case? Try man isinf and man isnan from the command line. Or do a google search on your platform type or check your system documentation Perhaps this assumption is not justified and isinf must be detected independently. -Travis From chanley at stsci.edu Thu Oct 20 14:20:57 2005 From: chanley at stsci.edu (Christopher Hanley) Date: Thu, 20 Oct 2005 14:20:57 -0400 Subject: [SciPy-dev] scipy_core Solaris Build: Good and Bad News In-Reply-To: <4357D4DD.8070509@ee.byu.edu> References: <4357AD86.5000805@stsci.edu> <4357D4DD.8070509@ee.byu.edu> Message-ID: <4357E009.7010308@stsci.edu> Travis Oliphant wrote: > O.K. we are making progress. Does your config.h file > contain HAVE_ISNAN? > The config.h file does conatin HAVE_ISNAN. For reference, the file contents are: /* #define SIZEOF_SHORT 2 */ /* #define SIZEOF_INT 4 */ /* #define SIZEOF_LONG 4 */ /* #define SIZEOF_FLOAT 4 */ /* #define SIZEOF_DOUBLE 8 */ #define SIZEOF_LONG_DOUBLE 16 #define SIZEOF_PY_INTPTR_T 4 /* #define SIZEOF_LONG_LONG 8 */ #define SIZEOF_PY_LONG_LONG 8 /* #define CHAR_BIT 8 */ #define MATHLIB m #define HAVE_INVERSE_HYPERBOLIC #define HAVE_ISNAN > We are assuming that if you have the isnan function, then you have the > isinf function as well. Can you check on your system to see if that is > the case? Try man isinf and man isnan from the command line. Or do > a google search on your platform type or check your system documentation The isinf as well as the isnan function are installed on this version of Solaris. I have also just rebuilt using gcc 4.0.1 and receive the same error on import. Chris -- Christopher Hanley Systems Software Engineer Space Telescope Science Institute 3700 San Martin Drive Baltimore MD, 21218 (410) 338-4338 From oliphant at ee.byu.edu Fri Oct 21 01:16:44 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Thu, 20 Oct 2005 23:16:44 -0600 Subject: [SciPy-dev] scipy_core Solaris Build: Good and Bad News In-Reply-To: <4357E009.7010308@stsci.edu> References: <4357AD86.5000805@stsci.edu> <4357D4DD.8070509@ee.byu.edu> <4357E009.7010308@stsci.edu> Message-ID: <435879BC.1080009@ee.byu.edu> Christopher Hanley wrote: >Travis Oliphant wrote: > > >>O.K. we are making progress. Does your config.h file >>contain HAVE_ISNAN? >> >> >> >The config.h file does conatin HAVE_ISNAN. For reference, the file >contents are: >/* #define SIZEOF_SHORT 2 */ >/* #define SIZEOF_INT 4 */ >/* #define SIZEOF_LONG 4 */ >/* #define SIZEOF_FLOAT 4 */ >/* #define SIZEOF_DOUBLE 8 */ >#define SIZEOF_LONG_DOUBLE 16 >#define SIZEOF_PY_INTPTR_T 4 >/* #define SIZEOF_LONG_LONG 8 */ >#define SIZEOF_PY_LONG_LONG 8 >/* #define CHAR_BIT 8 */ >#define MATHLIB m >#define HAVE_INVERSE_HYPERBOLIC >#define HAVE_ISNAN > > >The isinf as well as the isnan function are installed on this version of >Solaris. > > If they are installed, then they are not being linked apparently. Are there special command switches that must be used to link isinf on that system? This appears that isinf is being found and linked against correctly at configure time, but not at build time. I don't know why. Since HAVE_ISNAN is defined, the system does not define a separate isinf command (which it would if HAVE_ISNAN where not defined). What happens if you comment out the #define HAVE_ISNAN? -Travis From nwagner at mecha.uni-stuttgart.de Fri Oct 21 03:13:11 2005 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Fri, 21 Oct 2005 09:13:11 +0200 Subject: [SciPy-dev] TypeError: __array_data__ must return a string providing the pointer to data Message-ID: <43589507.4050003@mecha.uni-stuttgart.de> Python 2.4 (#2, May 12 2005, 14:45:33) [GCC 3.3.3 (SuSE Linux)] on linux2 Type "help", "copyright", "credits" or "license" for more information. >>> from MLab import * >>> A = rand(3,3) >>> A array([[ 0.9462472 , 0.30814099, 0.65772529], [ 0.29040523, 0.58193587, 0.25232399], [ 0.8339496 , 0.36152322, 0.58090246]]) >>> triu(A) array([[ 0.9462472 , 0.30814099, 0.65772529], [ 0. , 0.58193587, 0.25232399], [ 0. , 0. , 0.58090246]]) >>> tril(A) array([[ 0.9462472 , 0. , 0. ], [ 0.29040523, 0.58193587, 0. ], [ 0.8339496 , 0.36152322, 0.58090246]]) >>> from scipy import * Importing io to scipy Importing special to scipy Importing sparse to scipy Importing utils to scipy Importing interpolate to scipy Importing optimize to scipy Importing linalg to scipy >>> triu(A) Traceback (most recent call last): File "", line 1, in ? File "/usr/local/lib/python2.4/site-packages/scipy/base/twodim_base.py", line 103, in triu m = asarray(m) File "/usr/local/lib/python2.4/site-packages/scipy/base/numeric.py", line 36, in asarray return array(a, dtype, copy=0, fortran=fortran) TypeError: __array_data__ must return a string providing the pointer to data. From chanley at stsci.edu Fri Oct 21 10:40:28 2005 From: chanley at stsci.edu (Christopher Hanley) Date: Fri, 21 Oct 2005 10:40:28 -0400 Subject: [SciPy-dev] scipy_core Solaris Build: Good and Bad News In-Reply-To: <435879BC.1080009@ee.byu.edu> References: <4357AD86.5000805@stsci.edu> <4357D4DD.8070509@ee.byu.edu> <4357E009.7010308@stsci.edu> <435879BC.1080009@ee.byu.edu> Message-ID: <4358FDDC.9040203@stsci.edu> Travis Oliphant wrote: > > If they are installed, then they are not being linked apparently. Are > there special command switches that must be used to link isinf on that > system? > > This appears that isinf is being found and linked against correctly at > configure time, but not at build time. I don't know why. > > Since HAVE_ISNAN is defined, the system does not define a separate isinf > command (which it would if HAVE_ISNAN where not defined). What happens > if you comment out the #define HAVE_ISNAN? > > -Travis > Hi Travis, Well, commenting out #define HAVE_ISNAN has resulted in a "Good News" / "Disturbing News" situation. I am now able to successfully import scipy.base with no error messages. However, when I attempted scipy.test() I am receiving a bus error. The session log is below: basil> ipython Python 2.4.1 (#2, Apr 6 2005, 14:41:45) [C] Type "copyright", "credits" or "license" for more information. IPython 0.6.12 -- An enhanced Interactive Python. ? -> Introduction to IPython's features. %magic -> Information about IPython's 'magic' % functions. help -> Python's own help system. object? -> Details about 'object'. ?object also works, ?? prints more. In [1]: import scipy.base as sb In [2]: sb.test() Found 2 tests for scipy.base.umath Found 23 tests for scipy.base.function_base Found 3 tests for scipy.base.getlimits Found 9 tests for scipy.base.twodim_base Found 3 tests for scipy.base.matrix Found 44 tests for scipy.base.shape_base Found 42 tests for scipy.base.type_check Found 4 tests for scipy.base.index_tricks Found 0 tests for __main__ .......................Bus error (core dumped) basil> Chris From stephen.walton at csun.edu Fri Oct 21 11:22:28 2005 From: stephen.walton at csun.edu (Stephen Walton) Date: Fri, 21 Oct 2005 08:22:28 -0700 Subject: [SciPy-dev] TypeError: __array_data__ must return a string providing the pointer to data In-Reply-To: <43589507.4050003@mecha.uni-stuttgart.de> References: <43589507.4050003@mecha.uni-stuttgart.de> Message-ID: <435907B4.5070506@csun.edu> Nils Wagner wrote: >Python 2.4 (#2, May 12 2005, 14:45:33) >[GCC 3.3.3 (SuSE Linux)] on linux2 >Type "help", "copyright", "credits" or "license" for more information. > > >>>>from MLab import * >>>> >>>> MLab is part of Numeric 23.8 (at least on my machine) and doesn't make arrays which are quite 100% compatible with the new scipy core. You'll see the same error message if you try to use matplotlib to plot the contents of a new scipy core array. From oliphant at ee.byu.edu Fri Oct 21 12:52:00 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Fri, 21 Oct 2005 10:52:00 -0600 Subject: [SciPy-dev] TypeError: __array_data__ must return a string providing the pointer to data In-Reply-To: <435907B4.5070506@csun.edu> References: <43589507.4050003@mecha.uni-stuttgart.de> <435907B4.5070506@csun.edu> Message-ID: <43591CB0.8040302@ee.byu.edu> Stephen Walton wrote: >Nils Wagner wrote: > > > >>Python 2.4 (#2, May 12 2005, 14:45:33) >>[GCC 3.3.3 (SuSE Linux)] on linux2 >>Type "help", "copyright", "credits" or "license" for more information. >> >> >> >> >>>>>from MLab import * >>>> >>>> >>>>> >>>>> >>>>> >>>>> >MLab is part of Numeric 23.8 (at least on my machine) and doesn't make >arrays which are quite 100% compatible with the new scipy core. You'll >see the same error message if you try to use matplotlib to plot the >contents of a new scipy core array. > > You need the latest Numeric from CVS to do this correctly. -Travis From oliphant at ee.byu.edu Fri Oct 21 12:57:40 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Fri, 21 Oct 2005 10:57:40 -0600 Subject: [SciPy-dev] scipy_core Solaris Build: Good and Bad News In-Reply-To: <4358FDDC.9040203@stsci.edu> References: <4357AD86.5000805@stsci.edu> <4357D4DD.8070509@ee.byu.edu> <4357E009.7010308@stsci.edu> <435879BC.1080009@ee.byu.edu> <4358FDDC.9040203@stsci.edu> Message-ID: <43591E04.1040101@ee.byu.edu> Christopher Hanley wrote: >Travis Oliphant wrote: > > >>If they are installed, then they are not being linked apparently. Are >>there special command switches that must be used to link isinf on that >>system? >> >>This appears that isinf is being found and linked against correctly at >>configure time, but not at build time. I don't know why. >> >>Since HAVE_ISNAN is defined, the system does not define a separate isinf >>command (which it would if HAVE_ISNAN where not defined). What happens >>if you comment out the #define HAVE_ISNAN? >> >>-Travis >> >> >> > >Hi Travis, > >Well, commenting out #define HAVE_ISNAN has resulted in a "Good News" / >"Disturbing News" situation. I am now able to successfully import >scipy.base with no error messages. However, when I attempted >scipy.test() I am receiving a bus error. The session log is below > > Could you please run scipy.test(10,10) to get a better idea where the bus error is occurring? It would really help if I could log in to a Solaris machine somewhere. Also run print scipy.__core_version__ before you do that to verify the version you are running. We still have to figure out why the isnan function is being picked up by configure but not by the compilation stage. -Travis From chanley at stsci.edu Fri Oct 21 13:14:29 2005 From: chanley at stsci.edu (Christopher Hanley) Date: Fri, 21 Oct 2005 13:14:29 -0400 Subject: [SciPy-dev] scipy_core Solaris Build: Good and Bad News In-Reply-To: <43591E04.1040101@ee.byu.edu> References: <4357AD86.5000805@stsci.edu> <4357D4DD.8070509@ee.byu.edu> <4357E009.7010308@stsci.edu> <435879BC.1080009@ee.byu.edu> <4358FDDC.9040203@stsci.edu> <43591E04.1040101@ee.byu.edu> Message-ID: <435921F5.9010302@stsci.edu> > Could you please run scipy.test(10,10) to get a better idea where the > bus error is occurring? >>> import scipy >>> scipy.test(10,10) scipy/base/tests/test_scimath.py !! No test file 'test_scimath.py' found for scipy/basic/tests/test_fft.py !! No test file 'test_fft.py' found for scipy/tests/test_scipy.py !! No test file 'test_scipy.py' found for scipy/test/tests/test_testing.py !! No test file 'test_testing.py' found for Found 2 tests for scipy.base.umath Found 23 tests for scipy.base.function_base scipy/base/tests/test_machar.py !! No test file 'test_machar.py' found for scipy/test/tests/test_test.py !! No test file 'test_test.py' found for scipy/base/tests/test_ma.py !! No test file 'test_ma.py' found for scipy/base/tests/test_numerictypes.py !! No test file 'test_numerictypes.py' found for scipy/basic/tests/test_basic_lite.py !! No test file 'test_basic_lite.py' found for scipy/tests/test_core_version.py !! No test file 'test_core_version.py' found for Found 4 tests for scipy.base.getlimits Found 9 tests for scipy.base.twodim_base scipy/basic/tests/test_fft_lite.py !! No test file 'test_fft_lite.py' found for scipy/base/tests/test__compiled_base.py !! No test file 'test__compiled_base.py' found for scipy/base/tests/test_info_scipy_base.py !! No test file 'test_info_scipy_base.py' found for scipy/base/tests/test_ufunclike.py !! No test file 'test_ufunclike.py' found for scipy/test/tests/test_info_scipy_test.py !! No test file 'test_info_scipy_test.py' found for scipy/basic/tests/test_basic.py !! No test file 'test_basic.py' found for scipy/basic/tests/test_linalg.py !! No test file 'test_linalg.py' found for scipy/test/tests/test_scipy_test_version.py !! No test file 'test_scipy_test_version.py' found for scipy/base/tests/test_base.py !! No test file 'test_base.py' found for scipy/base/tests/test_convertcode.py !! No test file 'test_convertcode.py' found for scipy/lib/tests/test_lib.py !! No test file 'test_lib.py' found for scipy/base/tests/test_multiarray.py !! No test file 'test_multiarray.py' found for Found 3 tests for scipy.base.matrix scipy/base/tests/test_oldnumeric.py !! No test file 'test_oldnumeric.py' found for Found 44 tests for scipy.base.shape_base scipy/base/tests/test_numeric.py !! No test file 'test_numeric.py' found for scipy/lib/tests/test_fftpack_lite.py !! No test file 'test_fftpack_lite.py' found for scipy/lib/tests/test_lapack_lite.py !! No test file 'test_lapack_lite.py' found for scipy/base/tests/test_polynomial.py !! No test file 'test_polynomial.py' found for Found 42 tests for scipy.base.type_check scipy/test/tests/test_test.py !! No test file 'test_test.py' found for scipy/base/tests/test_machar.py !! No test file 'test_machar.py' found for Found 4 tests for scipy.base.index_tricks scipy/basic/tests/test_random.py !! No test file 'test_random.py' found for Found 0 tests for __main__ check_reduce_complex (scipy.base.umath.test_umath.test_maximum) ... ok check_reduce_complex (scipy.base.umath.test_umath.test_minimum) ... ok check_basic (scipy.base.function_base.test_function_base.test_all) ... ok check_nd (scipy.base.function_base.test_function_base.test_all) ... ok check_basic (scipy.base.function_base.test_function_base.test_amax) ... ok check_basic (scipy.base.function_base.test_function_base.test_amin) ... ok check_basic (scipy.base.function_base.test_function_base.test_angle) ... ok check_basic (scipy.base.function_base.test_function_base.test_any) ... ok check_nd (scipy.base.function_base.test_function_base.test_any) ... ok check_basic (scipy.base.function_base.test_function_base.test_cumprod) ... ok check_basic (scipy.base.function_base.test_function_base.test_cumsum) ... ok check_basic (scipy.base.function_base.test_function_base.test_diff) ... ok check_nd (scipy.base.function_base.test_function_base.test_diff) ... ok check_basic (scipy.base.function_base.test_function_base.test_extins) ... ok check_both (scipy.base.function_base.test_function_base.test_extins) ... ok check_insert (scipy.base.function_base.test_function_base.test_extins) ... ok check_basic (scipy.base.function_base.test_function_base.test_linspace) ... ok check_basic (scipy.base.function_base.test_function_base.test_logspace) ... ok check_basic (scipy.base.function_base.test_function_base.test_prod) ... ok check_basic (scipy.base.function_base.test_function_base.test_ptp) ... ok check_basic (scipy.base.function_base.test_function_base.test_trim_zeros) ... ok check_leading_skip (scipy.base.function_base.test_function_base.test_trim_zeros) ... ok check_trailing_skip (scipy.base.function_base.test_function_base.test_trim_zeros) ... ok check_scalar (scipy.base.function_base.test_function_base.test_vectorize) ... Bus error (core dumped) basil> dir core scipy/ > Also run > > print scipy.__core_version__ > In [1]: import scipy In [2]: scipy.__core_version__ Out[2]: '0.4.3.1330' > We still have to figure out why the isnan function is being picked up by > configure but not by the compilation stage. I have not found any compile time flags that are specific is isinf or isnan. I don't believe this is a compiler problem given its occurence under both Sun and GNU compilers. I will keep looking. Chris From nwagner at mecha.uni-stuttgart.de Fri Oct 21 13:48:25 2005 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Fri, 21 Oct 2005 19:48:25 +0200 Subject: [SciPy-dev] TypeError: __array_data__ must return a string providing the pointer to data In-Reply-To: <43591CB0.8040302@ee.byu.edu> References: <43589507.4050003@mecha.uni-stuttgart.de> <435907B4.5070506@csun.edu> <43591CB0.8040302@ee.byu.edu> Message-ID: On Fri, 21 Oct 2005 10:52:00 -0600 Travis Oliphant wrote: > Stephen Walton wrote: > >>Nils Wagner wrote: >> >> >> >>>Python 2.4 (#2, May 12 2005, 14:45:33) >>>[GCC 3.3.3 (SuSE Linux)] on linux2 >>>Type "help", "copyright", "credits" or "license" for more >>>information. >>> >>> >>> >>> >>>>>>from MLab import * >>>>> >>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>MLab is part of Numeric 23.8 (at least on my machine) and >>doesn't make >>arrays which are quite 100% compatible with the new scipy >>core. You'll >>see the same error message if you try to use matplotlib >>to plot the >>contents of a new scipy core array. >> >> > You need the latest Numeric from CVS to do this >correctly. > Traceback (most recent call last): File "test_lu.py", line 4, in ? lu, piv = linalg.lu_factor(a) File "/usr/local/lib/python2.4/site-packages/scipy/linalg/decomp.py", line 189, in lu_factor a1 = asarray(a) File "/usr/local/lib/python2.4/site-packages/scipy/base/numeric.py", line 36, in asarray return array(a, dtype, copy=0, fortran=fortran) TypeError: __array_data__ must return a string providing the pointer to data. >>> import Numeric >>> Numeric.__version__ '24.0b2' >>> > -Travis > > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-dev From oliphant at ee.byu.edu Fri Oct 21 14:34:43 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Fri, 21 Oct 2005 12:34:43 -0600 Subject: [SciPy-dev] scipy_core Solaris Build: Good and Bad News In-Reply-To: <435921F5.9010302@stsci.edu> References: <4357AD86.5000805@stsci.edu> <4357D4DD.8070509@ee.byu.edu> <4357E009.7010308@stsci.edu> <435879BC.1080009@ee.byu.edu> <4358FDDC.9040203@stsci.edu> <43591E04.1040101@ee.byu.edu> <435921F5.9010302@stsci.edu> Message-ID: <435934C3.8090806@ee.byu.edu> Christopher Hanley wrote: >>Could you please run scipy.test(10,10) to get a better idea where the >>bus error is occurring? >> >> > >>> import scipy > >>> scipy.test(10,10) >scipy/base/tests/test_scimath.py > !! No test file 'test_scimath.py' found for 'scipy.base.scimath' from 'scipy/base/scimath.pyc'> >scipy/basic/tests/test_fft.py > !! No test file 'test_fft.py' found for from 'scipy/basic/fft.pyc'> >scipy/tests/test_scipy.py > !! No test file 'test_scipy.py' found for 'scipy/__init__.pyc'> >scipy/test/tests/test_testing.py > !! No test file 'test_testing.py' found for 'scipy.test.testing' from 'scipy/test/testing.pyc'> > Found 2 tests for scipy.base.umath > Found 23 tests for scipy.base.function_base >scipy/base/tests/test_machar.py > !! No test file 'test_machar.py' found for 'scipy.base.machar' from 'scipy/base/machar.pyc'> >scipy/test/tests/test_test.py > !! No test file 'test_test.py' found for 'scipy/test/__init__.pyc'> >scipy/base/tests/test_ma.py > !! No test file 'test_ma.py' found for 'scipy/base/ma.pyc'> >scipy/base/tests/test_numerictypes.py > !! No test file 'test_numerictypes.py' found for 'scipy.base.numerictypes' from 'scipy/base/numerictypes.pyc'> >scipy/basic/tests/test_basic_lite.py > !! No test file 'test_basic_lite.py' found for 'scipy.basic.basic_lite' from 'scipy/basic/basic_lite.pyc'> >scipy/tests/test_core_version.py > !! No test file 'test_core_version.py' found for 'scipy.core_version' from 'scipy/core_version.pyc'> > Found 4 tests for scipy.base.getlimits > Found 9 tests for scipy.base.twodim_base >scipy/basic/tests/test_fft_lite.py > !! No test file 'test_fft_lite.py' found for 'scipy.basic.fft_lite' from 'scipy/basic/fft_lite.pyc'> >scipy/base/tests/test__compiled_base.py > !! No test file 'test__compiled_base.py' found for 'scipy.base._compiled_base' from 'scipy/base/_compiled_base.so'> >scipy/base/tests/test_info_scipy_base.py > !! No test file 'test_info_scipy_base.py' found for 'scipy.base.info_scipy_base' from 'scipy/base/info_scipy_base.pyc'> >scipy/base/tests/test_ufunclike.py > !! No test file 'test_ufunclike.py' found for 'scipy.base.ufunclike' from 'scipy/base/ufunclike.pyc'> >scipy/test/tests/test_info_scipy_test.py > !! No test file 'test_info_scipy_test.py' found for 'scipy.test.info_scipy_test' from 'scipy/test/info_scipy_test.pyc'> >scipy/basic/tests/test_basic.py > !! No test file 'test_basic.py' found for 'scipy/basic/__init__.pyc'> >scipy/basic/tests/test_linalg.py > !! No test file 'test_linalg.py' found for 'scipy.basic.linalg' from 'scipy/basic/linalg.pyc'> >scipy/test/tests/test_scipy_test_version.py > !! No test file 'test_scipy_test_version.py' found for 'scipy.test.scipy_test_version' from '...py/test/scipy_test_version.pyc'> >scipy/base/tests/test_base.py > !! No test file 'test_base.py' found for 'scipy/base/__init__.pyc'> >scipy/base/tests/test_convertcode.py > !! No test file 'test_convertcode.py' found for 'scipy.base.convertcode' from 'scipy/base/convertcode.pyc'> >scipy/lib/tests/test_lib.py > !! No test file 'test_lib.py' found for 'scipy/lib/__init__.pyc'> >scipy/base/tests/test_multiarray.py > !! No test file 'test_multiarray.py' found for 'scipy.base.multiarray' from 'scipy/base/multiarray.so'> > Found 3 tests for scipy.base.matrix >scipy/base/tests/test_oldnumeric.py > !! No test file 'test_oldnumeric.py' found for 'scipy.base.oldnumeric' from 'scipy/base/oldnumeric.pyc'> > Found 44 tests for scipy.base.shape_base >scipy/base/tests/test_numeric.py > !! No test file 'test_numeric.py' found for 'scipy.base.numeric' from 'scipy/base/numeric.pyc'> >scipy/lib/tests/test_fftpack_lite.py > !! No test file 'test_fftpack_lite.py' found for 'scipy.lib.fftpack_lite' from 'scipy/lib/fftpack_lite.so'> >scipy/lib/tests/test_lapack_lite.py > !! No test file 'test_lapack_lite.py' found for 'scipy.lib.lapack_lite' from 'scipy/lib/lapack_lite.so'> >scipy/base/tests/test_polynomial.py > !! No test file 'test_polynomial.py' found for 'scipy.base.polynomial' from 'scipy/base/polynomial.pyc'> > Found 42 tests for scipy.base.type_check >scipy/test/tests/test_test.py > !! No test file 'test_test.py' found for 'scipy/test/__init__.pyc'> >scipy/base/tests/test_machar.py > !! No test file 'test_machar.py' found for 'scipy.base.machar' from 'scipy/base/machar.pyc'> > Found 4 tests for scipy.base.index_tricks >scipy/basic/tests/test_random.py > !! No test file 'test_random.py' found for 'scipy.basic.random' from 'scipy/basic/random.pyc'> > Found 0 tests for __main__ >check_reduce_complex (scipy.base.umath.test_umath.test_maximum) ... ok >check_reduce_complex (scipy.base.umath.test_umath.test_minimum) ... ok >check_basic (scipy.base.function_base.test_function_base.test_all) ... ok >check_nd (scipy.base.function_base.test_function_base.test_all) ... ok >check_basic (scipy.base.function_base.test_function_base.test_amax) ... ok >check_basic (scipy.base.function_base.test_function_base.test_amin) ... ok >check_basic (scipy.base.function_base.test_function_base.test_angle) ... ok >check_basic (scipy.base.function_base.test_function_base.test_any) ... ok >check_nd (scipy.base.function_base.test_function_base.test_any) ... ok >check_basic (scipy.base.function_base.test_function_base.test_cumprod) >... ok >check_basic (scipy.base.function_base.test_function_base.test_cumsum) ... ok >check_basic (scipy.base.function_base.test_function_base.test_diff) ... ok >check_nd (scipy.base.function_base.test_function_base.test_diff) ... ok >check_basic (scipy.base.function_base.test_function_base.test_extins) ... ok >check_both (scipy.base.function_base.test_function_base.test_extins) ... ok >check_insert (scipy.base.function_base.test_function_base.test_extins) >... ok >check_basic (scipy.base.function_base.test_function_base.test_linspace) >... ok >check_basic (scipy.base.function_base.test_function_base.test_logspace) >... ok >check_basic (scipy.base.function_base.test_function_base.test_prod) ... ok >check_basic (scipy.base.function_base.test_function_base.test_ptp) ... ok >check_basic >(scipy.base.function_base.test_function_base.test_trim_zeros) ... ok >check_leading_skip >(scipy.base.function_base.test_function_base.test_trim_zeros) ... ok >check_trailing_skip >(scipy.base.function_base.test_function_base.test_trim_zeros) ... ok >check_scalar >(scipy.base.function_base.test_function_base.test_vectorize) ... Bus >error (core dumped) >basil> dir >core scipy/ > > O.K. this is an object array problem. There are several possible causes Could you comment out the test_vectorize and try the tests again. If you encounter any other tests that fail, please comment them out and continue. It would be nice to know all of the tests that are failing on Solaris. -Travis From oliphant at ee.byu.edu Fri Oct 21 15:07:53 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Fri, 21 Oct 2005 13:07:53 -0600 Subject: [SciPy-dev] scipy_core Solaris Build: Good and Bad News In-Reply-To: <435921F5.9010302@stsci.edu> References: <4357AD86.5000805@stsci.edu> <4357D4DD.8070509@ee.byu.edu> <4357E009.7010308@stsci.edu> <435879BC.1080009@ee.byu.edu> <4358FDDC.9040203@stsci.edu> <43591E04.1040101@ee.byu.edu> <435921F5.9010302@stsci.edu> Message-ID: <43593C89.8060904@ee.byu.edu> Christopher Hanley wrote: >I have not found any compile time flags that are specific is isinf or >isnan. I don't believe this is a compiler problem given its occurence >under both Sun and GNU compilers. > > The problem is why does the configure step in distutils find an isnan function when it apparently is not there? This is a Python distutils function we are calling. Have you verified whether or not isnan and isinf are available on Solaris? The disutils function just builds a simple c code that makes reference to the function and then trys to compile and link it. Perhaps the problem is that the link succeeds (because the compiler is allowing unresolved references until run time) when it really shouldn't. Do we need to actually try and run the program? I don't know if check_func does that. -Travis From oliphant at ee.byu.edu Fri Oct 21 15:10:45 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Fri, 21 Oct 2005 13:10:45 -0600 Subject: [SciPy-dev] TypeError: __array_data__ must return a string providing the pointer to data In-Reply-To: References: <43589507.4050003@mecha.uni-stuttgart.de> <435907B4.5070506@csun.edu> <43591CB0.8040302@ee.byu.edu> Message-ID: <43593D35.6050005@ee.byu.edu> Nils Wagner wrote: >>>>import Numeric >>>>Numeric.__version__ >>>> >>>> >'24.0b2' > > You still need the very latest CVS... The array interface protocol had to change to support discontiguous arrays (and read-only) arrays. These changes are only in the latest CVS versions of numarray and Numeric. But, the new SVN of newcore might work too. -Travis From chanley at stsci.edu Fri Oct 21 15:27:28 2005 From: chanley at stsci.edu (Christopher Hanley) Date: Fri, 21 Oct 2005 15:27:28 -0400 Subject: [SciPy-dev] scipy_core Solaris Build: Good and Bad News In-Reply-To: <435934C3.8090806@ee.byu.edu> References: <4357AD86.5000805@stsci.edu> <4357D4DD.8070509@ee.byu.edu> <4357E009.7010308@stsci.edu> <435879BC.1080009@ee.byu.edu> <4358FDDC.9040203@stsci.edu> <43591E04.1040101@ee.byu.edu> <435921F5.9010302@stsci.edu> <435934C3.8090806@ee.byu.edu> Message-ID: <43594120.6030809@stsci.edu> Travis Oliphant wrote: > > Could you comment out the test_vectorize and try the tests again. > Travis, After commenting out test_vectorize the remaining tests complete with no errors. Chris From chanley at stsci.edu Fri Oct 21 15:58:42 2005 From: chanley at stsci.edu (Christopher Hanley) Date: Fri, 21 Oct 2005 15:58:42 -0400 Subject: [SciPy-dev] scipy_core Solaris Build: Good and Bad News In-Reply-To: <43593C89.8060904@ee.byu.edu> References: <4357AD86.5000805@stsci.edu> <4357D4DD.8070509@ee.byu.edu> <4357E009.7010308@stsci.edu> <435879BC.1080009@ee.byu.edu> <4358FDDC.9040203@stsci.edu> <43591E04.1040101@ee.byu.edu> <435921F5.9010302@stsci.edu> <43593C89.8060904@ee.byu.edu> Message-ID: <43594872.1030209@stsci.edu> Travis Oliphant wrote: > The problem is why does the configure step in distutils find an isnan > function when it apparently is not there? > > This is a Python distutils function we are calling. Have you verified > whether or not isnan and isinf are available on Solaris? Well, the math.h file in /usr/include has a definition of isnan but not isinf. There is an isinf function defined in sunmath.h but that header does not seem to get picked up by distutils. Chris -- Christopher Hanley Systems Software Engineer Space Telescope Science Institute 3700 San Martin Drive Baltimore MD, 21218 (410) 338-4338 From oliphant at ee.byu.edu Fri Oct 21 16:15:16 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Fri, 21 Oct 2005 14:15:16 -0600 Subject: [SciPy-dev] scipy_core Solaris Build: Good and Bad News In-Reply-To: <43594872.1030209@stsci.edu> References: <4357AD86.5000805@stsci.edu> <4357D4DD.8070509@ee.byu.edu> <4357E009.7010308@stsci.edu> <435879BC.1080009@ee.byu.edu> <4358FDDC.9040203@stsci.edu> <43591E04.1040101@ee.byu.edu> <435921F5.9010302@stsci.edu> <43593C89.8060904@ee.byu.edu> <43594872.1030209@stsci.edu> Message-ID: <43594C54.8090806@ee.byu.edu> Christopher Hanley wrote: >Travis Oliphant wrote: > > >>The problem is why does the configure step in distutils find an isnan >>function when it apparently is not there? >> >>This is a Python distutils function we are calling. Have you verified >>whether or not isnan and isinf are available on Solaris? >> >> > >Well, the math.h file in /usr/include has a definition of isnan but not >isinf. There is an isinf function defined in sunmath.h but that header >does not seem to get picked up by distutils. > > > O.K. this sounds like isinf is not getting picked up then even though isnan is there. I've added a separate check for isinf that should help. Back to the vectorize problem. Do you have gdb installed on Solaris? Can you run the code under gdb (or another debugger) to determine exactly where the bus error is occurring? Thanks, -Travis From chanley at stsci.edu Fri Oct 21 16:36:44 2005 From: chanley at stsci.edu (Christopher Hanley) Date: Fri, 21 Oct 2005 16:36:44 -0400 Subject: [SciPy-dev] scipy_core Solaris Build: Good and Bad News In-Reply-To: <43594C54.8090806@ee.byu.edu> References: <4357AD86.5000805@stsci.edu> <4357D4DD.8070509@ee.byu.edu> <4357E009.7010308@stsci.edu> <435879BC.1080009@ee.byu.edu> <4358FDDC.9040203@stsci.edu> <43591E04.1040101@ee.byu.edu> <435921F5.9010302@stsci.edu> <43593C89.8060904@ee.byu.edu> <43594872.1030209@stsci.edu> <43594C54.8090806@ee.byu.edu> Message-ID: <4359515C.2040207@stsci.edu> Travis Oliphant wrote: > Christopher Hanley wrote: > O.K. this sounds like isinf is not getting picked up then even though > isnan is there. > > I've added a separate check for isinf that should help. Thanks. This has corrected the problem. > > Back to the vectorize problem. Do you have gdb installed on Solaris? > > Can you run the code under gdb (or another debugger) to determine > exactly where the bus error is occurring? I do have gdb installed on Solaris. I will work on tracking this problem down. Chris -- Christopher Hanley Systems Software Engineer Space Telescope Science Institute 3700 San Martin Drive Baltimore MD, 21218 (410) 338-4338 From oliphant at ee.byu.edu Fri Oct 21 17:07:45 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Fri, 21 Oct 2005 15:07:45 -0600 Subject: [SciPy-dev] scipy_core Solaris Build: Good and Bad News In-Reply-To: <4359515C.2040207@stsci.edu> References: <4357AD86.5000805@stsci.edu> <4357D4DD.8070509@ee.byu.edu> <4357E009.7010308@stsci.edu> <435879BC.1080009@ee.byu.edu> <4358FDDC.9040203@stsci.edu> <43591E04.1040101@ee.byu.edu> <435921F5.9010302@stsci.edu> <43593C89.8060904@ee.byu.edu> <43594872.1030209@stsci.edu> <43594C54.8090806@ee.byu.edu> <4359515C.2040207@stsci.edu> Message-ID: <435958A1.30509@ee.byu.edu> Christopher Hanley wrote: >I do have gdb installed on Solaris. I will work on tracking this >problem down. > >Chris > > I think I see the problem. In creating a dynamic ufunc, I'm allocating one chunk of memory and then using various offsets into it as other variables. Obviously some of these variables are then misaligned. I need to make sure the sub-blocks are all aligned on void* pointers. I'll fix this and then wait for another report. -Travis From chanley at stsci.edu Fri Oct 21 17:09:25 2005 From: chanley at stsci.edu (Christopher Hanley) Date: Fri, 21 Oct 2005 17:09:25 -0400 Subject: [SciPy-dev] scipy_core Solaris Build: Good and Bad News In-Reply-To: <43594C54.8090806@ee.byu.edu> References: <4357AD86.5000805@stsci.edu> <4357D4DD.8070509@ee.byu.edu> <4357E009.7010308@stsci.edu> <435879BC.1080009@ee.byu.edu> <4358FDDC.9040203@stsci.edu> <43591E04.1040101@ee.byu.edu> <435921F5.9010302@stsci.edu> <43593C89.8060904@ee.byu.edu> <43594872.1030209@stsci.edu> <43594C54.8090806@ee.byu.edu> Message-ID: <43595905.4090001@stsci.edu> > Can you run the code under gdb (or another debugger) to determine > exactly where the bus error is occurring? > Well, GDB core dumps with the Python we have installed on Solaris. Using DBX this is the most useful error message I have been able to get out so far: >>> scipy.test() Found 2 tests for scipy.base.umath Found 23 tests for scipy.base.function_base Found 3 tests for scipy.base.getlimits Found 9 tests for scipy.base.twodim_base Found 3 tests for scipy.base.matrix Found 44 tests for scipy.base.shape_base Found 42 tests for scipy.base.type_check Found 3 tests for scipy.basic.helper Found 4 tests for scipy.base.index_tricks Found 0 tests for __main__ .......................signal BUS (invalid address alignment) in ufunc_frompyfunc at 0xfeb56674 0xfeb56674: ufunc_frompyfunc+0x01bc: st %l6, [%i5 + 0x4] I realize that this probably isn't particularly helpful. Todd has suggested I build my own private copy of Python with --with-pydebug so that I can better step through the code. Chris -- Christopher Hanley Systems Software Engineer Space Telescope Science Institute 3700 San Martin Drive Baltimore MD, 21218 (410) 338-4338 From stephen.walton at csun.edu Fri Oct 21 17:26:26 2005 From: stephen.walton at csun.edu (Stephen Walton) Date: Fri, 21 Oct 2005 14:26:26 -0700 Subject: [SciPy-dev] TypeError: __array_data__ must return a string providing the pointer to data In-Reply-To: <43593D35.6050005@ee.byu.edu> References: <43589507.4050003@mecha.uni-stuttgart.de> <435907B4.5070506@csun.edu> <43591CB0.8040302@ee.byu.edu> <43593D35.6050005@ee.byu.edu> Message-ID: <43595D02.90709@csun.edu> Travis Oliphant wrote: >You still need the very latest CVS ... The array interface protocol had >to change to support discontiguous arrays (and read-only) arrays. These >changes are only in the latest CVS versions of numarray and Numeric. > > Sigh. Unfortunately on the two Linux releases I'm using (Ubuntu and Fedora Core 4), pygtk2 contains a dependency on Numeric and therefore you're more-or-less forced to install the distros' version, currently at 23.8. I know that's not up to you to fix, Travis! But, hypothetically, you're telling me that if I did forcibly install Numeric and numarray from current CVS, that they would play nice with the new scipy core and with matplotlib? Steve From rkern at ucsd.edu Fri Oct 21 17:33:22 2005 From: rkern at ucsd.edu (Robert Kern) Date: Fri, 21 Oct 2005 14:33:22 -0700 Subject: [SciPy-dev] TypeError: __array_data__ must return a string providing the pointer to data In-Reply-To: <43595D02.90709@csun.edu> References: <43589507.4050003@mecha.uni-stuttgart.de> <435907B4.5070506@csun.edu> <43591CB0.8040302@ee.byu.edu> <43593D35.6050005@ee.byu.edu> <43595D02.90709@csun.edu> Message-ID: <43595EA2.8050001@ucsd.edu> Stephen Walton wrote: > Travis Oliphant wrote: > >>You still need the very latest CVS ... The array interface protocol had >>to change to support discontiguous arrays (and read-only) arrays. These >>changes are only in the latest CVS versions of numarray and Numeric. > > Sigh. Unfortunately on the two Linux releases I'm using (Ubuntu and > Fedora Core 4), pygtk2 contains a dependency on Numeric and therefore > you're more-or-less forced to install the distros' version, currently at > 23.8. I know that's not up to you to fix, Travis! But, hypothetically, > you're telling me that if I did forcibly install Numeric and numarray > from current CVS, that they would play nice with the new scipy core and > with matplotlib? I know there's a way to make an empty deb to satisfy dependencies, and I'm told that there's a way to do the same on Fedora Core. http://www.us.debian.org/doc/manuals/apt-howto/ch-helpers.en.html#s-equivs -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From oliphant at ee.byu.edu Fri Oct 21 17:44:29 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Fri, 21 Oct 2005 15:44:29 -0600 Subject: [SciPy-dev] TypeError: __array_data__ must return a string providing the pointer to data In-Reply-To: <43595D02.90709@csun.edu> References: <43589507.4050003@mecha.uni-stuttgart.de> <435907B4.5070506@csun.edu> <43591CB0.8040302@ee.byu.edu> <43593D35.6050005@ee.byu.edu> <43595D02.90709@csun.edu> Message-ID: <4359613D.3060800@ee.byu.edu> Stephen Walton wrote: >Travis Oliphant wrote: > > > >>You still need the very latest CVS ... The array interface protocol had >>to change to support discontiguous arrays (and read-only) arrays. These >>changes are only in the latest CVS versions of numarray and Numeric. >> >> >> >> >Sigh. Unfortunately on the two Linux releases I'm using (Ubuntu and >Fedora Core 4), pygtk2 contains a dependency on Numeric and therefore >you're more-or-less forced to install the distros' version, currently at >23.8. I know that's not up to you to fix, Travis! But, hypothetically, >you're telling me that if I did forcibly install Numeric and numarray >from current CVS, that they would play nice with the new scipy core and >with matplotlib? > > > You can always upgrade the Numeric version after installation. This should not cause problems with pygtk2. You need at least Numeric 24.0 for array interfaces to work at all. So Numeric 23.8 won't work at all. The recent SVN of scipy core should handle __array_data__ returning a buffer object as well (which was the previous behavior). -Travis From oliphant at ee.byu.edu Fri Oct 21 17:48:56 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Fri, 21 Oct 2005 15:48:56 -0600 Subject: [SciPy-dev] scipy_core Solaris Build: Good and Bad News In-Reply-To: <43595905.4090001@stsci.edu> References: <4357AD86.5000805@stsci.edu> <4357D4DD.8070509@ee.byu.edu> <4357E009.7010308@stsci.edu> <435879BC.1080009@ee.byu.edu> <4358FDDC.9040203@stsci.edu> <43591E04.1040101@ee.byu.edu> <435921F5.9010302@stsci.edu> <43593C89.8060904@ee.byu.edu> <43594872.1030209@stsci.edu> <43594C54.8090806@ee.byu.edu> <43595905.4090001@stsci.edu> Message-ID: <43596248.7030708@ee.byu.edu> Christopher Hanley wrote: >>Can you run the code under gdb (or another debugger) to determine >>exactly where the bus error is occurring? >> >> >> > >Well, GDB core dumps with the Python we have installed on Solaris. >Using DBX this is the most useful error message I have been able to get >out so far: > > >>> scipy.test() > Found 2 tests for scipy.base.umath > Found 23 tests for scipy.base.function_base > Found 3 tests for scipy.base.getlimits > Found 9 tests for scipy.base.twodim_base > Found 3 tests for scipy.base.matrix > Found 44 tests for scipy.base.shape_base > Found 42 tests for scipy.base.type_check > Found 3 tests for scipy.basic.helper > Found 4 tests for scipy.base.index_tricks > Found 0 tests for __main__ >.......................signal BUS (invalid address alignment) in >ufunc_frompyfunc at 0xfeb56674 >0xfeb56674: ufunc_frompyfunc+0x01bc: st %l6, [%i5 + 0x4] > >I realize that this probably isn't particularly helpful. Todd has >suggested I build my own private copy of Python with --with-pydebug so >that I can better step through the code. > > Actually, this confirms my suspicion about the problem. In this one function I'm playing games to use only one malloc call as explained before. This is now (hopefully) fixed in SVN. I don't do the same thing anywhere else (well except for allocating buffers, but these are always increments of the data type size and so should be fine). All the pointers should now be on aligned addresses. -Travis From chanley at stsci.edu Fri Oct 21 18:10:01 2005 From: chanley at stsci.edu (Christopher Hanley) Date: Fri, 21 Oct 2005 18:10:01 -0400 Subject: [SciPy-dev] scipy_core Solaris Build: Good and Bad News In-Reply-To: <43596248.7030708@ee.byu.edu> References: <4357AD86.5000805@stsci.edu> <4357D4DD.8070509@ee.byu.edu> <4357E009.7010308@stsci.edu> <435879BC.1080009@ee.byu.edu> <4358FDDC.9040203@stsci.edu> <43591E04.1040101@ee.byu.edu> <435921F5.9010302@stsci.edu> <43593C89.8060904@ee.byu.edu> <43594872.1030209@stsci.edu> <43594C54.8090806@ee.byu.edu> <43595905.4090001@stsci.edu> <43596248.7030708@ee.byu.edu> Message-ID: <43596739.2060305@stsci.edu> Travis Oliphant wrote: > Actually, this confirms my suspicion about the problem. In this one > function I'm playing > games to use only one malloc call as explained before. This is now > (hopefully) fixed in SVN. > > I don't do the same thing anywhere else (well except for allocating > buffers, but these are always increments of the data type size and so > should be fine). > > All the pointers should now be on aligned addresses. > > -Travis > Yeah!!! The latest and greatest version now builds on Solaris with no customizations. All of the tests now pass as well. basil> python Python 2.4.1 (#2, Apr 6 2005, 14:41:45) [C] on sunos5 Type "help", "copyright", "credits" or "license" for more information. >>> import scipy >>> scipy.test() Found 2 tests for scipy.base.umath Found 23 tests for scipy.base.function_base Found 3 tests for scipy.base.getlimits Found 9 tests for scipy.base.twodim_base Found 3 tests for scipy.base.matrix Found 44 tests for scipy.base.shape_base Found 42 tests for scipy.base.type_check Found 3 tests for scipy.basic.helper Found 4 tests for scipy.base.index_tricks Found 0 tests for __main__ ..................................................................................................................................... ---------------------------------------------------------------------- Ran 133 tests in 1.777s OK >>> Thank you for your time and efforts, Chris -- Christopher Hanley Systems Software Engineer Space Telescope Science Institute 3700 San Martin Drive Baltimore MD, 21218 (410) 338-4338 From oliphant at ee.byu.edu Fri Oct 21 18:48:40 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Fri, 21 Oct 2005 16:48:40 -0600 Subject: [SciPy-dev] TypeError: __array_data__ must return a string providing the pointer to data In-Reply-To: <43595D02.90709@csun.edu> References: <43589507.4050003@mecha.uni-stuttgart.de> <435907B4.5070506@csun.edu> <43591CB0.8040302@ee.byu.edu> <43593D35.6050005@ee.byu.edu> <43595D02.90709@csun.edu> Message-ID: <43597048.5080509@ee.byu.edu> Stephen Walton wrote: >Travis Oliphant wrote: > > > >>You still need the very latest CVS ... The array interface protocol had >>to change to support discontiguous arrays (and read-only) arrays. These >>changes are only in the latest CVS versions of numarray and Numeric. >> >> >> >> >Sigh. Unfortunately on the two Linux releases I'm using (Ubuntu and >Fedora Core 4), pygtk2 contains a dependency on Numeric and therefore >you're more-or-less forced to install the distros' version, currently at >23.8. I know that's not up to you to fix, Travis! But, hypothetically, >you're telling me that if I did forcibly install Numeric and numarray >from current CVS, that they would play nice with the new scipy core and >with matplotlib? > > Yes, this works. I just tried it on my windows machine. Installing Numeric 24.0 and scipy core allows matplotlib to work with scipy core arrays (they get quickly converted to Numeric 24.0 arrays behind the scenes using the array interface). Unfortunately this does not work with Numeric 23.8 (even through the sequence interface) because the __array__ method of scipy core does not return a Numeric array (and Numeric doesn't do the right thing before Numeric 24.0). -Travis From oliphant at ee.byu.edu Fri Oct 21 19:48:36 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Fri, 21 Oct 2005 17:48:36 -0600 Subject: [SciPy-dev] All tests pass for me on newscipy Message-ID: <43597E54.9020605@ee.byu.edu> A fix to the overwrite specification in decomp.py has led to all tests passing for me in the newscipy. There are still a few packages to port, but we are getting close. -Travis From oliphant at ee.byu.edu Fri Oct 21 20:19:31 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Fri, 21 Oct 2005 18:19:31 -0600 Subject: [SciPy-dev] A quote of some relevance from Guido Message-ID: <43598593.7020805@ee.byu.edu> I just picked up this little tidbit from python-dev. It is a useful concept to keep in mind. -------------------------------------------------- The "Swiss Army Knife (...Not)" API design pattern -------------------------------------------------- This fortnight saw a number of different discussions on what Guido's guiding principles are in making design decisions about Python. Guido introduced the "Swiss Army Knife (...Not)" API design pattern, which has been lauded by some as `the long-lost 20th principle from the Zen of Python`_. A direct quote from Guido: [I]nstead of a single "swiss-army-knife" function with various options that choose different behavior variants, it's better to have different dedicated functions for each of the major functionality types. This principle is the basis for pairs like str.split() and str.rsplit () or str.find() and str.rfind(). The goal is to keep cognitive overhead down by associating with each use case a single function with a minimal number of parameters. From oliphant at ee.byu.edu Fri Oct 21 20:23:53 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Fri, 21 Oct 2005 18:23:53 -0600 Subject: [SciPy-dev] Making all classes new style classes Message-ID: <43598699.7010403@ee.byu.edu> I know it was mentioned that we should make all classes in scipy new style classes. This can be accomplished easily by a module level definition of __metaclass__ = type All classes in the module will be "new-style" (i.e. inherit from object) -Travis From rkern at ucsd.edu Fri Oct 21 22:29:48 2005 From: rkern at ucsd.edu (Robert Kern) Date: Fri, 21 Oct 2005 19:29:48 -0700 Subject: [SciPy-dev] A quote of some relevance from Guido In-Reply-To: <43598593.7020805@ee.byu.edu> References: <43598593.7020805@ee.byu.edu> Message-ID: <4359A41C.40108@ucsd.edu> Travis Oliphant wrote: > I just picked up this little tidbit from python-dev. It is a useful concept to keep in mind. > > -------------------------------------------------- > The "Swiss Army Knife (...Not)" API design pattern > -------------------------------------------------- > > This fortnight saw a number of different discussions on what Guido's > guiding principles are in making design decisions about Python. Guido > introduced the "Swiss Army Knife (...Not)" API design pattern, which > has been lauded by some as `the long-lost 20th principle from the Zen > of Python`_. A direct quote from Guido: > > [I]nstead of a single "swiss-army-knife" function with various > options that choose different behavior variants, it's better to have > different dedicated functions for each of the major functionality types. > > This principle is the basis for pairs like str.split() and str.rsplit > () or str.find() and str.rfind(). The goal is to keep cognitive > overhead down by associating with each use case a single function > with a minimal number of parameters. I think he made a more specific version of this rule: if you expect people to usually pass in a constant for a parameter, define another function. Booleans are especially good targets. For example, instead of str.find(x, y, fromright=False) str.find(x, y, fromright=True) define str.find(x, y) str.rfind(x, y) http://mail.python.org/pipermail/python-dev/2005-September/056202.html -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From fishburn at MIT.EDU Sat Oct 22 00:27:38 2005 From: fishburn at MIT.EDU (Matt Fishburn) Date: Sat, 22 Oct 2005 00:27:38 -0400 Subject: [SciPy-dev] include fit functions such as erf, exp, cosine etc. Message-ID: <4359BFBA.1000009@mit.edu> Hello, I recently needed to fit some data to an erf function, but I couldn't seem to find one in scipy (I ended up coding one using the outline on page 16 of the tutorial). Are there any functions to fit data to common functions such as erf, exp, cosine, and arbitrary degree polynomials, or support to include such functions in the future? -Matt Fishburn From rkern at ucsd.edu Sat Oct 22 00:56:55 2005 From: rkern at ucsd.edu (Robert Kern) Date: Fri, 21 Oct 2005 21:56:55 -0700 Subject: [SciPy-dev] include fit functions such as erf, exp, cosine etc. In-Reply-To: <4359BFBA.1000009@mit.edu> References: <4359BFBA.1000009@mit.edu> Message-ID: <4359C697.8000207@ucsd.edu> Matt Fishburn wrote: > Hello, > > I recently needed to fit some data to an erf function, but I couldn't > seem to find one in scipy (I ended up coding one using the outline on > page 16 of the tutorial). Are there any functions to fit data to common > functions such as erf, exp, cosine, and arbitrary degree polynomials, or > support to include such functions in the future? It might be good to write something that assists fitting polynomials, but otherwise, there's not much point. -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From charles.harris at sdl.usu.edu Sat Oct 22 01:11:41 2005 From: charles.harris at sdl.usu.edu (Charles Harris) Date: Fri, 21 Oct 2005 23:11:41 -0600 Subject: [SciPy-dev] include fit functions such as erf, exp, cosine etc. References: <4359BFBA.1000009@mit.edu> <4359C697.8000207@ucsd.edu> Message-ID: > -----Original Message----- > From: scipy-dev-bounces at scipy.net on behalf of Robert Kern > Sent: Fri 10/21/2005 10:56 PM > To: SciPy Developers List > Subject: Re: [SciPy-dev] include fit functions such as erf, exp, cosine etc. > >Matt Fishburn wrote: >> Hello, >> >> I recently needed to fit some data to an erf function, but I couldn't >> seem to find one in scipy (I ended up coding one using the outline on >> page 16 of the tutorial). Are there any functions to fit data to common >> functions such as erf, exp, cosine, and arbitrary degree polynomials, or >> support to include such functions in the future? > > It might be good to write something that assists fitting polynomials, > but otherwise, there's not much point. My own preference here is to do a least squares fit with Chebychev polynomials as 1) The series is better conditioned with regard to coefficient errors. 2) The fitting matrix tends to be better conditioned. 3) Tends to yield a L_inf fit when a high degree series is truncated. 4) Is a plain old least squares polynomial fit when all the terms are used. 5) High degree polynomials are ill conditioned anyway. 6) For low degree fits the Chebychev series can be converted to a polynomial. 7) Reverse recursion makes the series easy to evaluate. My $.02 . Chuck -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter _______________________________________________ Scipy-dev mailing list Scipy-dev at scipy.net http://www.scipy.net/mailman/listinfo/scipy-dev -------------- next part -------------- A non-text attachment was scrubbed... Name: winmail.dat Type: application/ms-tnef Size: 3436 bytes Desc: not available URL: From nwagner at mecha.uni-stuttgart.de Sat Oct 22 02:17:14 2005 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Sat, 22 Oct 2005 08:17:14 +0200 Subject: [SciPy-dev] All tests pass for me on newscipy In-Reply-To: <43597E54.9020605@ee.byu.edu> References: <43597E54.9020605@ee.byu.edu> Message-ID: On Fri, 21 Oct 2005 17:48:36 -0600 Travis Oliphant wrote: > > A fix to the overwrite specification in decomp.py has >led to all tests > passing for me in the newscipy. > > There are still a few packages to port, but we are >getting close. > > -Travis > > > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-dev ====================================================================== FAIL: check_random (scipy.linalg.decomp.test_decomp.test_cholesky) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib/python2.4/site-packages/scipy/linalg/tests/test_decomp.py", line 284, in check_random assert_array_almost_equal(a,a1) File "/usr/local/lib/python2.4/site-packages/scipy/test/testing.py", line 727, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 100.0%): Array 1: [[ 2.2115142e+01 1.9096254e+00 8.8586238e-01 9.7084732e-01 1.1361606e+00 7.4402947e-01 4.6727755e-01 1.... Array 2: [[ 489.0795215 42.2316379 19.5909726 21.4704267 25.1263533 16.4543176 10.3339096 31.0853131 7.2599046... ====================================================================== FAIL: check_random_complex (scipy.linalg.decomp.test_decomp.test_cholesky) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib/python2.4/site-packages/scipy/linalg/tests/test_decomp.py", line 298, in check_random_complex assert_array_almost_equal(a,a1) File "/usr/local/lib/python2.4/site-packages/scipy/test/testing.py", line 727, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 100.0%): Array 1: [[ 1.0194682e+01 +0.0000000e+00j 1.8373100e+00 +6.2774806e-01j 2.0980916e+00 -7.4906075e-01j 1.9508664e+00 -1.1... Array 2: [[ 1.0393155e+02 +0.0000000e+00j 1.8730792e+01 +6.3996920e+00j 2.1389377e+01 -7.6364363e+00j 1.9888463e+01 -1.1... ====================================================================== FAIL: check_simple_complex (scipy.linalg.decomp.test_decomp.test_cholesky) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib/python2.4/site-packages/scipy/linalg/tests/test_decomp.py", line 270, in check_simple_complex assert_array_almost_equal(a,a1) File "/usr/local/lib/python2.4/site-packages/scipy/test/testing.py", line 727, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 100.0%): Array 1: [[ 3.1622777e+00 +0.0000000e+00j 4.1109610e+00 +2.8460499e+00j 4.7434165e+00 -1.5811388e+00j] [ 0.0000000e+00 +... Array 2: [[ 1.0000000e+01 +0.0000000e+00j 1.3000000e+01 +9.0000000e+00j 1.5000000e+01 -5.0000000e+00j] [ 1.3000000e+01 -... ---------------------------------------------------------------------- Ran 389 tests in 0.982s FAILED (failures=3) >>> scipy.base.__version__ '0.4.3.1335' From nwagner at mecha.uni-stuttgart.de Sat Oct 22 06:18:28 2005 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Sat, 22 Oct 2005 12:18:28 +0200 Subject: [SciPy-dev] SeDuMi in scipy ? Message-ID: Hi all, SeDuMi is released under the GNU/GPL open source license. http://sedumi.mcmaster.ca/ Is it of interest to scipy ? Nils From rkern at ucsd.edu Sat Oct 22 06:20:53 2005 From: rkern at ucsd.edu (Robert Kern) Date: Sat, 22 Oct 2005 03:20:53 -0700 Subject: [SciPy-dev] SeDuMi in scipy ? In-Reply-To: References: Message-ID: <435A1285.6010100@ucsd.edu> Nils Wagner wrote: > Hi all, > > SeDuMi is released under the GNU/GPL open source license. > http://sedumi.mcmaster.ca/ > Is it of interest to scipy ? No, we're not accepting GPL code into the main scipy package. -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From aisaac at american.edu Sat Oct 22 09:50:23 2005 From: aisaac at american.edu (Alan G Isaac) Date: Sat, 22 Oct 2005 09:50:23 -0400 Subject: [SciPy-dev] SeDuMi in scipy ? In-Reply-To: <435A1285.6010100@ucsd.edu> References: <435A1285.6010100@ucsd.edu> Message-ID: > Nils Wagner wrote: >> SeDuMi is released under the GNU/GPL open source license. >> http://sedumi.mcmaster.ca/ >> Is it of interest to scipy ? On Sat, 22 Oct 2005, Robert Kern apparently wrote: > No, we're not accepting GPL code into the main scipy package. Looks great though. Please ask them if they would consider a release under the MIT license. Cheers, Alan Isaac From oliphant at ee.byu.edu Sat Oct 22 13:45:45 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Sat, 22 Oct 2005 11:45:45 -0600 Subject: [SciPy-dev] Comments in C source Message-ID: <435A7AC9.10306@ee.byu.edu> I appreciate all the people who are helping to develop scipy. Just a reminder: don't add C++ style comments to the code if at all possible. Use /* */ instead of // Thanks, -Travis From stephen.walton at csun.edu Sat Oct 22 18:56:20 2005 From: stephen.walton at csun.edu (Stephen Walton) Date: Sat, 22 Oct 2005 15:56:20 -0700 Subject: [SciPy-dev] Numeric 24, matplotlib, etc. (was: Re: TypeError: __array_data__ must return a string providing the pointer to data) In-Reply-To: <43597048.5080509@ee.byu.edu> References: <43589507.4050003@mecha.uni-stuttgart.de> <435907B4.5070506@csun.edu> <43591CB0.8040302@ee.byu.edu> <43593D35.6050005@ee.byu.edu> <43595D02.90709@csun.edu> <43597048.5080509@ee.byu.edu> Message-ID: <435AC394.10803@csun.edu> Travis Oliphant wrote: >Installing Numeric 24.0 and scipy core allows matplotlib to work with >scipy core arrays (they get quickly converted to Numeric 24.0 arrays >behind the scenes using the array interface). > Is 24.0 still around for download somewhere? I used an old archive of 24.0b2 I had lying around, but didn't see Numeric 24 on Sourceforge, either as a download or from CVS. It gets worse, because it turns out that the Ubuntu numeric packagers have broken it up into python-numeric, python2.4-numeric, and python2.4-numeric-ext. John Hunter, are you out there? I know you use Ubuntu; are you just using old scipy until you get a chance to port matplotlib to new core? Followups redirected to scipy-user, since IMHO this is no longer a scipy-dev discussion. From arnd.baecker at web.de Sun Oct 23 07:18:11 2005 From: arnd.baecker at web.de (Arnd Baecker) Date: Sun, 23 Oct 2005 13:18:11 +0200 (CEST) Subject: [SciPy-dev] random module not working on opteron Message-ID: Hi, I recently reported that from scipy.basic.random import normal a = normal(0.,1.,(3,3)) leads to a hang on the opteron. It turns out that the whole random module does not work as it should: In [2]: import scipy.basic.random In [3]: scipy.basic.random.uniform() Out[3]: 3804495653.9744735 In [4]: scipy.basic.random.uniform() Out[4]: 2804588197.3068576 In [5]: scipy.basic.random.uniform() Out[5]: 998177869.76781905 # this one also hangs In [6]: scipy.basic.random.randn(5) I have no idea what is causing this, in particular because there are no warnings during the compile of mtrand.c. Can anyone with a 64 Bit machine confirm this behaviour? What could one do to hunt this one down? (For the hangs one could add print statements to the .pyx source. Unfortunately I don't have the right pyxrc as I get a different output than the one from svn.). Best, Arnd From rkern at ucsd.edu Sun Oct 23 07:32:49 2005 From: rkern at ucsd.edu (Robert Kern) Date: Sun, 23 Oct 2005 04:32:49 -0700 Subject: [SciPy-dev] random module not working on opteron In-Reply-To: References: Message-ID: <435B74E1.3050804@ucsd.edu> Arnd Baecker wrote: > Hi, > > I recently reported that > from scipy.basic.random import normal > a = normal(0.,1.,(3,3)) > leads to a hang on the opteron. > > It turns out that the whole random module does not > work as it should: > > In [2]: import scipy.basic.random > In [3]: scipy.basic.random.uniform() > Out[3]: 3804495653.9744735 > In [4]: scipy.basic.random.uniform() > Out[4]: 2804588197.3068576 > In [5]: scipy.basic.random.uniform() > Out[5]: 998177869.76781905 > # this one also hangs > In [6]: scipy.basic.random.randn(5) > > I have no idea what is causing this, > in particular because there are no warnings during the compile > of mtrand.c. > > Can anyone with a 64 Bit machine confirm this behaviour? > What could one do to hunt this one down? I would first check if the RandomKit functions work correctly from C. See the files randomkit.[ch] ; the functions should be self-explanatory. It's entirely possible that they're not 64-bit safe; I never had the opportunity to check. > (For the hangs one could add print statements to the > .pyx source. Unfortunately I don't have the right pyxrc > as I get a different output than the one from svn.). We're using David Cooke's patched version of Pyrex; his patches were posted to the Pyrex list. The changes are solely to make the generated code compile without warnings. Don't worry about them. The mtrand compiled from the unpatched Pyrex generated source should behave identically. And if they don't on 64-bit platforms, then that's something we need to know. -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From arnd.baecker at web.de Sun Oct 23 16:19:37 2005 From: arnd.baecker at web.de (Arnd Baecker) Date: Sun, 23 Oct 2005 22:19:37 +0200 (CEST) Subject: [SciPy-dev] random module not working on opteron In-Reply-To: <435B74E1.3050804@ucsd.edu> References: <435B74E1.3050804@ucsd.edu> Message-ID: On Sun, 23 Oct 2005, Robert Kern wrote: > Arnd Baecker wrote: [...] > > Can anyone with a 64 Bit machine confirm this behaviour? > > What could one do to hunt this one down? > > I would first check if the RandomKit functions work correctly from C. > See the files randomkit.[ch] ; the functions should be self-explanatory. > It's entirely possible that they're not 64-bit safe; I never had the > opportunity to check. Ok, referring to a famous scene of the Life of Brian: after more than 5 years of not speaking C at all you are stepping on my feet and I speak C again - a miracle! (What a pleasure, no need to pay attention to white space, I love all those variable declarations and ";"... Thank you ;-). Now to the facts: - the code below works fine on my PIV - on the opteron it gives problems and hangs: Value: 1791095845l max=4294967295l Value: 0.997185 Now with random seed: Value: 920722385l max=4294967295l Value: 915381373.352906 # <----- not ok! (rk_double) Now test the distributions: rk_uniform: 566004099.864036 # <----- not ok! rk_normal: # (hangs) The problem is caused by calling rk_random_seed (Commenting this one out, everything seems fine). I am not sure what is going on in that routine (and have to leave investigating this at the moment ...) Maybe you can have a look at that routine and spot anything which might be problematic? (either 64 Bit, or the clock stuff???) Best, Arnd /* test_randomkit.c cd newcore/scipy/corelib/mtrand gcc -I/usr/include/python2.3/ test_randomkit.c randomkit.c distributions.c -lm */ #include #include #include #include #include "randomkit.h" #include "distributions.h" int main(int argc, char *argv[]) { rk_state state; unsigned long seed = 1, random_value; rk_seed(seed, &state); // Initialize the RNG random_value = rk_random(&state); // Generate random values in [0..RK_MAX] printf("Value: %ul max=%ul\n",random_value,RK_MAX); printf("Value: %f \n",rk_double(&state)); printf("Value: %f \n",rk_double(&state)); printf("Value: %f \n",rk_double(&state)); printf("Now with random seed:\n"); rk_randomseed(&state); printf("Value: %ul max=%ul\n",rk_random(&state),RK_MAX); printf("Value: %f \n",rk_double(&state)); printf("Now test the distributions:\n"); printf("rk_uniform: %f\n",rk_uniform(&state,0.0,1.0)); printf("rk_normal: %f\n",rk_normal(&state,0.0,1.0)); return(0); } From oliphant at ee.byu.edu Sun Oct 23 19:50:33 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Sun, 23 Oct 2005 17:50:33 -0600 Subject: [SciPy-dev] random module not working on opteron In-Reply-To: References: <435B74E1.3050804@ucsd.edu> Message-ID: <435C21C9.4060109@ee.byu.edu> Arnd Baecker wrote: >On Sun, 23 Oct 2005, Robert Kern wrote: > > >I am not sure what is going on in that routine >(and have to leave investigating this at the moment ...) >Maybe you can have a look at that routine >and spot anything which might be problematic? >(either 64 Bit, or the clock stuff???) > > This is a good bit of detective work. A little more may track down the problem. rk_random_seed is not that complicated of a function. I suspect you could call the individual functions inside it separately to determine the problem. There are a lot of platform specific structures here. It's hard to see immediately where a problem is. -Travis From rkern at ucsd.edu Sun Oct 23 20:09:42 2005 From: rkern at ucsd.edu (Robert Kern) Date: Sun, 23 Oct 2005 17:09:42 -0700 Subject: [SciPy-dev] random module not working on opteron In-Reply-To: References: <435B74E1.3050804@ucsd.edu> Message-ID: <435C2646.4090007@ucsd.edu> Arnd Baecker wrote: > On Sun, 23 Oct 2005, Robert Kern wrote: > > >>Arnd Baecker wrote: > > > [...] > > >>>Can anyone with a 64 Bit machine confirm this behaviour? >>>What could one do to hunt this one down? >> >>I would first check if the RandomKit functions work correctly from C. >>See the files randomkit.[ch] ; the functions should be self-explanatory. >>It's entirely possible that they're not 64-bit safe; I never had the >>opportunity to check. > > > Ok, referring to a famous scene of the Life of Brian: > after more than 5 years of not speaking C at all > you are stepping on my feet and I speak C again - a miracle! > (What a pleasure, no need to pay attention to white space, > I love all those variable declarations and ";"... Thank you ;-). > > Now to the facts: > - the code below works fine on my PIV > - on the opteron it gives problems and hangs: > > Value: 1791095845l max=4294967295l > Value: 0.997185 > Now with random seed: > Value: 920722385l max=4294967295l > Value: 915381373.352906 # <----- not ok! (rk_double) > Now test the distributions: > rk_uniform: 566004099.864036 # <----- not ok! > rk_normal: # (hangs) > > The problem is caused by calling rk_random_seed > (Commenting this one out, everything seems fine). > > I am not sure what is going on in that routine > (and have to leave investigating this at the moment ...) > Maybe you can have a look at that routine > and spot anything which might be problematic? > (either 64 Bit, or the clock stuff???) I think the issue is that the state is an array of 624 unsigned longs. rk_seed() essentially just does 32-bit arithmetic and so fills those longs with numbers up to 2**32-1. rk_randomseed(), however, takes raw bits from /dev/urandom and drops them into the array. I believe that the algorithm is expecting 32-bit integers. I've applied a mask to the state after it gets seeded by /dev/urandom. Please checkout the latest version of randomkit.c from SVN and give it a try. We may need to make this more robust later, but for now it might work. -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From charles.harris at sdl.usu.edu Sun Oct 23 20:29:48 2005 From: charles.harris at sdl.usu.edu (Charles Harris) Date: Sun, 23 Oct 2005 18:29:48 -0600 Subject: [SciPy-dev] random module not working on opteron References: <435B74E1.3050804@ucsd.edu> <435C2646.4090007@ucsd.edu> Message-ID: The mt1997 code on the author's site masks all the operations for 64 bit compatability, although I suspect this is compiler dependent and shouldn't matter with gcc. But apparently there was a reason... ;) Also, device urandom delivers bytes, IIRC, like any other unix file read, so we need to read 624*sizeof(long) bytes to fill the array. Chuck -----Original Message----- From: scipy-dev-bounces at scipy.net on behalf of Robert Kern Sent: Sun 10/23/2005 6:09 PM To: SciPy Developers List Subject: Re: [SciPy-dev] random module not working on opteron Arnd Baecker wrote: > On Sun, 23 Oct 2005, Robert Kern wrote: > > >>Arnd Baecker wrote: > > > [...] > > >>>Can anyone with a 64 Bit machine confirm this behaviour? >>>What could one do to hunt this one down? >> >>I would first check if the RandomKit functions work correctly from C. >>See the files randomkit.[ch] ; the functions should be self-explanatory. >>It's entirely possible that they're not 64-bit safe; I never had the >>opportunity to check. > > > Ok, referring to a famous scene of the Life of Brian: > after more than 5 years of not speaking C at all > you are stepping on my feet and I speak C again - a miracle! > (What a pleasure, no need to pay attention to white space, > I love all those variable declarations and ";"... Thank you ;-). > > Now to the facts: > - the code below works fine on my PIV > - on the opteron it gives problems and hangs: > > Value: 1791095845l max=4294967295l > Value: 0.997185 > Now with random seed: > Value: 920722385l max=4294967295l > Value: 915381373.352906 # <----- not ok! (rk_double) > Now test the distributions: > rk_uniform: 566004099.864036 # <----- not ok! > rk_normal: # (hangs) > > The problem is caused by calling rk_random_seed > (Commenting this one out, everything seems fine). > > I am not sure what is going on in that routine > (and have to leave investigating this at the moment ...) > Maybe you can have a look at that routine > and spot anything which might be problematic? > (either 64 Bit, or the clock stuff???) I think the issue is that the state is an array of 624 unsigned longs. rk_seed() essentially just does 32-bit arithmetic and so fills those longs with numbers up to 2**32-1. rk_randomseed(), however, takes raw bits from /dev/urandom and drops them into the array. I believe that the algorithm is expecting 32-bit integers. I've applied a mask to the state after it gets seeded by /dev/urandom. Please checkout the latest version of randomkit.c from SVN and give it a try. We may need to make this more robust later, but for now it might work. -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter _______________________________________________ Scipy-dev mailing list Scipy-dev at scipy.net http://www.scipy.net/mailman/listinfo/scipy-dev -------------- next part -------------- A non-text attachment was scrubbed... Name: winmail.dat Type: application/ms-tnef Size: 4445 bytes Desc: not available URL: From arnd.baecker at web.de Mon Oct 24 03:28:14 2005 From: arnd.baecker at web.de (Arnd Baecker) Date: Mon, 24 Oct 2005 09:28:14 +0200 (CEST) Subject: [SciPy-dev] random module not working on opteron In-Reply-To: <435C2646.4090007@ucsd.edu> References: <435C2646.4090007@ucsd.edu> Message-ID: On Sun, 23 Oct 2005, Robert Kern wrote: > I think the issue is that the state is an array of 624 unsigned longs. > rk_seed() essentially just does 32-bit arithmetic and so fills those > longs with numbers up to 2**32-1. rk_randomseed(), however, takes raw > bits from /dev/urandom and drops them into the array. I believe that the > algorithm is expecting 32-bit integers. > > I've applied a mask to the state after it gets seeded by /dev/urandom. > Please checkout the latest version of randomkit.c from SVN and give it a > try. We may need to make this more robust later, but for now it might work. I went straight to a full build (because that is done automatically via a script ;-): It looks good - scipy.test(10,10) runs through on the opteron the first time after quite a while! Ran 569 tests in 31.891s FAILED (failures=14, errors=30) Out[3]: Thanx to the master detectives! Best, Arnd P.S.: do you want to see the results of scipy.test(10,10) or is the number of failures and errors the same you get on other platforms? From nwagner at mecha.uni-stuttgart.de Mon Oct 24 08:10:28 2005 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Mon, 24 Oct 2005 14:10:28 +0200 Subject: [SciPy-dev] xplt in newscipy Message-ID: <435CCF34.1030301@mecha.uni-stuttgart.de> Hi all, Is it still possible to use xplt in newscipy ? It is faster than matplotlib. Nils From rkern at ucsd.edu Mon Oct 24 08:20:27 2005 From: rkern at ucsd.edu (Robert Kern) Date: Mon, 24 Oct 2005 05:20:27 -0700 Subject: [SciPy-dev] xplt in newscipy In-Reply-To: <435CCF34.1030301@mecha.uni-stuttgart.de> References: <435CCF34.1030301@mecha.uni-stuttgart.de> Message-ID: <435CD18B.3070203@ucsd.edu> Nils Wagner wrote: > Hi all, > > Is it still possible to use xplt in newscipy ? > It is faster than matplotlib. It's temporarily in scipy.sandbox.xplt . If I have anything to say about it, it won't return as scipy.xplt . The people who want to take over maintenance of xplt should find a home for it that's not scipy. -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From schofield at ftw.at Mon Oct 24 08:27:29 2005 From: schofield at ftw.at (Ed Schofield) Date: Mon, 24 Oct 2005 14:27:29 +0200 Subject: [SciPy-dev] Sparse matrix module Message-ID: <435CD331.3070007@ftw.at> Robert Cimrman wrote: > Ed Schofield wrote: >> I wrote in the SVN change log: >> >> This is the beginning of a merge of Roman Geus's PySparse into scipy. >> The goals are to make it more 'Pythonic', integrate sparse matrix types >> into scipy so they act similarly to dense arrays, and to support more >> a few more sparse data types, particularly CSC, in a nice OO hierarchy. > > > Does this basically mean to make Roman's spmatrix extension type also > a subclass of the original Travis' spmatrix? Or would you prefer to > keep scipy-ized PySparse separately and ultimately (merge &) replace > the original? > ... > > Here is a little summary of the situation as I see it now: > > existing module (TravisSparse): > - Python spmatrix class + a number of Python subclasses with a > possible fortran/C underlying implementation > - pros: trivial addition of new methods / attributes, speeding up > could be done "later"; handles (or will handle :-) all numeric scipy > types (ints, floats, doubles, ...) automagically > - cons: speed in some situations? > > experimental module (RomanSparse) (random remarks, since I know nuts > about it): > - C extension class > - pros: constructor speed? and speed in general? > - cons: not so easy to change the low-level implementation, 'cause > it's on that level already?; the solvers (umfpack etc.) are > hand-wrapped - I would certainly prefer a generated interface (swig?) > which would be AFAIK much more flexible. > > I am personally in favor of the TravisSparse approach as the base with > RomanSparse subclass of spmatrix with a key feature "*speed* *speed* > *speed*". Also the _solvers_ should be split as much as practical from > the sparse matrix _type_. Of course, having some 'recommended format > hinting' (e.g. CSR for umfpack), that would tell the user "use this if > you want speed and to avoid implicit matrix conversion" would be > necessary. > Yes, I agree that a Python implementation would be simpler and more flexible than one in C. Perhaps our goal should be to build on the existing TravisSparse module but replace the sparsetools Fortran code with code from PySparse, which is probably better tested and debugged. I've had a look at how objects with both C and Python member functions are possible in the Python standard library. An example is the random module: there's a randommodule.c file that is imported in the Python random.py file and used as a base class for Python objects: import _random class Random(_random.Random): Then the functions exported by the C module are wrapped like this: def seed(self, a=None): super(Random, self).seed(a) Why don't we adopt a similar structure? Then we'd use Travis's spmatrix as the base class, and derive csr_matrix from both spmatrix and an object we import from C. Or is it possible for a C module to derive directly from a Python class?! -- Ed From nwagner at mecha.uni-stuttgart.de Mon Oct 24 09:25:49 2005 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Mon, 24 Oct 2005 15:25:49 +0200 Subject: [SciPy-dev] NameError: global name 'hasattry' is not defined Message-ID: <435CE0DD.5060903@mecha.uni-stuttgart.de> w,vr = linalg.eig(Kd,Gd) File "/usr/local/lib/python2.4/site-packages/scipy/linalg/decomp.py", line 120, in eig return _geneig(a1,b,left,right,overwrite_a,overwrite_b) File "/usr/local/lib/python2.4/site-packages/scipy/linalg/decomp.py", line 52, in _geneig overwrite_b = overwrite_b or (b1 is not b and not hasattry(b,'__array__')) NameError: global name 'hasattry' is not defined From oliphant at ee.byu.edu Mon Oct 24 10:48:19 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Mon, 24 Oct 2005 08:48:19 -0600 Subject: [SciPy-dev] random module not working on opteron In-Reply-To: References: <435C2646.4090007@ucsd.edu> Message-ID: <435CF433.50603@ee.byu.edu> Arnd Baecker wrote: >On Sun, 23 Oct 2005, Robert Kern wrote: > > > >>I think the issue is that the state is an array of 624 unsigned longs. >>rk_seed() essentially just does 32-bit arithmetic and so fills those >>longs with numbers up to 2**32-1. rk_randomseed(), however, takes raw >>bits from /dev/urandom and drops them into the array. I believe that the >>algorithm is expecting 32-bit integers. >> >>I've applied a mask to the state after it gets seeded by /dev/urandom. >>Please checkout the latest version of randomkit.c from SVN and give it a >>try. We may need to make this more robust later, but for now it might work. >> >> > >I went straight to a full build (because that is done automatically >via a script ;-): > >It looks good - scipy.test(10,10) runs through on the opteron >the first time after quite a while! > >Ran 569 tests in 31.891s > >FAILED (failures=14, errors=30) >Out[3]: > > The buld logs are still useful (I recently added the remaining packages to the build -- but they need 64-bit cleaning). Also, I'm not getting that many errors and failures. So, the details would be helpful. -Travis From oliphant at ee.byu.edu Mon Oct 24 11:03:17 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Mon, 24 Oct 2005 09:03:17 -0600 Subject: [SciPy-dev] Sparse matrix module In-Reply-To: <435CD331.3070007@ftw.at> References: <435CD331.3070007@ftw.at> Message-ID: <435CF7B5.7050707@ee.byu.edu> Ed Schofield wrote: >Robert Cimrman wrote: > > >>Ed Schofield wrote: >> >> > > > >>>I wrote in the SVN change log: >>> >>>This is the beginning of a merge of Roman Geus's PySparse into scipy. >>>The goals are to make it more 'Pythonic', integrate sparse matrix types >>>into scipy so they act similarly to dense arrays, and to support more >>>a few more sparse data types, particularly CSC, in a nice OO hierarchy. >>> >>> >>Does this basically mean to make Roman's spmatrix extension type also >>a subclass of the original Travis' spmatrix? Or would you prefer to >>keep scipy-ized PySparse separately and ultimately (merge &) replace >>the original? >>... >> >>Here is a little summary of the situation as I see it now: >> >>existing module (TravisSparse): >>- Python spmatrix class + a number of Python subclasses with a >>possible fortran/C underlying implementation >>- pros: trivial addition of new methods / attributes, speeding up >>could be done "later"; handles (or will handle :-) all numeric scipy >>types (ints, floats, doubles, ...) automagically >>- cons: speed in some situations? >> >>experimental module (RomanSparse) (random remarks, since I know nuts >>about it): >>- C extension class >>- pros: constructor speed? and speed in general? >>- cons: not so easy to change the low-level implementation, 'cause >>it's on that level already?; the solvers (umfpack etc.) are >>hand-wrapped - I would certainly prefer a generated interface (swig?) >>which would be AFAIK much more flexible. >> >>I am personally in favor of the TravisSparse approach as the base with >>RomanSparse subclass of spmatrix with a key feature "*speed* *speed* >>*speed*". Also the _solvers_ should be split as much as practical from >>the sparse matrix _type_. Of course, having some 'recommended format >>hinting' (e.g. CSR for umfpack), that would tell the user "use this if >>you want speed and to avoid implicit matrix conversion" would be >>necessary. >> >> This is already the case. Currently the CSC type is used for SuperLU decomposition, and any matrix with a matvec method can be used with the iterative approaches. >> >> >Yes, I agree that a Python implementation would be simpler and more >flexible than one in C. Perhaps our goal should be to build on the >existing TravisSparse module but replace the sparsetools Fortran code >with code from PySparse, which is probably better tested and debugged. > > Let's not through the baby out with the bath water :-) I spent a bit of time checking that code, and I don't know that PySparse code has the same functionality. If speed is the issue, then we first need to understand what is actually "slowing" us down. We are basing things off a few comments. My suspicion is that speed improvements will be seen by using linked-lists for sparse matrix construction and perhaps a faster back-end for matrix factorization (I'm not sure how umfpack compares to SuperLU). >I've had a look at how objects with both C and Python member functions >are possible in the Python standard library. An example is the random >module: there's a randommodule.c file that is imported in the Python >random.py file and used as a base class for Python objects: > > We could do this, but I'm really not sure of the improvement. The problem is that the different sparse matrices need such different structures underneath that there is not much benefit to having the base class in C. However, you could take the typeobjects that PySparse defines and use them as mix-in classes, so that the Python csc_matrix class inherited from both the spmatrix base-class and the C-based class. >Why don't we adopt a similar structure? Then we'd use Travis's spmatrix >as the base class, and derive csr_matrix from both spmatrix and an >object we import from C. > I see that you thought of it already. >Or is it possible for a C module to derive >directly from a Python class?! > > This is possible, but not particularly easy. Let's solidify the desired structure and find out where the bottlenecks actually are before moving too much to C. -Travis From arnd.baecker at web.de Mon Oct 24 14:29:56 2005 From: arnd.baecker at web.de (Arnd Baecker) Date: Mon, 24 Oct 2005 20:29:56 +0200 (CEST) Subject: [SciPy-dev] random module not working on opteron In-Reply-To: <435CF433.50603@ee.byu.edu> References: <435CF433.50603@ee.byu.edu> Message-ID: On Mon, 24 Oct 2005, Travis Oliphant wrote: > The buld logs are still useful (I recently added the remaining packages > to the build -- but they need 64-bit cleaning). > > Also, I'm not getting that many errors and failures. So, the details > would be helpful. There are some warnings concerning pointers. A cat nohup.out | grep -A 5 -B 5 "incompatible pointer" gives: gcc options: '-pthread -fno-strict-aliasing -DNDEBUG -g -O3 -Wall -Wstrict-prototypes -fPIC' creating build/temp.linux-x86_64-2.4/Lib/io compile options: '-I$BUILDDIR/inst_scipy_newcore/lib/python2.4/site-packages/scipy/base/include -I/scr/python/include/python2.4 -c' gcc: Lib/io/numpyiomodule.c Lib/io/numpyiomodule.c: In function `numpyio_tofile': Lib/io/numpyiomodule.c:282: warning: passing arg 1 of pointer to function from incompatible pointer type Lib/io/numpyiomodule.c: In function `numpyio_convert_objects': Lib/io/numpyiomodule.c:743: warning: passing arg 2 of pointer to function from incompatible pointer type gcc -pthread -shared build/temp.linux-x86_64-2.4/Lib/io/numpyiomodule.o -Lbuild/temp.linux-x86_64-2.4 -o build/lib.linux-x86_64-2.4/scipy/io/numpyio.so building 'scipy.fftpack._fftpack' extension compiling C sources gcc options: '-pthread -fno-strict-aliasing -DNDEBUG -g -O3 -Wall -Wstrict-prototypes -fPIC' creating build/temp.linux-x86_64-2.4/build/src/Lib -- from $BUILDDIR/newscipy/Lib/signal/firfilter.c:1: /usr/include/features.h:190:1: warning: this is the location of the previous definition $BUILDDIR/inst_scipy_newcore/lib/python2.4/site-packages/scipy/base/include/scipy/__multiarray_api.h:699: warning: 'import_array' defined but not used gcc: $BUILDDIR/newscipy/Lib/signal/sigtoolsmodule.c $BUILDDIR/newscipy/Lib/signal/sigtoolsmodule.c: In function `fill_buffer': $BUILDDIR/newscipy/Lib/signal/sigtoolsmodule.c:1557: warning: initialization from incompatible pointer type $BUILDDIR/newscipy/Lib/signal/sigtoolsmodule.c:1558: warning: initialization from incompatible pointer type $BUILDDIR/newscipy/Lib/signal/sigtoolsmodule.c: In function `PyArray_OrderFilterND': $BUILDDIR/newscipy/Lib/signal/sigtoolsmodule.c:1674: warning: passing arg 2 of pointer to function from incompatible pointer type $BUILDDIR/newscipy/Lib/signal/sigtoolsmodule.c:1707: warning: passing arg 3 of `compute_offsets' from incompatible pointer type $BUILDDIR/newscipy/Lib/signal/sigtoolsmodule.c:1707: warning: passing arg 4 of `compute_offsets' from incompatible pointer type $BUILDDIR/newscipy/Lib/signal/sigtoolsmodule.c:1707: warning: passing arg 5 of `compute_offsets' from incompatible pointer type $BUILDDIR/newscipy/Lib/signal/sigtoolsmodule.c:1759: warning: passing arg 3 of `increment' from incompatible pointer type $BUILDDIR/newscipy/Lib/signal/sigtoolsmodule.c: In function `Py_copy_info': $BUILDDIR/newscipy/Lib/signal/sigtoolsmodule.c:1798: warning: assignment from incompatible pointer type $BUILDDIR/newscipy/Lib/signal/sigtoolsmodule.c:1800: warning: assignment from incompatible pointer type $BUILDDIR/newscipy/Lib/signal/sigtoolsmodule.c: In function `sigtools_convolve2d': $BUILDDIR/newscipy/Lib/signal/sigtoolsmodule.c:1997: warning: passing arg 2 of `pylab_convolve_2d' from incompatible pointer type $BUILDDIR/newscipy/Lib/signal/sigtoolsmodule.c:1997: warning: passing arg 4 of `pylab_convolve_2d' from incompatible pointer type $BUILDDIR/newscipy/Lib/signal/sigtoolsmodule.c:1997: warning: passing arg 6 of `pylab_convolve_2d' from incompatible pointer type $BUILDDIR/newscipy/Lib/signal/sigtoolsmodule.c:1997: warning: passing arg 7 of `pylab_convolve_2d' from incompatible pointer type $BUILDDIR/newscipy/Lib/signal/sigtoolsmodule.c:1997: warning: passing arg 8 of `pylab_convolve_2d' from incompatible pointer type $BUILDDIR/newscipy/Lib/signal/sigtoolsmodule.c: In function `sigtools_linear_filter': $BUILDDIR/newscipy/Lib/signal/sigtoolsmodule.c:2094: warning: passing arg 2 of pointer to function from incompatible pointer type $BUILDDIR/newscipy/Lib/signal/sigtoolsmodule.c:2098: warning: passing arg 2 of pointer to function from incompatible pointer type $BUILDDIR/newscipy/Lib/signal/sigtoolsmodule.c: In function `sigtools_median2d': $BUILDDIR/newscipy/Lib/signal/sigtoolsmodule.c:2307: warning: passing arg 2 of pointer to function from incompatible pointer type $BUILDDIR/newscipy/Lib/signal/sigtoolsmodule.c:2316: warning: passing arg 4 of `b_medfilt2' from incompatible pointer type $BUILDDIR/newscipy/Lib/signal/sigtoolsmodule.c:2319: warning: passing arg 4 of `f_medfilt2' from incompatible pointer type $BUILDDIR/newscipy/Lib/signal/sigtoolsmodule.c:2322: warning: passing arg 4 of `d_medfilt2' from incompatible pointer type gcc -pthread -shared build/temp.linux-x86_64-2.4$BUILDDIR/newscipy/Lib/signal/sigtoolsmodule.o build/temp.linux-x86_64-2.4$BUILDDIR/newscipy/Lib/signal/firfilter.o build/temp.linux-x86_64-2.4$BUILDDIR/newscipy/Lib/signal/medianfilter.o -Lbuild/temp.linux-x86_64-2.4 -o build/lib.linux-x86_64-2.4/scipy/signal/sigtools.so building 'scipy.signal.spline' extension compiling C sources gcc options: '-pthread -fno-strict-aliasing -DNDEBUG -g -O3 -Wall -Wstrict-prototypes -fPIC' compile options: '-I$BUILDDIR/inst_scipy_newcore/lib/python2.4/site-packages/scipy/base/include -I/scr/python/include/python2.4 -c' gcc: $BUILDDIR/newscipy/Lib/signal/C_bspline_util.c gcc: $BUILDDIR/newscipy/Lib/signal/splinemodule.c $BUILDDIR/newscipy/Lib/signal/splinemodule.c: In function `cspline2d': $BUILDDIR/newscipy/Lib/signal/splinemodule.c:84: warning: passing arg 2 of pointer to function from incompatible pointer type $BUILDDIR/newscipy/Lib/signal/splinemodule.c:89: warning: passing arg 1 of `convert_strides' from incompatible pointer type $BUILDDIR/newscipy/Lib/signal/splinemodule.c: In function `qspline2d': $BUILDDIR/newscipy/Lib/signal/splinemodule.c:143: warning: passing arg 2 of pointer to function from incompatible pointer type $BUILDDIR/newscipy/Lib/signal/splinemodule.c:148: warning: passing arg 1 of `convert_strides' from incompatible pointer type $BUILDDIR/newscipy/Lib/signal/splinemodule.c: In function `FIRsepsym2d': $BUILDDIR/newscipy/Lib/signal/splinemodule.c:200: warning: passing arg 2 of pointer to function from incompatible pointer type $BUILDDIR/newscipy/Lib/signal/splinemodule.c:205: warning: passing arg 1 of `convert_strides' from incompatible pointer type $BUILDDIR/newscipy/Lib/signal/splinemodule.c: In function `IIRsymorder1': $BUILDDIR/newscipy/Lib/signal/splinemodule.c:308: warning: passing arg 2 of pointer to function from incompatible pointer type $BUILDDIR/newscipy/Lib/signal/splinemodule.c:312: warning: passing arg 1 of `convert_strides' from incompatible pointer type $BUILDDIR/newscipy/Lib/signal/splinemodule.c: In function `IIRsymorder2': $BUILDDIR/newscipy/Lib/signal/splinemodule.c:428: warning: passing arg 2 of pointer to function from incompatible pointer type $BUILDDIR/newscipy/Lib/signal/splinemodule.c:432: warning: passing arg 1 of `convert_strides' from incompatible pointer type gcc: $BUILDDIR/newscipy/Lib/signal/S_bspline_util.c gcc: $BUILDDIR/newscipy/Lib/signal/bspline_util.c gcc: $BUILDDIR/newscipy/Lib/signal/D_bspline_util.c gcc: $BUILDDIR/newscipy/Lib/signal/Z_bspline_util.c gcc -pthread -shared build/temp.linux-x86_64-2.4$BUILDDIR/newscipy/Lib/signal/splinemodule.o build/temp.linux-x86_64-2.4$BUILDDIR/newscipy/Lib/signal/S_bspline_util.o build/temp.linux-x86_64-2.4$BUILDDIR/newscipy/Lib/signal/D_bspline_util.o build/temp.linux-x86_64-2.4$BUILDDIR/newscipy/Lib/signal/C_bspline_util.o build/temp.linux-x86_64-2.4$BUILDDIR/newscipy/Lib/signal/Z_bspline_util.o build/temp.linux-x86_64-2.4$BUILDDIR/newscipy/Lib/signal/bspline_util.o -Lbuild/temp.linux-x86_64-2.4 -o build/lib.linux-x86_64-2.4/scipy/signal/spline.so Further warnings ================ These should be observable on any platform. - _configtest.c: the usual warnings - gcc: scipy/corelib/mtrand/mtrand.c scipy/corelib/mtrand/mtrand.c: In function `__pyx_tp_new_6mtrand_RandomState': scipy/corelib/mtrand/mtrand.c:4908: warning: unused variable `p' - warning: callstatement is defined without callprotoargument y,t,istate = dvode(f,jac,y,t,tout,rtol,atol,itask,istate, rwork,iwork,mf, [f_extra_args,jac_extra_args,overwrite_y]) Wrote C/API module "vode" to file "build/src/Lib/integrate/vodemodule.c" - many "might be used uninitialized" (I don't like that, but OTOH it means to look into oldish Fortran code. Not sure if this is worth the effort ..) Concerns: dfftpack,quadpack,odepack,amos,toms,cdflib, specfun,minpack,fitpack, ... - Lib/special/cephes/sincos.c:232: warning: conflicting types for built-in function 'sincos' - SuperLU seems to be in bad shape: *many* implicit declarations like: Lib/sparase/SuperLU/SRC/clacon.c:150: warning: implicit declaration of function `ccopy_' many "suggest parentheses around assignment used as truth value" - several warnigns in: build/src/Lib/linalg/iterative/BiCGREVCOM.f: In subroutine `sbicgrevcom': build/src/Lib/linalg/iterative/BiCGREVCOM.f:135: warning: IF (RLBL .eq. 2) GOTO 2 1 build/src/Lib/linalg/iterative/BiCGREVCOM.f:239: (continued): 2 CONTINUE 2 Reference to label at (1) is outside block containing definition at (2) Best, Arnd From schofield at ftw.at Mon Oct 24 17:26:31 2005 From: schofield at ftw.at (Ed Schofield) Date: Mon, 24 Oct 2005 23:26:31 +0200 Subject: [SciPy-dev] Sparse module update Message-ID: <435D5187.8080509@ftw.at> I've been working on the sparse matrix module. Now all but three tests pass on my machine (x86 Linux). Could someone please look at the distutils setup for the sparse module, especially the sparsetools Fortran code? I'm not sure I'm doing the right thing. I needed to change a line in the test module from: from sparse import csc_matrix, csr_matrix, dok_matrix to from scipy.sparse import csc_matrix, csr_matrix, dok_matrix -- Ed From cimrman3 at ntc.zcu.cz Tue Oct 25 04:39:27 2005 From: cimrman3 at ntc.zcu.cz (Robert Cimrman) Date: Tue, 25 Oct 2005 10:39:27 +0200 Subject: [SciPy-dev] Sparse module update In-Reply-To: <435D5187.8080509@ftw.at> References: <435D5187.8080509@ftw.at> Message-ID: <435DEF3F.4000604@ntc.zcu.cz> Ed Schofield wrote: > I've been working on the sparse matrix module. Now all but three tests > pass on my machine (x86 Linux). Great work! After a small fix all tests pass now. Still there remains the strange error I have reported earlier with the subject "CS* constructor error"... To sum it up again, passing a single row to CSC and a single column to the CSR constructors does not work - a suspect for me is f2py. r. From cimrman3 at ntc.zcu.cz Tue Oct 25 04:56:54 2005 From: cimrman3 at ntc.zcu.cz (Robert Cimrman) Date: Tue, 25 Oct 2005 10:56:54 +0200 Subject: [SciPy-dev] Sparse matrix module In-Reply-To: <435CF7B5.7050707@ee.byu.edu> References: <435CD331.3070007@ftw.at> <435CF7B5.7050707@ee.byu.edu> Message-ID: <435DF356.9060702@ntc.zcu.cz> Travis Oliphant wrote: > Ed Schofield wrote: > > >>Robert Cimrman wrote: >>> ... >>>I am personally in favor of the TravisSparse approach as the base with >>>RomanSparse subclass of spmatrix with a key feature "*speed* *speed* >>>*speed*". Also the _solvers_ should be split as much as practical from >>>the sparse matrix _type_. Of course, having some 'recommended format >>>hinting' (e.g. CSR for umfpack), that would tell the user "use this if >>>you want speed and to avoid implicit matrix conversion" would be >>>necessary. >>> >>> > > This is already the case. Currently the CSC type is used for SuperLU > decomposition, and any matrix with a matvec method can be used with the > iterative approaches. Great. (In my remark I was talking about "RomanSparse"...) >>Yes, I agree that a Python implementation would be simpler and more >>flexible than one in C. Perhaps our goal should be to build on the >>existing TravisSparse module but replace the sparsetools Fortran code >>with code from PySparse, which is probably better tested and debugged. >> >> > > Let's not through the baby out with the bath water :-) I spent a bit of > time checking that code, and I don't know that PySparse code has the > same functionality. It does not - it is tied to 'double' (IMHO). On the other hand it does have some hand-tailored optimizations in its routines. But I agree with you that we should see first where the real bottlenecks are - the premature optimization is evil :-) > If speed is the issue, then we first need to understand what is actually > "slowing" us down. We are basing things off a few comments. > > My suspicion is that speed improvements will be seen by using > linked-lists for sparse matrix construction and perhaps a faster > back-end for matrix factorization (I'm not sure how umfpack compares to > SuperLU). Yes, this is the point where I see great application for the LL matrix of PySparse. > We could do this, but I'm really not sure of the improvement. The > problem is that the different sparse matrices need such different > structures underneath that there is not much benefit to having the base > class in C. +1 r. From dd55 at cornell.edu Tue Oct 25 11:31:06 2005 From: dd55 at cornell.edu (Darren Dale) Date: Tue, 25 Oct 2005 11:31:06 -0400 Subject: [SciPy-dev] test results Message-ID: <200510251131.06346.dd55@cornell.edu> I just updated from svn, and get the following message when I import: from scipy import * Importing io to scipy Importing special to scipy Importing fftpack to scipy Importing cluster to scipy Importing sparse to scipy Importing signal to scipy Failed to import signal cannot import name comb Importing utils to scipy Importing interpolate to scipy Importing integrate to scipy Importing optimize to scipy Importing linalg to scipy Here are my test results. My system is a P4, gentoo linux with fftw-2.1.5, blas/lapack/atlas version 3.6, and gcc-3.4.4. The last three errors are caused by signbit returning False for any number on my system, positive or negative. I have tested the libc signbit macro and it is working properly, though my example has to be compiled with ANSI C to gain access to the macro. Others here have not been able to reproduce this signbit error, and I am unsure how to proceed. Darren ====================================================================== ERROR: check_definition (scipy.fftpack.pseudo_diffs.test_pseudo_diffs.test_diff) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/lib/python2.4/site-packages/scipy/fftpack/tests/test_pseudo_diffs.py", line 88, in check_definition assert_array_almost_equal(diff(sin(x)),direct_diff(sin(x))) File "/usr/lib/python2.4/site-packages/scipy/fftpack/pseudo_diffs.py", line 48, in diff if tmp.dtype in (scipy.Complex32, scipy.Complex64): NameError: global name 'scipy' is not defined ====================================================================== ERROR: check_expr (scipy.fftpack.pseudo_diffs.test_pseudo_diffs.test_diff) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/lib/python2.4/site-pakages/scipy/fftpack/tests/test_pseudo_diffs.py", line 131, in check_expr d1 = diff(f) File "/usr/lib/python2.4/site-packages/scipy/fftpack/pseudo_diffs.py", line 48, in diff if tmp.dtype in (scipy.Complex32, scipy.Complex64): NameError: global name 'scipy' is not defined ====================================================================== ERROR: check_expr_large (scipy.fftpack.pseudo_diffs.test_pseudo_diffs.test_diff) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/lib/python2.4/site-packages/scipy/fftpack/tests/test_pseudo_diffs.py", line 145, in check_expr_large assert_array_almost_equal(diff(f),df) File "/usr/lib/python2.4/site-packages/scipy/fftpack/pseudo_diffs.py", line 48, in diff if tmp.dtype in (scipy.Complex32, scipy.Complex64): NameError: global name 'scipy' is not defined ====================================================================== ERROR: check_int (scipy.fftpack.pseudo_diffs.test_pseudo_diffs.test_diff) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/lib/python2.4/site-packages/scipy/fftpack/tests/test_pseudo_diffs.py", line 153, in check_int assert_array_almost_equal(diff(sin(x),-1),-cos(x)) File "/usr/lib/python2.4/site-packages/scipy/fftpack/pseudo_diffs.py", line 48, in diff if tmp.dtype in (scipy.Complex32, scipy.Complex64): NameError: global name 'scipy' is not defined ====================================================================== ERROR: check_period (scipy.fftpack.pseudo_diffs.test_pseudo_diffs.test_diff) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/lib/python2.4/site-packages/scipy/fftpack/tests/test_pseudo_diffs.py", line 109, in check_period assert_array_almost_equal(diff(sin(2*pi*x),period=1), File "/usr/lib/python2.4/site-packages/scipy/fftpack/pseudo_diffs.py", line 48, in diff if tmp.dtype in (scipy.Complex32, scipy.Complex64): NameError: global name 'scipy' is not defined ====================================================================== ERROR: check_random_even (scipy.fftpack.pseudo_diffs.test_pseudo_diffs.test_diff) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/lib/python2.4/site-packages/scipy/fftpack/tests/test_pseudo_diffs.py", line 165, in check_random_even f = diff(diff(f,1),-1) File "/usr/lib/python2.4/site-packages/scipy/fftpack/pseudo_diffs.py", line 48, in diff if tmp.dtype in (scipy.Complex32, scipy.Complex64): NameError: global name 'scipy' is not defined ====================================================================== ERROR: check_random_odd (scipy.fftpack.pseudo_diffs.test_pseudo_diffs.test_diff) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/lib/python2.4/site-packages/scipy/fftpack/tests/test_pseudo_diffs.py", line 177, in check_random_odd assert_array_almost_equal(diff(diff(f,k),-k),f) File "/usr/lib/python2.4/site-packages/scipy_test/testing.py", line 675, in assert_array_almost_equal reduced = ravel(equal(less_equal(around(abs(x-y),decimal),10.0**(-decimal)),1)) TypeError: unsupported operand type(s) for -: 'str' and 'float' ====================================================================== ERROR: check_sin (scipy.fftpack.pseudo_diffs.test_pseudo_diffs.test_diff) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/lib/python2.4/site-packages/scipy/fftpack/tests/test_pseudo_diffs.py", line 117, in check_sin assert_array_almost_equal(diff(sin(x)),cos(x)) File "/usr/lib/python2.4/site-packages/scipy/fftpack/pseudo_diffs.py", line 48, in diff if tmp.dtype in (scipy.Complex32, scipy.Complex64): NameError: global name 'scipy' is not defined ====================================================================== ERROR: check_zero_nyquist (scipy.fftpack.pseudo_diffs.test_pseudo_diffs.test_diff) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/lib/python2.4/site-packages/scipy/fftpack/tests/test_pseudo_diffs.py", line 187, in check_zero_nyquist f = diff(diff(f,1),-1) File "/usr/lib/python2.4/site-packages/scipy/fftpack/pseudo_diffs.py", line 48, in diff if tmp.dtype in (scipy.Complex32, scipy.Complex64): NameError: global name 'scipy' is not defined ====================================================================== ERROR: check_definition (scipy.fftpack.pseudo_diffs.test_pseudo_diffs.test_hilbert) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/lib/python2.4/site-packages/scipy/fftpack/tests/test_pseudo_diffs.py", line 302, in check_definition y = hilbert(sin(x)) File "/usr/lib/python2.4/site-packages/scipy/fftpack/pseudo_diffs.py", line 175, in hilbert if tmp.dtype in (scipy.Complex32, scipy.Complex64): NameError: global name 'scipy' is not defined ====================================================================== ERROR: check_random_even (scipy.fftpack.pseudo_diffs.test_pseudo_diffs.test_hilbert) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/lib/python2.4/site-packages/scipy/fftpack/tests/test_pseudo_diffs.py", line 333, in check_random_even f = diff(diff(f,1),-1) File "/usr/lib/python2.4/site-packages/scipy/fftpack/pseudo_diffs.py", line 48, in diff if tmp.dtype in (scipy.Complex32, scipy.Complex64): NameError: global name 'scipy' is not defined ====================================================================== ERROR: check_random_odd (scipy.fftpack.pseudo_diffs.test_pseudo_diffs.test_hilbert) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/lib/python2.4/site-packages/scipy/fftpack/tests/test_pseudo_diffs.py", line 324, in check_random_odd assert_array_almost_equal(ihilbert(hilbert(f)),f) File "/usr/lib/python2.4/site-packages/scipy/fftpack/pseudo_diffs.py", line 175, in hilbert if tmp.dtype in (scipy.Complex32, scipy.Complex64): NameError: global name 'scipy' is not defined ====================================================================== ERROR: check_tilbert_relation (scipy.fftpack.pseudo_diffs.test_pseudo_diffs.test_hilbert) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/lib/python2.4/site-packages/scipy/fftpack/tests/test_pseudo_diffs.py", line 312, in check_tilbert_relation y = hilbert(f) File "/usr/lib/python2.4/site-packages/scipy/fftpack/pseudo_diffs.py", line 175, in hilbert if tmp.dtype in (scipy.Complex32, scipy.Complex64): NameError: global name 'scipy' is not defined ====================================================================== ERROR: check_definition (scipy.fftpack.pseudo_diffs.test_pseudo_diffs.test_ihilbert) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/lib/python2.4/site-packages/scipy/fftpack/tests/test_pseudo_diffs.py", line 371, in check_definition y = ihilbert(sin(x)) File "/usr/lib/python2.4/site-packages/scipy/fftpack/pseudo_diffs.py", line 204, in ihilbert return -hilbert(x) File "/usr/lib/python2.4/site-packages/scipy/fftpack/pseudo_diffs.py", line 175, in hilbert if tmp.dtype in (scipy.Complex32, scipy.Complex64): NameError: global name 'scipy' is not defined ====================================================================== ERROR: check_itilbert_relation (scipy.fftpack.pseudo_diffs.test_pseudo_diffs.test_ihilbert) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/lib/python2.4/site-packages/scipy/fftpack/tests/test_pseudo_diffs.py", line 381, in check_itilbert_relation y = ihilbert(f) File "/usr/lib/python2.4/site-packages/scipy/fftpack/pseudo_diffs.py", line 204, in ihilbert return -hilbert(x) File "/usr/lib/python2.4/site-packages/scipy/fftpack/pseudo_diffs.py", line 175, in hilbert if tmp.dtype in (scipy.Complex32, scipy.Complex64): NameError: global name 'scipy' is not defined ====================================================================== ERROR: check_definition (scipy.fftpack.pseudo_diffs.test_pseudo_diffs.test_itilbert) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/lib/python2.4/site-packages/scipy/fftpack/tests/test_pseudo_diffs.py", line 289, in check_definition y = itilbert(sin(x),h) File "/usr/lib/python2.4/site-packages/scipy/fftpack/pseudo_diffs.py", line 136, in itilbert if tmp.dtype in (scipy.Complex32, scipy.Complex64): NameError: global name 'scipy' is not defined ====================================================================== ERROR: check_definition (scipy.fftpack.pseudo_diffs.test_pseudo_diffs.test_shift) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/lib/python2.4/site-packages/scipy/fftpack/tests/test_pseudo_diffs.py", line 393, in check_definition assert_array_almost_equal(shift(sin(x),a),direct_shift(sin(x),a)) File "/usr/lib/python2.4/site-packages/scipy/fftpack/pseudo_diffs.py", line 404, in shift if tmp.dtype in (scipy.Complex32, scipy.Complex64): NameError: global name 'scipy' is not defined ====================================================================== ERROR: check_definition (scipy.fftpack.pseudo_diffs.test_pseudo_diffs.test_tilbert) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/lib/python2.4/site-packages/scipy/fftpack/tests/test_pseudo_diffs.py", line 228, in check_definition y = tilbert(sin(x),h) File "/usr/lib/python2.4/site-packages/scipy/fftpack/pseudo_diffs.py", line 100, in tilbert if tmp.dtype in (scipy.Complex32, scipy.Complex64): NameError: global name 'scipy' is not defined ====================================================================== ERROR: check_random_even (scipy.fftpack.pseudo_diffs.test_pseudo_diffs.test_tilbert) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/lib/python2.4/site-packages/scipy/fftpack/tests/test_pseudo_diffs.py", line 243, in check_random_even assert_array_almost_equal(direct_tilbert(direct_itilbert(f,h),h),f) File "/usr/lib/python2.4/site-packages/scipy/fftpack/tests/test_pseudo_diffs.py", line 61, in direct_itilbert w = 1j*tanh(w) AttributeError: 'float64_arrtype' object has no attribute 'tanh' ====================================================================== ERROR: check_random_odd (scipy.fftpack.pseudo_diffs.test_pseudo_diffs.test_tilbert) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/lib/python2.4/site-packages/scipy/fftpack/tests/test_pseudo_diffs.py", line 252, in check_random_odd assert_array_almost_equal(itilbert(tilbert(f,h),h),f) File "/usr/lib/python2.4/site-packages/scipy/fftpack/pseudo_diffs.py", line 100, in tilbert if tmp.dtype in (scipy.Complex32, scipy.Complex64): NameError: global name 'scipy' is not defined ====================================================================== ERROR: check_djbfft (scipy.fftpack.basic.test_basic.test_irfft) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/lib/python2.4/site-packages/scipy/fftpack/tests/test_basic.py", line 361, in check_djbfft y1 = ifft2(x1) File "/usr/lib/python2.4/site-packages/Numeric/FFT/FFT.py", line 102, in inverse_fft a = Numeric.asarray(a).astype(Numeric.Complex) File "/usr/lib/python2.4/site-packages/Numeric/Numeric.py", line 134, in asarray return multiarray.array(a, typecode, copy=0, savespace=savespace) TypeError: only length-1 arrays can be converted to Python scalars ====================================================================== FAIL: check_definition (scipy.fftpack.basic.test_basic.test_fft) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/lib/python2.4/site-packages/scipy/fftpack/tests/test_basic.py", line 101, in check_definition assert_array_almost_equal(y,y1) File "/usr/lib/python2.4/site-packages/scipy/test/testing.py", line 727, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 100.0%): Array 1: [ 2.0000000e+01 +0.0000000e+00j 0.0000000e+00 +0.0000000e+00j -4.0000000e+00 +4.0000000e+00j 0.0000000e+00 +0.000... Array 2: [ 20. +3.j -0.7071068+0.7071068j -7. +4.j -0.7071068-0.7071068j -4. -3.j 0.707106... ====================================================================== FAIL: check_n_argument_complex (scipy.fftpack.basic.test_basic.test_fft) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/lib/python2.4/site-packages/scipy/fftpack/tests/test_basic.py", line 118, in check_n_argument_complex assert_array_almost_equal(y[0],direct_dft(x1)) File "/usr/lib/python2.4/site-packages/scipy/test/testing.py", line 727, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 100.0%): Array 1: [ 1.0000000e+01+0.j -2.0000000e+00+2.j -2.0000000e+00+0.j -2.0000000e+00-2.j] Array 2: [ 10.+1.j -3.+2.j -2.-1.j -1.-2.j] ====================================================================== FAIL: check_definition (scipy.fftpack.basic.test_basic.test_fftn) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/lib/python2.4/site-packages/scipy/fftpack/tests/test_basic.py", line 419, in check_definition assert_array_almost_equal(y,direct_dftn(x)) File "/usr/lib/python2.4/site-packages/scipy/test/testing.py", line 727, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 22.2222222222%): Array 1: [[ 4.5000000e+01+0.j -4.5000000e+00+2.5980762j -4.5000000e+00-2.5980762j] [ -1.3500000e+01+7.7942286j 0.0... Array 2: [[ 4.5000000e+01+0.j -4.5000000e+00+2.5980762j -4.5000000e+00-2.5980762j] [ -1.3500000e+01+0.j 0.0... ====================================================================== FAIL: check_definition (scipy.fftpack.basic.test_basic.test_ifft) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/lib/python2.4/site-packages/scipy/fftpack/tests/test_basic.py", line 184, in check_definition assert_array_almost_equal(y,y1) File "/usr/lib/python2.4/site-packages/scipy/test/testing.py", line 727, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 100.0%): Array 1: [ 2.5000000e+00 +0.0000000e+00j 0.0000000e+00 -0.0000000e+00j -5.0000000e-01 -5.0000000e-01j 0.0000000e+00 -0.000... Array 2: [ 2.5 +0.375j 0.0883883+0.0883883j -0.125 -0.5j 0.0883883-0.0883883j -0.5 -0.375j -0.0883883-0.0... ====================================================================== FAIL: check_random_real (scipy.fftpack.basic.test_basic.test_ifft) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/lib/python2.4/site-packages/scipy/fftpack/tests/test_basic.py", line 218, in check_random_real assert_array_almost_equal (ifft(fft(x)),x) File "/usr/lib/python2.4/site-packages/scipy/test/testing.py", line 727, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 98.0392156863%): Array 1: [ 2.7010335e-01 +0.0000000e+00j 1.5051924e-01 +1.9549653e-17j 3.9770871e-01 -4.2662293e-17j 5.5451658e-01 -5.476... Array 2: [ 0.2701033 0.0145507 0.5296276 0.2231588 0.6244757 0.1734576 0.4209751 0.6859024 0.888794 0.3217651 0.42293... ====================================================================== FAIL: check_definition (scipy.fftpack.basic.test_basic.test_ifftn) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/lib/python2.4/site-packages/scipy/fftpack/tests/test_basic.py", line 586, in check_definition assert_array_almost_equal(y,direct_idftn(x)) File "/usr/lib/python2.4/site-packages/scipy/test/testing.py", line 727, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 22.2222222222%): Array 1: [[ 5.0000000e+00+0.j -5.0000000e-01-0.2886751j -5.0000000e-01+0.2886751j] [ -1.5000000e+00-0.8660254j 0.0... Array 2: [[ 5.0000000e+00+0.j -5.0000000e-01-0.2886751j -5.0000000e-01+0.2886751j] [ -1.5000000e+00+0.j 0.0... ====================================================================== FAIL: check_definition (scipy.fftpack.basic.test_basic.test_irfft) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/lib/python2.4/site-packages/scipy/fftpack/tests/test_basic.py", line 338, in check_definition assert_array_almost_equal(y,ifft(x1)) File "/usr/lib/python2.4/site-packages/scipy/test/testing.py", line 727, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 50.0%): Array 1: [ 2.625 -1.6856602 -0.375 -1.1856602 0.625 0.4356602 -0.375 0.9356602] Array 2: [ 2.6250000e+00+0.j -3.7500000e-01-0.j -3.7500000e-01-0.j -3.7500000e-01-0.j 6.2500000e-01+0.j -3.7500000e-01+0.... ====================================================================== FAIL: check_generic (scipy.base.type_check.test_type_check.test_isneginf) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/lib/python2.4/site-packages/scipy/base/tests/test_type_check.py", line 193, in check_generic assert(vals[0] == 1) AssertionError ====================================================================== FAIL: check_generic (scipy.base.type_check.test_type_check.test_isposinf) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/lib/python2.4/site-packages/scipy/base/tests/test_type_check.py", line 186, in check_generic assert(vals[0] == 0) AssertionError ====================================================================== FAIL: check_generic (scipy.base.type_check.test_type_check.test_nan_to_num) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/lib/python2.4/site-packages/scipy/base/tests/test_type_check.py", line 200, in check_generic assert_all(vals[0] < -1e10) and assert_all(isfinite(vals[0])) File "/usr/lib/python2.4/site-packages/scipy/base/tests/test_type_check.py", line 12, in assert_all assert(all(x)), x AssertionError: False ---------------------------------------------------------------------- Ran 550 tests in 1.718s FAILED (failures=10, errors=21) From arnd.baecker at web.de Tue Oct 25 11:37:59 2005 From: arnd.baecker at web.de (Arnd Baecker) Date: Tue, 25 Oct 2005 17:37:59 +0200 (CEST) Subject: [SciPy-dev] Question about 64-bit integers being cast to double precision In-Reply-To: <1129157213.4839.5.camel@E011704> References: <4349EDA2.2090802@ee.byu.edu> <434A9453.2010705@csun.edu> <434A98EB.7030407@ee.byu.edu> <434D8F27.100@colorado.edu> <1129157213.4839.5.camel@E011704> Message-ID: On Wed, 12 Oct 2005, Charles R Harris wrote: > On Wed, 2005-10-12 at 16:33 -0600, Fernando Perez wrote: > > Travis Oliphant wrote: > > > > >>With all that, my vote on Travis's specific question: if conversion of > > >>an N-bit integer in scipy_core is required, it gets converted to an > > >>N-bit float. The only cases in which precision will be lost is if the > > >>integer is large enough to require more than (N-e) bits for its > > >>representation, where e is the number of bits in the exponent of the > > >>floating point representation. > > >> > > > > > > > > > Yes, it is only for large integers that problems arise. I like this > > > scheme and it would be very easy to implement, and it would provide a > > > consistent interface. > > > > > > The only problem is that it would mean that on current 32-bit systems > > > > > > sqrt(2) would cast 2 to a "single-precision" float and return a > > > single-precision result. > > > > > > If that is not a problem, then great... > > > > > > Otherwise, a more complicated (and less consistent) rule like > > > > > > integer float > > > ============== > > > 8-bit 32-bit > > > 16-bit 32-bit > > > 32-bit 64-bit > > > 64-bit 64-bit > > > > > > would be needed (this is also not too hard to do). > > > > Here's a different way to think about this issue: instead of thinking in terms > > of bit-width, let's look at it in terms of exact vs inexact numbers. Integers > > are exact, and their bit size only impacts the range of them which is > > representable. > > > > If we look at it this way, then seems to me justifiable to suggest that > > sqrt(2) would upcast to the highest-available precision floating point format. > > Obviously this can have an enormous memory impact if we're talking about a > > big array of numbers instead of sqrt(2), so I'm not 100% sure it's the right > > solution. However, I think that the rule 'if you apply "floating point" > > operations to integer inputs, the system will upcast the integers to give you > > as much precision as possible' is a reasonable one. Users needing tight > > memory control could always first convert their small integers to the smallest > > existing floats, and then operate on that. > > I think it is a good idea to keep double as the default, if only because > Python expects it. If someone needs more control over the precision of > arrays, why not do as c does and add functions sqrtf and sqrtl? I also think that double should be kept as default. If I understand things correctly, both normal python and all the libraries for scipy can only deal with that at the moment. The need for long double precision (and even multiple precision arithmetic) does arise in some situations, but I am not sure if this will be the default in the near future. Still it would be great if there was a `long double` version of scipy on those platforms which support this natively. This would require long double versions of basic math,cmath functions, of cephes (and all other routines from scipy.special), of fft, ATLAS, root finding, etc. etc. This would require major work, I fear, as for example several constants are hard coded to work for double precision and nothing else. Does this mean that one would need a ``parallel`` installation of scipy_long_double to do import scipy_long_double as scipy to perform all computations using `long double` (possibly after some modifications to the array declarations)? If double precision is kept as default, a conversion of a large integer would would raise an OverflowError as it is done right now. Best, Arnd From nwagner at mecha.uni-stuttgart.de Tue Oct 25 12:38:19 2005 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Tue, 25 Oct 2005 18:38:19 +0200 Subject: [SciPy-dev] test results In-Reply-To: <200510251131.06346.dd55@cornell.edu> References: <200510251131.06346.dd55@cornell.edu> Message-ID: On Tue, 25 Oct 2005 11:31:06 -0400 Darren Dale wrote: > I just updated from svn, and get the following message >when I import: > > from scipy import * > Importing io to scipy > Importing special to scipy > Importing fftpack to scipy > Importing cluster to scipy > Importing sparse to scipy > Importing signal to scipy >Failed to import signal > cannot import name comb > Importing utils to scipy > Importing interpolate to scipy > Importing integrate to scipy > Importing optimize to scipy > Importing linalg to scipy > > Here are my test results. My system is a P4, gentoo >linux with fftw-2.1.5, > blas/lapack/atlas version 3.6, and gcc-3.4.4. > > The last three errors are caused by signbit returning >False for any number on > my system, positive or negative. I have tested the libc >signbit macro and it > is working properly, though my example has to be >compiled with ANSI C to gain > access to the macro. Others here have not been able to >reproduce this signbit > error, and I am unsure how to proceed. > > Darren > > ====================================================================== > ERROR: check_definition > (scipy.fftpack.pseudo_diffs.test_pseudo_diffs.test_diff) > ---------------------------------------------------------------------- > Traceback (most recent call last): > File > "/usr/lib/python2.4/site-packages/scipy/fftpack/tests/test_pseudo_diffs.py", > line 88, in check_definition > assert_array_almost_equal(diff(sin(x)),direct_diff(sin(x))) > File >"/usr/lib/python2.4/site-packages/scipy/fftpack/pseudo_diffs.py", >line > 48, in diff > if tmp.dtype in (scipy.Complex32, scipy.Complex64): > NameError: global name 'scipy' is not defined > > ====================================================================== > ERROR: check_expr >(scipy.fftpack.pseudo_diffs.test_pseudo_diffs.test_diff) > ---------------------------------------------------------------------- > Traceback (most recent call last): > File > "/usr/lib/python2.4/site-pakages/scipy/fftpack/tests/test_pseudo_diffs.py", > line 131, in check_expr > d1 = diff(f) > File >"/usr/lib/python2.4/site-packages/scipy/fftpack/pseudo_diffs.py", >line > 48, in diff > if tmp.dtype in (scipy.Complex32, scipy.Complex64): > NameError: global name 'scipy' is not defined > > ====================================================================== > ERROR: check_expr_large > (scipy.fftpack.pseudo_diffs.test_pseudo_diffs.test_diff) > ---------------------------------------------------------------------- > Traceback (most recent call last): > File > "/usr/lib/python2.4/site-packages/scipy/fftpack/tests/test_pseudo_diffs.py", > line 145, in check_expr_large > assert_array_almost_equal(diff(f),df) > File >"/usr/lib/python2.4/site-packages/scipy/fftpack/pseudo_diffs.py", >line > 48, in diff > if tmp.dtype in (scipy.Complex32, scipy.Complex64): > NameError: global name 'scipy' is not defined > > ====================================================================== > ERROR: check_int >(scipy.fftpack.pseudo_diffs.test_pseudo_diffs.test_diff) > ---------------------------------------------------------------------- > Traceback (most recent call last): > File > "/usr/lib/python2.4/site-packages/scipy/fftpack/tests/test_pseudo_diffs.py", > line 153, in check_int > assert_array_almost_equal(diff(sin(x),-1),-cos(x)) > File >"/usr/lib/python2.4/site-packages/scipy/fftpack/pseudo_diffs.py", >line > 48, in diff > if tmp.dtype in (scipy.Complex32, scipy.Complex64): > NameError: global name 'scipy' is not defined > > ====================================================================== > ERROR: check_period >(scipy.fftpack.pseudo_diffs.test_pseudo_diffs.test_diff) > ---------------------------------------------------------------------- > Traceback (most recent call last): > File > "/usr/lib/python2.4/site-packages/scipy/fftpack/tests/test_pseudo_diffs.py", > line 109, in check_period > assert_array_almost_equal(diff(sin(2*pi*x),period=1), > File >"/usr/lib/python2.4/site-packages/scipy/fftpack/pseudo_diffs.py", >line > 48, in diff > if tmp.dtype in (scipy.Complex32, scipy.Complex64): > NameError: global name 'scipy' is not defined > > ====================================================================== > ERROR: check_random_even > (scipy.fftpack.pseudo_diffs.test_pseudo_diffs.test_diff) > ---------------------------------------------------------------------- > Traceback (most recent call last): > File > "/usr/lib/python2.4/site-packages/scipy/fftpack/tests/test_pseudo_diffs.py", > line 165, in check_random_even > f = diff(diff(f,1),-1) > File >"/usr/lib/python2.4/site-packages/scipy/fftpack/pseudo_diffs.py", >line > 48, in diff > if tmp.dtype in (scipy.Complex32, scipy.Complex64): > NameError: global name 'scipy' is not defined > > ====================================================================== > ERROR: check_random_odd > (scipy.fftpack.pseudo_diffs.test_pseudo_diffs.test_diff) > ---------------------------------------------------------------------- > Traceback (most recent call last): > File > "/usr/lib/python2.4/site-packages/scipy/fftpack/tests/test_pseudo_diffs.py", > line 177, in check_random_odd > assert_array_almost_equal(diff(diff(f,k),-k),f) > File >"/usr/lib/python2.4/site-packages/scipy_test/testing.py", >line 675, in > assert_array_almost_equal > reduced = > ravel(equal(less_equal(around(abs(x-y),decimal),10.0**(-decimal)),1)) > TypeError: unsupported operand type(s) for -: 'str' and >'float' > > ====================================================================== > ERROR: check_sin >(scipy.fftpack.pseudo_diffs.test_pseudo_diffs.test_diff) > ---------------------------------------------------------------------- > Traceback (most recent call last): > File > "/usr/lib/python2.4/site-packages/scipy/fftpack/tests/test_pseudo_diffs.py", > line 117, in check_sin > assert_array_almost_equal(diff(sin(x)),cos(x)) > File >"/usr/lib/python2.4/site-packages/scipy/fftpack/pseudo_diffs.py", >line > 48, in diff > if tmp.dtype in (scipy.Complex32, scipy.Complex64): > NameError: global name 'scipy' is not defined > > ====================================================================== > ERROR: check_zero_nyquist > (scipy.fftpack.pseudo_diffs.test_pseudo_diffs.test_diff) > ---------------------------------------------------------------------- > Traceback (most recent call last): > File > "/usr/lib/python2.4/site-packages/scipy/fftpack/tests/test_pseudo_diffs.py", > line 187, in check_zero_nyquist > f = diff(diff(f,1),-1) > File >"/usr/lib/python2.4/site-packages/scipy/fftpack/pseudo_diffs.py", >line > 48, in diff > if tmp.dtype in (scipy.Complex32, scipy.Complex64): > NameError: global name 'scipy' is not defined > > ====================================================================== > ERROR: check_definition > (scipy.fftpack.pseudo_diffs.test_pseudo_diffs.test_hilbert) > ---------------------------------------------------------------------- > Traceback (most recent call last): > File > "/usr/lib/python2.4/site-packages/scipy/fftpack/tests/test_pseudo_diffs.py", > line 302, in check_definition > y = hilbert(sin(x)) > File >"/usr/lib/python2.4/site-packages/scipy/fftpack/pseudo_diffs.py", >line > 175, in hilbert > if tmp.dtype in (scipy.Complex32, scipy.Complex64): > NameError: global name 'scipy' is not defined > > ====================================================================== > ERROR: check_random_even > (scipy.fftpack.pseudo_diffs.test_pseudo_diffs.test_hilbert) > ---------------------------------------------------------------------- > Traceback (most recent call last): > File > "/usr/lib/python2.4/site-packages/scipy/fftpack/tests/test_pseudo_diffs.py", > line 333, in check_random_even > f = diff(diff(f,1),-1) > File >"/usr/lib/python2.4/site-packages/scipy/fftpack/pseudo_diffs.py", >line > 48, in diff > if tmp.dtype in (scipy.Complex32, scipy.Complex64): > NameError: global name 'scipy' is not defined > > ====================================================================== > ERROR: check_random_odd > (scipy.fftpack.pseudo_diffs.test_pseudo_diffs.test_hilbert) > ---------------------------------------------------------------------- > Traceback (most recent call last): > File > "/usr/lib/python2.4/site-packages/scipy/fftpack/tests/test_pseudo_diffs.py", > line 324, in check_random_odd > assert_array_almost_equal(ihilbert(hilbert(f)),f) > File >"/usr/lib/python2.4/site-packages/scipy/fftpack/pseudo_diffs.py", >line > 175, in hilbert > if tmp.dtype in (scipy.Complex32, scipy.Complex64): > NameError: global name 'scipy' is not defined > > ====================================================================== > ERROR: check_tilbert_relation > (scipy.fftpack.pseudo_diffs.test_pseudo_diffs.test_hilbert) > ---------------------------------------------------------------------- > Traceback (most recent call last): > File > "/usr/lib/python2.4/site-packages/scipy/fftpack/tests/test_pseudo_diffs.py", > line 312, in check_tilbert_relation > y = hilbert(f) > File >"/usr/lib/python2.4/site-packages/scipy/fftpack/pseudo_diffs.py", >line > 175, in hilbert > if tmp.dtype in (scipy.Complex32, scipy.Complex64): > NameError: global name 'scipy' is not defined > > ====================================================================== > ERROR: check_definition > (scipy.fftpack.pseudo_diffs.test_pseudo_diffs.test_ihilbert) > ---------------------------------------------------------------------- > Traceback (most recent call last): > File > "/usr/lib/python2.4/site-packages/scipy/fftpack/tests/test_pseudo_diffs.py", > line 371, in check_definition > y = ihilbert(sin(x)) > File >"/usr/lib/python2.4/site-packages/scipy/fftpack/pseudo_diffs.py", >line > 204, in ihilbert > return -hilbert(x) > File >"/usr/lib/python2.4/site-packages/scipy/fftpack/pseudo_diffs.py", >line > 175, in hilbert > if tmp.dtype in (scipy.Complex32, scipy.Complex64): > NameError: global name 'scipy' is not defined > > ====================================================================== > ERROR: check_itilbert_relation > (scipy.fftpack.pseudo_diffs.test_pseudo_diffs.test_ihilbert) > ---------------------------------------------------------------------- > Traceback (most recent call last): > File > "/usr/lib/python2.4/site-packages/scipy/fftpack/tests/test_pseudo_diffs.py", > line 381, in check_itilbert_relation > y = ihilbert(f) > File >"/usr/lib/python2.4/site-packages/scipy/fftpack/pseudo_diffs.py", >line > 204, in ihilbert > return -hilbert(x) > File >"/usr/lib/python2.4/site-packages/scipy/fftpack/pseudo_diffs.py", >line > 175, in hilbert > if tmp.dtype in (scipy.Complex32, scipy.Complex64): > NameError: global name 'scipy' is not defined > > ====================================================================== > ERROR: check_definition > (scipy.fftpack.pseudo_diffs.test_pseudo_diffs.test_itilbert) > ---------------------------------------------------------------------- > Traceback (most recent call last): > File > "/usr/lib/python2.4/site-packages/scipy/fftpack/tests/test_pseudo_diffs.py", > line 289, in check_definition > y = itilbert(sin(x),h) > File >"/usr/lib/python2.4/site-packages/scipy/fftpack/pseudo_diffs.py", >line > 136, in itilbert > if tmp.dtype in (scipy.Complex32, scipy.Complex64): > NameError: global name 'scipy' is not defined > > ====================================================================== > ERROR: check_definition > (scipy.fftpack.pseudo_diffs.test_pseudo_diffs.test_shift) > ---------------------------------------------------------------------- > Traceback (most recent call last): > File > "/usr/lib/python2.4/site-packages/scipy/fftpack/tests/test_pseudo_diffs.py", > line 393, in check_definition > assert_array_almost_equal(shift(sin(x),a),direct_shift(sin(x),a)) > File >"/usr/lib/python2.4/site-packages/scipy/fftpack/pseudo_diffs.py", >line > 404, in shift > if tmp.dtype in (scipy.Complex32, scipy.Complex64): > NameError: global name 'scipy' is not defined > > ====================================================================== > ERROR: check_definition > (scipy.fftpack.pseudo_diffs.test_pseudo_diffs.test_tilbert) > ---------------------------------------------------------------------- > Traceback (most recent call last): > File > "/usr/lib/python2.4/site-packages/scipy/fftpack/tests/test_pseudo_diffs.py", > line 228, in check_definition > y = tilbert(sin(x),h) > File >"/usr/lib/python2.4/site-packages/scipy/fftpack/pseudo_diffs.py", >line > 100, in tilbert > if tmp.dtype in (scipy.Complex32, scipy.Complex64): > NameError: global name 'scipy' is not defined > > ====================================================================== > ERROR: check_random_even > (scipy.fftpack.pseudo_diffs.test_pseudo_diffs.test_tilbert) > ---------------------------------------------------------------------- > Traceback (most recent call last): > File > "/usr/lib/python2.4/site-packages/scipy/fftpack/tests/test_pseudo_diffs.py", > line 243, in check_random_even > assert_array_almost_equal(direct_tilbert(direct_itilbert(f,h),h),f) > File > "/usr/lib/python2.4/site-packages/scipy/fftpack/tests/test_pseudo_diffs.py", > line 61, in direct_itilbert > w = 1j*tanh(w) > AttributeError: 'float64_arrtype' object has no >attribute 'tanh' > > ====================================================================== > ERROR: check_random_odd > (scipy.fftpack.pseudo_diffs.test_pseudo_diffs.test_tilbert) > ---------------------------------------------------------------------- > Traceback (most recent call last): > File > "/usr/lib/python2.4/site-packages/scipy/fftpack/tests/test_pseudo_diffs.py", > line 252, in check_random_odd > assert_array_almost_equal(itilbert(tilbert(f,h),h),f) > File >"/usr/lib/python2.4/site-packages/scipy/fftpack/pseudo_diffs.py", >line > 100, in tilbert > if tmp.dtype in (scipy.Complex32, scipy.Complex64): > NameError: global name 'scipy' is not defined > > ====================================================================== > ERROR: check_djbfft >(scipy.fftpack.basic.test_basic.test_irfft) > ---------------------------------------------------------------------- > Traceback (most recent call last): > File >"/usr/lib/python2.4/site-packages/scipy/fftpack/tests/test_basic.py", > line 361, in check_djbfft > y1 = ifft2(x1) > File >"/usr/lib/python2.4/site-packages/Numeric/FFT/FFT.py", >line 102, in > inverse_fft > a = Numeric.asarray(a).astype(Numeric.Complex) > File >"/usr/lib/python2.4/site-packages/Numeric/Numeric.py", >line 134, in > asarray > return multiarray.array(a, typecode, copy=0, >savespace=savespace) > TypeError: only length-1 arrays can be converted to >Python scalars > > ====================================================================== >FAIL: check_definition >(scipy.fftpack.basic.test_basic.test_fft) > ---------------------------------------------------------------------- > Traceback (most recent call last): > File >"/usr/lib/python2.4/site-packages/scipy/fftpack/tests/test_basic.py", > line 101, in check_definition > assert_array_almost_equal(y,y1) > File >"/usr/lib/python2.4/site-packages/scipy/test/testing.py", >line 727, in > assert_array_almost_equal > assert cond,\ > AssertionError: > Arrays are not almost equal (mismatch 100.0%): > Array 1: [ 2.0000000e+01 +0.0000000e+00j > 0.0000000e+00 > +0.0000000e+00j > -4.0000000e+00 +4.0000000e+00j 0.0000000e+00 >+0.000... > Array 2: [ 20. +3.j > -0.7071068+0.7071068j -7. > +4.j > -0.7071068-0.7071068j -4. -3.j > 0.707106... > > > ====================================================================== >FAIL: check_n_argument_complex >(scipy.fftpack.basic.test_basic.test_fft) > ---------------------------------------------------------------------- > Traceback (most recent call last): > File >"/usr/lib/python2.4/site-packages/scipy/fftpack/tests/test_basic.py", > line 118, in check_n_argument_complex > assert_array_almost_equal(y[0],direct_dft(x1)) > File >"/usr/lib/python2.4/site-packages/scipy/test/testing.py", >line 727, in > assert_array_almost_equal > assert cond,\ > AssertionError: > Arrays are not almost equal (mismatch 100.0%): > Array 1: [ 1.0000000e+01+0.j -2.0000000e+00+2.j > -2.0000000e+00+0.j > -2.0000000e+00-2.j] > Array 2: [ 10.+1.j -3.+2.j -2.-1.j -1.-2.j] > > > ====================================================================== >FAIL: check_definition >(scipy.fftpack.basic.test_basic.test_fftn) > ---------------------------------------------------------------------- > Traceback (most recent call last): > File >"/usr/lib/python2.4/site-packages/scipy/fftpack/tests/test_basic.py", > line 419, in check_definition > assert_array_almost_equal(y,direct_dftn(x)) > File >"/usr/lib/python2.4/site-packages/scipy/test/testing.py", >line 727, in > assert_array_almost_equal > assert cond,\ > AssertionError: > Arrays are not almost equal (mismatch 22.2222222222%): > Array 1: [[ 4.5000000e+01+0.j > -4.5000000e+00+2.5980762j > -4.5000000e+00-2.5980762j] > [ -1.3500000e+01+7.7942286j 0.0... > Array 2: [[ 4.5000000e+01+0.j > -4.5000000e+00+2.5980762j > -4.5000000e+00-2.5980762j] > [ -1.3500000e+01+0.j 0.0... > > > ====================================================================== >FAIL: check_definition >(scipy.fftpack.basic.test_basic.test_ifft) > ---------------------------------------------------------------------- > Traceback (most recent call last): > File >"/usr/lib/python2.4/site-packages/scipy/fftpack/tests/test_basic.py", > line 184, in check_definition > assert_array_almost_equal(y,y1) > File >"/usr/lib/python2.4/site-packages/scipy/test/testing.py", >line 727, in > assert_array_almost_equal > assert cond,\ > AssertionError: > Arrays are not almost equal (mismatch 100.0%): > Array 1: [ 2.5000000e+00 +0.0000000e+00j > 0.0000000e+00 > -0.0000000e+00j > -5.0000000e-01 -5.0000000e-01j 0.0000000e+00 >-0.000... > Array 2: [ 2.5 +0.375j > 0.0883883+0.0883883j -0.125 -0.5j > 0.0883883-0.0883883j -0.5 -0.375j > -0.0883883-0.0... > > > ====================================================================== >FAIL: check_random_real >(scipy.fftpack.basic.test_basic.test_ifft) > ---------------------------------------------------------------------- > Traceback (most recent call last): > File >"/usr/lib/python2.4/site-packages/scipy/fftpack/tests/test_basic.py", > line 218, in check_random_real > assert_array_almost_equal (ifft(fft(x)),x) > File >"/usr/lib/python2.4/site-packages/scipy/test/testing.py", >line 727, in > assert_array_almost_equal > assert cond,\ > AssertionError: > Arrays are not almost equal (mismatch 98.0392156863%): > Array 1: [ 2.7010335e-01 +0.0000000e+00j > 1.5051924e-01 > +1.9549653e-17j > 3.9770871e-01 -4.2662293e-17j 5.5451658e-01 >-5.476... > Array 2: [ 0.2701033 0.0145507 0.5296276 > 0.2231588 0.6244757 > 0.1734576 > 0.4209751 0.6859024 0.888794 0.3217651 0.42293... > > > ====================================================================== >FAIL: check_definition >(scipy.fftpack.basic.test_basic.test_ifftn) > ---------------------------------------------------------------------- > Traceback (most recent call last): > File >"/usr/lib/python2.4/site-packages/scipy/fftpack/tests/test_basic.py", > line 586, in check_definition > assert_array_almost_equal(y,direct_idftn(x)) > File >"/usr/lib/python2.4/site-packages/scipy/test/testing.py", >line 727, in > assert_array_almost_equal > assert cond,\ > AssertionError: > Arrays are not almost equal (mismatch 22.2222222222%): > Array 1: [[ 5.0000000e+00+0.j > -5.0000000e-01-0.2886751j > -5.0000000e-01+0.2886751j] > [ -1.5000000e+00-0.8660254j 0.0... > Array 2: [[ 5.0000000e+00+0.j > -5.0000000e-01-0.2886751j > -5.0000000e-01+0.2886751j] > [ -1.5000000e+00+0.j 0.0... > > > ====================================================================== >FAIL: check_definition >(scipy.fftpack.basic.test_basic.test_irfft) > ---------------------------------------------------------------------- > Traceback (most recent call last): > File >"/usr/lib/python2.4/site-packages/scipy/fftpack/tests/test_basic.py", > line 338, in check_definition > assert_array_almost_equal(y,ifft(x1)) > File >"/usr/lib/python2.4/site-packages/scipy/test/testing.py", >line 727, in > assert_array_almost_equal > assert cond,\ > AssertionError: > Arrays are not almost equal (mismatch 50.0%): > Array 1: [ 2.625 -1.6856602 -0.375 > -1.1856602 0.625 > 0.4356602 -0.375 > 0.9356602] > Array 2: [ 2.6250000e+00+0.j -3.7500000e-01-0.j > -3.7500000e-01-0.j > -3.7500000e-01-0.j 6.2500000e-01+0.j > -3.7500000e-01+0.... > > > ====================================================================== >FAIL: check_generic >(scipy.base.type_check.test_type_check.test_isneginf) > ---------------------------------------------------------------------- > Traceback (most recent call last): > File >"/usr/lib/python2.4/site-packages/scipy/base/tests/test_type_check.py", > line 193, in check_generic > assert(vals[0] == 1) > AssertionError > > ====================================================================== >FAIL: check_generic >(scipy.base.type_check.test_type_check.test_isposinf) > ---------------------------------------------------------------------- > Traceback (most recent call last): > File >"/usr/lib/python2.4/site-packages/scipy/base/tests/test_type_check.py", > line 186, in check_generic > assert(vals[0] == 0) > AssertionError > > ====================================================================== >FAIL: check_generic >(scipy.base.type_check.test_type_check.test_nan_to_num) > ---------------------------------------------------------------------- > Traceback (most recent call last): > File >"/usr/lib/python2.4/site-packages/scipy/base/tests/test_type_check.py", > line 200, in check_generic > assert_all(vals[0] < -1e10) and >assert_all(isfinite(vals[0])) > File >"/usr/lib/python2.4/site-packages/scipy/base/tests/test_type_check.py", > line 12, in assert_all > assert(all(x)), x > AssertionError: False > > ---------------------------------------------------------------------- > Ran 550 tests in 1.718s > >FAILED (failures=10, errors=21) > > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-dev ====================================================================== FAIL: check_definition (scipy.fftpack.basic.test_basic.test_fft) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib/python2.4/site-packages/scipy/fftpack/tests/test_basic.py", line 101, in check_definition assert_array_almost_equal(y,y1) File "/usr/local/lib/python2.4/site-packages/scipy/test/testing.py", line 727, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 100.0%): Array 1: [ 2.0000000e+01 +0.0000000e+00j 0.0000000e+00 +0.0000000e+00j -4.0000000e+00 +4.0000000e+00j 0.0000000e+00 +0.000... Array 2: [ 20. +3.j -0.7071068+0.7071068j -7. +4.j -0.7071068-0.7071068j -4. -3.j 0.707106... ====================================================================== FAIL: check_n_argument_complex (scipy.fftpack.basic.test_basic.test_fft) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib/python2.4/site-packages/scipy/fftpack/tests/test_basic.py", line 118, in check_n_argument_complex assert_array_almost_equal(y[0],direct_dft(x1)) File "/usr/local/lib/python2.4/site-packages/scipy/test/testing.py", line 727, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 100.0%): Array 1: [ 1.0000000e+01+0.j -2.0000000e+00+2.j -2.0000000e+00+0.j -2.0000000e+00-2.j] Array 2: [ 10.+1.j -3.+2.j -2.-1.j -1.-2.j] ====================================================================== FAIL: check_definition (scipy.fftpack.basic.test_basic.test_fftn) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib/python2.4/site-packages/scipy/fftpack/tests/test_basic.py", line 419, in check_definition assert_array_almost_equal(y,direct_dftn(x)) File "/usr/local/lib/python2.4/site-packages/scipy/test/testing.py", line 727, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 22.2222222222%): Array 1: [[ 4.5000000e+01+0.j -4.5000000e+00+2.5980762j -4.5000000e+00-2.5980762j] [ -1.3500000e+01+7.7942286j 0.0... Array 2: [[ 4.5000000e+01+0.j -4.5000000e+00+2.5980762j -4.5000000e+00-2.5980762j] [ -1.3500000e+01+0.j 0.0... ====================================================================== FAIL: check_definition (scipy.fftpack.basic.test_basic.test_ifft) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib/python2.4/site-packages/scipy/fftpack/tests/test_basic.py", line 184, in check_definition assert_array_almost_equal(y,y1) File "/usr/local/lib/python2.4/site-packages/scipy/test/testing.py", line 727, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 100.0%): Array 1: [ 2.5000000e+00 +0.0000000e+00j 0.0000000e+00 -0.0000000e+00j -5.0000000e-01 -5.0000000e-01j 0.0000000e+00 -0.000... Array 2: [ 2.5 +0.375j 0.0883883+0.0883883j -0.125 -0.5j 0.0883883-0.0883883j -0.5 -0.375j -0.0883883-0.0... ====================================================================== FAIL: check_random_real (scipy.fftpack.basic.test_basic.test_ifft) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib/python2.4/site-packages/scipy/fftpack/tests/test_basic.py", line 218, in check_random_real assert_array_almost_equal (ifft(fft(x)),x) File "/usr/local/lib/python2.4/site-packages/scipy/test/testing.py", line 727, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 98.0392156863%): Array 1: [ 1.3389235e-01 +0.0000000e+00j 4.4822517e-01 -1.7485417e-17j 4.1562521e-01 +5.1530429e-17j 1.4863933e-01 +4.821... Array 2: [ 0.1338923 0.6217639 0.4406348 0.0668647 0.1173414 0.7171446 0.4180512 0.2046611 0.9841973 0.3370606 0.13347... ====================================================================== FAIL: check_definition (scipy.fftpack.basic.test_basic.test_ifftn) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib/python2.4/site-packages/scipy/fftpack/tests/test_basic.py", line 586, in check_definition assert_array_almost_equal(y,direct_idftn(x)) File "/usr/local/lib/python2.4/site-packages/scipy/test/testing.py", line 727, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 22.2222222222%): Array 1: [[ 5.0000000e+00+0.j -5.0000000e-01-0.2886751j -5.0000000e-01+0.2886751j] [ -1.5000000e+00-0.8660254j 0.0... Array 2: [[ 5.0000000e+00+0.j -5.0000000e-01-0.2886751j -5.0000000e-01+0.2886751j] [ -1.5000000e+00+0.j 0.0... ====================================================================== FAIL: check_definition (scipy.fftpack.basic.test_basic.test_irfft) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib/python2.4/site-packages/scipy/fftpack/tests/test_basic.py", line 338, in check_definition assert_array_almost_equal(y,ifft(x1)) File "/usr/local/lib/python2.4/site-packages/scipy/test/testing.py", line 727, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 50.0%): Array 1: [ 2.625 -1.6856602 -0.375 -1.1856602 0.625 0.4356602 -0.375 0.9356602] Array 2: [ 2.6250000e+00+0.j -3.7500000e-01-0.j -3.7500000e-01-0.j -3.7500000e-01-0.j 6.2500000e-01+0.j -3.7500000e-01+0.... ---------------------------------------------------------------------- Ran 406 tests in 1.947s FAILED (failures=7) >>> scipy.base.__version__ '0.4.3.1343' Nils From pearu at scipy.org Tue Oct 25 11:40:03 2005 From: pearu at scipy.org (Pearu Peterson) Date: Tue, 25 Oct 2005 10:40:03 -0500 (CDT) Subject: [SciPy-dev] array dtypechar wrong or not Message-ID: Hi, Note that In [142]: array(array([1,2],dtype='I'),dtype='L').dtypechar Out[142]: 'I' In [143]: array(array([1,2],dtype='L'),dtype='I').dtypechar Out[143]: 'L' I have expected 'L' and 'I', respectively. It this a bug? Pearu From schofield at ftw.at Tue Oct 25 13:31:33 2005 From: schofield at ftw.at (Ed Schofield) Date: Tue, 25 Oct 2005 19:31:33 +0200 Subject: [SciPy-dev] Segmentation fault triggered by optimize/lbfgsb.py Message-ID: <435E6BF5.3070103@ftw.at> The code in optimize/lbfgsb.py raises an exception and then dumps core: Traceback (most recent call last): ... File "/home/schofield/Test/lib/python2.4/site-packages/scipy/optimize/lbfgsb.py", line 182, in fmin_l_bfgs_b isave, dsave) ValueError: string_from_pyobj failed in converting 14th argument `csave' of _lbfgsb.setulb to C string Segmentation fault (core dumped) What is interesting is that I can get a segfault with the same backtrace even *before* the call to the Fortran function, just by executing some of the statements up to line 182 and exiting the python interpreter with ^D, implying that the interpreter is no longer in a stable state. # scipy.base.__version__ gives '0.4.3.1343' >>> import scipy >>> task = scipy.zeros((60,), scipy.Character) >>> task[:] = 'blah' >>> ^D Segmentation fault Here's a backtrace from gdb: #0 0x080de87d in Py_GetPath () #1 0x08087860 in PyTuple_Pack () #2 0x080df0c0 in _PyObject_GC_Track () #3 0x080df897 in PyGC_Collect () #4 0x080d88f9 in Py_Finalize () #5 0x080556f0 in Py_Main () #6 0x4938dea2 in __libc_start_main () from /lib/tls/i686/cmov/libc.so.6 #7 0x08054f21 in _start () Anyone want to take the baton? ;) -- Ed From oliphant at ee.byu.edu Tue Oct 25 15:05:51 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Tue, 25 Oct 2005 13:05:51 -0600 Subject: [SciPy-dev] array dtypechar wrong or not In-Reply-To: References: Message-ID: <435E820F.3030201@ee.byu.edu> Pearu Peterson wrote: >Hi, > >Note that > >In [142]: array(array([1,2],dtype='I'),dtype='L').dtypechar >Out[142]: 'I' > >In [143]: array(array([1,2],dtype='L'),dtype='I').dtypechar >Out[143]: 'L' > >I have expected 'L' and 'I', respectively. It this a bug? > > > It's not really a bug, Basically, on your platform 'I' and 'L' are equivalent types, and so conversion is not done. I suppose that the typenumber could change, though, instead of doing nothing. Basically, it's the code in array_fromarray. Because 'I' and 'L' are equivalent types on your platform, only an INCREF is done. If we change the typecode as well as INCREF, then the original array would be changed, which might be surprising as well: a = array([1,2],dtype='I') b = array(a,dtype='L') b will point to a. If we change b to have typecode 'L', then a will also change. This would be surprising, I think. So, I guess, I don't think we should change the behavior. -Travis From oliphant at ee.byu.edu Tue Oct 25 15:08:08 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Tue, 25 Oct 2005 13:08:08 -0600 Subject: [SciPy-dev] Segmentation fault triggered by optimize/lbfgsb.py In-Reply-To: <435E6BF5.3070103@ftw.at> References: <435E6BF5.3070103@ftw.at> Message-ID: <435E8298.3020301@ee.byu.edu> Ed Schofield wrote: >The code in optimize/lbfgsb.py raises an exception and then dumps core: > >Traceback (most recent call last): > ... > File >"/home/schofield/Test/lib/python2.4/site-packages/scipy/optimize/lbfgsb.py", >line 182, in fmin_l_bfgs_b > isave, dsave) >ValueError: string_from_pyobj failed in converting 14th argument `csave' >of _lbfgsb.setulb to C string >Segmentation fault (core dumped) > > >What is interesting is that I can get a segfault with the same backtrace >even *before* the call to the Fortran function, just by executing some >of the statements up to line 182 and exiting the python interpreter with >^D, implying that the interpreter is no longer in a stable state. > > ># scipy.base.__version__ gives '0.4.3.1343' > > > >>>>import scipy >>>> >>>> > > > >>>>task = scipy.zeros((60,), scipy.Character) >>>>task[:] = 'blah' >>>>^D >>>> >>>> Interesting. I did not know scipy.Character was being used. It should probably be changed to something else. But, yes, this is a bug in scipy core. -Travis From pearu at scipy.org Tue Oct 25 14:24:47 2005 From: pearu at scipy.org (Pearu Peterson) Date: Tue, 25 Oct 2005 13:24:47 -0500 (CDT) Subject: [SciPy-dev] array dtypechar wrong or not In-Reply-To: <435E820F.3030201@ee.byu.edu> References: <435E820F.3030201@ee.byu.edu> Message-ID: On Tue, 25 Oct 2005, Travis Oliphant wrote: > Pearu Peterson wrote: > >> Hi, >> >> Note that >> >> In [142]: array(array([1,2],dtype='I'),dtype='L').dtypechar >> Out[142]: 'I' >> >> In [143]: array(array([1,2],dtype='L'),dtype='I').dtypechar >> Out[143]: 'L' >> >> I have expected 'L' and 'I', respectively. It this a bug? >> >> >> > It's not really a bug, Basically, on your platform 'I' and 'L' are > equivalent types, and so conversion is not done. I suppose that the > typenumber could change, though, instead of doing nothing. > > Basically, it's the code in array_fromarray. Because 'I' and 'L' are > equivalent types on your platform, only an INCREF is done. If we > change the typecode as well as INCREF, then the original array would be > changed, which might be surprising as well: > > a = array([1,2],dtype='I') > > b = array(a,dtype='L') > > b will point to a. If we change b to have typecode 'L', then a will > also change. This would be surprising, I think. I agree with you up to a point that by default array should make a copy. Changing `b` should not change `a`, so why should `a` change on changing `b` typecode anyway? As it is now, array does not perform a complete copy, that is, it ignores dtype argument. In practical application this issue doesn't matter much probably. I noticed this behaviour and was forced to add some extra hooks to deal with this behavior for f2py unittests that I am implementing now. I am not requesting to change the current behavior. I am just pointing out an unexpected behaviour. If there is no obvious way to change this, then nevermind, but if there is, then that would be good. Pearu From oliphant at ee.byu.edu Tue Oct 25 15:34:46 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Tue, 25 Oct 2005 13:34:46 -0600 Subject: [SciPy-dev] array dtypechar wrong or not In-Reply-To: References: <435E820F.3030201@ee.byu.edu> Message-ID: <435E88D6.8080002@ee.byu.edu> >>b will point to a. If we change b to have typecode 'L', then a will >>also change. This would be surprising, I think. >> >> > >I agree with you up to a point that by default array should make a copy. >Changing `b` should not change `a`, so why should `a` change on changing >`b` typecode anyway? As it is now, array does not perform a complete >copy, that is, it ignores dtype argument. > > Ah, wait a minute here. You were using array which makes a copy by default. I think if a copy is made, then the typecode should change as you expected. I'll look into this. >In practical application this issue doesn't matter much probably. I >noticed this behaviour and was forced to add some extra hooks to deal >with this behavior for f2py unittests that I am implementing now. > > No, it doesn't matter because the types are identical and should behave the same. -Travis From pearu at scipy.org Tue Oct 25 14:45:33 2005 From: pearu at scipy.org (Pearu Peterson) Date: Tue, 25 Oct 2005 13:45:33 -0500 (CDT) Subject: [SciPy-dev] array dtypechar wrong or not In-Reply-To: <435E88D6.8080002@ee.byu.edu> References: <435E820F.3030201@ee.byu.edu><435E88D6.8080002@ee.byu.edu> Message-ID: On Tue, 25 Oct 2005, Travis Oliphant wrote: > >>> b will point to a. If we change b to have typecode 'L', then a will >>> also change. This would be surprising, I think. >>> >>> >> >> I agree with you up to a point that by default array should make a copy. >> Changing `b` should not change `a`, so why should `a` change on changing >> `b` typecode anyway? As it is now, array does not perform a complete >> copy, that is, it ignores dtype argument. >> >> > Ah, wait a minute here. You were using array which makes a copy by > default. I think if a copy is made, then the typecode should change > as you expected. I'll look into this. Ah, great! I was just having a trouble with creating an array with specified dtypechar and fortran flag and having OWNDATA true from another array but the code is getting worse and worse. But if this is a bug then the code simplifies immediately. Thanks, Pearu From oliphant at ee.byu.edu Tue Oct 25 15:54:56 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Tue, 25 Oct 2005 13:54:56 -0600 Subject: [SciPy-dev] array dtypechar wrong or not In-Reply-To: References: <435E820F.3030201@ee.byu.edu><435E88D6.8080002@ee.byu.edu> Message-ID: <435E8D90.7020700@ee.byu.edu> Pearu Peterson wrote: >Ah, great! I was just having a trouble with creating an array with >specified dtypechar and fortran flag and having OWNDATA true from another >array but the code is getting worse and worse. But if this is a bug then >the code simplifies immediately. > > Just changed it, I was using the old typenumber, when I should have been using the new. Perhaps only a semantic question at the core, but I think this was a good catch. -Travis From drewdemento at yahoo.com Tue Oct 25 16:20:20 2005 From: drewdemento at yahoo.com (Andrew D) Date: Tue, 25 Oct 2005 13:20:20 -0700 (PDT) Subject: [SciPy-dev] Building newcore on the Itanium Message-ID: <20051025202020.12376.qmail@web50703.mail.yahoo.com> Hi, I have been trying compiling the lastest scipy "newcore" on an intel itanium machine with Redhat Linux. On the latest svn code (updated today, 1pm) the command "python setup.py build" gives the output documented at http://agenda.ms11.net/code/itanium_build.log In particular it stopps trying to link with the command: error: Command "f77 -L/home/phillips/faculty/dandrew/usr/lib build/temp.linux-ia64-2.3/scipy/corelib/lapack_lite/lapack_litemodule.o -L/home/phillips/faculty/dandrew/usr/lib/ -llapack -lptf77blas -lptcblas -latlas -lg2c -o build/lib.linux-ia64-2.3/scipy/lib/lapack_lite.so" failed with exit status 1 Also, when I do not have the path set for ATLAS to be found the build fails at: error: Command "f77 -L/home/phillips/faculty/dandrew/usr/lib build/temp.linux-ia64-2.3/scipy/corelib/blasdot/_dotblas.o -L/usr/lib -lblas -lg2c -o build/lib.linux-ia64-2.3/scipy/lib/_dotblas.so" failed with exit status 1 Upon manually linking with the command: "g77 -shared ..." newcore then builds and installs. [When using the intel compilers (ifort,icc) extra flags are required to link. However the building process ignores the flags put in the scipy/distutils/fcompiler/intel.py file. AND THE LINKING IS DONE WITH THE GCC COMPILER (see the logfile). To sucessfully build with the intel compiler, all libraries must be manually linked with the commands: "icc -i_dynamic -lirc -pthread -shared ..." "ifort -i_dynamic -lirc -pthread -shared ..." Build log: http://agenda.ms11.net/code/itanium_build_icc.log ] The tests run sucessfully: >>from scipy import * >>test(10,10) . . ---------------------------------------------------------------------- Ran 138 tests in 1.942s OK I have not tried to compile newscipy due to the linking problems. Too much manual linking.. Andrew Computer information: ~>cat /proc/cpuinfo processor : 0 vendor : GenuineIntel arch : IA-64 family : Itanium 2 model : 1 revision : 5 archrev : 0 features : branchlong cpu number : 0 cpu regs : 4 cpu MHz : 1300.000000 itc MHz : 1300.000000 BogoMIPS : 1946.15 [repeated 4 times, it's a 4-cpu box] ~>free total used free shared buffers cached Mem: 24779536 13069312 11710224 0 808784 10497376 -/+ buffers/cache: 1763152 23016384 Swap: 51477632 960 51476672 ~>cat /etc/issue Red Hat Enterprise Linux AS release 3 (Taroon Update 6) ~>gcc -v Reading specs from /usr/lib/gcc-lib/ia64-redhat-linux/3.2.3/specs Configured with: ../configure --prefix=/usr --mandir=/usr/share/man --infodir=/usr/share/info --enable-shared --enable-threads=posix --disable-checking --with-system-zlib --enable-__cxa_atexit --host=ia64-redhat-linux Thread model: posix gcc version 3.2.3 20030502 (Red Hat Linux 3.2.3-53) ->python Python 2.3 (#2, Oct 31 2003, 11:54:00) [GCC 3.2.3 20030502 (Red Hat Linux 3.2.3-20)] on linux2 __________________________________ Yahoo! Mail - PC Magazine Editors' Choice 2005 http://mail.yahoo.com From oliphant at ee.byu.edu Tue Oct 25 23:51:54 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Tue, 25 Oct 2005 21:51:54 -0600 Subject: [SciPy-dev] Segmentation fault triggered by optimize/lbfgsb.py In-Reply-To: <435E6BF5.3070103@ftw.at> References: <435E6BF5.3070103@ftw.at> Message-ID: <435EFD5A.4050702@ee.byu.edu> Ed Schofield wrote: >The code in optimize/lbfgsb.py raises an exception and then dumps core: > >Traceback (most recent call last): > ... > File >"/home/schofield/Test/lib/python2.4/site-packages/scipy/optimize/lbfgsb.py", >line 182, in fmin_l_bfgs_b > isave, dsave) >ValueError: string_from_pyobj failed in converting 14th argument `csave' >of _lbfgsb.setulb to C string >Segmentation fault (core dumped) > > >What is interesting is that I can get a segfault with the same backtrace >even *before* the call to the Fortran function, just by executing some >of the statements up to line 182 and exiting the python interpreter with >^D, implying that the interpreter is no longer in a stable state. > > ># scipy.base.__version__ gives '0.4.3.1343' > > > >>>>import scipy >>>> >>>> > > > >>>>task = scipy.zeros((60,), scipy.Character) >>>>task[:] = 'blah' >>>>^D >>>> >>>> >Segmentation fault > >Here's a backtrace from gdb: >#0 0x080de87d in Py_GetPath () >#1 0x08087860 in PyTuple_Pack () >#2 0x080df0c0 in _PyObject_GC_Track () >#3 0x080df897 in PyGC_Collect () >#4 0x080d88f9 in Py_Finalize () >#5 0x080556f0 in Py_Main () >#6 0x4938dea2 in __libc_start_main () from /lib/tls/i686/cmov/libc.so.6 >#7 0x08054f21 in _start () > >Anyone want to take the baton? ;) > > > One part of the problem has been fixed in new scipy core. Now, f2py may need to change to handle character arrays differently. Character arrays work a little differently than they used to. Before they were just special-cases of UBYTE arrays. Now, character arrays are arrays of strings. Thus, the lbfgsb.py code should use task = zeros(1, 'S60') so that task[:] = 'blah' will do the right thing. task = zeros(1,Character) is equivalent to task = zeros(1,'S1') so that task[:] = 'blah' is the same as task[:] = array('blah','S1') which fills the array with 'b' -Travis -Travis From nwagner at mecha.uni-stuttgart.de Wed Oct 26 09:38:53 2005 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Wed, 26 Oct 2005 15:38:53 +0200 Subject: [SciPy-dev] newscipy Message-ID: <435F86ED.6080404@mecha.uni-stuttgart.de> Hi all, How can I obtain information about the installed version of newscipy ? >>> scipy.base.__version__ '0.4.3.1354' yields the current version of newcore. Nils From pearu at scipy.org Wed Oct 26 09:08:07 2005 From: pearu at scipy.org (Pearu Peterson) Date: Wed, 26 Oct 2005 08:08:07 -0500 (CDT) Subject: [SciPy-dev] newscipy In-Reply-To: <435F86ED.6080404@mecha.uni-stuttgart.de> References: <435F86ED.6080404@mecha.uni-stuttgart.de> Message-ID: On Wed, 26 Oct 2005, Nils Wagner wrote: > How can I obtain information about the installed version of newscipy ? > >>>> scipy.base.__version__ > '0.4.3.1354' > > yields the current version of newcore. With recent newcore use In [1]: import scipy In [2]: scipy.__core_version__ Out[2]: '0.4.3.1356' In [3]: scipy.__scipy_version__ Out[3]: '0.4.2_1371' Pearu From nwagner at mecha.uni-stuttgart.de Wed Oct 26 10:41:57 2005 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Wed, 26 Oct 2005 16:41:57 +0200 Subject: [SciPy-dev] newscipy In-Reply-To: References: <435F86ED.6080404@mecha.uni-stuttgart.de> Message-ID: <435F95B5.3070409@mecha.uni-stuttgart.de> Pearu Peterson wrote: >On Wed, 26 Oct 2005, Nils Wagner wrote: > > >>How can I obtain information about the installed version of newscipy ? >> >> >>>>>scipy.base.__version__ >>>>> >>'0.4.3.1354' >> >>yields the current version of newcore. >> > >With recent newcore use > >In [1]: import scipy > >In [2]: scipy.__core_version__ >Out[2]: '0.4.3.1356' > >In [3]: scipy.__scipy_version__ >Out[3]: '0.4.2_1371' > >Pearu > >_______________________________________________ >Scipy-dev mailing list >Scipy-dev at scipy.net >http://www.scipy.net/mailman/listinfo/scipy-dev > Thank you - works fine for me. >>> scipy.__core_version__ '0.4.3.1356' >>> scipy.__scipy_version__ '0.4.2_1372' BTW, scipy.test(1,10) yields Ran 406 tests in 1.541s FAILED (failures=6, errors=75) Most errors are due to AttributeError: 'module' object has no attribute 'has_column_major_storage' error: failed in converting 1st keyword `s' of _fftpack.zfftnd to C/Fortran array Nils From stephen.walton at csun.edu Wed Oct 26 12:00:34 2005 From: stephen.walton at csun.edu (Stephen Walton) Date: Wed, 26 Oct 2005 09:00:34 -0700 Subject: [SciPy-dev] Question about 64-bit integers being cast to double precision In-Reply-To: References: <4349EDA2.2090802@ee.byu.edu> <434A9453.2010705@csun.edu> <434A98EB.7030407@ee.byu.edu> <434D8F27.100@colorado.edu> <1129157213.4839.5.camel@E011704> Message-ID: <435FA822.2050501@csun.edu> Arnd Baecker wrote: >I also think that double should be kept as default. >If I understand things correctly, both normal python and >all the libraries for scipy can only deal with that at the moment. > > I respectfully disagree that double should be the default target for upcasts. This is a holdover from C and was a bad decision when made for that language. And, as Pearu points out, it has dire consequences for storage. If I get a 16 Megapixel image from HST with two-byte integers, I definitely would not want that image upcast to 64 or, heaven forfend, 128 bits the first time I did an operation on it. From nwagner at mecha.uni-stuttgart.de Wed Oct 26 12:37:28 2005 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Wed, 26 Oct 2005 18:37:28 +0200 Subject: [SciPy-dev] building newcore, newscipy Message-ID: Hi all, Building newcore and newscipy is much faster, when ATLAS is available. Is it possible to accelerate the building process when ATLAS isn't installed. Nils From charles.harris at sdl.usu.edu Wed Oct 26 12:53:57 2005 From: charles.harris at sdl.usu.edu (Charles R Harris) Date: Wed, 26 Oct 2005 10:53:57 -0600 Subject: [SciPy-dev] Question about 64-bit integers being cast to double precision In-Reply-To: <435FA822.2050501@csun.edu> References: <4349EDA2.2090802@ee.byu.edu> <434A9453.2010705@csun.edu> <434A98EB.7030407@ee.byu.edu> <434D8F27.100@colorado.edu> <1129157213.4839.5.camel@E011704> <435FA822.2050501@csun.edu> Message-ID: <1130345637.4822.24.camel@E011704> On Wed, 2005-10-26 at 09:00 -0700, Stephen Walton wrote: > Arnd Baecker wrote: > > >I also think that double should be kept as default. > >If I understand things correctly, both normal python and > >all the libraries for scipy can only deal with that at the moment. > > > > > I respectfully disagree that double should be the default target for > upcasts. This is a holdover from C and was a bad decision when made for > that language. And, as Pearu points out, it has dire consequences for > storage. If I get a 16 Megapixel image from HST with two-byte integers, > I definitely would not want that image upcast to 64 or, heaven forfend, > 128 bits the first time I did an operation on it. I think there are two goals here: 1) it just works 2) it is efficient. These goals are not always compatible. In order to just work, certain defaults need to be assumed. Python works like that, it is one of the reasons it is so convenient. On the other hand, efficiency, space efficiency in particular, requires greater control on the part of the programmer who has to take the trouble pick the types he wants to use, making a trade between precision, space, and speed. So I think that we should choose reasonable defaults that carry on the Python spirit, while leaving open options for the programmer who wants more control. How to do this without making a mess is the question. Now, python does the following: >>> from math import * >>> sqrt(2) 1.4142135623730951 and if we are going to overload sqrt we should keep this precision. Do we really want to make a distinction in this case between math.sqrt and Numeric.sqrt ? I myself don't think so. On the other hand, it is reasonable that scipy not promote float types in this situation. Integral types remain a problem. What about uint8 vs uint64 for instance? Maybe we should either require a cast of integral types to a float type for arrays or define distinct functions like sqrtf and sqrtl to handle this. I note that a complaint has been made that this is unwieldy and a throwback, but I don't think so. The integer case is, after all, ambiguous. The automatic selection of type only really makes sense for floats or if we explicitly state that maximum precision, but no more than necessary, should be maintained. But what happens then for int64 when we have a machine whose default float is double double? Chuck From Fernando.Perez at colorado.edu Wed Oct 26 13:38:26 2005 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Wed, 26 Oct 2005 11:38:26 -0600 Subject: [SciPy-dev] Question about 64-bit integers being cast to double precision In-Reply-To: <1130345637.4822.24.camel@E011704> References: <4349EDA2.2090802@ee.byu.edu> <434A9453.2010705@csun.edu> <434A98EB.7030407@ee.byu.edu> <434D8F27.100@colorado.edu> <1129157213.4839.5.camel@E011704> <435FA822.2050501@csun.edu> <1130345637.4822.24.camel@E011704> Message-ID: <435FBF12.9010406@colorado.edu> Charles R Harris wrote: [...] > Now, python does the following: > > >>>>from math import * >>>>sqrt(2) > > 1.4142135623730951 > > and if we are going to overload sqrt we should keep this precision. Do > we really want to make a distinction in this case between math.sqrt and > Numeric.sqrt ? I myself don't think so. On the other hand, it is > reasonable that scipy not promote float types in this situation. > Integral types remain a problem. What about uint8 vs uint64 for > instance? Again, I find it simplest to think about this problem in terms of exact/approximate numbers. All integer types (of any bit-width) are exact, all float numbers are approximate. The question is then how to handle functions, which can be (in terms of their domain/range relation): 1. f : exact -> exact 2. f : exact -> approximate etc. My argument is that for #2, there should be upcasting to the widest possible approximate type, in an attempt to preserve as much of the original information as we can. For example, sqrt(2) should upcast to double, because truncation to integer makes very little practical sense. The case of accumulators is special, because they are of type 1 above, but the result may not (and often doesn't) fit in the input type. Travis already agreed that in this case, an upcast was a reasonable compromise. However, for functions of the kind 3. f : approx -> approx there should be in general no upcasting (except for accumulators, as we've discussed). Doing a*b to two float arrays should certainly not produce an enormous result, which may not even fit in memory. Just my opinion. Cheers, f From charlesr.harris at gmail.com Wed Oct 26 14:48:14 2005 From: charlesr.harris at gmail.com (Charles R Harris) Date: Wed, 26 Oct 2005 12:48:14 -0600 Subject: [SciPy-dev] Question about 64-bit integers being cast to double precision In-Reply-To: <435FBF12.9010406@colorado.edu> References: <4349EDA2.2090802@ee.byu.edu> <434A9453.2010705@csun.edu> <434A98EB.7030407@ee.byu.edu> <434D8F27.100@colorado.edu> <1129157213.4839.5.camel@E011704> <435FA822.2050501@csun.edu> <1130345637.4822.24.camel@E011704> <435FBF12.9010406@colorado.edu> Message-ID: On 10/26/05, Fernando Perez wrote: > > Charles R Harris wrote: > > [...] > > > Now, python does the following: > > > > > >>>>from math import * > >>>>sqrt(2) > > > > 1.4142135623730951 > > > > and if we are going to overload sqrt we should keep this precision. Do > > we really want to make a distinction in this case between math.sqrt and > > Numeric.sqrt ? I myself don't think so. On the other hand, it is > > reasonable that scipy not promote float types in this situation. > > Integral types remain a problem. What about uint8 vs uint64 for > > instance? > > Again, I find it simplest to think about this problem in terms of > exact/approximate numbers. All integer types (of any bit-width) are exact, > all float numbers are approximate. The question is then how to handle > functions, which can be (in terms of their domain/range relation): > > 1. f : exact -> exact > 2. f : exact -> approximate > > etc. > > My argument is that for #2, there should be upcasting to the widest > possible > approximate type, in an attempt to preserve as much of the original > information as we can. For example, sqrt(2) should upcast to double, > because > truncation to integer makes very little practical sense. > Yes, I agree with this. The only problem I see is if someone wants to save space when taking the sqrt of an integral array. There are at least three possiblilities: 1. cast the result to a float 2. cast the argument to a float 3. use a special sqrtf function The first two options use more temporary space, take more time, and look uglier (IMHO). On the other hand, the needed commands are already implemented. The last option is clear and concise, but needs a new ufunc. Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From Fernando.Perez at colorado.edu Wed Oct 26 14:56:03 2005 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Wed, 26 Oct 2005 12:56:03 -0600 Subject: [SciPy-dev] Question about 64-bit integers being cast to double precision In-Reply-To: References: <4349EDA2.2090802@ee.byu.edu> <434A9453.2010705@csun.edu> <434A98EB.7030407@ee.byu.edu> <434D8F27.100@colorado.edu> <1129157213.4839.5.camel@E011704> <435FA822.2050501@csun.edu> <1130345637.4822.24.camel@E011704> <435FBF12.9010406@colorado.edu> Message-ID: <435FD143.1050903@colorado.edu> Charles R Harris wrote: > Yes, I agree with this. The only problem I see is if someone wants to save > space when taking the sqrt of an integral array. There are at least three > possiblilities: > > 1. cast the result to a float > 2. cast the argument to a float > 3. use a special sqrtf function > > The first two options use more temporary space, take more time, and look > uglier (IMHO). On the other hand, the needed commands are already > implemented. The last option is clear and concise, but needs a new ufunc. Well, while I agree with the recently posted design guideline from Guido of using different functions for different purposes rather than flags, this may be a case where a flag would be a good choice. Especially because we already have a conceptual precedent for the accumulators of specifying the return type via a flag: a.sum(rtype=int). Since the 'mental slot' is already in scipy's users heads for saying 'modify the default output of this function to accumulate/store data in a different type', I think it would be reasonable to offer sqrt(a,rtype=float) as an optional way to prevent automatic upcasting in cases where users want that kind of very fine level control. This can be done uniformly across the library, rather than growing a zillion foof/food/foo* post-fixed forms of every ufunc in the library. We would then have: - A basic principle for how upcasting is done, driven by the idea of 'protecting precision even at the cost of storage'. This principle forces sqrt(2) to be a double and anint_array.sum() to accumulate to a wider type. - A uniform mechanism for overriding upcasting across the library, via the rtype flag. If most/all of scipy implements this, it seems like a small price of learning to pay for a reasonable balance between convenience, correctness and efficiency. Or am I missing some usage case that this would not satisfy? Cheers, f From charlesr.harris at gmail.com Wed Oct 26 16:54:52 2005 From: charlesr.harris at gmail.com (Charles R Harris) Date: Wed, 26 Oct 2005 14:54:52 -0600 Subject: [SciPy-dev] Question about 64-bit integers being cast to double precision In-Reply-To: <435FD143.1050903@colorado.edu> References: <4349EDA2.2090802@ee.byu.edu> <434A98EB.7030407@ee.byu.edu> <434D8F27.100@colorado.edu> <1129157213.4839.5.camel@E011704> <435FA822.2050501@csun.edu> <1130345637.4822.24.camel@E011704> <435FBF12.9010406@colorado.edu> <435FD143.1050903@colorado.edu> Message-ID: On 10/26/05, Fernando Perez wrote: > > Charles R Harris wrote: > > > Yes, I agree with this. The only problem I see is if someone wants to > save > > space when taking the sqrt of an integral array. There are at least > three > > possiblilities: > > > > 1. cast the result to a float > > 2. cast the argument to a float > > 3. use a special sqrtf function > > > > The first two options use more temporary space, take more time, and look > > uglier (IMHO). On the other hand, the needed commands are already > > implemented. The last option is clear and concise, but needs a new > ufunc. > > Well, while I agree with the recently posted design guideline from Guido > of > using different functions for different purposes rather than flags, this > may > be a case where a flag would be a good choice. Especially because we > already > have a conceptual precedent for the accumulators of specifying the return > type > via a flag: a.sum(rtype=int). > > Since the 'mental slot' is already in scipy's users heads for saying > 'modify > the default output of this function to accumulate/store data in a > different > type', I think it would be reasonable to offer > > sqrt(a,rtype=float) > > as an optional way to prevent automatic upcasting in cases where users > want > that kind of very fine level control. This can be done uniformly across > the > library, rather than growing a zillion foof/food/foo* post-fixed forms of > every ufunc in the library. > > We would then have: > > - A basic principle for how upcasting is done, driven by the idea of > 'protecting precision even at the cost of storage'. This principle forces > sqrt(2) to be a double and anint_array.sum() to accumulate to a wider > type. > > - A uniform mechanism for overriding upcasting across the library, via the > rtype flag. If most/all of scipy implements this, it seems like a small > price > of learning to pay for a reasonable balance between convenience, > correctness > and efficiency. Yes, I think that would work well. Most of us, most of the time, could then rely on the unmodified functions to do the right thing. On the rare occasion that space really mattered, there would be a fallback position. It would also be easy to use a global type string mytype = 'Float32' and call everything critical with rtype=mytype. That would make it easy to change the behaviour of fairly large programs. Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From pearu at scipy.org Wed Oct 26 16:40:11 2005 From: pearu at scipy.org (Pearu Peterson) Date: Wed, 26 Oct 2005 15:40:11 -0500 (CDT) Subject: [SciPy-dev] All newscipy tests pass, except on 64-bit system Message-ID: Hi, Yes! Now all newscipy tests pass on my 32-bit system but on Opteron I get segfaults in fftpack tests. So far I have found that swapaxes might cause these faults but I am not sure. For example, the following codelet tmp = zeros((1,1,1,1)) swapaxes(tmp, 0, -1) sometimes fails with intermediate error message free(): invalid pointer 0xac89c0! I could not produce a simple example that would segfault on Opteron. So, I would be interested if anyone else is experiencing segfaults on 64-bit system, escpecially when running from scipy import * def direct_dftn(x): x = asarray(x) for axis in range(len(x.shape)): x = fft(x,axis=axis) return x direct_dftn(zeros((1,1,1,1,1))) Pearu From oliphant at ee.byu.edu Wed Oct 26 18:26:13 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Wed, 26 Oct 2005 16:26:13 -0600 Subject: [SciPy-dev] All newscipy tests pass, except on 64-bit system In-Reply-To: References: Message-ID: <43600285.3090304@ee.byu.edu> Pearu Peterson wrote: >Hi, >Yes! Now all newscipy tests pass on my 32-bit system but on Opteron I get >segfaults in fftpack tests. So far I have found that swapaxes might cause >these faults but I am not sure. For example, the following codelet > > tmp = zeros((1,1,1,1)) > swapaxes(tmp, 0, -1) > >sometimes fails with intermediate error message > > free(): invalid pointer 0xac89c0! > > Does this only happen on 64-bit systems? Or do others see this as well... -Travis From oliphant at ee.byu.edu Wed Oct 26 18:35:58 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Wed, 26 Oct 2005 16:35:58 -0600 Subject: [SciPy-dev] Question about 64-bit integers being cast to double precision In-Reply-To: References: <4349EDA2.2090802@ee.byu.edu> <434A98EB.7030407@ee.byu.edu> <434D8F27.100@colorado.edu> <1129157213.4839.5.camel@E011704> <435FA822.2050501@csun.edu> <1130345637.4822.24.camel@E011704> <435FBF12.9010406@colorado.edu> <435FD143.1050903@colorado.edu> Message-ID: <436004CE.2030604@ee.byu.edu> Charles R Harris wrote: > Since the 'mental slot' is already in scipy's users heads for > saying 'modify > the default output of this function to accumulate/store data in a > different > type', I think it would be reasonable to offer > > sqrt(a,rtype=float) > This would requires some rewriting of the internals which might be tricky to get right because of the conflict with the optional output arguments that are already available. Look at sqrt.types This shows you the types that actually have functions available. Everything else has to be cast to something. Right now, the rule is basically, don't cast unless we can "safely," where the notion of "safely" is defined in a switch-statement. I suppose some way to bypass this default function selection and pick the function you specify instead might be useful, especially because with the way type conversion is handled now through a buffer, it is a lot different (for large arrays) to cast during calculation of a ufunc then to do sqrt(a.astype(float)) which would make a copy of the data first. -Travis From pearu at scipy.org Wed Oct 26 17:39:08 2005 From: pearu at scipy.org (Pearu Peterson) Date: Wed, 26 Oct 2005 16:39:08 -0500 (CDT) Subject: [SciPy-dev] All newscipy tests pass, except on 64-bit system In-Reply-To: <43600285.3090304@ee.byu.edu> References: <43600285.3090304@ee.byu.edu> Message-ID: On Wed, 26 Oct 2005, Travis Oliphant wrote: > Pearu Peterson wrote: > >> Hi, >> Yes! Now all newscipy tests pass on my 32-bit system but on Opteron I get >> segfaults in fftpack tests. So far I have found that swapaxes might cause >> these faults but I am not sure. For example, the following codelet >> >> tmp = zeros((1,1,1,1)) >> swapaxes(tmp, 0, -1) >> >> sometimes fails with intermediate error message >> >> free(): invalid pointer 0xac89c0! >> >> > Does this only happen on 64-bit systems? Or do others see this as well... This happened only on 64-bit here. But note that I have now enabled tests for signal, stats, special, etc that give also a bunch of errors/failures on 32-bit system. signal tests give also segfaults, so, there remains some work to do. Pearu From oliphant at ee.byu.edu Wed Oct 26 18:43:16 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Wed, 26 Oct 2005 16:43:16 -0600 Subject: [SciPy-dev] All newscipy tests pass, except on 64-bit system In-Reply-To: References: Message-ID: <43600684.3010200@ee.byu.edu> Pearu Peterson wrote: >Hi, >Yes! Now all newscipy tests pass on my 32-bit system but on Opteron I get >segfaults in fftpack tests. So far I have found that swapaxes might cause >these faults but I am not sure. For example, the following codelet > > tmp = zeros((1,1,1,1)) > swapaxes(tmp, 0, -1) > >sometimes fails with intermediate error message > > Hey, I just saw a problem with some malloc code in PyArray_Transpose that was using sizeof(int) instead of sizeof(intp). This could definitely be the problem. Hopefully it is fixed now in SVN. Thanks for all the great work getting tests to pass. -Travis From oliphant at ee.byu.edu Wed Oct 26 18:44:32 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Wed, 26 Oct 2005 16:44:32 -0600 Subject: [SciPy-dev] All newscipy tests pass, except on 64-bit system In-Reply-To: References: <43600285.3090304@ee.byu.edu> Message-ID: <436006D0.2040703@ee.byu.edu> Pearu Peterson wrote: >On Wed, 26 Oct 2005, Travis Oliphant wrote: > > > >>Pearu Peterson wrote: >> >> >> >>>Hi, >>>Yes! Now all newscipy tests pass on my 32-bit system but on Opteron I get >>>segfaults in fftpack tests. So far I have found that swapaxes might cause >>>these faults but I am not sure. For example, the following codelet >>> >>> tmp = zeros((1,1,1,1)) >>> swapaxes(tmp, 0, -1) >>> >>>sometimes fails with intermediate error message >>> >>> free(): invalid pointer 0xac89c0! >>> >>> >>> >>> >>Does this only happen on 64-bit systems? Or do others see this as well... >> >> > >This happened only on 64-bit here. > >But note that I have now enabled tests for signal, stats, special, etc >that give also a bunch of errors/failures on 32-bit system. signal tests >give also segfaults, so, there remains some work to do. > > Yes, I noticed that the signal tools module has lots of places where it is assume that dimensions and so forth are integer pointers. Are their segfaults on 32-bit systems with signal tools? -Travis From pearu at scipy.org Wed Oct 26 17:48:50 2005 From: pearu at scipy.org (Pearu Peterson) Date: Wed, 26 Oct 2005 16:48:50 -0500 (CDT) Subject: [SciPy-dev] All newscipy tests pass, except on 64-bit system In-Reply-To: <436006D0.2040703@ee.byu.edu> References: <43600285.3090304@ee.byu.edu><436006D0.2040703@ee.byu.edu> Message-ID: On Wed, 26 Oct 2005, Travis Oliphant wrote: >> But note that I have now enabled tests for signal, stats, special, etc >> that give also a bunch of errors/failures on 32-bit system. signal tests >> give also segfaults, so, there remains some work to do. >> >> > Yes, I noticed that the signal tools module has lots of places where it > is assume that dimensions and so forth are integer pointers. Are their > segfaults on 32-bit systems with signal tools? Yes. I have check_basic (scipy.signal.signaltools.test_signaltools.test_convolve) ... ok check_basic (scipy.signal.signaltools.test_signaltools.test_medfilt) ... ok check_basic (scipy.signal.signaltools.test_signaltools.test_wiener) ... Segmentation fault Pearu From pearu at scipy.org Wed Oct 26 17:53:00 2005 From: pearu at scipy.org (Pearu Peterson) Date: Wed, 26 Oct 2005 16:53:00 -0500 (CDT) Subject: [SciPy-dev] All newscipy tests pass, except on 64-bit system In-Reply-To: <43600684.3010200@ee.byu.edu> References: <43600684.3010200@ee.byu.edu> Message-ID: On Wed, 26 Oct 2005, Travis Oliphant wrote: > Pearu Peterson wrote: > >> Hi, >> Yes! Now all newscipy tests pass on my 32-bit system but on Opteron I get >> segfaults in fftpack tests. So far I have found that swapaxes might cause >> these faults but I am not sure. For example, the following codelet >> >> tmp = zeros((1,1,1,1)) >> swapaxes(tmp, 0, -1) >> >> sometimes fails with intermediate error message >> >> > Hey, I just saw a problem with some malloc code in PyArray_Transpose > that was using sizeof(int) instead of sizeof(intp). This could > definitely be the problem. Hopefully it is fixed now in SVN. Yes, that fixed fftpack tests on Opteron. Thanks, Pearu From stephen.walton at csun.edu Wed Oct 26 22:35:22 2005 From: stephen.walton at csun.edu (Stephen Walton) Date: Wed, 26 Oct 2005 19:35:22 -0700 Subject: [SciPy-dev] All newscipy tests pass, except on 64-bit system In-Reply-To: References: Message-ID: <43603CEA.2060308@csun.edu> Pearu Peterson wrote: >Hi, >Yes! Now all newscipy tests pass on my 32-bit system > Sorry, but this is not true here. I traced a failure in scipy.stats to a bug which the following code demonstrates (the BIG array is cribbed from scipy.stats.tests.test_stats.py): from scipy import stats from scipy.base import array,Float mean = 99999995. BIG=array([99999991,99999992,99999993,99999994,99999995,99999996,99999997,99999998,99999999],Float) print BIG-mean print BIG-array(mean) print BIG-array([mean]) On an up to date Ubuntu 5.10 system, the output of the above script is: swalton at ubuntu:~$ python buglet.py Importing io to scipy Importing special to scipy Importing fftpack to scipy Importing cluster to scipy Importing sparse to scipy Importing signal to scipy Failed to import signal cannot import name comb Importing utils to scipy Importing interpolate to scipy Importing integrate to scipy Importing optimize to scipy Importing linalg to scipy [-4. -3. -2. -1. 0. 1. 2. 3. 4.] [-4. -3. -2. -1. 0. 1. 2. 3. 4.] [ -4.00000000e+000 9.99999920e+007 9.99999930e+007 9.99999940e+007 9.99999950e+007 nan -2.62192373e+257 -3.50744949e+010 The last line is, I think, not quite correct. A Fedora Core 4 system with Absoft Fortran gives similarly wacky results. Steve From oliphant at ee.byu.edu Wed Oct 26 23:06:53 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Wed, 26 Oct 2005 21:06:53 -0600 Subject: [SciPy-dev] All newscipy tests pass, except on 64-bit system In-Reply-To: <43603CEA.2060308@csun.edu> References: <43603CEA.2060308@csun.edu> Message-ID: <4360444D.5050404@ee.byu.edu> Stephen Walton wrote: >Pearu Peterson wrote: > > > >>Hi, >>Yes! Now all newscipy tests pass on my 32-bit system >> >> >> >Sorry, but this is not true here. I traced a failure in scipy.stats to >a bug which the following code demonstrates (the BIG array is cribbed >from scipy.stats.tests.test_stats.py): > > from scipy import stats > from scipy.base import array,Float > mean = 99999995. > BIG=array([99999991,99999992,99999993,99999994,99999995,99999996,99999997,99999998,99999999],Float) > print BIG-mean > print BIG-array(mean) > print BIG-array([mean]) > > > We are getting errors in stats too. He said this before he added back the tests for the untested packages. >The last line is, I think, not quite correct. A Fedora Core 4 system >with Absoft Fortran gives similarly wacky results. > > Good catch. This was a bug introduced by some optimizations I did a while ago. It was executing the fastest section of code but with incorrect step sizes. This should be fixed in SVN. -Travis From stephen.walton at csun.edu Wed Oct 26 23:49:19 2005 From: stephen.walton at csun.edu (Stephen Walton) Date: Wed, 26 Oct 2005 20:49:19 -0700 Subject: [SciPy-dev] All newscipy tests pass, except on 64-bit system In-Reply-To: <4360444D.5050404@ee.byu.edu> References: <43603CEA.2060308@csun.edu> <4360444D.5050404@ee.byu.edu> Message-ID: <43604E3F.3060009@csun.edu> Travis Oliphant wrote: >Good catch. This was a bug introduced by some optimizations I did a >while ago. It was executing the fastest section of code but with >incorrect step sizes. This should be fixed in SVN. > > Yup. This change all by itself dropped the number of fails in scipy.test() to 19 from 32 for me. Thanks! Steve From cimrman3 at ntc.zcu.cz Thu Oct 27 02:20:27 2005 From: cimrman3 at ntc.zcu.cz (Robert Cimrman) Date: Thu, 27 Oct 2005 08:20:27 +0200 Subject: [SciPy-dev] Question about 64-bit integers being cast to double precision In-Reply-To: <1130345637.4822.24.camel@E011704> References: <4349EDA2.2090802@ee.byu.edu> <434A9453.2010705@csun.edu> <434A98EB.7030407@ee.byu.edu> <434D8F27.100@colorado.edu> <1129157213.4839.5.camel@E011704> <435FA822.2050501@csun.edu> <1130345637.4822.24.camel@E011704> Message-ID: <436071AB.202@ntc.zcu.cz> Charles R Harris wrote: > On Wed, 2005-10-26 at 09:00 -0700, Stephen Walton wrote: > >>Arnd Baecker wrote: >> >> >>>I also think that double should be kept as default. >>>If I understand things correctly, both normal python and >>>all the libraries for scipy can only deal with that at the moment. >>> >>> >> >>I respectfully disagree that double should be the default target for >>upcasts. This is a holdover from C and was a bad decision when made for >>that language. And, as Pearu points out, it has dire consequences for >>storage. If I get a 16 Megapixel image from HST with two-byte integers, >>I definitely would not want that image upcast to 64 or, heaven forfend, >>128 bits the first time I did an operation on it. > > > I think there are two goals here: 1) it just works 2) it is efficient. > These goals are not always compatible. In order to just work, certain > defaults need to be assumed. Python works like that, it is one of the > reasons it is so convenient. On the other hand, efficiency, space > efficiency in particular, requires greater control on the part of the > programmer who has to take the trouble pick the types he wants to use, > making a trade between precision, space, and speed. So I think that we > should choose reasonable defaults that carry on the Python spirit, while > leaving open options for the programmer who wants more control. How to > do this without making a mess is the question. Maybe the arrays could have some 'manual type control' flag (which could be set on e.g. when explicitly stating type in an array constructor) - then 1) everything would just work and 2) a user could always set 'manual on', causing all ops on that array to return the array of the same (or given (via rtype?)) type. I know, it still does not solve how to do the 'it just works' part. with just my 2 cents, r. From nwagner at mecha.uni-stuttgart.de Thu Oct 27 03:29:09 2005 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Thu, 27 Oct 2005 09:29:09 +0200 Subject: [SciPy-dev] Results of scipy.test(1, 10) with latest svn versions of newcore and newscipy Message-ID: <436081C5.2090101@mecha.uni-stuttgart.de> Ran 1106 tests in 4.692s FAILED (failures=20, errors=75) >>> scipy.__core_version__ '0.4.3.1361' >>> >>> scipy.__scipy_version__ '0.4.2_1381' >>> NameError: global name 'logical_and' is not defined y = scipy.round(ROUND[i]) AttributeError: 'module' object has no attribute 'round' if vals.typecode() not in scipy.typecodes['AllInteger']: AttributeError: 'scipy.ndarray' object has no attribute 'typecode' const = 1.0/(1+exp(-a)) NameError: global name 'exp' is not defined From pearu at scipy.org Thu Oct 27 05:12:30 2005 From: pearu at scipy.org (Pearu Peterson) Date: Thu, 27 Oct 2005 04:12:30 -0500 (CDT) Subject: [SciPy-dev] Results of scipy.test(1, 10) with latest svn versions of newcore and newscipy In-Reply-To: <436081C5.2090101@mecha.uni-stuttgart.de> References: <436081C5.2090101@mecha.uni-stuttgart.de> Message-ID: On Thu, 27 Oct 2005, Nils Wagner wrote: > Ran 1106 tests in 4.692s > > FAILED (failures=20, errors=75) ... is now FAILED (failures=14) on a 32-bit (mostly special failures, 1 from stats) and FAILED (failures=26, errors=2) on a 64-bit machine (sparse, special, linalg failures, 1 from stats). Pearu From nwagner at mecha.uni-stuttgart.de Thu Oct 27 06:56:05 2005 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Thu, 27 Oct 2005 12:56:05 +0200 Subject: [SciPy-dev] Results of scipy.test(1, 10) with latest svn versions of newcore and newscipy In-Reply-To: References: <436081C5.2090101@mecha.uni-stuttgart.de> Message-ID: <4360B245.4010606@mecha.uni-stuttgart.de> Pearu Peterson wrote: >On Thu, 27 Oct 2005, Nils Wagner wrote: > > >>Ran 1106 tests in 4.692s >> >>FAILED (failures=20, errors=75) >> > >... is now > >FAILED (failures=14) > >on a 32-bit (mostly special failures, 1 from stats) and > >FAILED (failures=26, errors=2) > >on a 64-bit machine (sparse, special, linalg failures, 1 from stats). > >Pearu > >_______________________________________________ >Scipy-dev mailing list >Scipy-dev at scipy.net >http://www.scipy.net/mailman/listinfo/scipy-dev > on a 32-bit machine without ATLAS Ran 1106 tests in 4.710s FAILED (failures=15, errors=1) >>> scipy.__scipy_version__ '0.4.2_1390' >>> scipy.__core_version__ '0.4.3.1363' >>> ====================================================================== ERROR: check_nils (scipy.linalg.matfuncs.test_matfuncs.test_logm) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib/python2.4/site-packages/scipy/linalg/tests/test_matfuncs.py", line 82, in check_nils logm((identity(7)*3.1+0j)-a) File "/usr/local/lib/python2.4/site-packages/scipy/linalg/matfuncs.py", line 232, in logm errest = norm(expm(F)-A,1) / norm(A,1) File "/usr/local/lib/python2.4/site-packages/scipy/linalg/basic.py", line 254, in norm x = asarray_chkfinite(x) File "/usr/local/lib/python2.4/site-packages/scipy/base/function_base.py", line 204, in asarray_chkfinite raise ValueError, "array must not contain infs or nans" ValueError: array must not contain infs or nans I do not use ATLAS here. Can you reproduce this error with/without ATLAS ? Nils From schofield at ftw.at Thu Oct 27 11:38:23 2005 From: schofield at ftw.at (Ed Schofield) Date: Thu, 27 Oct 2005 17:38:23 +0200 Subject: [SciPy-dev] Casting and rtype arguments [Was: Question about 64-bit integers being cast to double precision] In-Reply-To: References: <4349EDA2.2090802@ee.byu.edu> <434A98EB.7030407@ee.byu.edu> <434D8F27.100@colorado.edu> <1129157213.4839.5.camel@E011704> <435FA822.2050501@csun.edu> <1130345637.4822.24.camel@E011704> <435FBF12.9010406@colorado.edu> <435FD143.1050903@colorado.edu> Message-ID: <4360F46F.9020800@ftw.at> Charles R Harris wrote: > On 10/26/05, *Fernando Perez* > wrote: > > Charles R Harris wrote: > > Since the 'mental slot' is already in scipy's users heads for > saying 'modify > the default output of this function to accumulate/store data in a > different > type', I think it would be reasonable to offer > > sqrt(a,rtype=float) > > as an optional way to prevent automatic upcasting in cases where > users want > that kind of very fine level control. This can be done uniformly > across the > library, rather than growing a zillion foof/food/foo* post-fixed > forms of > every ufunc in the library. > > We would then have: > > - A basic principle for how upcasting is done, driven by the idea of > 'protecting precision even at the cost of storage'. This > principle forces > sqrt(2) to be a double and anint_array.sum() to accumulate to a > wider type. > > - A uniform mechanism for overriding upcasting across the library, > via the > rtype flag. If most/all of scipy implements this, it seems like a > small price > of learning to pay for a reasonable balance between convenience, > correctness > and efficiency. > > > Yes, I think that would work well. Most of us, most of the time, could > then rely on the unmodified functions to do the right thing. On the > rare occasion that space really mattered, there would be a fallback > position. It would also be easy to use a global type string mytype = > 'Float32' and call everything critical with rtype=mytype. That would > make it easy to change the behaviour of fairly large programs. I agree that this would be a nice consistent interface (if we can implement it) :) I've added text to the docstrings for a.sum() and a.mean() to reflect their new behaviour (re. thread on int8 array operations) and the role of the 'rtype' argument there. Let me know if you think anything's wrong. Otherwise we could aim to migrate gradually to similar behaviour with other functions. I'm not sure that 'rtype' (for 'return type'?) is the most accurate name. For a.mean() the rtype is currently the type used for intermediate calculations (in a.sum()), not the return type. (The return type is float, even if the 'rtype' is int, and I agree with this behaviour.) The same is true, in a sense, for a.sum(). The second example in the new a.sum() docstring is: >>> array([0.5, 1.5]).sum(rtype=int32) 1 where the floats are downcast to int32 before the sum. My guess is that a user who goes to the trouble of specifying a non-default data type for an operation is at least as interested in the data type of the intermediate operations as in the return type. Perhaps we should think instead about the data types used for intermediate operations, as sum() and mean() do now, and rename the argument 'itype'. Another option would be to change the behaviour of a.sum() and a.mean() so they really do return the given type. But I'm not keen on this, since we can already achieve this without any 'rtype' argument by casting the output to the desired type, and this leaves us less control over what actually goes on behind the scenes... Comments?! -- Ed From rstanchak at yahoo.com Thu Oct 27 11:43:53 2005 From: rstanchak at yahoo.com (Roman Stanchak) Date: Thu, 27 Oct 2005 08:43:53 -0700 (PDT) Subject: [SciPy-dev] newcore atlas info on Gentoo In-Reply-To: <43566FB3.4090108@csun.edu> Message-ID: <20051027154353.63983.qmail@web35602.mail.mud.yahoo.com> The solution to this issue was hinted at, but never explicitly stated. Just for completeness, on my Gentoo system, to detect ATLAS with newcore I placed the following in newcore/scipy/distutils/site.cfg [atlas] library_dirs = /usr/lib/blas/atlas:/usr/lib/lapack/atlas atlas_libs = lapack, blas, cblas, atlas Also, pls email if you're interested in scipy-core-svn and scipy-svn ebuilds --Roman __________________________________ Yahoo! Mail - PC Magazine Editors' Choice 2005 http://mail.yahoo.com From chanley at stsci.edu Thu Oct 27 11:57:52 2005 From: chanley at stsci.edu (Christopher Hanley) Date: Thu, 27 Oct 2005 11:57:52 -0400 Subject: [SciPy-dev] complex data type Message-ID: <4360F900.3010305@stsci.edu> Hi Travis, While working on the records port I have run into some confusion over the handling of the "complex" data type. I would like to know if numarray's Complex32 is equivalent to a complex32 in scipy_Core? When you say complex32, are both the real and imaginary parts each 32 bits in size? Chris -- Christopher Hanley Systems Software Engineer Space Telescope Science Institute 3700 San Martin Drive Baltimore MD, 21218 (410) 338-4338 From oliphant at ee.byu.edu Thu Oct 27 12:45:58 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Thu, 27 Oct 2005 10:45:58 -0600 Subject: [SciPy-dev] complex data type In-Reply-To: <4360F900.3010305@stsci.edu> References: <4360F900.3010305@stsci.edu> Message-ID: <43610446.3020302@ee.byu.edu> Christopher Hanley wrote: >Hi Travis, > >While working on the records port I have run into some confusion over >the handling of the "complex" data type. I would like to know if >numarray's Complex32 is equivalent to a complex32 in scipy_Core? When >you say complex32, are both the real and imaginary parts each 32 bits in >size? > > > No. Complex32 is for backwards compatibility only and it is equivalent to complex64 Bit-widths should be bit-widths for the whole type. complex64 and complex128 are the preferred bit-width names now. There is no complex32. -Travis From schofield at ftw.at Thu Oct 27 12:55:25 2005 From: schofield at ftw.at (Ed Schofield) Date: Thu, 27 Oct 2005 18:55:25 +0200 Subject: [SciPy-dev] Segmentation fault triggered by optimize/lbfgsb.py In-Reply-To: <435EFD5A.4050702@ee.byu.edu> References: <435E6BF5.3070103@ftw.at> <435EFD5A.4050702@ee.byu.edu> Message-ID: <4361067D.2010906@ftw.at> Travis Oliphant wrote: >Ed Schofield wrote: > > > >>The code in optimize/lbfgsb.py raises an exception and then dumps core ... >> >> >One part of the problem has been fixed in new scipy core. > > I've unearthed some more problems with rank-0 arrays. >>> import scipy # base version '0.4.3.1366' >>> f = scipy.array(0.0, scipy.float64) >>> f array(0.0) >>> f.shape () >>> scipy.rank(f) 0 >>> f[0] Traceback (most recent call last): File "", line 1, in ? IndexError: 0-d arrays can't be indexed >>> f[0] = 10 >>> f array(10.0) >>> f[-1] = 10 Segmentation fault I've committed a patch that fixes the first bug; now assigning to f[0] throws an IndexError like reading from f[0] does. The second bug is still there. Is there a way to build scipy with debugging symbols using distutils? If not, how would you recommend doing this? -- Ed From oliphant at ee.byu.edu Thu Oct 27 15:58:48 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Thu, 27 Oct 2005 13:58:48 -0600 Subject: [SciPy-dev] Segmentation fault triggered by optimize/lbfgsb.py In-Reply-To: <4361067D.2010906@ftw.at> References: <435E6BF5.3070103@ftw.at> <435EFD5A.4050702@ee.byu.edu> <4361067D.2010906@ftw.at> Message-ID: <43613178.4040405@ee.byu.edu> Ed Schofield wrote: >Is there a way to build scipy with debugging symbols using distutils? >If not, how would you recommend doing this? > > If you build Python with debugging symbols, then distutils automatically builds extensions with debugging symbols. I've built extensions with debugging symbols in the past when they are not in the main Python, but it is less effective. -Travis From oliphant at ee.byu.edu Thu Oct 27 20:01:39 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Thu, 27 Oct 2005 18:01:39 -0600 Subject: [SciPy-dev] All but two tests passing for me Message-ID: <43616A63.4070205@ee.byu.edu> After a fix to PyArray_Correlate all but two tests are passing for me on a 32-bit platform. I'd like to see if the problems are gone from signaltools for 64-bit platforms (there may still be some lurking int -> intp incompatibilities --- build logs from those platforms are particularly valuable). Are there any more tests that are not "turned on." What is left before we can make a release of newscipy? -Travis From Fernando.Perez at colorado.edu Thu Oct 27 20:08:31 2005 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Thu, 27 Oct 2005 18:08:31 -0600 Subject: [SciPy-dev] All but two tests passing for me In-Reply-To: <43616A63.4070205@ee.byu.edu> References: <43616A63.4070205@ee.byu.edu> Message-ID: <43616BFF.4070404@colorado.edu> Travis Oliphant wrote: > After a fix to PyArray_Correlate all but two tests are passing for me on > a 32-bit platform. > > I'd like to see if the problems are gone from signaltools for 64-bit > platforms (there may still be some lurking int -> intp incompatibilities > --- build logs from those platforms are particularly valuable). > > Are there any more tests that are not "turned on." What is left before > we can make a release of newscipy? Did you have a look at the logs posted yesterday by Andrew about the Itanium builds? It seems like getting the build process on that box to work still is very manual-labor intensive. If you have any pointers, they'd be greatly appreciated. Andrew - I can try to work on this with you a little tomorrow if you are interested. We can try to get the full newscipy to build on Itanium2 and report back. I'd really like to have full 64-bit support out of the box when the new code is released. Cheers, f From oliphant at ee.byu.edu Thu Oct 27 20:31:44 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Thu, 27 Oct 2005 18:31:44 -0600 Subject: [SciPy-dev] All but two tests passing for me In-Reply-To: <43616A63.4070205@ee.byu.edu> References: <43616A63.4070205@ee.byu.edu> Message-ID: <43617170.9080506@ee.byu.edu> Travis Oliphant wrote: >After a fix to PyArray_Correlate all but two tests are passing for me on >a 32-bit platform. > > > Now all tests are passing. I'm sure issues will still be resolved. But, I'm so pleased, I just had to show everybody :-) ---------------------------------------------------------------------- Ran 1127 tests in 85.453s OK Thanks for all who are helping in the bug-fixing and conversion process. -Travis From oliphant at ee.byu.edu Thu Oct 27 20:38:06 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Thu, 27 Oct 2005 18:38:06 -0600 Subject: [SciPy-dev] All but two tests passing for me In-Reply-To: <43616BFF.4070404@colorado.edu> References: <43616A63.4070205@ee.byu.edu> <43616BFF.4070404@colorado.edu> Message-ID: <436172EE.3080208@ee.byu.edu> Fernando Perez wrote: >Did you have a look at the logs posted yesterday by Andrew about the Itanium >builds? It seems like getting the build process on that box to work still is >very manual-labor intensive. If you have any pointers, they'd be greatly >appreciated. > > > Yes, I looked at the logs he reported. His problems look like an issue with the Python installation. There are a lot of underfined references to basic Python C-API calls. Somehow he is not linking against the Python library? I don't know why that is the case. Perhaps scipy.distutils is not picking up where his Python library is located (it looks like he has Python installed in some non-standard place). Or when he links with f77 the Python library is not included. I'm not really sure what's going on. So, I think the problems Andrew is having have to do with non-standard installation issues and not 64-bit issues. >Andrew - I can try to work on this with you a little tomorrow if you are >interested. We can try to get the full newscipy to build on Itanium2 and >report back. I'd really like to have full 64-bit support out of the box when >the new code is released. > > I don't think his issues are 64-bit related at all. Scipy core is pretty clean with 64-bit support. Newscipy is mostly clean (signaltools had alot of places where int * was assumed for strides and dimensions that I think were changed --- but I'm not sure if everything was fixed). -Travis From Fernando.Perez at colorado.edu Thu Oct 27 20:42:32 2005 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Thu, 27 Oct 2005 18:42:32 -0600 Subject: [SciPy-dev] All but two tests passing for me In-Reply-To: <436172EE.3080208@ee.byu.edu> References: <43616A63.4070205@ee.byu.edu> <43616BFF.4070404@colorado.edu> <436172EE.3080208@ee.byu.edu> Message-ID: <436173F8.8050708@colorado.edu> Travis Oliphant wrote: > Fernando Perez wrote: > > >>Did you have a look at the logs posted yesterday by Andrew about the Itanium >>builds? It seems like getting the build process on that box to work still is >>very manual-labor intensive. If you have any pointers, they'd be greatly >>appreciated. >> >> >> > > Yes, I looked at the logs he reported. His problems look like an > issue with the Python installation. There are a lot of underfined > references to basic Python C-API calls. Somehow he is not linking > against the Python library? I don't know why that is the case. > > Perhaps scipy.distutils is not picking up where his Python library is > located (it looks like he has Python installed in some non-standard > place). Or when he links with f77 the Python library is not included. > > I'm not really sure what's going on. > > So, I think the problems Andrew is having have to do with non-standard > installation issues and not 64-bit issues. OK, thanks for the feedback, Travis. I'll try to play with Andrew with this problem, and we'll report back, or ask for help if we get stuck. I know he was building against a Python built in /usr/local instead of the default system one, but in principle that should work. I'll let you know if we see problems, for now your message is very useful. Cheers, f From nwagner at mecha.uni-stuttgart.de Fri Oct 28 02:03:48 2005 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Fri, 28 Oct 2005 08:03:48 +0200 Subject: [SciPy-dev] All but two tests passing for me In-Reply-To: <43617170.9080506@ee.byu.edu> References: <43616A63.4070205@ee.byu.edu> <43617170.9080506@ee.byu.edu> Message-ID: <4361BF44.6010909@mecha.uni-stuttgart.de> Travis Oliphant wrote: >Travis Oliphant wrote: > > >>After a fix to PyArray_Correlate all but two tests are passing for me on >>a 32-bit platform. >> >> >> >> >Now all tests are passing. > >I'm sure issues will still be resolved. But, I'm so pleased, I just had >to show everybody :-) > >---------------------------------------------------------------------- >Ran 1127 tests in 85.453s > >OK > > >Thanks for all who are helping in the bug-fixing and conversion process. > >-Travis > >_______________________________________________ >Scipy-dev mailing list >Scipy-dev at scipy.net >http://www.scipy.net/mailman/listinfo/scipy-dev > Hi Travis, scipy.test(1,10) on a 32-bit system yields ====================================================================== ERROR: check_nils (scipy.linalg.matfuncs.test_matfuncs.test_logm) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib/python2.4/site-packages/scipy/linalg/tests/test_matfuncs.py", line 82, in check_nils logm((identity(7)*3.1+0j)-a) File "/usr/local/lib/python2.4/site-packages/scipy/linalg/matfuncs.py", line 232, in logm errest = norm(expm(F)-A,1) / norm(A,1) File "/usr/local/lib/python2.4/site-packages/scipy/linalg/basic.py", line 254, in norm x = asarray_chkfinite(x) File "/usr/local/lib/python2.4/site-packages/scipy/base/function_base.py", line 211, in asarray_chkfinite raise ValueError, "array must not contain infs or NaNs" ValueError: array must not contain infs or NaNs ====================================================================== FAIL: check_nils (scipy.linalg.matfuncs.test_matfuncs.test_signm) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib/python2.4/site-packages/scipy/linalg/tests/test_matfuncs.py", line 44, in check_nils assert_array_almost_equal(r,cr) File "/usr/local/lib/python2.4/site-packages/scipy/test/testing.py", line 727, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 100.0%): Array 1: [[ 1.2104571e+01 -2.3056788e-02j -1.0034348e+00 -1.8445430e-01j 1.5239715e+01 +1.1528394e-02j 2.1808571e+01 -2.3... Array 2: [[ 11.9493333 -2.2453333 15.3173333 21.6533333 -2.2453333] [ -3.8426667 0.4986667 -4.5906667 -7.1866667 0.498... ---------------------------------------------------------------------- Ran 1106 tests in 4.545s FAILED (failures=1, errors=1) >>> scipy.__core_version__ '0.4.3.1371' >>> scipy.__scipy_version__ '0.4.2_1393' I guess these issues are connected with the fact that I do not use ATLAS here. Any idea how to resolve this ? Nils From nwagner at mecha.uni-stuttgart.de Fri Oct 28 02:26:18 2005 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Fri, 28 Oct 2005 08:26:18 +0200 Subject: [SciPy-dev] Possibly bug in logm Message-ID: <4361C48A.9070305@mecha.uni-stuttgart.de> >>> linalg.logm(1.0*identity(2)) array([[ 0.00000000e+000, -5.28723020e-270], [ 0.00000000e+000, 0.00000000e+000]]) >>> linalg.logm(1.0*identity(2)) array([[ 0.00000000e+000, 1.03655723e-269], [ 0.00000000e+000, 0.00000000e+000]]) >>> linalg.logm(1.0*identity(2)) array([[ 0., 0.], [ 0., 0.]]) >>> linalg.logm(1.0*identity(2)) array([[ 0.00000000e+000, 1.54221980e-269], [ 0.00000000e+000, 0.00000000e+000]]) >>> linalg.logm(1.0*identity(2)) array([[ 0., 0.], [ 0., 0.]]) >>> linalg.logm(1.0*identity(2)) array([[ 0., 0.], [ 0., 0.]]) >>> linalg.logm(1.0*identity(2)) array([[ 0.00000000e+000, -9.25299312e-270], [ 0.00000000e+000, 0.00000000e+000]]) >>> linalg.logm(1.0*identity(2)) array([[ 0., 0.], [ 0., 0.]]) >>> linalg.logm(1.0*identity(2)) array([[ 0., 0.], [ 0., 0.]]) >>> linalg.logm(1.0*identity(2)) array([[ 0.00000000e+000, -1.01171794e-269], [ 0.00000000e+000, 0.00000000e+000]]) >>> linalg.logm(1.0*identity(2)) array([[ 0. +0.00000000e+00j, 0. +6.80000020e+01j], [ 0. +0.00000000e+00j, 0. +0.00000000e+00j]]) >>> linalg.logm(1.0*identity(2)) array([[ 0., 0.], [ 0., 0.]]) >>> linalg.logm(1.0*identity(2)) array([[ 0. +0.00000000e+00j, 0. -9.91687036e+00j], [ 0. +0.00000000e+00j, 0. +0.00000000e+00j]]) The repeated computation of logm() yields strange results. Can someone reproduce this behaviour with latest svn versions ? Nils From nwagner at mecha.uni-stuttgart.de Fri Oct 28 03:02:42 2005 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Fri, 28 Oct 2005 09:02:42 +0200 Subject: [SciPy-dev] logm Message-ID: <4361CD12.60109@mecha.uni-stuttgart.de> def logm(A,disp=1): """Matrix logarithm, inverse of expm.""" # Compute using general funm but then use better error estimator and # make one step in improving estimate using a rotation matrix. A = mat(asarray(A)) All other matrix functions use A = asarray(A) instead of mat(...) . Nils From arnd.baecker at web.de Fri Oct 28 04:14:19 2005 From: arnd.baecker at web.de (Arnd Baecker) Date: Fri, 28 Oct 2005 10:14:19 +0200 (CEST) Subject: [SciPy-dev] All but two tests passing for me In-Reply-To: <43617170.9080506@ee.byu.edu> References: <43616A63.4070205@ee.byu.edu> <43617170.9080506@ee.byu.edu> Message-ID: Hi Travis, On Thu, 27 Oct 2005, Travis Oliphant wrote: > Travis Oliphant wrote: > > >After a fix to PyArray_Correlate all but two tests are passing for me on > >a 32-bit platform. > > > > > > > Now all tests are passing. > > I'm sure issues will still be resolved. But, I'm so pleased, I just had > to show everybody :-) > > ---------------------------------------------------------------------- > Ran 1127 tests in 85.453s > > OK > > > Thanks for all who are helping in the bug-fixing and conversion process. That is good newws!! From arnd.baecker at web.de Fri Oct 28 04:29:45 2005 From: arnd.baecker at web.de (Arnd Baecker) Date: Fri, 28 Oct 2005 10:29:45 +0200 (CEST) Subject: [SciPy-dev] All but two tests passing for me In-Reply-To: <43617170.9080506@ee.byu.edu> References: <43616A63.4070205@ee.byu.edu> <43617170.9080506@ee.byu.edu> Message-ID: umpf - pressed send to early .... Hi Travis On Thu, 27 Oct 2005, Travis Oliphant wrote: > Travis Oliphant wrote: > > >After a fix to PyArray_Correlate all but two tests are passing for me on > >a 32-bit platform. > > > > > > > Now all tests are passing. > > I'm sure issues will still be resolved. But, I'm so pleased, I just had > to show everybody :-) > > ---------------------------------------------------------------------- > Ran 1127 tests in 85.453s > > OK > > > Thanks for all who are helping in the bug-fixing and conversion process. That is good news!! I don't want to spoil your party, but I think I have to ;-) In [3]: scipy.__core_version__ Out[3]: '0.4.3.1371' In [4]: scipy.__scipy_version__ Out[4]: '0.4.2_1393' scipy.test(10,verbosity=10) gives on the opteron ====================================================================== ERROR: check_integer (scipy.io.array_import.test_array_import.test_read_array) ---------------------------------------------------------------------- Traceback (most recent call last): File "/home/abaecker/BUILDS2/Build_52/inst_scipy_newcore/lib/python2.4/site-packages/scipy/io/tests/test_array_import.py", line 62, in check_integer b = io.read_array(fname,atype=N.Int) File "/home/abaecker/BUILDS2/Build_52//inst_scipy_newcore/lib/python2.4/site-packages/scipy/io/array_import.py", line 359, in read_array raise ValueError, "One of the array types is invalid, k=%d" % k ValueError: One of the array types is invalid, k=0 This is might be caused by: compile options: '-I/home/abaecker/BUILDS2/Build_52/inst_scipy_newcore/lib/python2.4/site-packages/scipy/base/include -I/scr/python/include/python2.4 -c' gcc: Lib/io/numpyiomodule.c Lib/io/numpyiomodule.c: In function `numpyio_tofile': Lib/io/numpyiomodule.c:282: warning: passing arg 1 of pointer to function from incompatible pointer type Lib/io/numpyiomodule.c: In function `numpyio_convert_objects': Lib/io/numpyiomodule.c:743: warning: passing arg 2 of pointer to function from incompatible pointer type gcc -pthread -shared build/temp.linux-x86_64-2.4/Lib/io/numpyiomodule.o -Lbuild/temp.linux-x86_64-2.4 -o build/lib.linux-x86_64-2.4/scipy/io/numpyio.so building 'scipy.fftpack._fftpack' extension compiling C sources ====================================================================== ERROR: check_simple_sym (scipy.linalg.basic.test_basic.test_solve) ---------------------------------------------------------------------- Traceback (most recent call last): File "/home/abaecker/BUILDS2/Build_52/inst_scipy_newcore/lib/python2.4/site-packages/scipy/linalg/tests/test_basic.py", line 80, in check_simple_sym x = solve(a,b,sym_pos=1,lower=lower) File "/home/abaecker/BUILDS2/Build_52//inst_scipy_newcore/lib/python2.4/site-packages/scipy/linalg/basic.py", line 127, in solve raise LinAlgError, "singular matrix" LinAlgError: singular matrix ====================================================================== FAIL: check_simple (scipy.linalg.decomp.test_decomp.test_cholesky) ---------------------------------------------------------------------- Traceback (most recent call last): File "/home/abaecker/BUILDS2/Build_52/inst_scipy_newcore/lib/python2.4/site-packages/scipy/linalg/tests/test_decomp.py", line 261, in check_simple assert_array_almost_equal(dot(transpose(c),c),a) File "/home/abaecker/BUILDS2/Build_52//inst_scipy_newcore/lib/python2.4/site-packages/scipy/test/testing.py", line 727, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 100.0%): Array 1: [[-1885965976648932823 7796773222197633577 -761725725645001531] [ 7796773222197633577 -6619736815450336083 789149371... Array 2: [[8 2 3] [2 9 3] [3 3 6]] ====================================================================== FAIL: check_matvec (scipy.sparse.sparse.test_sparse.test_csc) ---------------------------------------------------------------------- Traceback (most recent call last): File "/home/abaecker/BUILDS2/Build_52/inst_scipy_newcore/lib/python2.4/site-packages/scipy/sparse/tests/test_sparse.py", line 66, in check_matvec assert_array_almost_equal([1,2,3,4]*b,dot([1,2,3,4],b.todense())) File "/home/abaecker/BUILDS2/Build_52//inst_scipy_newcore/lib/python2.4/site-packages/scipy/test/testing.py", line 727, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 100.0%): Array 1: [ 8.3991160e-323 6.9169190e-323 4.4465908e-323] Array 2: [ 17. 14. 9.] ====================================================================== FAIL: check_matvec (scipy.sparse.sparse.test_sparse.test_csr) ---------------------------------------------------------------------- Traceback (most recent call last): File "/home/abaecker/BUILDS2/Build_52/inst_scipy_newcore/lib/python2.4/site-packages/scipy/sparse/tests/test_sparse.py", line 66, in check_matvec assert_array_almost_equal([1,2,3,4]*b,dot([1,2,3,4],b.todense())) File "/home/abaecker/BUILDS2/Build_52//inst_scipy_newcore/lib/python2.4/site-packages/scipy/test/testing.py", line 727, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 100.0%): Array 1: [ 8.3991160e-323 6.9169190e-323 4.4465908e-323] Array 2: [ 17. 14. 9.] ====================================================================== FAIL: check_matvec (scipy.sparse.sparse.test_sparse.test_dok) ---------------------------------------------------------------------- Traceback (most recent call last): File "/home/abaecker/BUILDS2/Build_52/inst_scipy_newcore/lib/python2.4/site-packages/scipy/sparse/tests/test_sparse.py", line 66, in check_matvec assert_array_almost_equal([1,2,3,4]*b,dot([1,2,3,4],b.todense())) File "/home/abaecker/BUILDS2/Build_52//inst_scipy_newcore/lib/python2.4/site-packages/scipy/test/testing.py", line 727, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 100.0%): Array 1: [ 8.3991160e-323 6.9169190e-323 4.4465908e-323] Array 2: [ 17. 14. 9.] ====================================================================== FAIL: check_simple (scipy.linalg.basic.test_basic.test_det) ---------------------------------------------------------------------- Traceback (most recent call last): File "/home/abaecker/BUILDS2/Build_52/inst_scipy_newcore/lib/python2.4/site-packages/scipy/linalg/tests/test_basic.py", line 273, in check_simple assert_almost_equal(a_det,-2.0) File "/home/abaecker/BUILDS2/Build_52//inst_scipy_newcore/lib/python2.4/site-packages/scipy/test/testing.py", line 649, in assert_almost_equal assert round(abs(desired - actual),decimal) == 0, msg AssertionError: Items are not equal: DESIRED: -2.0 ACTUAL: -0.0 ====================================================================== FAIL: check_simple (scipy.linalg.basic.test_basic.test_inv) ---------------------------------------------------------------------- Traceback (most recent call last): File "/home/abaecker/BUILDS2/Build_52/inst_scipy_newcore/lib/python2.4/site-packages/scipy/linalg/tests/test_basic.py", line 202, in check_simple [[1,0],[0,1]]) File "/home/abaecker/BUILDS2/Build_52//inst_scipy_newcore/lib/python2.4/site-packages/scipy/test/testing.py", line 727, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 100.0%): Array 1: [[9216616637413720064 -4503599627370496] [9207609438158979072 -9007199254740992]] Array 2: [[1 0] [0 1]] ====================================================================== FAIL: check_simple_exact (scipy.linalg.basic.test_basic.test_lstsq) ---------------------------------------------------------------------- Traceback (most recent call last): File "/home/abaecker/BUILDS2/Build_52/inst_scipy_newcore/lib/python2.4/site-packages/scipy/linalg/tests/test_basic.py", line 356, in check_simple_exact assert_array_almost_equal(Numeric.matrixmultiply(a,x),b) File "/home/abaecker/BUILDS2/Build_52//inst_scipy_newcore/lib/python2.4/site-packages/scipy/test/testing.py", line 727, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 50.0%): Array 1: [0 0] Array 2: [1 0] ====================================================================== FAIL: check_simple_overdet (scipy.linalg.basic.test_basic.test_lstsq) ---------------------------------------------------------------------- Traceback (most recent call last): File "/home/abaecker/BUILDS2/Build_52/inst_scipy_newcore/lib/python2.4/site-packages/scipy/linalg/tests/test_basic.py", line 363, in check_simple_overdet assert_array_almost_equal(x,direct_lstsq(a,b)) File "/home/abaecker/BUILDS2/Build_52//inst_scipy_newcore/lib/python2.4/site-packages/scipy/test/testing.py", line 727, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 100.0%): Array 1: [-9223372036854775808 1] Array 2: [-0.4285714 0.8571429] ====================================================================== FAIL: check_simple (scipy.linalg.basic.test_basic.test_solve) ---------------------------------------------------------------------- Traceback (most recent call last): File "/home/abaecker/BUILDS2/Build_52/inst_scipy_newcore/lib/python2.4/site-packages/scipy/linalg/tests/test_basic.py", line 74, in check_simple assert_array_almost_equal(Numeric.matrixmultiply(a,x),b) File "/home/abaecker/BUILDS2/Build_52//inst_scipy_newcore/lib/python2.4/site-packages/scipy/test/testing.py", line 727, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 100.0%): Array 1: [[ nan nan] [ nan nan]] Array 2: [[1 0] [0 1]] ====================================================================== FAIL: check_matvec (scipy.sparse.test_sparse.test_csc) ---------------------------------------------------------------------- Traceback (most recent call last): File "/home/abaecker/BUILDS2/Build_52/inst_scipy_newcore/lib/python2.4/site-packages/scipy/sparse/tests/test_sparse.py", line 66, in check_matvec assert_array_almost_equal([1,2,3,4]*b,dot([1,2,3,4],b.todense())) File "/home/abaecker/BUILDS2/Build_52//inst_scipy_newcore/lib/python2.4/site-packages/scipy/test/testing.py", line 727, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 100.0%): Array 1: [ 8.3991160e-323 6.9169190e-323 4.4465908e-323] Array 2: [ 17. 14. 9.] ====================================================================== FAIL: check_matvec (scipy.sparse.test_sparse.test_csr) ---------------------------------------------------------------------- Traceback (most recent call last): File "/home/abaecker/BUILDS2/Build_52/inst_scipy_newcore/lib/python2.4/site-packages/scipy/sparse/tests/test_sparse.py", line 66, in check_matvec assert_array_almost_equal([1,2,3,4]*b,dot([1,2,3,4],b.todense())) File "/home/abaecker/BUILDS2/Build_52//inst_scipy_newcore/lib/python2.4/site-packages/scipy/test/testing.py", line 727, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 100.0%): Array 1: [ 8.3991160e-323 6.9169190e-323 4.4465908e-323] Array 2: [ 17. 14. 9.] ====================================================================== FAIL: check_matvec (scipy.sparse.test_sparse.test_dok) ---------------------------------------------------------------------- Traceback (most recent call last): File "/home/abaecker/BUILDS2/Build_52/inst_scipy_newcore/lib/python2.4/site-packages/scipy/sparse/tests/test_sparse.py", line 66, in check_matvec assert_array_almost_equal([1,2,3,4]*b,dot([1,2,3,4],b.todense())) File "/home/abaecker/BUILDS2/Build_52//inst_scipy_newcore/lib/python2.4/site-packages/scipy/test/testing.py", line 727, in assert_array_almost_equal assert cond,\ AssertionError: Arrays are not almost equal (mismatch 100.0%): Array 1: [ 8.3991160e-323 6.9169190e-323 4.4465908e-323] Array 2: [ 17. 14. 9.] ---------------------------------------------------------------------- Ran 1124 tests in 45.943s FAILED (failures=12, errors=2) ========================================================== ========================================================== Further pointer problems: cat build_log | grep -A 5 -B 5 "incompatible pointer" from Lib/signal/sigtoolsmodule.c:9: /scr/python/include/python2.4/pyconfig.h:835:1: warning: "_POSIX_C_SOURCE" redefined In file included from /usr/include/setjmp.h:26, from Lib/signal/sigtoolsmodule.c:8: /usr/include/features.h:190:1: warning: this is the location of the previous definition Lib/signal/sigtoolsmodule.c:1610: warning: initialization from incompatible pointer type Lib/signal/sigtoolsmodule.c:1610: warning: initialization from incompatible pointer type Lib/signal/sigtoolsmodule.c:1610: warning: initialization from incompatible pointer type Lib/signal/sigtoolsmodule.c:1610: warning: initialization from incompatible pointer type Lib/signal/sigtoolsmodule.c: In function `sigtools_convolve2d': Lib/signal/sigtoolsmodule.c:1763: warning: passing arg 2 of pointer to function from incompatible pointer type Lib/signal/sigtoolsmodule.c:1773: warning: passing arg 2 of pointer to function from incompatible pointer type Lib/signal/sigtoolsmodule.c: In function `sigtools_median2d': Lib/signal/sigtoolsmodule.c:2140: warning: passing arg 3 of `b_medfilt2' from incompatible pointer type Lib/signal/sigtoolsmodule.c:2143: warning: passing arg 3 of `f_medfilt2' from incompatible pointer type Lib/signal/sigtoolsmodule.c:2146: warning: passing arg 3 of `d_medfilt2' from incompatible pointer type gcc: Lib/signal/medianfilter.c gcc: Lib/signal/firfilter.c gcc -pthread -shared build/temp.linux-x86_64-2.4/Lib/signal/sigtoolsmodule.o build/temp.linux-x86_64-2.4/Lib/signal/firfilter.o build/temp.linux-x86_64-2.4/Lib/signal/medianfilter.o -Lbuild/temp.linux-x86_64-2.4 -o build/lib.linux-x86_64-2.4/scipy/signal/sigtools.so building 'scipy.signal.spline' extension compiling C sources gcc options: '-pthread -fno-strict-aliasing -DNDEBUG -g -O3 -Wall -Wstrict-prototypes -fPIC' compile options: '-I/home/abaecker/BUILDS2/Build_52/inst_scipy_newcore/lib/python2.4/site-packages/scipy/base/include -I/scr/python/include/python2.4 -c' gcc: Lib/signal/Z_bspline_util.c gcc: Lib/signal/splinemodule.c Lib/signal/splinemodule.c: In function `cspline2d': Lib/signal/splinemodule.c:84: warning: passing arg 2 of pointer to function from incompatible pointer type Lib/signal/splinemodule.c:89: warning: passing arg 1 of `convert_strides' from incompatible pointer type Lib/signal/splinemodule.c: In function `qspline2d': Lib/signal/splinemodule.c:143: warning: passing arg 2 of pointer to function from incompatible pointer type Lib/signal/splinemodule.c:148: warning: passing arg 1 of `convert_strides' from incompatible pointer type Lib/signal/splinemodule.c: In function `FIRsepsym2d': Lib/signal/splinemodule.c:200: warning: passing arg 2 of pointer to function from incompatible pointer type Lib/signal/splinemodule.c:205: warning: passing arg 1 of `convert_strides' from incompatible pointer type Lib/signal/splinemodule.c: In function `IIRsymorder1': Lib/signal/splinemodule.c:308: warning: passing arg 2 of pointer to function from incompatible pointer type Lib/signal/splinemodule.c:312: warning: passing arg 1 of `convert_strides' from incompatible pointer type Lib/signal/splinemodule.c: In function `IIRsymorder2': Lib/signal/splinemodule.c:428: warning: passing arg 2 of pointer to function from incompatible pointer type Lib/signal/splinemodule.c:432: warning: passing arg 1 of `convert_strides' from incompatible pointer type gcc: Lib/signal/S_bspline_util.c gcc: Lib/signal/D_bspline_util.c gcc: Lib/signal/bspline_util.c gcc: Lib/signal/C_bspline_util.c gcc -pthread -shared build/temp.linux-x86_64-2.4/Lib/signal/splinemodule.o build/temp.linux-x86_64-2.4/Lib/signal/S_bspline_util.o build/temp.linux-x86_64-2.4/Lib/signal/D_bspline_util.o build/temp.linux-x86_64-2.4/Lib/signal/C_bspline_util.o build/temp.linux-x86_64-2.4/Lib/signal/Z_bspline_util.o build/temp.linux-x86_64-2.4/Lib/signal/bspline_util.o -Lbuild/temp.linux-x86_64-2.4 -o build/lib.linux-x86_64-2.4/scipy/signal/spline.so I am completely swamped at the moment, so I won't have time today to track any of these - maybe much later today. Best, Arnd P.S.: gcc -v Reading specs from /scr/python/bin/../lib/gcc/x86_64-unknown-linux-gnu/3.4.4/specs Configured with: ../gcc-3.4.4/configure --prefix=/scr/python/ --enable-shared --enable-threads=posix --enable-__cxa_atexit --enable-clocale=gnu --enable-languages=c,c++,f77,objc Thread model: posix gcc version 3.4.4 python -V Python 2.4.2 In [2]: scipy.__core_config__.show() atlas_threads_info: libraries = ['lapack', 'ptf77blas', 'ptcblas', 'atlas'] library_dirs = ['/scr/python/lib64'] language = f77 define_macros = [('ATLAS_INFO', '"\\"3.7.11\\""')] include_dirs = ['/scr/python/include'] blas_opt_info: libraries = ['ptf77blas', 'ptcblas', 'atlas'] library_dirs = ['/scr/python/lib64'] define_macros = [('ATLAS_INFO', '"\\"3.7.11\\""')] language = c include_dirs = ['/scr/python/include'] atlas_blas_threads_info: libraries = ['ptf77blas', 'ptcblas', 'atlas'] library_dirs = ['/scr/python/lib64'] language = c define_macros = [('ATLAS_INFO', '"\\"3.7.11\\""')] include_dirs = ['/scr/python/include'] lapack_opt_info: libraries = ['lapack', 'ptf77blas', 'ptcblas', 'atlas'] library_dirs = ['/scr/python/lib64'] define_macros = [('ATLAS_INFO', '"\\"3.7.11\\""')] language = f77 include_dirs = ['/scr/python/include'] In [3]: scipy.__scipy_config__.show() lapack_opt_info: libraries = ['lapack', 'ptf77blas', 'ptcblas', 'atlas'] library_dirs = ['/scr/python/lib64'] define_macros = [('ATLAS_INFO', '"\\"3.7.11\\""')] language = f77 include_dirs = ['/scr/python/include'] blas_opt_info: libraries = ['ptf77blas', 'ptcblas', 'atlas'] library_dirs = ['/scr/python/lib64'] define_macros = [('ATLAS_INFO', '"\\"3.7.11\\""')] language = c include_dirs = ['/scr/python/include'] atlas_blas_threads_info: libraries = ['ptf77blas', 'ptcblas', 'atlas'] library_dirs = ['/scr/python/lib64'] language = c define_macros = [('ATLAS_INFO', '"\\"3.7.11\\""')] include_dirs = ['/scr/python/include'] djbfft_info: NOT AVAILABLE fftw_info: libraries = ['rfftw', 'fftw'] library_dirs = ['/scr/python/lib'] define_macros = [('SCIPY_FFTW_H', None)] include_dirs = ['/scr/python/include'] atlas_threads_info: libraries = ['lapack', 'ptf77blas', 'ptcblas', 'atlas'] library_dirs = ['/scr/python/lib64'] language = f77 define_macros = [('ATLAS_INFO', '"\\"3.7.11\\""')] include_dirs = ['/scr/python/include'] From arnd.baecker at web.de Fri Oct 28 05:09:37 2005 From: arnd.baecker at web.de (Arnd Baecker) Date: Fri, 28 Oct 2005 11:09:37 +0200 (CEST) Subject: [SciPy-dev] All but two tests passing for me In-Reply-To: References: <43616A63.4070205@ee.byu.edu> <43617170.9080506@ee.byu.edu> Message-ID: Hi, On Fri, 28 Oct 2005, Arnd Baecker wrote: > umpf - pressed send to early .... > > Hi Travis > > On Thu, 27 Oct 2005, Travis Oliphant wrote: > > > Travis Oliphant wrote: > > > > >After a fix to PyArray_Correlate all but two tests are passing for me on > > >a 32-bit platform. > > > > > > > > > > > Now all tests are passing. > > > > I'm sure issues will still be resolved. But, I'm so pleased, I just had > > to show everybody :-) > > > > ---------------------------------------------------------------------- > > Ran 1127 tests in 85.453s > > > > OK > > > > > > Thanks for all who are helping in the bug-fixing and conversion process. > > That is good news!! > > I don't want to spoil your party, but I think I have to ;-) > > In [3]: scipy.__core_version__ > Out[3]: '0.4.3.1371' > In [4]: scipy.__scipy_version__ > Out[4]: '0.4.2_1393' > > scipy.test(10,verbosity=10) > > gives on the opteron > > > ====================================================================== > ERROR: check_integer > (scipy.io.array_import.test_array_import.test_read_array) > ---------------------------------------------------------------------- > Traceback (most recent call last): > File > "/home/abaecker/BUILDS2/Build_52/inst_scipy_newcore/lib/python2.4/site-packages/scipy/io/tests/test_array_import.py", > line 62, in check_integer > b = io.read_array(fname,atype=N.Int) > File > "/home/abaecker/BUILDS2/Build_52//inst_scipy_newcore/lib/python2.4/site-packages/scipy/io/array_import.py", > line 359, in read_array > raise ValueError, "One of the array types is invalid, k=%d" % k > ValueError: One of the array types is invalid, k=0 > > > This is might be caused by: > > compile options: > '-I/home/abaecker/BUILDS2/Build_52/inst_scipy_newcore/lib/python2.4/site-packages/scipy/base/include > -I/scr/python/include/python2.4 -c' > gcc: Lib/io/numpyiomodule.c > Lib/io/numpyiomodule.c: In function `numpyio_tofile': > Lib/io/numpyiomodule.c:282: warning: passing arg 1 of pointer to function > from incompatible pointer type > Lib/io/numpyiomodule.c: In function `numpyio_convert_objects': > Lib/io/numpyiomodule.c:743: warning: passing arg 2 of pointer to function > from incompatible pointer type > gcc -pthread -shared build/temp.linux-x86_64-2.4/Lib/io/numpyiomodule.o > -Lbuild/temp.linux-x86_64-2.4 -o > build/lib.linux-x86_64-2.4/scipy/io/numpyio.so > building 'scipy.fftpack._fftpack' extension > compiling C sources I tried to look into this one. The offending line is: buffer_size = _PyArray_multiply_list(arr->dimensions + k, arr->nd - k); and later on out = (PyArrayObject *)PyArray_FromDims(RANK(arr), DIMS(arr), int_type); with the definition: #define DIMS(arr) ((arr)->dimensions) So it might be that arr->dimensions is not of the type it should be. So I looked for the definition of arr: if ((arr = (PyArrayObject *)PyArray_FromDims(1,(int*)&n,out_type)) == NULL) return NULL; Is there some int * --> intp * needed ? Well, I thought this was the lowest hanging fruit ... Anyway, I really have to stop doing this now. Best, Arnd From pearu at scipy.org Fri Oct 28 04:52:24 2005 From: pearu at scipy.org (Pearu Peterson) Date: Fri, 28 Oct 2005 03:52:24 -0500 (CDT) Subject: [SciPy-dev] All but two tests passing for me In-Reply-To: References: <43616A63.4070205@ee.byu.edu> <43617170.9080506@ee.byu.edu> Message-ID: On Fri, 28 Oct 2005, Arnd Baecker wrote: >> ====================================================================== >> ERROR: check_integer >> (scipy.io.array_import.test_array_import.test_read_array) >> ---------------------------------------------------------------------- >> Traceback (most recent call last): >> File >> "/home/abaecker/BUILDS2/Build_52/inst_scipy_newcore/lib/python2.4/site-packages/scipy/io/tests/test_array_import.py", >> line 62, in check_integer >> b = io.read_array(fname,atype=N.Int) >> File >> "/home/abaecker/BUILDS2/Build_52//inst_scipy_newcore/lib/python2.4/site-packages/scipy/io/array_import.py", >> line 359, in read_array >> raise ValueError, "One of the array types is invalid, k=%d" % k >> ValueError: One of the array types is invalid, k=0 >> >> >> This is might be caused by: >> >> compile options: >> '-I/home/abaecker/BUILDS2/Build_52/inst_scipy_newcore/lib/python2.4/site-packages/scipy/base/include >> -I/scr/python/include/python2.4 -c' >> gcc: Lib/io/numpyiomodule.c >> Lib/io/numpyiomodule.c: In function `numpyio_tofile': >> Lib/io/numpyiomodule.c:282: warning: passing arg 1 of pointer to function >> from incompatible pointer type >> Lib/io/numpyiomodule.c: In function `numpyio_convert_objects': >> Lib/io/numpyiomodule.c:743: warning: passing arg 2 of pointer to function >> from incompatible pointer type >> gcc -pthread -shared build/temp.linux-x86_64-2.4/Lib/io/numpyiomodule.o >> -Lbuild/temp.linux-x86_64-2.4 -o >> build/lib.linux-x86_64-2.4/scipy/io/numpyio.so >> building 'scipy.fftpack._fftpack' extension >> compiling C sources > > I tried to look into this one. The offending line is: > > buffer_size = _PyArray_multiply_list(arr->dimensions + k, arr->nd - k); > > and later on > > out = (PyArrayObject *)PyArray_FromDims(RANK(arr), DIMS(arr), int_type); > > with the definition: > > #define DIMS(arr) ((arr)->dimensions) > > So it might be that arr->dimensions is not of the type it should be. The type of arr->dimensions has been changed from int* to intp* in newcore. So, when porting extension modules to newcore, the following replacements are needed int* dimensions, strides; -> intp* dimensions, strides; PyArray_FromDims -> PyArray_SimpleNew PyArray_FromDimsAndData -> PyArray_SimpleNewFromData > So I looked for the definition of arr: > > if ((arr = (PyArrayObject *)PyArray_FromDims(1,(int*)&n,out_type)) == > NULL) > return NULL; > > Is there some int * --> intp * needed ? Yes, I have commited the patch already to SVN repository. Pearu From arnd.baecker at web.de Fri Oct 28 07:18:44 2005 From: arnd.baecker at web.de (Arnd Baecker) Date: Fri, 28 Oct 2005 13:18:44 +0200 (CEST) Subject: [SciPy-dev] All but two tests passing for me In-Reply-To: References: <43616A63.4070205@ee.byu.edu> <43617170.9080506@ee.byu.edu> Message-ID: Hi Pearu, On Fri, 28 Oct 2005, Pearu Peterson wrote: > The type of arr->dimensions has been changed from int* to intp* in > newcore. So, when porting extension modules to newcore, the following > replacements are needed > > int* dimensions, strides; -> intp* dimensions, strides; > PyArray_FromDims -> PyArray_SimpleNew > PyArray_FromDimsAndData -> PyArray_SimpleNewFromData thanks for this conversion table! > > So I looked for the definition of arr: > > > > if ((arr = (PyArrayObject *)PyArray_FromDims(1,(int*)&n,out_type)) == > > NULL) > > return NULL; > > > > Is there some int * --> intp * needed ? > > Yes, I have commited the patch already to SVN repository. No compile error anymore, but does not quite work yet: ====================================================================== ERROR: check_integer (scipy.io.array_import.test_array_import.test_read_array) ---------------------------------------------------------------------- Traceback (most recent call last): File "/home/abaecker/BUILDS2/Build_53/inst_scipy_newcore/lib/python2.4/site-packages/scipy/io/tests/test_array_import.py", line 62, in check_integer b = io.read_array(fname,atype=N.Int) File "/home/abaecker/BUILDS2/Build_53//inst_scipy_newcore/lib/python2.4/site-packages/scipy/io/array_import.py", line 359, in read_array raise ValueError, "One of the array types is invalid, k=%d" % k ValueError: One of the array types is invalid, k=0 In [2]: scipy.__core_version__ Out[2]: '0.4.3.1376' In [3]: scipy.__scipy_version__ Out[3]: '0.4.2_1398' Sorry, have to rush now - will be back later, Arnd From nwagner at mecha.uni-stuttgart.de Fri Oct 28 11:12:31 2005 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Fri, 28 Oct 2005 17:12:31 +0200 Subject: [SciPy-dev] ERROR: check_simple_todense Message-ID: ====================================================================== ERROR: check_simple_todense (scipy.io.mmio.test_mmio.test_mmio_coordinate) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib/python2.4/site-packages/scipy/io/tests/test_mmio.py", line 152, in check_simple_todense b = mmread(fn).todense() File "/usr/local/lib/python2.4/site-packages/scipy/io/mmio.py", line 215, in mmread a = coo_matrix(data,(row,col),M=rows,N=cols,dtype=dtype) TypeError: __init__() got an unexpected keyword argument 'M' ---------------------------------------------------------------------- Ran 1124 tests in 92.966s From schofield at ftw.at Fri Oct 28 12:49:05 2005 From: schofield at ftw.at (Ed Schofield) Date: Fri, 28 Oct 2005 18:49:05 +0200 Subject: [SciPy-dev] ERROR: check_simple_todense In-Reply-To: References: Message-ID: <43625681.6040807@ftw.at> Nils Wagner wrote: >====================================================================== >ERROR: check_simple_todense >(scipy.io.mmio.test_mmio.test_mmio_coordinate) >---------------------------------------------------------------------- >Traceback (most recent call last): > File >"/usr/local/lib/python2.4/site-packages/scipy/io/tests/test_mmio.py", >line 152, in check_simple_todense > b = mmread(fn).todense() > File >"/usr/local/lib/python2.4/site-packages/scipy/io/mmio.py", >line 215, in mmread > a = >coo_matrix(data,(row,col),M=rows,N=cols,dtype=dtype) >TypeError: __init__() got an unexpected keyword argument >'M' > > Oops, my fault! Should be fixed now. -- Ed From nwagner at mecha.uni-stuttgart.de Fri Oct 28 13:02:15 2005 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Fri, 28 Oct 2005 19:02:15 +0200 Subject: [SciPy-dev] ERROR: check_simple_todense In-Reply-To: <43625681.6040807@ftw.at> References: <43625681.6040807@ftw.at> Message-ID: On Fri, 28 Oct 2005 18:49:05 +0200 Ed Schofield wrote: > Nils Wagner wrote: > >>====================================================================== >>ERROR: check_simple_todense >>(scipy.io.mmio.test_mmio.test_mmio_coordinate) >>---------------------------------------------------------------------- >>Traceback (most recent call last): >> File >>"/usr/local/lib/python2.4/site-packages/scipy/io/tests/test_mmio.py", >>line 152, in check_simple_todense >> b = mmread(fn).todense() >> File >>"/usr/local/lib/python2.4/site-packages/scipy/io/mmio.py", >>line 215, in mmread >> a = >>coo_matrix(data,(row,col),M=rows,N=cols,dtype=dtype) >>TypeError: __init__() got an unexpected keyword argument >>'M' >> >> > Oops, my fault! Should be fixed now. > Yes. Thank you. Nils > -- Ed > > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-dev From arnd.baecker at web.de Fri Oct 28 13:54:17 2005 From: arnd.baecker at web.de (Arnd Baecker) Date: Fri, 28 Oct 2005 19:54:17 +0200 (CEST) Subject: [SciPy-dev] All but two tests passing for me In-Reply-To: References: <43616A63.4070205@ee.byu.edu> <43617170.9080506@ee.byu.edu> Message-ID: On Fri, 28 Oct 2005, Arnd Baecker wrote: [...] > No compile error anymore, but does not quite work yet: > > ====================================================================== > ERROR: check_integer > (scipy.io.array_import.test_array_import.test_read_array) > ---------------------------------------------------------------------- > Traceback (most recent call last): > File > "/home/abaecker/BUILDS2/Build_53/inst_scipy_newcore/lib/python2.4/site-packages/scipy/io/tests/test_array_import.py", > line 62, in check_integer > b = io.read_array(fname,atype=N.Int) > File > "/home/abaecker/BUILDS2/Build_53//inst_scipy_newcore/lib/python2.4/site-packages/scipy/io/array_import.py", > line 359, in read_array > raise ValueError, "One of the array types is invalid, k=%d" % k > ValueError: One of the array types is invalid, k=0 > > In [2]: scipy.__core_version__ > Out[2]: '0.4.3.1376' > In [3]: scipy.__scipy_version__ > Out[3]: '0.4.2_1398' > > Sorry, have to rush now - will be back later, Looking at the corresponding test: def check_integer(self): from scipy import stats a = stats.randint.rvs(1,20,size=(3,4)) fname = tempfile.mktemp('.dat') io.write_array(fname,a) b = io.read_array(fname,atype=N.Int) assert_array_equal(a,b) os.remove(fname) and executing this line by line shows the error for b = io.read_array(fname,atype=N.Int) Doing b = io.read_array(fname) reads in the array, but it gives floats. However, b = io.read_array(fname,atype=N.Int32) works. If this is the intended behaviour (also on 32Bit), the unit test should be changed accordingly... Best, Arnd From arnd.baecker at web.de Fri Oct 28 14:00:11 2005 From: arnd.baecker at web.de (Arnd Baecker) Date: Fri, 28 Oct 2005 20:00:11 +0200 (CEST) Subject: [SciPy-dev] newscipy and ATLAS Message-ID: Hi, I tried to install newcore and newscipy on a different machine and ATLAS is not detected. The reason for this is that on this debian sarge box > dpkg -L atlas3-base /. /usr /usr/lib /usr/lib/libatlas.so.3.0 /usr/lib/libcblas.so.3.0 /usr/lib/libf77blas.so.3.0 /usr/lib/liblapack_atlas.so.3.0 /usr/lib/atlas /usr/lib/atlas/libblas.so.3.0 /usr/lib/atlas/liblapack.so.3.0 So there are no symbolic links from /usr/lib/libatlas.so to /usr/lib/libatlas.so.3.0 Would it be possible that scipy distutils also searches for the .so.X.Y ones - or would this lead to problems in the linking stage? Best, Arnd From arnd.baecker at web.de Fri Oct 28 14:37:35 2005 From: arnd.baecker at web.de (Arnd Baecker) Date: Fri, 28 Oct 2005 20:37:35 +0200 (CEST) Subject: [SciPy-dev] float exception, fblas cgemv - newscipy Message-ID: Hi, on another debian sarge based 32Bit machine (where I made a couple of links so that ATLAS is found) I get the following check_default_beta_y (scipy.linalg.fblas.test_fblas.test_cgemv) ... zsh: 8527 floating point exception The shortest example to reproduce this is gdb file /usr/bin/python run from scipy import * from scipy.basic.random import normal a = normal(0.,1.,(3,3)) x = arange(shape(a)[0]) print a print x linalg.fblas.cgemv(1,a,x) Giving >>> print a [[-0.70685647 0.65494695 0.9179007 ] [-0.30862233 0.58137533 2.01614468] [ 0.18585164 -0.8134587 1.23613689]] >>> print x [0 1 2] >>> linalg.fblas.cgemv(1,a,x) Program received signal SIGFPE, Arithmetic exception. [Switching to Thread 16384 (LWP 26791)] 0x404167a0 in ATL_cgemvC_a1_x1_bXi0_y1 () from /usr/lib/sse2/libatlas.so.3 Does anyone else see this? Any ideas what to do with this? Best, Arnd System details: ipython import scipy scipy.__core_version__ scipy.__scipy_version__ scipy.__scipy_config__.show() gives Python 2.3.5 (#2, Sep 4 2005, 22:01:42) In [1]:import scipy Importing io to scipy Importing fftpack to scipy Importing special to scipy Importing cluster to scipy Importing sparse to scipy Importing utils to scipy Importing interpolate to scipy Importing integrate to scipy Importing signal to scipy Importing optimize to scipy Importing linalg to scipy Importing stats to scipy In [2]:scipy.__core_version__ Out[2]:'0.4.3.1376' In [3]:scipy.__scipy_version__ Out[3]:'0.4.2_1400' In [4]:scipy.__scipy_config__.show() dfftw_info: NOT AVAILABLE blas_opt_info: libraries = ['f77blas', 'cblas', 'atlas'] library_dirs = ['/usr/lib/sse2'] define_macros = [('ATLAS_INFO', '"\\"3.6.0\\""')] language = c djbfft_info: NOT AVAILABLE atlas_blas_threads_info: NOT AVAILABLE lapack_opt_info: libraries = ['lapack', 'f77blas', 'cblas', 'atlas'] library_dirs = ['/usr/lib/atlas', '/usr/lib/'] define_macros = [('ATLAS_INFO', '"\\"3.6.0\\""')] language = f77 atlas_info: libraries = ['lapack', 'f77blas', 'cblas', 'atlas'] library_dirs = ['/usr/lib/atlas', '/usr/lib/'] language = f77 define_macros = [('ATLAS_INFO', '"\\"3.6.0\\""')] fftw_info: NOT AVAILABLE atlas_blas_info: libraries = ['f77blas', 'cblas', 'atlas'] library_dirs = ['/usr/lib/sse2'] language = c define_macros = [('ATLAS_INFO', '"\\"3.6.0\\""')] atlas_threads_info: NOT AVAILABLE Atlas: ls -l /usr/lib/atlas total 17168 lrwxrwxrwx 1 root root 12 Oct 25 00:32 libblas.so -> libblas.so.3 lrwxrwxrwx 1 root root 14 Mar 23 2005 libblas.so.2 -> libblas.so.2.3 -rw-r--r-- 1 root root 3051888 Mar 4 2005 libblas.so.2.3 lrwxrwxrwx 1 root root 14 Nov 18 2004 libblas.so.3 -> libblas.so.3.0 -rw-r--r-- 1 root root 3443424 Oct 27 2004 libblas.so.3.0 lrwxrwxrwx 1 root root 14 Oct 25 00:32 liblapack.so -> liblapack.so.3 lrwxrwxrwx 1 root root 16 Mar 23 2005 liblapack.so.2 -> liblapack.so.2.3 -rw-r--r-- 1 root root 5500688 Mar 4 2005 liblapack.so.2.3 lrwxrwxrwx 1 root root 16 Nov 18 2004 liblapack.so.3 -> liblapack.so.3.0 -rw-r--r-- 1 root root 5537840 Oct 27 2004 liblapack.so.3.0 drwxr-xr-x 2 root root 4096 Nov 18 2004 sse2 ls -l /usr/lib/ | grep atlas drwxr-xr-x 3 root root 4096 Oct 25 00:32 atlas lrwxrwxrwx 1 root root 16 Mar 22 2004 libatlas.so -> sse2/libatlas.so lrwxrwxrwx 1 root root 15 Mar 23 2005 libatlas.so.2 -> libatlas.so.2.3 -rw-r--r-- 1 root root 2835272 Mar 4 2005 libatlas.so.2.3 lrwxrwxrwx 1 root root 15 Nov 18 2004 libatlas.so.3 -> libatlas.so.3.0 -rw-r--r-- 1 root root 3234288 Oct 27 2004 libatlas.so.3.0 lrwxrwxrwx 1 root root 34 Mar 22 2004 liblapack.so -> /usr/lib/atlas/sse2/liblapack.so.3 lrwxrwxrwx 1 root root 25 Mar 22 2004 liblapack_atlas.so -> sse2/liblapack_atlas.so.3 lrwxrwxrwx 1 root root 22 Mar 23 2005 liblapack_atlas.so.2 -> liblapack_atlas.so.2.3 -rw-r--r-- 1 root root 60568 Mar 4 2005 liblapack_atlas.so.2.3 lrwxrwxrwx 1 root root 22 Nov 18 2004 liblapack_atlas.so.3 -> liblapack_atlas.so.3.0 -rw-r--r-- 1 root root 131224 Oct 27 2004 liblapack_atlas.so.3.0 From ryanlists at gmail.com Fri Oct 28 14:49:42 2005 From: ryanlists at gmail.com (Ryan Krauss) Date: Fri, 28 Oct 2005 14:49:42 -0400 Subject: [SciPy-dev] patch for minpack.py Newton Message-ID: I am using the Newton algorithm in minpack.py to find the roots of a complex expression. This seems to work correctly, but the error raised if it doesn't convert wasn't written with the possiblity of complex expressions in mind. I have changed the ending of the newton function (line 330) from raise RuntimeError, "Failed to converge after %d iterations, value is %f" % (maxiter,p) to if type(p) is complex: raise RuntimeError, "Failed to converge after %d iterations, value is %f%+fj" % (maxiter,real(p),imag(p)) else: raise RuntimeError, "Failed to converge after %d iterations, value is %f" % (maxiter,p) This just makes the function raise a more helpful error message. Otherwise it reads, error float is required. I made this change on my machine. Is it valuable and what is the procedure to make it part of scipy? And is it worth doing if I am not in the newscipy yet? Thanks, Ryan From oliphant at ee.byu.edu Fri Oct 28 14:49:20 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Fri, 28 Oct 2005 12:49:20 -0600 Subject: [SciPy-dev] [matplotlib-devel] new scipy and numerix In-Reply-To: <200510280957.53966.dd55@cornell.edu> References: <200510280957.53966.dd55@cornell.edu> Message-ID: <436272B0.8010503@ee.byu.edu> Darren Dale wrote: >I was wondering, has anyone looked into extending numerix to handle the new >scipy? All scipy tests now pass on my 32bit system, but there is some >incompatibility between new scipy and mpl. For example: > >from pylab import * >from scipy import * >plot(linspace(-1,1,100)) >show() > > Which version of Numeric do you have? Currently you need Numeric 24.0 to use matplotlib with scipy. Numeric 24.0 uses the array interface to convert scipy arrays. If you do have the latest version. What problems are you seeing? I think the array interface is a better way to go then the numerix approach. John Hunter has indicated that he will eventually move to requiring newcore for matplotlib. I would rather push that direction, then have a messy 3-way numerix maintenance. -Travis From pearu at scipy.org Fri Oct 28 13:51:44 2005 From: pearu at scipy.org (Pearu Peterson) Date: Fri, 28 Oct 2005 12:51:44 -0500 (CDT) Subject: [SciPy-dev] newscipy and ATLAS In-Reply-To: References: Message-ID: On Fri, 28 Oct 2005, Arnd Baecker wrote: > Hi, > > I tried to install newcore and newscipy on a different machine > and ATLAS is not detected. > The reason for this is that on this debian sarge box > >> dpkg -L atlas3-base > /. > /usr > /usr/lib > /usr/lib/libatlas.so.3.0 > /usr/lib/libcblas.so.3.0 > /usr/lib/libf77blas.so.3.0 > /usr/lib/liblapack_atlas.so.3.0 > /usr/lib/atlas > /usr/lib/atlas/libblas.so.3.0 > /usr/lib/atlas/liblapack.so.3.0 > > So there are no symbolic links from > /usr/lib/libatlas.so to /usr/lib/libatlas.so.3.0 > > Would it be possible that scipy distutils also searches > for the .so.X.Y ones - or would this lead to problems > in the linking stage? In principle it would be possible. However, users should install atlas3-base-dev package when building scipy themselfs. That's why *-dev packages are provided in the first place. I guess only debian developers or packagers should build scipy against .so.X.Y libraries. Pearu From arnd.baecker at web.de Fri Oct 28 15:15:46 2005 From: arnd.baecker at web.de (Arnd Baecker) Date: Fri, 28 Oct 2005 21:15:46 +0200 (CEST) Subject: [SciPy-dev] newscipy and ATLAS In-Reply-To: References: Message-ID: On Fri, 28 Oct 2005, Pearu Peterson wrote: > On Fri, 28 Oct 2005, Arnd Baecker wrote: > > > Hi, > > > > I tried to install newcore and newscipy on a different machine > > and ATLAS is not detected. > > The reason for this is that on this debian sarge box > > > >> dpkg -L atlas3-base > > /. > > /usr > > /usr/lib > > /usr/lib/libatlas.so.3.0 > > /usr/lib/libcblas.so.3.0 > > /usr/lib/libf77blas.so.3.0 > > /usr/lib/liblapack_atlas.so.3.0 > > /usr/lib/atlas > > /usr/lib/atlas/libblas.so.3.0 > > /usr/lib/atlas/liblapack.so.3.0 > > > > So there are no symbolic links from > > /usr/lib/libatlas.so to /usr/lib/libatlas.so.3.0 > > > > Would it be possible that scipy distutils also searches > > for the .so.X.Y ones - or would this lead to problems > > in the linking stage? > > In principle it would be possible. However, users should install > atlas3-base-dev package when building scipy themselfs. That's why *-dev > packages are provided in the first place. Fair enough - I thought the links were done already on the atlas3-base level and that something has changed in the transition to sarge. In my case atlas3-sse2-dev was not installed, which (BTW) clearly says """ This package includes the static libraries and symbolic links needed for program development. """ > I guess only debian developers or packagers should build scipy against > .so.X.Y libraries. Alright, so I better don't do that ;-) Many thanks for the clarification, Arnd From cookedm at physics.mcmaster.ca Fri Oct 28 15:24:03 2005 From: cookedm at physics.mcmaster.ca (David M. Cooke) Date: Fri, 28 Oct 2005 15:24:03 -0400 Subject: [SciPy-dev] patch for minpack.py Newton In-Reply-To: (Ryan Krauss's message of "Fri, 28 Oct 2005 14:49:42 -0400") References: Message-ID: Ryan Krauss writes: > I am using the Newton algorithm in minpack.py to find the roots of a > complex expression. This seems to work correctly, but the error > raised if it doesn't convert wasn't written with the possiblity of > complex expressions in mind. I have changed the ending of the newton > function (line 330) from > > raise RuntimeError, "Failed to converge after %d iterations, value is > %f" % (maxiter,p) > > to > > if type(p) is complex: > raise RuntimeError, "Failed to converge after %d iterations, > value is %f%+fj" % (maxiter,real(p),imag(p)) > else: > raise RuntimeError, "Failed to converge after %d iterations, > value is %f" % (maxiter,p) > > This just makes the function raise a more helpful error message. > Otherwise it reads, error float is required. > > I made this change on my machine. Is it valuable and what is the > procedure to make it part of scipy? And is it worth doing if I am not > in the newscipy yet? > > Thanks, > > Ryan Hi Ryan, I've added a fix to the newscipy branch. I use %s instead of pulling real and imaginary parts out. Reporting here is fine (for now :). I don't think people are keeping an eye on the bug tracker, as we're busy with the -newcore and -newscipy branches. -- |>|\/|< /--------------------------------------------------------------------------\ |David M. Cooke http://arbutus.physics.mcmaster.ca/dmc/ |cookedm at physics.mcmaster.ca From ryanlists at gmail.com Fri Oct 28 16:04:59 2005 From: ryanlists at gmail.com (Ryan Krauss) Date: Fri, 28 Oct 2005 16:04:59 -0400 Subject: [SciPy-dev] patch for minpack.py Newton In-Reply-To: References: Message-ID: Thanks David. I didn't realize %s would work for complex or float. That definitely seems like the right solution. Ryan On 10/28/05, David M. Cooke wrote: > Ryan Krauss writes: > > > I am using the Newton algorithm in minpack.py to find the roots of a > > complex expression. This seems to work correctly, but the error > > raised if it doesn't convert wasn't written with the possiblity of > > complex expressions in mind. I have changed the ending of the newton > > function (line 330) from > > > > raise RuntimeError, "Failed to converge after %d iterations, value is > > %f" % (maxiter,p) > > > > to > > > > if type(p) is complex: > > raise RuntimeError, "Failed to converge after %d iterations, > > value is %f%+fj" % (maxiter,real(p),imag(p)) > > else: > > raise RuntimeError, "Failed to converge after %d iterations, > > value is %f" % (maxiter,p) > > > > This just makes the function raise a more helpful error message. > > Otherwise it reads, error float is required. > > > > I made this change on my machine. Is it valuable and what is the > > procedure to make it part of scipy? And is it worth doing if I am not > > in the newscipy yet? > > > > Thanks, > > > > Ryan > > Hi Ryan, > > I've added a fix to the newscipy branch. I use %s instead of pulling > real and imaginary parts out. > > Reporting here is fine (for now :). I don't think people are keeping > an eye on the bug tracker, as we're busy with the -newcore and > -newscipy branches. > > -- > |>|\/|< > /--------------------------------------------------------------------------\ > |David M. Cooke http://arbutus.physics.mcmaster.ca/dmc/ > |cookedm at physics.mcmaster.ca > > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-dev > From oliphant at ee.byu.edu Fri Oct 28 17:50:43 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Fri, 28 Oct 2005 15:50:43 -0600 Subject: [SciPy-dev] Question about 64-bit integers being cast to double precision In-Reply-To: <435FD143.1050903@colorado.edu> References: <4349EDA2.2090802@ee.byu.edu> <434A9453.2010705@csun.edu> <434A98EB.7030407@ee.byu.edu> <434D8F27.100@colorado.edu> <1129157213.4839.5.camel@E011704> <435FA822.2050501@csun.edu> <1130345637.4822.24.camel@E011704> <435FBF12.9010406@colorado.edu> <435FD143.1050903@colorado.edu> Message-ID: <43629D33.1000704@ee.byu.edu> Fernando Perez wrote: >Charles R Harris wrote: > > > >>Yes, I agree with this. The only problem I see is if someone wants to save >>space when taking the sqrt of an integral array. There are at least three >>possiblilities: >> >>1. cast the result to a float >>2. cast the argument to a float >>3. use a special sqrtf function >> >>The first two options use more temporary space, take more time, and look >>uglier (IMHO). On the other hand, the needed commands are already >>implemented. The last option is clear and concise, but needs a new ufunc. >> >> > >Well, while I agree with the recently posted design guideline from Guido of >using different functions for different purposes rather than flags, this may >be a case where a flag would be a good choice. Especially because we already >have a conceptual precedent for the accumulators of specifying the return type >via a flag: a.sum(rtype=int). > >Since the 'mental slot' is already in scipy's users heads for saying 'modify >the default output of this function to accumulate/store data in a different >type', I think it would be reasonable to offer > >sqrt(a,rtype=float) > > One thing we could do is take advantage of the "indexing capabilities" of the ufunc object which are unused at the moment and do something like sqrt[float](a) Where the argument to the index would be the desired output type or something. -Travis From dd55 at cornell.edu Fri Oct 28 20:55:48 2005 From: dd55 at cornell.edu (Darren Dale) Date: Fri, 28 Oct 2005 20:55:48 -0400 Subject: [SciPy-dev] [matplotlib-devel] new scipy and numerix In-Reply-To: <436272B0.8010503@ee.byu.edu> References: <200510280957.53966.dd55@cornell.edu> <436272B0.8010503@ee.byu.edu> Message-ID: <200510282055.48259.dd55@cornell.edu> On Friday 28 October 2005 2:49 pm, Travis Oliphant wrote: > Darren Dale wrote: > >I was wondering, has anyone looked into extending numerix to handle the > > new scipy? All scipy tests now pass on my 32bit system, but there is some > > incompatibility between new scipy and mpl. For example: > > > >from pylab import * > >from scipy import * > >plot(linspace(-1,1,100)) > >show() > > Which version of Numeric do you have? Currently you need Numeric 24.0 > to use matplotlib with scipy. Numeric 24.0 uses the array interface > to convert scipy arrays. If you do have the latest version. What > problems are you seeing? Sorry, I didn't realize installing Numeric 24 would improve things. I just upgraded and the problem is gone. > I think the array interface is a better way to go then the numerix > approach. John Hunter has indicated that he will eventually move to > requiring newcore for matplotlib. I would rather push that direction, > then have a messy 3-way numerix maintenance. Sounds good to me. Darren From arnd.baecker at web.de Sat Oct 29 11:20:40 2005 From: arnd.baecker at web.de (Arnd Baecker) Date: Sat, 29 Oct 2005 17:20:40 +0200 (CEST) Subject: [SciPy-dev] newscipy - special problems Message-ID: I installed newcore and newscipy on my laptop and run into two types of problems for scipy.test(10,10): a) linalg problems: Both test_cgemv and test_sgemv give a floating point exception. Defining aroun line 402 in site-packages/scipy/linalg/tests/test_fblas.py class test_cgemv: pass class test_sgemv: pass skips thoses tests and all other linalg related ones are ok. b) special problems The first one is: check_gdtrix (scipy.special.basic.test_basic.test_cephes) ... ok check_hankel1 (scipy.special.basic.test_basic.test_cephes) floating point exception If I comment out the following tests all other tests pass! - check_hankel1 cbesh_wrap1 - check_hankel1e cbesh_wrap1_e - check_hankel2 cbesh_wrap2 - check_hankel2e cbesh_wrap2_e - check_h1vp uses hankel1 - check_h2vp uses hankel1 - test_ive cbesi_wrap_e - test_jve cbesj_wrap_e - test_k0 test uses k0(0.1) and kv(0,0.1) - test_k0e _cephesmodule.c, direct - test_k1 _cephesmodule.c, direct - test_k1e _cephesmodule.c, direct - test_kv cbesk_wrap - test_kve cbesk_wrap_e - test_kvp uses kv - test_y0_zeros uses: specfun.cyzo(nt,kf,kc) - test_yve ? It looks as if most (all?) are coming from the amos_wrappers.c E.g.: For import scipy.special._cephes as cephes cephes.hankel1(1,1) I still see all printf's before the call of F_FUNC Py_complex cbesh_wrap1( double v, Py_complex z) { int n = 1; int kode = 1; int m = 1; int nz, ierr; Py_complex cy; printf("IN: cbesh_wrap1\n"); printf("v %e\n",v); printf("z %e\n",z.real); printf("z %e\n",z.imag); F_FUNC(zbesh,ZBESH)(CADDR(z), &v, &kode, &m, &n, CADDR(cy), &nz, &ierr); printf("IN2: cbesh 2 ...\n"); DO_MTHERR("hankel1:"); return cy; } Inserting some print's in zbesh.f shows that floating point exception happens here: print *, " in ZBESH1 C" BB=DBLE(FLOAT(I1MACH(9)))*0.5D0 print *, " in ZBESH1 D" At this point I am stuck - What could go wrong? Is there a way to compile this particular wrapper (or the whole cephes) without optimization to find out if this is a compiler bug? Best, Arnd SystemInfo: - debian sarge - gcc version 3.3.5 (Debian 1:3.3.5-13) - CPU information: getNCPUs has_mmx has_sse is_32bit is_Intel is_Pentium is_PentiumII is_PentiumIII is_i686 is_singleCPU ------ - In [2]: scipy.__core_version__ Out[2]: '0.4.3.1386' - In [4]: scipy.__scipy_version__ Out[4]: '0.4.2_1402' From Ralf_Ahlbrink at web.de Sat Oct 29 13:45:05 2005 From: Ralf_Ahlbrink at web.de (Ralf Ahlbrink) Date: Sat, 29 Oct 2005 19:45:05 +0200 Subject: [SciPy-dev] vectorize and arguments with default settings Message-ID: <200510291945.06001.Ralf_Ahlbrink@web.de> Hello! With the old scipy_core vectorize-ing functions, which have arguments with default settings, was possible. In the newcore (up to the current rev. 1386) the code is introspected, but default settings are not taken into account. I've attached a patch. Ralf. -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: function_base-vectorize_introspection.diff Type: text/x-diff Size: 1347 bytes Desc: not available URL: From oliphant at ee.byu.edu Sat Oct 29 14:16:28 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Sat, 29 Oct 2005 12:16:28 -0600 Subject: [SciPy-dev] vectorize and arguments with default settings In-Reply-To: <200510291945.06001.Ralf_Ahlbrink@web.de> References: <200510291945.06001.Ralf_Ahlbrink@web.de> Message-ID: <4363BC7C.3000701@ee.byu.edu> Ralf Ahlbrink wrote: > Hello! > > With the old scipy_core vectorize-ing functions, which have arguments > with default settings, was possible. > > In the newcore (up to the current rev. 1386) the code is introspected, > > but default settings are not taken into account. > > I've attached a patch. > Thanks for the patch. I modified it so that the ufunc is created again if you call it with a different number of arguments, and included it. -Travis From oliphant at ee.byu.edu Sat Oct 29 14:25:31 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Sat, 29 Oct 2005 12:25:31 -0600 Subject: [SciPy-dev] newscipy - special problems In-Reply-To: References: Message-ID: <4363BE9B.8050300@ee.byu.edu> Arnd Baecker wrote: > print *, " in ZBESH1 C" > BB=DBLE(FLOAT(I1MACH(9)))*0.5D0 > print *, " in ZBESH1 D" > > Is the problem with i1mach? You could replace the line with BB=DBLE(2147483647)*0.5D0 or BB=DBLE(FLOAT(21477483647))*0.5D0 and see if that helps. The other problems may be ATLAS issues. Yes, it could be compiler optimization problems. Presumably distutils has a way to turn off compiler optimizations. You could compile the amos library manually without compiler optimizations and just place it in the build/tempYYYYY/ directory where it will be picked up by distutils. -Travis From arnd.baecker at web.de Sat Oct 29 17:36:13 2005 From: arnd.baecker at web.de (Arnd Baecker) Date: Sat, 29 Oct 2005 23:36:13 +0200 (CEST) Subject: [SciPy-dev] newscipy - special problems In-Reply-To: <4363BE9B.8050300@ee.byu.edu> References: <4363BE9B.8050300@ee.byu.edu> Message-ID: On Sat, 29 Oct 2005, Travis Oliphant wrote: > Arnd Baecker wrote: > > > print *, " in ZBESH1 C" > > BB=DBLE(FLOAT(I1MACH(9)))*0.5D0 > > print *, " in ZBESH1 D" > > > > > Is the problem with i1mach? I don't think so (at least not directly) as BB=I1MACH(9) got me beyond that point. > You could replace the line with > > BB=DBLE(2147483647)*0.5D0 also gets me beyond that point but then gets a floating point exception when calling CALL ZBKNU(ZNR, ZNI, FNU, KODE, NN, CYR, CYI, NZ, TOL, ELIM, ALIM) (whose code contains similarly DBLE(FLOAT(I1MACH(14)-1)) - but I did not follow this further ...) > or > > BB=DBLE(FLOAT(21477483647))*0.5D0 leads to: ^ Integer at (^) too large error: Command "/usr/bin/g77 -Wall -fno-second-underscore -fPIC -O3 -funroll-loops -march=pentium3 -mmmx -msse -fomit-frame-pointer -malign-double -c -c Lib/special/amos/zbesh.f -o build/temp.linux-i686-2.3/Lib/special/amos/zbesh.o" failed with exit status 1 > and see if that helps. > > > The other problems may be ATLAS issues. Though that ATLAS works fine with "old" scipy and the corresponding debian packages. So I am not sure what to do about it ... > Yes, it could be compiler > optimization problems. Presumably distutils has a way to turn off > compiler optimizations. That would be the nicest solution, I'd guess. > You could compile the amos library manually > without compiler optimizations and just place it in the build/tempYYYYY/ > directory where it will be picked up by distutils. I will have to leave that for tomorrow (if I find a way how to do this - at first glance it was not obvious to me ...) Best, Arnd From arnd.baecker at web.de Sat Oct 29 17:41:52 2005 From: arnd.baecker at web.de (Arnd Baecker) Date: Sat, 29 Oct 2005 23:41:52 +0200 (CEST) Subject: [SciPy-dev] newscipy, 64Bit, det problem Message-ID: Hi, as previously reported I get on the opteron (In [11]: scipy.__core_version__ Out[11]: '0.4.3.1385' In [12]: scipy.__scipy_version__ Out[12]: '0.4.2_1402' ): ====================================================================== FAIL: check_simple (scipy.linalg.basic.test_basic.test_det) ---------------------------------------------------------------------- Traceback (most recent call last): File "/home/abaecker/BUILDS2/Build_55/inst_scipy_newcore/lib/python2.4/site-pa ckages/scipy/linalg/tests/test_basic.py", line 273, in check_simple assert_almost_equal(a_det,-2.0) File "/home/abaecker/BUILDS2/Build_55//inst_scipy_newcore/lib/python2.4/site-p ackages/scipy/test/testing.py", line 649, in assert_almost_equal assert round(abs(desired - actual),decimal) == 0, msg AssertionError: Items are not equal: DESIRED: -2.0 ACTUAL: -0.0 The corresponding test routine is test_det def check_simple(self): a = [[1,2],[3,4]] a_det = det(a) assert_almost_equal(a_det,-2.0) Testing this line by line gives for: from scipy import * a = [[1,2],[3,4]] print type(a) linalg.det(a) the following: In [3]: a = [[1,2],[3,4]] In [4]: print type(a) In [5]: linalg.det(a) Out[5]: -0.0 Interestingly the following gives the expected answer from scipy import * a = zeros( (2,2)) a[0,0]=1 a[0,1]=2 a[1,0]=3 a[1,1]=4 print type(a) linalg.det(a) In [7]: print type(a) In [8]: linalg.det(a) Out[8]: -2.0 Travis, does this help you in some way to understand what is going on? (Seems to be some list->array conversion problem ...) I'd guess that some of the other failures are of the same reason. Also note that the above works with newscipy on my laptop In [2]: a = [[1,2],[3,4]] In [3]: print type(a) In [4]: linalg.det(a) Out[4]: -2.0 Best, Arnd From cookedm at physics.mcmaster.ca Sat Oct 29 18:52:53 2005 From: cookedm at physics.mcmaster.ca (David M. Cooke) Date: Sat, 29 Oct 2005 18:52:53 -0400 Subject: [SciPy-dev] Using doctest Message-ID: <6CED96CB-0887-416D-9D0C-261AB76A37B2@physics.mcmaster.ca> I've added some tests for scipy.base.polynomial and scipy.base.ufunclike, but I used doctests instead of writing out something with unittest. They're run by using a test_suite(level) function in the appropriate test module. They were much easier to write than using the unittest API: copy-and- paste from my terminal window, mostly :-) I'm not familiar with the internals of scipy.test.testing, so I don't know if there's something I'm missing when setting them up. It does require Python 2.3 or higher to make a unittest test suite from the doctest, though. On another note: the shadowing of the scipy.test package by the scipy.test function can be annoying, especially if someone wants to reuse it. All usages of scipy.test.testing in scipy right now are of the form "from scipy.test.testing import *". But because of the test () function, you can't do "import scipy.test.testing as T" for instance, which is just plain confusing, and difficult for us in the "no-import-*" camp :-). Now, you can do "from scipy.test import testing as T", but that's not obvious. Personally, I'd suggest renaming the test() function to runtests(), but I can see how this is a problem considering the documentation, etc., about test. -- |>|\/|< /------------------------------------------------------------------\ |David M. Cooke http://arbutus.physics.mcmaster.ca/dmc/ |cookedm at physics.mcmaster.ca From cookedm at physics.mcmaster.ca Sat Oct 29 20:13:41 2005 From: cookedm at physics.mcmaster.ca (David M. Cooke) Date: Sat, 29 Oct 2005 20:13:41 -0400 Subject: [SciPy-dev] Easier now for extension writers to reuse type names Message-ID: Alright, one thing that was bothering me was if you use arrayobject.h, and you're writing an extension that wraps some library, you may have a conflict with typedefs (Bool, intp, longlong, etc.). arrayobject.h is not such a good neighbour there. So, I've fixed up arrayobject.h and ufuncobject.h so that if the preprocessor symbol PY_ARRAY_TYPES_PREFIX is defined before they're imported, the typedefs in those header files define types prefixed with whatever the value of PY_ARRAY_TYPES_PREFIX is. Example: #define PY_ARRAY_TYPES_PREFIX PA_ #include "scipy/arrayobject.h" /* maybe this is in a header file for a library being wrapped */ typedef int Bool; This will work fine, as the typedef in arrayobject.h is instead typedef unsigned char PA_Bool; and all the PyArray API functions will be declared using PA_Bool instead within the module. This isn't a problem with calling the API, as the types are the same (just different names). This is analogous to PY_ARRAY_UNIQUE_SYMBOL used to store the API in a unique array. cheers! -- |>|\/|< /------------------------------------------------------------------\ |David M. Cooke http://arbutus.physics.mcmaster.ca/dmc/ |cookedm at physics.mcmaster.ca From nwagner at mecha.uni-stuttgart.de Sun Oct 30 01:19:16 2005 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Sun, 30 Oct 2005 07:19:16 +0100 Subject: [SciPy-dev] NameError: global name 'nx' is not defined Message-ID: ====================================================================== ERROR: check_assoc_laguerre (scipy.special.basic.test_basic.test_assoc_laguerre) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib/python2.4/site-packages/scipy/special/tests/test_basic.py", line 527, in check_assoc_laguerre a1 = genlaguerre(11,1) File "/usr/local/lib/python2.4/site-packages/scipy/special/orthogonal.py", line 273, in genlaguerre p = orthopoly1d(x,w,hn,kn,wfunc,(0,inf),monic) File "/usr/local/lib/python2.4/site-packages/scipy/special/orthogonal.py", line 73, in __init__ poly1d.__init__(self, roots, r=1) File "/usr/local/lib/python2.4/site-packages/scipy/base/polynomial.py", line 372, in __init__ c_or_r = poly(c_or_r) File "/usr/local/lib/python2.4/site-packages/scipy/base/polynomial.py", line 70, in poly if isinstance(a.dtype, nx.complexfloating): NameError: global name 'nx' is not defined From nwagner at mecha.uni-stuttgart.de Sun Oct 30 02:40:12 2005 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Sun, 30 Oct 2005 08:40:12 +0100 Subject: [SciPy-dev] A = mat(asarray(A)) versus A = asarray(A) Message-ID: Hi all, I have used asarray(A) instead of mat(asarray(A)) in matfuncs.py. Now the tests concerning logm passed. For what reason ? # A = mat(asarray(A)) A = asarray(A) Nils From arnd.baecker at web.de Sun Oct 30 03:03:38 2005 From: arnd.baecker at web.de (Arnd Baecker) Date: Sun, 30 Oct 2005 09:03:38 +0100 (CET) Subject: [SciPy-dev] NameError: global name 'nx' is not defined In-Reply-To: References: Message-ID: On Sun, 30 Oct 2005, Nils Wagner wrote: > ====================================================================== > ERROR: check_assoc_laguerre > (scipy.special.basic.test_basic.test_assoc_laguerre) > ---------------------------------------------------------------------- > Traceback (most recent call last): > File > "/usr/local/lib/python2.4/site-packages/scipy/special/tests/test_basic.py", > line 527, in check_assoc_laguerre > a1 = genlaguerre(11,1) > File > "/usr/local/lib/python2.4/site-packages/scipy/special/orthogonal.py", > line 273, in genlaguerre > p = orthopoly1d(x,w,hn,kn,wfunc,(0,inf),monic) > File > "/usr/local/lib/python2.4/site-packages/scipy/special/orthogonal.py", > line 73, in __init__ > poly1d.__init__(self, roots, r=1) > File > "/usr/local/lib/python2.4/site-packages/scipy/base/polynomial.py", > line 372, in __init__ > c_or_r = poly(c_or_r) > File > "/usr/local/lib/python2.4/site-packages/scipy/base/polynomial.py", > line 70, in poly > if isinstance(a.dtype, nx.complexfloating): > NameError: global name 'nx' is not defined Here the fix is easy: just replace all "nx." by "NX." Arnd From arnd.baecker at web.de Sun Oct 30 03:05:17 2005 From: arnd.baecker at web.de (Arnd Baecker) Date: Sun, 30 Oct 2005 09:05:17 +0100 (CET) Subject: [SciPy-dev] Using doctest In-Reply-To: <6CED96CB-0887-416D-9D0C-261AB76A37B2@physics.mcmaster.ca> References: <6CED96CB-0887-416D-9D0C-261AB76A37B2@physics.mcmaster.ca> Message-ID: On Sat, 29 Oct 2005, David M. Cooke wrote: > I've added some tests for scipy.base.polynomial and > scipy.base.ufunclike, but I used doctests instead of writing out > something with unittest. They're run by using a test_suite(level) > function in the appropriate test module. > > They were much easier to write than using the unittest API: copy-and- > paste from my terminal window, mostly :-) > > I'm not familiar with the internals of scipy.test.testing, so I don't > know if there's something I'm missing when setting them up. It does > require Python 2.3 or higher to make a unittest test suite from the > doctest, though. One problem is that the doctests screw up when ipython is used, e.g.: ipython import scipy scipy.__core_version__ scipy.__core_config__.show() scipy.test(10) Giving many of this type. ***************************************************************** Failure in example: p from line #5 of scipy.base.polynomial.test_polynomial Expected: poly1d([ 1., 2., 3.]) Got: ***************************************************************** Failure in example: q from line #11 of scipy.base.polynomial.test_polynomial Expected: poly1d([ 3., 2., 1.]) Got: Note that I use IPython 0.6.13 and I vaguely remember that something like this was discussed on the IPython mailing list some time ago. Another point: Running the tests of newcore from "normal" python gives: [...] ====================================================================== ERROR: doctest of scipy.base.polynomial.test_polynomial ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/lib/python2.3/unittest.py", line 423, in runTest self.__testFunc() File "/usr/lib/python2.3/doctest.py", line 1359, in runit _utest(tester, name, doc, filename, lineno) File "/usr/lib/python2.3/doctest.py", line 1309, in _utest raise DocTestTestFailure('Failed doctest test for %s\n' DocTestTestFailure: Failed doctest test for scipy.base.polynomial.test_polynomial File "/tmp/SCIPY3/INST/lib/python2.3/site-packages/scipy/base/tests/test_polynomial.py", line 1 (or above), in test_polynomial ***************************************************************** Failure in example: p / q from line #28 of scipy.base.polynomial.test_polynomial Exception raised: Traceback (most recent call last): File "/usr/lib/python2.3/doctest.py", line 442, in _run_examples_inner compileflags, 1) in globs File "", line 1, in ? File "/tmp/SCIPY3/INST//lib/python2.3/site-packages/scipy/base/polynomial.py", line 489, in __div__ return map(poly1d, polydiv(self.coeffs, other.coeffs)) File "/tmp/SCIPY3/INST//lib/python2.3/site-packages/scipy/base/polynomial.py", line 313, in polydiv q, r = deconvolve(a1, a2) File "/tmp/SCIPY3/INST//lib/python2.3/site-packages/scipy/base/polynomial.py", line 305, in deconvolve quot = scipy.signal.lfilter(num, den, input) UnboundLocalError: local variable 'scipy' referenced before assignment The reason is the use of scipy.signal in polynomial.py, def deconvolve(signal, divisor): """Deconvolves divisor out of signal. Requires scipy.signal library """ try: import scipy.signal except ImportError: print "You need scipy.signal to use this function." [...] This routine does not work if only newcore is installed. If I understood things correctly, newcore should be independent from newscipy, or? Best, Arnd From oliphant at ee.byu.edu Sun Oct 30 11:14:00 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Sun, 30 Oct 2005 09:14:00 -0700 Subject: [SciPy-dev] Using doctest In-Reply-To: References: <6CED96CB-0887-416D-9D0C-261AB76A37B2@physics.mcmaster.ca> Message-ID: <4364F148.1070202@ee.byu.edu> >Another point: >Running the tests of newcore from "normal" python gives: > >[...] > >====================================================================== >ERROR: doctest of scipy.base.polynomial.test_polynomial >---------------------------------------------------------------------- >Traceback (most recent call last): > File "/usr/lib/python2.3/unittest.py", line 423, in runTest > self.__testFunc() > File "/usr/lib/python2.3/doctest.py", line 1359, in runit > _utest(tester, name, doc, filename, lineno) > File "/usr/lib/python2.3/doctest.py", line 1309, in _utest > raise DocTestTestFailure('Failed doctest test for %s\n' >DocTestTestFailure: Failed doctest test for >scipy.base.polynomial.test_polynomial > File >"/tmp/SCIPY3/INST/lib/python2.3/site-packages/scipy/base/tests/test_polynomial.py", >line 1 (or above), in test_polynomial > >***************************************************************** >Failure in example: p / q >from line #28 of scipy.base.polynomial.test_polynomial >Exception raised: >Traceback (most recent call last): > File "/usr/lib/python2.3/doctest.py", line 442, in _run_examples_inner > compileflags, 1) in globs > File "", line 1, in ? > File >"/tmp/SCIPY3/INST//lib/python2.3/site-packages/scipy/base/polynomial.py", >line 489, in __div__ > return map(poly1d, polydiv(self.coeffs, other.coeffs)) > File >"/tmp/SCIPY3/INST//lib/python2.3/site-packages/scipy/base/polynomial.py", >line 313, in polydiv > q, r = deconvolve(a1, a2) > File >"/tmp/SCIPY3/INST//lib/python2.3/site-packages/scipy/base/polynomial.py", >line 305, in deconvolve > quot = scipy.signal.lfilter(num, den, input) >UnboundLocalError: local variable 'scipy' referenced before assignment > > >The reason is the use of scipy.signal in polynomial.py, > >def deconvolve(signal, divisor): > """Deconvolves divisor out of signal. Requires scipy.signal library > """ > try: > import scipy.signal > except ImportError: > print "You need scipy.signal to use this function." > [...] > >This routine does not work if only newcore is installed. >If I understood things correctly, newcore should >be independent from newscipy, or? > > newcore should be installable, but there can be a few functions that require other scipy packages. This is the appropriate way to handle it, I think. The test for deconvolve should be moved over to scipy.signal or else an appropriate try block used. -Travis From stephen.walton at csun.edu Sun Oct 30 12:42:24 2005 From: stephen.walton at csun.edu (Stephen Walton) Date: Sun, 30 Oct 2005 09:42:24 -0800 Subject: [SciPy-dev] newcore atlas info on Gentoo In-Reply-To: <20051027154353.63983.qmail@web35602.mail.mud.yahoo.com> References: <20051027154353.63983.qmail@web35602.mail.mud.yahoo.com> Message-ID: <43650600.3020108@csun.edu> Roman Stanchak wrote: >The solution to this issue was hinted at, but never >explicitly stated. Just for completeness, on my >Gentoo system, to detect ATLAS with newcore I placed >the following in newcore/scipy/distutils/site.cfg > >[atlas] >library_dirs = >/usr/lib/blas/atlas:/usr/lib/lapack/atlas >atlas_libs = lapack, blas, cblas, atlas > > Again, and for completeness, Ubuntu and now Fedora Core 4 have used a similar setup in their distributed ATLAS packages, where architecture-specific versions of atlas go in /usr/lib/${ARCH}, /usr/lib/atlas, and/or /usr/lib/atlas/${ARCH}, where ARCH can be sse, sse2 or 3dnow. Thus I think the following site.cfg works for both Ubuntu and FC4 (tested for Ubuntu 5.10, being tested now for FC4): [atlas] library_dirs=/usr/lib/sse:/usr/lib/atlas/sse:/usr/lib/atlas:/usr/lib atlas_libs=lapack,blas,cblas,atlas (change sse to sse2 or 3dnow as appropriate for your system) From pearu at scipy.org Sun Oct 30 16:04:34 2005 From: pearu at scipy.org (Pearu Peterson) Date: Sun, 30 Oct 2005 15:04:34 -0600 (CST) Subject: [SciPy-dev] PyArray_CanCastSafely(exact,inexact) on 64-bit Message-ID: Hi Travis, Number of scipy.test failures occuring on 64-bit systems are due to the fact that PyArray_CanCastSafely(PyArray_LONG,PyArray_DOUBLE) returns true. Though sizeof(long)==sizeof(double) holds on 64-bit machines, shouldn't PyArray_CanCastSafely return false on (exact,inexact) arguments and vice versa? By the definition of can_cast, no bitwise information is lost, but it is not meaningful to pass (double*)() to a numeric function, for example. Or may be I should use some different function than PyArray_CanCastSafely in this situation? Pearu From cookedm at physics.mcmaster.ca Sun Oct 30 17:37:59 2005 From: cookedm at physics.mcmaster.ca (David M. Cooke) Date: Sun, 30 Oct 2005 17:37:59 -0500 Subject: [SciPy-dev] Using doctest In-Reply-To: <4364F148.1070202@ee.byu.edu> References: <6CED96CB-0887-416D-9D0C-261AB76A37B2@physics.mcmaster.ca> <4364F148.1070202@ee.byu.edu> Message-ID: <9BFB276D-2370-4255-91CC-5940EB10AFAA@physics.mcmaster.ca> On Oct 30, 2005, at 11:14, Travis Oliphant wrote: >> Another point: >> Running the tests of newcore from "normal" python gives: >> >> [...] >> >> ===================================================================== >> = >> ERROR: doctest of scipy.base.polynomial.test_polynomial >> --------------------------------------------------------------------- >> - >> Traceback (most recent call last): >> File "/usr/lib/python2.3/unittest.py", line 423, in runTest >> self.__testFunc() >> File "/usr/lib/python2.3/doctest.py", line 1359, in runit >> _utest(tester, name, doc, filename, lineno) >> File "/usr/lib/python2.3/doctest.py", line 1309, in _utest >> raise DocTestTestFailure('Failed doctest test for %s\n' >> DocTestTestFailure: Failed doctest test for >> scipy.base.polynomial.test_polynomial >> File >> "/tmp/SCIPY3/INST/lib/python2.3/site-packages/scipy/base/tests/ >> test_polynomial.py", >> line 1 (or above), in test_polynomial >> >> ***************************************************************** >> Failure in example: p / q >> from line #28 of scipy.base.polynomial.test_polynomial >> Exception raised: >> Traceback (most recent call last): >> File "/usr/lib/python2.3/doctest.py", line 442, in >> _run_examples_inner >> compileflags, 1) in globs >> File "", line 1, in ? >> File >> "/tmp/SCIPY3/INST//lib/python2.3/site-packages/scipy/base/ >> polynomial.py", >> line 489, in __div__ >> return map(poly1d, polydiv(self.coeffs, other.coeffs)) >> File >> "/tmp/SCIPY3/INST//lib/python2.3/site-packages/scipy/base/ >> polynomial.py", >> line 313, in polydiv >> q, r = deconvolve(a1, a2) >> File >> "/tmp/SCIPY3/INST//lib/python2.3/site-packages/scipy/base/ >> polynomial.py", >> line 305, in deconvolve >> quot = scipy.signal.lfilter(num, den, input) >> UnboundLocalError: local variable 'scipy' referenced before >> assignment >> >> >> The reason is the use of scipy.signal in polynomial.py, >> >> def deconvolve(signal, divisor): >> """Deconvolves divisor out of signal. Requires scipy.signal >> library >> """ >> try: >> import scipy.signal >> except ImportError: >> print "You need scipy.signal to use this function." >> [...] >> >> This routine does not work if only newcore is installed. >> If I understood things correctly, newcore should >> be independent from newscipy, or? >> >> >> > > newcore should be installable, but there can be a few functions that > require other scipy packages. This is the appropriate way to > handle it, > I think. > > The test for deconvolve should be moved over to scipy.signal or > else an > appropriate try block used. It's not the test for deconvolve (didn't add one). It's the test for polynomial division. I'll have a look at not using deconvolve for division. I've changed deconvolve so it just does import scipy.signal, and doesn't try to catch the ImportError. This is less confusing in terms of error messages, at least. -- |>|\/|< /------------------------------------------------------------------\ |David M. Cooke http://arbutus.physics.mcmaster.ca/dmc/ |cookedm at physics.mcmaster.ca From cookedm at physics.mcmaster.ca Sun Oct 30 18:02:41 2005 From: cookedm at physics.mcmaster.ca (David M. Cooke) Date: Sun, 30 Oct 2005 18:02:41 -0500 Subject: [SciPy-dev] Using doctest In-Reply-To: <9BFB276D-2370-4255-91CC-5940EB10AFAA@physics.mcmaster.ca> References: <6CED96CB-0887-416D-9D0C-261AB76A37B2@physics.mcmaster.ca> <4364F148.1070202@ee.byu.edu> <9BFB276D-2370-4255-91CC-5940EB10AFAA@physics.mcmaster.ca> Message-ID: <3BD9B7E3-BE0A-4B37-A186-8746BC4F73E9@physics.mcmaster.ca> On Oct 30, 2005, at 17:37, David M. Cooke wrote: > On Oct 30, 2005, at 11:14, Travis Oliphant wrote: >> >> newcore should be installable, but there can be a few functions that >> require other scipy packages. This is the appropriate way to >> handle it, >> I think. >> >> The test for deconvolve should be moved over to scipy.signal or >> else an >> appropriate try block used. >> > > It's not the test for deconvolve (didn't add one). It's the test for > polynomial division. I'll have a look at not using deconvolve for > division. And done. It uses synthetic division now. It could be more efficient: the quotient is built up as a list, for instance, but it passes the test. -- |>|\/|< /------------------------------------------------------------------\ |David M. Cooke http://arbutus.physics.mcmaster.ca/dmc/ |cookedm at physics.mcmaster.ca From cookedm at physics.mcmaster.ca Sun Oct 30 18:13:59 2005 From: cookedm at physics.mcmaster.ca (David M. Cooke) Date: Sun, 30 Oct 2005 18:13:59 -0500 Subject: [SciPy-dev] Using doctest In-Reply-To: References: <6CED96CB-0887-416D-9D0C-261AB76A37B2@physics.mcmaster.ca> Message-ID: On Oct 30, 2005, at 03:05, Arnd Baecker wrote: > On Sat, 29 Oct 2005, David M. Cooke wrote: >> I've added some tests for scipy.base.polynomial and >> scipy.base.ufunclike, but I used doctests instead of writing out >> something with unittest. They're run by using a test_suite(level) >> function in the appropriate test module. >> >> They were much easier to write than using the unittest API: copy-and- >> paste from my terminal window, mostly :-) >> >> I'm not familiar with the internals of scipy.test.testing, so I don't >> know if there's something I'm missing when setting them up. It does >> require Python 2.3 or higher to make a unittest test suite from the >> doctest, though. >> > > One problem is that the doctests screw up when > ipython is used, e.g.: > > ipython > import scipy > scipy.__core_version__ > scipy.__core_config__.show() > scipy.test(10) > > Giving many of this type. > > ***************************************************************** > Failure in example: p > from line #5 of scipy.base.polynomial.test_polynomial > Expected: poly1d([ 1., 2., 3.]) > Got: > ***************************************************************** > Failure in example: q > from line #11 of scipy.base.polynomial.test_polynomial > Expected: poly1d([ 3., 2., 1.]) > Got: > > > Note that I use IPython 0.6.13 and I vaguely > remember that something like this > was discussed on the IPython mailing list some time ago. Fixed. There's some interaction with IPython's sys.displayhook that doctest doesn't like. Now, scipy.test will switch to the builtin displayhook (stored at sys.__displayhook__) before running tests. -- |>|\/|< /------------------------------------------------------------------\ |David M. Cooke http://arbutus.physics.mcmaster.ca/dmc/ |cookedm at physics.mcmaster.ca From oliphant at ee.byu.edu Sun Oct 30 20:40:15 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Sun, 30 Oct 2005 18:40:15 -0700 Subject: [SciPy-dev] PyArray_CanCastSafely(exact,inexact) on 64-bit In-Reply-To: References: Message-ID: <436575FF.8080201@ee.byu.edu> Pearu Peterson wrote: >Hi Travis, > >Number of scipy.test failures occuring on 64-bit systems are due to the >fact that PyArray_CanCastSafely(PyArray_LONG,PyArray_DOUBLE) returns true. > > > We had this discussion about a week ago, so you should definitely look at other comments from that discussion. What exactly are you trying to do that is causing the errors? >Though sizeof(long)==sizeof(double) holds on 64-bit machines, shouldn't >PyArray_CanCastSafely return false on (exact,inexact) arguments and vice >versa? > > It's never done that in the past, so this would represent quite a switch. It has always been possible to convert from integers to floats "safely." On old Numeric you could cast from int8 and int16 to float32 "safely" and from int32 to float64 "safely". The problem is with 64-bit integers. They can't be represented by 64-bit floats, and the original code only allowed them to be converted safely to long double floats. But, because on 64-bit platforms, the long is often 64-bits and that's the default Python integer, it would mean always doing long double calculations. PyArray_CanCastSafely is used largely for determining the "coercion" rules for ufuncs. The current function causes sqrt(2) to take place with double precision. >By the definition of can_cast, no bitwise information is lost, but >it is not meaningful to pass (double*)() to a >numeric function, for example. > > But, that's never been what CanCastSafely has been used to represent. Perhaps we need a different function that distinguishes between different kinds. Again, what are you trying to do, specifically? >Or may be I should use some different function than PyArray_CanCastSafely >in this situation? > > I suspect so... -Travis From pearu at scipy.org Sun Oct 30 22:57:01 2005 From: pearu at scipy.org (Pearu Peterson) Date: Sun, 30 Oct 2005 21:57:01 -0600 (CST) Subject: [SciPy-dev] PyArray_CanCastSafely(exact,inexact) on 64-bit In-Reply-To: <436575FF.8080201@ee.byu.edu> References: <436575FF.8080201@ee.byu.edu> Message-ID: On Sun, 30 Oct 2005, Travis Oliphant wrote: > Pearu Peterson wrote: > >> By the definition of can_cast, no bitwise information is lost, but >> it is not meaningful to pass (double*)() to a >> numeric function, for example. >> >> > But, that's never been what CanCastSafely has been used to represent. > Perhaps we need a different function that distinguishes between > different kinds. Again, what are you trying to do, specifically? > >> Or may be I should use some different function than PyArray_CanCastSafely >> in this situation? >> > I suspect so... PyArray_CanCastSafely has been used in array_from_pyobj function (see fortranobject.c) to decide whether the data pointer of an input array can be directly passed on to the Fortran or C function: if ((! (intent & F2PY_INTENT_COPY)) && PyArray_ITEMSIZE(arr)==descr->elsize && PyArray_CanCastSafely(arr->descr->type_num,type_num) ) { if ((intent&F2PY_INTENT_C)?PyArray_ISCARRAY(arr):PyArray_ISFARRAY(arr)) { if ((intent & F2PY_INTENT_OUT)) { Py_INCREF(arr); } /* Returning input array */ return arr; } } /* else apply arr.astype() */ But I now realize that this is a incorrect usage of PyArray_CanCastSafely. The above codelet should probably then read as if ((! (intent & F2PY_INTENT_COPY)) && PyArray_ITEMSIZE(arr)==descr->elsize && ( (PyArray_ISINTEGER(arr) && PyTypeNum_ISINTEGER(type_num)) || (PyArray_ISFLOAT(arr) && PyTypeNum_ISFLOAT(type_num)) || (PyArray_ISCOMPLEX(arr) && PyTypeNum_ISCOMPLEX(type_num)) || (PyArray_ISSTRING(arr) && PyTypeNum_ISSTRING(type_num)) ) ) { ... though I am not sure about the STRING case, e.g. what happens when passing unicode arr->data to fortran character*(*) argument, for instance. Thanks, Pearu From pearu at scipy.org Sun Oct 30 23:40:35 2005 From: pearu at scipy.org (Pearu Peterson) Date: Sun, 30 Oct 2005 22:40:35 -0600 (CST) Subject: [SciPy-dev] PyArray_CanCastSafely(exact,inexact) on 64-bit In-Reply-To: References: <436575FF.8080201@ee.byu.edu> Message-ID: After fixing the usage of PyArray_CanCastSafely, now all scipy tests, except one, are succesfull on a 64-bit machine: ====================================================================== ERROR: check_integer (scipy.io.array_import.test_array_import.test_read_array) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib/python2.3/site-packages/scipy/io/tests/test_array_import.py", line 62, in check_integer b = io.read_array(fname,atype=N.Int) File "/usr/local/lib/python2.3/site-packages/scipy/io/array_import.py", line 359, in read_array raise ValueError, "One of the array types is invalid, k=%d" % k ValueError: One of the array types is invalid, k=0 ---------------------------------------------------------------------- Ran 1334 tests in 57.772s FAILED (errors=1) Pearu From oliphant at ee.byu.edu Mon Oct 31 01:40:10 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Sun, 30 Oct 2005 23:40:10 -0700 Subject: [SciPy-dev] PyArray_CanCastSafely(exact,inexact) on 64-bit In-Reply-To: References: <436575FF.8080201@ee.byu.edu> Message-ID: <4365BC4A.1070704@ee.byu.edu> >PyArray_CanCastSafely has been used in array_from_pyobj function (see >fortranobject.c) to decide whether the data pointer of an input array can >be directly passed on to the Fortran or C function: > > if ((! (intent & F2PY_INTENT_COPY)) > && PyArray_ITEMSIZE(arr)==descr->elsize > && PyArray_CanCastSafely(arr->descr->type_num,type_num) > ) { > if ((intent&F2PY_INTENT_C)?PyArray_ISCARRAY(arr):PyArray_ISFARRAY(arr)) { > if ((intent & F2PY_INTENT_OUT)) { > Py_INCREF(arr); > } > /* Returning input array */ > return arr; > } > } > /* else apply arr.astype() */ > > > > There are new functions in the C-API called PyArray_EquivalentTypes and PyArray_EquivArrType that can be used to check what you want to check. These functions are used internally to decide whether or not two types are compatible and can be substituted one for another. -Travis From oliphant at ee.byu.edu Mon Oct 31 01:41:16 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Sun, 30 Oct 2005 23:41:16 -0700 Subject: [SciPy-dev] PyArray_CanCastSafely(exact,inexact) on 64-bit In-Reply-To: References: <436575FF.8080201@ee.byu.edu> Message-ID: <4365BC8C.4000502@ee.byu.edu> Pearu Peterson wrote: >After fixing the usage of PyArray_CanCastSafely, now all scipy tests, >except one, are succesfull on a 64-bit machine: > >====================================================================== >ERROR: check_integer >(scipy.io.array_import.test_array_import.test_read_array) >---------------------------------------------------------------------- >Traceback (most recent call last): > File >"/usr/local/lib/python2.3/site-packages/scipy/io/tests/test_array_import.py", >line 62, in check_integer > b = io.read_array(fname,atype=N.Int) > File "/usr/local/lib/python2.3/site-packages/scipy/io/array_import.py", >line 359, in read_array > raise ValueError, "One of the array types is invalid, k=%d" % k >ValueError: One of the array types is invalid, k=0 > >---------------------------------------------------------------------- >Ran 1334 tests in 57.772s > > Fantastic. I'm sure we can track down this problem. -Travis From arnd.baecker at web.de Mon Oct 31 04:00:32 2005 From: arnd.baecker at web.de (Arnd Baecker) Date: Mon, 31 Oct 2005 10:00:32 +0100 (CET) Subject: [SciPy-dev] Using doctest In-Reply-To: References: <6CED96CB-0887-416D-9D0C-261AB76A37B2@physics.mcmaster.ca> Message-ID: On Sun, 30 Oct 2005, David M. Cooke wrote: > Fixed. There's some interaction with IPython's sys.displayhook that > doctest doesn't like. Now, scipy.test will switch to the builtin > displayhook (stored at sys.__displayhook__) before running tests. Excellent - many thanx!! Arnd From arnd.baecker at web.de Mon Oct 31 04:01:22 2005 From: arnd.baecker at web.de (Arnd Baecker) Date: Mon, 31 Oct 2005 10:01:22 +0100 (CET) Subject: [SciPy-dev] PyArray_CanCastSafely(exact,inexact) on 64-bit In-Reply-To: References: <436575FF.8080201@ee.byu.edu> Message-ID: On Sun, 30 Oct 2005, Pearu Peterson wrote: > After fixing the usage of PyArray_CanCastSafely, now all scipy tests, > except one, are succesfull on a 64-bit machine: > > ====================================================================== > ERROR: check_integer > (scipy.io.array_import.test_array_import.test_read_array) > ---------------------------------------------------------------------- > Traceback (most recent call last): > File > "/usr/local/lib/python2.3/site-packages/scipy/io/tests/test_array_import.py", > line 62, in check_integer > b = io.read_array(fname,atype=N.Int) > File "/usr/local/lib/python2.3/site-packages/scipy/io/array_import.py", > line 359, in read_array > raise ValueError, "One of the array types is invalid, k=%d" % k > ValueError: One of the array types is invalid, k=0 > > ---------------------------------------------------------------------- > Ran 1334 tests in 57.772s > > FAILED (errors=1) That is great news! I also get this on my 64 Bit machine! Just in case it has fallen through the cracks: Concerning the check_integer problem: def check_integer(self): from scipy import stats a = stats.randint.rvs(1,20,size=(3,4)) fname = tempfile.mktemp('.dat') io.write_array(fname,a) b = io.read_array(fname,atype=N.Int) assert_array_equal(a,b) os.remove(fname) Executing this line by line shows the error for b = io.read_array(fname,atype=N.Int) Doing b = io.read_array(fname) reads in the array, but it gives floats. However, b = io.read_array(fname,atype=N.Int32) works. If this is the intended behaviour (also on 32Bit), the unit test should be changed accordingly... BTW: wouldn't io be much better in newcore instead of newscipy? It seems like something of very common use. Best, Arnd From nwagner at mecha.uni-stuttgart.de Mon Oct 31 05:07:02 2005 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Mon, 31 Oct 2005 11:07:02 +0100 Subject: [SciPy-dev] signm Message-ID: <4365ECC6.3030602@mecha.uni-stuttgart.de> Hi all, Can someone reproduce the following (wrong) result using newscipy w i t h o u t ATLAS [[ 4.37893189 +1.16931579j -10.35820234 -0.85562669j 22.88516273 +5.26136512j 10.29316548 +1.10228001j 8.41737971 +2.08144719j] [ -0.92449912 -0.38734485j 2.55899387 +0.31688062j -5.74188412 -1.87245744j -2.95587871 -0.35141682j -3.00842621 -0.7724415j ] [ -1.04771148 -0.4097215j 2.82256666 +0.33074828j -6.31761424 -1.96343222j -3.18685611 -0.37353876j -3.15301437 -0.80605845j] [ -2.09461007 -0.38040313j 4.48073743 +0.23575022j -9.78741206 -1.5465713j -3.92661581 -0.37607293j -2.4579986 -0.5714881j ] [ 0.91756308 +0.42099707j -2.63664054 -0.34995887j 5.93570303 +2.05662992j 3.13741039 +0.37967162j 3.30630428 +0.85330874j]] 0.4.3.1401 0.4.2_1407 It should be [[ 11.94933333 -2.24533333 15.31733333 21.65333333 -2.24533333] [ -3.84266667 0.49866667 -4.59066667 -7.18666667 0.49866667] [ -4.08 0.56 -4.92 -7.6 0.56 ] [ -4.03466667 1.04266667 -5.59866667 -7.02666667 1.04266667] [ 4.15733333 -0.50133333 4.90933333 7.81333333 -0.50133333]] Nils -------------- next part -------------- A non-text attachment was scrubbed... Name: test_signm.py Type: text/x-python Size: 365 bytes Desc: not available URL: From cimrman3 at ntc.zcu.cz Mon Oct 31 05:39:19 2005 From: cimrman3 at ntc.zcu.cz (Robert Cimrman) Date: Mon, 31 Oct 2005 11:39:19 +0100 Subject: [SciPy-dev] array logical ops error? Message-ID: <4365F457.4060202@ntc.zcu.cz> Hi all, is this the expected behaviour? IMHO (b * c) == (b and c), (b + c) == (b or c) should hold... In [1]:import scipy In [2]:print scipy.__scipy_version__ 0.4.2_1407 In [3]:print scipy.__core_version__ 0.4.3.1401 In [4]:a = scipy.array( [1,2,3,4] ) In [5]:a Out[5]:array([1, 2, 3, 4]) In [6]:b = a == 3 In [7]:b Out[7]:array([False, False, True, False], dtype=bool) In [8]:c = a > 3 In [9]:c Out[9]:array([False, False, False, True], dtype=bool) In [10]:b and c Out[10]:array([False, False, False, True], dtype=bool) In [11]:b * c Out[11]:array([False, False, False, False], dtype=bool) In [12]:b or c Out[12]:array([False, False, True, False], dtype=bool) In [13]:b + c Out[13]:array([False, False, True, True], dtype=bool) r. From aisaac at american.edu Mon Oct 31 07:37:31 2005 From: aisaac at american.edu (Alan G Isaac) Date: Mon, 31 Oct 2005 07:37:31 -0500 Subject: [SciPy-dev] array logical ops error? In-Reply-To: <4365F457.4060202@ntc.zcu.cz> References: <4365F457.4060202@ntc.zcu.cz> Message-ID: On Mon, 31 Oct 2005, Robert Cimrman apparently wrote: > is this the expected behaviour? > IMHO (b * c) == (b and c), (b + c) == (b or c) should hold... I expected the Boolean operations to yield element-by-element comparisons. What are they?? In contrast, the + and * operators give the expected results. >>> a=scipy.array([1,2,3,4]) >>> b= a==3 >>> c= a>3 >>> b array([False, False, True, False], dtype=bool) >>> c array([False, False, False, True], dtype=bool) >>> b and c array([False, False, False, True], dtype=bool) >>> b or c array([False, False, True, False], dtype=bool) Cheers, Alan Isaac From schofield at ftw.at Mon Oct 31 07:52:44 2005 From: schofield at ftw.at (Ed Schofield) Date: Mon, 31 Oct 2005 13:52:44 +0100 Subject: [SciPy-dev] Adding to an array with __radd__() Message-ID: <4366139C.2060007@ftw.at> Sparse matrix objects now support adding a sparse matrix to a dense matrix of the same dimensions with the syntax: newdense = sparse + dense I'd like to add support for the opposite order too: newdense = dense + sparse but this doesn't seem possible currently. This operation calls the __radd__() method of the sparse matrix object multiple times, each time passing a single element from the dense matrix. The problem is that the sparse matrix's __radd__ function also needs to know the shape of the matrix and the position of the element it's receiving. Could we change the default behaviour of ndarray objects to invoke the right-hand object's __radd__ function just once, passing the dense array? I think this is the default behaviour for Python objects. What code would break if we made this change? -- Ed From nwagner at mecha.uni-stuttgart.de Mon Oct 31 08:49:56 2005 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Mon, 31 Oct 2005 14:49:56 +0100 Subject: [SciPy-dev] Reliability of results Message-ID: <43662104.7060409@mecha.uni-stuttgart.de> Hi all, We should take care of using linalg.inv blindly (test_inv.py). So how can we improve the behaviour of linalg.solve, linalg.lu, linalg.inv etc. with respect to "nearly" singular matrices (floating point arithmetic) ? Any pointer would be appreciated. Is it possibe to add the condition number to the output of linalg.solve, linalg.inv, ... ? For example linalg.lstsq returns the singular values. lstsq(a, b, cond=None, overwrite_a=0, overwrite_b=0) lstsq(a, b, cond=None, overwrite_a=0, overwrite_b=0) -> x,resids,rank,s Return least-squares solution of a * x = b. Inputs: a -- An M x N matrix. b -- An M x nrhs matrix or M vector. cond -- Used to determine effective rank of a. Outputs: x -- The solution (N x nrhs matrix) to the minimization problem: 2-norm(| b - a * x |) -> min resids -- The residual sum-of-squares for the solution matrix x (only if M>N and rank==N). rank -- The effective rank of a. s -- Singular values of a in decreasing order. The condition number of a is abs(s[0]/s[-1]). Nils -------------- next part -------------- A non-text attachment was scrubbed... Name: test_inv.py Type: text/x-python Size: 475 bytes Desc: not available URL: From perry at stsci.edu Mon Oct 31 09:22:04 2005 From: perry at stsci.edu (Perry Greenfield) Date: Mon, 31 Oct 2005 09:22:04 -0500 Subject: [SciPy-dev] alternate .flags interface In-Reply-To: <4366139C.2060007@ftw.at> References: <4366139C.2060007@ftw.at> Message-ID: I'm wondering if the current flags interface couldn't be made a bit easier to use to allow mapping of dictionary keys in to attributes. For example: instead of arr.flags["CONTIGUOUS"] arr.flags.contiguous If using a dictionary is considered useful (e.g., for getting the whole state, it is still possible to allow arr.flags to return a dictionary derived object that can be used wherever dictionaries are accepted, and likewise, assigning to arr.flags should be able to take a dictionary of flags. Any reason we can't support this interface? Perry From Fernando.Perez at colorado.edu Mon Oct 31 09:34:23 2005 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Mon, 31 Oct 2005 07:34:23 -0700 Subject: [SciPy-dev] alternate .flags interface In-Reply-To: References: <4366139C.2060007@ftw.at> Message-ID: <43662B6F.8030000@colorado.edu> Perry Greenfield wrote: > I'm wondering if the current flags interface couldn't be made a bit > easier to use to allow mapping of dictionary keys in to attributes. For > example: > > instead of arr.flags["CONTIGUOUS"] > > arr.flags.contiguous > > If using a dictionary is considered useful (e.g., for getting the whole > state, it is still possible to allow arr.flags to return a dictionary > derived object that can be used wherever dictionaries are accepted, and > likewise, assigning to arr.flags should be able to take a dictionary of > flags. Any reason we can't support this interface? Mmh. What happens when you say arr.flags.update = 1 and then pass arr.flags to a dict-expecting method which calls arr.flags.update(otherdict) ? I do like the idea of named attribute access, but the issue of name conflicts with the existing dict API needs to be dealt with first. Either certain attribute names are disallowed (case in which you can just steal IPython.Struct for the implementation, which already does all of this), or some other policy must be devised. Regards, f From perry at stsci.edu Mon Oct 31 09:42:07 2005 From: perry at stsci.edu (Perry Greenfield) Date: Mon, 31 Oct 2005 09:42:07 -0500 Subject: [SciPy-dev] alternate .flags interface In-Reply-To: <43662B6F.8030000@colorado.edu> References: <4366139C.2060007@ftw.at> <43662B6F.8030000@colorado.edu> Message-ID: <8b0cd7c40082728c68cea914676fe362@stsci.edu> On Oct 31, 2005, at 9:34 AM, Fernando Perez wrote: > Mmh. What happens when you say > > arr.flags.update = 1 > > and then pass arr.flags to a dict-expecting method which calls > > arr.flags.update(otherdict) > > ? > > I do like the idea of named attribute access, but the issue of name > conflicts > with the existing dict API needs to be dealt with first. Either > certain > attribute names are disallowed (case in which you can just steal > IPython.Struct for the implementation, which already does all of > this), or > some other policy must be devised. > > Regards, > > f Isn't the flag attribute UPDATEIFCOPY? (admittedly, I'm going by a potentially dated version of the "Guide to SciPy") Perry From chanley at stsci.edu Mon Oct 31 09:43:19 2005 From: chanley at stsci.edu (Christopher Hanley) Date: Mon, 31 Oct 2005 09:43:19 -0500 Subject: [SciPy-dev] Broken Python Type Alias Message-ID: <43662D87.5080800@stsci.edu> Greetings, The alias to the python type "sb.float" that previously was working in my regression testing system now raises an attribute error. However, "sb.float_" still functions correctly. Example: In [16]: x = sb.ones((10,),dtype=sb.float) --------------------------------------------------------------------------- exceptions.AttributeError Traceback (most recent call last) /data/sparty1/dev/devCode/ AttributeError: 'module' object has no attribute 'float' My regression tests first showed this behavior yesterday morning with the checkout of newcore revision 1396. Chris -- Christopher Hanley Systems Software Engineer Space Telescope Science Institute 3700 San Martin Drive Baltimore MD, 21218 (410) 338-4338 From Fernando.Perez at colorado.edu Mon Oct 31 09:45:40 2005 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Mon, 31 Oct 2005 07:45:40 -0700 Subject: [SciPy-dev] alternate .flags interface In-Reply-To: <8b0cd7c40082728c68cea914676fe362@stsci.edu> References: <4366139C.2060007@ftw.at> <43662B6F.8030000@colorado.edu> <8b0cd7c40082728c68cea914676fe362@stsci.edu> Message-ID: <43662E14.7050107@colorado.edu> Perry Greenfield wrote: > On Oct 31, 2005, at 9:34 AM, Fernando Perez wrote: > >>Mmh. What happens when you say >> >>arr.flags.update = 1 > Isn't the flag attribute UPDATEIFCOPY? (admittedly, I'm going by a > potentially dated version of the "Guide to SciPy") I used 'update' simply as an example to illustrate the issue of potential name clashes. I didn't actually know what the flag names were, what I worry about is that now or in the future either the flag names or the dict method list can grow in a direction that causes a clash. Sorry if a poor choice of example caused confusion, I hope the issue I'm trying to point out is clear now. Regards, f From perry at stsci.edu Mon Oct 31 09:54:40 2005 From: perry at stsci.edu (Perry Greenfield) Date: Mon, 31 Oct 2005 09:54:40 -0500 Subject: [SciPy-dev] alternate .flags interface In-Reply-To: <43662E14.7050107@colorado.edu> References: <4366139C.2060007@ftw.at> <43662B6F.8030000@colorado.edu> <8b0cd7c40082728c68cea914676fe362@stsci.edu> <43662E14.7050107@colorado.edu> Message-ID: <2d2ece8de6de039916e478662dd5d5a4@stsci.edu> On Oct 31, 2005, at 9:45 AM, Fernando Perez wrote: > I used 'update' simply as an example to illustrate the issue of > potential name > clashes. I didn't actually know what the flag names were, what I > worry about > is that now or in the future either the flag names or the dict method > list can > grow in a direction that causes a clash. > > Sorry if a poor choice of example caused confusion, I hope the issue > I'm > trying to point out is clear now. > Sure, it's a valid concern. (I did look at the list to see if any current one was a conflict; your mail make me think I missed something!) Another possible solution is to have .asdict() and .fromdict() methods to return a dictionary and a set from a dictionary and avoid the whole attribute conflict issue. Perry From Fernando.Perez at colorado.edu Mon Oct 31 10:04:45 2005 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Mon, 31 Oct 2005 08:04:45 -0700 Subject: [SciPy-dev] Question about 64-bit integers being cast to double precision In-Reply-To: <43629D33.1000704@ee.byu.edu> References: <4349EDA2.2090802@ee.byu.edu> <434A9453.2010705@csun.edu> <434A98EB.7030407@ee.byu.edu> <434D8F27.100@colorado.edu> <1129157213.4839.5.camel@E011704> <435FA822.2050501@csun.edu> <1130345637.4822.24.camel@E011704> <435FBF12.9010406@colorado.edu> <435FD143.1050903@colorado.edu> <43629D33.1000704@ee.byu.edu> Message-ID: <4366328D.8090106@colorado.edu> Travis Oliphant wrote: > Fernando Perez wrote: >>Since the 'mental slot' is already in scipy's users heads for saying 'modify >>the default output of this function to accumulate/store data in a different >>type', I think it would be reasonable to offer >> >>sqrt(a,rtype=float) >> >> > > One thing we could do is take advantage of the "indexing capabilities" > of the ufunc object which are unused at the moment and do something like > > sqrt[float](a) > > Where the argument to the index would be the desired output type or > something. This one is a bit intriguing. I kind of like it, but I worry that it's a bit too unique of a usage. I've never seen this kind of use elsewhere in python code 'in the wild', and I wonder if it's not too orthogonal to common usage to force people to learn this particular special case. Cheers, f From chanley at stsci.edu Mon Oct 31 10:10:01 2005 From: chanley at stsci.edu (Christopher Hanley) Date: Mon, 31 Oct 2005 10:10:01 -0500 Subject: [SciPy-dev] Broken Python Type Alias In-Reply-To: <43662D87.5080800@stsci.edu> References: <43662D87.5080800@stsci.edu> Message-ID: <436633C9.4060008@stsci.edu> I should add that this is also true for the other Python types as well (int, bool, etc...). Christopher Hanley wrote: > Greetings, > > The alias to the python type "sb.float" that previously was working in > my regression testing system now raises an attribute error. However, > "sb.float_" still functions correctly. Example: > > > > In [16]: x = sb.ones((10,),dtype=sb.float) > --------------------------------------------------------------------------- > exceptions.AttributeError Traceback (most > recent call last) > > /data/sparty1/dev/devCode/ > > AttributeError: 'module' object has no attribute 'float' > > > > My regression tests first showed this behavior yesterday morning with > the checkout of newcore revision 1396. > > Chris > From jmiller at stsci.edu Mon Oct 31 11:31:25 2005 From: jmiller at stsci.edu (Todd Miller) Date: Mon, 31 Oct 2005 11:31:25 -0500 Subject: [SciPy-dev] Is "bool" the right name? Message-ID: <436646DD.1060306@stsci.edu> Trying to add some newcore compatibility to numarray, I bowled through the obvious name collision between the new numarray "bool" and the Python scalar type "bool." The name collision is easy enough to work around in the numarray internals, but it occurred to me that "bool8" might be a better name since it is more explicit, consistent with the other type names, and clearly resolves the recurring question "Is this a 1-bit bool or not?". So, while newcore is still young, does it make sense to rename "bool" to "bool8"? Regards, Todd From charlesr.harris at gmail.com Mon Oct 31 12:30:43 2005 From: charlesr.harris at gmail.com (Charles R Harris) Date: Mon, 31 Oct 2005 10:30:43 -0700 Subject: [SciPy-dev] Question about 64-bit integers being cast to double precision In-Reply-To: <4366328D.8090106@colorado.edu> References: <4349EDA2.2090802@ee.byu.edu> <1129157213.4839.5.camel@E011704> <435FA822.2050501@csun.edu> <1130345637.4822.24.camel@E011704> <435FBF12.9010406@colorado.edu> <435FD143.1050903@colorado.edu> <43629D33.1000704@ee.byu.edu> <4366328D.8090106@colorado.edu> Message-ID: On 10/31/05, Fernando Perez wrote: > > Travis Oliphant wrote: > > Fernando Perez wrote: > > >>Since the 'mental slot' is already in scipy's users heads for saying > 'modify > >>the default output of this function to accumulate/store data in a > different > >>type', I think it would be reasonable to offer > >> > >>sqrt(a,rtype=float) > >> > >> > > > > One thing we could do is take advantage of the "indexing capabilities" > > of the ufunc object which are unused at the moment and do something like > > > > sqrt[float](a) > > > > Where the argument to the index would be the desired output type or > > something. > > This one is a bit intriguing. I kind of like it, but I worry that it's a > bit > too unique of a usage. I've never seen this kind of use elsewhere in > python > code 'in the wild', and I wonder if it's not too orthogonal to common > usage to > force people to learn this particular special case. > I kind of like it too. I'm not to worried about the special usage case because controlling types in python is itself a special usage. I guess another question is why the indexing capability was originally added. Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From oliphant at ee.byu.edu Mon Oct 31 12:39:03 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Mon, 31 Oct 2005 10:39:03 -0700 Subject: [SciPy-dev] Question about 64-bit integers being cast to double precision In-Reply-To: References: <4349EDA2.2090802@ee.byu.edu> <1129157213.4839.5.camel@E011704> <435FA822.2050501@csun.edu> <1130345637.4822.24.camel@E011704> <435FBF12.9010406@colorado.edu> <435FD143.1050903@colorado.edu> <43629D33.1000704@ee.byu.edu> <4366328D.8090106@colorado.edu> Message-ID: <436656B7.6080708@ee.byu.edu> > I kind of like it too. I'm not to worried about the special usage case > because controlling types in python is itself a special usage. I guess > another question is why the indexing capability was originally added. I'm not sure what you mean. Perhaps my wording was confusing. There is currently no indexing capability of ufunc objects. But, because ufuncs are Python types, we could add indexing capability to accomodate a use such as this one. Most likely the use case would return a special ufunc object that would handle casting differently then the default. -Travis From oliphant at ee.byu.edu Mon Oct 31 12:31:09 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Mon, 31 Oct 2005 10:31:09 -0700 Subject: [SciPy-dev] Is "bool" the right name? In-Reply-To: <436646DD.1060306@stsci.edu> References: <436646DD.1060306@stsci.edu> Message-ID: <436654DD.3070601@ee.byu.edu> Todd Miller wrote: >So, while newcore is still young, does it make sense to rename "bool" >to "bool8"? > > I think that bool8 would be an appropriate addition to the bit-width names. Good idea. -Travis From oliphant at ee.byu.edu Mon Oct 31 12:18:36 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Mon, 31 Oct 2005 10:18:36 -0700 Subject: [SciPy-dev] Broken Python Type Alias In-Reply-To: <43662D87.5080800@stsci.edu> References: <43662D87.5080800@stsci.edu> Message-ID: <436651EC.2090904@ee.byu.edu> Christopher Hanley wrote: >Greetings, > >The alias to the python type "sb.float" that previously was working in >my regression testing system now raises an attribute error. However, >"sb.float_" still functions correctly. Example: > > > > > Before making a beta release, I decided against over-writing the Python names, so it's sb.float_ But, dtype=float still works and maps to the c-double type. -Travis From oliphant at ee.byu.edu Mon Oct 31 12:22:30 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Mon, 31 Oct 2005 10:22:30 -0700 Subject: [SciPy-dev] alternate .flags interface In-Reply-To: <43662B6F.8030000@colorado.edu> References: <4366139C.2060007@ftw.at> <43662B6F.8030000@colorado.edu> Message-ID: <436652D6.5020001@ee.byu.edu> Fernando Perez wrote: >Perry Greenfield wrote: > > >>I'm wondering if the current flags interface couldn't be made a bit >>easier to use to allow mapping of dictionary keys in to attributes. For >>example: >> >>instead of arr.flags["CONTIGUOUS"] >> >>arr.flags.contiguous >> >>If using a dictionary is considered useful (e.g., for getting the whole >>state, it is still possible to allow arr.flags to return a dictionary >>derived object that can be used wherever dictionaries are accepted, and >>likewise, assigning to arr.flags should be able to take a dictionary of >>flags. Any reason we can't support this interface? >> >> > >Mmh. What happens when you say > >arr.flags.update = 1 > >and then pass arr.flags to a dict-expecting method which calls > >arr.flags.update(otherdict) > > I'm not sure I follow what the problem is. Is there an update flag? First of all, arr.flags already returns a special dictionary (so that tests like Fortran but not contiguous are easy). If you want to use attribute access to get and set the dictionary items, I don't see how that couldn't be done. -Travis From oliphant at ee.byu.edu Mon Oct 31 12:17:12 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Mon, 31 Oct 2005 10:17:12 -0700 Subject: [SciPy-dev] alternate .flags interface In-Reply-To: References: <4366139C.2060007@ftw.at> Message-ID: <43665198.2080601@ee.byu.edu> Perry Greenfield wrote: >I'm wondering if the current flags interface couldn't be made a bit >easier to use to allow mapping of dictionary keys in to attributes. For >example: > >instead of arr.flags["CONTIGUOUS"] > >arr.flags.contiguous > >If using a dictionary is considered useful (e.g., for getting the whole >state, it is still possible to allow arr.flags to return a dictionary >derived object that can be used wherever dictionaries are accepted, and >likewise, assigning to arr.flags should be able to take a dictionary of >flags. Any reason we can't support this interface? > > > Actually this would be easy, because arr.flags already returns a special object defined in scipy/base/_internal.py. So yes, it could be done. -Travis From oliphant at ee.byu.edu Mon Oct 31 12:24:38 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Mon, 31 Oct 2005 10:24:38 -0700 Subject: [SciPy-dev] Is "bool" the right name? In-Reply-To: <436646DD.1060306@stsci.edu> References: <436646DD.1060306@stsci.edu> Message-ID: <43665356.4020409@ee.byu.edu> Todd Miller wrote: >Trying to add some newcore compatibility to numarray, I bowled through >the obvious name collision between the new numarray "bool" and the >Python scalar type "bool." The name collision is easy enough to work >around in the numarray internals, but it occurred to me that "bool8" >might be a better name since it is more explicit, consistent with the >other type names, and clearly resolves the recurring question "Is this >a 1-bit bool or not?". > >So, while newcore is still young, does it make sense to rename "bool" >to "bool8"? > > Actually, I renamed everything that was conflicting with standard Python types with an appended underscore. When used as a data type, the standard Python scalars actually work to identify what is desired as well. So dtype=bool_ and dtype=bool produce the same result, but of course bool_(...) produces an array of bools and bool() produces a single Python bool -Travis From oliphant at ee.byu.edu Mon Oct 31 12:26:26 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Mon, 31 Oct 2005 10:26:26 -0700 Subject: [SciPy-dev] array logical ops error? In-Reply-To: References: <4365F457.4060202@ntc.zcu.cz> Message-ID: <436653C2.5030003@ee.byu.edu> Alan G Isaac wrote: >On Mon, 31 Oct 2005, Robert Cimrman apparently wrote: > > >>is this the expected behaviour? >>IMHO (b * c) == (b and c), (b + c) == (b or c) should hold... >> >> > >I expected the Boolean operations to yield >element-by-element comparisons. What are they?? In >contrast, the + and * operators give the expected results. > > > This is a Python deal. It would be nice if b and c did the same thing as b * c, but Python does not allow overloading of the "and" and "or" operators (A PEP to say it should would be possible). Thus, "b and c" evaluates the truth of b and the truth of c as a whole (not elementwise), and there is no way to over-ride this. -Travis From oliphant at ee.byu.edu Mon Oct 31 12:42:49 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Mon, 31 Oct 2005 10:42:49 -0700 Subject: [SciPy-dev] Adding to an array with __radd__() In-Reply-To: <4366139C.2060007@ftw.at> References: <4366139C.2060007@ftw.at> Message-ID: <43665799.9010801@ee.byu.edu> Ed Schofield wrote: >Sparse matrix objects now support adding a sparse matrix to a dense >matrix of the same dimensions with the syntax: > >newdense = sparse + dense > >I'd like to add support for the opposite order too: > >newdense = dense + sparse > >but this doesn't seem possible currently. This operation calls the >__radd__() method of the sparse matrix object multiple times, each time >passing a single element from the dense matrix. > Why does it do this? This does not seem to be the way Python would handle it. Can you track exactly what gets called when dense + sparse is performed? -Travis From cookedm at physics.mcmaster.ca Mon Oct 31 13:15:00 2005 From: cookedm at physics.mcmaster.ca (David M. Cooke) Date: Mon, 31 Oct 2005 13:15:00 -0500 Subject: [SciPy-dev] Is "bool" the right name? In-Reply-To: <43665356.4020409@ee.byu.edu> References: <436646DD.1060306@stsci.edu> <43665356.4020409@ee.byu.edu> Message-ID: <800E5A70-4997-423F-950F-6316AB254E02@physics.mcmaster.ca> On Oct 31, 2005, at 12:24, Travis Oliphant wrote: > Todd Miller wrote: >> Trying to add some newcore compatibility to numarray, I bowled >> through >> the obvious name collision between the new numarray "bool" and the >> Python scalar type "bool." The name collision is easy enough to >> work >> around in the numarray internals, but it occurred to me that "bool8" >> might be a better name since it is more explicit, consistent with the >> other type names, and clearly resolves the recurring question "Is >> this >> a 1-bit bool or not?". >> >> So, while newcore is still young, does it make sense to rename >> "bool" >> to "bool8"? >> > Actually, I renamed everything that was conflicting with standard > Python > types with an appended underscore. ... and I also added an __all__ to scipy.base.numerictypes that doesn't export the Python types (this is probably what Christopher Hanley ran in to). I get real nervous overwriting builtin types like that, even though it's supposedly it's the same object. Static checkers (like pylint; haven't got pychecker to work right yet) complain mightly when that happens. -- |>|\/|< /------------------------------------------------------------------\ |David M. Cooke http://arbutus.physics.mcmaster.ca/dmc/ |cookedm at physics.mcmaster.ca From cookedm at physics.mcmaster.ca Mon Oct 31 13:18:24 2005 From: cookedm at physics.mcmaster.ca (David M. Cooke) Date: Mon, 31 Oct 2005 13:18:24 -0500 Subject: [SciPy-dev] array logical ops error? In-Reply-To: References: <4365F457.4060202@ntc.zcu.cz> Message-ID: On Oct 31, 2005, at 07:37, Alan G Isaac wrote: > On Mon, 31 Oct 2005, Robert Cimrman apparently wrote: > >> is this the expected behaviour? >> IMHO (b * c) == (b and c), (b + c) == (b or c) should hold... >> > > I expected the Boolean operations to yield > element-by-element comparisons. What are they?? In > contrast, the + and * operators give the expected results. Use & and | instead of 'and' and 'or'. -- |>|\/|< /------------------------------------------------------------------\ |David M. Cooke http://arbutus.physics.mcmaster.ca/dmc/ |cookedm at physics.mcmaster.ca From schofield at ftw.at Mon Oct 31 13:29:55 2005 From: schofield at ftw.at (Ed Schofield) Date: Mon, 31 Oct 2005 19:29:55 +0100 Subject: [SciPy-dev] Adding to an array with __radd__() In-Reply-To: <43665799.9010801@ee.byu.edu> References: <4366139C.2060007@ftw.at> <43665799.9010801@ee.byu.edu> Message-ID: <436662A3.4020101@ftw.at> Travis Oliphant wrote: >Ed Schofield wrote: > > > >>Sparse matrix objects now support adding a sparse matrix to a dense >>matrix of the same dimensions with the syntax: >> >>newdense = sparse + dense >> >>I'd like to add support for the opposite order too: >> >>newdense = dense + sparse >> >>but this doesn't seem possible currently. This operation calls the >>__radd__() method of the sparse matrix object multiple times, each time >>passing a single element from the dense matrix. >> >> >Why does it do this? This does not seem to be the way Python would >handle it. > >Can you track exactly what gets called when > >dense + sparse > >is performed? > > Sure. Actually, it looks like the scipy 0.3.2 handled it fine. This script: ----------------------- class A: def __radd__(a, b=None, c=None): print "A.__radd__() called, with arguments (%s, %s, %s)" % (a, b, c) return "A.radd()\n" def __add__(a, b=None, c=None): print "A.__add__() called, with arguments (%s, %s, %s)" % (a, b, c) return "A.add()\n" class B: def __radd__(a, b=None, c=None): print "B.__radd__() called, with arguments (%s, %s, %s)" % (a, b, c) return "B.radd()\n" a = A() b = B() print "a + b is: " + str(a + b) print "b + a is: " + str(b + a) import scipy c = scipy.array([[1,2,3],[4,5,6]]) print "c + a is: " + str(c + a) print "a + c is: " + str(a + c) ------------------ produces this output with scipy 0.3.2: ------------------ A.__add__() called, with arguments (<__main__.A instance at 0xb7eaff8c>, <__main__.B instance at 0xb7eaffec>, None) a + b is: A.add() A.__radd__() called, with arguments (<__main__.A instance at 0xb7eaff8c>, <__main__.B instance at 0xb7eaffec>, None) b + a is: A.radd() A.__radd__() called, with arguments (<__main__.A instance at 0xb7eaff8c>, [[1 2 3] [4 5 6]], None) c + a is: A.radd() A.__add__() called, with arguments (<__main__.A instance at 0xb7eaff8c>, [[1 2 3] [4 5 6]], None) a + c is: A.add() --------------------- as expected. With newcore (0.4.3.1401) I get: --------------------- A.__add__() called, with arguments (<__main__.A instance at 0xb7e98f8c>, <__main__.B instance at 0xb7e98fec>, None) a + b is: A.add() A.__radd__() called, with arguments (<__main__.A instance at 0xb7e98f8c>, <__main__.B instance at 0xb7e98fec>, None) b + a is: A.radd() Importing io to scipy Importing interpolate to scipy Importing fftpack to scipy Importing special to scipy Importing cluster to scipy Importing sparse to scipy Importing signal to scipy Failed to import signal cannot import name comb Importing utils to scipy Importing lib to scipy Importing integrate to scipy Importing optimize to scipy Importing linalg to scipy Importing stats to scipy A.__radd__() called, with arguments (<__main__.A instance at 0xb7ecaf8c>, 1, None) A.__radd__() called, with arguments (<__main__.A instance at 0xb7ecaf8c>, 2, None) A.__radd__() called, with arguments (<__main__.A instance at 0xb7ecaf8c>, 3, None) A.__radd__() called, with arguments (<__main__.A instance at 0xb7ecaf8c>, 4, None) A.__radd__() called, with arguments (<__main__.A instance at 0xb7ecaf8c>, 5, None) A.__radd__() called, with arguments (<__main__.A instance at 0xb7ecaf8c>, 6, None) c + a is: [[A.radd() A.radd() A.radd() ] [A.radd() A.radd() A.radd() ]] A.__add__() called, with arguments (<__main__.A instance at 0xb7ecaf8c>, [[1 2 3] [4 5 6]], None) a + c is: A.add() -------------------- Tracing through the code, it seems the following functions are called: array_add() PyArray_GenericBinaryFunction(dense, newsparse, n_ops.add) ... a wrapper I don't understand ... then PyNumber_Add() from __umath_generated.c (which supposedly works elementwise) PyArray_FromAny() ... and ... array_fromobject() which queries various attributes like __array__, __array_shape__, and __len__ I haven't yet checked further than this ... e.g. whether the type == PyArray_NOTYPE succeeds ... -- Ed From oliphant at ee.byu.edu Mon Oct 31 17:43:29 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Mon, 31 Oct 2005 15:43:29 -0700 Subject: [SciPy-dev] PyArray_CanCastSafely(exact,inexact) on 64-bit In-Reply-To: References: <436575FF.8080201@ee.byu.edu> Message-ID: <43669E11.7000605@ee.byu.edu> Arnd Baecker wrote: >That is great news! I also get this on my 64 Bit machine! > > Wonderful... >Just in case it has fallen through the cracks: >Concerning the check_integer problem: > > def check_integer(self): > from scipy import stats > a = stats.randint.rvs(1,20,size=(3,4)) > fname = tempfile.mktemp('.dat') > io.write_array(fname,a) > b = io.read_array(fname,atype=N.Int) > assert_array_equal(a,b) > os.remove(fname) > >Executing this line by line shows the error for > b = io.read_array(fname,atype=N.Int) > >Doing > b = io.read_array(fname) >reads in the array, but it gives floats. > >However, > b = io.read_array(fname,atype=N.Int32) >works. > >If this is the intended behaviour (also on 32Bit), >the unit test should be changed accordingly... > > What is the type of 'a' (i.e. stats.randint.rvs(1,20,size=(3,4)) ) on your platform? This does look like a problem of type mismatch on 64-bit that we can handle much better now. It looks like randint is returning 32-bit numbers, but N.Int is 'l' which on your 64-bit platform is a 64-bit integer. This would definitely be the problem. I've changed the test... >BTW: wouldn't io be much better in newcore instead >of newscipy? It seems like something of very common use. > > > Yes, that is a strong possibility. The only issue is that scipy arrays can now be directly written to files (using the tofile method) and read from files (using fromfile function). The scipy.io code essentially re-does all of that (but not for the new flexible arrays). It would need to be adapted before I think it would be a good fit. -Travis From oliphant at ee.byu.edu Mon Oct 31 19:07:41 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Mon, 31 Oct 2005 17:07:41 -0700 Subject: [SciPy-dev] Adding to an array with __radd__() In-Reply-To: <436662A3.4020101@ftw.at> References: <4366139C.2060007@ftw.at> <43665799.9010801@ee.byu.edu> <436662A3.4020101@ftw.at> Message-ID: <4366B1CD.6050805@ee.byu.edu> Ed Schofield wrote: I think this is a result of the fact that arrays are now new style numbers and the fact that array(B) created a perfectly nice object array. I've committed a change that makes special cases this case so the reflected operands will work if they are defined for something that becomes an object array. -Travis From rkern at ucsd.edu Mon Oct 31 20:32:49 2005 From: rkern at ucsd.edu (Robert Kern) Date: Mon, 31 Oct 2005 17:32:49 -0800 Subject: [SciPy-dev] Interpolating scattered 2D data Message-ID: <4366C5C1.7050604@ucsd.edu> Short post: http://svn.scipy.org/svn/scipy/branches/newscipy/Lib/sandbox/delaunay/ Long post: I finally found some code to do 2-D Delaunay triangulations that was * efficient and robust * BSD-ish licensed * coded to behave nicely as a library rather than a standalone program * compileable with a modern compiler http://mapviewer.skynet.ie/voronoi.html It's a modest C++-ification of Steve Fortune's classic sweepline code (which previously failed the last two criterion). It needed some tweaking to expose all of the information I needed for interpolation, but that was pretty easy. I've implemented two interpolation algorithms. One is just a linear interpolation on each triangle using the three node values to determine a plane. I've also implemented natural neighbor interpolation via Sibson's method. Currently, only interpolation to a regular, rectangular grid is supported, but I'll add general interpolation shortly. Here's an example: In [5]: import scipy In [6]: from matplotlib.pylab import * In [7]: from scipy.sandbox import delaunay In [8]: x = scipy.random.uniform(-0.5, 1.5, size=1000) In [9]: y = scipy.random.uniform(-0.5, 1.5, size=1000) In [10]: z = 2.0*cos(10.0*x)*sin(10.0*y) + sin(10.0*x*y) In [11]: tri = delaunay.Triangulation(x, y) In [12]: nni = tri.nn_interpolator(z) In [13]: x0 = y0 = 0.0 In [14]: x1 = y1 = 1.0 In [15]: f = nni[y0:y1:101j, x0:x1:101j] In [16]: imshow(f, extent=(x0,x1,y0,y1)) Out[16]: You can see a bunch of images of test functions here: http://starship.python.net/crew/kernr/nn-images/ Note that the contours aren't being computed in any intelligent way. I just interpolated to a 101x101 grid and used pylab's contour() function. -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From dd55 at cornell.edu Mon Oct 31 21:38:48 2005 From: dd55 at cornell.edu (Darren Dale) Date: Mon, 31 Oct 2005 21:38:48 -0500 Subject: [SciPy-dev] scipy.org and numarray Message-ID: <200510312138.49269.dd55@cornell.edu> I just wanted to suggest that the following page on www.scipy.org is now out of date. http://www.scipy.org/documentation/numarraydiscuss.html Darren