From ravi at ati.com Thu Sep 1 11:07:49 2005 From: ravi at ati.com (Ravikiran Rajagopal) Date: Thu, 1 Sep 2005 11:07:49 -0400 Subject: [SciPy-dev] Submitting Patches In-Reply-To: <4569.195.251.195.65.1125474924.squirrel@webmail1.b-one.nu> References: <4569.195.251.195.65.1125474924.squirrel@webmail1.b-one.nu> Message-ID: <200509011107.49825.ravi@ati.com> Hi, > what is the appropriate procedure to submit patches? I sended to the list > a patch for RQ decomposition a month ago but got no responce. Will is be > commited? Will it not? The current SVN version of scipy did not compile the last time I checked. At this point, it is sort of hard to get any meaningful testing done. Once all the issues introduced by the migration from CVS to SVN are done, I will test the patch. Regards, Ravi From oliphant at ee.byu.edu Fri Sep 2 17:32:15 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Fri, 02 Sep 2005 15:32:15 -0600 Subject: [SciPy-dev] New Lyx Version eas(ier) to install on Windows. Message-ID: <4318C4DF.2090805@ee.byu.edu> There is a new Lyx Version that works on Windows quite well -- windows is now officially supported. So, those of you who have been hesitant to try LyX because it supposedly doesn't work on Windows, have no more excuse ;-) An executable installer is available at http://www.lyx.org The setup process guides you through the other programs that are needed (ImageMagick, Ghostscript, a minimal shell like minsys, Python, and optionally Perl) to run lyx well. -Travis From oliphant at ee.byu.edu Fri Sep 2 19:53:49 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Fri, 02 Sep 2005 17:53:49 -0600 Subject: [SciPy-dev] scipy core (Numeric3) win32 binaries to play with Message-ID: <4318E60D.4040304@ee.byu.edu> If anybody has just been waiting for a windows binary to try out the new Numeric (scipy.base) you can download this. from scipy.base import * (replaces from Numeric import *) The installer is here: http://numeric.scipy.org/files/scipy_core-0.4.0.win32-py2.4.exe From aisaac at american.edu Fri Sep 2 20:38:15 2005 From: aisaac at american.edu (Alan G Isaac) Date: Fri, 2 Sep 2005 20:38:15 -0400 Subject: [SciPy-dev] scipy core (Numeric3) win32 binaries to play with In-Reply-To: <4318E60D.4040304@ee.byu.edu> References: <4318E60D.4040304@ee.byu.edu> Message-ID: On Fri, 02 Sep 2005, Travis Oliphant apparently wrote: > http://numeric.scipy.org/files/scipy_core-0.4.0.win32-py2.4.exe So far so good. Thanks! Alan Isaac From aisaac at american.edu Sat Sep 3 02:23:57 2005 From: aisaac at american.edu (Alan G Isaac) Date: Sat, 3 Sep 2005 02:23:57 -0400 Subject: [SciPy-dev] New Lyx Version eas(ier) to install on Windows. In-Reply-To: <4318C4DF.2090805@ee.byu.edu> References: <4318C4DF.2090805@ee.byu.edu> Message-ID: On Fri, 02 Sep 2005, Travis Oliphant apparently wrote: > those of you who have been hesitant to try LyX because it supposedly > doesn't work on Windows, have no more excuse ;-) > An executable installer is available at http://www.lyx.org > The setup process guides you through the other programs that are needed > (ImageMagick, Ghostscript, a minimal shell like minsys, Python, and > optionally Perl) to run lyx well. 1. A question: what ImageMagick file is needed? http://sourceforge.net/project/showfiles.php?group_id=24099 (I.e., what do you get out of "dynamic"?) Comment: ImageMagick seems very RAM intensive. 2. I do not see the shell requirment. Is that just your recommendation? 3. Are you using it on Windows? If so, how's it going? Thanks, Alan Isaac From oliphant at ee.byu.edu Sun Sep 4 05:00:29 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Sun, 04 Sep 2005 03:00:29 -0600 Subject: [SciPy-dev] New Lyx Version eas(ier) to install on Windows. In-Reply-To: References: <4318C4DF.2090805@ee.byu.edu> Message-ID: <431AB7AD.50709@ee.byu.edu> Alan G Isaac wrote: >On Fri, 02 Sep 2005, Travis Oliphant apparently wrote: > > >>those of you who have been hesitant to try LyX because it supposedly >>doesn't work on Windows, have no more excuse ;-) >> >> > > > >>An executable installer is available at http://www.lyx.org >> >> > > > >>The setup process guides you through the other programs that are needed >>(ImageMagick, Ghostscript, a minimal shell like minsys, Python, and >>optionally Perl) to run lyx well. >> >> > >1. A question: what ImageMagick file is needed? > http://sourceforge.net/project/showfiles.php?group_id=24099 > (I.e., what do you get out of "dynamic"?) > Comment: ImageMagick seems very RAM intensive. > > You just need the program convert (use the Q8 version which assumes 8 bits per channel rather than the more RAM intensive 16 bit-per channel Q16 version). Lots of ImageMagick versions will work. If you don't have the convert.exe program, the program should still work, but you won't be able to see graphics included in the file on screen. >2. I do not see the shell requirment. Is that just your > recommendation? > > The new installer for Lyx 1.3.6 on windows will guide you through and ask for the directory where sh.exe lives, or I think it tries to download msys for you. >3. Are you using it on Windows? If so, how's it going? > > I edited on Windows last night, and it seemed just fine. My main platform is not Windows, however. -Travis From oliphant at ee.byu.edu Tue Sep 6 13:43:21 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Tue, 06 Sep 2005 11:43:21 -0600 Subject: [SciPy-dev] Re: [SciPy-user] Upgrading scipy.io.loadmat In-Reply-To: <431DBDE9.6030901@mit.edu> References: <431DBDE9.6030901@mit.edu> Message-ID: <431DD539.1090406@ee.byu.edu> Nick Fotopoulos wrote: > Greetings, > > A couple months ago, I posted to scipy-user asking if anyone knew how to > fix scipy.io.loadmat to read Matlab 7 (R14) mat-files. Receiving no > response, I hacked the mio.py myself and can now read in the new version > of mat-files. While I was tinkering, I fixed the in-place-but-broken > support for complex arrays; it returned only the real parts before. Excellent, your R14 contribution is greatly appreciated. By the way, I believe the complex array problem was fixed (which version of SciPy did you start from?). At any rate, I would be glad to apply your patches. There is a bug-tracker at sourceforge where you can store a patch to make sure it gets recorded. But, make sure you let scipy-dev know about the patch so we can apply it. Finally, scipy is undergoing some changes and so don't be discouraged if your patch doesn't get applied immediately. It's on the queue.... -Travis O. From jmiller at stsci.edu Wed Sep 7 11:08:09 2005 From: jmiller at stsci.edu (Todd Miller) Date: Wed, 07 Sep 2005 11:08:09 -0400 Subject: [SciPy-dev] numarray weave commit? weave maintainer? Message-ID: <1126105689.2353.19.camel@halloween.stsci.edu> I have a patch for weave for numarray which passes all of the default self-tests. In doing my final checkout, I noticed that weave.test(level=10) has some errors and eventually segfaults... for Numeric. This was on RHEL3 with Python-2.4.1. Is anyone actively maintaining weave? Does anyone have a problem with me committing changes which enable numarray to pass weave.test() even though the level=10 tests are still showing problems for both numarray and Numeric? Todd From jmiller at stsci.edu Fri Sep 9 11:12:36 2005 From: jmiller at stsci.edu (Todd Miller) Date: Fri, 09 Sep 2005 11:12:36 -0400 Subject: [SciPy-dev] Re: [SciPy-user] More installation troubles In-Reply-To: References: <9e8c52a205090801005dbc0485@mail.gmail.com> <9e8c52a2050908051838abe874@mail.gmail.com> Message-ID: <1126278755.27040.282.camel@halloween.stsci.edu> On Thu, 2005-09-08 at 07:43, Pearu Peterson wrote: > On Thu, 8 Sep 2005, Alexander Borghgraef wrote: > > > > Yep. Removing numarray solved that problem. Now I'm back to the include > > problem described in my other post. I had some comments and questions about this: 1. numarray and Numeric/scipy coexist fine for me. numarray can't yet replace Numeric in scipy but using both has not been a problem. 2. I can't reproduce the problem with Numeric importing numarray's dotblas during scipy setup but if there is one, we want to fix it. What's your OS and compiler toolset? What versions of numarray/Numeric/scipy did you install? 3. Has anyone else ever seen a problem with numarray's dotblas conflicting with Numeric's dotblas during scipy installation? Thanks, Todd From rstanchak at yahoo.com Fri Sep 9 14:02:03 2005 From: rstanchak at yahoo.com (Roman Stanchak) Date: Fri, 9 Sep 2005 11:02:03 -0700 (PDT) Subject: [SciPy-dev] xeon 64bit patch for scipy_core Message-ID: <20050909180203.2255.qmail@web30509.mail.mud.yahoo.com> gcc 3.4 didn't like -malign-double on a 64 bit machine. This is against yesterday's SVN repository. Scipy builds, and simple things seem to work, but scipy.test() segfaults, so it seem there are larger problems. ______________________________________________________ Click here to donate to the Hurricane Katrina relief effort. http://store.yahoo.com/redcross-donate3/ -------------- next part -------------- A non-text attachment was scrubbed... Name: nocona.patch Type: text/x-diff Size: 1622 bytes Desc: 2927178388-nocona.patch URL: From prabhu_r at users.sf.net Fri Sep 9 16:25:04 2005 From: prabhu_r at users.sf.net (Prabhu Ramachandran) Date: Sat, 10 Sep 2005 01:55:04 +0530 Subject: [SciPy-dev] numarray weave commit? weave maintainer? In-Reply-To: <1126105689.2353.19.camel@halloween.stsci.edu> References: <1126105689.2353.19.camel@halloween.stsci.edu> Message-ID: <17185.61344.250357.788979@monster.linux.in> >>>>> "Todd" == Todd Miller writes: Todd> I have a patch for weave for numarray which passes all of Todd> the default self-tests. In doing my final checkout, I Todd> noticed that weave.test(level=10) has some errors and Todd> eventually segfaults... for Numeric. This was on RHEL3 with Todd> Python-2.4.1. Todd> Is anyone actively maintaining weave? Does anyone have a Todd> problem with me committing changes which enable numarray to Todd> pass weave.test() even though the level=10 tests are still Todd> showing problems for both numarray and Numeric? Well, I do use weave for some of my work and do try to keep the swig and vtk spec code in shape but do not maintain the core. I think Eric would know more about the tests and why they fail. However, I suspect that he is submerged with too much work. weave.test() itself passes for me but like you I get errors and a segfault with level=10. I suspect that tests for like the wx_spec will not pass with the new versions of wxPython. So I am not sure if all the tests are supposed to pass at all. I would say go ahead and check in the stuff to get weave working with numarray while passing weave.test(). If you break any of our code I'm sure we will holler. :) cheers, prabhu From jmiller at stsci.edu Fri Sep 9 20:03:32 2005 From: jmiller at stsci.edu (Todd Miller) Date: Fri, 09 Sep 2005 20:03:32 -0400 Subject: [SciPy-dev] numarray weave commit? weave maintainer? In-Reply-To: <17185.61344.250357.788979@monster.linux.in> References: <1126105689.2353.19.camel@halloween.stsci.edu> <17185.61344.250357.788979@monster.linux.in> Message-ID: <1126310613.5295.2.camel@localhost.localdomain> On Sat, 2005-09-10 at 01:55 +0530, Prabhu Ramachandran wrote: > >>>>> "Todd" == Todd Miller writes: > > Todd> I have a patch for weave for numarray which passes all of > Todd> the default self-tests. In doing my final checkout, I > Todd> noticed that weave.test(level=10) has some errors and > Todd> eventually segfaults... for Numeric. This was on RHEL3 with > Todd> Python-2.4.1. > > Todd> Is anyone actively maintaining weave? Does anyone have a > Todd> problem with me committing changes which enable numarray to > Todd> pass weave.test() even though the level=10 tests are still > Todd> showing problems for both numarray and Numeric? > > Well, I do use weave for some of my work and do try to keep the swig > and vtk spec code in shape but do not maintain the core. I think Eric > would know more about the tests and why they fail. However, I suspect > that he is submerged with too much work. > > weave.test() itself passes for me but like you I get errors and a > segfault with level=10. I suspect that tests for like the wx_spec > will not pass with the new versions of wxPython. So I am not sure if > all the tests are supposed to pass at all. > > I would say go ahead and check in the stuff to get weave working with > numarray while passing weave.test(). If you break any of our code I'm > sure we will holler. :) > > cheers, > prabhu OK, Thanks Prabhu. It's checked in. Todd From stephen.walton at csun.edu Tue Sep 13 18:39:20 2005 From: stephen.walton at csun.edu (Stephen Walton) Date: Tue, 13 Sep 2005 15:39:20 -0700 Subject: [SciPy-dev] Scipy vs. Fedora Core 4 Message-ID: <43275518.1010405@csun.edu> In late June, in this message: http://www.scipy.org/mailinglists/mailman?fn=scipy-dev/2005-June/003041.html Ravikiran Rajagopal posted a patch to scipy_core/scipy_distutils/gnufcompiler.py to cause it to use gfortran if it was available, as well as some patches to routines in scipy/Lib/linalg to get them to compile with gfortran. In a followup message in mid August, he also suggested building ATLAS by changing 'g77' to 'gfortran' in Make.Linux_ARCH. I have a few comments and questions, after successfully building scipy on FC4 with the latest versions of gfortran and gcc 4, and using the patches Ravi posted on my copy of scipy (up to date from subversion as of today, which is to say scipy_core revision 986, scipy revision 1269). In addition to changing 'g77' to 'gfortran' in ATLAS's Make.Linux_ARCH, one also has to add '-ff2c' to the Fortran compilation switches. The reason is that ATLAS's setup procedure uses g77 to determine the Fortran to C calling conventions, but these conventions are different in gfortran. 'gfortran -ff2c' forces gfortran to be compatible with g77, which works the same as the old f2c command in this regard. In particular, -ff2c causes external Fortran names which contain embedded underscores to have two underscores appended; gfortran's default behavior is to only append one underscore to all external Fortran routine names. I believe it's OK to do this with atlas without requiring -ff2c to be used systemwide, as the only ATLAS routine names which are exposed to the outside world do not have embedded underscores in their names and therefore have only a single underscore appended by either gfortran or g77. To build numarray 1.3.3 and Numeric 23.8 against gfortran, it is sufficient to change 'g2c' to 'gfortran' in the list of libraries numarray links against (in cfg_packages.py for numarray, and in setup.py in Numeric 23.8). libg2c does not need to be linked against; in fact it won't get found by the numarray build procedure unless you force use of gcc 3.2 by using the gcc32 command. Having done this, the problems with hangups (routines running forever) that Ravi reported seem not to occur. What I do see, though, is that scipy.test(level=1,verbosity=10) issues a "STOP 778" for the test scipy.special.basic.test_basic.test_airye, causing python/ipython to exit at that point. This STOP is originating in Lib/special/d1mach.f, which is being miscompiled by gfortran, as one can see by comparing the values of d1mach(3), d1mach(4), and d1mach(5) when compiled with g77 and with gfortran. This is a known gfortran bug which originates in the fact that gfortran does not SAVE EQUIVALENCE variables properly (see http://gcc.gnu.org/bugzilla/show_bug.cgi?id=18518). I've reported a copy of the same bug to bugzilla.redhat.com as bug 168252. Despite the above, I think Ravi's patch should be incorporated into Scipy. I also think that the changes he made to the routines in linalg should be communicated upstream to the developers of those routines, as the things he fixed (GOTO's which jump into the middle of IF/ENDIF blocks) are clearly bugs. Steve Walton From stephen.walton at csun.edu Wed Sep 14 12:38:56 2005 From: stephen.walton at csun.edu (Stephen Walton) Date: Wed, 14 Sep 2005 09:38:56 -0700 Subject: [SciPy-dev] Scipy vs. Fedora Core 4 In-Reply-To: <43275518.1010405@csun.edu> References: <43275518.1010405@csun.edu> Message-ID: <43285220.8080001@csun.edu> Stephen Walton wrote: > Despite the above, I think Ravi's patch should be incorporated into > Scipy. You know, I thought about this comment of mine overnight and decided it's silly. i1mach, r1mach, and d1mach are fundamental to many of the Netlib routines which Scipy incorporates. If gfortran won't compile them correctly, then you won't get the right answer and gfortran should not be used to build Scipy. Fedora Core 4 is feeling more and more like a beta to me. From Fernando.Perez at colorado.edu Wed Sep 14 12:45:22 2005 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Wed, 14 Sep 2005 10:45:22 -0600 Subject: [SciPy-dev] Scipy vs. Fedora Core 4 In-Reply-To: <43285220.8080001@csun.edu> References: <43275518.1010405@csun.edu> <43285220.8080001@csun.edu> Message-ID: <432853A2.2090808@colorado.edu> Stephen Walton wrote: > Stephen Walton wrote: > > >>Despite the above, I think Ravi's patch should be incorporated into >>Scipy. > > > You know, I thought about this comment of mine overnight and decided > it's silly. i1mach, r1mach, and d1mach are fundamental to many of the > Netlib routines which Scipy incorporates. If gfortran won't compile > them correctly, then you won't get the right answer and gfortran should > not be used to build Scipy. > > Fedora Core 4 is feeling more and more like a beta to me. To be honest, after seeing their head-first jump into gcc4/gfortran, without looking if there was any water in the pool at all, I decided to stay behind a little. FC3 continues to be patched (yesterday they FINALLY fixed a 2-year old kernel bug which prevented reliable access to USB storage devices), and everything builds there. Making a python2.4-based environment takes a bit of extra work, but I'm mostly using 2.3 only so that's fine for me. It seems to me that gcc4/gfortran are not really ready for production use, if you want to spend your time doing work instead of chasing build/compile problems. In a very selfish way, thank you for going first and getting the arrows on your back ;) Cheers, f From ravi at ati.com Wed Sep 14 14:56:58 2005 From: ravi at ati.com (Ravikiran Rajagopal) Date: Wed, 14 Sep 2005 14:56:58 -0400 Subject: [SciPy-dev] Scipy vs. Fedora Core 4 In-Reply-To: <432853A2.2090808@colorado.edu> References: <43275518.1010405@csun.edu> <43285220.8080001@csun.edu> <432853A2.2090808@colorado.edu> Message-ID: <200509141456.58431.ravi@ati.com> Hi, On Wednesday 14 September 2005 12:45, Fernando Perez wrote: > Stephen Walton wrote: > >>Despite the above, I think Ravi's patch should be incorporated into > >>Scipy. > > [ ... ] If gfortran won't compile > > them correctly, then you won't get the right answer and gfortran should > > not be used to build Scipy. The application of the patches (with your suggested modifications along with a big notice saying that gfortran is not supported) will not change anything for the people who do not use gfortran. However, it would make life easier for those who do and might get us more testers who can make scipy usable with gfortran. At some point, gfortran will need to be supported, since g77 is unmaintained now. As noted, the bug fixes for linalg will need to be made at some point too. > In a very selfish way, thank you for going first and getting the arrows on > your back ;) Going first - but only because I need Python 2.4. Regards, Ravi From oliphant at ee.byu.edu Wed Sep 14 15:33:26 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Wed, 14 Sep 2005 13:33:26 -0600 Subject: [SciPy-dev] scipy.base (Numeric3) now on scipy.org svn server Message-ID: <43287B06.3000301@ee.byu.edu> Now that the new scipy.base (that can replace Numeric) is pretty much complete, I'm working on bringing the other parts of Numeric along so that scipy_core can replace Numeric (and numarray in functionality) for all users. I'm now using a branch of scipy_core to do this work. The old Numeric3 CVS directory on sourceforge will start to wither... The branch is at http://svn.scipy.org/svn/scipy_core/branches/newcore I'm thinking about how to structure the new scipy_core. Right now under the new scipy_core we have Hierarchy Imports as ==================== base/ --> scipy.base (namespace also available under scipy itself) distutils/ --> scipy.distutils test/ --> scipy.test weave/ --> weave We need to bring over basic linear algebra, statistics, and fft's from Numeric. So where do we put them and how do they import? Items to consider: * the basic functionality will be expanded / replaced by anybody who installs the entire scipy library. * are we going to get f2py to live in scipy_core (I say yes)... * I think scipy_core should install a working basic scipy (i.e. import scipy as Numeric) should work and be an effective replacement for import Numeric). Of course the functionality will be quite a bit less than if full scipy was installed, but basic functions should still work. With that in mind I propose the additions Hiearchy Imports as ========================== corelib/lapack_lite/ --> scipy.lapack_lite corelib/fftpack_lite/ --> scipy.fftpack_lite corelib/random_lite/ --> scipy.random_lite linalg/ --> scipy.linalg fftpack/ --> scipy.fftpack stats/ --> scipy.stats Users would typically use only the functions in scipy.linalg, scipy.fftpack, and scipy.stats. Notice that scipy also has modules names linalg, fftpack, and stats. These would add / replace functionality available in the basic core system. Comments, -Travis O. From pearu at scipy.org Wed Sep 14 14:51:14 2005 From: pearu at scipy.org (Pearu Peterson) Date: Wed, 14 Sep 2005 13:51:14 -0500 (CDT) Subject: [SciPy-dev] scipy.base (Numeric3) now on scipy.org svn server In-Reply-To: <43287B06.3000301@ee.byu.edu> References: <43287B06.3000301@ee.byu.edu> Message-ID: On Wed, 14 Sep 2005, Travis Oliphant wrote: > Now that the new scipy.base (that can replace Numeric) is pretty much > complete, I'm working on > bringing the other parts of Numeric along so that scipy_core can replace > Numeric (and numarray in functionality) for all users. > > I'm now using a branch of scipy_core to do this work. The old Numeric3 CVS > directory on sourceforge will start to wither... > > The branch is at > > http://svn.scipy.org/svn/scipy_core/branches/newcore > > I'm thinking about how to structure the new scipy_core. > Right now under the new scipy_core we have > > Hierarchy Imports as > ==================== > base/ --> scipy.base (namespace also available under > scipy itself) > distutils/ --> scipy.distutils > test/ --> scipy.test > weave/ --> weave > > We need to bring over basic linear algebra, statistics, and fft's from > Numeric. So where do we put them and how do they import? I have done some work in this direction but have not commited to repository yet because it needs more testing. Basically, (not commited) scipy.distutils has support to build Fortran or f2c'd versions of various libraries (currently I have tested it on blas) depending on whether Fortran compiler is available or not. > Items to consider: > > * the basic functionality will be expanded / replaced by anybody who > installs the entire scipy library. > > * are we going to get f2py to live in scipy_core (I say yes)... That would simplify many things, so I'd also say yes. On the other hand, I have not decided what to do with f2py2e CVS repository. Suggestions are welcome (though I understand that this might be my personal problem). > * I think scipy_core should install a working basic scipy (i.e. import scipy > as Numeric) should work and be an effective replacement for import Numeric). > Of course the functionality will be quite a bit less than if full scipy was > installed, but basic functions should still work. > > With that in mind I propose the additions > > Hiearchy Imports as > ========================== > corelib/lapack_lite/ --> scipy.lapack_lite corelib/fftpack_lite/ --> > scipy.fftpack_lite > corelib/random_lite/ --> scipy.random_lite > linalg/ --> scipy.linalg > fftpack/ --> scipy.fftpack > stats/ --> scipy.stats > > Users would typically use only the functions in scipy.linalg, scipy.fftpack, > and scipy.stats. > > Notice that scipy also has modules names linalg, fftpack, and stats. These > would add / replace functionality available in the basic core system. Since lapack_lite, fftpack_lite can be copied from Numeric then there's no rush for me to commit my scipy.distutils work, I guess. I'll do that when it is more or less stable and then we can gradually apply f2c to various scipy modules that currently have fortran sources which would allow compiling the whole scipy without having fortran compiler around. Pearu From oliphant at ee.byu.edu Wed Sep 14 17:31:55 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Wed, 14 Sep 2005 15:31:55 -0600 Subject: [SciPy-dev] scipy.base (Numeric3) now on scipy.org svn server In-Reply-To: References: <43287B06.3000301@ee.byu.edu> Message-ID: <432896CB.2090106@ee.byu.edu> Pearu Peterson wrote: >> >> We need to bring over basic linear algebra, statistics, and fft's >> from Numeric. So where do we put them and how do they import? > > > I have done some work in this direction but have not commited to > repository yet because it needs more testing. Basically, (not > commited) scipy.distutils has support to build Fortran or f2c'd > versions of various libraries (currently I have tested it on blas) > depending on whether Fortran compiler is available or not. Ahh... :-) This is a very, very good idea. But, it might take some ironing out. I think a copy of what is currently in Numeric is a good starting point. > > That would simplify many things, so I'd also say yes. On the other > hand, I have not decided what to do with f2py2e CVS repository. > Suggestions are welcome (though I understand that this might be my > personal problem). Do you need the history of the CVS repository? If so, Joe has worked out cvs2svn issues and may be able to help you convert over. Or, you could just import the latest copy into newcore/f2py and go from there. > > Since lapack_lite, fftpack_lite can be copied from Numeric then > there's no rush for me to commit my scipy.distutils work, I guess. > I'll do that when it is more or less stable and then we can gradually > apply f2c to various scipy modules that currently have fortran sources > which would allow compiling the whole scipy without having fortran > compiler around. I like the vision you've put forward. In the meantime, it would be good to have something that installs well. I'd really, really like to get a new download available by the scipy conference. So, if scipy.distutils is not ready for that, then, I'm going to move ahead with the more mundane approach I've outlined... I could use help on configuration to use blas for the new system when appropriate... Numeric has a manual system that isn't too bad, but I'm wondering if with scipy.disutils we can automate it somewhat... -Travis From stephen.walton at csun.edu Thu Sep 15 13:39:24 2005 From: stephen.walton at csun.edu (Stephen Walton) Date: Thu, 15 Sep 2005 10:39:24 -0700 Subject: [SciPy-dev] Scipy vs. Fedora Core 4 In-Reply-To: <200509141456.58431.ravi@ati.com> References: <43275518.1010405@csun.edu> <43285220.8080001@csun.edu> <432853A2.2090808@colorado.edu> <200509141456.58431.ravi@ati.com> Message-ID: <4329B1CC.2010103@csun.edu> Ravikiran Rajagopal wrote: >The application of the patches (with your suggested modifications along with a >big notice saying that gfortran is not supported) will not change anything >for the people who do not use gfortran. However, it would make life easier >for those who do and might get us more testers who can make scipy usable with >gfortran. > Fair enough, but the difficulty is that with the present patches gfortran is used in preference to g77 if both are present, and gfortran as I noted has a sufficiently serious bug that it cannot compile Scipy correctly at the present time. Steve From oliphant at ee.byu.edu Thu Sep 15 15:14:33 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Thu, 15 Sep 2005 13:14:33 -0600 Subject: [SciPy-dev] scipy core now builds from svn Message-ID: <4329C819.6070302@ee.byu.edu> This is to officially announce that the new replacement for Numeric (scipy_core) is available at SVN. Read permission is open to everyone so a simple checkout: svn co http://svn.scipy.org/svn/scipy_core/branches/newcore newcore should get you the distribution that should install with cd newcore python setup.py install I'm in the process of adding the linear algebra routines, fft, random, and dotblas from Numeric. This should be done by the conference. I will make a windows binary release for the SciPy conference, but not before then. There is a script in newcore/scipy/base/convertcode.py that will take code written for Numeric (or numerix) and convert it to code for the new scipy base object. This code is not foolproof, but it takes care of the minor incompatibilities (a few search and replaces are done). The compatibility issues are documented (mostly in the typecode characters and a few method name changes). The one bigger incompatibility is that a.flat does something a little different (a 1-d iterator object). The convert code script changes uses of a.flat that are not indexing or set attribute related to a.ravel() C-code should build for the new system with a change of #include Numeric/arrayobject.h to #include scipy/arrayobject.h --- though you may want to enhance your code to take advantage of the new features (and more extensive C-API). I also still need to add the following ufuncs: isnan, isfinite, signbit, isinf, frexp, and ldexp. This should not take too long. -Travis O. From ravi at ati.com Thu Sep 15 16:06:00 2005 From: ravi at ati.com (Ravikiran Rajagopal) Date: Thu, 15 Sep 2005 16:06:00 -0400 Subject: [SciPy-dev] Scipy vs. Fedora Core 4 In-Reply-To: <4329B1CC.2010103@csun.edu> References: <43275518.1010405@csun.edu> <200509141456.58431.ravi@ati.com> <4329B1CC.2010103@csun.edu> Message-ID: <200509151606.00699.ravi@ati.com> > Fair enough, but the difficulty is that with the present patches > gfortran is used in preference to g77 if both are present, and gfortran > as I noted has a sufficiently serious bug that it cannot compile Scipy > correctly at the present time. Good point. In that case, I propose that gfortran be tried iff an environment variable SCIPY_USE_GFORTRAN is present. I do not want to try it only after not finding g77. The reason is that anyone who has gfortran installed will almost certainly have g77 installed as well (simply as a backup given the current state of gfortran). In regards to the d1mach.f problem, I have a solution that I have been using successfully. I replaced d1mach.f, r1mach.f and i1mach.f with much cleaner C implementations; the example C implementation in i1mach.f needs to be modified to unistd.h. I am attaching them. Could someone make sure that this works on windows as well? Regards, Ravi -------------- next part -------------- A non-text attachment was scrubbed... Name: d1mach.c Type: text/x-csrc Size: 383 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: i1mach.c Type: text/x-csrc Size: 967 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: g.diff Type: text/x-diff Size: 1307 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: r1mach.c Type: text/x-csrc Size: 385 bytes Desc: not available URL: From Fernando.Perez at colorado.edu Thu Sep 15 17:19:05 2005 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Thu, 15 Sep 2005 15:19:05 -0600 Subject: [SciPy-dev] Re: [Numpy-discussion] scipy core now builds from svn In-Reply-To: <4329C819.6070302@ee.byu.edu> References: <4329C819.6070302@ee.byu.edu> Message-ID: <4329E549.1060009@colorado.edu> Travis Oliphant wrote: > This is to officially announce that the new replacement for Numeric > (scipy_core) is available at SVN. Read permission is open to everyone > so a simple checkout: > > svn co http://svn.scipy.org/svn/scipy_core/branches/newcore newcore Great news, congratulations and many thanks! > There is a script in newcore/scipy/base/convertcode.py that will take > code written for Numeric (or numerix) and convert it to code for the new > scipy base object. This code is not foolproof, but it takes care of the > minor incompatibilities (a few search and replaces are done). The > compatibility issues are documented (mostly in the typecode characters > and a few method name changes). The one bigger incompatibility is that > a.flat does something a little different (a 1-d iterator object). The > convert code script changes uses of a.flat that are not indexing or set > attribute related to a.ravel() Quick question: is it still OK to pass a.flat to weave.inline'd code? I use this a lot, to do fast operations over arrays where I don't care about the shape but only about the elements. Or should one use .ravel() for such cases from now on? And if so, does the ravel call incur a significant performance hit compared to the old (fast but brittle, since .flat can fail) approach? Thanks for any insight. Cheers, f From stephen.walton at csun.edu Thu Sep 15 17:51:47 2005 From: stephen.walton at csun.edu (Stephen Walton) Date: Thu, 15 Sep 2005 14:51:47 -0700 Subject: [SciPy-dev] Scipy vs. Fedora Core 4 In-Reply-To: <200509151606.00699.ravi@ati.com> References: <43275518.1010405@csun.edu> <200509141456.58431.ravi@ati.com> <4329B1CC.2010103@csun.edu> <200509151606.00699.ravi@ati.com> Message-ID: <4329ECF3.9010400@csun.edu> Ravikiran Rajagopal wrote: >In that case, I propose that gfortran be tried iff an environment >variable SCIPY_USE_GFORTRAN is present. I do not want to try it only after >not finding g77. The reason is that anyone who has gfortran installed will >almost certainly have g77 installed as well (simply as a backup given the >current state of gfortran). > > We're trying to simplify installation, not make it more complicated. I'd rather Scipy compiled properly on the default platform. Anyway, I just finished another test. I compiled LAPACK, ATLAS, and Scipy with g77 on FC4. I built Scipy with the current Subversion checkout, without your patches, so it will use g77 and gcc. However, gcc on this platform is gcc4, which doesn't see libg2c. I created symbolic links to fix this: ln -s /usr/lib/gcc-lib/i386-redhat-linux/3.2.3/libg2c.a /usr/lib/gcc/i386-redhat-linux/4.0.1/libg2c.a ln -s /usr/lib/gcc-lib/i386-redhat-linux/3.2.3/libg2c.so /usr/lib/gcc/i386-redhat-linux/4.0.1/libg2c.so Scipy compiled and linked, but now we're back to the situation where scipy.test(level=1,verbosity=10) shows that it hangs nearly forever in scipy.special.basic.test_basic.test_assoc_laguerre. >In regards to the d1mach.f problem, I have a solution that I have been using >successfully. I replaced d1mach.f, r1mach.f and i1mach.f with much cleaner C >implementations; > I'm not sure the C versions are correct. For example, i1mach(1) returns 0, and i1mach(2) returns 1, which are the C conventions for stdin and stdout. Most Fortran compilers use 5 and 6, respectively. I'd rather stick with the Fortran versions, myself, as they are the Netlib standards. I'm afraid I've spent rather enough time on all this, as I have other work to get done, and since unlike Ravi I don't require Python 2.4 for anything I'm doing, I'm going to back out FC4 on my system and install FC3. Steve From Fernando.Perez at colorado.edu Fri Sep 16 11:45:00 2005 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Fri, 16 Sep 2005 09:45:00 -0600 Subject: [SciPy-dev] Re: [Numpy-discussion] scipy core now builds from svn In-Reply-To: <432A5CF4.9050909@polyu.edu.hk> References: <4329C819.6070302@ee.byu.edu> <432A5CF4.9050909@polyu.edu.hk> Message-ID: <432AE87C.6030406@colorado.edu> LUK ShunTim wrote: >> I got this time out error when I tried, several times. :-( >> >> svn: REPORT request failed on '/svn/scipy_core/!svn/vcc/default' >> svn: REPORT of '/svn/scipy_core/!svn/vcc/default': Could not read status >> line: Connection timed out (http://svn.scipy.org) >> >> Please see if this is an server configuration issue. No, it's an issue with your setup, not something on scipy's side. You are behind a proxy blocking REPORT requests. See this for details: http://www.sipfoundry.org/tools/svn-tips.html which says: What does 'REPORT request failed' mean? When I try to check out a subversion repository > svn co http://scm.sipfoundry.org/rep/project/main project I get an error like: svn: REPORT request failed on '/rep/project/!svn/vcc/default' svn: REPORT of '/rep/project/!svn/vcc/default': 400 Bad Request (http://scm.sipfoundry.org) You are behind a web proxy that is not passing the WebDAV methods that subversion uses. You can work around the problem by using SSL to hide what you're doing from the proxy: > svn co https://scm.sipfoundry.org/rep/project/main project Cheers, f From Fernando.Perez at colorado.edu Fri Sep 16 11:48:54 2005 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Fri, 16 Sep 2005 09:48:54 -0600 Subject: [SciPy-dev] Re: [Numpy-discussion] scipy core now builds from svn In-Reply-To: <432AE87C.6030406@colorado.edu> References: <4329C819.6070302@ee.byu.edu> <432A5CF4.9050909@polyu.edu.hk> <432AE87C.6030406@colorado.edu> Message-ID: <432AE966.90007@colorado.edu> Fernando Perez wrote: > You are behind a web proxy that is not passing the WebDAV methods that > subversion uses. You can work around the problem by using SSL to hide what > you're doing from the proxy: > > > svn co https://scm.sipfoundry.org/rep/project/main project I forgot to add that the https method will NOT work with scipy, which doesn't provide svn/ssl support. You need to fix your proxy config. Cheers, f From stefan at sun.ac.za Fri Sep 16 13:44:33 2005 From: stefan at sun.ac.za (Stefan van der Walt) Date: Fri, 16 Sep 2005 19:44:33 +0200 Subject: [SciPy-dev] Re: [Numpy-discussion] scipy core now builds from svn In-Reply-To: <432AE966.90007@colorado.edu> References: <4329C819.6070302@ee.byu.edu> <432A5CF4.9050909@polyu.edu.hk> <432AE87C.6030406@colorado.edu> <432AE966.90007@colorado.edu> Message-ID: <20050916174433.GA15411@rajah> On Fri, Sep 16, 2005 at 09:48:54AM -0600, Fernando Perez wrote: > Fernando Perez wrote: > > >You are behind a web proxy that is not passing the WebDAV methods that > >subversion uses. You can work around the problem by using SSL to hide what > >you're doing from the proxy: > > > > > svn co https://scm.sipfoundry.org/rep/project/main project > > I forgot to add that the https method will NOT work with scipy, which > doesn't provide svn/ssl support. You need to fix your proxy config. We had the same problem here. It was fixed (for Squid) by following these instructions (off the web somewhere): """ Next, you need to make sure the proxy server itself supports all the HTTP methods Subversion uses. Some proxy servers do not +support these methods by default: PROPFIND, REPORT, MERGE, MKACTIVITY, CHECKOUT. In general, solving this depends on the +particular proxy software. For Squid, the config option is # TAG: extension_methods # Squid only knows about standardized HTTP request methods. # You can add up to 20 additional "extension" methods here. # #Default: # none extension_methods REPORT MERGE MKACTIVITY CHECKOUT (Squid 2.4 and later already knows about PROPFIND.) """ Regards St?fan From Fernando.Perez at colorado.edu Sat Sep 17 01:09:45 2005 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Fri, 16 Sep 2005 23:09:45 -0600 Subject: [SciPy-dev] Re: [Numpy-discussion] scipy core now builds from svn In-Reply-To: <432BA2B2.7090307@polyu.edu.hk> References: <4329C819.6070302@ee.byu.edu> <432A5CF4.9050909@polyu.edu.hk> <432AE87C.6030406@colorado.edu> <432BA2B2.7090307@polyu.edu.hk> Message-ID: <432BA519.3040705@colorado.edu> LUK ShunTim wrote: > Fernando Perez wrote: > >>LUK ShunTim wrote: > Thanks very much. However no luck. :-( I now got this error > > >>svn: PROPFIND of '/svn/scipy_core/branches/newcore': 405 Method Not Allowed (https://svn.scipy.org) That's what I said in the message immediately afterward, because I hit send too soon: that the https:// approach would NOT work with scipy. You need to have your proxy fixed, I'm afraid. Cheers, f From stephen.walton at csun.edu Mon Sep 19 12:28:51 2005 From: stephen.walton at csun.edu (Stephen Walton) Date: Mon, 19 Sep 2005 09:28:51 -0700 Subject: [SciPy-dev] Absoft and current versions Message-ID: <432EE743.60302@csun.edu> I finally decided that building scipy with the Absoft compiler, which I have a perfectly good license for, is preferred to dropping back to FC3 on several machines already running FC4. I've put a patch up on the scipy.org Bug Tracker for absoftfcompiler.py from scipy_distutils for Absoft Version 9.0. After this fix, I can verify that "python fcompiler.py config_fc --verbose" in scipy_core/scipy_distutils lists the correct switches and lists absoft as an available compiler on my system. However, after python setup.py config_fc --fcompiler=absoft build in the main scipy directory (current svn checkout) I see from the output that scipy is still being built with g77. Where might this problem be? From pearu at scipy.org Tue Sep 20 01:36:41 2005 From: pearu at scipy.org (Pearu Peterson) Date: Tue, 20 Sep 2005 00:36:41 -0500 (CDT) Subject: [SciPy-dev] Absoft and current versions In-Reply-To: <432EE743.60302@csun.edu> References: <432EE743.60302@csun.edu> Message-ID: On Mon, 19 Sep 2005, Stephen Walton wrote: > I finally decided that building scipy with the Absoft compiler, which I have > a perfectly good license for, is preferred to dropping back to FC3 on several > machines already running FC4. I've put a patch up on the scipy.org Bug > Tracker for absoftfcompiler.py from scipy_distutils for Absoft Version 9.0. > After this fix, I can verify that "python fcompiler.py config_fc --verbose" > in scipy_core/scipy_distutils lists the correct switches and lists absoft as > an available compiler on my system. > > However, after > > python setup.py config_fc --fcompiler=absoft build > > in the main scipy directory (current svn checkout) I see from the output that > scipy is still being built with g77. Where might this problem be? Is python setup.py config_fc --help-fcompiler --verbose still showing that absoft compiler is available? The question is whether setup.py uses correct scipy_distutils or not. Pearu From aisaac at american.edu Tue Sep 20 11:10:37 2005 From: aisaac at american.edu (Alan G Isaac) Date: Tue, 20 Sep 2005 11:10:37 -0400 Subject: [SciPy-dev] are arrays hashable? Message-ID: A function that used to work fine with SciPy arrays now chokes on the new scipy.base, claiming that the new arrays are an unhashable type. For illustration only: >>> from scipy.base import * >>> d = arange(11) >>> set(d) Traceback (most recent call last): File "", line 1, in ? TypeError: unhashable type Cheers, Alan Isaac PS I'm using the first announced Windows binary. From rkern at ucsd.edu Tue Sep 20 11:15:52 2005 From: rkern at ucsd.edu (Robert Kern) Date: Tue, 20 Sep 2005 08:15:52 -0700 Subject: [SciPy-dev] are arrays hashable? In-Reply-To: References: Message-ID: <433027A8.3080701@ucsd.edu> Alan G Isaac wrote: > A function that used to work fine with SciPy arrays > now chokes on the new scipy.base, claiming that > the new arrays are an unhashable type. > > For illustration only: > >>>>from scipy.base import * >>>>d = arange(11) >>>>set(d) > > Traceback (most recent call last): > File "", line 1, in ? > TypeError: unhashable type Arrays were never hashable, nor should they be since they are mutable. The problem here is that set() isn't recognizing the array as an iterable. With a slightly old CVS checkout, I get In [10]: from scipy.base import * In [11]: d = array(10) In [12]: set(d) --------------------------------------------------------------------------- exceptions.TypeError Traceback (most recent call last) /Users/kern/ TypeError: iteration over non-sequence In [13]: list(d) --------------------------------------------------------------------------- exceptions.TypeError Traceback (most recent call last) /Users/kern/ TypeError: iteration over non-sequence -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From aisaac at american.edu Tue Sep 20 11:44:46 2005 From: aisaac at american.edu (Alan G Isaac) Date: Tue, 20 Sep 2005 11:44:46 -0400 Subject: [SciPy-dev] are arrays hashable? In-Reply-To: <433027A8.3080701@ucsd.edu> References: <433027A8.3080701@ucsd.edu> Message-ID: On Tue, 20 Sep 2005, Robert Kern apparently wrote: > Alan G Isaac wrote: >> A function that used to work fine with SciPy arrays >> now chokes on the new scipy.base, claiming that >> the new arrays are an unhashable type. >> For illustration only: >>>>> from scipy.base import * >>>>> d = arange(11) >>>>> set(d) >> Traceback (most recent call last): >> File "", line 1, in ? >> TypeError: unhashable type > Arrays were never hashable, nor should they be since they are mutable. Right. Sorry. But see below. > The problem here is that set() isn't recognizing the array as an > iterable. With a slightly old CVS checkout, I get Actually iterability is recognized. I can get a list from an array. But the elements of the list are type long_arrtype, which are not hashable! Surely these should be hashable? All variants of arrtype are apparently not considered hashable types. In the meantime, how can I produce a list of hashable types from a ndarray? Thank you, Alan Isaac From oliphant at ee.byu.edu Tue Sep 20 15:50:05 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Tue, 20 Sep 2005 13:50:05 -0600 Subject: [SciPy-dev] are arrays hashable? In-Reply-To: References: Message-ID: <433067ED.3060407@ee.byu.edu> Alan G Isaac wrote: >A function that used to work fine with SciPy arrays >now chokes on the new scipy.base, claiming that >the new arrays are an unhashable type. > >For illustration only: > > >>>>from scipy.base import * >>>>d = arange(11) >>>>set(d) >>>> >>>> >Traceback (most recent call last): > File "", line 1, in ? >TypeError: unhashable type > >Cheers, >Alan Isaac > >PS I'm using the first announced Windows binary. > > Thanks for the feedback, Alan (and Robert). If this used to work, it should still work. Yes, the new array scalars should be hashable (because they are immutable). I've probably overlooked something. I'll look into it. -Travis From rkern at ucsd.edu Tue Sep 20 16:19:56 2005 From: rkern at ucsd.edu (Robert Kern) Date: Tue, 20 Sep 2005 13:19:56 -0700 Subject: [SciPy-dev] are arrays hashable? In-Reply-To: <433067ED.3060407@ee.byu.edu> References: <433067ED.3060407@ee.byu.edu> Message-ID: <43306EEC.4060901@ucsd.edu> Travis Oliphant wrote: > Yes, the new array scalars should be hashable (because they are > immutable). I've probably overlooked something. I'll look into it. You need to implement tp_hash functions for all of the scalar types. It looks like it shouldn't be too bad. I think that except for the extra-long types, you can just piggyback on the implementations for int(), float(), and complex() objects. For the unsigned integer types, just cast to a long. For the signed, cast to a long (and if the casted version == -1, return -2 since -1 signals an error). For the float types, I think it's going to be acceptable to cast to a double and use _Py_HashDouble(). I have not the slightest idea what void objects should do. Possibly just return the hash of the pointer itself. For the extra long types ("long long", "unsigned long long", "long double", "complex long double"), you have to check if they are within the range of the regular types ("long", "unsigned long", etc.) and be sure to return the same hash value. I'm not sure what the right answer is for the other values, though. -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From oliphant at ee.byu.edu Tue Sep 20 18:55:27 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Tue, 20 Sep 2005 16:55:27 -0600 Subject: [SciPy-dev] are arrays hashable? In-Reply-To: <43306EEC.4060901@ucsd.edu> References: <433067ED.3060407@ee.byu.edu> <43306EEC.4060901@ucsd.edu> Message-ID: <4330935F.2040900@ee.byu.edu> Robert Kern wrote: >Travis Oliphant wrote: > > > >>Yes, the new array scalars should be hashable (because they are >>immutable). I've probably overlooked something. I'll look into it. >> >> > >You need to implement tp_hash functions for all of the scalar types. It >looks like it shouldn't be too bad. I think that except for the >extra-long types, you can just piggyback on the implementations for >int(), float(), and complex() objects. For the unsigned integer types, >just cast to a long. For the signed, cast to a long (and if the casted >version == -1, return -2 since -1 signals an error). For the float >types, I think it's going to be acceptable to cast to a double and use >_Py_HashDouble(). I have not the slightest idea what void objects should >do. Possibly just return the hash of the pointer itself. > >For the extra long types ("long long", "unsigned long long", "long >double", "complex long double"), you have to check if they are within >the range of the regular types ("long", "unsigned long", etc.) and be >sure to return the same hash value. I'm not sure what the right answer >is for the other values, though. > > > I'll do this, it shouldn't be too hard. In fact, I just now, added code to inherit the hash function from the Python builtins for those array scalars that can (int, intc-depending on platform, float, complex, string, and unicode): In particular, it solves the problem Alan had. I'll look into the others... -Travis From stephen.walton at csun.edu Wed Sep 21 11:44:38 2005 From: stephen.walton at csun.edu (Stephen Walton) Date: Wed, 21 Sep 2005 08:44:38 -0700 Subject: [SciPy-dev] Absoft and current versions In-Reply-To: References: <432EE743.60302@csun.edu> Message-ID: <43317FE6.6090502@csun.edu> Hello, Pearu, > Is > > python setup.py config_fc --help-fcompiler --verbose > > still showing that absoft compiler is available? The question is > whether setup.py uses correct scipy_distutils or not. Yes. It occurred to me to try the Intel Fortran compiler, which I had installed a copy of long ago and forgotten about. With the appropriate PATH changes to find the ifort command, the above lists all three of g77, Absoft, and Intel being available on my system. But scipy always builds with g77. Is there an easy way I can get a trace of the list of steps Scipy uses to determine which Fortran compiler to use? I'd think that if I request a compiler which isn't available for some reason it would throw an exception rather than using g77 anyway. Steve From pearu at scipy.org Wed Sep 21 11:01:05 2005 From: pearu at scipy.org (Pearu Peterson) Date: Wed, 21 Sep 2005 10:01:05 -0500 (CDT) Subject: [SciPy-dev] Absoft and current versions In-Reply-To: <43317FE6.6090502@csun.edu> References: <432EE743.60302@csun.edu> <43317FE6.6090502@csun.edu> Message-ID: On Wed, 21 Sep 2005, Stephen Walton wrote: > Hello, Pearu, > >> Is >> >> python setup.py config_fc --help-fcompiler --verbose >> >> still showing that absoft compiler is available? The question is whether >> setup.py uses correct scipy_distutils or not. > > Yes. It occurred to me to try the Intel Fortran compiler, which I had > installed a copy of long ago and forgotten about. With the appropriate PATH > changes to find the ifort command, the above lists all three of g77, Absoft, > and Intel being available on my system. But scipy always builds with g77. > Is there an easy way I can get a trace of the list of steps Scipy uses to > determine which Fortran compiler to use? I'd think that if I request a > compiler which isn't available for some reason it would throw an exception > rather than using g77 anyway. This is strange, here python setup.py build # uses gnu compiler python setup.py config_fc --fcompiler=g95 build # uses g95 compiler python setup.py config_fc --fcompiler=intel build # fails with 'ifc: command not found' because I don't have intel compiler installed at the moment python setup.py config_fc --fcompiler=blah build # fails with error: don't know how to compile Fortran code on platform 'posix' with 'blah' compiler. Supported compilers are: compaq,absoft,intel,gnu,sun,f,vast,ibm,lahey,intelv,g95,intele,pg,compaqv,mips,hpux,intelev,nag) So, in my system everything behaves normally. Could you send me the full output (both stdout and stderr) from your build command? Pearu From rkern at ucsd.edu Wed Sep 21 12:17:15 2005 From: rkern at ucsd.edu (Robert Kern) Date: Wed, 21 Sep 2005 09:17:15 -0700 Subject: [SciPy-dev] Absoft and current versions In-Reply-To: <43317FE6.6090502@csun.edu> References: <432EE743.60302@csun.edu> <43317FE6.6090502@csun.edu> Message-ID: <4331878B.3030807@ucsd.edu> Stephen Walton wrote: > Hello, Pearu, > >> Is >> >> python setup.py config_fc --help-fcompiler --verbose >> >> still showing that absoft compiler is available? The question is >> whether setup.py uses correct scipy_distutils or not. > > Yes. It occurred to me to try the Intel Fortran compiler, which I had > installed a copy of long ago and forgotten about. With the appropriate > PATH changes to find the ifort command, the above lists all three of > g77, Absoft, and Intel being available on my system. But scipy always > builds with g77. Is there an easy way I can get a trace of the list of > steps Scipy uses to determine which Fortran compiler to use? I'd think > that if I request a compiler which isn't available for some reason it > would throw an exception rather than using g77 anyway. Are you sure you don't have something set in your ~/.pydistutils.cfg ? -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From stephen.walton at csun.edu Wed Sep 21 12:57:26 2005 From: stephen.walton at csun.edu (Stephen Walton) Date: Wed, 21 Sep 2005 09:57:26 -0700 Subject: [SciPy-dev] Absoft and current versions In-Reply-To: References: <432EE743.60302@csun.edu> <43317FE6.6090502@csun.edu> Message-ID: <433190F6.2030504@csun.edu> Pearu Peterson wrote: > python setup.py config_fc --fcompiler=g95 build # uses g95 compiler > > python setup.py config_fc --fcompiler=intel build # fails with 'ifc: > command not found' because I don't have intel compiler installed at > the moment > > > So, in my system everything behaves normally. Could you send me the > full output (both stdout and stderr) from your build command? Ah HA! I apologize for incomplete information. It turns out that python setup.py config_fc --fcompiler=intel build uses the Intel compiler here too, but replace "build" with "bdist_rpm" and it always uses g77. Steve From travis at enthought.com Thu Sep 22 20:36:59 2005 From: travis at enthought.com (Travis N. Vaught) Date: Thu, 22 Sep 2005 19:36:59 -0500 Subject: [SciPy-dev] io errors Message-ID: <43334E2B.5000904@enthought.com> Greetings, I built scipy from svn source (first a bdist_wininst, then running the created setup_...exe) for python 2.4. Tests all seem to pass except the io package tests...reduced a bit, the following session causes a seg-fault in Windows XP: >>> from scipy.io import numpyio >>> import os, sys >>> import Numeric >>> import tempfile >>> from scipy import rand >>> a = 255*rand(20) >>> fname = tempfile.mktemp('.dat') >>> fid = open(fname,"wb") >>> numpyio.fwrite(fid, 20, a, Numeric.Int16) <> Environment: Python 2.4.1 (#65, Mar 30 2005, 09:13:57) >>> scipy.__version__ '0.3.3_303.4573' >>> Numeric.__version__ '23.8' >>> f2py2e.__version__.version '2.45.241_1926' Travis From Fernando.Perez at colorado.edu Fri Sep 23 01:21:19 2005 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Thu, 22 Sep 2005 23:21:19 -0600 Subject: [SciPy-dev] layering discussion Message-ID: <433390CF.4020409@colorado.edu> sent from scipy bof... Level 0: A multidimensional array object (possibly in the core Python language) ------------------------------------------------------------------------------- These arrays should support at least arithmetic operations, and hopefully have a well-defined ufunc API. It would be even better if the math library functions could handle them either transparently or with a mirror 'arraymath/amath' library of equivalent array functions. In terms of additional functionality, only random arrays should be supported, perhaps via the random module in the stdlib. More sophisticated things like FFTs, linear algebra, Lapack access, etc, does not belong in the core. I'm torn about a Matrix object (for which A*B does true matrix multiplication and not elementwise one) should go there as well. It feels a bit much, but it would round up the basic datatype support for numerical work really nicely. The library writers can then add the _fuctionality_ that would build upon these basic types. [ Note added by Pearu Peterson: Note that Matrix could use optimized blas/lapack libraries. So, either I wouldn't put Matrix into level 0 layer or there should be a well-defined interface for enhancing matrix operations (matrix*vector, matrix*matrix,inverse(matrix),etc). May be level 1 is a better place for Matrix where linear algebra routines are available. ] Level 1 - Generic scientific computing functionality ----------------------------------------------------- The rest of what today goes into numerix, perhaps merged into a scipy_base package and made very easy to install, with all dependencies shipped and binary installers provided regularly for all platforms. This would provide the minimal set of functionality for common 'light' scientific use, and it should include plotting. It would be something at the level of a basic matlab or octave, without any extension toolboxes. Matplotlib seems like a good candidate for integration at this level. Yes, choice is good. But having to study the pros and cons of 40 plotting libraries before you can plot a line is a sure way to send new potential users running for the hills. The python community is having the same problem with the whole 'web frameworks' topic, and sometimes these issues are best addressed simply by making a solid choice, and living with it. Individual users can still use whatever personal favorite they want, but people can assume _something_ to look at data will work out of the (level 1) box. No compilers should be required for end-users at this point (obviously packagers will need compilers, including Fortran ones for lapack/blas wrappers). This level would include FFTs, basic linear algebra (lapack/blas wrapping), and perhaps numerical integration. This level will also include any additional packages which satisfy these constraints: a - Are 100% guaranteed, proven, to be easy and risk-free installations, with the possibility to ensure up-to-date distribution of binaries for *nix, OSX and win32. b - Do not require a compiler in the target system to be _used_. c - Have a reasonably wide potential audience (tools of interest only to super-specialized communities belong in level 3). Once this Level is reasonably well defined, a tutorial written specifically to target this would be extremely useful. It could be based on the current scipy/numerix/pylab documentation. Level 2 - Infrastructure and extension support level ---------------------------------------------------- A set of tools to support a more complex package infrastructure and extensions in other languages. Note that this is mostly a 'glue' layer, without much scientific functionality proper (except that the external code tools do end up being useful to scientists who do significant code development). Something like scipy_distutils, weave, f2py, and whatever else is needed for easy packaging, installation and distribution of the more sophisticated functionality. At this point, even end users (not only packagers) will be expected to have Fortran/C/C++ compilers available. This level means both a set of tools AND of guidelines for where things go, how to express dependencies, etc. Everything below this will assume this level to be satisfied, AND will conform to these guidelines. This means that some naming and packaging conventions will need to be clearly spelled out and documented. Level 3 - High-level, specialized and third-party packages ---------------------------------------------------------- These packages are installable independently of one another (though some may depend on others, case in which this dependency should be clearly stated and hopefully an automatic dependency mechanism will handle it). These could be considered 'toolboxes', and will range from fairly common things to domain-specific tools only of interest to specialists. Much of what today is scipy would live on level 3, and a scipy-full could still be distributed for most users. This scipy-full would encompass a lot of functionality, since as long as it comes packaged together, all it does is take space on disk for people. Much like what Mathematica is today: a huge library of mathematical functions, the bulk of which most users never need, but which is great to have at your fingertips just in case. But the specs laid out in level 2 (and the functionality level 2 provides) would also enable third-party projects to distribute their own level3 tools, even outside of the scipy project management and release cycles. In this manner, very specialized tools (only of interest to small communities, but potentially very important for them) could be distributed and live in harmony with the rest of the system, taking advantage of the common facilities provided by levels 0-2. At this level, pretty much anything goes. Difficult installations, platform-specific packages, compiler-dependent code, ... The scipy project may not distribute such nasties, but third parties are free to do as they wish. But we do hope that third-party packages will rely on the level 0-2 functionality and conventions, to make life easier for the community at large. It should be noted that scipy developers have started to move wrappers to various libraries such as blas,lapack,fftpack,quadpack,etc to "lib" package that should be used by higher level (>=3) packages as basic computational tools. So all packages at this level can use scipy.lib.foo* as engines for their internal needs. Other notes by Pearu -------------------- The main difficulty for installing scipy is related to providing optimized blas/lapack libraries that scipy.linalg can _optionally_ use. As a rule, when ATLAS or blas/lapck libraries are properly installed then there are no problems using them by scipy building scripts. However, some systems provide broken or incomplete blas/lapack libraries and when scipy is using them, all bad things can happen. This is imho the main reason why people find it difficult to install scipy because in such cases they end up rebuilding ATLAS/lapack libraries that may need some experience to be succesful in the first iteration. So, imho there are no "difficult to install packages" in this layer as linalg, as the most sensitive one to broken systems, is already in level 1. Originally also scipy used linear leveling for organization of its tools but that didn't work out long. The orders of mathematical concepts and the corresponding implementation details often do not match and so such a simplified organization by linear levels may not be practical. A trivial example is where Matrix should go: mathematically it should be in a low level while it's implementation (of useful methods) may require higher level tools. So, may be we should first recognize which (scipy, numerix, etc) packages can be implemented independently from others, what would be the complete tree of dependecies by implementation details, and then see what would be practical structure for "scientific computing functionality for python" concept. From Fernando.Perez at colorado.edu Fri Sep 23 01:21:41 2005 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Thu, 22 Sep 2005 23:21:41 -0600 Subject: [SciPy-dev] packaging bof notes Message-ID: <433390E5.1000702@colorado.edu> Notes ===== Thu Sep 22 21:57:50 MDT 2005 M. Knepley's build tools ------------------------- - no support for versioning. Only latest version of anything is supported, they don't even attempt to support multiple versions. But they do for a few things like BLAS. - 'Version numbers are just names'. No significant semantic content in there. - Tests are done, as much as possible, for behavior. - setup.py knows nothing about dependencies, this knowledge has to be given to it. - Tools: a set of python files included with the PETSc distribution. - Interface is simple: naming convention (configure). - scipy.distutils: Michel Sanner's tools ported their tools to scipy, and it worked for them. - Could the PETSC config system be brought into scipy.distutils? - MK's buildsys can ensure that libraries (atlas, fftw, etc) are actually there, and will download/build them if needed. M Sanner's questions -------------------- - These issues are not scipy-specific. Can these tools benefit other projects without making scipy a dependency for them? - Versioning? Python eggs ----------- - single zip files w/python code and metadata (version, dependencies, scripts to be unpacked, extension modules, ...) An egg can be a complete package, or it can be a 'namespace package'. These are logical containers, and the actual packages are packed in their own eggs. - Package resources: library which can ask for specific versions, etc. Two issues: installation and verification of correctness. We can't prevent third-party libraries from making semantic changes across versions, explicitly specified or not. - support for subpackage building. Download all the source, but allow users to build only what they need. From oliphant at ee.byu.edu Sun Sep 25 00:51:27 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Sat, 24 Sep 2005 22:51:27 -0600 Subject: [SciPy-dev] Release of scipy core beta will happen next week. Message-ID: <43362CCF.5020100@ee.byu.edu> At the SciPy 2005 conference I announced that I was hoping to get a beta of the new scipy (core) (aka Numeric3 aka Numeric Next Generation) released by the end of the conference. This did not happen. Some last minute features were suggested by Fernando Perez that I think will be relatively easy to add and make the release that much stronger. Look for the beta announcement next week. For the impatient, the svn server is always available: http://svn.scipy.org/svn/scipy_core/branches/newcore -Travis O. From oliphant at ee.byu.edu Mon Sep 26 11:45:03 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Mon, 26 Sep 2005 09:45:03 -0600 Subject: [SciPy-dev] Re: [Numpy-discussion] Release of scipy core beta will happen next week. In-Reply-To: <4337640B.2030405@ucsd.edu> References: <43362CCF.5020100@ee.byu.edu> <20050925191415.3a686369.gerard.vermeulen@grenoble.cnrs.fr> <4337640B.2030405@ucsd.edu> Message-ID: <4338177F.6090301@ee.byu.edu> Robert Kern wrote: >Gerard Vermeulen wrote: > > >>On Sat, 24 Sep 2005 22:51:27 -0600 >>Travis Oliphant wrote: >> >> >> >>>At the SciPy 2005 conference I announced that I was hoping to get a beta >>>of the new scipy (core) (aka Numeric3 aka Numeric Next Generation) >>>released by the end of the conference. >>> >>>This did not happen. Some last minute features were suggested by >>>Fernando Perez that I think will be relatively easy to add and make the >>>release that much stronger. >>> >>>Look for the beta announcement next week. >>> >>>For the impatient, the svn server is always available: >>> >>>http://svn.scipy.org/svn/scipy_core/branches/newcore >>> >>> >> >>Hi Travis, >> >>when I tried a few months ago to compile one of my C++ Python modules with >>Numeric3, g++-3.4.3 choked on the line >> >>typedef unsigned char bool; >> >>in arrayobject.h, because bool is a predefined type in C++. >> >>I see the offending line is still in SVN (did not try to build it though). >> >> > >Will this do the trick? > >#ifndef __cplusplus >typedef unsigned char bool; >#define false 0 >#define true 1 >#endif /* __cplusplus */ > > > >>Sorry for sitting on the bug so long; the main reason is that at the time >>(I suppose it is still the case) Numeric3 does not coexist well with >>Numeric in the same Python interpreter (I remember import conflicts). >>If a typical Linux user wants to play with Numeric3, he has either to remove >>Numeric (and break possible dependencies) or build his own Python for Numeric3. >>I think that most Linux users are not going to do this and that it will take more >>than a year before distros make the move. Hence, my lack of motivation for >>reporting bugs or giving it a real try. >> >> > >scipy_core does not interfere with Numeric anymore. It's installed as >scipy (so it *will* interfere with previous versions of scipy). > >While we're on the subject of bugs (for reference, I'm on OS X 10.4 with >Python 2.4.1): > >* When linking umath.so, distutils is including a bare "-l" that causes >the link to fail (gcc can't interpret the argument). I have no idea >where it's coming from. Just after the Extension object for umath.so is >created, the libraries attribute is empty, just like the other Extension >objects. > > I've seen this too. It looks like scipy.distutils is detecting a "blank" mathlibs for OS X. >* When linking against Accelerate.framework, it can't find cblas.h . I >have a patch in scipy.distutils.system_info for that (and also to remove >the -framework arguments for compiling; they're solely linker flags). > >* setup.py looks for an optimized BLAS through blas_info, but getting >lapack_info is commented out. Is this deliberate? > > It was commented out to test the build of the included lapack_lite. It remained commented out while I was trying to build on windows because I would get segfaults when trying to link against ATLAS on that platform (I didn't track down the problem). >* Despite comment lines claiming the contrary, scipy.linalg hasn't been >converted yet. basic_lite.py tries to import lapack and flapack. > > I've decided against improving the default interfaces for now. I'm just going to give the old Numeric interfaces in scipy_core. So, this will change. You caught it mid-changes. >* scipy.base.limits is missing (and is used in basic_lite.py). > > Should be fine now (again in-the-middle-of changes). >Feature request: > >* Bundle the include directory in with the package itself and provide a >function somewhere that returns the appropriate directory. People >writing setup.py scripts for extension modules that use scipy_core can >then do the following: > > from scipy.distutils import get_scipy_include > ... > someext = Extension('someext', include_dirs=[get_scipy_include(), > ...], ...) > > Sounds like a good idea. Not exactly sure how to do this. Anyone know?? -Travis From rkern at ucsd.edu Mon Sep 26 20:48:15 2005 From: rkern at ucsd.edu (Robert Kern) Date: Mon, 26 Sep 2005 17:48:15 -0700 Subject: [SciPy-dev] Re: [Numpy-discussion] Release of scipy core beta will happen next week. In-Reply-To: <4338177F.6090301@ee.byu.edu> References: <43362CCF.5020100@ee.byu.edu> <20050925191415.3a686369.gerard.vermeulen@grenoble.cnrs.fr> <4337640B.2030405@ucsd.edu> <4338177F.6090301@ee.byu.edu> Message-ID: <433896CF.3090000@ucsd.edu> Travis Oliphant wrote: > Robert Kern wrote: >> * When linking umath.so, distutils is including a bare "-l" that causes >> the link to fail (gcc can't interpret the argument). I have no idea >> where it's coming from. Just after the Extension object for umath.so is >> created, the libraries attribute is empty, just like the other Extension >> objects. >> > I've seen this too. It looks like scipy.distutils is detecting a > "blank" mathlibs for OS X. Got it. Fixed it. >> Feature request: >> >> * Bundle the include directory in with the package itself and provide a >> function somewhere that returns the appropriate directory. People >> writing setup.py scripts for extension modules that use scipy_core can >> then do the following: >> >> from scipy.distutils import get_scipy_include >> ... >> someext = Extension('someext', include_dirs=[get_scipy_include(), >> ...], ...) > > Sounds like a good idea. Not exactly sure how to do this. Anyone know?? arrayobject.h and ufuncobject.h are easy; just include them as data. For the generated files, you only have to make some small modifications to the generate_*() functions; instead of dropping them into the build_dir, put them in the same directory as arrayobject.h and ufuncobject.h . I have a patch. Unfortunately, it has a side-effect of including the generated files in the sdist. We should be able to work around that by making a MANIFEST.in . With the caveat that this approach needs some more testing, I propose that we stop installing headers with the install_header command and the get_scipy_include() approach be the One Obvious Way To Do It. Comments? I also have mtrand ready to go in. It isn't entirely plug-compatible with RandomArray or RNG, but I think that it's clearly superior and should be the *single* default PRNG in scipy_core (I emphasize *single* because scipy.stats.ranf() is using one generator and the other distributions are using another. This is not good.). I have it as scipy.lib.mtrand at the moment. I haven't done anything to integrate it with scipy.stats, yet. Should I check it in along with the other fixes so people can play with it? Actually, I should test whether or not I *have* checkin privileges... -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From charles.harris at sdl.usu.edu Tue Sep 27 13:12:25 2005 From: charles.harris at sdl.usu.edu (Charles Harris) Date: Tue, 27 Sep 2005 11:12:25 -0600 Subject: [SciPy-dev] Re: [Numpy-discussion] Release of scipy core betawill happen next week. Message-ID: >I also have mtrand ready to go in. It isn't entirely plug-compatible >with RandomArray or RNG, but I think that it's clearly superior and >should be the *single* default PRNG in scipy_core (I emphasize *single* >because scipy.stats.ranf() is using one generator and the other >distributions are using another. This is not good.). I assume mtrand is the Mersenne Twister. The Python module Random also uses the MT generator now. Did you look into that implementation by any chance? Chuck -------------- next part -------------- A non-text attachment was scrubbed... Name: winmail.dat Type: application/ms-tnef Size: 2800 bytes Desc: not available URL: From charles.harris at sdl.usu.edu Tue Sep 27 13:33:04 2005 From: charles.harris at sdl.usu.edu (Charles Harris) Date: Tue, 27 Sep 2005 11:33:04 -0600 Subject: [SciPy-dev] Release of scipy core beta will happen next week. Message-ID: Thought I should include a link to the random module documentation. http://docs.python.org/lib/module-random.html I've also implemented the ziggurat method -- with some small improvements -- for the generation of gaussian distributions if you are interested. Also, are the generated uniform doubles "thin" or full precision? For my own use, I have made the default doubles thin and on the open interval (0,1) in order to get the best speed and to make taking the log and other such functions that blow up at 0 and 1 not require special attention. Chuck d -------------- next part -------------- A non-text attachment was scrubbed... Name: winmail.dat Type: application/ms-tnef Size: 2766 bytes Desc: not available URL: From rkern at ucsd.edu Tue Sep 27 15:54:49 2005 From: rkern at ucsd.edu (Robert Kern) Date: Tue, 27 Sep 2005 12:54:49 -0700 Subject: [SciPy-dev] Re: [Numpy-discussion] Release of scipy core betawill happen next week. In-Reply-To: References: Message-ID: <4339A389.6050608@ucsd.edu> Charles Harris wrote: >>I also have mtrand ready to go in. It isn't entirely plug-compatible >>with RandomArray or RNG, but I think that it's clearly superior and >>should be the *single* default PRNG in scipy_core (I emphasize *single* >>because scipy.stats.ranf() is using one generator and the other >>distributions are using another. This is not good.). > > I assume mtrand is the Mersenne Twister. The Python module Random also > uses the MT generator now. Did you look into that implementation by any chance? I'm using RandomKit by Jean-Sebastian Roy. Ultimately, the core algorithm in both implementations are cut-and-pasted from the reference implementation. I picked RandomKit because I could drop it in directly with minimal change. It passes around state in a plain C struct rather than a PyObject*, which is good, because I needed to extend that struct. RandomKit also has a nice feature that the stdlib implementation doesn't: if /dev/urandom is available (or the equivalent Windows API), RandomKit will use that to seed itself instead of the time if it wasn't given an explicit seed to begin with. When you have 624 words of state, it's good to use them all. http://www.jeannot.org/~js/code/index.en.html -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From rkern at ucsd.edu Tue Sep 27 16:08:35 2005 From: rkern at ucsd.edu (Robert Kern) Date: Tue, 27 Sep 2005 13:08:35 -0700 Subject: [SciPy-dev] Release of scipy core beta will happen next week. In-Reply-To: References: Message-ID: <4339A6C3.40304@ucsd.edu> Charles Harris wrote: > Thought I should include a link to the random module documentation. > > http://docs.python.org/lib/module-random.html > > I've also implemented the ziggurat method -- with some small improvements -- > for the generation of gaussian distributions if you are interested. I think I'm sticking with Box-Mueller for gaussian variates. What it may lack in speed it makes up for in readability of code. But I would be interested in seeing your implementation. I might as well throw it in and actually see if the performance outweighs the code size. > Also, > are the generated uniform doubles "thin" or full precision? For my own use, I > have made the default doubles thin and on the open interval (0,1) in order to > get the best speed and to make taking the log and other such functions that blow > up at 0 and 1 not require special attention. It uses the full 53 bits of double precision on [0, 1) by default. I can add auxiliary methods that do different things. Heck, with a little work, I could probably make that selectable at runtime for all of the distributions. -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From charles.harris at sdl.usu.edu Tue Sep 27 19:09:51 2005 From: charles.harris at sdl.usu.edu (Charles Harris) Date: Tue, 27 Sep 2005 17:09:51 -0600 Subject: [SciPy-dev] Re: [Numpy-discussion] Release of scipy core betawillhappen next week. Message-ID: Robert Kern wrote >with minimal change. It passes around state in a plain C struct rather >than a PyObject*, which is good, because I needed to extend that struct. This is potentially a problem if the user has, say, a uniform double and a random integer going at the same time. There should be no correlation between the two, so they need to be using the same base generator, or two different generators with genuinely random seeds. I kept one base generator per module to deal with this, but I guess there might be a better way to keep things straight. Chuck _______________________________________________ -------------- next part -------------- A non-text attachment was scrubbed... Name: winmail.dat Type: application/ms-tnef Size: 2871 bytes Desc: not available URL: From charles.harris at sdl.usu.edu Tue Sep 27 19:54:37 2005 From: charles.harris at sdl.usu.edu (Charles Harris) Date: Tue, 27 Sep 2005 17:54:37 -0600 Subject: [SciPy-dev] Release of scipy core beta will happen next week. Message-ID: -----Original Message----- From: scipy-dev-bounces at scipy.net on behalf of Robert Kern Sent: Tue 9/27/2005 2:08 PM >Charles Harris wrote: >> Thought I should include a link to the random module documentation. >> >> http://docs.python.org/lib/module-random.html >> >> I've also implemented the ziggurat method -- with some small improvements -- >> for the generation of gaussian distributions if you are interested. > I think I'm sticking with Box-Mueller for gaussian variates. Fair enough. If we ever optimize for speed, there are some quirks to bear in mind for Intel (and AMD) architectures. For instance, longs convert to doubles faster than unsigned longs: http://www.doornik.com/research/randomdouble.pdf I've attached two parts of the Gaussian Ziggurat method. Note that I have used an exponential tail on the ziggurat instead of the usual Gaussian tail as the algorithm is a bit simpler IMHO. Anyway, the first attachment is a python program to compute the required constants a and r, the second a c++ class for the computation of the variates themselves. These aren't thoroughly tested, so there is the possibility of brown bags in my future. _______________________________________________ Scipy-dev mailing list Scipy-dev at scipy.net http://www.scipy.net/mailman/listinfo/scipy-dev -------------- next part -------------- A non-text attachment was scrubbed... Name: winmail.dat Type: application/ms-tnef Size: 5949 bytes Desc: not available URL: From rkern at ucsd.edu Tue Sep 27 20:11:06 2005 From: rkern at ucsd.edu (Robert Kern) Date: Tue, 27 Sep 2005 17:11:06 -0700 Subject: [SciPy-dev] Re: [Numpy-discussion] Release of scipy core betawillhappen next week. In-Reply-To: References: Message-ID: <4339DF9A.9040909@ucsd.edu> Charles Harris wrote: > Robert Kern wrote > >>with minimal change. It passes around state in a plain C struct rather >>than a PyObject*, which is good, because I needed to extend that struct. > > This is potentially a problem if the user has, say, a uniform double and a > random integer going at the same time. There should be no correlation between > the two, so they need to be using the same base generator, or two different > generators with genuinely random seeds. I kept one base generator per module > to deal with this, but I guess there might be a better way to keep things straight. Actually, it's the *solution* to that problem. Unlike the older RANLIB-based modules, random.py and mtrand define generator objects that keep their state tied to the object. You can instantiate them as many times as you like with different states (another benefit of the /dev/urandom seeding over seeding with the time; there's no real chance that you'll seed two generators with the same state no matter how close in time you instantiate them). There are no globals. Give me a little bit of time to check it in and you can see for yourself. I do instantiate one generator in the module and make aliases to its methods for convenience. People who just need a random number here and there will probably use those. People who have need more control can instantiate their own generator. I do have one question on this point, though: Should this default generator be initialized with the same state (like scipy.lib.rng) or be initialized via /dev/urandom or time (like scipy.lib.ranlib and random.py)? I have 4992 hex digits of pi that are begging to be used. -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From oliphant at ee.byu.edu Tue Sep 27 20:23:44 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Tue, 27 Sep 2005 18:23:44 -0600 Subject: [SciPy-dev] Re: [Numpy-discussion] Release of scipy core betawillhappen next week. In-Reply-To: <4339DF9A.9040909@ucsd.edu> References: <4339DF9A.9040909@ucsd.edu> Message-ID: <4339E290.20603@ee.byu.edu> Robert Kern wrote: >Charles Harris wrote: > > >>Robert Kern wrote >> >> >> >>>with minimal change. It passes around state in a plain C struct rather >>>than a PyObject*, which is good, because I needed to extend that struct. >>> >>> >>This is potentially a problem if the user has, say, a uniform double and a >>random integer going at the same time. There should be no correlation between >>the two, so they need to be using the same base generator, or two different >>generators with genuinely random seeds. I kept one base generator per module >>to deal with this, but I guess there might be a better way to keep things straight. >> >> > >Actually, it's the *solution* to that problem. Unlike the older >RANLIB-based modules, random.py and mtrand define generator objects that >keep their state tied to the object. You can instantiate them as many >times as you like with different states (another benefit of the >/dev/urandom seeding over seeding with the time; there's no real chance >that you'll seed two generators with the same state no matter how close >in time you instantiate them). There are no globals. Give me a little >bit of time to check it in and you can see for yourself. > >I do instantiate one generator in the module and make aliases to its >methods for convenience. People who just need a random number here and >there will probably use those. People who have need more control can >instantiate their own generator. > >I do have one question on this point, though: Should this default >generator be initialized with the same state (like scipy.lib.rng) or be >initialized via /dev/urandom or time (like scipy.lib.ranlib and >random.py)? I have 4992 hex digits of pi that are begging to be used. > > > I think it should be initialized via /dev/urandom or time. Thanks for your help. Should I delay release of the beta until you get your code contributed? I'm ready to go with most else. -Travis From rkern at ucsd.edu Tue Sep 27 20:53:03 2005 From: rkern at ucsd.edu (Robert Kern) Date: Tue, 27 Sep 2005 17:53:03 -0700 Subject: [SciPy-dev] Re: [Numpy-discussion] Release of scipy core betawillhappen next week. In-Reply-To: <4339E290.20603@ee.byu.edu> References: <4339DF9A.9040909@ucsd.edu> <4339E290.20603@ee.byu.edu> Message-ID: <4339E96F.1000200@ucsd.edu> Travis Oliphant wrote: > Robert Kern wrote: >> I do have one question on this point, though: Should this default >> generator be initialized with the same state (like scipy.lib.rng) or be >> initialized via /dev/urandom or time (like scipy.lib.ranlib and >> random.py)? I have 4992 hex digits of pi that are begging to be used. >> > I think it should be initialized via /dev/urandom or time. > > Thanks for your help. Should I delay release of the beta until you get > your code contributed? > > I'm ready to go with most else. Yes, I think that would be good. It will go in tonight. -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From charles.harris at sdl.usu.edu Tue Sep 27 22:25:31 2005 From: charles.harris at sdl.usu.edu (Charles Harris) Date: Tue, 27 Sep 2005 20:25:31 -0600 Subject: [SciPy-dev] Re: [Numpy-discussion] Release of scipy corebetawillhappen next week. Message-ID: Robert Kern wrote: > times as you like with different states (another benefit of the > /dev/urandom seeding over seeding with the time; there's no real chance I like /dev/urandom, but is it portable? What is the windows equivalent? I've been wondering, as I run linux and /dev/urandom is always handy. I also think some provision for setting a simple seed, perhaps just a long, is useful for testing when it is nice to have a repeatable sequences. Chuck -------------- next part -------------- A non-text attachment was scrubbed... Name: winmail.dat Type: application/ms-tnef Size: 2762 bytes Desc: not available URL: From rkern at ucsd.edu Tue Sep 27 22:42:32 2005 From: rkern at ucsd.edu (Robert Kern) Date: Tue, 27 Sep 2005 19:42:32 -0700 Subject: [SciPy-dev] Re: [Numpy-discussion] Release of scipy corebetawillhappen next week. In-Reply-To: References: Message-ID: <433A0318.5060206@ucsd.edu> Charles Harris wrote: > Robert Kern wrote: > >>times as you like with different states (another benefit of the >>/dev/urandom seeding over seeding with the time; there's no real chance > > I like /dev/urandom, but is it portable? Lots of UNIX-like systems have it now. I don't think any OS has a /dev/urandom that *isn't* an uninterruptable stream of random bits. > What is the windows equivalent? Here's the snippet from RandomKit: /* Windows crypto */ #ifndef _WIN32_WINNT #define _WIN32_WINNT 0x0400 #endif #include #include #endif ... HCRYPTPROV hCryptProv; BOOL done; if (!CryptAcquireContext(&hCryptProv, NULL, NULL, PROV_RSA_FULL, CRYPT_VERIFYCONTEXT) || !hCryptProv) return RK_ENODEV; done = CryptGenRandom(hCryptProv, size, (unsigned char *)buffer); CryptReleaseContext(hCryptProv, 0); if (done) return RK_NOERR; > I've been wondering, as I run linux and /dev/urandom is always handy. > > I also think some provision for setting a simple seed, perhaps just a long, > is useful for testing when it is nice to have a repeatable sequences. Way ahead of you. :-) def seed(self, seed=None): """Seed the generator. seed(seed=None) seed can be an integer, an array (or other sequence) of integers of any length, or None. If seed is None, then RandomState will try to read data from /dev/urandom (or the Windows analogue) if available or seed from the clock otherwise. """ -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From charles.harris at sdl.usu.edu Tue Sep 27 23:05:49 2005 From: charles.harris at sdl.usu.edu (Charles Harris) Date: Tue, 27 Sep 2005 21:05:49 -0600 Subject: [SciPy-dev] Re: [Numpy-discussion] Release of scipycorebetawillhappen next week. Message-ID: OK, looks good. Thanks for doing all the work to get this into SciPY. Chuck -----Original Message----- From: scipy-dev-bounces at scipy.net on behalf of Robert Kern Sent: Tue 9/27/2005 8:42 PM To: SciPy Developers List Subject: Re: [SciPy-dev] Re: [Numpy-discussion] Release of scipycorebetawillhappen next week. Charles Harris wrote: > Robert Kern wrote: > >>times as you like with different states (another benefit of the >>/dev/urandom seeding over seeding with the time; there's no real chance > > I like /dev/urandom, but is it portable? Lots of UNIX-like systems have it now. I don't think any OS has a /dev/urandom that *isn't* an uninterruptable stream of random bits. > What is the windows equivalent? Here's the snippet from RandomKit: /* Windows crypto */ #ifndef _WIN32_WINNT #define _WIN32_WINNT 0x0400 #endif #include #include #endif ... HCRYPTPROV hCryptProv; BOOL done; if (!CryptAcquireContext(&hCryptProv, NULL, NULL, PROV_RSA_FULL, CRYPT_VERIFYCONTEXT) || !hCryptProv) return RK_ENODEV; done = CryptGenRandom(hCryptProv, size, (unsigned char *)buffer); CryptReleaseContext(hCryptProv, 0); if (done) return RK_NOERR; > I've been wondering, as I run linux and /dev/urandom is always handy. > > I also think some provision for setting a simple seed, perhaps just a long, > is useful for testing when it is nice to have a repeatable sequences. Way ahead of you. :-) def seed(self, seed=None): """Seed the generator. seed(seed=None) seed can be an integer, an array (or other sequence) of integers of any length, or None. If seed is None, then RandomState will try to read data from /dev/urandom (or the Windows analogue) if available or seed from the clock otherwise. """ -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter _______________________________________________ Scipy-dev mailing list Scipy-dev at scipy.net http://www.scipy.net/mailman/listinfo/scipy-dev -------------- next part -------------- A non-text attachment was scrubbed... Name: winmail.dat Type: application/ms-tnef Size: 3901 bytes Desc: not available URL: From rkern at ucsd.edu Tue Sep 27 23:14:44 2005 From: rkern at ucsd.edu (Robert Kern) Date: Tue, 27 Sep 2005 20:14:44 -0700 Subject: [SciPy-dev] Re: [Numpy-discussion] Release of scipycorebetawillhappen next week. In-Reply-To: References: Message-ID: <433A0AA4.1080705@ucsd.edu> Charles Harris wrote: > OK, looks good. Thanks for doing all the work to get this into SciPY. No problem. It's now checked in, so please test it to little bits. http://svn.scipy.org/svn/scipy_core/branches/newcore/ -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From oliphant at ee.byu.edu Tue Sep 27 23:33:03 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Tue, 27 Sep 2005 21:33:03 -0600 Subject: [SciPy-dev] Re: [Numpy-discussion] Release of scipycorebetawillhappen next week. In-Reply-To: <433A0AA4.1080705@ucsd.edu> References: <433A0AA4.1080705@ucsd.edu> Message-ID: <433A0EEF.9050804@ee.byu.edu> Robert Kern wrote: >Charles Harris wrote: > > >>OK, looks good. Thanks for doing all the work to get this into SciPY. >> >> > >No problem. It's now checked in, so please test it to little bits. > >http://svn.scipy.org/svn/scipy_core/branches/newcore/ > > > Thanks Robert, I presume we need to convert all of the pdfs from RNG and random_lite now? -Travis From oliphant at ee.byu.edu Tue Sep 27 23:37:59 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Tue, 27 Sep 2005 21:37:59 -0600 Subject: [SciPy-dev] Re: [Numpy-discussion] Release of scipycorebetawillhappen next week. In-Reply-To: <433A0AA4.1080705@ucsd.edu> References: <433A0AA4.1080705@ucsd.edu> Message-ID: <433A1017.6050905@ee.byu.edu> Robert Kern wrote: >Charles Harris wrote: > > >>OK, looks good. Thanks for doing all the work to get this into SciPY. >> >> > >No problem. It's now checked in, so please test it to little bits. > >http://svn.scipy.org/svn/scipy_core/branches/newcore/ > > > In particular, what can we get rid of out of random_lite and still keep the same interfaces to corresponding function calls? Robert, do you have time to sort through this? Or should I... -Travis From charles.harris at sdl.usu.edu Tue Sep 27 23:45:38 2005 From: charles.harris at sdl.usu.edu (Charles Harris) Date: Tue, 27 Sep 2005 21:45:38 -0600 Subject: [SciPy-dev] Re: [Numpy-discussion] Release ofscipycorebetawillhappen next week. Message-ID: Robert Kern wrote: > Charles Harris wrote: >> OK, looks good. Thanks for doing all the work to get this into SciPY. > > No problem. It's now checked in, so please test it to little bits. > > http://svn.scipy.org/svn/scipy_core/branches/newcore/ It might be a nice thought to include the original MT license. I had the impression the the authors expected it. http://www.math.sci.hiroshima-u.ac.jp/~m-mat/MT/MT2002/CODES/mt19937ar.c Chuck -------------- next part -------------- A non-text attachment was scrubbed... Name: winmail.dat Type: application/ms-tnef Size: 2768 bytes Desc: not available URL: From rkern at ucsd.edu Wed Sep 28 01:23:15 2005 From: rkern at ucsd.edu (Robert Kern) Date: Tue, 27 Sep 2005 22:23:15 -0700 Subject: [SciPy-dev] Re: [Numpy-discussion] Release ofscipycorebetawillhappen next week. In-Reply-To: References: Message-ID: <433A28C3.9010902@ucsd.edu> Charles Harris wrote: > It might be a nice thought to include the original MT license. I had the impression the > the authors expected it. > > http://www.math.sci.hiroshima-u.ac.jp/~m-mat/MT/MT2002/CODES/mt19937ar.c Yes! Sorry, that's an oversight that was carried over from RandomKit. Will be committed shortly. -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From rkern at ucsd.edu Wed Sep 28 01:50:28 2005 From: rkern at ucsd.edu (Robert Kern) Date: Tue, 27 Sep 2005 22:50:28 -0700 Subject: [SciPy-dev] Re: [Numpy-discussion] Release of scipycorebetawillhappen next week. In-Reply-To: <433A0EEF.9050804@ee.byu.edu> References: <433A0AA4.1080705@ucsd.edu> <433A0EEF.9050804@ee.byu.edu> Message-ID: <433A2F24.906@ucsd.edu> Travis Oliphant wrote: > I presume we need to convert all of the pdfs from RNG and random_lite now? No, not really. mtrand.pyx: 261: def seed(self, seed=None): 282: def set_state(self, state): 293: def get_state(self): 303: def random_sample(self, size=None): 310: def tomaxint(self, size=None): 318: def randint(self, low, high=None, size=None): 352: def bytes(self, unsigned int length): 364: def uniform(self, double low=0.0, double high=1.0, size=None): 372: def standard_normal(self, size=None): 379: def normal(self, double loc=0.0, double scale=1.0, size=None): 388: def beta(self, double a, double b, size=None): 399: def exponential(self, double scale=1.0, size=None): 408: def standard_exponential(self, size=None): 415: def standard_gamma(self, double shape, size=None): 424: def gamma(self, double shape, double scale=1.0, size=None): 435: def f(self, double dfnum, double dfden, size=None): 446: def noncentral_f(self, double dfnum, double dfden, double nonc, size=None): 460: def chisquare(self, double df, size=None): 469: def noncentral_chisquare(self, double df, double nonc, size=None): 481: def standard_cauchy(self, size=None): 488: def standard_t(self, double df, size=None): 497: def vonmises(self, double mu, double kappa, size=None): 507: def pareto(self, double a, size=None): 516: def weibull(self, double a, size=None): 525: def power(self, double a, size=None): 534: def binomial(self, long n, double p, size=None): 547: def negative_binomial(self, long n, double p, size=None): 561: def poisson(self, double lam=1.0, size=None): 570: def multinomial(self, long n, object pvals, size=None): 620: def shuffle(self, object x): 635: def permutation(self, object x): Okay, I guess I missed lognormal, but that's just a transformation of normal variates. Easy. I originally wrote this to replace the PRNG in the old scipy.stats, not RandomArray or RNG. Here are the incompatibilities: * The RNG interface isn't there at all * Some small name differences: - F, noncentral_F -> f, noncentral_f - the "shape" argument -> "size" (following the convention in the old scipy.stats distributions) * I haven't carried over mean_var_test() since it doesn't belong there as an exposed API * Some changes in the naming of arguments * seed(x, y) -> seed([x, y]) if you must * get_seed() has nothing to correspond to (except maybe get_state(), but you can't pass that to seed(); well you could, but it won't get you what you want) Does anyone use the RNG interface? It's terribly limited. I *am* concerned about carrying over interfaces that no one is going to use. The missing bits of RNG (ignoring the generator itself) could easily be replicated by a small Python file. We could keep that for a release or two, suitably marked as deprecated via warnings.warn(). Then we can remove it. -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From oliphant at ee.byu.edu Wed Sep 28 02:22:27 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Wed, 28 Sep 2005 00:22:27 -0600 Subject: ***[Possible UCE]*** Re: [SciPy-dev] Re: [Numpy-discussion] Release of scipycorebetawillhappen next week. In-Reply-To: <433A2F24.906@ucsd.edu> References: <433A0AA4.1080705@ucsd.edu> <433A0EEF.9050804@ee.byu.edu> <433A2F24.906@ucsd.edu> Message-ID: <433A36A3.4010003@ee.byu.edu> Robert Kern wrote: >Travis Oliphant wrote: > > > >>I presume we need to convert all of the pdfs from RNG and random_lite now? >> >> > >No, not really. > >mtrand.pyx: >261: def seed(self, seed=None): >282: def set_state(self, state): >293: def get_state(self): >303: def random_sample(self, size=None): >310: def tomaxint(self, size=None): >318: def randint(self, low, high=None, size=None): >352: def bytes(self, unsigned int length): >364: def uniform(self, double low=0.0, double high=1.0, size=None): >372: def standard_normal(self, size=None): >379: def normal(self, double loc=0.0, double scale=1.0, size=None): >388: def beta(self, double a, double b, size=None): >399: def exponential(self, double scale=1.0, size=None): >408: def standard_exponential(self, size=None): >415: def standard_gamma(self, double shape, size=None): >424: def gamma(self, double shape, double scale=1.0, size=None): >435: def f(self, double dfnum, double dfden, size=None): >446: def noncentral_f(self, double dfnum, double dfden, double nonc, >size=None): >460: def chisquare(self, double df, size=None): >469: def noncentral_chisquare(self, double df, double nonc, size=None): >481: def standard_cauchy(self, size=None): >488: def standard_t(self, double df, size=None): >497: def vonmises(self, double mu, double kappa, size=None): >507: def pareto(self, double a, size=None): >516: def weibull(self, double a, size=None): >525: def power(self, double a, size=None): >534: def binomial(self, long n, double p, size=None): >547: def negative_binomial(self, long n, double p, size=None): >561: def poisson(self, double lam=1.0, size=None): >570: def multinomial(self, long n, object pvals, size=None): >620: def shuffle(self, object x): >635: def permutation(self, object x): > >Okay, I guess I missed lognormal, but that's just a transformation of >normal variates. Easy. > > Ah, yes. Got it. Great job. >I originally wrote this to replace the PRNG in the old scipy.stats, not >RandomArray or RNG. Here are the incompatibilities: > >* The RNG interface isn't there at all >* Some small name differences: > - F, noncentral_F -> f, noncentral_f > - the "shape" argument -> "size" (following the convention in the old >scipy.stats distributions) >* I haven't carried over mean_var_test() since it doesn't belong there >as an exposed API >* Some changes in the naming of arguments >* seed(x, y) -> seed([x, y]) if you must >* get_seed() has nothing to correspond to (except maybe get_state(), but >you can't pass that to seed(); well you could, but it won't get you what >you want) > >Does anyone use the RNG interface? It's terribly limited. I *am* >concerned about carrying over interfaces that no one is going to use. >The missing bits of RNG (ignoring the generator itself) could easily be >replicated by a small Python file. We could keep that for a release or >two, suitably marked as deprecated via warnings.warn(). Then we can >remove it. > > I don't know. I never did. I would like to just toss it. But, to be fair, we ought to include a module, so that import RNG can be replaced with import that issues a warning on import. Can you take care of that? Also, can we eliminate the c-extension modules in the random_lite directory? No need to have them cluttering the distribution if we don't use them.... -Travis From rkern at ucsd.edu Wed Sep 28 02:31:14 2005 From: rkern at ucsd.edu (Robert Kern) Date: Tue, 27 Sep 2005 23:31:14 -0700 Subject: [SciPy-dev] Re: [Numpy-discussion] Release of scipycorebetawillhappen next week. In-Reply-To: <433A36A3.4010003@ee.byu.edu> References: <433A0AA4.1080705@ucsd.edu> <433A0EEF.9050804@ee.byu.edu> <433A2F24.906@ucsd.edu> <433A36A3.4010003@ee.byu.edu> Message-ID: <433A38B2.6000906@ucsd.edu> Travis Oliphant wrote: > Robert Kern wrote: >> Does anyone use the RNG interface? It's terribly limited. I *am* >> concerned about carrying over interfaces that no one is going to use. >> The missing bits of RNG (ignoring the generator itself) could easily be >> replicated by a small Python file. We could keep that for a release or >> two, suitably marked as deprecated via warnings.warn(). Then we can >> remove it. > > I don't know. I never did. I would like to just toss it. But, to be > fair, we ought to include a module, so that import RNG can be replaced > with import that issues a warning on > import. > > Can you take care of that? Sure thing. > Also, can we eliminate the c-extension modules in the random_lite > directory? No need to have them cluttering the distribution if we don't > use them.... Yes, I think so. At this point, it's probably *easier* for users to migrate than keep using the older code. -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From rkern at ucsd.edu Wed Sep 28 04:07:58 2005 From: rkern at ucsd.edu (Robert Kern) Date: Wed, 28 Sep 2005 01:07:58 -0700 Subject: [SciPy-dev] scipy_core bugs Message-ID: <433A4F5E.5010507@ucsd.edu> I don't like the fact that scipy.{bool,int,float,complex,object,unicode} conflict with builtins. That was a bit surprising at the prompt. Unfortunately, they don't cleanly inherit from the Python counterparts. In [48]: from scipy import * In [49]: s = 'This is text.' In [50]: u = unicode(s) In [51]: len(u) --------------------------------------------------------------------------- exceptions.TypeError Traceback (most recent call last) /Users/kern/svk-projects/ipython/nbdoc/trunk/scipy_tutorial/ TypeError: len() of unsized object -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From oliphant at ee.byu.edu Wed Sep 28 04:16:39 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Wed, 28 Sep 2005 02:16:39 -0600 Subject: [SciPy-dev] scipy_core bugs In-Reply-To: <433A4F5E.5010507@ucsd.edu> References: <433A4F5E.5010507@ucsd.edu> Message-ID: <433A5167.3080006@ee.byu.edu> Robert Kern wrote: >I don't like the fact that scipy.{bool,int,float,complex,object,unicode} >conflict with builtins. That was a bit surprising at the prompt. >Unfortunately, they don't cleanly inherit from the Python counterparts. > > Well, they should, so I'll look into it. At any rate, you get back the python defined types with py infront. pyunicode('This is python unicode type') I thought long and hard about this one. Don't see a better solution. -Travis From oliphant at ee.byu.edu Wed Sep 28 04:45:23 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Wed, 28 Sep 2005 02:45:23 -0600 Subject: [SciPy-dev] scipy_core bugs In-Reply-To: <433A4F5E.5010507@ucsd.edu> References: <433A4F5E.5010507@ucsd.edu> Message-ID: <433A5823.3060800@ee.byu.edu> Robert Kern wrote: >I don't like the fact that scipy.{bool,int,float,complex,object,unicode} >conflict with builtins. That was a bit surprising at the prompt. >Unfortunately, they don't cleanly inherit from the Python counterparts. > >In [48]: from scipy import * > >In [49]: s = 'This is text.' > >In [50]: u = unicode(s) > >In [51]: len(u) > > Try it now. I just fixed the inheritance for string and unicode to come from the Python object first. This gives better usability, I think... -Travis From rkern at ucsd.edu Wed Sep 28 05:08:01 2005 From: rkern at ucsd.edu (Robert Kern) Date: Wed, 28 Sep 2005 02:08:01 -0700 Subject: [SciPy-dev] scipy_core bugs In-Reply-To: <433A5823.3060800@ee.byu.edu> References: <433A4F5E.5010507@ucsd.edu> <433A5823.3060800@ee.byu.edu> Message-ID: <433A5D71.30309@ucsd.edu> Travis Oliphant wrote: > Try it now. I just fixed the inheritance for string and unicode to come > from the Python object first. > > This gives better usability, I think... Much better, thank you. I'm still not entirely sold on shadowing builtin types, but until I find an actual bug, I'm content. -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From Fernando.Perez at colorado.edu Wed Sep 28 05:19:39 2005 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Wed, 28 Sep 2005 03:19:39 -0600 Subject: [SciPy-dev] Re: [Numpy-discussion] Release of scipy core betawillhappen next week. In-Reply-To: <4339DF9A.9040909@ucsd.edu> References: <4339DF9A.9040909@ucsd.edu> Message-ID: <433A602B.4060504@colorado.edu> Robert Kern wrote: > Actually, it's the *solution* to that problem. Unlike the older > RANLIB-based modules, random.py and mtrand define generator objects that > keep their state tied to the object. You can instantiate them as many > times as you like with different states (another benefit of the > /dev/urandom seeding over seeding with the time; there's no real chance > that you'll seed two generators with the same state no matter how close > in time you instantiate them). There are no globals. Give me a little > bit of time to check it in and you can see for yourself. Very nice. You may recall that at scipy'05, for the parallel demo I had to use a stupid serialized hack for seeding the RNGs in a cluster from the ipython control console, with a time.sleep(1.5) in between calls. This hack was _precisely_ to work around the problem you point to, and I'll be very happy not to have to use it in the future. Cheers, f From loredo at astro.cornell.edu Wed Sep 28 14:30:18 2005 From: loredo at astro.cornell.edu (Tom Loredo) Date: Wed, 28 Sep 2005 14:30:18 -0400 (EDT) Subject: [SciPy-dev] New random capability [was: Release of scipy core...] Message-ID: <200509281830.j8SIUIA03868@laplace.astro.cornell.edu> Robert, Wow, thanks for what sounds like some well thought out improvements in the "random" capability of scipy. Working on this has been on my todo list, but at lower priority than other things. I think it's in more capable hands in any case! > I also think some provision for setting a simple seed, perhaps just a long, > is useful for testing when it is nice to have a repeatable sequences. Actually, I think some users need more than this (I certainly speak at least for myself!). I think there should be capability to store the full current state, and restore it. This would let you not only repeat an earlier calculation verbatim, but also pick up where you left off, so to speak, guaranteeing you the properties of the full chain through many subsequent runs, as well as the ability to backtrack to some extent. I've copied below the code I use for this now. By default it stores the state in the CWD in ".scipy-rng-state" as a pickle. When it stores the state, if there happens to already be a stored state file, it renames it ".scipy-rng-state.1" (and if that exists, also it is renamed "...2"). In many years of doing Monte Carlo simulations and statistical calculations, there have been lots of times I realized I wanted to rerun a simulation just before the one I just started, so this backup capability has been very handy. Though scipy's current RNG has very simple state, I wrote these anticipating future RNG capability that might let one use different generators with different state. So the pickle is actually a tuple, (id, state), with id a string identifying the generator. If indeed there are ever multiple RNGs, restore_rng should probably set the current generator to the one whose state is restored, and not just restore the state of a RNG that might not actually be used. You'll probably come up with something better; the code is attached merely as a model of the capability I'm talking about. -Tom Loredo ~~~~~~~~~~~~~~~~~~~~~ import os, pickle import os.path as path # Note: use 'SSRand' since 'from scipy import *' would overwrite 'rand' import scipy.stats.rand as SSRand def restore_rng(fname='.scipy-rng-state'): """ Restore the state of SciPy's RNG from the contents of a file in the CWD if the file exists; otherwise use a (fixed) default initialization. """ if os.access(fname, os.R_OK | os.W_OK): rng_file = open(fname, 'r') id, state = pickle.load(rng_file) rng_file.close() print 'Recovered RNG state:', id, state if id == 'C2LCGX': SSRand.set_seeds(*state) else: raise ValueError, 'Invalid ID for RNG in %s!' % fname else: print 'No accessible RNG status file; using fixed default initialization.' SSRand.set_seeds(1234567890, 123456789) # From L'Ecuyer & Cote (1991) def save_rng(fname='.scipy-rng-state'): """ Save the state of SciPy's RNG to a file in the CWD. Backup the previous two saved states if present. If the RNG state file exists (from a previous save), rename the previous one with a '.1' suffix. If a '.1' file exists, rename it with a '.2' suffix. """ id = 'C2LCGX' # SciPy's RNG from L'Ecuyer & Cote (1991) state = SSRand.get_seeds() if path.exists(fname): fname1 = fname + '.1' if path.exists(fname1): fname2 = fname + '.2' os.rename(fname1, fname2) os.rename(fname, fname1) ofile = open(fname, 'w') pickle.dump((id, state), ofile) ofile.close() From charles.harris at sdl.usu.edu Wed Sep 28 15:01:49 2005 From: charles.harris at sdl.usu.edu (Charles R Harris) Date: Wed, 28 Sep 2005 13:01:49 -0600 Subject: [SciPy-dev] New random capability [was: Release of scipy core...] In-Reply-To: <200509281830.j8SIUIA03868@laplace.astro.cornell.edu> References: <200509281830.j8SIUIA03868@laplace.astro.cornell.edu> Message-ID: <1127934109.6099.6.camel@E011704> On Wed, 2005-09-28 at 14:30 -0400, Tom Loredo wrote: > Robert, > > Wow, thanks for what sounds like some well thought out improvements > in the "random" capability of scipy. Working on this has been on > my todo list, but at lower priority than other things. I think > it's in more capable hands in any case! > > > I also think some provision for setting a simple seed, perhaps just a long, > > is useful for testing when it is nice to have a repeatable sequences. > > Actually, I think some users need more than this (I certainly speak > at least for myself!). I think there should be capability to store > the full current state, and restore it. In the Mersenne Twister this also requires storing an index into the current state. The MWC type generators need to store the value of the carry as well. Perhaps all that is needed is a way to serialize the generator or make a copy. Chuck From rkern at ucsd.edu Wed Sep 28 16:36:33 2005 From: rkern at ucsd.edu (Robert Kern) Date: Wed, 28 Sep 2005 13:36:33 -0700 Subject: [SciPy-dev] New random capability [was: Release of scipy core...] In-Reply-To: <1127934109.6099.6.camel@E011704> References: <200509281830.j8SIUIA03868@laplace.astro.cornell.edu> <1127934109.6099.6.camel@E011704> Message-ID: <433AFED1.4020608@ucsd.edu> Charles R Harris wrote: > On Wed, 2005-09-28 at 14:30 -0400, Tom Loredo wrote: > >>Robert, >> >>Wow, thanks for what sounds like some well thought out improvements >>in the "random" capability of scipy. Working on this has been on >>my todo list, but at lower priority than other things. I think >>it's in more capable hands in any case! >> >>>I also think some provision for setting a simple seed, perhaps just a long, >>>is useful for testing when it is nice to have a repeatable sequences. >> >>Actually, I think some users need more than this (I certainly speak >>at least for myself!). I think there should be capability to store >>the full current state, and restore it. > > In the Mersenne Twister this also requires storing an index into the > current state. The MWC type generators need to store the value of the > carry as well. Perhaps all that is needed is a way to serialize the > generator or make a copy. Ack! I forgot about the index. In a little while, the methods RandomState.{get,set}_state() will also deal with the index and well as the state array. And in fact, those methods are really just some underscores away from making the object itself pickleable ... Saving to and restoring from a file is a nice capability. I'll try to work it into the current object. Thanks, Tom! -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From charles.harris at sdl.usu.edu Wed Sep 28 17:22:07 2005 From: charles.harris at sdl.usu.edu (Charles R Harris) Date: Wed, 28 Sep 2005 15:22:07 -0600 Subject: [SciPy-dev] New random capability [was: Release of scipy core...] In-Reply-To: <433AFED1.4020608@ucsd.edu> References: <200509281830.j8SIUIA03868@laplace.astro.cornell.edu> <1127934109.6099.6.camel@E011704> <433AFED1.4020608@ucsd.edu> Message-ID: <1127942527.6099.9.camel@E011704> On Wed, 2005-09-28 at 13:36 -0700, Robert Kern wrote: > Charles R Harris wrote: > > On Wed, 2005-09-28 at 14:30 -0400, Tom Loredo wrote: > > > >>Robert, > >> > >>Wow, thanks for what sounds like some well thought out improvements > >>in the "random" capability of scipy. Working on this has been on > >>my todo list, but at lower priority than other things. I think > >>it's in more capable hands in any case! > >> > >>>I also think some provision for setting a simple seed, perhaps just a long, > >>>is useful for testing when it is nice to have a repeatable sequences. > >> > >>Actually, I think some users need more than this (I certainly speak > >>at least for myself!). I think there should be capability to store > >>the full current state, and restore it. > > > > In the Mersenne Twister this also requires storing an index into the > > current state. The MWC type generators need to store the value of the > > carry as well. Perhaps all that is needed is a way to serialize the > > generator or make a copy. > > Ack! I forgot about the index. ;) From rkern at ucsd.edu Thu Sep 29 04:11:49 2005 From: rkern at ucsd.edu (Robert Kern) Date: Thu, 29 Sep 2005 01:11:49 -0700 Subject: [SciPy-dev] New random capability [was: Release of scipy core...] In-Reply-To: <200509281830.j8SIUIA03868@laplace.astro.cornell.edu> References: <200509281830.j8SIUIA03868@laplace.astro.cornell.edu> Message-ID: <433BA1C5.4070502@ucsd.edu> Tom Loredo wrote: > Actually, I think some users need more than this (I certainly speak > at least for myself!). I think there should be capability to store > the full current state, and restore it. This would let you not > only repeat an earlier calculation verbatim, but also pick up where > you left off, so to speak, guaranteeing you the properties of the > full chain through many subsequent runs, as well as the ability > to backtrack to some extent. scipy.stats.RandomState is now pickleable (and incidentally copyable through the copy module). -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From stephen.walton at csun.edu Thu Sep 29 14:05:46 2005 From: stephen.walton at csun.edu (Stephen Walton) Date: Thu, 29 Sep 2005 11:05:46 -0700 Subject: [SciPy-dev] absoft.py patch for newcore Message-ID: <433C2CFA.2060305@csun.edu> The attached patch puts the same fixes into newcore's absoft.py that I previously put over on scipy Bug Tracker for the old absoftfcompiler.py file. They basically make it compatible with Absoft Version 9, the current version and correct a typo or two. -------------- next part -------------- An embedded and charset-unspecified text was scrubbed... Name: absoft.diffs URL: From oliphant at ee.byu.edu Thu Sep 29 14:30:44 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Thu, 29 Sep 2005 12:30:44 -0600 Subject: [SciPy-dev] Help with error In-Reply-To: <433B22EA.6090701@colorado.edu> References: <433B0E01.1090608@ee.byu.edu> <433B0EFB.9070609@colorado.edu> <433B19AE.4060803@ee.byu.edu> <433B22EA.6090701@colorado.edu> Message-ID: <433C32D4.3070301@ee.byu.edu> Can anybody with experience on building RPMS help me. I tried bdist_rpm, but get the following error: error: Installed (but unpackaged) file(s) found: /usr/lib/python2.4/site-packages/atlas_version.so RPM build errors: Installed (but unpackaged) file(s) found: /usr/lib/python2.4/site-packages/atlas_version.so error: command 'rpmbuild' failed with exit status 1 I don't know how to get rpm to package this file -Travis From pearu at scipy.org Thu Sep 29 13:37:58 2005 From: pearu at scipy.org (Pearu Peterson) Date: Thu, 29 Sep 2005 12:37:58 -0500 (CDT) Subject: [SciPy-dev] Help with error In-Reply-To: <433C32D4.3070301@ee.byu.edu> References: <433B0E01.1090608@ee.byu.edu> <433B0EFB.9070609@colorado.edu> <433B19AE.4060803@ee.byu.edu> <433B22EA.6090701@colorado.edu> <433C32D4.3070301@ee.byu.edu> Message-ID: On Thu, 29 Sep 2005, Travis Oliphant wrote: > > Can anybody with experience on building RPMS help me. > > I tried bdist_rpm, but get the following error: > > > error: Installed (but unpackaged) file(s) found: > /usr/lib/python2.4/site-packages/atlas_version.so > > > RPM build errors: > Installed (but unpackaged) file(s) found: > /usr/lib/python2.4/site-packages/atlas_version.so > error: command 'rpmbuild' failed with exit status 1 > > > I don't know how to get rpm to package this file I don't think that atlas_version.so should be there at all. I have been working all day with packaging issues in newcore today and I am soon ready commit my patches. Before that I'll look into this issue as well. Pearu From oliphant at ee.byu.edu Thu Sep 29 14:56:52 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Thu, 29 Sep 2005 12:56:52 -0600 Subject: [SciPy-dev] Re: [SciPy-user] Gamma distribution questions In-Reply-To: <433C3591.8060304@csun.edu> References: <4320B63D.1070401@ucsd.edu> <433C3591.8060304@csun.edu> Message-ID: <433C38F4.9080408@ee.byu.edu> Stephen Walton wrote: > Robert Kern wrote, way back on September 8: > >> Currently, the fit() method of distributions is broken. There's a >> problem with the way it passes arguments to the nnlf() method; that >> problem seems to apply to all distributions. >> > I actually fixed this, kind of, some time ago. See > > > P.S. Does anyone read the Bug Tracker entries at scipy.org? My > morestats.py fix is over three months old. I don't read it regularly -- I read email much more. So make sure to post an announcement to this list. Also, I have been pre-occupied with the new core system. Also, the move to svn put a big delay on placing any fixes into scipy as several people lost write access. So, don't be discouraged. There have been some growing pains this summer. We also need more people who are willing to have write access to the actual source tree. Any takers? (I won't just let anybody make changes, but if you've already contributed to SciPy in some fashion, you are a likely candidate...) -Travis O. From stephen.walton at csun.edu Thu Sep 29 15:04:02 2005 From: stephen.walton at csun.edu (Stephen Walton) Date: Thu, 29 Sep 2005 12:04:02 -0700 Subject: [SciPy-dev] Re: [SciPy-user] Gamma distribution questions In-Reply-To: <433C38F4.9080408@ee.byu.edu> References: <4320B63D.1070401@ucsd.edu> <433C3591.8060304@csun.edu> <433C38F4.9080408@ee.byu.edu> Message-ID: <433C3AA2.90305@csun.edu> Travis Oliphant wrote: > We also need more people who are willing to have write access to the > actual source tree. > Any takers? I'd be willing, but I would be very reluctant to actually commit changes as I really only have one platform on which to test, namely FC4 (two if you count my Ubuntu laptop). In particular, I've never even attempted Scipy on Windows. From cookedm at physics.mcmaster.ca Thu Sep 29 15:21:21 2005 From: cookedm at physics.mcmaster.ca (David M. Cooke) Date: Thu, 29 Sep 2005 15:21:21 -0400 Subject: [SciPy-dev] Re: [SciPy-user] Gamma distribution questions In-Reply-To: <433C3AA2.90305@csun.edu> (Stephen Walton's message of "Thu, 29 Sep 2005 12:04:02 -0700") References: <4320B63D.1070401@ucsd.edu> <433C3591.8060304@csun.edu> <433C38F4.9080408@ee.byu.edu> <433C3AA2.90305@csun.edu> Message-ID: Stephen Walton writes: > Travis Oliphant wrote: > >> We also need more people who are willing to have write access to the >> actual source tree. Any takers? > > I'd be willing, but I would be very reluctant to actually commit changes > as I really only have one platform on which to test, namely FC4 (two if > you count my Ubuntu laptop). In particular, I've never even attempted > Scipy on Windows. I'm also willing; at the moment I'm kind of overflowing with other work, but I can look at some bugs, and clean up code. -- |>|\/|< /--------------------------------------------------------------------------\ |David M. Cooke http://arbutus.physics.mcmaster.ca/dmc/ |cookedm at physics.mcmaster.ca From fonnesbeck at gmail.com Thu Sep 29 15:23:24 2005 From: fonnesbeck at gmail.com (Chris Fonnesbeck) Date: Thu, 29 Sep 2005 15:23:24 -0400 Subject: [SciPy-dev] Re: [SciPy-user] Gamma distribution questions In-Reply-To: <433C3AA2.90305@csun.edu> References: <4320B63D.1070401@ucsd.edu> <433C3591.8060304@csun.edu> <433C38F4.9080408@ee.byu.edu> <433C3AA2.90305@csun.edu> Message-ID: <723eb6930509291223s67a1c762y32d65608006add91@mail.gmail.com> On 9/29/05, Stephen Walton wrote: > Travis Oliphant wrote: > > > We also need more people who are willing to have write access to the > > actual source tree. > > Any takers? > > I'd be willing, but I would be very reluctant to actually commit changes > as I really only have one platform on which to test, namely FC4 (two if > you count my Ubuntu laptop). In particular, I've never even attempted > Scipy on Windows. I would be interested in overhauling the statistical distributions in general. I think they should be re-designed, or at least refactored. I have been developing a range of statistical likelihood functions for PyMC, using f2py. I would be willing to work on this if there is interest. Also, it would be nice to get a Bayesian fit, as well as a classical fit. Chris From Fernando.Perez at colorado.edu Thu Sep 29 15:25:50 2005 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Thu, 29 Sep 2005 13:25:50 -0600 Subject: [SciPy-dev] Re: Help with error In-Reply-To: <433C32D4.3070301@ee.byu.edu> References: <433B0E01.1090608@ee.byu.edu> <433B0EFB.9070609@colorado.edu> <433B19AE.4060803@ee.byu.edu> <433B22EA.6090701@colorado.edu> <433C32D4.3070301@ee.byu.edu> Message-ID: <433C3FBE.9000409@colorado.edu> Travis Oliphant wrote: > Can anybody with experience on building RPMS help me. > > I tried bdist_rpm, but get the following error: > > > error: Installed (but unpackaged) file(s) found: > /usr/lib/python2.4/site-packages/atlas_version.so > > > RPM build errors: > Installed (but unpackaged) file(s) found: > /usr/lib/python2.4/site-packages/atlas_version.so > error: command 'rpmbuild' failed with exit status 1 > > > I don't know how to get rpm to package this file Pearu mentioned he'd commit some fixes. Please holler after he does so: if the problem persists after his fixes, I'll have a look as well. Cheers, f From prabhu_r at users.sf.net Thu Sep 29 15:54:55 2005 From: prabhu_r at users.sf.net (Prabhu Ramachandran) Date: Fri, 30 Sep 2005 01:24:55 +0530 Subject: [SciPy-dev] Re: [SciPy-user] Gamma distribution questions In-Reply-To: <433C38F4.9080408@ee.byu.edu> References: <4320B63D.1070401@ucsd.edu> <433C3591.8060304@csun.edu> <433C38F4.9080408@ee.byu.edu> Message-ID: <17212.18063.134339.730797@monster.linux.in> >>>>> "Travis" == Travis Oliphant writes: Travis> Any takers? (I won't just let anybody make changes, but Travis> if you've already contributed to SciPy in some fashion, Travis> you are a likely candidate...) Swamped currently, but I've been maintaining weave's VTK and SWIG2 support. cheers, prabhu From nwagner at mecha.uni-stuttgart.de Thu Sep 29 16:03:04 2005 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Thu, 29 Sep 2005 22:03:04 +0200 Subject: [SciPy-dev] More bugs Message-ID: Apropos bugs ... optimize.fmin_powell(func,x0) works erroneous See rayleigh.py for details Nils -------------- next part -------------- A non-text attachment was scrubbed... Name: rayleigh.py Type: application/x-python Size: 1886 bytes Desc: not available URL: From charles.harris at sdl.usu.edu Thu Sep 29 17:30:42 2005 From: charles.harris at sdl.usu.edu (Charles Harris) Date: Thu, 29 Sep 2005 15:30:42 -0600 Subject: [SciPy-dev] Re: [SciPy-user] Gamma distribution questions Message-ID: Travis, I used to have access, but my password seems to have lapsed. There are a few cleanups and comments I could make to those small parts of Scipy I contributed. I also extended the sort functions in numarray and might look into porting them to the new core. Chuck -----Original Message----- From: scipy-dev-bounces at scipy.net on behalf of Travis Oliphant Sent: Thu 9/29/2005 12:56 PM To: SciPy Users List; SciPy Developers List Subject: [SciPy-dev] Re: [SciPy-user] Gamma distribution questions Stephen Walton wrote: > Robert Kern wrote, way back on September 8: > >> Currently, the fit() method of distributions is broken. There's a >> problem with the way it passes arguments to the nnlf() method; that >> problem seems to apply to all distributions. >> > I actually fixed this, kind of, some time ago. See > > > P.S. Does anyone read the Bug Tracker entries at scipy.org? My > morestats.py fix is over three months old. I don't read it regularly -- I read email much more. So make sure to post an announcement to this list. Also, I have been pre-occupied with the new core system. Also, the move to svn put a big delay on placing any fixes into scipy as several people lost write access. So, don't be discouraged. There have been some growing pains this summer. We also need more people who are willing to have write access to the actual source tree. Any takers? (I won't just let anybody make changes, but if you've already contributed to SciPy in some fashion, you are a likely candidate...) -Travis O. _______________________________________________ Scipy-dev mailing list Scipy-dev at scipy.net http://www.scipy.net/mailman/listinfo/scipy-dev -------------- next part -------------- A non-text attachment was scrubbed... Name: winmail.dat Type: application/ms-tnef Size: 3530 bytes Desc: not available URL: From pearu at scipy.org Thu Sep 29 17:57:43 2005 From: pearu at scipy.org (Pearu Peterson) Date: Thu, 29 Sep 2005 16:57:43 -0500 (CDT) Subject: [SciPy-dev] Re: Help with error In-Reply-To: <433C3FBE.9000409@colorado.edu> References: <433B0E01.1090608@ee.byu.edu> <433B0EFB.9070609@colorado.edu> <433B19AE.4060803@ee.byu.edu> <433B22EA.6090701@colorado.edu> <433C32D4.3070301@ee.byu.edu> <433C3FBE.9000409@colorado.edu> Message-ID: On Thu, 29 Sep 2005, Fernando Perez wrote: > Travis Oliphant wrote: >> Can anybody with experience on building RPMS help me. >> >> I tried bdist_rpm, but get the following error: >> >> >> error: Installed (but unpackaged) file(s) found: >> /usr/lib/python2.4/site-packages/atlas_version.so >> >> >> RPM build errors: >> Installed (but unpackaged) file(s) found: >> /usr/lib/python2.4/site-packages/atlas_version.so >> error: command 'rpmbuild' failed with exit status 1 >> >> >> I don't know how to get rpm to package this file > > Pearu mentioned he'd commit some fixes. Please holler after he does so: if > the problem persists after his fixes, I'll have a look as well. Have commited my patches --- now f2py2e installs as scipy.f2py, setup.py files are updated/fixed to the latest features in scipy.distutils, etc. On my debian box newcore build/install/bdist_rpm commands work ok, though on RH based systems bdist_rpm may behave differently. I haven't tested newcore on windows yet. Pearu From pearu at scipy.org Thu Sep 29 18:20:17 2005 From: pearu at scipy.org (Pearu Peterson) Date: Thu, 29 Sep 2005 17:20:17 -0500 (CDT) Subject: [SciPy-dev] status of newcore and f2py Message-ID: Hi, When forcing f2py to use newcore, that is, when changing #include "Numeric/arrayobject.h" to #include "scipy/arrayobject.h" in newcore/scipy/f2py2e/src/fortranobject.h, then I get the folloing messages /tmp/tmpqynjjs/src/fortranobject.c: In function 'fortran_setattr': /tmp/tmpqynjjs/src/fortranobject.c:233: warning: implicit declaration of function '_PyArray_multiply_list' /tmp/tmpqynjjs/src/fortranobject.c: In function 'copy_ND_array': /tmp/tmpqynjjs/src/fortranobject.c:980: warning: passing argument 2 of 'cast' makes pointer from integer without a cast /tmp/tmpqynjjs/src/fortranobject.c:980: warning: passing argument 3 of 'cast' makes integer from pointer without a cast /tmp/tmpqynjjs/src/fortranobject.c:980: warning: passing argument 4 of 'cast' makes pointer from integer without a cast /tmp/tmpqynjjs/src/fortranobject.c:980: warning: passing argument 5 of 'cast' makes pointer from integer without a cast /tmp/tmpqynjjs/src/fortranobject.c:1004: warning: passing argument 2 of 'cast' makes pointer from integer without a cast /tmp/tmpqynjjs/src/fortranobject.c:1004: warning: passing argument 3 of 'cast' makes integer from pointer without a cast /tmp/tmpqynjjs/src/fortranobject.c:1004: warning: passing argument 4 of 'cast' makes pointer from integer without a cast /tmp/tmpqynjjs/src/fortranobject.c:1004: warning: passing argument 5 of 'cast' makes pointer from integer without a cast /tmp/tmpqynjjs/src/fortranobject.c:1023: warning: passing argument 2 of 'cast' makes pointer from integer without a cast /tmp/tmpqynjjs/src/fortranobject.c:1023: warning: passing argument 3 of 'cast' makes integer from pointer without a cast /tmp/tmpqynjjs/src/fortranobject.c:1023: warning: passing argument 4 of 'cast' makes pointer from integer without a cast /tmp/tmpqynjjs/src/fortranobject.c:1023: warning: passing argument 5 of 'cast' makes pointer from integer without a cast /tmp/tmpqynjjs/src/fortranobject.c: At top level: /usr/local/include/python2.3/scipy/__multiarray_api.h:681: warning: 'import_array' defined but not used when compiling fortranobject.c and when trying to import the corresponding extension module, I get >>> import foo Traceback (most recent call last): File "", line 1, in ? ImportError: ./foo.so: undefined symbol: _PyArray_multiply_list So, we need to have a newcore replacement for _PyArray_multiply_list. Is there anything available already? Pearu From oliphant at ee.byu.edu Thu Sep 29 19:21:18 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Thu, 29 Sep 2005 17:21:18 -0600 Subject: [SciPy-dev] Re: Help with error In-Reply-To: References: <433B0E01.1090608@ee.byu.edu> <433B0EFB.9070609@colorado.edu> <433B19AE.4060803@ee.byu.edu> <433B22EA.6090701@colorado.edu> <433C32D4.3070301@ee.byu.edu> <433C3FBE.9000409@colorado.edu> Message-ID: <433C76EE.4040703@ee.byu.edu> Pearu Peterson wrote: > > > On Thu, 29 Sep 2005, Fernando Perez wrote: > >> Travis Oliphant wrote: >> >>> Can anybody with experience on building RPMS help me. >>> >>> I tried bdist_rpm, but get the following error: >>> >>> >>> error: Installed (but unpackaged) file(s) found: >>> /usr/lib/python2.4/site-packages/atlas_version.so >>> >>> >>> RPM build errors: >>> Installed (but unpackaged) file(s) found: >>> /usr/lib/python2.4/site-packages/atlas_version.so >>> error: command 'rpmbuild' failed with exit status 1 >>> >>> >>> I don't know how to get rpm to package this file >> >> >> Pearu mentioned he'd commit some fixes. Please holler after he does >> so: if the problem persists after his fixes, I'll have a look as well. > > > Have commited my patches --- now f2py2e installs as scipy.f2py, > setup.py files are updated/fixed to the latest features in > scipy.distutils, etc. > > On my debian box newcore build/install/bdist_rpm commands work ok, > though on RH based systems bdist_rpm may behave differently. > I haven't tested newcore on windows yet. I was having trouble linking against the ATLAS library before. Also, when blas is detected but cblas.h is not found problems follow. Should we just include cblas.h with dotblas -- that's the only thing that uses it... -Travis From oliphant at ee.byu.edu Thu Sep 29 19:34:00 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Thu, 29 Sep 2005 17:34:00 -0600 Subject: [SciPy-dev] Release going to happen tonight. Message-ID: <433C79E8.7050407@ee.byu.edu> With Pearu's commits, I'm about ready to make a beta release. Now, where do I place it? I'll probably put it on the sourceforge site as well as make a link on numeric.scipy.org. What do you all think? -Travis P.S. I resolved my rpm building error by deleting an old buildroot tree in /var/tmp. If anybody else has the same problem, you might look into deleting old copies of the rpm buildroot tree. Now, I have shiny rpm's of the distribution... -Travis From Fernando.Perez at colorado.edu Thu Sep 29 19:38:40 2005 From: Fernando.Perez at colorado.edu (Fernando Perez) Date: Thu, 29 Sep 2005 17:38:40 -0600 Subject: [SciPy-dev] Release going to happen tonight. In-Reply-To: <433C79E8.7050407@ee.byu.edu> References: <433C79E8.7050407@ee.byu.edu> Message-ID: <433C7B00.4060306@colorado.edu> Travis Oliphant wrote: > With Pearu's commits, I'm about ready to make a beta release. Now, > where do I place it? > > I'll probably put it on the sourceforge site as well as make a link on > numeric.scipy.org. > > What do you all think? Sourceforge is nice for distribution, if nothing else to take advantage of their extensive mirror network. No need to abuse Enthought's bandwith bill any more than necessary, I suppose :) As backup in case SF is inaccessible, a local copy would be nice. Best, f ps - in advance, a big cheer. From rkern at ucsd.edu Thu Sep 29 19:39:25 2005 From: rkern at ucsd.edu (Robert Kern) Date: Thu, 29 Sep 2005 16:39:25 -0700 Subject: [SciPy-dev] Release going to happen tonight. In-Reply-To: <433C79E8.7050407@ee.byu.edu> References: <433C79E8.7050407@ee.byu.edu> Message-ID: <433C7B2D.4060502@ucsd.edu> Travis Oliphant wrote: > > With Pearu's commits, I'm about ready to make a beta release. Now, > where do I place it? > > I'll probably put it on the sourceforge site as well as make a link on > numeric.scipy.org. > > What do you all think? We should make a final decision about where the headers need to go. I floated the idea a few days ago to install headers into the package itself instead of having them installed to the $prefix by the install_headers command. Some people won't be able to install to the main Python include directory because they don't have root access. PythonEggs don't have the ability to install headers either. AFAICT, only Numeric, numarray, and ScientificPython do this. We should take the opportunity now to stop this practice. There is now a function scipy.base.get_scipy_include(): def get_scipy_include(): """Return the directory in the package that contains the scipy/*.h header files. Extension modules that need to compile against scipy.base should use this function to locate the appropriate include directory. Using distutils: import scipy Extension('extension_name', ... include_dirs=[scipy.get_scipy_include()]) """ import os dir, fn = os.path.split(__file__) return os.path.join(dir, 'include') -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From oliphant at ee.byu.edu Thu Sep 29 19:57:19 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Thu, 29 Sep 2005 17:57:19 -0600 Subject: [SciPy-dev] status of newcore and f2py In-Reply-To: References: Message-ID: <433C7F5F.1040000@ee.byu.edu> Pearu Peterson wrote: > > Hi, > > When forcing f2py to use newcore, that is, when changing > > #include "Numeric/arrayobject.h" > > to > > #include "scipy/arrayobject.h" > > in newcore/scipy/f2py2e/src/fortranobject.h, then I get the folloing > messages > > /tmp/tmpqynjjs/src/fortranobject.c: In function 'fortran_setattr': > /tmp/tmpqynjjs/src/fortranobject.c:233: warning: implicit declaration > of function '_PyArray_multiply_list' > /tmp/tmpqynjjs/src/fortranobject.c: In function 'copy_ND_array': > /tmp/tmpqynjjs/src/fortranobject.c:980: warning: passing argument 2 of > 'cast' makes pointer from integer without a cast > /tmp/tmpqynjjs/src/fortranobject.c:980: warning: passing argument 3 of > 'cast' makes integer from pointer without a cast > /tmp/tmpqynjjs/src/fortranobject.c:980: warning: passing argument 4 of > 'cast' makes pointer from integer without a cast > /tmp/tmpqynjjs/src/fortranobject.c:980: warning: passing argument 5 of > 'cast' makes pointer from integer without a cast > /tmp/tmpqynjjs/src/fortranobject.c:1004: warning: passing argument 2 > of 'cast' makes pointer from integer without a cast > /tmp/tmpqynjjs/src/fortranobject.c:1004: warning: passing argument 3 > of 'cast' makes integer from pointer without a cast > /tmp/tmpqynjjs/src/fortranobject.c:1004: warning: passing argument 4 > of 'cast' makes pointer from integer without a cast > /tmp/tmpqynjjs/src/fortranobject.c:1004: warning: passing argument 5 > of 'cast' makes pointer from integer without a cast > /tmp/tmpqynjjs/src/fortranobject.c:1023: warning: passing argument 2 > of 'cast' makes pointer from integer without a cast > /tmp/tmpqynjjs/src/fortranobject.c:1023: warning: passing argument 3 > of 'cast' makes integer from pointer without a cast > /tmp/tmpqynjjs/src/fortranobject.c:1023: warning: passing argument 4 > of 'cast' makes pointer from integer without a cast > /tmp/tmpqynjjs/src/fortranobject.c:1023: warning: passing argument 5 > of 'cast' makes pointer from integer without a cast > /tmp/tmpqynjjs/src/fortranobject.c: At top level: > /usr/local/include/python2.3/scipy/__multiarray_api.h:681: warning: > 'import_array' defined but not used > > when compiling fortranobject.c and when trying to import the > corresponding extension module, I get > >>>> import foo >>> > Traceback (most recent call last): > File "", line 1, in ? > ImportError: ./foo.so: undefined symbol: _PyArray_multiply_list > > So, we need to have a newcore replacement for _PyArray_multiply_list. > Is there anything available already? Yes, PyArray_MultiplyList I should make a define. -Travis From oliphant at ee.byu.edu Thu Sep 29 19:58:47 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Thu, 29 Sep 2005 17:58:47 -0600 Subject: [SciPy-dev] Release going to happen tonight. In-Reply-To: <433C7B2D.4060502@ucsd.edu> References: <433C79E8.7050407@ee.byu.edu> <433C7B2D.4060502@ucsd.edu> Message-ID: <433C7FB7.2090104@ee.byu.edu> Robert Kern wrote: >Travis Oliphant wrote: > > >>With Pearu's commits, I'm about ready to make a beta release. Now, >>where do I place it? >> >>I'll probably put it on the sourceforge site as well as make a link on >>numeric.scipy.org. >> >>What do you all think? >> >> > >We should make a final decision about where the headers need to go. I >floated the idea a few days ago to install headers into the package >itself instead of having them installed to the $prefix by the >install_headers command. Some people won't be able to install to the >main Python include directory because they don't have root access. >PythonEggs don't have the ability to install headers either. AFAICT, >only Numeric, numarray, and ScientificPython do this. We should take the >opportunity now to stop this practice. > >There is now a function scipy.base.get_scipy_include(): > >def get_scipy_include(): > """Return the directory in the package that contains the scipy/*.h >header > files. > > Extension modules that need to compile against scipy.base should use >this > function to locate the appropriate include directory. Using distutils: > > import scipy > Extension('extension_name', ... > include_dirs=[scipy.get_scipy_include()]) > """ > import os > dir, fn = os.path.split(__file__) > return os.path.join(dir, 'include') > > Shouldn't this be in scipy.distutils? -Travis From oliphant at ee.byu.edu Thu Sep 29 20:05:13 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Thu, 29 Sep 2005 18:05:13 -0600 Subject: [SciPy-dev] status of newcore and f2py In-Reply-To: References: Message-ID: <433C8139.4050709@ee.byu.edu> Pearu Peterson wrote: > > Hi, > > When forcing f2py to use newcore, that is, when changing > > #include "Numeric/arrayobject.h" > > to > > #include "scipy/arrayobject.h" The copy_ND_array (in, out) function can be replaced with PyArray_CopyInto(destination, source) which does the right thing... -Travis From rkern at ucsd.edu Thu Sep 29 20:10:47 2005 From: rkern at ucsd.edu (Robert Kern) Date: Thu, 29 Sep 2005 17:10:47 -0700 Subject: [SciPy-dev] Release going to happen tonight. In-Reply-To: <433C7FB7.2090104@ee.byu.edu> References: <433C79E8.7050407@ee.byu.edu> <433C7B2D.4060502@ucsd.edu> <433C7FB7.2090104@ee.byu.edu> Message-ID: <433C8287.4010306@ucsd.edu> Travis Oliphant wrote: > Robert Kern wrote: >> There is now a function scipy.base.get_scipy_include(): >> >> def get_scipy_include(): ... > Shouldn't this be in scipy.distutils? Umm, yeah, probably. -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From oliphant at ee.byu.edu Thu Sep 29 20:14:44 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Thu, 29 Sep 2005 18:14:44 -0600 Subject: [SciPy-dev] Release going to happen tonight. In-Reply-To: <433C8287.4010306@ucsd.edu> References: <433C79E8.7050407@ee.byu.edu> <433C7B2D.4060502@ucsd.edu> <433C7FB7.2090104@ee.byu.edu> <433C8287.4010306@ucsd.edu> Message-ID: <433C8374.4010300@ee.byu.edu> Robert Kern wrote: >Travis Oliphant wrote: > > >>Robert Kern wrote: >> >> > > > >>>There is now a function scipy.base.get_scipy_include(): >>> >>>def get_scipy_include(): >>> >>> >... > > >>Shouldn't this be in scipy.distutils? >> >> There is a misc_util.py file that could house it unless Pearu has a better idea. -Travis From eric at enthought.com Fri Sep 30 00:41:46 2005 From: eric at enthought.com (eric_imap) Date: Thu, 29 Sep 2005 23:41:46 -0500 Subject: [SciPy-dev] Release going to happen tonight. In-Reply-To: <433C7B00.4060306@colorado.edu> References: <433C79E8.7050407@ee.byu.edu> <433C7B00.4060306@colorado.edu> Message-ID: <433CC20A.3050404@enthought.com> Fernando Perez wrote: > Travis Oliphant wrote: > >> With Pearu's commits, I'm about ready to make a beta release. Now, >> where do I place it? >> >> I'll probably put it on the sourceforge site as well as make a link >> on numeric.scipy.org. >> >> What do you all think? > > > Sourceforge is nice for distribution, if nothing else to take > advantage of their extensive mirror network. No need to abuse > Enthought's bandwith bill any more than necessary, I suppose :) I think we have a fast download location for files that effectively doesn't have a bandwidth limit -- at least for files this size. Isn't this correct Joe? Placing copies in both places seems fine to me. eric > ps - in advance, a big cheer. ps. I 2nd that. Thanks Travis (and everyone else) for all the work. This is a huge service to the community, and I'm really excited after what I saw at the SciPy conference about the potential of scipy core. From pearu at scipy.org Thu Sep 29 23:47:43 2005 From: pearu at scipy.org (Pearu Peterson) Date: Thu, 29 Sep 2005 22:47:43 -0500 (CDT) Subject: [SciPy-dev] Release going to happen tonight. In-Reply-To: <433C8374.4010300@ee.byu.edu> References: <433C79E8.7050407@ee.byu.edu> <433C7B2D.4060502@ucsd.edu> <433C8374.4010300@ee.byu.edu> Message-ID: On Thu, 29 Sep 2005, Travis Oliphant wrote: > Robert Kern wrote: > >> Travis Oliphant wrote: >> >>> Robert Kern wrote: >>> >> >> >>>> There is now a function scipy.base.get_scipy_include(): >>>> >>>> def get_scipy_include(): >>>> >> ... >> >>> Shouldn't this be in scipy.distutils? >>> > > There is a misc_util.py file that could house it unless Pearu has a better > idea. Actually, there is no need to give scipy include_dirs in setup.py files. Since scipy.distutils will be used to build scipy specific extension modules anyway, then I thought that scipy include_dirs should be always used and in the current newcore this is implemented in scipy.distutils.command.build_ext module, it uses get_scipy_include_dirs function from misc_util. But I agree with Robert that it might be could idea to install scipy headers under /lib/pythonX.X/site-packages/scipy/base/include/scipy rather than under /include/pythonX.X/scipy. However, other places could be also considered: /lib/pythonX.X/site-packages/scipy/include for example. Let me know what would you prefer and I'll do the implementation. Pearu From rkern at UCSD.EDU Fri Sep 30 01:02:15 2005 From: rkern at UCSD.EDU (Robert Kern) Date: Thu, 29 Sep 2005 22:02:15 -0700 Subject: [SciPy-dev] Release going to happen tonight. In-Reply-To: References: <433C79E8.7050407@ee.byu.edu> <433C7B2D.4060502@ucsd.edu> <433C8374.4010300@ee.byu.edu> Message-ID: <433CC6D7.7010301@ucsd.edu> Pearu Peterson wrote: > > > On Thu, 29 Sep 2005, Travis Oliphant wrote: > >> Robert Kern wrote: >> >>> Travis Oliphant wrote: >>> >>>> Robert Kern wrote: >>>> >>>>> There is now a function scipy.base.get_scipy_include(): >>>>> >>>>> def get_scipy_include(): >>>>> >>> ... >>> >>>> Shouldn't this be in scipy.distutils? >>>> >> >> There is a misc_util.py file that could house it unless Pearu has a >> better idea. > > Actually, there is no need to give scipy include_dirs in setup.py files. > Since scipy.distutils will be used to build scipy specific extension > modules anyway, then I thought that scipy include_dirs should be always > used and in the current newcore this is implemented in > scipy.distutils.command.build_ext module, it uses get_scipy_include_dirs > function from misc_util. I'm not sure it's a good idea to require that scipy.distutils be used to build extensions if we can avoid it. There are plenty of other extensions to distutils and build_ext out there and we should play nicely with them if at all possible. People aren't going to switch out their build systems wholesale just compile against scipy. > But I agree with Robert that it might be could idea to install scipy > headers under > /lib/pythonX.X/site-packages/scipy/base/include/scipy rather > than under /include/pythonX.X/scipy. However, other > places could be also considered: > > /lib/pythonX.X/site-packages/scipy/include > > for example. Let me know what would you prefer and I'll do the > implementation. Don't try to do those path calculations in install_header. bdist_egg, for example, never runs install_header, and those calculations from would be wrong if it did. Try to put them in as package data. -- Robert Kern rkern at ucsd.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From pearu at scipy.org Fri Sep 30 01:40:00 2005 From: pearu at scipy.org (Pearu Peterson) Date: Fri, 30 Sep 2005 00:40:00 -0500 (CDT) Subject: [SciPy-dev] Release going to happen tonight. In-Reply-To: <433CC6D7.7010301@ucsd.edu> References: <433C79E8.7050407@ee.byu.edu> <433C7B2D.4060502@ucsd.edu> <433C8374.4010300@ee.byu.edu> <433CC6D7.7010301@ucsd.edu> Message-ID: On Thu, 29 Sep 2005, Robert Kern wrote: > Pearu Peterson wrote: >> >> >> On Thu, 29 Sep 2005, Travis Oliphant wrote: >> >>> Robert Kern wrote: >>> >>>> Travis Oliphant wrote: >>>> >>>>> Robert Kern wrote: >>>>> >>>>>> There is now a function scipy.base.get_scipy_include(): >>>>>> >>>>>> def get_scipy_include(): >>>>>> >>>> ... >>>> >>>>> Shouldn't this be in scipy.distutils? >>>>> >>> >>> There is a misc_util.py file that could house it unless Pearu has a >>> better idea. >> >> Actually, there is no need to give scipy include_dirs in setup.py files. >> Since scipy.distutils will be used to build scipy specific extension >> modules anyway, then I thought that scipy include_dirs should be always >> used and in the current newcore this is implemented in >> scipy.distutils.command.build_ext module, it uses get_scipy_include_dirs >> function from misc_util. > > I'm not sure it's a good idea to require that scipy.distutils be used to > build extensions if we can avoid it. There are plenty of other > extensions to distutils and build_ext out there and we should play > nicely with them if at all possible. People aren't going to switch out > their build systems wholesale just compile against scipy. You might be right. Then I would prefer scipy.get_scipy_include() over putting it into scipy.distutils. And whenever scipy.distutils is used, one does not need to specify explicitely scipy include directories. >> But I agree with Robert that it might be could idea to install scipy >> headers under >> /lib/pythonX.X/site-packages/scipy/base/include/scipy rather >> than under /include/pythonX.X/scipy. However, other >> places could be also considered: >> >> /lib/pythonX.X/site-packages/scipy/include >> >> for example. Let me know what would you prefer and I'll do the >> implementation. > > Don't try to do those path calculations in install_header. bdist_egg, > for example, never runs install_header, and those calculations from > would be wrong if it did. Try to put them in as package data. Ok. Pearu From alexandre.fayolle at logilab.fr Fri Sep 30 02:57:00 2005 From: alexandre.fayolle at logilab.fr (Alexandre Fayolle) Date: Fri, 30 Sep 2005 08:57:00 +0200 Subject: [SciPy-dev] write access to the source tree In-Reply-To: <433C38F4.9080408@ee.byu.edu> References: <4320B63D.1070401@ucsd.edu> <433C3591.8060304@csun.edu> <433C38F4.9080408@ee.byu.edu> Message-ID: <20050930065700.GB17370@logilab.fr> On Thu, Sep 29, 2005 at 12:56:52PM -0600, Travis Oliphant wrote: > We also need more people who are willing to have write access to the > actual source tree. > > Any takers? (I won't just let anybody make changes, but if you've > already contributed to SciPy in some fashion, you are a likely candidate...) Hi, I'm part of the maintenance team for the scipy Debian package, and being able to contribute in a more direct fashion to the development of the package would be great. -- Alexandre Fayolle LOGILAB, Paris (France). http://www.logilab.com http://www.logilab.fr http://www.logilab.org -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 481 bytes Desc: Digital signature URL: From oliphant at ee.byu.edu Fri Sep 30 02:59:22 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Fri, 30 Sep 2005 00:59:22 -0600 Subject: [SciPy-dev] Release of SciPy Core 0.4 (Beta) Message-ID: <433CE24A.6040509@ee.byu.edu> This is to announce the release of SciPy Core 0.4.X (beta) It is available at sourceforge which you can access by going to http://numeric.scipy.org Thanks to all who helped me with this release, especially Robert Kern Pearu Peterson Now, let's start getting this stable.... -Travis Oliphant From oliphant at ee.byu.edu Fri Sep 30 03:17:19 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Fri, 30 Sep 2005 01:17:19 -0600 Subject: [SciPy-dev] Install of scipy core for current scipy users Message-ID: <433CE67F.3060907@ee.byu.edu> If you are a current user of scipy and want to try out the new scipy core. You can do so. However, the default install will change four __init__.py files on your system (which will probably "break" the rest of scipy). Hopefully we will get scipy as a whole to work with the new system shortly. There has also been talk of putting together a simple install to fixup the __init__ files so that old scipy and new scipy_core can live together peacefully (anyone who can help with that is welcomed). scipy\__init__.py scipy\linalg\__init__.py scipy\fftpack\__init__.py scipy\stats\__init__.py are the affected files. The ones from scipy can just be copied back, but then you won't get the functions from new scipy. -Travis From hoel at gl-group.com Fri Sep 30 08:37:11 2005 From: hoel at gl-group.com (=?iso-8859-15?Q?Berthold_H=F6llmann?=) Date: Fri, 30 Sep 2005 14:37:11 +0200 Subject: [SciPy-dev] Re: Release of SciPy Core 0.4 (Beta) In-Reply-To: <433CE24A.6040509@ee.byu.edu> (Travis Oliphant's message of "Fri, 30 Sep 2005 00:59:22 -0600") References: <433CE24A.6040509@ee.byu.edu> Message-ID: Travis Oliphant writes: > This is to announce the release of SciPy Core 0.4.X (beta) > > It is available at sourceforge which you can access by going to > > http://numeric.scipy.org > > Thanks to all who helped me with this release, especially > > Robert Kern > Pearu Peterson > > Now, let's start getting this stable.... > > -Travis Oliphant Will the stable release support VC6 and DF6 on WinXP? These are cruical for my applications at the moment. I tried to install SciPy Core 0.4 in a WinXP machine but it seems the neccessary support is gone. I am able to install scipy_core from an older CVS checkout. Kind regards Berthold H?llmann -- Germanischer Lloyd AG CAE Development Vorsetzen 35 20459 Hamburg Phone: +49(0)40 36149-7374 Fax: +49(0)40 36149-7320 e-mail: hoel at gl-group.com Internet: http://www.gl-group.com This e-mail and any attachment thereto may contain confidential information and/or information protected by intellectual property rights for the exclusive attention of the intended addressees named above. Any access of third parties to this e-mail is unauthorised. Any use of this e-mail by unintended recipients such as total or partial copying, distribution, disclosure etc. is prohibited and may be unlawful. When addressed to our clients the content of this e-mail is subject to the General Terms and Conditions of GL's Group of Companies applicable at the date of this e-mail. If you have received this e-mail in error, please notify the sender either by telephone or by e-mail and delete the material from any computer. GL's Group of Companies does not warrant and/or guarantee that this message at the moment of receipt is authentic, correct and its communication free of errors, interruption etc. From cookedm at physics.mcmaster.ca Fri Sep 30 10:22:57 2005 From: cookedm at physics.mcmaster.ca (David M. Cooke) Date: Fri, 30 Sep 2005 10:22:57 -0400 Subject: [SciPy-dev] Supported Python versions? Message-ID: Currently, INSTALL.txt says we support Python 2.1, 2.2, and 2.3 (and presumably 2.4 :). How useful is 2.1 and 2.2 support? Does anybody test on these? I'd say that 2.3 support is still important, but if no one's checking for 2.1 or 2.2 compatibility, should we bother claiming they're suppported? Right now, for instance, scipy.base won't compile with 2.2. -- |>|\/|< /--------------------------------------------------------------------------\ |David M. Cooke http://arbutus.physics.mcmaster.ca/dmc/ |cookedm at physics.mcmaster.ca From pearu at scipy.org Fri Sep 30 09:31:53 2005 From: pearu at scipy.org (Pearu Peterson) Date: Fri, 30 Sep 2005 08:31:53 -0500 (CDT) Subject: [SciPy-dev] Supported Python versions? In-Reply-To: References: Message-ID: On Fri, 30 Sep 2005, David M. Cooke wrote: > Currently, INSTALL.txt says we support Python 2.1, 2.2, and 2.3 (and > presumably 2.4 :). How useful is 2.1 and 2.2 support? Does anybody > test on these? I'd say that 2.3 support is still important, but if no > one's checking for 2.1 or 2.2 compatibility, should we bother claiming > they're suppported? The information in INSTALL.txt is out-of-date. It's header should point to http://www.scipy.org/documentation/buildscipy.txt that contains (at least in theory) up-to-date information on scipy prerequisities. Right now these hold for scipy, not for scipy_core i.e. Numeric3. > Right now, for instance, scipy.base won't compile with 2.2. Not surprising. I suggest supporting only Python 2.3 and up. It could be certainly possible to support 2.2 but I, for instance, don't have resources for it. Pearu From cimrman3 at ntc.zcu.cz Fri Sep 30 11:37:17 2005 From: cimrman3 at ntc.zcu.cz (Robert Cimrman) Date: Fri, 30 Sep 2005 17:37:17 +0200 Subject: [SciPy-dev] sparse matrix support status In-Reply-To: <433CE24A.6040509@ee.byu.edu> References: <433CE24A.6040509@ee.byu.edu> Message-ID: <433D5BAD.8030801@ntc.zcu.cz> Hi all, I would like to know what is the status of the sparse matrix support in SciPy. Searching the scipy- mailing lists did not reveal anything recent. Is there some code to work on? I would like to help - though I am a newbie to the Scipy world, I do have experience with sparse matrices. If there is nothing yet, a quick possibility would be to wrap a library like sparskit2 and give it a pythonic feel. What are your opinions? Now when the new scipy is on the way and new types are being introduced, it could be a good time for a decent sparse matrix type too. -- Robert Cimrman From oliphant at ee.byu.edu Fri Sep 30 12:34:21 2005 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Fri, 30 Sep 2005 10:34:21 -0600 Subject: [SciPy-dev] sparse matrix support status In-Reply-To: <433D5BAD.8030801@ntc.zcu.cz> References: <433CE24A.6040509@ee.byu.edu> <433D5BAD.8030801@ntc.zcu.cz> Message-ID: <433D690D.7060008@ee.byu.edu> Robert Cimrman wrote: > Hi all, > > I would like to know what is the status of the sparse matrix support > in SciPy. Searching the scipy- mailing lists did not reveal anything > recent. Is there some code to work on? I would like to help - though I > am a newbie to the Scipy world, I do have experience with sparse > matrices. > > If there is nothing yet, a quick possibility would be to wrap a > library like sparskit2 and give it a pythonic feel. What are your > opinions? > There is a sparse matrix in SciPy now: under Lib/sparse. I've implemented about 3 or 4 sparse matrices over the years, using various libraries as back-ends. Right now, the sparse matrix code builds on top of superlu and some home-grown caclulation tools. I'd love some help with this. So, take a look at what's there and see how you can help. -Travis From nwagner at mecha.uni-stuttgart.de Fri Sep 30 13:08:52 2005 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Fri, 30 Sep 2005 19:08:52 +0200 Subject: [SciPy-dev] sparse matrix support status In-Reply-To: <433D690D.7060008@ee.byu.edu> References: <433CE24A.6040509@ee.byu.edu> <433D5BAD.8030801@ntc.zcu.cz> <433D690D.7060008@ee.byu.edu> Message-ID: On Fri, 30 Sep 2005 10:34:21 -0600 Travis Oliphant wrote: > Robert Cimrman wrote: > >> Hi all, >> >> I would like to know what is the status of the sparse >>matrix support >> in SciPy. Searching the scipy- mailing lists did not >>reveal anything >> recent. Is there some code to work on? I would like to >>help - though I >> am a newbie to the Scipy world, I do have experience >>with sparse >> matrices. >> >> If there is nothing yet, a quick possibility would be to >>wrap a >> library like sparskit2 and give it a pythonic feel. What >>are your >> opinions? >> > > There is a sparse matrix in SciPy now: under >Lib/sparse. I've implemented about 3 or 4 sparse >matrices over the years, using various libraries as >back-ends. Right now, the sparse matrix code builds on >top of superlu and some home-grown caclulation tools. > I'd love some help with this. So, take a look at what's >there and see how you can help. > > -Travis How about ARPACK (http://www.caam.rice.edu/software/ARPACK/) ? It would be great to have eigs in scipy as well. Nils > > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-dev From jswhit at fastmail.fm Fri Sep 30 14:30:17 2005 From: jswhit at fastmail.fm (Jeff Whitaker) Date: Fri, 30 Sep 2005 12:30:17 -0600 Subject: [SciPy-dev] sparse matrix support status In-Reply-To: References: <433CE24A.6040509@ee.byu.edu> <433D5BAD.8030801@ntc.zcu.cz> <433D690D.7060008@ee.byu.edu> Message-ID: <433D8439.8010005@fastmail.fm> Nils Wagner wrote: > > How about ARPACK (http://www.caam.rice.edu/software/ARPACK/) ? > It would be great to have eigs in scipy as well. > > Nils > Nils: Searching the scipy-user archives I found: http://jrfonseca.dyndns.org/work/phd/ I think the Scipy developers have their hands full right now. -Jeff -- Jeffrey S. Whitaker Phone : (303)497-6313 Meteorologist FAX : (303)497-6449 NOAA/OAR/CDC R/CDC1 Email : Jeffrey.S.Whitaker at noaa.gov 325 Broadway Office : Skaggs Research Cntr 1D-124 Boulder, CO, USA 80303-3328 Web : http://tinyurl.com/5telg From nwagner at mecha.uni-stuttgart.de Fri Sep 30 14:37:18 2005 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Fri, 30 Sep 2005 20:37:18 +0200 Subject: [SciPy-dev] sparse matrix support status In-Reply-To: <433D8439.8010005@fastmail.fm> References: <433CE24A.6040509@ee.byu.edu> <433D5BAD.8030801@ntc.zcu.cz> <433D690D.7060008@ee.byu.edu> <433D8439.8010005@fastmail.fm> Message-ID: On Fri, 30 Sep 2005 12:30:17 -0600 Jeff Whitaker wrote: > Nils Wagner wrote: > >> >> How about ARPACK >>(http://www.caam.rice.edu/software/ARPACK/) ? >> It would be great to have eigs in scipy as well. >> >> Nils >> > > Nils: > > Searching the scipy-user archives I found: > > http://jrfonseca.dyndns.org/work/phd/ > > I think the Scipy developers have their hands full right >now. > > -Jeff But it doesn't seem to be optimal ... These bindings weren't generated with any automatic binding generation tool. Even though I initially tried both PyFortran and f2py, both showed to be inappropriate to handle the specificity of the ARPACK API. ARPACK uses a reverse communication interface where basically the API sucessively returns to caller which must take some update steps, and re-call the API with most arguments untouched. The intelligent (and silent) argument conversions made by the above tools made very difficult to implement and debug the most simple example. Also, for large-scale problems we wouldn't want any kind of array conversion/transposing happening behind the scenes as that would completely kill performance. Therefore the bindings are faithfull to the original FORTRAN API and are not very friendly as is. Nevertheless a Python wrapper to these calls can easily be made, where all kind of type conversions and dummy-safe actions can be made. > > -- > Jeffrey S. Whitaker Phone : (303)497-6313 > Meteorologist FAX : (303)497-6449 > NOAA/OAR/CDC R/CDC1 Email : >Jeffrey.S.Whitaker at noaa.gov > 325 Broadway Office : Skaggs Research >Cntr 1D-124 > Boulder, CO, USA 80303-3328 Web : >http://tinyurl.com/5telg > > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-dev From pearu at scipy.org Fri Sep 30 13:54:08 2005 From: pearu at scipy.org (Pearu Peterson) Date: Fri, 30 Sep 2005 12:54:08 -0500 (CDT) Subject: [SciPy-dev] Re: [SciPy-user] Release of SciPy Core 0.4 (Beta) In-Reply-To: References: <433CE24A.6040509@ee.byu.edu> Message-ID: On Fri, 30 Sep 2005, Rob Managan wrote: > OK, I am giving this a go. > > I seem to be not setting a flag about some math functions availability. > > Here are the details (aorry about the length of it but I snipped a lot!): > > OSX 10.3.9, Python 2.4.1 > > Installed g77v3.4-bin.tar.gz from http://hpc.sourceforge.net/ as described > into /usr/local/... > Installed fftw-3.0.1-fma.tar.gz from http://www.fftw.org/download.html into > ~/Documents/local/... where I have other libraries I have installed. > > Got scipy_core-0.4.1.tar.gz from > http://sourceforge.net/project/showfiles.php?group_id=1369&package_id=6222, > wanted to try out the new replacement for Numeric > > tar -zxvf scipy_core-0.4.1.tar.gz > cd scipy_core-0.4.1 > python setup.py build > gcc: build/src/scipy/base/src/umathmodule.c > In file included from build/src/scipy/base/src/umathmodule.c:8: > scipy/base/include/scipy/arrayobject.h:84: warning: redefinition of `ushort' > /usr/include/sys/types.h:82: warning: `ushort' previously declared here > scipy/base/include/scipy/arrayobject.h:85: warning: redefinition of `uint' > /usr/include/sys/types.h:83: warning: `uint' previously declared here > build/src/scipy/base/src/umathmodule.c: In function `nc_floor_quotl': > build/src/scipy/base/src/umathmodule.c:1257: warning: implicit declaration of > function `floorl' > build/src/scipy/base/src/umathmodule.c: In function `nc_sqrtl': > build/src/scipy/base/src/umathmodule.c:1269: warning: implicit declaration of > function `sqrtl' > ... > build/src/scipy/base/__umath_generated.c:171: error: initializer element is > not constant > build/src/scipy/base/__umath_generated.c:171: error: (near initialization for > `arccosh_data[2]') Detection of the availability of long functions is not working properly in scipy_core-0.4.1. This is fixed in svn. So, try newcore from svn. On windows XP box with latest mingw32 I have experienced similar problems with float versions of inverse hyperbolic functions: it appears that while asinhf is defined then atanhf is not, I am in a middle of debugging process with it right now.. Pearu