From oliphant.travis at ieee.org Fri Nov 1 00:57:20 2002 From: oliphant.travis at ieee.org (Travis Oliphant) Date: 31 Oct 2002 22:57:20 -0700 Subject: [SciPy-dev] Statistics toolbox and nans Message-ID: <1036130242.16585.8.camel@travis.local.net> Hello developers. What should we do about nan's and the stats toolbox. Stats is one package where people may use nans to represent missing values. There are two options that I see. 1) MATLAB option MATLAB defines 6 new functions nanmean, nanmedian, nansum, nanmin, nanmax, and nanstd that ignore nans properly. These can be used in place of the normal functions which don't use nans properly. Perhaps they did this as an afterthought. Note, this is an easy option and is (as of now) implemented in the CVS scipy. Other stats functions may or may not handle nan's properly. 2) Integrated option All stats functions handle nan's properly The drawback to Option 2 which is less difficult to explain is that every function is saddled with isnan checking which may slow things down some. Using Knuth's policy of not optimizing prematurely. I tend toward number 2. Are there any other options anybody sees. Thanks, -Travis O. From oliphant.travis at ieee.org Fri Nov 1 01:59:49 2002 From: oliphant.travis at ieee.org (Travis Oliphant) Date: 31 Oct 2002 23:59:49 -0700 Subject: [SciPy-dev] Exact calculation Message-ID: <1036133991.16586.12.camel@travis.local.net> What is the feeling of this group about adding some kind of exact real number calculation module to scipy. I know there is a wrapping of the gnu multiprecision library that is GPL. But, there is an excellent self-contained module called real.py that is completely free (no expressed copyright) By Jurjen N.E. Bos (jurjen at q2c.nl). His module allows one to to infinite digit calculations which can come in very handy sometimes. Should we include a version of this module in scipy? -Travis O. From kern at caltech.edu Fri Nov 1 03:47:49 2002 From: kern at caltech.edu (Robert Kern) Date: Fri, 1 Nov 2002 00:47:49 -0800 Subject: [SciPy-dev] Exact calculation In-Reply-To: <1036133991.16586.12.camel@travis.local.net> References: <1036133991.16586.12.camel@travis.local.net> Message-ID: <20021101084749.GA32134@taliesen.caltech.edu> On Thu, Oct 31, 2002 at 11:59:49PM -0700, Travis Oliphant wrote: > But, there is an excellent self-contained module called real.py that is > completely free (no expressed copyright) By Jurjen N.E. Bos > (jurjen at q2c.nl). By default, a work is copyrighted regardless of whether that copyright is expressed or claimed. I don't see anything in the documentation that actually gives permission to use it, much less copy or modify it. Of course, that's not what the author wants. It would be best to contact the author and ask him to agree to "rerelease" it under an actual license, Scipy's for preference. > His module allows one to to infinite digit calculations which can come > in very handy sometimes. Should we include a version of this module in > scipy? +1 Every time I use a new Python installation, I'm never sure if I've installed it or where to get it again. It's useful for when you are evaluating an expression a few times and want to get that extra digit or 20. > -Travis O. -- Robert Kern Ruddock House President kern at caltech.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From rossini at blindglobe.net Fri Nov 1 09:15:03 2002 From: rossini at blindglobe.net (A.J. Rossini) Date: 01 Nov 2002 06:15:03 -0800 Subject: [SciPy-dev] Statistics toolbox and nans In-Reply-To: <1036130242.16585.8.camel@travis.local.net> References: <1036130242.16585.8.camel@travis.local.net> Message-ID: <871y651imw.fsf@jeeves.blindglobe.net> >>>>> "travis" == Travis Oliphant writes: travis> Hello developers. travis> What should we do about nan's and the stats toolbox. Stats is one travis> package where people may use nans to represent missing values. Yech. This is a hard issue, but NAN isn't the solution. I believe someone who used to be tracking this list, John Barnard, had written some tools for implementing statistical procedures for handling missing values (imputation-based) in python. But I don't know the state of that work. (the issue being that one can want to distinguish between types of missing values). travis> There are two options that I see. travis> 1) MATLAB option travis> MATLAB defines 6 new functions nanmean, nanmedian, nansum, nanmin, travis> nanmax, and nanstd that ignore nans properly. These can be used in travis> place of the normal functions which don't use nans properly. Perhaps travis> they did this as an afterthought. travis> Note, this is an easy option and is (as of now) implemented in the CVS travis> scipy. Would it be possible, instead to take the R approach, which is to have a missing data handler? (i.e. mean(x,missing=missing.drop()), where the default is to drop, and the other options might be "replace with mean", "replace with random sample", "user defined function", "barf because we shouldn't compute with missings", etc. Missing data handling is hard, and to be done right, needs to be handled flexibly. best, -tony -- A.J. Rossini Rsrch. Asst. Prof. of Biostatistics U. of Washington Biostatistics rossini at u.washington.edu FHCRC/SCHARP/HIV Vaccine Trials Net rossini at scharp.org -------------- http://software.biostat.washington.edu/ ---------------- FHCRC: M: 206-667-7025 (fax=4812)|Voicemail is pretty sketchy/use Email UW: Th: 206-543-1044 (fax=3286)|Change last 4 digits of phone to FAX (my tuesday/wednesday/friday locations are completely unpredictable.) From rossini at blindglobe.net Fri Nov 1 09:15:03 2002 From: rossini at blindglobe.net (A.J. Rossini) Date: 01 Nov 2002 06:15:03 -0800 Subject: [SciPy-dev] Statistics toolbox and nans In-Reply-To: <1036130242.16585.8.camel@travis.local.net> References: <1036130242.16585.8.camel@travis.local.net> Message-ID: <871y651imw.fsf@jeeves.blindglobe.net> >>>>> "travis" == Travis Oliphant writes: travis> Hello developers. travis> What should we do about nan's and the stats toolbox. Stats is one travis> package where people may use nans to represent missing values. Yech. This is a hard issue, but NAN isn't the solution. I believe someone who used to be tracking this list, John Barnard, had written some tools for implementing statistical procedures for handling missing values (imputation-based) in python. But I don't know the state of that work. (the issue being that one can want to distinguish between types of missing values). travis> There are two options that I see. travis> 1) MATLAB option travis> MATLAB defines 6 new functions nanmean, nanmedian, nansum, nanmin, travis> nanmax, and nanstd that ignore nans properly. These can be used in travis> place of the normal functions which don't use nans properly. Perhaps travis> they did this as an afterthought. travis> Note, this is an easy option and is (as of now) implemented in the CVS travis> scipy. Would it be possible, instead to take the R approach, which is to have a missing data handler? (i.e. mean(x,missing=missing.drop()), where the default is to drop, and the other options might be "replace with mean", "replace with random sample", "user defined function", "barf because we shouldn't compute with missings", etc. Missing data handling is hard, and to be done right, needs to be handled flexibly. best, -tony -- A.J. Rossini Rsrch. Asst. Prof. of Biostatistics U. of Washington Biostatistics rossini at u.washington.edu FHCRC/SCHARP/HIV Vaccine Trials Net rossini at scharp.org -------------- http://software.biostat.washington.edu/ ---------------- FHCRC: M: 206-667-7025 (fax=4812)|Voicemail is pretty sketchy/use Email UW: Th: 206-543-1044 (fax=3286)|Change last 4 digits of phone to FAX (my tuesday/wednesday/friday locations are completely unpredictable.) From rossini at blindglobe.net Fri Nov 1 09:16:21 2002 From: rossini at blindglobe.net (A.J. Rossini) Date: 01 Nov 2002 06:16:21 -0800 Subject: [SciPy-dev] Exact calculation In-Reply-To: <20021101084749.GA32134@taliesen.caltech.edu> References: <1036133991.16586.12.camel@travis.local.net> <20021101084749.GA32134@taliesen.caltech.edu> Message-ID: <87wunxz87e.fsf@jeeves.blindglobe.net> >>>>> "robert" == Robert Kern writes: robert> By default, a work is copyrighted regardless of whether robert> that copyright is expressed or claimed. I don't see I know this holds trus in the US, but does it hold true in Europe as well? best, -tony -- A.J. Rossini Rsrch. Asst. Prof. of Biostatistics U. of Washington Biostatistics rossini at u.washington.edu FHCRC/SCHARP/HIV Vaccine Trials Net rossini at scharp.org -------------- http://software.biostat.washington.edu/ ---------------- FHCRC: M: 206-667-7025 (fax=4812)|Voicemail is pretty sketchy/use Email UW: Th: 206-543-1044 (fax=3286)|Change last 4 digits of phone to FAX (my tuesday/wednesday/friday locations are completely unpredictable.) From skip at pobox.com Fri Nov 1 09:25:34 2002 From: skip at pobox.com (Skip Montanaro) Date: Fri, 1 Nov 2002 08:25:34 -0600 Subject: [SciPy-dev] Exact calculation In-Reply-To: <1036133991.16586.12.camel@travis.local.net> References: <1036133991.16586.12.camel@travis.local.net> Message-ID: <15810.36574.149986.667129@montanaro.dyndns.org> Travis> What is the feeling of this group about adding some kind of Travis> exact real number calculation module to scipy. It's probably worth noting that there is a move afoot to include rational numbers in Python. You might want to skim the recent discussion on python-dev: http://mail.python.org/pipermail/python-dev/2002-October/029051.htm Also, the monthly summary: http://mail.python.org/pipermail/python-dev/2002-October/thread.html should get you to any separate threads that got disconnected from the original thread. Search for "Rational Numbers". Of course, you will probably want to read PEPs 239 and 240 as well: http://www.python.org/peps/pep-0239.html http://www.python.org/peps/pep-0240.html As I recall, the concensus from the recent discussion seemed to be that rational numbers will get accepted with syntactic sugar to follow later. I think a trailing 'r' seems to be the most likely syntactic specifier (unlike what was proposed in PEP 240). That would make "1r/3r", "1/3r" and "1r/3" all rational numbers representing 1/3. I don't recall if "0.33333r" would be valid as a rational representing a close approximation to 1/3 or not. Skip From pearu at cens.ioc.ee Fri Nov 1 09:54:04 2002 From: pearu at cens.ioc.ee (Pearu Peterson) Date: Fri, 1 Nov 2002 16:54:04 +0200 (EET) Subject: [SciPy-dev] Statistics toolbox and nans In-Reply-To: <871y651imw.fsf@jeeves.blindglobe.net> Message-ID: On 1 Nov 2002, A.J. Rossini wrote: > >>>>> "travis" == Travis Oliphant writes: > > travis> Hello developers. > travis> What should we do about nan's and the stats toolbox. Stats is one > travis> package where people may use nans to represent missing values. > > Yech. This is a hard issue, but NAN isn't the solution. I think so too that using NANs for representing missing values cannot be reliable. There's too much weirdness going on with NaNs depending on the local C library. For example, on linux >>> nan=float('nan') >>> nan==nan 1 >>> nan==1 1 while on Windows nan==1 returns 0, as I have been told. See http://groups.google.com/groups?hl=en&lr=&ie=UTF-8&threadm=mailman.1035055286.17772.python-list%40python.org&rnum=6&prev=/groups%3Fq%3DPearu%2BPeterson%26hl%3Den%26lr%3D%26ie%3DUTF-8%26scoring%3Dd Tim Peters has been explained these NAN issues several times on the Usenet, google for 'Tim Peters NaN'. Since "all IEEE-754 behavior visible from Python is a platform-dependent accident" [T.P.], I don't see that NaNs could be used in SciPy for anything useful in an platform independent way. I would avoid using NaNs and Infs as much as possible until they become less platform-dependent, may be by implementing special objects for Python instead of using float('nan'), float('inf') (that even should not work on Win32). Pearu From pearu at cens.ioc.ee Fri Nov 1 10:25:51 2002 From: pearu at cens.ioc.ee (Pearu Peterson) Date: Fri, 1 Nov 2002 17:25:51 +0200 (EET) Subject: [SciPy-dev] Exact calculation In-Reply-To: <1036133991.16586.12.camel@travis.local.net> Message-ID: On 31 Oct 2002, Travis Oliphant wrote: > > What is the feeling of this group about adding some kind of > exact real number calculation module to scipy. I am positive about it, really. > I know there is a wrapping of the gnu multiprecision library that is > GPL. > > But, there is an excellent self-contained module called real.py that is > completely free (no expressed copyright) By Jurjen N.E. Bos > (jurjen at q2c.nl). real.py looks very nice (shows errors, has various elementary functions, has docs but no unittests:( etc.) but it is dead slow compared to GMPY. I would hope that in scipy we could have more efficient and longer lasting tools for exact calculation than real.py. For example, peterson at utmws130:~/tmp > python bench_exact.py real.py:101: DeprecationWarning: the regex module is deprecated; please use the re module import math, regex, marshal, string, sys gmpy float multiplication took 3600.00 secs real float multiplication took 107000.00 secs where # file bench_real.py from scipy_test.testing import jiffies import gmpy import real gmpy.set_minprec(80) real.floatPrecision=80 n=100000 a1 = 1/gmpy.mpf('3') a2 = 1/real.r(3) i = n st=jiffies() while i: i -= 1 a1*a1 et=jiffies() print 'gmpy float multiplication took %.2f secs' % (100*(et-st)) i = n st=jiffies() while i: i -= 1 a2*a2 et=jiffies() print 'real float multiplication took %.2f secs' % (100*(et-st)) # eof From oliphant at ee.byu.edu Fri Nov 1 13:55:55 2002 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Fri, 1 Nov 2002 11:55:55 -0700 (MST) Subject: [SciPy-dev] Statistics toolbox and nans In-Reply-To: Message-ID: > > On 1 Nov 2002, A.J. Rossini wrote: > > > >>>>> "travis" == Travis Oliphant writes: > > > > travis> Hello developers. > > travis> What should we do about nan's and the stats toolbox. Stats is one > > travis> package where people may use nans to represent missing values. > > > > Yech. This is a hard issue, but NAN isn't the solution. > > I think so too that using NANs for representing missing values cannot be > reliable. There's too much weirdness going on with NaNs depending on the > local C library. For example, on linux Well, MATLAB is cross-platform and it uses NANs like this extensively. So I'm not sure I buy this argument. > > >>> nan=float('nan') > >>> nan==nan > 1 > >>> nan==1 > 1 So don't use nan's that way. That's why we have isnan(x) to test where the nan's are in an array. This function should work on the platforms where scipy works. I agree that equality testing of nans against another float should not be used in an algorithm. > > while on Windows nan==1 returns 0, as I have been told. See > > http://groups.google.com/groups?hl=en&lr=&ie=UTF-8&threadm=mailman.1035055286.17772.python-list%40python.org&rnum=6&prev=/groups%3Fq%3DPearu%2BPeterson%26hl%3Den%26lr%3D%26ie%3DUTF-8%26scoring%3Dd > > Tim Peters has been explained these NAN issues several times on the > Usenet, google for 'Tim Peters NaN'. Sure, but he hasn't gone into enough detail. Matlab successfully does it so obviously it can be done (especially on modern machines that use IEEE754) > > Since "all IEEE-754 behavior visible from Python is a platform-dependent > accident" [T.P.], I don't see that NaNs could be used in SciPy for > anything useful in an platform independent way. > I would avoid using NaNs > and Infs as much as possible until they become less platform-dependent, > may be by implementing special objects for Python instead of using > float('nan'), float('inf') (that even should not work on Win32). Right now, to me this is a straw man (a hypothetical argument). We already are supporting nan's in scipy. See what scipy_base.nan or scipy_base.inf gives you on your platform. I would prefer specific examples that show where whay scipy is doing now is not working on specific platforms that we want to support, then general arguments that refer to T.P.'s apparent distaste of nan's. We have already borrowed heavily from the ideas T.P. espoused. Look deeper into scipy_base to see what I'm talking about. In short, I don't agree with the statements that nans don't or can't work. Now, I agree that treating missing values using NaNs is somewhat of a kludge. And there are perhaps better ways to handle it. It is a rather efficient kludge that works much of the time. Even if you don't officially bless nan's as "missing values," If they every show up in your calculation, they essentially are missing values and the question still remains as to how to deal with them (should you ignore them or let them ruin the rest of your calculation?) -Travis From oliphant at ee.byu.edu Fri Nov 1 14:01:04 2002 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Fri, 1 Nov 2002 12:01:04 -0700 (MST) Subject: [SciPy-dev] Exact calculation In-Reply-To: Message-ID: > real.py looks very nice (shows errors, has various elementary > functions, has docs but no unittests:( etc.) but it is dead slow compared > to GMPY. I would hope that in scipy we could have more efficient and > longer lasting tools for exact calculation than real.py. > Thanks for the benchmarks. I agree that gmpy is all better than real.py It's license is LGPL. In theory this is not a problem for SciPy because the LGPL does not limit you in anyway except for changes made to gmpy. But, Eric will have to speak to this. In practice, gmpy has a bit more backage to carry around, whereas real is one file. It would be nice to have on hand a multiple-precision library for calculating certain constants or functions, for example. I would be very happy if we went with gmpy. -Travis O. From kern at caltech.edu Fri Nov 1 14:30:59 2002 From: kern at caltech.edu (Robert Kern) Date: Fri, 1 Nov 2002 11:30:59 -0800 Subject: [SciPy-dev] Exact calculation In-Reply-To: <87wunxz87e.fsf@jeeves.blindglobe.net> References: <1036133991.16586.12.camel@travis.local.net> <20021101084749.GA32134@taliesen.caltech.edu> <87wunxz87e.fsf@jeeves.blindglobe.net> Message-ID: <20021101193059.GA3573@taliesen.caltech.edu> On Fri, Nov 01, 2002 at 06:16:21AM -0800, A.J. Rossini wrote: > >>>>> "robert" == Robert Kern writes: > > robert> By default, a work is copyrighted regardless of whether > robert> that copyright is expressed or claimed. I don't see > > I know this holds trus in the US, but does it hold true in Europe as > well? I believe it is true for all signatory countries of the Berne Convention, which includes most of Europe, IIRC. Besides, as long as the author's email still works, simply asking him should be an insignificant hassle. > best, > -tony -- Robert Kern Ruddock House President kern at caltech.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From kern at caltech.edu Fri Nov 1 15:42:24 2002 From: kern at caltech.edu (Robert Kern) Date: Fri, 1 Nov 2002 12:42:24 -0800 Subject: [SciPy-dev] Exact calculation In-Reply-To: <87wunxz87e.fsf@jeeves.blindglobe.net> References: <1036133991.16586.12.camel@travis.local.net> <20021101084749.GA32134@taliesen.caltech.edu> <87wunxz87e.fsf@jeeves.blindglobe.net> Message-ID: <20021101204224.GA6623@taliesen.caltech.edu> On Fri, Nov 01, 2002 at 06:16:21AM -0800, A.J. Rossini wrote: > >>>>> "robert" == Robert Kern writes: > > robert> By default, a work is copyrighted regardless of whether > robert> that copyright is expressed or claimed. I don't see > > I know this holds trus in the US, but does it hold true in Europe as > well? My reading of http://www.unesco.org/culture/copy/copyright/netherlands/netherlands.html suggests that it holds true in the Netherlands. The law lists several, specific instances where other individuals may copy a work, e.g. use by public officials and teachers, unless copyright is explicitly reserved. I believe other uses are still prohibited. > best, > -tony -- Robert Kern Ruddock House President kern at caltech.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From rossini at blindglobe.net Fri Nov 1 16:01:30 2002 From: rossini at blindglobe.net (A.J. Rossini) Date: 01 Nov 2002 13:01:30 -0800 Subject: [SciPy-dev] Statistics toolbox and nans In-Reply-To: References: Message-ID: <87fzulovh1.fsf@jeeves.blindglobe.net> >>>>> "travis" == Travis Oliphant writes: travis> Right now, to me this is a straw man (a hypothetical argument). I agree (i.e. NAN being the problem; it's not -- I'd probably complain about any value that could cause confusion). travis> Now, I agree that treating missing values using NaNs is somewhat of a travis> kludge. And there are perhaps better ways to handle it. It is a rather travis> efficient kludge that works much of the time. travis> Even if you don't officially bless nan's as "missing values," If they travis> every show up in your calculation, they essentially are missing values and travis> the question still remains as to how to deal with them (should you ignore travis> them or let them ruin the rest of your calculation?) This is the crux of the issue -- from a statistical perspective (different from a numerical analyst's, from what I can tell), it would be important to flag different forms of missing data, in order to process in different manners. Using a single NAN does't allow for this (i.e. numerical missingness, vs. statistical missingness, both of which may be present depending on the data and the data analysis algorithm using for processing. best, -tony -- A.J. Rossini Rsrch. Asst. Prof. of Biostatistics U. of Washington Biostatistics rossini at u.washington.edu FHCRC/SCHARP/HIV Vaccine Trials Net rossini at scharp.org -------------- http://software.biostat.washington.edu/ ---------------- FHCRC: M: 206-667-7025 (fax=4812)|Voicemail is pretty sketchy/use Email UW: Th: 206-543-1044 (fax=3286)|Change last 4 digits of phone to FAX (my tuesday/wednesday/friday locations are completely unpredictable.) From pearu at cens.ioc.ee Fri Nov 1 16:56:28 2002 From: pearu at cens.ioc.ee (Pearu Peterson) Date: Fri, 1 Nov 2002 23:56:28 +0200 (EET) Subject: [SciPy-dev] Statistics toolbox and nans In-Reply-To: Message-ID: On Fri, 1 Nov 2002, Travis Oliphant wrote: > > > > On 1 Nov 2002, A.J. Rossini wrote: > > > > > >>>>> "travis" == Travis Oliphant writes: > > > > > > travis> Hello developers. > > > travis> What should we do about nan's and the stats toolbox. Stats is one > > > travis> package where people may use nans to represent missing values. > > > > > > Yech. This is a hard issue, but NAN isn't the solution. > > > > I think so too that using NANs for representing missing values cannot be > > reliable. There's too much weirdness going on with NaNs depending on the > > local C library. For example, on linux > > Well, MATLAB is cross-platform and it uses NANs like this extensively. So > I'm not sure I buy this argument. The main problem is that Python does not support NANs. So, anything weird with NANs follows from that, like the example NaN==1 -> 1 on linux but 0 on win32. Few other examples: >>> int(NaN) Traceback (most recent call last): File "", line 1, in ? OverflowError: float too large to convert >>> NaN/0 Traceback (most recent call last): File "", line 1, in ? ZeroDivisionError: float division > > >>> nan=float('nan') > > >>> nan==nan > > 1 > > >>> nan==1 > > 1 > > So don't use nan's that way. That's why we have isnan(x) to test where > the nan's are in an array. This function should work on the platforms > where scipy works. > > I agree that equality testing of nans against another float should not be > used in an algorithm. I know. But most scipy users will probably shoot at least once their foot when trying to use NaN provided by scipy. Many of them will probably consider this as a bug in SciPy/Python/... My point is that it's not good to provide a support to something that does not work as expected. Sure, isnan(..) works but everybody would assume that also NaN==.. works. For example, the codelet if r==2: r = 1 else: r = 2 would behave unexpectetly if r happens to be NaN. In Linux r becomes 1 while in Windows r becomes 2. To get the above codelet platform independent, it should read if r==2 and not isnan(r): r = 1 else: r = 2 Assuming that one is free to use NaN or scipy functions are expected to return NaN (sometimes), then all scipy or users codes should be floated with these isnan checks, otherwise these codes are potentially buggy. Note that the whole issue is just due to the fact that currently Python does not support NaN values, and trying to support it in Scipy is too painful if it is defined as in scipy_base. See below for possible alternative. > > Tim Peters has been explained these NAN issues several times on the > > Usenet, google for 'Tim Peters NaN'. > > Sure, but he hasn't gone into enough detail. Matlab successfully does it > so obviously it can be done (especially on modern machines that use > IEEE754) There's no doubt that it can be done, just it has not been done in Python yet. For more discussion about Python state on the IEEE standard, see http://www.python.org/dev/summary/2000-10-1.html > > Since "all IEEE-754 behavior visible from Python is a platform-dependent > > accident" [T.P.], I don't see that NaNs could be used in SciPy for > > anything useful in an platform independent way. > > I would avoid using NaNs > > and Infs as much as possible until they become less platform-dependent, > > may be by implementing special objects for Python instead of using > > float('nan'), float('inf') (that even should not work on Win32). > > Right now, to me this is a straw man (a hypothetical argument). > > We already are supporting nan's in scipy. See what scipy_base.nan or > scipy_base.inf gives you on your platform. > > I would prefer specific examples that show where whay scipy is doing now > is not working on specific platforms that we want to support, then > general arguments that refer to T.P.'s apparent distaste of nan's. We > have already borrowed heavily from the ideas T.P. espoused. Look deeper > into scipy_base to see what I'm talking about. > > In short, I don't agree with the statements that nans don't or can't work. nan's can work but they do not work with the current Python, see the example above. I remember tracking down a difficult bug that turned out to be trivially caused by NaN==1 -> 1 "feature". It was difficult to track because all tests passed (seemingly) but acctually one of the tests should have been failed as the bug was giving NaN result. As a result I spent quite a time to find out which of the tests should have been failed... And this convinced me not to use NaNs with the current Python. > Now, I agree that treating missing values using NaNs is somewhat of a > kludge. And there are perhaps better ways to handle it. It is a rather > efficient kludge that works much of the time. > > Even if you don't officially bless nan's as "missing values," If they > every show up in your calculation, they essentially are missing values and > the question still remains as to how to deal with them (should you ignore > them or let them ruin the rest of your calculation?) Interpretation of nan's is application dependent. They can be interpreted as "missing values" but certainly they are not "essentially missing values". NaN means "Not a Number" or "Not any Number" which is given as a result of an operation when its argument is out of the range. See http://www.cs.berkeley.edu/~wkahan/ieee754status/ieee754.ps http://grouper.ieee.org/groups/754/ Depending on situation, certain nan's can be ignored but nan's can also ruin the whole calculations when ignored. Here follows an idea how to improve NaN support in scipy. Instead of using NaN=Inf-Inf as a definition of nan, let's define it as follows # File nan.py class NaN_class(float): def __eq__(self,other): return 0 __lt__ = __gt__ = __le__ = __eq__ def __add__(self,other): return self __sub__ = __mul__ = __div__ = __pow__ = __radd__ = __rsub__ \ = __rmul__ = __rdiv__ = __rpow__ = __add__ def __neg__(self): return self __pos__ = __abs__ = __neg__ def __str__(self): return 'NaN' def __repr__(self): return 'NaN' def __float__(self): return self def __int__(self): raise ValueError,'cannot convert NaN to integer' # ... NaN = NaN_class() def test(): assert not NaN==1 assert not NaN==NaN assert not NaN<1 assert not NaN>1 assert not NaN>=1.0 assert not NaN<=1 assert not 1NaN assert not 1>=NaN assert not NaNNaN assert NaN+1 is NaN assert 1+NaN is NaN if __name__ == "__main__": test() # eof Probably also Inf should be similarly constructed. In addition, these classes should be implemented in C in order other extension modules could use them. There are number of other issues that I haven't considered (NaN's in Numeric arrays, complex NaN's, etc) and resolving all of them is not a trivial project. Pearu From oliphant at ee.byu.edu Fri Nov 1 19:32:03 2002 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Fri, 1 Nov 2002 17:32:03 -0700 (MST) Subject: [SciPy-dev] Nan's In-Reply-To: Message-ID: Hi all. I think the issues with nans can be resolved. Matlab uses them. Octave uses them. It seems that scipy should be able to handle them properly on the same platforms where these codes work. IEEE has a well-defined standard for them, also that most compilers can support if configured correctly. The problem is when you use a compiler in a configuration where it intentionally breaks IEEE standards. For example, gcc CANNOT use the -ffast-math flag on a file where nan comparisons will be taking place. So, removing -ffast-umath from a gcc compiled file suddenly makes nan comparisons work correctly as defined by the IEEE I'm not as good at figuring out the distutils stuff as others on this list, so could somebody show me how to make sure that the flag -ffast-math is not turned on while compiling fastumathmodule.c (and some of the cephes librarys files as well). You can't expect the standard to work, when you use a compiler in a way to intentionally break it. -Travis -- Travis Oliphant Assistant Professor 459 CB Electrical and Computer Engineering Brigham Young University Provo, UT 84602 Tel: (801) 422-3108 oliphant.travis at ieee.org From pearu at cens.ioc.ee Sat Nov 2 03:42:24 2002 From: pearu at cens.ioc.ee (Pearu Peterson) Date: Sat, 2 Nov 2002 10:42:24 +0200 (EET) Subject: [SciPy-dev] Nan's In-Reply-To: Message-ID: On Fri, 1 Nov 2002, Travis Oliphant wrote: > I think the issues with nans can be resolved. I hope so. I would like very much if NaN and Infs would work under Python. Unfortunately there are number of subtle Python related issues with them that will zero all their advantages if these issues are not resolved first. The main issue is related to the platform-independence. > Matlab uses them. Octave uses them. I note again. The difference between these software and Python is that they *support* IEEE while Python does not. > It seems that scipy should be able to handle them properly on the same > platforms where these codes work. That is, scipy must implement the missing IEEE support, but it cannot rely on Python default behaviour on IEEE stuff as this behaviour *is* platform dependent, currently. For me at least, it is crucial that the result of r==1.0 should be unique for all major platforms and for all floats (including NaN). Currently it is not and therefore it is a source of difficult to find bugs if NaN is allowed. > IEEE has a well-defined standard for them, also that most compilers can > support if configured correctly. The problem is when you use a > compiler in a configuration where it intentionally breaks IEEE standards. > > For example, > > gcc CANNOT use the -ffast-math flag on a file where nan comparisons will > be taking place. > > So, removing -ffast-umath from a gcc compiled file suddenly makes nan > comparisons work correctly as defined by the IEEE Nothing in distutils or scipy_distutils forces gcc to use the -ffast-math flag. For example, the default flags used to compile C extension modules with the debian distributed Python are -DNDEBUG -g -O3 -Wall -Wstrict-prototypes You can find out these flags by executing the following code # from distutils import sysconfig sysconfig._init_posix() print sysconfig._config_vars['OPT'] # The only way the -ffast-math flag can get into this option list is if the one who compiled Python specifically forced to use it. Does Mandrake, Suse, Redhat, etc. use the -ffast-math flag for compiling Python? If yes, then I can implement hooks for scipy_distutils that will disable -ffast-math for scipy modules. If non of the Python distributors use the -ffast-math flag, then the users must recompile their Python without the -ffast-math flag, I presume. > I'm not as good at figuring out the distutils stuff as others on this > list, so could somebody show me how to make sure that the flag -ffast-math > is not turned on while compiling fastumathmodule.c (and some of the > cephes librarys files as well). I quick hack for removing -ffast-math flag from OPT is # from distutils import sysconfig save_init_posix = sysconfig._init_posix def my_init_posix(): save_init_posix() g = sysconfig._config_vars g['OPT'] = g['OPT'].replace('-ffast-math','') sysconfig._init_posix = my_init_posix # that should be placed into one of the setup files (similar hooks should be implemented for _init_nt, _init_mac). Pearu From wagner.nils at vdi.de Sat Nov 2 16:24:01 2002 From: wagner.nils at vdi.de (Nils Wagner) Date: Sat, 02 Nov 2002 22:24:01 +0100 Subject: [SciPy-dev] ValueError: frames are not aligned Message-ID: <20021102213307.DC1783EADA@www.scipy.com> Hi, I tried to compute the pseudo-inverse of a random matrix... >>> scipy.__version__ '0.2.0_alpha_145.4411' >>> from scipy import * >>> a = rand(5,3) >>> a array([[ 0.81884205, 0.3865217 , 0.9218021 ], [ 0.99365133, 0.36057177, 0.64316332], [ 0.00589046, 0.56506401, 0.97917145], [ 0.43969154, 0.44840959, 0.26978564], [ 0.11854674, 0.56671292, 0.8699165 ]]) >>> ap = linalg.pinv2(a) Traceback (most recent call last): File "", line 1, in ? File "/usr/lib/python2.2/site-packages/scipy/linalg/basic.py", line 327, in pi nv2 return dot(tran(conj(vh)),tran(conj(u))*s[:,NewAxis]) ValueError: frames are not aligned What is the reason for that ? However, pinv worls fine >>> ap = linalg.pinv(a) >>> ap array([[ 0.34069276, 0.65084773, -0.51301133, 0.26982581, -0.34844927], [-1.32339071, -0.14174113, 0.06124646, 2.61495834, 0.62720737], [ 0.88231874, -0.0535125 , 0.55819992, -1.55919289, 0.10939951]]) >>> dot(ap,a) array([[ 1.00000000e+00, -6.69765892e-17, 1.14030963e-16], [ 7.31768704e-17, 1.00000000e+00, -2.01878445e-16], [ -2.80238310e-16, -1.01511817e-16, 1.00000000e+00]]) >>> Nils From pearu at cens.ioc.ee Sat Nov 2 19:32:18 2002 From: pearu at cens.ioc.ee (Pearu Peterson) Date: Sun, 3 Nov 2002 02:32:18 +0200 (EET) Subject: [SciPy-dev] ValueError: frames are not aligned In-Reply-To: <20021102213307.DC1783EADA@www.scipy.com> Message-ID: On Sat, 2 Nov 2002, Nils Wagner wrote: > Hi, > > I tried to compute the pseudo-inverse of a random matrix... > > >>> scipy.__version__ > '0.2.0_alpha_145.4411' > > >>> from scipy import * > >>> a = rand(5,3) > >>> a > array([[ 0.81884205, 0.3865217 , 0.9218021 ], > [ 0.99365133, 0.36057177, 0.64316332], > [ 0.00589046, 0.56506401, 0.97917145], > [ 0.43969154, 0.44840959, 0.26978564], > [ 0.11854674, 0.56671292, 0.8699165 ]]) > >>> ap = linalg.pinv2(a) > Traceback (most recent call last): > File "", line 1, in ? > File "/usr/lib/python2.2/site-packages/scipy/linalg/basic.py", line 327, in pi nv2 > return dot(tran(conj(vh)),tran(conj(u))*s[:,NewAxis]) > ValueError: frames are not aligned > > What is the reason for that ? It is now fixed in SciPy CVS. Pearu From travis at enthought.com Sat Nov 2 20:39:54 2002 From: travis at enthought.com (Travis N. Vaught) Date: Sat, 2 Nov 2002 19:39:54 -0600 Subject: [SciPy-dev] ValueError: frames are not aligned In-Reply-To: Message-ID: All, I just 'kicked' the nightly build, so the updates should now be in the win32 binary download. Travis > -----Original Message----- > From: scipy-dev-admin at scipy.net [mailto:scipy-dev-admin at scipy.net]On > Behalf Of Pearu Peterson > Sent: Saturday, November 02, 2002 6:32 PM > To: scipy-dev at scipy.net > Subject: Re: [SciPy-dev] ValueError: frames are not aligned > > > > > On Sat, 2 Nov 2002, Nils Wagner wrote: > > > Hi, > > > > I tried to compute the pseudo-inverse of a random matrix... > > > > >>> scipy.__version__ > > '0.2.0_alpha_145.4411' > > > > >>> from scipy import * > > >>> a = rand(5,3) > > >>> a > > array([[ 0.81884205, 0.3865217 , 0.9218021 ], > > [ 0.99365133, 0.36057177, 0.64316332], > > [ 0.00589046, 0.56506401, 0.97917145], > > [ 0.43969154, 0.44840959, 0.26978564], > > [ 0.11854674, 0.56671292, 0.8699165 ]]) > > >>> ap = linalg.pinv2(a) > > Traceback (most recent call last): > > File "", line 1, in ? > > File > "/usr/lib/python2.2/site-packages/scipy/linalg/basic.py", line > 327, in pi nv2 > > return dot(tran(conj(vh)),tran(conj(u))*s[:,NewAxis]) > > ValueError: frames are not aligned > > > > What is the reason for that ? > > It is now fixed in SciPy CVS. > > Pearu > > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-dev > From oliphant at ee.byu.edu Mon Nov 4 20:20:59 2002 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Mon, 4 Nov 2002 18:20:59 -0700 (MST) Subject: [SciPy-dev] Mandrake 9.0 Message-ID: Hi all, I just compiled scipy on Mandrake 9.0 (with gcc-3.2) and it passed all 565 tests at level 10 scipy.test(10) I'd been wondering about gcc-3 compatibility, but I guess it's just the weave module that doesn't work. weave.test(10) gives 41 errors, but when I run as root I get only 1 error so it must be some problem with being able to access certain directories. -Travis O. From fperez at pizero.colorado.edu Mon Nov 4 20:27:21 2002 From: fperez at pizero.colorado.edu (Fernando Perez) Date: Mon, 4 Nov 2002 18:27:21 -0700 (MST) Subject: [SciPy-dev] Mandrake 9.0 In-Reply-To: Message-ID: On Mon, 4 Nov 2002, Travis Oliphant wrote: > I just compiled scipy on Mandrake 9.0 (with gcc-3.2) > > and it passed all 565 tests at level 10 > > scipy.test(10) > > I'd been wondering about gcc-3 compatibility, but I guess it's just the > weave module that doesn't work. > > weave.test(10) gives 41 errors, > > but > > when I run as root I get only 1 error so it must be some problem with > being able to access certain directories. I (for one) would appreciate if you could post when you get full success with Mandrake 9.0. I've been contemplating an upgrade but I don't want to break weave, which I need to _just work_. Cheers, f. From oliphant.travis at ieee.org Tue Nov 5 18:52:09 2002 From: oliphant.travis at ieee.org (Travis Oliphant) Date: 05 Nov 2002 16:52:09 -0700 Subject: [SciPy-dev] Mandrake 9.0 In-Reply-To: References: Message-ID: <1036540331.24645.33.camel@travis.local.net> On Mon, 2002-11-04 at 18:27, Fernando Perez wrote: > > I (for one) would appreciate if you could post when you get full success > with Mandrake 9.0. I've been contemplating an upgrade but I don't want to > break weave, which I need to _just work_. I'm only getting one error with weave right now on Mandrake 9.0 (the other errors were due to permission problems when running as a single user). Here's the error I'm getting (i.e. if you don't use what's being tested then you could still be O.K.) /usr/lib/python2.2/site-packages/weave/blitz-20001213/blitz/ops.h:220: no match for `std::complex& / double&' operator ERROR: result[1:-1,1:-1] = (b[1:-1,1:-1] + b[2:,1:-1] + b[:-2,1:-1] ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/lib/python2.2/site-packages/weave/tests/test_blitz_tools.py", line 150, in check_5point_avg_2d self.generic_2d(expr) .... .... .... (I've deleted quite a bit of the output. I think I've included the relevant gcc error). -Travis O. From eric at enthought.com Tue Nov 5 19:17:42 2002 From: eric at enthought.com (eric jones) Date: Tue, 5 Nov 2002 18:17:42 -0600 Subject: [SciPy-dev] Mandrake 9.0 In-Reply-To: <1036540331.24645.33.camel@travis.local.net> Message-ID: <004101c28529$ef20c210$8901a8c0@ERICDESKTOP> It looks like blitz() is the only part that is failing them. I'm not sure if three are any inline() tests that use converters.blitz. If there are and they are passing, then this indicates a problem with the blitz() function. If not, then, I'm betting that there is an error with compiling blitz++ code on that platform. It would also mean I need to add one. The C++ classes in weave have changed (much cleaner now), so current users will have to change there code a bit to get it to work. Travis, can you set the verbose=2 flag in the test that is failing and send me the output from it? I seemed to remember that the last tests I ran passed on RH 8.0. Thanks, Eric ---------------------------------------------- eric jones 515 Congress Ave www.enthought.com Suite 1614 512 536-1057 Austin, Tx 78701 > -----Original Message----- > From: scipy-dev-admin at scipy.net [mailto:scipy-dev-admin at scipy.net] On > Behalf Of Travis Oliphant > Sent: Tuesday, November 05, 2002 5:52 PM > To: scipy-dev at scipy.net > Subject: Re: [SciPy-dev] Mandrake 9.0 > > On Mon, 2002-11-04 at 18:27, Fernando Perez wrote: > > > > I (for one) would appreciate if you could post when you get full success > > with Mandrake 9.0. I've been contemplating an upgrade but I don't want > to > > break weave, which I need to _just work_. > > I'm only getting one error with weave right now on Mandrake 9.0 (the > other errors were due to permission problems when running as a single > user). > > Here's the error I'm getting (i.e. if you don't use what's being tested > then you could still be O.K.) > > /usr/lib/python2.2/site-packages/weave/blitz-20001213/blitz/ops.h:220: > no match for `std::complex& / double&' operator > > ERROR: result[1:-1,1:-1] = (b[1:-1,1:-1] + b[2:,1:-1] + b[:-2,1:-1] > ---------------------------------------------------------------------- > Traceback (most recent call last): > File > "/usr/lib/python2.2/site-packages/weave/tests/test_blitz_tools.py", line > 150, in check_5point_avg_2d > self.generic_2d(expr) > .... > .... > .... > > > (I've deleted quite a bit of the output. I think I've included the > relevant gcc error). > > -Travis O. > > > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-dev From nwagner at mecha.uni-stuttgart.de Thu Nov 7 08:41:38 2002 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Thu, 07 Nov 2002 14:41:38 +0100 Subject: [SciPy-dev] NameError: global name 'asarray' is not defined Message-ID: <3DCA6D92.33B356B6@mecha.uni-stuttgart.de> Hi, I would like to write an array to an ascii stream l = integrate.odeint(F0, l0[0], t) file=open("eigenpath.dat",'w') io.write_array(file,l) Traceback (most recent call last): File "moody.py", line 87, in ? io.write_array(file,l) File "/usr/local/lib/python2.1/site-packages/scipy/io/array_import.py", line 385, in write_array arr = asarray(arr) NameError: global name 'asarray' is not defined Any suggestion ? Nils From oliphant at ee.byu.edu Thu Nov 7 11:49:41 2002 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Thu, 7 Nov 2002 09:49:41 -0700 (MST) Subject: [SciPy-dev] NameError: global name 'asarray' is not defined In-Reply-To: <3DCA6D92.33B356B6@mecha.uni-stuttgart.de> Message-ID: > Hi, > > I would like to write an array to an ascii stream > > l = integrate.odeint(F0, l0[0], t) > file=open("eigenpath.dat",'w') > io.write_array(file,l) > > Traceback (most recent call last): > File "moody.py", line 87, in ? > io.write_array(file,l) > File > "/usr/local/lib/python2.1/site-packages/scipy/io/array_import.py", line > 385, in write_array > arr = asarray(arr) > NameError: global name 'asarray' is not defined > Fixed in CVS. But, just go to the file /usr/local/lib/python2.1/site-packages/scipy/io/array_import.py and on the line that says from Numeric import XXXXX, XXXXX add asarray at the end. -Travis O. From nwagner at mecha.uni-stuttgart.de Fri Nov 8 05:04:27 2002 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Fri, 08 Nov 2002 11:04:27 +0100 Subject: [SciPy-dev] gplt and multiple windows Message-ID: <3DCB8C2B.EABEE01A@mecha.uni-stuttgart.de> Hi, Is it possible to plot curves in two different "windows" using gplt.plot(t,y) ? Nils From wagner.nils at vdi.de Sat Nov 9 14:51:36 2002 From: wagner.nils at vdi.de (Nils Wagner) Date: Sat, 09 Nov 2002 20:51:36 +0100 Subject: [SciPy-dev] Special matrices Message-ID: <20021109200120.1F18E3EACE@www.scipy.com> Hi, I wonder if there will be modules for other special matrices in scipy for example Hilbert, Frank, Vandermonde matrix etc. Afaik only Toeplitz and Hankel matrices are supported by scipy. Any suggestion ? Nils From skybar2002 at hotmail.com Wed Nov 13 18:01:35 2002 From: skybar2002 at hotmail.com (skybar2002 at hotmail.com) Date: 13 Nov 2002 19:01:35 -0400 Subject: [SciPy-dev] SkyBar Opening Party! No te lo pierdas Message-ID: <20021113231152.D2C3E3EACE@www.scipy.com> An HTML attachment was scrubbed... URL: From wagner.nils at vdi.de Fri Nov 15 17:53:18 2002 From: wagner.nils at vdi.de (Nils Wagner) Date: Fri, 15 Nov 2002 23:53:18 +0100 Subject: [SciPy-dev] error: Input/output error Message-ID: <20021115230330.A9D603EACE@www.scipy.com> Hi all, I try to install scipy via latest cvs. However python setup.py install --force failed copying build/lib.linux-i686-2.2/scipy/stats/rv2.py -> /usr/lib/python2.2/site-packages/scipy/stats error: Input/output error Any idea ? Nils From pearu at cens.ioc.ee Fri Nov 15 18:17:35 2002 From: pearu at cens.ioc.ee (Pearu Peterson) Date: Sat, 16 Nov 2002 01:17:35 +0200 (EET) Subject: [SciPy-dev] error: Input/output error In-Reply-To: <20021115230330.A9D603EACE@www.scipy.com> Message-ID: On Fri, 15 Nov 2002, Nils Wagner wrote: > I try to install scipy via latest cvs. > However python setup.py install --force failed > > copying build/lib.linux-i686-2.2/scipy/stats/rv2.py -> > /usr/lib/python2.2/site-packages/scipy/stats > error: Input/output error > > Any idea ? I don't think that this is a SciPy issue. Anyway, the possible reasons for this kind of errors are too numerous; so I'll not speculate any further with the data given. You can try to remove /usr/lib/python2.2/site-packages/scipy manually. For further hints, see google. Pearu From datafeed at SoftHome.net Sat Nov 16 00:25:47 2002 From: datafeed at SoftHome.net (M. Evans) Date: Fri, 15 Nov 2002 22:25:47 -0700 Subject: [SciPy-dev] Psycho about performance Message-ID: <3841201174.20021115222547@SoftHome.net> Gotta love the name! Looks very interesting, perhaps SciPy could help with this baby while the author finishes his Ph.D. Nice loose MIT license. -Mark http://psyco.sourceforge.net/introduction.html "The actual performance gains can be very large. For common code, expect at least a 2x speed-up, more typically 4x. But where Psyco shines is when running algorithmical code --- these are the first pieces of code that you would consider rewriting in C for performance. If you are in this situation, consider using Psyco instead! You might get 10x to 100x speed-ups. It is theoretically possible to actually speed up this kind of code up to the performance of C itself." From datafeed at SoftHome.net Sat Nov 16 01:40:55 2002 From: datafeed at SoftHome.net (M. Evans) Date: Fri, 15 Nov 2002 23:40:55 -0700 Subject: [SciPy-dev] Fresco for Chaco In-Reply-To: <3841201174.20021115222547@SoftHome.net> References: <3841201174.20021115222547@SoftHome.net> Message-ID: <13145708815.20021115234055@SoftHome.net> Have you considered Fresco in connection with the Chaco project? http://wiki.fresco.org/index.cgi/FrescoVsX Mark From oliphant at ee.byu.edu Mon Nov 18 14:45:09 2002 From: oliphant at ee.byu.edu (Travis Oliphant) Date: Mon, 18 Nov 2002 12:45:09 -0700 (MST) Subject: [SciPy-dev] Changes to CVS In-Reply-To: Message-ID: Because we are getting ready for releasing Scipy 0.2, I should make people aware, that I'm in the process of re-writing some of the modules in the stats subpackage so that the random variable functions are implemented as classes. Currently there is a separate function pdf, cdf, etc. for each random variable . The new design defines a class _gen and an instance of the class that has methods .pdf, .cdf, etc. This approach makes it very easy to add a new random variable by inheriting from the base class and over-riding either the ._pdf, or ._cdf method (and others if you want speed improvements). Everything else can be defined from these in the generic case. I'm testing the system right now and should be ready to check it in a week or two. (I'm also writing the discrete random variable part). -Travis O. From eric at enthought.com Mon Nov 18 15:12:06 2002 From: eric at enthought.com (eric jones) Date: Mon, 18 Nov 2002 14:12:06 -0600 Subject: [SciPy-dev] Changes to CVS In-Reply-To: Message-ID: <001301c28f3e$c74753c0$8901a8c0@ERICDESKTOP> > > Because we are getting ready for releasing Scipy 0.2, I should make people > aware, that I'm in the process of re-writing some of the modules in the > stats subpackage so that the random variable functions > are implemented as classes. > > Currently there is a separate function pdf, cdf, etc. for > each random variable . The new design defines a class _gen > and an instance of the class that has methods .pdf, > .cdf, etc. > > This approach makes it very > easy to add a new random variable by inheriting from the base class and > over-riding either the ._pdf, or ._cdf method (and others if you want > speed improvements). Everything else can be defined from these in the > generic case. > > I'm testing the system right now and should be ready to check it in a week > or two. (I'm also writing the discrete random variable part). > This sounds a lot cleaner. Good to hear it will be ready for SciPy-0.2. eric From pearu at cens.ioc.ee Mon Nov 18 20:03:09 2002 From: pearu at cens.ioc.ee (Pearu Peterson) Date: Tue, 19 Nov 2002 03:03:09 +0200 (EET) Subject: [SciPy-dev] Building/distributing SciPy Message-ID: Hi! Recently we discussed about SciPy packaging issues and reached to some consensus how to resolve them so that SciPy and its friends could be packaged without dependency conflicts. See recent "Debian package(s)?" thread in scipy-user mailing list. Now I have made few steps towards implementing the required hooks and before finishing implementation, I'd like to sketch the new way of building, installing, and distributing SciPy and its friends in order to get some feedback as there are still some open questions/choices that may need to be discussed. Distributing ============ Creating tarballs from CVS Repositories --------------------------------------- scipy_core: cd cvs/scipy python setup_scipy_core.py -f -t MANIFEST_scipy_core.in # -> SciPy_core-0.2.0_alpha_4.258.tar.gz # .. contains scipy_test and scipy_distutils Python packages # FINISHED f2py: cd cvs/f2py2e python setup.py sdist -f # -> F2PY-2.23.190-1372.tar.gz # .. contains f2py2e Python package (without scipy_distutils) # TODO: disable scipy_distutils in setup.py scipy: cd cvs/scipy python setup_scipy.py sdist -f # -> SciPy-0.2.0_alpha_150.4447.tar.gz # .. contains scipy Python package (without scipy_test, scipy_distutils, weave) # TODO: depreciate using setup.py weave: cd cvs/scipy/weave python setup_weave.py sdist -f # -> weave-0.0.0.tar.gz # .. contains weave (without scipy_distutils, scipy_test, scipy_base) # TODO: ?? depreciate using setup.py ?? start using __cvs_version__.py as other packages do to fix version numbering. Eric, what do you think? All these tarballs are downloadable from scipy.org. Questions: Do we need a tarball, say SciPy_all-0.2.0_alpha_150.4447.tar.gz, that contains all modules including scipy_core, f2py2e, scipy, weave, chaco, etc.? Will that simplify building SciPy for Windows platforms? Recommendations for deb/rpm packagers. Tree of dependencies ------------------------------------------------------------------ python-scipy_core: python-dev python-NumPy python-f2py: python-scipy_core C and F77 compilers python-weave: python-scipy_core C and C++ compilers (is scipy_base actually required???) python-scipy: python-f2py (required for building) atlas or lapack/blas fftw (optional) python-weave (optional or require???) gnuplot,wxPython (optional or require???, will chaco replace gplt,xplt,plt???) Binaries ======== ??? Build using SciPy_all-???.tar.gz? Note that python-scipy is the only package that contains extension modules, other packages are pure Python and are actually not required for python-scipy binary (except may be scipy_test). Installing ========== Make sure that all SciPy prerequisites are installed and working properly. >From CVS repository ------------------- f2py: cd cvs/f2py2e python setup.py install # TODO: remove scipy_distutils dependence from setup.py # Note that f2py2e can be installed before scipy_core, but not used # until scipy_core is installed. scipy_core and scipy: cd cvs/scipy python setup_scipy_core.py install python setup_scipy.py install # TODO: depreciate using setup.py weave: cd cvs/scipy/weave python setup_weave.py install # TODO: ?? depreciate using setup.py >From tarballs ------------- Unpack tarballs and execute the same commands as in "From CVS repository" in the corresponding directories. >From deb files -------------- ??? >From rpm files -------------- ??? Windows installation -------------------- ??? Mac OS X installation --------------------- ??? Feel free to fill in or make suggestions for ??? parts. Regards, Pearu From kern at caltech.edu Tue Nov 19 01:28:55 2002 From: kern at caltech.edu (Robert Kern) Date: Mon, 18 Nov 2002 22:28:55 -0800 Subject: [SciPy-dev] Module Submission: Orthogonal Distance Regression Message-ID: <20021119062855.GA3999@taliesen.caltech.edu> Some time ago, I wrote an extension module to use ODRPACK, a FORTRAN library that performs Orthogonal Distance Regression (ODR) as well as non-linear Ordinary Least-Squares (OLS) and implicit regression (fitting to, say, an ellipse). ODRPACK is very flexible. It can handle multi-dimensional input and output variables, weighted observations (with correlations between dimensions). It solves the ODR problem, i.e. finding "parameter estimates that minimize the sum of the squares of the weighted orthogonal distances between each observed data point and the curve described by a nonlinear equation" [1] as well as the nonlinear OLS problem, where uncertainties on the input variables are assumed to be negligible. Further, the model equation can be implicit, e.g. x0^2+x1^2-r^2 = 0 will fit the data to a circle and estimate the radius. ODRPACK will estimate the uncertainties and covariances of the estimated parameters. ODRPACK has a flexible reporting system to generate reports on all stages of the computation. SciPy issues: I have wrapped the base extension module in a set of classes to facilitate the handling of all the options. These classes model my use of the module: fitting experimental data and fiddling on the interactive interpreter to get it right. Perhaps this is leads to a cumbersome API, or another needs to be exposed in addition. I believe I have followed the SciPy style guide except for the classes, which are MixedCase. I think the docstrings are pretty thorough, but I might need to write a tutorial to get people jump-started. Tests: I have tests based on ODRPACK's FORTRAN examples and some of my own data. They are not in unittest form yet, nor do they completely cover all features. ODRPACK comes with a comprehensive FORTRAN testsuite which could be converted to Python, but it will be a fair bit of work. I use Numeric's LinearAlgebra to do a matrix inversion. If adopted into SciPy, that would change to scipy.inv . Availability: http://starship.python.net/crew/kernr/source/odr-0.6.tar.gz License: Public Domain References: [1] http://www.boulder.nist.gov/mcsd/Staff/JRogers/odrpack.html ODRPACK comes from http://www.netlib.org/odrpack/index.html and is public domain. -- Robert Kern Ruddock House President kern at caltech.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From falted at openlc.org Tue Nov 19 05:10:23 2002 From: falted at openlc.org (Francesc Alted) Date: Tue, 19 Nov 2002 11:10:23 +0100 Subject: [SciPy-dev] Building/distributing SciPy In-Reply-To: References: Message-ID: <20021119101023.GB1164@openlc.org> On Tue, Nov 19, 2002 at 03:03:09AM +0200, Pearu Peterson wrote: > Questions: Do we need a tarball, say > SciPy_all-0.2.0_alpha_150.4447.tar.gz, > that contains all modules including scipy_core, f2py2e, scipy, weave, > chaco, etc.? Will that simplify building SciPy for Windows platforms? For Windows I don't know, but for Debian (and may be also for RPM) definitely yes, IMO. In Debian it is quite tipical to derive multiple binary packages from only one tar source. That way, you can group Debian scripts better, avoid redundances and improve maintenance. You are doing a very good job organizing scipy sources; this task may seem quite tough but it is very important to spread scipy. Go ahead. Cheers, -- Francesc Alted PGP KeyID: 0x61C8C11F Scientific aplications developer Public PGP key available: http://www.openlc.org/falted_at_openlc.asc Key fingerprint = 1518 38FE 3A3D 8BE8 24A0 3E5B 1328 32CC 61C8 C11F From eric at enthought.com Tue Nov 19 10:38:19 2002 From: eric at enthought.com (eric jones) Date: Tue, 19 Nov 2002 09:38:19 -0600 Subject: [SciPy-dev] Module Submission: Orthogonal Distance Regression In-Reply-To: <20021119062855.GA3999@taliesen.caltech.edu> Message-ID: <004c01c28fe1$b2d3e540$8901a8c0@ERICDESKTOP> Hey Robert, Thanks for the submission! I had to make a few changes to setup.py and test.py to get it flying on windows. The diffs are below. Here's a change summary: I turned off blas optimization, stuck an import of mingw32_support at the top of the file, and changed the odr package path from '' to '.'. On test.py, there as a syntax error on line 64 -- it needed a line continuation character. As we move forward with this, should we group the odr tools in a package with other data fitting tools such as leastsq fit, spline fits, etc.? While they use different methods, they all serve similar purposes. eric ----------------------------------------------------------------------- C:\third\odr-0.6>diff -c setup.py setup_eric.py *** setup.py Sun Nov 10 01:28:40 2002 --- setup_eric.py Tue Nov 19 09:16:19 2002 *************** *** 13,24 **** # separately-compiled LINPACK. Edit the Extension instances # below to reflect where your libraries are. ! opt_blas = 1 # We need to compile FORTRAN files, so check if we would be # using gcc anyways: then we can use g77 with impunity. # Man, this is such a hack. I'm really, really sorry. ! cc = new_compiler() sysconfig.customize_compiler(cc) --- 13,24 ---- # separately-compiled LINPACK. Edit the Extension instances # below to reflect where your libraries are. ! opt_blas = 0 # We need to compile FORTRAN files, so check if we would be # using gcc anyways: then we can use g77 with impunity. # Man, this is such a hack. I'm really, really sorry. ! from scipy_distutils import mingw32_support cc = new_compiler() sysconfig.customize_compiler(cc) *************** *** 104,110 **** # platforms=["POSIX", "Windows"], packages=['odr'], ! package_dir={'odr': ''}, ext_modules=[ext] ) --- 104,110 ---- # platforms=["POSIX", "Windows"], packages=['odr'], ! package_dir={'odr': '.'}, ext_modules=[ext] ) ----------------------------------------------------------------------- C:\third\odr-0.6\test>diff -c test.py test_eric.py *** test.py Sun Nov 10 04:26:08 2002 --- test_eric.py Tue Nov 19 09:18:50 2002 *************** *** 61,67 **** # Implicit Example def impl_fcn(B, x, power=power): ! return B[2]*power(x[0]-B[0], 2) + 2.0*B[3]*(x[0]-B[0])*(x[1]-B[1]) + B[4]*power(x[1]-B[1], 2) - 1.0 impl_mod = Model(impl_fcn, implicit=1, meta={'name': 'Sample Implicit Model', --- 61,67 ---- # Implicit Example def impl_fcn(B, x, power=power): ! return B[2]*power(x[0]-B[0], 2) + 2.0*B[3]*(x[0]-B[0])*(x[1]-B[1]) + \ B[4]*power(x[1]-B[1], 2) - 1.0 impl_mod = Model(impl_fcn, implicit=1, meta={'name': 'Sample Implicit Model', ---------------------------------------------- eric jones 515 Congress Ave www.enthought.com Suite 1614 512 536-1057 Austin, Tx 78701 > -----Original Message----- > From: scipy-dev-admin at scipy.net [mailto:scipy-dev-admin at scipy.net] On > Behalf Of Robert Kern > Sent: Tuesday, November 19, 2002 12:29 AM > To: scipy-dev at scipy.org > Subject: [SciPy-dev] Module Submission: Orthogonal Distance Regression > > Some time ago, I wrote an extension module to use ODRPACK, a FORTRAN > library that performs Orthogonal Distance Regression (ODR) as well as > non-linear Ordinary Least-Squares (OLS) and implicit regression (fitting > to, say, an ellipse). > > ODRPACK is very flexible. It can handle multi-dimensional input > and output variables, weighted observations (with correlations between > dimensions). It solves the ODR problem, i.e. finding "parameter > estimates that minimize the sum of the squares of the weighted > orthogonal distances between each observed data point and the curve > described by a nonlinear equation" [1] as well as the nonlinear OLS > problem, where uncertainties on the input variables are assumed to be > negligible. Further, the model equation can be implicit, e.g. > x0^2+x1^2-r^2 = 0 will fit the data to a circle and estimate the radius. > ODRPACK will estimate the uncertainties and covariances of the estimated > parameters. ODRPACK has a flexible reporting system to generate reports > on all stages of the computation. > > SciPy issues: > I have wrapped the base extension module in a set of classes to > facilitate the handling of all the options. These classes model my use > of the module: fitting experimental data and fiddling on the interactive > interpreter to get it right. Perhaps this is leads to a cumbersome API, > or another needs to be exposed in addition. > > I believe I have followed the SciPy style guide except for the classes, > which are MixedCase. > > I think the docstrings are pretty thorough, but I might need to write a > tutorial to get people jump-started. > > Tests: I have tests based on ODRPACK's FORTRAN examples and some of my > own data. They are not in unittest form yet, nor do they completely > cover all features. ODRPACK comes with a comprehensive FORTRAN testsuite > which could be converted to Python, but it will be a fair bit of work. > > I use Numeric's LinearAlgebra to do a matrix inversion. If adopted into > SciPy, that would change to scipy.inv . > > Availability: > http://starship.python.net/crew/kernr/source/odr-0.6.tar.gz > License: > Public Domain > > References: > [1] http://www.boulder.nist.gov/mcsd/Staff/JRogers/odrpack.html > > ODRPACK comes from http://www.netlib.org/odrpack/index.html and is > public domain. > > -- > Robert Kern > Ruddock House President > kern at caltech.edu > > "In the fields of hell where the grass grows high > Are the graves of dreams allowed to die." > -- Richard Harter > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-dev From pearu at cens.ioc.ee Tue Nov 19 11:25:45 2002 From: pearu at cens.ioc.ee (Pearu Peterson) Date: Tue, 19 Nov 2002 18:25:45 +0200 (EET) Subject: [SciPy-dev] Module Submission: Orthogonal Distance Regression In-Reply-To: <20021119062855.GA3999@taliesen.caltech.edu> Message-ID: On Mon, 18 Nov 2002, Robert Kern wrote: > Some time ago, I wrote an extension module to use ODRPACK, a FORTRAN > library that performs Orthogonal Distance Regression (ODR) as well as > non-linear Ordinary Least-Squares (OLS) and implicit regression (fitting > to, say, an ellipse). > > ODRPACK is very flexible. It can handle multi-dimensional input > and output variables, weighted observations (with correlations between > dimensions). It solves the ODR problem, i.e. finding "parameter > estimates that minimize the sum of the squares of the weighted > orthogonal distances between each observed data point and the curve > described by a nonlinear equation" [1] as well as the nonlinear OLS > problem, where uncertainties on the input variables are assumed to be > negligible. Further, the model equation can be implicit, e.g. > x0^2+x1^2-r^2 = 0 will fit the data to a circle and estimate the radius. > ODRPACK will estimate the uncertainties and covariances of the estimated > parameters. ODRPACK has a flexible reporting system to generate reports > on all stages of the computation. > > SciPy issues: > I have wrapped the base extension module in a set of classes to > facilitate the handling of all the options. These classes model my use > of the module: fitting experimental data and fiddling on the interactive > interpreter to get it right. Perhaps this is leads to a cumbersome API, > or another needs to be exposed in addition. > > I believe I have followed the SciPy style guide except for the classes, > which are MixedCase. > > I think the docstrings are pretty thorough, but I might need to write a > tutorial to get people jump-started. Here follows more SciPy issues: - odrpack wrapper will probably work fine with g77 compiler but not necessarily with other compilers (this is related to Fortran names in C issue). This can be easily solved using F_FUNC macro though. - odrpack wraper is handwritten and the code base is quite large. This may cause some maintainance issues. - Using the wrapper is sensitive to Fortran/C multi-dimensional array issue (the different data ordering stuff). I think that the last one is the most serious one. Probably the easiest way to solve this is to use f2py generated wrappers. This would also reduce the maintainance issues. Otherwise, odrpack looks good and its proper place would probably be the interpolate module. I think that we should consider odrpack inclusion to SciPy after releasing SciPy-0.2 as also the interpolate module needs some revison and odrpack can be added while doing that. Thanks, Pearu From wagner.nils at vdi.de Tue Nov 19 15:24:45 2002 From: wagner.nils at vdi.de (Nils Wagner) Date: Tue, 19 Nov 2002 21:24:45 +0100 Subject: [SciPy-dev] Never change a running system Message-ID: <20021119203517.5629D3EACE@www.scipy.com> Hi, Due to some problems I have reinstalled scipy via latest cvs. python setp.py build works fine. However python setup.py install failed File "scipy_distutils/core.py", line 42, in setup return old_setup(**new_attr) File "/usr/lib/python2.2/distutils/core.py", line 138, in setup dist.run_commands() File "/usr/lib/python2.2/distutils/dist.py", line 893, in run_commands self.run_command(cmd) File "/usr/lib/python2.2/distutils/dist.py", line 913, in run_command cmd_obj.run() File "/usr/lib/python2.2/distutils/command/install.py", line 495, in run self.run_command(cmd_name) File "/usr/lib/python2.2/distutils/cmd.py", line 330, in run_command self.distribution.run_command(command) File "/usr/lib/python2.2/distutils/dist.py", line 913, in run_command cmd_obj.run() File "/usr/lib/python2.2/distutils/command/install_lib.py", line 93, in run self.byte_compile(outfiles) File "/usr/lib/python2.2/distutils/command/install_lib.py", line 130, in byte_compile verbose=self.verbose, dry_run=self.dry_run) File "/usr/lib/python2.2/distutils/util.py", line 440, in byte_compile compile(file, cfile, dfile) File "/usr/lib/python2.2/py_compile.py", line 62, in compile codeobject = __builtin__.compile(codestring, dfile or file, 'exec') TypeError: compile() argument 1 must be string without null bytes, not str Any suggestion ? Nils Python 2.2.1 (#1, Sep 10 2002, 17:49:17) [GCC 3.2] on linux2 Type "help", "copyright", "credits" or "license" for more information. From pearu at cens.ioc.ee Tue Nov 19 16:03:46 2002 From: pearu at cens.ioc.ee (Pearu Peterson) Date: Tue, 19 Nov 2002 23:03:46 +0200 (EET) Subject: [SciPy-dev] Never change a running system In-Reply-To: <20021119203517.5629D3EACE@www.scipy.com> Message-ID: On Tue, 19 Nov 2002, Nils Wagner wrote: > Due to some problems I have reinstalled scipy via > latest cvs. python setp.py build works fine. > However python setup.py install failed The latest scipy CVS works fine here. > File "/usr/lib/python2.2/py_compile.py", line 62, in compile > codeobject = __builtin__.compile(codestring, dfile or file, > 'exec') > TypeError: compile() argument 1 must be string without null bytes, > not str > > Any suggestion ? Clean up your scipy installation and try: cd cvs/scipy python setup_scipy_core.py install python setup_scipy.py install Pearu From wagner.nils at vdi.de Tue Nov 19 16:51:00 2002 From: wagner.nils at vdi.de (Nils Wagner) Date: Tue, 19 Nov 2002 22:51:00 +0100 Subject: [SciPy-dev] python =?iso-8859-1?q?setup=5Fscipy.py?= install Message-ID: <20021119220132.3CF4E3EACE@www.scipy.com> Pearu, python setup_scipy_core.py install works fine here. But python setup_scipy.py install failed again Traceback (most recent call last): File "setup_scipy.py", line 85, in ? setup_package() File "setup_scipy.py", line 76, in setup_package url = "http://www.scipy.org", File "scipy_distutils/core.py", line 42, in setup return old_setup(**new_attr) File "/usr/lib/python2.2/distutils/core.py", line 138, in setup dist.run_commands() File "/usr/lib/python2.2/distutils/dist.py", line 893, in run_commands self.run_command(cmd) File "/usr/lib/python2.2/distutils/dist.py", line 913, in run_command cmd_obj.run() File "/usr/lib/python2.2/distutils/command/install.py", line 495, in run self.run_command(cmd_name) File "/usr/lib/python2.2/distutils/cmd.py", line 330, in run_command self.distribution.run_command(command) File "/usr/lib/python2.2/distutils/dist.py", line 913, in run_command cmd_obj.run() File "/usr/lib/python2.2/distutils/command/install_lib.py", line 93, in run self.byte_compile(outfiles) File "/usr/lib/python2.2/distutils/command/install_lib.py", line 130, in byte_compile verbose=self.verbose, dry_run=self.dry_run) File "/usr/lib/python2.2/distutils/util.py", line 440, in byte_compile compile(file, cfile, dfile) File "/usr/lib/python2.2/py_compile.py", line 62, in compile codeobject = __builtin__.compile(codestring, dfile or file, 'exec') TypeError: compile() argument 1 must be string without null bytes, not str Nils From dmorrill at enthought.com Tue Nov 19 18:10:12 2002 From: dmorrill at enthought.com (David C. Morrill) Date: Tue, 19 Nov 2002 17:10:12 -0600 Subject: [SciPy-dev] Tkinter/OpenGL version of Chaco/Kiva status and request for help Message-ID: <002801c29020$cffd9190$6501a8c0@Dave> First the good news: The Tkinter/OpenGL based version of Chaco/Kiva is very near completion. The major remaining work items are: 1) Convert Eric's wxPython based FreeType Kiva code to work with the OpenGL based version of Kiva. 2) Improve overall performance dramatically. Some people voiced a concern that the wxPython based version of Chaco seems a little slow. Well, compared to the current OpenGL version, it hums like a well oiled machine :-( Hence the request for help... Most of the performance issues with the OpenGL version of Chaco have to do with interactive operations, like drawing the cursor crosshair lines or the graph and axis selection regions. In the wxPython version, we maintain an off-screen bitmap of the PlotWindow contents and either xor the cursor or selection regions into the screen or blit the off-screen bitmap onto the screen (e.g. to refresh the window when a region of the PlotWindow is exposed). Being fairly inexperienced with OpenGL I have not been able to find a way to do something equivalent in OpenGL: - The OpenGL 'overlay' buffers do not appear to be supported by the version of 'togl' supplied with the current PyOpenGL package. - Trying to do a glReadPixels(...) call to read the contents of the OpenGL buffer seems to take about a second on a >1 GHz processor (hardly useful for interactive work). So the question is: Does anyone know how to efficiently 'rubberband' things in OpenGL? If the asnwer is to use the overlay buffer, does anyone know how to get it to work with a Tkinter based OpenGL widget? Thanks in advance for any help... Dave Morrill -------------- next part -------------- An HTML attachment was scrubbed... URL: From kern at caltech.edu Wed Nov 20 00:42:32 2002 From: kern at caltech.edu (Robert Kern) Date: Tue, 19 Nov 2002 21:42:32 -0800 Subject: [SciPy-dev] Module Submission: Orthogonal Distance Regression In-Reply-To: <004c01c28fe1$b2d3e540$8901a8c0@ERICDESKTOP> References: <20021119062855.GA3999@taliesen.caltech.edu> <004c01c28fe1$b2d3e540$8901a8c0@ERICDESKTOP> Message-ID: <20021120054232.GA15250@taliesen.caltech.edu> On Tue, Nov 19, 2002 at 09:38:19AM -0600, eric jones wrote: > Hey Robert, > > Thanks for the submission! > > I had to make a few changes to setup.py and test.py to get it flying on > windows. The diffs are below. > > Here's a change summary: > I turned off blas optimization, stuck an import of mingw32_support at > the top of the file, and changed the odr package path from '' to '.'. Ah. I haven't tried installing it separately in a while. setup_odr.py allows it to be installed as a subpackage of scipy. I haven't done any testing on Windows. > On test.py, there as a syntax error on line 64 -- it needed a line > continuation character. Fixed. > As we move forward with this, should we group the odr tools in a package > with other data fitting tools such as leastsq fit, spline fits, etc.? > While they use different methods, they all serve similar purposes. The optimize package is probably the best place. > eric -- Robert Kern Ruddock House President kern at caltech.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From kern at caltech.edu Wed Nov 20 03:58:44 2002 From: kern at caltech.edu (Robert Kern) Date: Wed, 20 Nov 2002 00:58:44 -0800 Subject: [SciPy-dev] Module Submission: Orthogonal Distance Regression In-Reply-To: References: <20021119062855.GA3999@taliesen.caltech.edu> Message-ID: <20021120085844.GB15250@taliesen.caltech.edu> On Tue, Nov 19, 2002 at 06:25:45PM +0200, Pearu Peterson wrote: [snip] > Here follows more SciPy issues: > - odrpack wrapper will probably work fine with g77 compiler but not > necessarily with other compilers (this is related to Fortran names in > C issue). This can be easily solved using F_FUNC macro though. I'll look into this. > - odrpack wraper is handwritten and the code base is quite large. This > may cause some maintainance issues. > - Using the wrapper is sensitive to Fortran/C multi-dimensional array > issue (the different data ordering stuff). > > I think that the last one is the most serious one. Probably the > easiest way to solve this is to use f2py generated wrappers. This would > also reduce the maintainance issues. There are a couple of reasons why I didn't use f2py. I do have an f2py version that mostly works, however, * ODRPACK allows "structured arguments." For example, say we have 5 observations of a 2-dimensional input variable and each observation has the same weights, but the second dimension is weighted twice as much as the first, we can either input wd=[1,2] or wd=[[1,1,1,1,1],[2,2,2,2,2]]. The flexibility of these arguments is difficult to recreate with f2py. My f2py implementation expands these arguments in Python to full arrays. * With my usage of the module, I prefer using arrays already in FORTRAN order. N=number of observations, M=number of dimensions per observation in the input variable. ODRPACK expects (N,M) FORTRAN arrays which are (M,N) Python arrays. These get passed to the model functions. For the most part, one wants to implement y=f(x) without explicit looping over each observation; I find it convenient to use x[i] rather than x[:,i]. I have to put explicit transpose's to get the f2py'd code to accept the arrays I'm passing around. I think there's some combination of options to get f2py to do what I want, but I haven't found it yet. I could also be wrongheaded about my preferences. I don't think "N-first" indexing works well with Python's slicing abilities. * The FORTRAN driver routine takes a single function as a callback. This function calculates either the function, the Jacobian wrt parameters or the Jacobian wrt the input variables based on an integer argument. The f2py implementation generates a Python function which calls the appropriate Python functions (if supplied, exception if not provided; the Jacobians are optional). The handmade implementation has a C function that does the same thing. * This is the showstopper: one of the examples exhibits numerical problems in the f2py version and none other. I have no explanation, but one of the Jacobians is consistently wrong and generates results that are incorrect beyond machine eps. The driver routine can check the provided Jacobians by numerically calculating the derivatives; they are flagged as suspect. > Otherwise, odrpack looks good and its proper place would probably be the > interpolate module. I think that we should consider odrpack inclusion > to SciPy after releasing SciPy-0.2 as also the interpolate module needs > some revison and odrpack can be added while doing that. I would have thought optimize. It does various forms of nonlinear least squares. I agree about waiting until post-SciPy-0.2. > Thanks, > Pearu -- Robert Kern Ruddock House President kern at caltech.edu "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter From j_r_fonseca at yahoo.co.uk Wed Nov 20 07:25:43 2002 From: j_r_fonseca at yahoo.co.uk (=?iso-8859-15?Q?Jos=E9?= Fonseca) Date: Wed, 20 Nov 2002 12:25:43 +0000 Subject: [SciPy-dev] Module Submission: Orthogonal Distance Regression In-Reply-To: <20021120085844.GB15250@taliesen.caltech.edu> References: <20021119062855.GA3999@taliesen.caltech.edu> <20021120085844.GB15250@taliesen.caltech.edu> Message-ID: <20021120122543.GB13652@localhost.localdomain> On Wed, Nov 20, 2002 at 12:58:44AM -0800, Robert Kern wrote: > On Tue, Nov 19, 2002 at 06:25:45PM +0200, Pearu Peterson wrote: > > - odrpack wraper is handwritten and the code base is quite large. This > > may cause some maintainance issues. > > - Using the wrapper is sensitive to Fortran/C multi-dimensional array > > issue (the different data ordering stuff). > > > > I think that the last one is the most serious one. Probably the > > easiest way to solve this is to use f2py generated wrappers. This would > > also reduce the maintainance issues. > > There are a couple of reasons why I didn't use f2py. I do have an f2py > version that mostly works, however, > [...] > > * With my usage of the module, I prefer using arrays already in FORTRAN > order. N=number of observations, M=number of dimensions per > observation in the input variable. ODRPACK expects (N,M) FORTRAN > arrays which are (M,N) Python arrays. These get passed to the model > functions. For the most part, one wants to implement y=f(x) without > explicit looping over each observation; I find it convenient to use > x[i] rather than x[:,i]. I have to put explicit transpose's to get > the f2py'd code to accept the arrays I'm passing around. I think > there's some combination of options to get f2py to do what I want, > but I haven't found it yet. > > I could also be wrongheaded about my preferences. I don't think > "N-first" indexing works well with Python's slicing abilities. I felt exactly the same when generating a python binding to ARPACK (A large scale eigenvalue solver, available at http://www.netlib.org/arpack), which I'm still finishing. I started by trying both pyfortran and f2py, but both failed to run the most simple example. ARPACK uses a "reverse communication" scheme where a subroutine is repeatedly called and updates all its arguments in each iteration and ask the caller to do specific test. I could avoid all this by making a more "user friendly" wrapper to ARPACK, which would just take one or two matrices and spits the eigenvalues/eigenvectors, but that would take away the possibility to do alot of optimizations (in-place operations, matrices shapes and properties, etc). So I've decided to make the bindings by hand. And a "user-friendly" wrapper still can be easily made in python, which do all sort of transposing/casting - for large scale problems, you don't want those to happen. By using some carefully coded C macros, all I need is to do CHECK_ARRAY_2D(v, PyArray_FLOAT, == ncv, >= n); to check if the array is a 2D contiguous, check the dimensions, and output a error message if not. I'm still finishing it, but you can take a peek at http://jrfonseca.dyndns.org/work/phd/python/modules/ . BTW, this and other things (like Python never copying a object but give a reference), gave a 10x boost to my original Matlab FE code, when implemented in Numeric Python. Jos? Fonseca __________________________________________________ Do You Yahoo!? Everything you'll ever need on one web page from News and Sport to Email and Music Charts http://uk.my.yahoo.com From gregc at cgl.ucsf.edu Wed Nov 20 13:28:15 2002 From: gregc at cgl.ucsf.edu (Greg Couch) Date: Wed, 20 Nov 2002 10:28:15 -0800 (PST) Subject: [SciPy-dev] Re: [SciPy-user] Tkinter/OpenGL version of Chaco/Kiva status and request for help Message-ID: Just thought I'd clear up some of the OpenGL/Togl/Tk issues in your post. Togl's overlay buffers need to be supported by the graphics card. Most PC graphics cards do not support overlay planes. You need a "workstation" card, such as a NVidia Quadro, a 3dLabs Wildcat, or an ATI FireGL card. glReadPixels is slow because graphics cards vendors say there is no incentive to make it fast. Talk to your vendor. OpenGL rubberbanding is best done with glLogicOp(GL_XOR). Tcl/Tk is slow compared to WxWindows. If you want GUI speed, go back to WxPython. OpenGL is same speed whether you are using WxWindows' wxGLCanvas or Tk's Togl. Hope this helps, Greg Couch UCSF Computer Graphics Lab From pearu at cens.ioc.ee Wed Nov 20 16:46:56 2002 From: pearu at cens.ioc.ee (Pearu Peterson) Date: Wed, 20 Nov 2002 23:46:56 +0200 (EET) Subject: [SciPy-dev] Fire in UTwente Message-ID: Hi, We had a large fire in our departement at UTwente that destroyed most of our servers and the uplink to the Internet. See eg http://slashdot.org/articles/02/11/20/132259.shtml for more info. The computer gays here have done incredible job to get up another uplink to Internet but this may be a temporary solution. I use this chance to let you know that I might be off the line for few days until things stabilize here.. Regards, Pearu From roybryant at seventwentyfour.com Wed Nov 20 22:21:05 2002 From: roybryant at seventwentyfour.com (Roy at SEVENtwentyfour Inc.) Date: Wed, 20 Nov 2002 22:21:05 -0500 Subject: [SciPy-dev] Broken link in www.scipy.org Message-ID: <20021121032950.DB2CD3EACE@www.scipy.com> An embedded and charset-unspecified text was scrubbed... Name: not available URL: From datafeed at SoftHome.net Thu Nov 21 03:32:23 2002 From: datafeed at SoftHome.net (M. Evans) Date: Thu, 21 Nov 2002 01:32:23 -0700 Subject: [SciPy-dev] DisplayPDF for Cross-platform GUI Widgets Message-ID: <6550109313.20021121013223@SoftHome.net> Please correct me if I'm wrong about anything - my understanding of Chaco is cursory. (I have read the web pages.) I wish to present a concept and solicit your opinions. This is not a proposal for work under the Scipy banner, though it certainly could complement the effort. Chaco is a means to draw resolution-independent plots in "canvas windows" supplied by wxWindows, Tk, OpenGL, whatever. It is not a means to draw user interface widgets. Those are supplied by the host library. Chaco merely renders, from DisplayPDF model data, static views in "paint windows" supplied by the various libraries. This is my understanding. Now the idea: Use DisplayPDF to create a resolution-independent, cross-platform device context API in which to draw user widgets themselves. The idea resembles Chaco, except that the context also tracks OS events. (I assume Chaco doesn't do this without help from the host library.) The idea then is to eliminate the host library altogether. The device context layer serves as the foundation for a truly remarkable cross-platform GUI library. Remarkable in that the widgets are not defined in the traditional pixel-wise manner, but via scalable, parametric vectors and arcs. Native look & feel could be emulated a la Qt, http://linux.sarang.net/ftp/mirror/development/widget/qt/pdf/qt-whitepaper-v10.pdf The device context would naturally require ports to different OS APIs. However, it could instead ride on top of an existing pixel-based context like SDL or GGI. I know that Mac OS X utilizes Display PDF so perhaps the device context for that OS would be very "thin" -- while requiring more work on Windows and Xlib. The wxWindows folks are moving to wxUniversal which is akin to this idea except that (as far as I know) wxUniversal is going to use the traditional pixel-based definition of widgets. Instead of defining widgets as so many centimeters high and various scalability constraints, they will be so many pixels high. Comments, opinions? Mark From prabhu at aero.iitm.ernet.in Thu Nov 21 07:19:39 2002 From: prabhu at aero.iitm.ernet.in (Prabhu Ramachandran) Date: Thu, 21 Nov 2002 17:49:39 +0530 Subject: [SciPy-dev] DisplayPDF for Cross-platform GUI Widgets In-Reply-To: <6550109313.20021121013223@SoftHome.net> References: <6550109313.20021121013223@SoftHome.net> Message-ID: <15836.53083.201795.601913@monster.linux.in> >>>>> "ME" == M Evans writes: [snip] ME> Now the idea: Use DisplayPDF to create a ME> resolution-independent, cross-platform device context API in ME> which to draw user widgets themselves. The idea resembles ME> Chaco, except that the context also tracks OS events. (I ME> assume Chaco doesn't do this without help from the host ME> library.) You might want to take a look at Fresco (http://www.iuk.tu-harburg.de/Fresco/HomePage.html) and Berlin (http://www.berlin-consortium.org/). IIRC they have a similar idea. I could be wrong though its been a while since I looked at these. cheers, prabhu From datafeed at SoftHome.net Thu Nov 21 17:07:42 2002 From: datafeed at SoftHome.net (M. Evans) Date: Thu, 21 Nov 2002 15:07:42 -0700 Subject: [SciPy-dev] Re: DisplayPDF for Cross-platform GUI Widgets In-Reply-To: <20021121180002.3525.81901.Mailman@shaft> References: <20021121180002.3525.81901.Mailman@shaft> Message-ID: <16712896203.20021121150742@SoftHome.net> sdrsn> ME> Now the idea: Use DisplayPDF to create a sdrsn> ME> resolution-independent, cross-platform device context API in sdrsn> ME> which to draw user widgets themselves. The idea resembles sdrsn> ME> Chaco, sdrsn> You might want to take a look at Fresco and Berlin sdrsn> IIRC they have a similar idea. sdrsn> cheers, sdrsn> prabhu Fresco and Mac OS X may be the only two systems that use such a concept. I already knew about Fresco, but it is only for Linux. It is not cross-platform. My question revolves around the cross-platform aspect of the idea. I've never done DisplayPDF, and wanted a bit of feedback about this idea of different "back ends" for a DisplayPDF cross-platform graphics layer. There are many cross-platform widget toolkits. Some take a layered approach and some an emulating approach. The layered approach uses native OS widgets and tries to wrap them with a cross-platform API. An example would be wxWindows (not wxUniversal). The emulating approach draws its own widgets in a cross-platform device context. An example of that would be Qt. That is the approach which interests me, with the major difference that I want a vector-based device context, not a pixel-based context. It has to be more than just a graphics layer, since it must also track mouse and keyboard events, i.e. things like mouse dragging to enlarge a rectangle. What do you think about DisplayPDF's suitability for this purpose? Evidently Apple thinks highly of it, but then again, the Chaco pages explain that DisplayPDF has no real standard, so it takes some work to figure out. Mark From dmorrill at enthought.com Thu Nov 21 18:26:20 2002 From: dmorrill at enthought.com (David C. Morrill) Date: Thu, 21 Nov 2002 17:26:20 -0600 Subject: [SciPy-dev] Tkinter/OpenGL version of Chaco/Kiva status follow-up Message-ID: <009501c291b5$65f77720$6501a8c0@Dave> I just wanted to update everyone on the results of my recent request for help... With the kind help of both Perry Greenfield and especially Greg Couch, I have been able to resolve all of the major performance bottlenecks in the Tkinter/OpenGL Chaco/Kiva code. Yeah! The net result is that the OpenGL version now performs comparably to the wxPython version. While neither version is as fast as we (and users) would like, we think we know how to go after the remaining performance issues. However, we will probably wait until the code has matured a little more before doing any further optimization work. I just wanted to thank Perry and Greg for helping me with some nuances of OpenGL that had eluded and baffled me previously. Thanks a lot guys!!! Dave Morrill From motocng at cq114.com.cn Thu Nov 21 19:40:30 2002 From: motocng at cq114.com.cn (Ms.huang) Date: Fri, 22 Nov 2002 08:40:30 +0800 Subject: [SciPy-dev] China Motorcycle Message-ID: <20021122004923.B064E3EAE4@www.scipy.com> Dear Sir We fetch your name through our internet. Last month, Our Group ( Chongqing Yingxiang motorcycle group Co.Ltd)'s breach company had set up one JV with Korean Hyosung Motors & Machinery Inc Our Group manufacture and distributes various whole motorcycle units (displacement ranging from 48cc to 250cc, including two-wheel motorcycle three-wheel motorcycle and four-wheel motorcycle, for carrying goods and taking passengers) and accessories especially main accessories of motorcycle, such as engine (including crankcase, crankshaft connecting rod, carburetor, engine cylinder head, cylinder body, clutch, piston and piston rings), frame, fuel tank, shock absorber, disk brake, panels, wheel hub and so on. So far, they have sold very well to markets in many countries and areas around Asia, Africa and Latin America, meanwhile, we establish service spots and sub-factories around there. We would now like to market the motorcycles and spare parts directly in your country. We would appreciate your advise on whether your company would be interested in acting as a distributor in the your country or if you have any recommendations on any other your country??s associates who might also be interested. For further information about our products, kindly please visit our web page: http://www.cq114.com.cn/English/production/jiaotongys/moto/motozhanshi/YX/YX50QT-2.htm We look forward to your reply. Yours sincerely, Huang(Ms.Sales Manager) Fax: 86-23-67732102 E-mail: motocng at cq114.com.cn a67747005 at cta.cq.cn From nwagner at mecha.uni-stuttgart.de Fri Nov 22 10:47:07 2002 From: nwagner at mecha.uni-stuttgart.de (Nils Wagner) Date: Fri, 22 Nov 2002 16:47:07 +0100 Subject: [SciPy-dev] [Fwd: [Numpy-discussion] Ann: ARPACK bindings to Numeric Python] Message-ID: <3DDE517B.64FF96E5@mecha.uni-stuttgart.de> Hi all, I wonder if ARPACK will be included in scipy ? Nils -------------- next part -------------- An embedded message was scrubbed... From: =?iso-8859-15?Q?Jos=E9?= Fonseca Subject: [Numpy-discussion] Ann: ARPACK bindings to Numeric Python Date: Fri, 22 Nov 2002 13:33:08 +0000 Size: 5102 URL: From j_r_fonseca at yahoo.co.uk Fri Nov 22 14:33:11 2002 From: j_r_fonseca at yahoo.co.uk (=?iso-8859-15?Q?Jos=E9?= Fonseca) Date: Fri, 22 Nov 2002 19:33:11 +0000 Subject: [SciPy-dev] [Fwd: [Numpy-discussion] Ann: ARPACK bindings to Numeric Python] In-Reply-To: <3DDE517B.64FF96E5@mecha.uni-stuttgart.de> References: <3DDE517B.64FF96E5@mecha.uni-stuttgart.de> Message-ID: <20021122193310.GA28583@localhost.localdomain> On Fri, Nov 22, 2002 at 04:47:07PM +0100, Nils Wagner wrote: > Hi all, > > I wonder if ARPACK will be included in scipy ? > > Nils > Date: Fri, 22 Nov 2002 13:33:08 +0000 > From: Jos? Fonseca > Subject: [Numpy-discussion] Ann: ARPACK bindings to Numeric Python > To: numpy-discussion at lists.sourceforge.net > > I've made a Numeric Python binding of the ARPACK library. ARPACK is a [...] I decided not announce on scipy-dev because I'm not a scipy user yet. For the last month I've been porting all my MATLAB code written last year to Numeric Python and I'm still seting up a familiar environment. I did download CVS scipy though, mostly to study Travis Oliphant work on sparse matrices. I must say that I find the scipy project most interesting and I plan to use it more in the future (therefore having subscribed to both sci-user and sci-dev lists, but and the moment I'm still striving to get all my code at the point it was with MATLAB. If there is insterest in a future inclusion of ARPACK in scipy I'll be glad to help. Also concerning sparse matrices, I'm also writting bindings for UMFPACK 4.0 (sparse matrices LU factorization) and for the NIST Sparse BLAS Toolkit (for sparse matrices mat-mat and mat-vec operations), with the same ideals used on ARPACK binding (i.e., focusing on matching the feature of package rather than be user-friendly). Jos? Fonseca __________________________________________________ Do You Yahoo!? Everything you'll ever need on one web page from News and Sport to Email and Music Charts http://uk.my.yahoo.com From wagner.nils at vdi.de Fri Nov 22 16:11:59 2002 From: wagner.nils at vdi.de (Nils Wagner) Date: Fri, 22 Nov 2002 22:11:59 +0100 Subject: [SciPy-dev] error: =?iso-8859-1?q?/tmp/=5F=5Fdummy.f:?= Permission denied Message-ID: <20021122212249.4E2843EACE@www.scipy.com> Hi all, I tried to build scipy via latest cvs. However python setup.py build failed using F2PY 2.23.190-1369 using F2PY 2.23.190-1369 using F2PY 2.23.190-1369 running build_flib running find_fortran_compiler error: /tmp/__dummy.f: Permission denied Any idea ? Nils From pearu at cens.ioc.ee Fri Nov 22 16:33:37 2002 From: pearu at cens.ioc.ee (Pearu Peterson) Date: Fri, 22 Nov 2002 23:33:37 +0200 (EET) Subject: [SciPy-dev] error: =?iso-8859-1?q?/tmp/=5F=5Fdummy.f:?= Permission denied In-Reply-To: <20021122212249.4E2843EACE@www.scipy.com> Message-ID: On Fri, 22 Nov 2002, Nils Wagner wrote: > running find_fortran_compiler > error: /tmp/__dummy.f: Permission denied > > Any idea ? Check your permissions to /tmp. Or may be there exists /tmp/__dummy.f already, created by root or another user, but you don't have permissions to overwrite it. Or... there are number of other possible reasons to see this message and that probably are not related to scipy at all. Pearu From bhoel at web.de Fri Nov 22 16:20:19 2002 From: bhoel at web.de (Berthold =?iso-8859-15?q?H=F6llmann?=) Date: 22 Nov 2002 22:20:19 +0100 Subject: [SciPy-dev] Re: error: /tmp/__dummy.f: Permission denied In-Reply-To: <20021122212249.4E2843EACE@www.scipy.com> References: <20021122212249.4E2843EACE@www.scipy.com> Message-ID: Nils Wagner writes: > Hi all, > > I tried to build scipy via latest cvs. > However python setup.py build failed > using F2PY 2.23.190-1369 > using F2PY 2.23.190-1369 > using F2PY 2.23.190-1369 > running build_flib > running find_fortran_compiler > error: /tmp/__dummy.f: Permission denied Somebody else ran f2py on that machine. Unfortunately f2py does not clean up this temporary file when it checks for installed Fortran compiler. I'd say it's a bug in f2py. Greetings Berthold -- bhoel at web.de / http://starship.python.net/crew/bhoel/ It is unlawful to use this email address for unsolicited ads (USC Title 47 Sec.227). I will assess a US$500 charge for reviewing and deleting each unsolicited ad. From pearu at cens.ioc.ee Fri Nov 22 17:58:43 2002 From: pearu at cens.ioc.ee (Pearu Peterson) Date: Sat, 23 Nov 2002 00:58:43 +0200 (EET) Subject: [SciPy-dev] Re: error: /tmp/__dummy.f: Permission denied In-Reply-To: Message-ID: On 22 Nov 2002, Berthold [iso-8859-15] H?llmann wrote: > Nils Wagner writes: > > > Hi all, > > > > I tried to build scipy via latest cvs. > > However python setup.py build failed > > using F2PY 2.23.190-1369 > > using F2PY 2.23.190-1369 > > using F2PY 2.23.190-1369 > > running build_flib > > running find_fortran_compiler > > error: /tmp/__dummy.f: Permission denied > > Somebody else ran f2py on that machine. Unfortunately f2py does not > clean up this temporary file when it checks for installed Fortran > compiler. I'd say it's a bug in f2py. Actually, it was build_flib.py bug and it is now fixed in CVS. Pearu From datafeed at SoftHome.net Sat Nov 23 23:17:10 2002 From: datafeed at SoftHome.net (M. Evans) Date: Sat, 23 Nov 2002 21:17:10 -0700 Subject: [SciPy-dev] Anti-Grain Geometry Message-ID: <4832724285.20021123211710@SoftHome.net> http://www.antigrain.com/agg_docs/doc_overview.html Interesting library with a liberal license. "Anti-Grain Geometry (AGG) is a platform independent C++ graphic rendering library that is designed to be small and efficient." Mark From eric at enthought.com Sun Nov 24 17:18:26 2002 From: eric at enthought.com (eric jones) Date: Sun, 24 Nov 2002 16:18:26 -0600 Subject: [SciPy-dev] Anti-Grain Geometry In-Reply-To: <4832724285.20021123211710@SoftHome.net> Message-ID: <000001c29407$6be3ac40$8901a8c0@ERICDESKTOP> Hey Mark, Thanks for the link. This looks *very* interesting for Kiva. It is path and affine transform based. It also supports alpha transparency, anti-aliasing, and non-zero and odd-even polygon fills. Best of all, it is small at 10K lines of code. Looking at its "path_storage" class, it has the following methods: move_to line_to curve3 curve4 Besides arc and arc_to, this covers all that Kiva needs. Anti-grain doesn't allow paths to have matrix transforms stored in them (which would provide a direct mapping to kiva), but this can probably be handled (or added) without much fuss. Adding anti-grain as a backend would provide a render engine that was completely independent of a GUI system. We have wanted this, but haven't wanted to do the work.:-) anti-grain, along with the freetype engine, would remove any reliance on the underlying platform for drawing. People wanting GIF would really like this. It would also make a non-OpenGL TkInter easy -- just blit in the buffer after anti-grain has rendered it. Anti-grain does use C++ templates, but compiles on MSVC 6.0 which means gcc probably doesn't have problems with it. It looks like anti-grain is mainly developed on windows. Has anyone tried it on Unix with gcc? I played with the C++ examples on windows, and they seemed very snappy. I'm going to play around with it a little tonight to see if any fundamental problems show up. Thanks again, Eric ---------------------------------------------- eric jones 515 Congress Ave www.enthought.com Suite 1614 512 536-1057 Austin, Tx 78701 > -----Original Message----- > From: scipy-dev-admin at scipy.net [mailto:scipy-dev-admin at scipy.net] On > Behalf Of M. Evans > Sent: Saturday, November 23, 2002 10:17 PM > To: scipy-dev at scipy.net > Subject: [SciPy-dev] Anti-Grain Geometry > > > http://www.antigrain.com/agg_docs/doc_overview.html > > Interesting library with a liberal license. > > "Anti-Grain Geometry (AGG) is a platform independent C++ graphic > rendering library that is designed to be small and efficient." > > Mark > > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-dev From eric at enthought.com Sun Nov 24 19:04:19 2002 From: eric at enthought.com (eric jones) Date: Sun, 24 Nov 2002 18:04:19 -0600 Subject: [SciPy-dev] Anti-Grain Geometry In-Reply-To: <000001c29407$6be3ac40$8901a8c0@ERICDESKTOP> Message-ID: <000001c29416$36fb85c0$8901a8c0@ERICDESKTOP> Here is a quick example that demonstrates anti-grain using weave: I don't quite grok the interface yet, but at least this works (on windows). See ya, Eric ---------------------------------------------------------------- # tst.py # This will produce a grey filled triangle; # # C:\third\agg>python -i tst.py # >>> import gui_thread # # >>> from scipy import plt # >>> plt.image(res[:,:,0]) import os from glob import glob import weave agg_path = "C:/third/agg" src_files = glob(os.path.join(agg_path,'src','*.cpp')) src_files += glob(os.path.join(agg_path,'src','win32','*.cpp')) include_dirs = [os.path.join(agg_path,'include')] headers = [ '"agg_basics.h"', '"agg_render_buffer.h"', '"agg_render_bgr24_solid.h"', '"agg_polyfill.h"', '"agg_affine_matrix.h"', '"agg_converters.h"', '"agg_pipe_converters.h"', '"win32/agg_win32_bmp.h"' ] import weave code = """ // Note: Need to make way for pixel_map not to own its memory. // Here we create the rendering buffer using class agg::pixel_map // which is not a part of AGG and privided only for your convinience // if you program under Windows platform. // !! Make pointer so that memory isn't destroyed (fix leak later). agg::pixel_map* pmap = new agg::pixel_map(); pmap->create(100, 100, agg::org_color24); //Then - create render_buffer and render_bgr24. The only purpose of the //following four lines is to clear the buffer. agg::render_buffer buf; buf.attach(pmap->get_buf(), pmap->get_width(), pmap->get_height(), pmap->get_row_bytes()); agg::render_bgr24 rr(&buf); rr.clear(agg::rgba8(255, 255, 255)); agg::color color = agg::bgr8_packed(0, 20); // build a path agg::render_bgr24_solid r(&buf); agg::polyfill g_polyfill; agg::path_storage g_path; g_path.move_to(20,20); g_path.line_to(80,80); g_path.line_to(40,80); agg::pipe_path_storage p2(&g_path); // Wrapper for path_storage agg::pipe_conv_curve curve(&p2); // Wrapper for conv_curve agg::pipe_conv_polygon poly(&curve); // Convert polygon to its polyfill, // used in trans2 // Below our pipeline divides into two different streams. // One (trans1) transforms our polygon directly, accepting // curve as a data sourcre, the other (trans2) uses polygon's // outline to render a thin border. agg::affine_matrix mtx; agg::pipe_conv_transform trans1(&curve, &mtx); agg::pipe_conv_transform trans2(&poly, &mtx); //------------------------------------- // Setting the thickness of the outline. It can be done anytime // before rendering. poly.set_thickness(2.0); // Render the polygon itself r.set_attribute(agg::rgba8(100, 200, 200, 255)); agg::make_polygon(&g_polyfill, &trans1, 0); agg::render_polygon(&r, &g_polyfill); // Render black outline r.set_attribute(agg::rgba8(0, 0, 0, 200)); agg::make_polygon(&g_polyfill, &trans2, 0); agg::render_polygon(&r, &g_polyfill); //-------------------------------------------------------------------- // Get buffer from anti-grain lib and put it in a Numeric array. //-------------------------------------------------------------------- // vars for building array char real_type = 'b'; // UInt8 // & ~SAVESPACEBIT; int nd = 3; int d[3] = {100,100,3}; int sd,i; PyArray_Descr *descr; PyObject *op; unsigned char* data; // create descriptor if ((descr = PyArray_DescrFromType((int)real_type)) == NULL) return NULL; // calculate size of array sd = descr->elsize; for(i=nd-1;i>=0;i--) { if (d[i] < 0) throw_error(PyExc_ValueError, "negative dimensions are not allowed"); /* This may waste some space, but it seems to be (unsuprisingly) unhealthy to allow strides that are longer than sd. */ sd *= d[i] ? d[i] : 1; } /* Make sure we're alligned on ints. */ sd += sizeof(int) - sd%sizeof(int); op = PyArray_FromDimsAndDataAndDescr(nd, d, descr, (char*)pmap->get_buf()); // we own data ((PyArrayObject *)op)->flags |= OWN_DATA; return_val = op; """ res = weave.inline(code,sources = src_files,headers = headers,include_dirs = include_dirs, libraries=['gdi32'], force=1) print 'shape', res.shape print 'pixel one:', res[0,0,:] #C:\third\agg>python -i tst.py #>>> import gui_thread # #>>> from scipy import plt #>>> plt.image(res[:,:,0]) ---------------------------------------------- eric jones 515 Congress Ave www.enthought.com Suite 1614 512 536-1057 Austin, Tx 78701 > -----Original Message----- > From: scipy-dev-admin at scipy.net [mailto:scipy-dev-admin at scipy.net] On > Behalf Of eric jones > Sent: Sunday, November 24, 2002 4:18 PM > To: scipy-dev at scipy.net > Subject: RE: [SciPy-dev] Anti-Grain Geometry > > Hey Mark, > > Thanks for the link. This looks *very* interesting for Kiva. It is > path and affine transform based. It also supports alpha transparency, > anti-aliasing, and non-zero and odd-even polygon fills. Best of all, it > is small at 10K lines of code. Looking at its "path_storage" class, it > has the following methods: > > move_to > line_to > curve3 > curve4 > > Besides arc and arc_to, this covers all that Kiva needs. Anti-grain > doesn't allow paths to have matrix transforms stored in them (which > would provide a direct mapping to kiva), but this can probably be > handled (or added) without much fuss. > > Adding anti-grain as a backend would provide a render engine that was > completely independent of a GUI system. We have wanted this, but > haven't wanted to do the work.:-) anti-grain, along with the freetype > engine, would remove any reliance on the underlying platform for > drawing. People wanting GIF would really like this. It would also make > a non-OpenGL TkInter easy -- just blit in the buffer after anti-grain > has rendered it. > > Anti-grain does use C++ templates, but compiles on MSVC 6.0 which means > gcc probably doesn't have problems with it. It looks like anti-grain is > mainly developed on windows. Has anyone tried it on Unix with gcc? > > I played with the C++ examples on windows, and they seemed very snappy. > I'm going to play around with it a little tonight to see if any > fundamental problems show up. > > Thanks again, > Eric > > ---------------------------------------------- > eric jones 515 Congress Ave > www.enthought.com Suite 1614 > 512 536-1057 Austin, Tx 78701 > > > > -----Original Message----- > > From: scipy-dev-admin at scipy.net [mailto:scipy-dev-admin at scipy.net] On > > Behalf Of M. Evans > > Sent: Saturday, November 23, 2002 10:17 PM > > To: scipy-dev at scipy.net > > Subject: [SciPy-dev] Anti-Grain Geometry > > > > > > http://www.antigrain.com/agg_docs/doc_overview.html > > > > Interesting library with a liberal license. > > > > "Anti-Grain Geometry (AGG) is a platform independent C++ graphic > > rendering library that is designed to be small and efficient." > > > > Mark > > > > _______________________________________________ > > Scipy-dev mailing list > > Scipy-dev at scipy.net > > http://www.scipy.net/mailman/listinfo/scipy-dev > > > > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.net > http://www.scipy.net/mailman/listinfo/scipy-dev From skip at pobox.com Tue Nov 26 20:49:20 2002 From: skip at pobox.com (Skip Montanaro) Date: Tue, 26 Nov 2002 19:49:20 -0600 Subject: [SciPy-dev] error: =?iso-8859-1?q?/tmp/=5F=5Fdummy.f:?= Permission denied In-Reply-To: References: <20021122212249.4E2843EACE@www.scipy.com> Message-ID: <15844.9376.590677.919894@montanaro.dyndns.org> >> running find_fortran_compiler >> error: /tmp/__dummy.f: Permission denied Pearu> Check your permissions to /tmp. I believe I solved this at one point but never checked in the fix. /tmp shouldn't be used as a temporary directory during builds. If you happen to run on a machine shared by multiple people, you're bound to have problems. Many multi-user machines are administered so that anyone can create files in /tmp, but that only the owner can modify them. Fiddling permissions on /tmp isn't a wise idea in such contexts. Skip From DavidA at ActiveState.com Fri Nov 29 16:48:29 2002 From: DavidA at ActiveState.com (David Ascher) Date: Fri, 29 Nov 2002 13:48:29 -0800 Subject: [SciPy-dev] Re: [Scipy-cvs] world/kiva pilcore2d.py,NONE,1.1 In-Reply-To: <20021129215014.51DC53EACE@www.scipy.com> References: <20021129215014.51DC53EACE@www.scipy.com> Message-ID: <3DE7E0AD.4090500@ActiveState.com> eric at ActiveState.com wrote: > Update of /home/cvsroot/world/kiva > In directory shaft:/tmp/cvs-serv18261 > > Added Files: > pilcore2d.py > Log Message: > Added David Ascher's first cut at a PIL backend for kiva. I have some changes already. Should I send them to you or do you want to grant me checkin rights? Note that you/we also need to define how to deal with fonts. --david From datafeed at SoftHome.net Sat Nov 30 11:40:53 2002 From: datafeed at SoftHome.net (M. Evans) Date: Sat, 30 Nov 2002 09:40:53 -0700 Subject: [SciPy-dev] ROOT In-Reply-To: <20021125180002.28669.72341.Mailman@shaft> References: <20021125180002.28669.72341.Mailman@shaft> Message-ID: <180949975.20021130094053@SoftHome.net> http://root.cern.ch/root/ Seems similar to the Scipy effort except that it's more C++-centric. They have an embedded C++ interpreter (yuk, I'd rather use Python). Maybe some room for collaboration? Mark