From ralf.gommers at googlemail.com Wed Feb 1 01:33:45 2012 From: ralf.gommers at googlemail.com (Ralf Gommers) Date: Wed, 1 Feb 2012 07:33:45 +0100 Subject: [SciPy-Dev] Scikit Signal or similar In-Reply-To: References: Message-ID: 2012/2/1 St?fan van der Walt > Hi Stuart > > On Tue, Jan 31, 2012 at 5:20 AM, Stuart Mumford > wrote: > > I am interested in contributing to the scikit-signal project, I have been > > working on a wavelet package recently which I believe would be useful. > > https://github.com/Cadair/scikit-signal > > We'd also be interested in having wavelet code in scikits-image > (http://skimage.org), since we need it for denoising (I was planning > on just incorporating pywavelets). An advantage is that you'd get a > "free" vehicle for distribution and packaging, but since we focus on > image processing, there may be reasons why you'd rather have it in a > stand-alone package. > Don't really understand the stand-alone bit, since that is a scikit-signal fork. But I think the above shows that this really belongs in scipy. I think we should either improve scipy.signal.wavelets or look at merging pywavelets into scipy. This particular wheel gets reinvented way too often. Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From travis at continuum.io Wed Feb 1 02:53:26 2012 From: travis at continuum.io (Travis Oliphant) Date: Wed, 1 Feb 2012 01:53:26 -0600 Subject: [SciPy-Dev] Scikit Signal or similar In-Reply-To: References: Message-ID: <370248BF-77F2-40CC-888F-99AC5A518736@continuum.io> On Feb 1, 2012, at 12:33 AM, Ralf Gommers wrote: > > > 2012/2/1 St?fan van der Walt > Hi Stuart > > On Tue, Jan 31, 2012 at 5:20 AM, Stuart Mumford wrote: > > I am interested in contributing to the scikit-signal project, I have been > > working on a wavelet package recently which I believe would be useful. > > https://github.com/Cadair/scikit-signal > > We'd also be interested in having wavelet code in scikits-image > (http://skimage.org), since we need it for denoising (I was planning > on just incorporating pywavelets). An advantage is that you'd get a > "free" vehicle for distribution and packaging, but since we focus on > image processing, there may be reasons why you'd rather have it in a > stand-alone package. > > Don't really understand the stand-alone bit, since that is a scikit-signal fork. But I think the above shows that this really belongs in scipy. I think we should either improve scipy.signal.wavelets or look at merging pywavelets into scipy. This particular wheel gets reinvented way too often. > +1 --- I've wanted to see a wavelets package in SciPy for a long time. -Travis > Ralf > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev -------------- next part -------------- An HTML attachment was scrubbed... URL: From stuart at mumford.me.uk Wed Feb 1 04:53:47 2012 From: stuart at mumford.me.uk (Stuart Mumford) Date: Wed, 1 Feb 2012 09:53:47 +0000 Subject: [SciPy-Dev] scikit-signal or Similar Message-ID: Hello all, > *That's great. Have you been following the discussion that's happened > about this package earlier on this list? Here's a summary I made - > http://brocabrain.blogspot.in/2012/01/scikit-signal-python-for-signal.html > * > I have read the discussion and your blog post. I think that development in scikit-signal is a good thing as long as we keep open the possibility of merging bits (all) of it into other places later. It really depends on where the project goes, as you said in your blog post, I don't really understand the intricacies of namespaces either so I am just happy to work on some code. *But I think the above shows that this really belongs in scipy. I think we > should either improve scipy.signal.wavelets or look at merging pywavelets > into scipy. This particular wheel gets reinvented way too often.* > I agree that scipy.signal.wavelets needs improving, the reason myself and my friend started developing this wavelet code, was the only piece of Continuous Wavelet Transform code we could find was the piece we based what is now in the GitHub on. Even that had major omissions to what we needed and therefore we have spent time making the code fit our needs. As for pyWavelets that seems to be a good standalone project and appears to be good at Discrete Wavelet Transforms, which I have not looked into. Again there is no need to reinvent the wheel so I don't think implementing a DWT into SciPy is necessary, however with the amount of applications for CWT I feel it would be better off in SciPy when it is ready. *We'd also be interested in having wavelet code in scikits-image ( > http://skimage.org), since we need it for denoising * I am interested in this application for wavelets, and how to expand the current 1D CWT into 2D. Do you know the advantages / disadvantages for CWT / DWT for your applications? *Wow, I took a look at the wavelet.py code. I, for one would learn like to > learn from you. I want to learn to start coding like that.* > I am flattered! I have never been taught OOP I just sort of blunder through so I hope what I produce is decent code! *I don't think pyHHT will be a part of scikit-signal for some time, both > are projects in their infancy.* > pyHHT certainly needs a lot of work and scikit-signal needs some code, but I think eventually this could be the aim. *Right now I'm working on time-frequency analysis (for the scikit-signal).* > Cool, what? *Although HHT too is ultimately a tool for time-frequency analysis, we need > to create enough motivation for using the HHT over other conventional > methods.* > >From my (very limited) knowledge of what is good for what in signal processing, HHT to me is pretty impressive in what it can do. But as for using something like that it's all about what data you have and what you want to learn from it. I am studying the Sun for my PhD and the primary reason I wish to use HHT (well EMD) is to calculate the periods of oscillation in my very short data sets. [And I will go to extreme lengths to avoid using IDL] *But of course, as an independent project, you are welcome to contribute. > I've put a crude version up at https://github.com/jaidevd/pyhht* > I have already cloned it and started tinkering, but I need to do some more theoretical research first as I don't fully understand it. Stuart -------------- next part -------------- An HTML attachment was scrubbed... URL: From travis at continuum.io Wed Feb 1 10:10:29 2012 From: travis at continuum.io (Travis Oliphant) Date: Wed, 1 Feb 2012 09:10:29 -0600 Subject: [SciPy-Dev] scikit-signal or Similar In-Reply-To: References: Message-ID: <48B4808F-7FD1-46CE-B53A-423EB6D5576A@continuum.io> On Feb 1, 2012, at 3:53 AM, Stuart Mumford wrote: > Hello all, > > That's great. Have you been following the discussion that's happened about this package earlier on this list? Here's a summary I made - http://brocabrain.blogspot.in/2012/01/scikit-signal-python-for-signal.html > > I have read the discussion and your blog post. I think that development in scikit-signal is a good thing as long as we keep open the possibility of merging bits (all) of it into other places later. It really depends on where the project goes, as you said in your blog post, I don't really understand the intricacies of namespaces either so I am just happy to work on some code. > > But I think the above shows that this really belongs in scipy. I think we should either improve scipy.signal.wavelets or look at merging pywavelets into scipy. This particular wheel gets reinvented way too often. > > I agree that scipy.signal.wavelets needs improving, the reason myself and my friend started developing this wavelet code, was the only piece of Continuous Wavelet Transform code we could find was the piece we based what is now in the GitHub on. Even that had major omissions to what we needed and therefore we have spent time making the code fit our needs. > > As for pyWavelets that seems to be a good standalone project and appears to be good at Discrete Wavelet Transforms, which I have not looked into. Again there is no need to reinvent the wheel so I don't think implementing a DWT into SciPy is necessary, however with the amount of applications for CWT I feel it would be better off in SciPy when it is ready. The DWT is exactly the kind of tool SciPy needs. The goal would not be to re-invent DWT with SciPy, but simply integrate pywavelets into SciPy if that is at all possible. Having so many packages is good for developers, but not very good for consumers as people have to collect a lot of different packages together to get what they want. Some of this pressure is being alleviated by "distributions" of Python, and I expect that trend will continue. But, it is still useful for SciPy to grow "fundamental" libraries like a DWT. Thanks for your contributions an comments. Best regards, -Travis > > We'd also be interested in having wavelet code in scikits-image (http://skimage.org), since we need it for denoising > > I am interested in this application for wavelets, and how to expand the current 1D CWT into 2D. Do you know the advantages / disadvantages for CWT / DWT for your applications? > > Wow, I took a look at the wavelet.py code. I, for one would learn like to learn from you. I want to learn to start coding like that. > > I am flattered! I have never been taught OOP I just sort of blunder through so I hope what I produce is decent code! > > I don't think pyHHT will be a part of scikit-signal for some time, both are projects in their infancy. > > pyHHT certainly needs a lot of work and scikit-signal needs some code, but I think eventually this could be the aim. > > Right now I'm working on time-frequency analysis (for the scikit-signal). > > Cool, what? > > Although HHT too is ultimately a tool for time-frequency analysis, we need to create enough motivation for using the HHT over other conventional methods. > > From my (very limited) knowledge of what is good for what in signal processing, HHT to me is pretty impressive in what it can do. But as for using something like that it's all about what data you have and what you want to learn from it. I am studying the Sun for my PhD and the primary reason I wish to use HHT (well EMD) is to calculate the periods of oscillation in my very short data sets. [And I will go to extreme lengths to avoid using IDL] > > But of course, as an independent project, you are welcome to contribute. I've put a crude version up at https://github.com/jaidevd/pyhht > > I have already cloned it and started tinkering, but I need to do some more theoretical research first as I don't fully understand it. > > Stuart > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at googlemail.com Wed Feb 1 15:43:52 2012 From: ralf.gommers at googlemail.com (Ralf Gommers) Date: Wed, 1 Feb 2012 21:43:52 +0100 Subject: [SciPy-Dev] scikit-signal or Similar In-Reply-To: <48B4808F-7FD1-46CE-B53A-423EB6D5576A@continuum.io> References: <48B4808F-7FD1-46CE-B53A-423EB6D5576A@continuum.io> Message-ID: On Wed, Feb 1, 2012 at 4:10 PM, Travis Oliphant wrote: > > On Feb 1, 2012, at 3:53 AM, Stuart Mumford wrote: > > Hello all, > > >> *That's great. Have you been following the discussion that's happened >> about this package earlier on this list? Here's a summary I made - >> http://brocabrain.blogspot.in/2012/01/scikit-signal-python-for-signal.html >> * >> > > I have read the discussion and your blog post. I think that development in > scikit-signal is a good thing as long as we keep open the possibility of > merging bits (all) of it into other places later. It really depends on > where the project goes, as you said in your blog post, I don't really > understand the intricacies of namespaces either so I am just happy to work > on some code. > > *But I think the above shows that this really belongs in scipy. I think >> we should either improve scipy.signal.wavelets or look at merging >> pywavelets into scipy. This particular wheel gets reinvented way too often. >> * >> > > I agree that scipy.signal.wavelets needs improving, the reason myself and > my friend started developing this wavelet code, was the only piece of > Continuous Wavelet Transform code we could find was the piece we based what > is now in the GitHub on. Even that had major omissions to what we needed > and therefore we have spent time making the code fit our needs. > > There is http://projects.scipy.org/scipy/ticket/922 which the author has kept on developing despite the unfortunate lack of feedback. It looks quite far along and may be useful for you: https://github.com/lesserwhirls/scipy-cwt There is also a cwt function in signal.wavelets, but it's very limited. > As for pyWavelets that seems to be a good standalone project and appears > to be good at Discrete Wavelet Transforms, which I have not looked into. > Again there is no need to reinvent the wheel so I don't think implementing > a DWT into SciPy is necessary, however with the amount of applications for > CWT I feel it would be better off in SciPy when it is ready. > > Agreed. The DWT is exactly the kind of tool SciPy needs. The goal would not be to > re-invent DWT with SciPy, but simply integrate pywavelets into SciPy if > that is at all possible. Having so many packages is good for developers, > but not very good for consumers as people have to collect a lot of > different packages together to get what they want. Some of this pressure > is being alleviated by "distributions" of Python, and I expect that trend > will continue. But, it is still useful for SciPy to grow "fundamental" > libraries like a DWT. > > I'll send the pywavelets author an email, would be great to get his input on this. Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From charlesr.harris at gmail.com Wed Feb 1 16:01:49 2012 From: charlesr.harris at gmail.com (Charles R Harris) Date: Wed, 1 Feb 2012 14:01:49 -0700 Subject: [SciPy-Dev] Heads up and macro deprecation. In-Reply-To: References: Message-ID: Hi All, Two things here. 1) Some macros for threading and the iterator now require a trailing semicolon. This change will be reverted before the 1.7 release so that scipy 0.10 will compile, but because it is desirable in the long term it would be helpful if folks maintaining c extensions using numpy would try compiling them against current development and adding the semicolon where needed. The added semicolon will be backward compatible with earlier versions of numpy. 2) It is proposed to deprecate all of the macros in the old_defines.h file and require the use of their replacements. Numpy itself will have made this change after pull-189 is merged and getting rid of the surplus macros will help clean up the historical detritus that has built up over the years, easing maintenance, clarifying code, and making the eventual transition to 2.0 a bit easier. There is a sed script in the tools directory as part of the pull request that can be used to make the needed substitutions. Thoughts? Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From cournape at gmail.com Wed Feb 1 17:43:24 2012 From: cournape at gmail.com (David Cournapeau) Date: Wed, 1 Feb 2012 22:43:24 +0000 Subject: [SciPy-Dev] Heads up and macro deprecation. In-Reply-To: References: Message-ID: On Wed, Feb 1, 2012 at 9:01 PM, Charles R Harris wrote: > Hi All, > > Two things here. > > 1) Some macros for threading and the iterator now require a trailing > semicolon. This change will be reverted before the 1.7 release so that scipy > 0.10 will compile, but because it is desirable in the long term it would be > helpful if folks maintaining c extensions using numpy would try compiling > them against current development and adding the semicolon where needed. The > added semicolon will be backward compatible with earlier versions of numpy. > > 2) It is proposed to deprecate all of the macros in the old_defines.h file > and require the use of their replacements. Numpy itself will have made this > change after pull-189 is merged and getting rid of the surplus macros will > help clean up the historical detritus that has built up over the years, > easing maintenance, clarifying code, and making the eventual transition to > 2.0 a bit easier. There is a sed script in the tools directory as part of > the pull request that can be used to make the needed substitutions. > > Thoughts? Long needed cleanup, thanks for taking care of this. Is there a need to review anything, or has this already been merged ? David From charlesr.harris at gmail.com Wed Feb 1 18:01:54 2012 From: charlesr.harris at gmail.com (Charles R Harris) Date: Wed, 1 Feb 2012 16:01:54 -0700 Subject: [SciPy-Dev] Heads up and macro deprecation. In-Reply-To: References: Message-ID: On Wed, Feb 1, 2012 at 3:43 PM, David Cournapeau wrote: > On Wed, Feb 1, 2012 at 9:01 PM, Charles R Harris > wrote: > > Hi All, > > > > Two things here. > > > > 1) Some macros for threading and the iterator now require a trailing > > semicolon. This change will be reverted before the 1.7 release so that > scipy > > 0.10 will compile, but because it is desirable in the long term it would > be > > helpful if folks maintaining c extensions using numpy would try compiling > > them against current development and adding the semicolon where needed. > The > > added semicolon will be backward compatible with earlier versions of > numpy. > > > > 2) It is proposed to deprecate all of the macros in the old_defines.h > file > > and require the use of their replacements. Numpy itself will have made > this > > change after pull-189 is merged and getting rid of the surplus macros > will > > help clean up the historical detritus that has built up over the years, > > easing maintenance, clarifying code, and making the eventual transition > to > > 2.0 a bit easier. There is a sed script in the tools directory as part of > > the pull request that can be used to make the needed substitutions. > > > > Thoughts? > > Long needed cleanup, thanks for taking care of this. > > Is there a need to review anything, or has this already been merged ? > > It hasn't been merged yet. The main question has been if/how we should deprecate the old macros. The announcement also serves notice ;) I've got a patch doing this for scipy put haven't put it up yet. Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at googlemail.com Thu Feb 2 01:32:33 2012 From: ralf.gommers at googlemail.com (Ralf Gommers) Date: Thu, 2 Feb 2012 07:32:33 +0100 Subject: [SciPy-Dev] commit rights for Jake Vanderplas Message-ID: Hi, I propose to give Jake Vanderplas commit rights. He has been been making a lot of small fixes recently, as well as larger contributions on Arpack and now graph algorithms. Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From warren.weckesser at enthought.com Thu Feb 2 01:39:41 2012 From: warren.weckesser at enthought.com (Warren Weckesser) Date: Thu, 2 Feb 2012 00:39:41 -0600 Subject: [SciPy-Dev] commit rights for Jake Vanderplas In-Reply-To: References: Message-ID: On Thu, Feb 2, 2012 at 12:32 AM, Ralf Gommers wrote: > Hi, > > I propose to give Jake Vanderplas commit rights. He has been been making a > lot of small fixes recently, as well as larger contributions on Arpack and > now graph algorithms. > > +1 Warren -------------- next part -------------- An HTML attachment was scrubbed... URL: From stefan at sun.ac.za Thu Feb 2 02:31:29 2012 From: stefan at sun.ac.za (=?ISO-8859-1?Q?St=E9fan_van_der_Walt?=) Date: Wed, 1 Feb 2012 23:31:29 -0800 Subject: [SciPy-Dev] commit rights for Jake Vanderplas In-Reply-To: References: Message-ID: On Wed, Feb 1, 2012 at 10:39 PM, Warren Weckesser wrote: >> I propose to give Jake Vanderplas commit rights. He has been been making a >> lot of small fixes recently, as well as larger contributions on Arpack and >> now graph algorithms. +1 St?fan From travis at continuum.io Thu Feb 2 02:36:29 2012 From: travis at continuum.io (Travis Oliphant) Date: Thu, 2 Feb 2012 01:36:29 -0600 Subject: [SciPy-Dev] commit rights for Jake Vanderplas In-Reply-To: References: Message-ID: +1 On Feb 2, 2012, at 12:32 AM, Ralf Gommers wrote: > Hi, > > I propose to give Jake Vanderplas commit rights. He has been been making a lot of small fixes recently, as well as larger contributions on Arpack and now graph algorithms. > > Ralf > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev From matthew.brett at gmail.com Thu Feb 2 02:42:42 2012 From: matthew.brett at gmail.com (Matthew Brett) Date: Thu, 2 Feb 2012 07:42:42 +0000 Subject: [SciPy-Dev] commit rights for Jake Vanderplas In-Reply-To: References: Message-ID: Hi, On Thu, Feb 2, 2012 at 7:36 AM, Travis Oliphant wrote: > +1 > > On Feb 2, 2012, at 12:32 AM, Ralf Gommers wrote: > >> Hi, >> >> I propose to give Jake Vanderplas commit rights. He has been been making a lot of small fixes recently, as well as larger contributions on Arpack and now graph algorithms. Obviously kudos and thanks to Jake - but - do we have a policy on commit rights for Scipy in the new github age? I mean - are commit rights for: * making your own commits OR * merging other people's pull requests I think I'm asking the same old git vs svn workflow question. See you, Matthew From stuart at mumford.me.uk Thu Feb 2 04:17:49 2012 From: stuart at mumford.me.uk (Stuart Mumford) Date: Thu, 2 Feb 2012 09:17:49 +0000 Subject: [SciPy-Dev] scikit-signal or Similar In-Reply-To: References: <48B4808F-7FD1-46CE-B53A-423EB6D5576A@continuum.io> Message-ID: Hello, > There is http://projects.scipy.org/scipy/ticket/922 which the author has > kept on developing despite the unfortunate lack of feedback. It looks quite > far along and may be useful for you: > https://github.com/lesserwhirls/scipy-cwt > > There is also a cwt function in signal.wavelets, but it's very limited. > Whoa ... nice one, that's much neater code than my attempt. It's an interesting way it has been implemented though. I wonder if someone can explain the logic of having a class that needs the mother wavelet fed in as an argument rather than subclassing? I shall work on improving that code, I can implement more and more general Mother wavelets and also write some examples and update the plotting routine to use mpl's make_axes_locatable if people think that is a better way to go. I shall send the original author an email, to talk to him about it. The DWT is exactly the kind of tool SciPy needs. The goal would not be >> to re-invent DWT with SciPy, but simply integrate pywavelets into SciPy if >> that is at all possible. Having so many packages is good for developers, >> but not very good for consumers as people have to collect a lot of >> different packages together to get what they want. Some of this pressure >> is being alleviated by "distributions" of Python, and I expect that trend >> will continue. But, it is still useful for SciPy to grow "fundamental" >> libraries like a DWT. >> >> I'll send the pywavelets author an email, would be great to get his input > on this. > I agree that wavelet transforms, both DWT and CWT, are well within SciPy's scope, if we could integrate pyWavelets into SciPy that would be great. It does beg the question though, if that is the path we are going down would it be better to implement a CWT in the same style as pyWavelets to have uniformity in use if they are both integrated in SciPy. Stuart -------------- next part -------------- An HTML attachment was scrubbed... URL: From stefan at sun.ac.za Thu Feb 2 05:47:21 2012 From: stefan at sun.ac.za (=?ISO-8859-1?Q?St=E9fan_van_der_Walt?=) Date: Thu, 2 Feb 2012 02:47:21 -0800 Subject: [SciPy-Dev] commit rights for Jake Vanderplas In-Reply-To: References: Message-ID: Hey Matthew On Wed, Feb 1, 2012 at 11:42 PM, Matthew Brett wrote: > I mean - are commit rights for: > > * making your own commits OR > * merging other people's pull requests > > I think I'm asking the same old git vs svn workflow question. I think the "policy" for #1 is currently to make a PR, wait for feedback, and, if none is forthcoming, to proceed with the merge unless you know the change warrants further discussion (think of doc fixes, etc). Essentially, it relies on the good judgement of the author. Ideally, of course, authors shouldn't have to merge their own PRs, and Ralf and Warren have been very good at reviewing and merging. St?fan From jean-louis at durrieu.ch Thu Feb 2 07:31:26 2012 From: jean-louis at durrieu.ch (Jean-Louis Durrieu) Date: Thu, 2 Feb 2012 13:31:26 +0100 Subject: [SciPy-Dev] scipy.io.wavfile In-Reply-To: References: Message-ID: Hi Warren, On Jan 22, 2012, at 9:40 PM, Warren Weckesser wrote: > Yes, this *is* the right place to discuss this! Sorry if my terse email gave the wrong impression. No worries! I must admit I did not check enough whether this issue had been raised before, I ll check the wiki more often! > As Pauli also said, it is just a matter of someone contributing the implementation. I think it would be great to have a more robust wav file reader. I m wondering if it would not be possible to directly integrate the source from libsndfile? As many might know, David Cournapeau did a "wrapper" for it (scikits.audiolab), but I guess there is some licensing issue... In the meantime, it would be a good trade-off if there were some warning messages when importing scipy.io.wavfile, stating for which data type and which format it is intended to work. But yeah, it is up to someone to contribute... Best regards, Jean-Louis Durrieu From travis at continuum.io Thu Feb 2 10:59:49 2012 From: travis at continuum.io (Travis Oliphant) Date: Thu, 2 Feb 2012 09:59:49 -0600 Subject: [SciPy-Dev] scikit-signal or Similar In-Reply-To: References: <48B4808F-7FD1-46CE-B53A-423EB6D5576A@continuum.io> Message-ID: On Feb 2, 2012, at 3:17 AM, Stuart Mumford wrote: > > The DWT is exactly the kind of tool SciPy needs. The goal would not be to re-invent DWT with SciPy, but simply integrate pywavelets into SciPy if that is at all possible. Having so many packages is good for developers, but not very good for consumers as people have to collect a lot of different packages together to get what they want. Some of this pressure is being alleviated by "distributions" of Python, and I expect that trend will continue. But, it is still useful for SciPy to grow "fundamental" libraries like a DWT. > > I'll send the pywavelets author an email, would be great to get his input on this. > > I agree that wavelet transforms, both DWT and CWT, are well within SciPy's scope, if we could integrate pyWavelets into SciPy that would be great. It does beg the question though, if that is the path we are going down would it be better to implement a CWT in the same style as pyWavelets to have uniformity in use if they are both integrated in SciPy. > That would be ideal, I think. But, I don't know how much work it would be convert one to the other style. -Travis > > Stuart > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at googlemail.com Thu Feb 2 14:13:17 2012 From: ralf.gommers at googlemail.com (Ralf Gommers) Date: Thu, 2 Feb 2012 20:13:17 +0100 Subject: [SciPy-Dev] commit rights for Jake Vanderplas In-Reply-To: References: Message-ID: 2012/2/2 St?fan van der Walt > Hey Matthew > > On Wed, Feb 1, 2012 at 11:42 PM, Matthew Brett > wrote: > > I mean - are commit rights for: > > > > * making your own commits OR > > * merging other people's pull requests > > > > I think I'm asking the same old git vs svn workflow question. > > I think the "policy" for #1 is currently to make a PR, wait for > feedback, and, if none is forthcoming, to proceed with the merge > unless you know the change warrants further discussion (think of doc > fixes, etc). Essentially, it relies on the good judgement of the > author. Ideally, of course, authors shouldn't have to merge their own > PRs, and Ralf and Warren have been very good at reviewing and merging. > Indeed, nothing (except truly trivial things like fixing a typo) should go in without review. Of course it's still possible to merge your own commits if another developer has reviewed them and commented that it's OK to do so. Besides that, I think commit rights are also a kind of recognition for making significant good quality contributions, and that it's healthy for scipy to have a larger number of active developers which can review and merge PRs. Cheers, Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From magetter at gmx.de Thu Feb 2 14:32:28 2012 From: magetter at gmx.de (Dominik Maxein) Date: Thu, 02 Feb 2012 20:32:28 +0100 Subject: [SciPy-Dev] Website: Contradiction in FAQs Message-ID: <20120202193228.232810@gmx.net> ... concerning Python3 compatibility. Hello! While browsing on the SciPy website today, I came along two different FAQs which differed (amongst other things) in their statement about the Python3 compatibility of NumPy and SciPy. They are found here: http://new.scipy.org/faq.html http://www.scipy.org/FAQ I found the first one first and was quite disappointed (it basically says "no Python 3 supported yet, unclear when it will be"), until I stumbled by accident over the other one later. Probably the first one is some left-over or archive, but that was not clear for me from the page. Both can be reached by clicking links from seemingly-current pages on the SciPy site. Best, Dominik From scott.sinclair.za at gmail.com Fri Feb 3 00:40:40 2012 From: scott.sinclair.za at gmail.com (Scott Sinclair) Date: Fri, 3 Feb 2012 07:40:40 +0200 Subject: [SciPy-Dev] Website: Contradiction in FAQs In-Reply-To: <20120202193228.232810@gmx.net> References: <20120202193228.232810@gmx.net> Message-ID: On 2 February 2012 21:32, Dominik Maxein wrote: > While browsing on the SciPy website today, I came along two different FAQs which differed (amongst other things) in their statement about the Python3 compatibility of NumPy and SciPy. They are found here: > > http://new.scipy.org/faq.html > > http://www.scipy.org/FAQ > > I found the first one first and was quite disappointed (it basically says "no Python 3 supported yet, unclear when it will be"), until I stumbled by accident over the other one later. Probably the first one is some left-over or archive, but that was not clear for me from the page. Both can be reached by clicking links from seemingly-current pages on the SciPy site. Thanks for the report. Pull request updating the FAQ of new.scipy.org at https://github.com/scipy/scipy.org-new/pull/2 Preview of updated page at http://scottza.github.com/faq.html#do-numpy-and-scipy-support-python-3-x Cheers, Scott From jba at sdf.lonestar.org Fri Feb 3 10:41:33 2012 From: jba at sdf.lonestar.org (Jeffrey Armstrong) Date: Fri, 3 Feb 2012 15:41:33 +0000 (UTC) Subject: [SciPy-Dev] Interest in a "controls" scikit? Message-ID: After seeing the discussion started by the scikit-signal proposal on the list, I wanted to see if there would be interest in a scikit related to controls. While I wouldn't purport to be an expert, I have managed to author some control design equation solvers, notably algebraic Riccati and Lyapunov equation solvers, and I have a pythonic wrapper for LAPACK's Sylvester equation solver. Combined with much of the LTI code in scipy.signal, I think there's a good start here towards providing some control system design tools. I wanted to see if others would be interested in helping out and whether the scikit path is the right choice (I admit I was a little confused by the outcome of the scikit-signal discussion). Thanks for any input! -Jeff Jeff Armstrong - jba at sdf.lonestar.org SDF Public Access UNIX System - http://sdf.lonestar.org From deshpande.jaidev at gmail.com Fri Feb 3 10:46:42 2012 From: deshpande.jaidev at gmail.com (Jaidev Deshpande) Date: Fri, 3 Feb 2012 21:16:42 +0530 Subject: [SciPy-Dev] Interest in a "controls" scikit? In-Reply-To: References: Message-ID: Hi Jefferey I've taken a couple of undergrad courses in control systems, and I liked them. But that's where my knowledge ends. I'd be interested in doing the coding, though. How do you intend to get started? Where can I read more about this? Thanks From pav at iki.fi Fri Feb 3 11:07:10 2012 From: pav at iki.fi (Pauli Virtanen) Date: Fri, 03 Feb 2012 17:07:10 +0100 Subject: [SciPy-Dev] Interest in a "controls" scikit? In-Reply-To: References: Message-ID: 03.02.2012 16:41, Jeffrey Armstrong kirjoitti: > After seeing the discussion started by the scikit-signal proposal on the > list, I wanted to see if there would be interest in a scikit related to > controls. While I wouldn't purport to be an expert, I have managed to > author some control design equation solvers, notably algebraic Riccati and > Lyapunov equation solvers, and I have a pythonic wrapper for LAPACK's > Sylvester equation solver. Combined with much of the LTI code in > scipy.signal, I think there's a good start here towards providing some > control system design tools. The algebraic Sylvester solver code sounds like it would be a good addition to scipy.linalg --- that equation tends to pop up not only in control theory, so it would be good to have the LAPACK routine wrapped. -- Pauli Virtanen From sarms at unidata.ucar.edu Fri Feb 3 11:22:44 2012 From: sarms at unidata.ucar.edu (Sean Arms) Date: Fri, 03 Feb 2012 09:22:44 -0700 Subject: [SciPy-Dev] scikit-signal or Similar Message-ID: <4F2C09D4.2080201@unidata.ucar.edu> Greetings! > Hello, > > > > There is http://projects.scipy.org/scipy/ticket/922 which the author has > > kept on developing despite the unfortunate lack of feedback. It > looks quite > > far along and may be useful for you: > > https://github.com/lesserwhirls/scipy-cwt > > > > There is also a cwt function in signal.wavelets, but it's very limited. > > > > Whoa ... nice one, that's much neater code than my attempt. It's an > interesting way it has been implemented though. I wonder if someone can > explain the logic of having a class that needs the mother wavelet fed > in as > an argument rather than subclassing? > Sorry - I've been off the list for awhile as I was transitioning to my first 'real world' job. Now that I'm back - hello! This was my first attempt at object oriented programming - I'll need to take a look at the code again to see what I was (or was not) thinking at the time :-) > I shall work on improving that code, I can implement more and more general > Mother wavelets and also write some examples and update the plotting > routine to use mpl's make_axes_locatable if people think that is a better > way to go. > I'd be happy to start working on this again. I'm still finishing up my PhD, but I am now working as a developer at UCAR / Unidata and have some time I can officially spend on 'guerrilla projects' like this (especially since it can/will benefit the atmospheric science community)! Let me know how you'd like to proceed :-) Cheers! Sean > I shall send the original author an email, to talk to him about it. > > The DWT is exactly the kind of tool SciPy needs. The goal would not be > >> to re-invent DWT with SciPy, but simply integrate pywavelets into > SciPy if > >> that is at all possible. Having so many packages is good for > developers, > >> but not very good for consumers as people have to collect a lot of > >> different packages together to get what they want. Some of this > pressure > >> is being alleviated by "distributions" of Python, and I expect that > trend > >> will continue. But, it is still useful for SciPy to grow > "fundamental" > >> libraries like a DWT. > >> > >> I'll send the pywavelets author an email, would be great to get his > input > > on this. > > > > I agree that wavelet transforms, both DWT and CWT, are well within SciPy's > scope, if we could integrate pyWavelets into SciPy that would be great. It > does beg the question though, if that is the path we are going down would > it be better to implement a CWT in the same style as pyWavelets to have > uniformity in use if they are both integrated in SciPy. > > > Stuart From nwagner at iam.uni-stuttgart.de Fri Feb 3 11:28:32 2012 From: nwagner at iam.uni-stuttgart.de (Nils Wagner) Date: Fri, 03 Feb 2012 17:28:32 +0100 Subject: [SciPy-Dev] Interest in a "controls" scikit? In-Reply-To: References: Message-ID: On Fri, 03 Feb 2012 17:07:10 +0100 Pauli Virtanen wrote: > 03.02.2012 16:41, Jeffrey Armstrong kirjoitti: >> After seeing the discussion started by the scikit-signal >>proposal on the >> list, I wanted to see if there would be interest in a >>scikit related to >> controls. While I wouldn't purport to be an expert, I >>have managed to >> author some control design equation solvers, notably >>algebraic Riccati and >> Lyapunov equation solvers, and I have a pythonic wrapper >>for LAPACK's >> Sylvester equation solver. Combined with much of the >>LTI code in >> scipy.signal, I think there's a good start here towards >>providing some >> control system design tools. > > The algebraic Sylvester solver code sounds like it would >be a good > addition to scipy.linalg --- that equation tends to pop >up not only in > control theory, so it would be good to have the LAPACK >routine wrapped. > > -- > Pauli Virtanen > +1 https://github.com/avventi/Slycot might be useful. Nils From gustavo.goretkin at gmail.com Sat Feb 4 15:55:41 2012 From: gustavo.goretkin at gmail.com (Gustavo Goretkin) Date: Sat, 4 Feb 2012 15:55:41 -0500 Subject: [SciPy-Dev] Interest in a "controls" scikit? In-Reply-To: References: Message-ID: I don't think this project was mentioned yet: http://sourceforge.net/apps/mediawiki/python-control/index.php?title=Main_Page On Fri, Feb 3, 2012 at 11:28 AM, Nils Wagner wrote: > On Fri, 03 Feb 2012 17:07:10 +0100 > ?Pauli Virtanen wrote: >> 03.02.2012 16:41, Jeffrey Armstrong kirjoitti: >>> After seeing the discussion started by the scikit-signal >>>proposal on the >>> list, I wanted to see if there would be interest in a >>>scikit related to >>> controls. ?While I wouldn't purport to be an expert, I >>>have managed to >>> author some control design equation solvers, notably >>>algebraic Riccati and >>> Lyapunov equation solvers, and I have a pythonic wrapper >>>for LAPACK's >>> Sylvester equation solver. ?Combined with much of the >>>LTI code in >>> scipy.signal, I think there's a good start here towards >>>providing some >>> control system design tools. >> >> The algebraic Sylvester solver code sounds like it would >>be a good >> addition to scipy.linalg --- that equation tends to pop >>up not only in >> control theory, so it would be good to have the LAPACK >>routine wrapped. >> >> -- >> Pauli Virtanen >> > > +1 > > https://github.com/avventi/Slycot > > might be useful. > > Nils > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev From ralf.gommers at googlemail.com Sat Feb 4 16:52:24 2012 From: ralf.gommers at googlemail.com (Ralf Gommers) Date: Sat, 4 Feb 2012 22:52:24 +0100 Subject: [SciPy-Dev] scipy 0.10.1 release Message-ID: Hi all, A 0.10.1 bugfix release soon is starting to look like a good idea. The two fixes that are most needed are the single-precision Arpack problems and the fix to remain compatible with the macro changes in numpy that Charles made. While we're at it, we may as well backport some other bug and doc fixes. I'll have a look at recent commits for that, suggestions welcome. An RC within a week, and release the week after seems feasible. Thoughts, comments? Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at googlemail.com Sun Feb 5 05:17:01 2012 From: ralf.gommers at googlemail.com (Ralf Gommers) Date: Sun, 5 Feb 2012 11:17:01 +0100 Subject: [SciPy-Dev] anyone want to fix Mann-Whitney test? Message-ID: Hi, There's a bug report and a number of new tests for mannwhitneyu at http://projects.scipy.org/scipy/ticket/1593. These plus a fix were contributed by Sebastian P?lsterl, unfortunately he based his initial fix on GPL'ed R code. Therefore I think we can't use that, even after he modified it. I looked at the GPL code too; I think we need someone who didn't do that to implement a new fix based only on the tests and bug report. Any takers? Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at googlemail.com Sun Feb 5 06:59:55 2012 From: ralf.gommers at googlemail.com (Ralf Gommers) Date: Sun, 5 Feb 2012 12:59:55 +0100 Subject: [SciPy-Dev] Trac performance? Message-ID: Hi, Is there an end in sight for the troubles with Trac? It's driving me completely crazy. So if not, I'm willing to invest some time in moving all the relevant content on the Trac wikis somewhere else (suggestions welcome). I'm thinking about things like development plans, GSOC info, list of maintainers per module, etc. Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From josef.pktd at gmail.com Sun Feb 5 07:19:44 2012 From: josef.pktd at gmail.com (josef.pktd at gmail.com) Date: Sun, 5 Feb 2012 07:19:44 -0500 Subject: [SciPy-Dev] anyone want to fix Mann-Whitney test? In-Reply-To: References: Message-ID: On Sun, Feb 5, 2012 at 5:17 AM, Ralf Gommers wrote: > Hi, > > There's a bug report and a number of new tests for mannwhitneyu at > http://projects.scipy.org/scipy/ticket/1593. These plus a fix were > contributed by Sebastian P?lsterl, unfortunately he based his initial fix > on GPL'ed R code. Therefore I think we can't use that, even after he > modified it. I looked at the GPL code too; I think we need someone who > didn't do that to implement a new fix based only on the tests and bug > report. > > Any takers? > >From what I remember my impression is that this is only a "cosmetic" change, or better a change in what is returned. >>> v, pval = stats.mannwhitneyu(x, y) >>> len(x)*len(y) - v 498.0 >>> pval*2 9.188326533255e-05 docstring says: The reported p-value is for a one-sided hypothesis, to get the two-sided p-value multiply the returned p-value by 2. currently I think none of the tests that uses normal or t distribution has one versus two sided option, but I think they could be added everywhere. One argument in favor of adding two one-sided options is that we return the correct tail instead of the smaller tail. I haven't looked at the pull request. Josef > > Ralf > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From pav at iki.fi Sun Feb 5 08:02:04 2012 From: pav at iki.fi (Pauli Virtanen) Date: Sun, 05 Feb 2012 14:02:04 +0100 Subject: [SciPy-Dev] Trac performance? In-Reply-To: References: Message-ID: Hi, 05.02.2012 12:59, Ralf Gommers kirjoitti: > Is there an end in sight for the troubles with Trac? It's driving me > completely crazy. So if not, I'm willing to invest some time in moving > all the relevant content on the Trac wikis somewhere else (suggestions > welcome). I'm thinking about things like development plans, GSOC info, > list of maintainers per module, etc. Newer versions of Trac are supposed to alleviate the "database is locked" error, which is a PITA. So upgrading the software could help. I think moving stuff out from the Trac could be useful in any case, as its account management is not so transparent. As for the developer wiki info --- how about turning on wikis on Github and using them? Does someone have experience on how usable they actually are? As for dropping Trac completely -- last time around, I think the Github issues system lacked features. I think they've added some stuff since then, though. Pauli From ralf.gommers at googlemail.com Sun Feb 5 08:28:42 2012 From: ralf.gommers at googlemail.com (Ralf Gommers) Date: Sun, 5 Feb 2012 14:28:42 +0100 Subject: [SciPy-Dev] anyone want to fix Mann-Whitney test? In-Reply-To: References: Message-ID: On Sun, Feb 5, 2012 at 1:19 PM, wrote: > > > On Sun, Feb 5, 2012 at 5:17 AM, Ralf Gommers wrote: > >> Hi, >> >> There's a bug report and a number of new tests for mannwhitneyu at >> http://projects.scipy.org/scipy/ticket/1593. These plus a fix were >> contributed by Sebastian P?lsterl, unfortunately he based his initial fix >> on GPL'ed R code. Therefore I think we can't use that, even after he >> modified it. I looked at the GPL code too; I think we need someone who >> didn't do that to implement a new fix based only on the tests and bug >> report. >> >> Any takers? >> > > From what I remember my impression is that this is only a "cosmetic" > change, or better a change in what is returned. > > >>> v, pval = stats.mannwhitneyu(x, y) > >>> len(x)*len(y) - v > 498.0 > Ah, okay. I'm not sure if this is a desirable change then. Any idea why it was implemented like this? > > >>> pval*2 > 9.188326533255e-05 > > > docstring says: > The reported p-value is for a one-sided hypothesis, to get the > two-sided > p-value multiply the returned p-value by 2. > > currently I think none of the tests that uses normal or t distribution has > one versus two sided option, but I think they could be added everywhere. > One argument in favor of adding two one-sided options is that we return > the correct tail instead of the smaller tail. > fisher_exact, kstest and ks_twosamp have less/greater/two-sided. I also think it makes sense to add them where possible. Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at googlemail.com Sun Feb 5 09:11:50 2012 From: ralf.gommers at googlemail.com (Ralf Gommers) Date: Sun, 5 Feb 2012 15:11:50 +0100 Subject: [SciPy-Dev] Trac performance? In-Reply-To: References: Message-ID: On Sun, Feb 5, 2012 at 2:02 PM, Pauli Virtanen wrote: > Hi, > > 05.02.2012 12:59, Ralf Gommers kirjoitti: > > Is there an end in sight for the troubles with Trac? It's driving me > > completely crazy. So if not, I'm willing to invest some time in moving > > all the relevant content on the Trac wikis somewhere else (suggestions > > welcome). I'm thinking about things like development plans, GSOC info, > > list of maintainers per module, etc. > > Newer versions of Trac are supposed to alleviate the "database is > locked" error, which is a PITA. So upgrading the software could help. > > Moving from SQLite to PostgreSQL is also supposed to help. > I think moving stuff out from the Trac could be useful in any case, as > its account management is not so transparent. As for the developer wiki > info --- how about turning on wikis on Github and using them? Does > someone have experience on how usable they actually are? > Haven't got experience, but see that they were just revamped two weeks ago: https://github.com/blog/774-git-powered-wikis-improved. They still look quite simplistic, but we don't need much. > > As for dropping Trac completely -- last time around, I think the Github > issues system lacked features. I think they've added some stuff since > then, though. > If the database locked errors can be solved, I think Trac is still better than Github. Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From josef.pktd at gmail.com Sun Feb 5 09:28:39 2012 From: josef.pktd at gmail.com (josef.pktd at gmail.com) Date: Sun, 5 Feb 2012 09:28:39 -0500 Subject: [SciPy-Dev] anyone want to fix Mann-Whitney test? In-Reply-To: References: Message-ID: On Sun, Feb 5, 2012 at 8:28 AM, Ralf Gommers wrote: > > > On Sun, Feb 5, 2012 at 1:19 PM, wrote: > >> >> >> On Sun, Feb 5, 2012 at 5:17 AM, Ralf Gommers > > wrote: >> >>> Hi, >>> >>> There's a bug report and a number of new tests for mannwhitneyu at >>> http://projects.scipy.org/scipy/ticket/1593. These plus a fix were >>> contributed by Sebastian P?lsterl, unfortunately he based his initial fix >>> on GPL'ed R code. Therefore I think we can't use that, even after he >>> modified it. I looked at the GPL code too; I think we need someone who >>> didn't do that to implement a new fix based only on the tests and bug >>> report. >>> >>> Any takers? >>> >> >> From what I remember my impression is that this is only a "cosmetic" >> change, or better a change in what is returned. >> >> >>> v, pval = stats.mannwhitneyu(x, y) >> >>> len(x)*len(y) - v >> 498.0 >> > > Ah, okay. I'm not sure if this is a desirable change then. Any idea why it > was implemented like this? > No, I was just fixing bugs. This was one of the early tests I worked on when I didn't have stronger opinions what the standard or more informative returns are. Since the pvalues are correct, I didn't care too much about which test statistic is reported. Looking a bit closer, I'm in favor of the change. Returning the short tail instead of the asked for tail in a one-sided test is not really "clean", and trying to rewrite this, it's not easy to figure out which is which, 210 or 498. I haven't finished yet. I like requests with a full test suite. If I remember correctly, then we return almost all the time the two-sided test, so adding the option for one-sided test will be backwards compatible, but for mannwhitneyu it might not be possible. > >> >>> pval*2 >> 9.188326533255e-05 >> >> >> docstring says: >> The reported p-value is for a one-sided hypothesis, to get the >> two-sided >> p-value multiply the returned p-value by 2. >> >> currently I think none of the tests that uses normal or t distribution >> has one versus two sided option, but I think they could be added everywhere. >> One argument in favor of adding two one-sided options is that we return >> the correct tail instead of the smaller tail. >> > > fisher_exact, kstest and ks_twosamp have less/greater/two-sided. I also > think it makes sense to add them where possible. > None of these have a symmetric test distribution, AFAI remember. So, for those it's not easy to figure out how to move from one sided short tail to two-sided or the other way around. Josef > > Ralf > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From josef.pktd at gmail.com Sun Feb 5 09:49:34 2012 From: josef.pktd at gmail.com (josef.pktd at gmail.com) Date: Sun, 5 Feb 2012 09:49:34 -0500 Subject: [SciPy-Dev] anyone want to fix Mann-Whitney test? In-Reply-To: References: Message-ID: On Sun, Feb 5, 2012 at 9:28 AM, wrote: > > > > On Sun, Feb 5, 2012 at 8:28 AM, Ralf Gommers wrote: >> >> >> >> On Sun, Feb 5, 2012 at 1:19 PM, wrote: >>> >>> >>> >>> On Sun, Feb 5, 2012 at 5:17 AM, Ralf Gommers wrote: >>>> >>>> Hi, >>>> >>>> There's a bug report and a number of new tests for mannwhitneyu at http://projects.scipy.org/scipy/ticket/1593. These plus a fix were contributed by Sebastian P?lsterl, unfortunately he based his initial fix on GPL'ed R code. Therefore I think we can't use that, even after he modified it. I looked at the GPL code too; I think we need someone who didn't do that to implement a new fix based only on the tests and bug report. >>>> >>>> Any takers? >>> >>> >>> From what I remember my impression is that this is only a "cosmetic" change, or better a change in what is returned. >>> >>> >>> v, pval = stats.mannwhitneyu(x, y) >>> >>> len(x)*len(y) - v >>> 498.0 >> >> >> Ah, okay. I'm not sure if this is a desirable change then. Any idea why it was implemented like this? > > > No, I was just fixing bugs. This was one of the early tests I worked on when I didn't have stronger opinions what the standard or more informative returns are. Since the pvalues are correct, I didn't care too much about which test statistic is reported. > > Looking a bit closer, I'm in favor of the change. Returning the short tail instead of the asked for tail in a one-sided test is not really "clean", and trying to rewrite this, it's not easy to figure out which is which, 210 or 498. I haven't finished yet. I like requests with a full test suite. > > If I remember correctly, then we return almost all the time the two-sided test, so adding the option for one-sided test will be backwards compatible, but for mannwhitneyu it might not be possible. rewrite as a standalone function is attached the last test was missing a self And I initially had a test failure, because I preferred the keyword arguments in reversed sequence and the tests use a keyword arguments as positional argument. Also just tried to match the tests without trying to understand every detail again. I think it would be better if the default is two-sided but this will double the reported p-value compared to the current version. > > > >>> >>> >>> >>> pval*2 >>> 9.188326533255e-05 >>> >>> >>> docstring says: >>> ??? The reported p-value is for a one-sided hypothesis, to get the two-sided >>> ??? p-value multiply the returned p-value by 2. >>> >>> currently I think none of the tests that uses normal or t distribution has one versus two sided option, but I think they could be added everywhere. >>> One argument in favor of adding two one-sided options is that we return the correct tail instead of the smaller tail. >> >> >> fisher_exact, kstest and ks_twosamp have less/greater/two-sided. I also think it makes sense to add them where possible. > > > None of these have a symmetric test distribution, AFAI remember. So, for those it's not easy to figure out how to move from one sided short tail to two-sided or the other way around. > > Josef > >> >> >> Ralf >> >> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at scipy.org >> http://mail.scipy.org/mailman/listinfo/scipy-dev >> > -------------- next part -------------- A non-text attachment was scrubbed... Name: try_mannwhitenyu.py Type: text/x-python Size: 8256 bytes Desc: not available URL: From ralf.gommers at googlemail.com Sun Feb 5 10:39:29 2012 From: ralf.gommers at googlemail.com (Ralf Gommers) Date: Sun, 5 Feb 2012 16:39:29 +0100 Subject: [SciPy-Dev] anyone want to fix Mann-Whitney test? In-Reply-To: References: Message-ID: On Sun, Feb 5, 2012 at 3:49 PM, wrote: > On Sun, Feb 5, 2012 at 9:28 AM, wrote: > > > > > > > > On Sun, Feb 5, 2012 at 8:28 AM, Ralf Gommers < > ralf.gommers at googlemail.com> wrote: > >> > >> > >> > >> On Sun, Feb 5, 2012 at 1:19 PM, wrote: > >>> > >>> > >>> > >>> On Sun, Feb 5, 2012 at 5:17 AM, Ralf Gommers < > ralf.gommers at googlemail.com> wrote: > >>>> > >>>> Hi, > >>>> > >>>> There's a bug report and a number of new tests for mannwhitneyu at > http://projects.scipy.org/scipy/ticket/1593. These plus a fix were > contributed by Sebastian P?lsterl, unfortunately he based his initial fix > on GPL'ed R code. Therefore I think we can't use that, even after he > modified it. I looked at the GPL code too; I think we need someone who > didn't do that to implement a new fix based only on the tests and bug > report. > >>>> > >>>> Any takers? > >>> > >>> > >>> From what I remember my impression is that this is only a "cosmetic" > change, or better a change in what is returned. > >>> > >>> >>> v, pval = stats.mannwhitneyu(x, y) > >>> >>> len(x)*len(y) - v > >>> 498.0 > >> > >> > >> Ah, okay. I'm not sure if this is a desirable change then. Any idea why > it was implemented like this? > > > > > > No, I was just fixing bugs. This was one of the early tests I worked on > when I didn't have stronger opinions what the standard or more informative > returns are. Since the pvalues are correct, I didn't care too much about > which test statistic is reported. > > > > Looking a bit closer, I'm in favor of the change. Returning the short > tail instead of the asked for tail in a one-sided test is not really > "clean", and trying to rewrite this, it's not easy to figure out which is > which, 210 or 498. I haven't finished yet. I like requests with a full test > suite. > > > > If I remember correctly, then we return almost all the time the > two-sided test, so adding the option for one-sided test will be backwards > compatible, but for mannwhitneyu it might not be possible. > > rewrite as a standalone function is attached > Looks good, thanks. I updated the docstring and put it at https://github.com/rgommers/scipy/tree/mannwhitneyu-tests. > > the last test was missing a self > > And I initially had a test failure, because I preferred the keyword > arguments in reversed sequence and the tests use a keyword arguments > as positional argument. > Users can do that too, so you should never insert new keywords in the middle. > Also just tried to match the tests without trying to understand every > detail again. > > I think it would be better if the default is two-sided but this will > double the reported p-value compared to the current version. > > Not worth breaking backwards compatibility for I'd think. Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From josef.pktd at gmail.com Sun Feb 5 11:14:09 2012 From: josef.pktd at gmail.com (josef.pktd at gmail.com) Date: Sun, 5 Feb 2012 11:14:09 -0500 Subject: [SciPy-Dev] anyone want to fix Mann-Whitney test? In-Reply-To: References: Message-ID: On Sun, Feb 5, 2012 at 10:39 AM, Ralf Gommers wrote: > > > On Sun, Feb 5, 2012 at 3:49 PM, wrote: >> >> On Sun, Feb 5, 2012 at 9:28 AM, wrote: >> > >> > >> > >> > On Sun, Feb 5, 2012 at 8:28 AM, Ralf Gommers >> > wrote: >> >> >> >> >> >> >> >> On Sun, Feb 5, 2012 at 1:19 PM, wrote: >> >>> >> >>> >> >>> >> >>> On Sun, Feb 5, 2012 at 5:17 AM, Ralf Gommers >> >>> wrote: >> >>>> >> >>>> Hi, >> >>>> >> >>>> There's a bug report and a number of new tests for mannwhitneyu at >> >>>> http://projects.scipy.org/scipy/ticket/1593. These plus a fix were >> >>>> contributed by Sebastian P?lsterl, unfortunately he based his initial fix on >> >>>> GPL'ed R code. Therefore I think we can't use that, even after he modified >> >>>> it. I looked at the GPL code too; I think we need someone who didn't do that >> >>>> to implement a new fix based only on the tests and bug report. >> >>>> >> >>>> Any takers? >> >>> >> >>> >> >>> From what I remember my impression is that this is only a "cosmetic" >> >>> change, or better a change in what is returned. >> >>> >> >>> >>> v, pval = stats.mannwhitneyu(x, y) >> >>> >>> len(x)*len(y) - v >> >>> 498.0 >> >> >> >> >> >> Ah, okay. I'm not sure if this is a desirable change then. Any idea why >> >> it was implemented like this? >> > >> > >> > No, I was just fixing bugs. This was one of the early tests I worked on >> > when I didn't have stronger opinions what the standard or more informative >> > returns are. Since the pvalues are correct, I didn't care too much about >> > which test statistic is reported. >> > >> > Looking a bit closer, I'm in favor of the change. Returning the short >> > tail instead of the asked for tail in a one-sided test is not really >> > "clean", and trying to rewrite this, it's not easy to figure out which is >> > which, 210 or 498. I haven't finished yet. I like requests with a full test >> > suite. >> > >> > If I remember correctly, then we return almost all the time the >> > two-sided test, so adding the option for one-sided test will be backwards >> > compatible, but for mannwhitneyu it might not be possible. >> >> rewrite as a standalone function is attached > > > Looks good, thanks. I updated the docstring and put it at > https://github.com/rgommers/scipy/tree/mannwhitneyu-tests. I had managed to work with git, it just takes some time (and I cannot test with current scipy) https://github.com/josef-pkt/scipy/commit/30aa361fc76dea7f7fd76c3f4f7babcd288f7c01 a bit more streamlining and I increased significant to 14 >> >> >> the last test was missing a self >> >> And I initially had a test failure, because I preferred the keyword >> arguments in reversed sequence and the tests use a keyword arguments >> as positional argument. > > > Users can do that too, so you should never insert new keywords in the > middle. > >> >> Also just tried to match the tests without trying to understand every >> detail again. >> >> I think it would be better if the default is two-sided but this will >> double the reported p-value compared to the current version. >> > Not worth breaking backwards compatibility for I'd think. I think, there is no option in this version that would be backwards compatible, since the old version calculated the two-sided test statistic but reported only pvalue/2. The old version was "correct" only with the right interpretation or usage. Since the results will be difficult, I would have really broken with the old version and used the nicer ordering of keywords. If we need to deprecate and break backwards compatibility, then it might be worth to review the entire group of rank tests again. There is at least one ticket on this, also with the comparison of various options and results with matlab and R. Josef > > Ralf > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > From josef.pktd at gmail.com Sun Feb 5 11:25:46 2012 From: josef.pktd at gmail.com (josef.pktd at gmail.com) Date: Sun, 5 Feb 2012 11:25:46 -0500 Subject: [SciPy-Dev] anyone want to fix Mann-Whitney test? In-Reply-To: References: Message-ID: On Sun, Feb 5, 2012 at 11:14 AM, wrote: > On Sun, Feb 5, 2012 at 10:39 AM, Ralf Gommers > wrote: >> >> >> On Sun, Feb 5, 2012 at 3:49 PM, wrote: >>> >>> On Sun, Feb 5, 2012 at 9:28 AM, wrote: >>> > >>> > >>> > >>> > On Sun, Feb 5, 2012 at 8:28 AM, Ralf Gommers >>> > wrote: >>> >> >>> >> >>> >> >>> >> On Sun, Feb 5, 2012 at 1:19 PM, wrote: >>> >>> >>> >>> >>> >>> >>> >>> On Sun, Feb 5, 2012 at 5:17 AM, Ralf Gommers >>> >>> wrote: >>> >>>> >>> >>>> Hi, >>> >>>> >>> >>>> There's a bug report and a number of new tests for mannwhitneyu at >>> >>>> http://projects.scipy.org/scipy/ticket/1593. These plus a fix were >>> >>>> contributed by Sebastian P?lsterl, unfortunately he based his initial fix on >>> >>>> GPL'ed R code. Therefore I think we can't use that, even after he modified >>> >>>> it. I looked at the GPL code too; I think we need someone who didn't do that >>> >>>> to implement a new fix based only on the tests and bug report. >>> >>>> >>> >>>> Any takers? >>> >>> >>> >>> >>> >>> From what I remember my impression is that this is only a "cosmetic" >>> >>> change, or better a change in what is returned. >>> >>> >>> >>> >>> v, pval = stats.mannwhitneyu(x, y) >>> >>> >>> len(x)*len(y) - v >>> >>> 498.0 >>> >> >>> >> >>> >> Ah, okay. I'm not sure if this is a desirable change then. Any idea why >>> >> it was implemented like this? >>> > >>> > >>> > No, I was just fixing bugs. This was one of the early tests I worked on >>> > when I didn't have stronger opinions what the standard or more informative >>> > returns are. Since the pvalues are correct, I didn't care too much about >>> > which test statistic is reported. >>> > >>> > Looking a bit closer, I'm in favor of the change. Returning the short >>> > tail instead of the asked for tail in a one-sided test is not really >>> > "clean", and trying to rewrite this, it's not easy to figure out which is >>> > which, 210 or 498. I haven't finished yet. I like requests with a full test >>> > suite. >>> > >>> > If I remember correctly, then we return almost all the time the >>> > two-sided test, so adding the option for one-sided test will be backwards >>> > compatible, but for mannwhitneyu it might not be possible. >>> >>> rewrite as a standalone function is attached >> >> >> Looks good, thanks. I updated the docstring and put it at >> https://github.com/rgommers/scipy/tree/mannwhitneyu-tests. > > I had managed to work with git, it just takes some time (and I cannot > test with current scipy) > https://github.com/josef-pkt/scipy/commit/30aa361fc76dea7f7fd76c3f4f7babcd288f7c01 > > a bit more streamlining and I increased significant to 14 > >>> >>> >>> the last test was missing a self >>> >>> And I initially had a test failure, because I preferred the keyword >>> arguments in reversed sequence and the tests use a keyword arguments >>> as positional argument. >> >> >> Users can do that too, so you should never insert new keywords in the >> middle. >> >>> >>> Also just tried to match the tests without trying to understand every >>> detail again. >>> >>> I think it would be better if the default is two-sided but this will >>> double the reported p-value compared to the current version. >>> >> Not worth breaking backwards compatibility for I'd think. > > I think, there is no option in this version that would be backwards > compatible, since the old version calculated the two-sided test > statistic but reported only pvalue/2. > The old version was "correct" only with the right interpretation or usage. > > Since the results will be difficult, I would have really broken with > the old version and used the nicer ordering of keywords. > > If we need to deprecate and break backwards compatibility, then it > might be worth to review the entire group of rank tests again. There > is at least one ticket on this, also with the comparison of various > options and results with matlab and R. here is the relevant ticket http://projects.scipy.org/scipy/ticket/901 3 years ago I was in favor of backwards compatibility instead of "clean". Now, I would prefer "clean" there is one that can be closed, I think, http://projects.scipy.org/scipy/ticket/1289 and the ancient stats review ticket http://projects.scipy.org/scipy/ticket/109 Josef > > Josef > >> >> Ralf >> >> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at scipy.org >> http://mail.scipy.org/mailman/listinfo/scipy-dev >> From ralf.gommers at googlemail.com Sun Feb 5 12:05:17 2012 From: ralf.gommers at googlemail.com (Ralf Gommers) Date: Sun, 5 Feb 2012 18:05:17 +0100 Subject: [SciPy-Dev] scipy 0.10.1 release In-Reply-To: References: Message-ID: On Sat, Feb 4, 2012 at 10:52 PM, Ralf Gommers wrote: > Hi all, > > A 0.10.1 bugfix release soon is starting to look like a good idea. The two > fixes that are most needed are the single-precision Arpack problems and the > fix to remain compatible with the macro changes in numpy that Charles made. > While we're at it, we may as well backport some other bug and doc fixes. > I'll have a look at recent commits for that, suggestions welcome. An RC > within a week, and release the week after seems feasible. > > Thoughts, comments? > https://github.com/scipy/scipy/pull/150 Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From ognen at enthought.com Sun Feb 5 12:09:09 2012 From: ognen at enthought.com (Ognen Duzlevski) Date: Sun, 5 Feb 2012 11:09:09 -0600 Subject: [SciPy-Dev] Trac performance? In-Reply-To: References: Message-ID: Hello, On Sun, Feb 5, 2012 at 8:11 AM, Ralf Gommers wrote: > > > On Sun, Feb 5, 2012 at 2:02 PM, Pauli Virtanen wrote: >> >> Hi, >> >> 05.02.2012 12:59, Ralf Gommers kirjoitti: >> > Is there an end in sight for the troubles with Trac? It's driving me >> > completely crazy. So if not, I'm willing to invest some time in moving >> > all the relevant content on the Trac wikis somewhere else (suggestions >> > welcome). I'm thinking about things like development plans, GSOC info, >> > list of maintainers per module, etc. >> >> Newer versions of Trac are supposed to alleviate the "database is >> locked" error, which is a PITA. So upgrading the software could help. >> > Moving from SQLite to PostgreSQL is also supposed to help. > >> >> I think moving stuff out from the Trac could be useful in any case, as >> its account management is not so transparent. As for the developer wiki >> info --- how about turning on wikis on Github and using them? Does >> someone have experience on how usable they actually are? > > > Haven't got experience, but see that they were just revamped two weeks ago: > https://github.com/blog/774-git-powered-wikis-improved. They still look > quite simplistic, but we don't need much. >> >> >> As for dropping Trac completely -- last time around, I think the Github >> issues system lacked features. I think they've added some stuff since >> then, though. > > > If the database locked errors can be solved, I think Trac is still better > than Github. Hello, Can I be of any assistance in this? I have an instance running on Amazon that is slated for the new scipy.org box. Is anyone willing to work with me to help migrate these things (or install a brand new spanking instance of Trac on this separate machine, perhaps powered by Postgres)? Would that be of any help? My problem is that having the one scipy.org box right now that is critical does not allow me much freedom to make a mistake which could possibly result in the scipy.org box being down for some time (in order for me to back out of it)... Ognen From ralf.gommers at googlemail.com Sun Feb 5 12:28:18 2012 From: ralf.gommers at googlemail.com (Ralf Gommers) Date: Sun, 5 Feb 2012 18:28:18 +0100 Subject: [SciPy-Dev] anyone want to fix Mann-Whitney test? In-Reply-To: References: Message-ID: On Sun, Feb 5, 2012 at 5:25 PM, wrote: > On Sun, Feb 5, 2012 at 11:14 AM, wrote: > > On Sun, Feb 5, 2012 at 10:39 AM, Ralf Gommers > > wrote: > >> > >> > >> On Sun, Feb 5, 2012 at 3:49 PM, wrote: > >>> > >>> On Sun, Feb 5, 2012 at 9:28 AM, wrote: > >>> > > >>> > > >>> > > >>> > On Sun, Feb 5, 2012 at 8:28 AM, Ralf Gommers > >>> > wrote: > >>> >> > >>> >> > >>> >> > >>> >> On Sun, Feb 5, 2012 at 1:19 PM, wrote: > >>> >>> > >>> >>> > >>> >>> > >>> >>> On Sun, Feb 5, 2012 at 5:17 AM, Ralf Gommers > >>> >>> wrote: > >>> >>>> > >>> >>>> Hi, > >>> >>>> > >>> >>>> There's a bug report and a number of new tests for mannwhitneyu at > >>> >>>> http://projects.scipy.org/scipy/ticket/1593. These plus a fix > were > >>> >>>> contributed by Sebastian P?lsterl, unfortunately he based his > initial fix on > >>> >>>> GPL'ed R code. Therefore I think we can't use that, even after he > modified > >>> >>>> it. I looked at the GPL code too; I think we need someone who > didn't do that > >>> >>>> to implement a new fix based only on the tests and bug report. > >>> >>>> > >>> >>>> Any takers? > >>> >>> > >>> >>> > >>> >>> From what I remember my impression is that this is only a > "cosmetic" > >>> >>> change, or better a change in what is returned. > >>> >>> > >>> >>> >>> v, pval = stats.mannwhitneyu(x, y) > >>> >>> >>> len(x)*len(y) - v > >>> >>> 498.0 > >>> >> > >>> >> > >>> >> Ah, okay. I'm not sure if this is a desirable change then. Any idea > why > >>> >> it was implemented like this? > >>> > > >>> > > >>> > No, I was just fixing bugs. This was one of the early tests I worked > on > >>> > when I didn't have stronger opinions what the standard or more > informative > >>> > returns are. Since the pvalues are correct, I didn't care too much > about > >>> > which test statistic is reported. > >>> > > >>> > Looking a bit closer, I'm in favor of the change. Returning the short > >>> > tail instead of the asked for tail in a one-sided test is not really > >>> > "clean", and trying to rewrite this, it's not easy to figure out > which is > >>> > which, 210 or 498. I haven't finished yet. I like requests with a > full test > >>> > suite. > >>> > > >>> > If I remember correctly, then we return almost all the time the > >>> > two-sided test, so adding the option for one-sided test will be > backwards > >>> > compatible, but for mannwhitneyu it might not be possible. > >>> > >>> rewrite as a standalone function is attached > >> > >> > >> Looks good, thanks. I updated the docstring and put it at > >> https://github.com/rgommers/scipy/tree/mannwhitneyu-tests. > > > > I had managed to work with git, it just takes some time (and I cannot > > test with current scipy) > > > https://github.com/josef-pkt/scipy/commit/30aa361fc76dea7f7fd76c3f4f7babcd288f7c01 > > > > a bit more streamlining and I increased significant to 14 > > > >>> > >>> > >>> the last test was missing a self > >>> > >>> And I initially had a test failure, because I preferred the keyword > >>> arguments in reversed sequence and the tests use a keyword arguments > >>> as positional argument. > >> > >> > >> Users can do that too, so you should never insert new keywords in the > >> middle. > >> > >>> > >>> Also just tried to match the tests without trying to understand every > >>> detail again. > >>> > >>> I think it would be better if the default is two-sided but this will > >>> double the reported p-value compared to the current version. > >>> > >> Not worth breaking backwards compatibility for I'd think. > > > > I think, there is no option in this version that would be backwards > > compatible, since the old version calculated the two-sided test > > statistic but reported only pvalue/2. > > The old version was "correct" only with the right interpretation or > usage. > This seems worth fixing then. Since there's no good way to do this with a deprecation and this function is not used often, why not give a warning on usage in 0.11.0 and immediately switch to the fixed version? > Since the results will be difficult, I would have really broken with > > the old version and used the nicer ordering of keywords. > > Let's switch them around then. If we're breaking compatibility anyway, it doesn't matter. Ralf > If we need to deprecate and break backwards compatibility, then it > > might be worth to review the entire group of rank tests again. There > > is at least one ticket on this, also with the comparison of various > > options and results with matlab and R. > > here is the relevant ticket http://projects.scipy.org/scipy/ticket/901 > > 3 years ago I was in favor of backwards compatibility instead of "clean". > Now, I would prefer "clean" > > there is one that can be closed, I think, > http://projects.scipy.org/scipy/ticket/1289 > and the ancient stats review ticket > http://projects.scipy.org/scipy/ticket/109 > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From josef.pktd at gmail.com Sun Feb 5 13:09:03 2012 From: josef.pktd at gmail.com (josef.pktd at gmail.com) Date: Sun, 5 Feb 2012 13:09:03 -0500 Subject: [SciPy-Dev] anyone want to fix Mann-Whitney test? In-Reply-To: References: Message-ID: On Sun, Feb 5, 2012 at 12:28 PM, Ralf Gommers wrote: > > > On Sun, Feb 5, 2012 at 5:25 PM, wrote: >> >> On Sun, Feb 5, 2012 at 11:14 AM, ? wrote: >> > On Sun, Feb 5, 2012 at 10:39 AM, Ralf Gommers >> > wrote: >> >> >> >> >> >> On Sun, Feb 5, 2012 at 3:49 PM, wrote: >> >>> >> >>> On Sun, Feb 5, 2012 at 9:28 AM, wrote: >> >>> > >> >>> > >> >>> > >> >>> > On Sun, Feb 5, 2012 at 8:28 AM, Ralf Gommers >> >>> > wrote: >> >>> >> >> >>> >> >> >>> >> >> >>> >> On Sun, Feb 5, 2012 at 1:19 PM, wrote: >> >>> >>> >> >>> >>> >> >>> >>> >> >>> >>> On Sun, Feb 5, 2012 at 5:17 AM, Ralf Gommers >> >>> >>> wrote: >> >>> >>>> >> >>> >>>> Hi, >> >>> >>>> >> >>> >>>> There's a bug report and a number of new tests for mannwhitneyu >> >>> >>>> at >> >>> >>>> http://projects.scipy.org/scipy/ticket/1593. These plus a fix >> >>> >>>> were >> >>> >>>> contributed by Sebastian P?lsterl, unfortunately he based his >> >>> >>>> initial fix on >> >>> >>>> GPL'ed R code. Therefore I think we can't use that, even after he >> >>> >>>> modified >> >>> >>>> it. I looked at the GPL code too; I think we need someone who >> >>> >>>> didn't do that >> >>> >>>> to implement a new fix based only on the tests and bug report. >> >>> >>>> >> >>> >>>> Any takers? >> >>> >>> >> >>> >>> >> >>> >>> From what I remember my impression is that this is only a >> >>> >>> "cosmetic" >> >>> >>> change, or better a change in what is returned. >> >>> >>> >> >>> >>> >>> v, pval = stats.mannwhitneyu(x, y) >> >>> >>> >>> len(x)*len(y) - v >> >>> >>> 498.0 >> >>> >> >> >>> >> >> >>> >> Ah, okay. I'm not sure if this is a desirable change then. Any idea >> >>> >> why >> >>> >> it was implemented like this? >> >>> > >> >>> > >> >>> > No, I was just fixing bugs. This was one of the early tests I worked >> >>> > on >> >>> > when I didn't have stronger opinions what the standard or more >> >>> > informative >> >>> > returns are. Since the pvalues are correct, I didn't care too much >> >>> > about >> >>> > which test statistic is reported. >> >>> > >> >>> > Looking a bit closer, I'm in favor of the change. Returning the >> >>> > short >> >>> > tail instead of the asked for tail in a one-sided test is not really >> >>> > "clean", and trying to rewrite this, it's not easy to figure out >> >>> > which is >> >>> > which, 210 or 498. I haven't finished yet. I like requests with a >> >>> > full test >> >>> > suite. >> >>> > >> >>> > If I remember correctly, then we return almost all the time the >> >>> > two-sided test, so adding the option for one-sided test will be >> >>> > backwards >> >>> > compatible, but for mannwhitneyu it might not be possible. >> >>> >> >>> rewrite as a standalone function is attached >> >> >> >> >> >> Looks good, thanks. I updated the docstring and put it at >> >> https://github.com/rgommers/scipy/tree/mannwhitneyu-tests. >> > >> > I had managed to work with git, it just takes some time (and I cannot >> > test with current scipy) >> > >> > https://github.com/josef-pkt/scipy/commit/30aa361fc76dea7f7fd76c3f4f7babcd288f7c01 >> > >> > a bit more streamlining and I increased significant to 14 >> > >> >>> >> >>> >> >>> the last test was missing a self >> >>> >> >>> And I initially had a test failure, because I preferred the keyword >> >>> arguments in reversed sequence and the tests use a keyword arguments >> >>> as positional argument. >> >> >> >> >> >> Users can do that too, so you should never insert new keywords in the >> >> middle. >> >> >> >>> >> >>> Also just tried to match the tests without trying to understand every >> >>> detail again. >> >>> >> >>> I think it would be better if the default is two-sided but this will >> >>> double the reported p-value compared to the current version. >> >>> >> >> Not worth breaking backwards compatibility for I'd think. >> > >> > I think, there is no option in this version that would be backwards >> > compatible, since the old version calculated the two-sided test >> > statistic but reported only pvalue/2. >> > The old version was "correct" only with the right interpretation or >> > usage. > > > This seems worth fixing then. Since there's no good way to do this with a > deprecation and this function is not used often, why not give a warning on > usage in 0.11.0 and immediately switch to the fixed version? +1 Josef > >> > Since the results will be difficult, I would have really broken with >> > the old version and used the nicer ordering of keywords. >> > Let's switch them around then. If we're breaking compatibility anyway, it > doesn't matter. > > Ralf > >> > If we need to deprecate and break backwards compatibility, then it >> > might be worth to review the entire group of rank tests again. There >> > is at least one ticket on this, also with the comparison of various >> > options and results with matlab and R. >> >> here is the relevant ticket http://projects.scipy.org/scipy/ticket/901 >> >> 3 years ago I was in favor of backwards compatibility instead of "clean". >> Now, I would prefer "clean" >> >> there is one that can be closed, I think, >> http://projects.scipy.org/scipy/ticket/1289 >> and the ancient stats review ticket >> http://projects.scipy.org/scipy/ticket/109 >> > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > From jba at sdf.lonestar.org Sun Feb 5 15:38:06 2012 From: jba at sdf.lonestar.org (Jeffrey Armstrong) Date: Sun, 5 Feb 2012 20:38:06 +0000 (UTC) Subject: [SciPy-Dev] Interest in a "controls" scikit? In-Reply-To: References: Message-ID: On Fri, 3 Feb 2012, Nils Wagner wrote: > https://github.com/avventi/Slycot > > might be useful. > Slycot and SLICOT are GPL'd code, so I was under the impression that they couldn't be included in SciPy directly. I don't know how people feel about GPL in a scikit. Jeff Armstrong - jba at sdf.lonestar.org SDF Public Access UNIX System - http://sdf.lonestar.org From jba at sdf.lonestar.org Sun Feb 5 15:40:07 2012 From: jba at sdf.lonestar.org (Jeffrey Armstrong) Date: Sun, 5 Feb 2012 20:40:07 +0000 (UTC) Subject: [SciPy-Dev] Interest in a "controls" scikit? In-Reply-To: References: Message-ID: On Fri, 3 Feb 2012, Pauli Virtanen wrote: > The algebraic Sylvester solver code sounds like it would be a good > addition to scipy.linalg --- that equation tends to pop up not only in > control theory, so it would be good to have the LAPACK routine wrapped. The LAPACK routine (*trsyl) was wrapped in pull request #124, but a more Pythonic interface would probably be advisable. Would Lyapunov or Riccati solver have a place in scipy.linalg? Jeff Armstrong - jba at sdf.lonestar.org SDF Public Access UNIX System - http://sdf.lonestar.org From josef.pktd at gmail.com Sun Feb 5 18:07:13 2012 From: josef.pktd at gmail.com (josef.pktd at gmail.com) Date: Sun, 5 Feb 2012 18:07:13 -0500 Subject: [SciPy-Dev] doc edit: discrete ppf Message-ID: based on http://projects.scipy.org/scipy/ticket/1421#comment:8 I added an explanation to the stats tutorial for the ppf of discrete distributions, but I don't quite know how to create a link. http://docs.scipy.org/scipy/docs/scipy-docs/tutorial/stats.rst anyone for review and fixing some rst? Thanks, Josef From jsseabold at gmail.com Sun Feb 5 18:16:43 2012 From: jsseabold at gmail.com (Skipper Seabold) Date: Sun, 5 Feb 2012 18:16:43 -0500 Subject: [SciPy-Dev] doc edit: discrete ppf In-Reply-To: References: Message-ID: On Sun, Feb 5, 2012 at 6:07 PM, wrote: > > based on http://projects.scipy.org/scipy/ticket/1421#comment:8 > > I added an explanation to the stats tutorial for the ppf of discrete > distributions, but I don't quite know how to create a link. > I just changed the link, if it's what you meant. `this is the link text `__ My two go-to reST tutorials. http://docutils.sourceforge.net/docs/user/rst/quickref.html#hyperlink-targets http://people.ee.ethz.ch/~creller/web/tricks/reST.html#hypertext-links > > http://docs.scipy.org/scipy/docs/scipy-docs/tutorial/stats.rst > > anyone for review and fixing some rst? > > Thanks, > > Josef > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev From josef.pktd at gmail.com Sun Feb 5 19:35:49 2012 From: josef.pktd at gmail.com (josef.pktd at gmail.com) Date: Sun, 5 Feb 2012 19:35:49 -0500 Subject: [SciPy-Dev] doc edit: discrete ppf In-Reply-To: References: Message-ID: On Sun, Feb 5, 2012 at 6:16 PM, Skipper Seabold wrote: > On Sun, Feb 5, 2012 at 6:07 PM, wrote: >> >> based on http://projects.scipy.org/scipy/ticket/1421#comment:8 >> >> I added an explanation to the stats tutorial for the ppf of discrete >> distributions, but I don't quite know how to create a link. >> > > I just changed the link, if it's what you meant. > > `this is the link text > `__ Sorry, I wasn't clear, I had left this link to figure out how to make the link, but the link I wanted is an internal reference to an attached rst file just before that :ref:`(ppf of discrete random variables) ` This seems to be working now, but it wasn't before. maybe just some processing delay Thanks, Josef > > My two go-to reST tutorials. > > http://docutils.sourceforge.net/docs/user/rst/quickref.html#hyperlink-targets > http://people.ee.ethz.ch/~creller/web/tricks/reST.html#hypertext-links > >> >> http://docs.scipy.org/scipy/docs/scipy-docs/tutorial/stats.rst >> >> anyone for review and fixing some rst? >> >> Thanks, >> >> Josef >> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at scipy.org >> http://mail.scipy.org/mailman/listinfo/scipy-dev > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev From ralf.gommers at googlemail.com Mon Feb 6 02:16:56 2012 From: ralf.gommers at googlemail.com (Ralf Gommers) Date: Mon, 6 Feb 2012 08:16:56 +0100 Subject: [SciPy-Dev] Trac performance? In-Reply-To: References: Message-ID: On Sun, Feb 5, 2012 at 6:09 PM, Ognen Duzlevski wrote: > Hello, > > On Sun, Feb 5, 2012 at 8:11 AM, Ralf Gommers > wrote: > > > > > > On Sun, Feb 5, 2012 at 2:02 PM, Pauli Virtanen wrote: > >> > >> Hi, > >> > >> 05.02.2012 12:59, Ralf Gommers kirjoitti: > >> > Is there an end in sight for the troubles with Trac? It's driving me > >> > completely crazy. So if not, I'm willing to invest some time in moving > >> > all the relevant content on the Trac wikis somewhere else (suggestions > >> > welcome). I'm thinking about things like development plans, GSOC info, > >> > list of maintainers per module, etc. > >> > >> Newer versions of Trac are supposed to alleviate the "database is > >> locked" error, which is a PITA. So upgrading the software could help. > >> > > Moving from SQLite to PostgreSQL is also supposed to help. > > > >> > >> I think moving stuff out from the Trac could be useful in any case, as > >> its account management is not so transparent. As for the developer wiki > >> info --- how about turning on wikis on Github and using them? Does > >> someone have experience on how usable they actually are? > > > > > > Haven't got experience, but see that they were just revamped two weeks > ago: > > https://github.com/blog/774-git-powered-wikis-improved. They still look > > quite simplistic, but we don't need much. > >> > >> > >> As for dropping Trac completely -- last time around, I think the Github > >> issues system lacked features. I think they've added some stuff since > >> then, though. > > > > > > If the database locked errors can be solved, I think Trac is still better > > than Github. > > Hello, > > Can I be of any assistance in this? I have an instance running on > Amazon that is slated for the new scipy.org box. Is anyone willing to > work with me to help migrate these things (or install a brand new > spanking instance of Trac on this separate machine, perhaps powered by > Postgres)? Would that be of any help? My problem is that having the > one scipy.org box right now that is critical does not allow me much > freedom to make a mistake which could possibly result in the scipy.org > box being down for some time (in order for me to back out of it)... > Thanks Ognen, that would help a lot. Just getting Trac to work would be the best solution here. I'm not one of the admins of the old machine and not very experienced with Trac, but if the current admins don't have time to help you I'll do what I can to help out. Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From vanforeest at gmail.com Mon Feb 6 03:52:43 2012 From: vanforeest at gmail.com (nicky van foreest) Date: Mon, 6 Feb 2012 09:52:43 +0100 Subject: [SciPy-Dev] doc edit: discrete ppf In-Reply-To: References: Message-ID: Hi Josef, I don't quite know which parts to review, as I don't know quite how to figure out what parts you changed. BTW, while searching for `percent' in http://docs.scipy.org/scipy/docs/scipy-docs/tutorial/stats.rst/ I encountered the following text: ..... We can obtain the list of available distribution through introspection: >>> dist_continu = [d for d in dir(stats) if ... isinstance(getattr(stats,d), stats.rv_continuous)] >>> dist_discrete = [d for d in dir(stats) if ... isinstance(getattr(stats,d), stats.rv_discrete)] >>> print 'number of continuous distributions:', len(dist_continu) number of continuous distributions: 84 >>> print 'number of discrete distributions: ', len(dist_discrete) number of discrete distributions: 12 Distributions can be used in one of two ways, either by passing all distribution parameters to each method call or by freezing the parameters for the instance of the distribution. As an example, we can get the median of the distribution by using the percent point function, ppf, which is the inverse of the cdf: ........... The sentence ` Distributions can be ...' seems to off-topic here, at least to me. If is intentionally located at this place, the example of the ppf is not the way to demonstrate how to freeze the rv. Nicky On 6 February 2012 01:35, wrote: > On Sun, Feb 5, 2012 at 6:16 PM, Skipper Seabold wrote: >> On Sun, Feb 5, 2012 at 6:07 PM, wrote: >>> >>> based on http://projects.scipy.org/scipy/ticket/1421#comment:8 >>> >>> I added an explanation to the stats tutorial for the ppf of discrete >>> distributions, but I don't quite know how to create a link. >>> >> >> I just changed the link, if it's what you meant. >> >> `this is the link text >> `__ > > Sorry, I wasn't clear, I had left this link to figure out how to make > the link, but the link I wanted is an internal reference to an > attached rst file just before that > > :ref:`(ppf of discrete random variables) ` > > This seems to be working now, but it wasn't before. maybe just some > processing delay > > Thanks, > > Josef > >> >> My two go-to reST tutorials. >> >> http://docutils.sourceforge.net/docs/user/rst/quickref.html#hyperlink-targets >> http://people.ee.ethz.ch/~creller/web/tricks/reST.html#hypertext-links >> >>> >>> http://docs.scipy.org/scipy/docs/scipy-docs/tutorial/stats.rst >>> >>> anyone for review and fixing some rst? >>> >>> Thanks, >>> >>> Josef >>> _______________________________________________ >>> SciPy-Dev mailing list >>> SciPy-Dev at scipy.org >>> http://mail.scipy.org/mailman/listinfo/scipy-dev >> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at scipy.org >> http://mail.scipy.org/mailman/listinfo/scipy-dev > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev From stuart at mumford.me.uk Mon Feb 6 05:13:24 2012 From: stuart at mumford.me.uk (Stuart Mumford) Date: Mon, 6 Feb 2012 10:13:24 +0000 Subject: [SciPy-Dev] scikit-signal or Similar In-Reply-To: <4F2C09D4.2080201@unidata.ucar.edu> References: <4F2C09D4.2080201@unidata.ucar.edu> Message-ID: Hello, Sorry - I've been off the list for awhile as I was transitioning to my > first 'real world' job. Now that I'm back - hello! This was my first > attempt at object oriented programming - I'll need to take a look at the > code again to see what I was (or was not) thinking at the time :-) > Nice to have you around to help and I am glad you now have some time to help out! For a first attempt at OOP it's very good! (well much better than my first attempt was!) I was thinking about how would be best to implement it, we want to try and mirror as closely as possible the calling sequence of PyWavelets and it's DWT routines. They use the approach of >>> cA, cD = pywt.dwt(data,mother) where mother is either a string or an instance of a mother class. So I was thinking would the best code structure to be have a "Wavelet" class which is subclassed for cwt, icwt ccwt etc. and a Mother class which is subclassed for each family like Morlet, DOG etc. Then into the Wavelet instance you can pass either a subclass of mother or a string and go from there. This is pretty close to how your code is structured at the moment, just need to make the main cwt etc. functions subclasses of Wavelet? As for results I think it would be best to return a class when cwt etc. is called and then have attributes such as power, data, coefficients etc. ie wave = scipy.signal.wavelet.cwt.cwt(data,'morlet') pwr = wave.power wave.scalogram() etc. > > > I shall work on improving that code, I can implement more and more > general > > Mother wavelets and also write some examples and update the plotting > > routine to use mpl's make_axes_locatable if people think that is a better > > way to go. > > > > I'd be happy to start working on this again. I'm still finishing up my > PhD, but I am now working as a developer at UCAR / Unidata and have some > time I can officially spend on 'guerrilla projects' like this > (especially since it can/will benefit the atmospheric science community)! > I am also doing a Solar Physics PhD so this stuff gets used in lots of places!! I think as a todo list things that need doing are: * Agree on a calling and code structure and implement. * Implement other wavelet families and make all general in order m (ie Mexican hat is just m=2 of DOG) * Implement a significance contouring routine. * Develop some tests to show it works! There will probably be many little things to find on the way along, but I think they are the major points? One other question, do we want a plotting routine inside the module or keep it as an example in the documentation? I ask because I don't know what is the norm. Stuart -------------- next part -------------- An HTML attachment was scrubbed... URL: From ognen at enthought.com Mon Feb 6 08:41:55 2012 From: ognen at enthought.com (Ognen Duzlevski) Date: Mon, 6 Feb 2012 07:41:55 -0600 Subject: [SciPy-Dev] Trac performance? In-Reply-To: References: Message-ID: Ralf, > Thanks Ognen, that would help a lot. Just getting Trac to work would be the > best solution here. I'm not one of the admins of the old machine and not > very experienced with Trac, but if the current admins don't have time to > help you I'll do what I can to help out. I am open to suggestions or help with Trac on the current machine or moving it to the Amazon EC2 instance (where all of it will live eventually). I don't have much experience with Trac either and that's why I am reluctant to just upgrade it as is on the "production" server. Ognen From denis.laxalde at mcgill.ca Mon Feb 6 09:20:32 2012 From: denis.laxalde at mcgill.ca (Denis Laxalde) Date: Mon, 6 Feb 2012 09:20:32 -0500 Subject: [SciPy-Dev] Trac performance? In-Reply-To: References: Message-ID: <20120206092032.74fb25cf@mcgill.ca> Ralf Gommers wrote: > > I think moving stuff out from the Trac could be useful in any case, > > as its account management is not so transparent. As for the > > developer wiki info --- how about turning on wikis on Github and > > using them? Does someone have experience on how usable they > > actually are? > > Haven't got experience, but see that they were just revamped two > weeks ago: https://github.com/blog/774-git-powered-wikis-improved. > They still look quite simplistic, but we don't need much. I also think a github wiki is sufficient. It's quite easy to work with and the fact that the content is stored in git is definitely a plus IMO. Otherwise, there is also the current scipy.org wiki (MoinMoin) but I guess there's a good reason for not using it? > > As for dropping Trac completely -- last time around, I think the > > Github issues system lacked features. I think they've added some > > stuff since then, though. > > > > If the database locked errors can be solved, I think Trac is still > better than Github. And you probably also keep a higher control on data since github issues are not stored in git but in a database (though there's an API to access it). -- Denis From denis.laxalde at mcgill.ca Mon Feb 6 11:30:41 2012 From: denis.laxalde at mcgill.ca (Denis Laxalde) Date: Mon, 6 Feb 2012 11:30:41 -0500 Subject: [SciPy-Dev] updating optimize tutorial & doc questions Message-ID: <20120206113041.6ce57313@mcgill.ca> Hi, I've completed the minimize function which provides a common interface to both unconstrained an constrained minimizers of multivariate functions. Now I wonder if the optimize tutorial should be updated in order to put this forward. I think new users should be pointed to this interface instead of individual functions and that it should be clearer in the tutorial. What do others think? Besides, a couple of questions about the documentation on : - How is it actually updated? - The content of reference pages (e.g. ) appears to be generated using automodule. There is also some bits in packages' __init__.py file but this is not consistent with the latter. So what's the documentation in __init__.py? Thanks. -- Denis From josef.pktd at gmail.com Mon Feb 6 11:46:50 2012 From: josef.pktd at gmail.com (josef.pktd at gmail.com) Date: Mon, 6 Feb 2012 11:46:50 -0500 Subject: [SciPy-Dev] updating optimize tutorial & doc questions In-Reply-To: <20120206113041.6ce57313@mcgill.ca> References: <20120206113041.6ce57313@mcgill.ca> Message-ID: On Mon, Feb 6, 2012 at 11:30 AM, Denis Laxalde wrote: > Hi, > > I've completed the minimize function which provides a common interface > to both unconstrained an constrained minimizers of multivariate > functions. Now I wonder if the optimize tutorial should be updated > in order to put this forward. I think new users should be pointed to > this interface instead of individual functions and that it should be > clearer in the tutorial. What do others think? > > Besides, a couple of questions about the documentation on > : > > ?- How is it actually updated? either edit the source in git or use the online editor, e.g. http://docs.scipy.org/scipy/docs/scipy-docs/tutorial/optimize.rst/ I like the editor because it provides an immediate check on the correctness of the rst. The pages edited on the online documentation editor get merged into the source (ir)regularly > ?- The content of reference pages (e.g. > ? ) appears to > ? be generated using automodule. There is also some bits in packages' > ? __init__.py file but this is not consistent with the latter. So > ? what's the documentation in __init__.py? It looks to me that it is now all in the module docstring, so the content of the main subpackage page should be directly the information in the __init__.py. What's the inconsistency? I think updating the tutorial will be very useful in this case. Josef > > Thanks. > > -- > Denis > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev From denis.laxalde at mcgill.ca Mon Feb 6 12:03:04 2012 From: denis.laxalde at mcgill.ca (Denis Laxalde) Date: Mon, 6 Feb 2012 12:03:04 -0500 Subject: [SciPy-Dev] updating optimize tutorial & doc questions In-Reply-To: References: <20120206113041.6ce57313@mcgill.ca> Message-ID: <20120206120304.5d1b0083@mcgill.ca> josef.pktd at gmail.com wrote: > > ?- The content of reference pages (e.g. > > ? ) appears to > > ? be generated using automodule. There is also some bits in packages' > > ? __init__.py file but this is not consistent with the latter. So > > ? what's the documentation in __init__.py? > > It looks to me that it is now all in the module docstring, so the > content of the main subpackage page should be directly the information > in the __init__.py. > What's the inconsistency? It's kind of a mix of both contents. For example, in sparse/__init__.py, there is: eye - Sparse MxN matrix whose k-th diagonal is all ones the online doc shows: eye(m, n[, k, dtype, format]) eye(m, n) returns a sparse (m x n) matrix where the k-th diagonal The latter description is the first line of `eye`'s docstring. This is expected due to the `.. automodule:: scipy.sparse` which appears in doc/source/sparse.rst. Some other parts of __init__.py (like examples) are propagated "correctly". -- Denis From josef.pktd at gmail.com Mon Feb 6 12:26:14 2012 From: josef.pktd at gmail.com (josef.pktd at gmail.com) Date: Mon, 6 Feb 2012 12:26:14 -0500 Subject: [SciPy-Dev] updating optimize tutorial & doc questions In-Reply-To: <20120206120304.5d1b0083@mcgill.ca> References: <20120206113041.6ce57313@mcgill.ca> <20120206120304.5d1b0083@mcgill.ca> Message-ID: On Mon, Feb 6, 2012 at 12:03 PM, Denis Laxalde wrote: > josef.pktd at gmail.com wrote: >> > ?- The content of reference pages (e.g. >> > ? ) appears to >> > ? be generated using automodule. There is also some bits in packages' >> > ? __init__.py file but this is not consistent with the latter. So >> > ? what's the documentation in __init__.py? >> >> It looks to me that it is now all in the module docstring, so the >> content of the main subpackage page should be directly the information >> in the __init__.py. >> What's the inconsistency? > > It's kind of a mix of both contents. For example, in sparse/__init__.py, > there is: > > ?eye - Sparse MxN matrix whose k-th diagonal is all ones > > the online doc shows: > > ?eye(m, n[, k, dtype, format]) ? ? ? ? eye(m, n) returns a sparse (m x n) matrix where the k-th diagonal > > The latter description is the first line of `eye`'s docstring. This > is expected due to the `.. automodule:: scipy.sparse` which appears in > doc/source/sparse.rst. Some other parts of __init__.py (like examples) > are propagated "correctly". Ok, Warren just had a case where there was a rendering problem on one side. I haven't kept up with all the changes to the docs (since the info.py got dropped) The problem was always how to make it readable in the sphinx rendered docs and in the interpreter. The content of __init__.py shows up in help(sparse) or print sparse.__doc__ and has the extra information, that obviously gets overwritten by sphinx. Josef > > -- > Denis > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev From warren.weckesser at enthought.com Mon Feb 6 12:52:24 2012 From: warren.weckesser at enthought.com (Warren Weckesser) Date: Mon, 6 Feb 2012 11:52:24 -0600 Subject: [SciPy-Dev] updating optimize tutorial & doc questions In-Reply-To: References: <20120206113041.6ce57313@mcgill.ca> <20120206120304.5d1b0083@mcgill.ca> Message-ID: On Mon, Feb 6, 2012 at 11:26 AM, wrote: > On Mon, Feb 6, 2012 at 12:03 PM, Denis Laxalde > wrote: > > josef.pktd at gmail.com wrote: > >> > - The content of reference pages (e.g. > >> > ) appears to > >> > be generated using automodule. There is also some bits in packages' > >> > __init__.py file but this is not consistent with the latter. So > >> > what's the documentation in __init__.py? > >> > >> It looks to me that it is now all in the module docstring, so the > >> content of the main subpackage page should be directly the information > >> in the __init__.py. > >> What's the inconsistency? > > > > It's kind of a mix of both contents. For example, in sparse/__init__.py, > > there is: > > > > eye - Sparse MxN matrix whose k-th diagonal is all ones > > > > the online doc shows: > > > > eye(m, n[, k, dtype, format]) eye(m, n) returns a sparse (m x > n) matrix where the k-th diagonal > > > > The latter description is the first line of `eye`'s docstring. This > > is expected due to the `.. automodule:: scipy.sparse` which appears in > > doc/source/sparse.rst. Some other parts of __init__.py (like examples) > > are propagated "correctly". > > Ok, Warren just had a case where there was a rendering problem on one side. > > I haven't kept up with all the changes to the docs (since the info.py > got dropped) > > The problem was always how to make it readable in the sphinx rendered > docs and in the interpreter. The content of __init__.py shows up in > help(sparse) or print sparse.__doc__ and has the extra information, > that obviously gets overwritten by sphinx. > Expanding on Josef's comment: The module-level docs violate the "don't repeat yourself" rule (but it is much better now than it used to be, thanks to Ralf). Here's an example: In scipy/integrate/__init__.py, you'll find: """ ============================================= Integration and ODEs (:mod:`scipy.integrate`) ============================================= .. currentmodule:: scipy.integrate Integrating functions, given function object ============================================ .. autosummary:: :toctree: generated/ quad -- General purpose integration. dblquad -- General purpose double integration. tplquad -- General purpose triple integration. fixed_quad -- Integrate func(x) using Gaussian quadrature of order n. quadrature -- Integrate with given tolerance using Gaussian quadrature. romberg -- Integrate func using Romberg integration. etc. The documentation generated by Sphinx looks like this: http://docs.scipy.org/doc/scipy/reference/integrate.html The autosummary directive tells Sphinx to get the descriptions from the actual docstrings of the functions; the text after the function names listed in __init__.py is ignored. If you are in an interactive shell, however, and you enter >>> from scipy import integrate >>> help(integrate) you will see the text exactly as it appears in __init__.py, so we don't want to remove the (redundant) short descriptions in that file. This mean you must copy the first line from the function's docstring into __init__.py when you create a new function or modify a function's docstring. Warren -------------- next part -------------- An HTML attachment was scrubbed... URL: From denis.laxalde at mcgill.ca Mon Feb 6 13:00:40 2012 From: denis.laxalde at mcgill.ca (Denis Laxalde) Date: Mon, 6 Feb 2012 13:00:40 -0500 Subject: [SciPy-Dev] updating optimize tutorial & doc questions In-Reply-To: References: <20120206113041.6ce57313@mcgill.ca> <20120206120304.5d1b0083@mcgill.ca> Message-ID: <20120206130040.08b145e4@mcgill.ca> Warren Weckesser wrote: > This mean you must copy the first line from the > function's docstring into __init__.py when you create a new > function or modify a function's docstring. Ok. Thanks! -- Denis From denis.laxalde at mcgill.ca Mon Feb 6 13:05:07 2012 From: denis.laxalde at mcgill.ca (Denis Laxalde) Date: Mon, 6 Feb 2012 13:05:07 -0500 Subject: [SciPy-Dev] updating optimize tutorial & doc questions In-Reply-To: References: <20120206113041.6ce57313@mcgill.ca> Message-ID: <20120206130507.0dac5998@mcgill.ca> josef.pktd at gmail.com wrote: > > ?- How is it actually updated? > > either edit the source in git or use the online editor, e.g. > http://docs.scipy.org/scipy/docs/scipy-docs/tutorial/optimize.rst/ > I like the editor because it provides an immediate check on the > correctness of the rst. > > The pages edited on the online documentation editor get merged into > the source (ir)regularly Once merged in git source, how is the website updated? Is there any build bot or is this done manually? -- Denis From josef.pktd at gmail.com Mon Feb 6 13:11:45 2012 From: josef.pktd at gmail.com (josef.pktd at gmail.com) Date: Mon, 6 Feb 2012 13:11:45 -0500 Subject: [SciPy-Dev] updating optimize tutorial & doc questions In-Reply-To: <20120206130507.0dac5998@mcgill.ca> References: <20120206113041.6ce57313@mcgill.ca> <20120206130507.0dac5998@mcgill.ca> Message-ID: On Mon, Feb 6, 2012 at 1:05 PM, Denis Laxalde wrote: > josef.pktd at gmail.com wrote: >> > ?- How is it actually updated? >> >> either edit the source in git or use the online editor, e.g. >> http://docs.scipy.org/scipy/docs/scipy-docs/tutorial/optimize.rst/ >> I like the editor because it provides an immediate check on the >> correctness of the rst. >> >> The pages edited on the online documentation editor get merged into >> the source (ir)regularly > > Once merged in git source, how is the website updated? Is there any > build bot or is this done manually? Pauli had it set up so it gets automatically updated, I think it usually took 2 to 4 days. I assume it's still the same. Josef > > -- > Denis > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev From ralf.gommers at googlemail.com Mon Feb 6 14:02:31 2012 From: ralf.gommers at googlemail.com (Ralf Gommers) Date: Mon, 6 Feb 2012 20:02:31 +0100 Subject: [SciPy-Dev] updating optimize tutorial & doc questions In-Reply-To: References: <20120206113041.6ce57313@mcgill.ca> <20120206130507.0dac5998@mcgill.ca> Message-ID: On Mon, Feb 6, 2012 at 7:11 PM, wrote: > On Mon, Feb 6, 2012 at 1:05 PM, Denis Laxalde > wrote: > > josef.pktd at gmail.com wrote: > >> > - How is it actually updated? > >> > >> either edit the source in git or use the online editor, e.g. > >> http://docs.scipy.org/scipy/docs/scipy-docs/tutorial/optimize.rst/ > >> I like the editor because it provides an immediate check on the > >> correctness of the rst. > >> > >> The pages edited on the online documentation editor get merged into > >> the source (ir)regularly > > > > Once merged in git source, how is the website updated? Is there any > > build bot or is this done manually? > > Pauli had it set up so it gets automatically updated, I think it > usually took 2 to 4 days. I assume it's still the same. > > Still the same, development docs get built automatically, for a release they have to be built and uploaded by hand. Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From denis.laxalde at mcgill.ca Mon Feb 6 16:23:21 2012 From: denis.laxalde at mcgill.ca (Denis Laxalde) Date: Mon, 6 Feb 2012 16:23:21 -0500 Subject: [SciPy-Dev] numpydoc installation and path Message-ID: <20120206162321.072f8795@mcgill.ca> Hi, While trying to build scipy's documentation, I had to install numpydoc (python setup.py --install --user). This apparently generates an 'easy-install.pth' file which contains: import sys; sys.__plen = len(sys.path) ./numpydoc-0.4-py2.6.egg /usr/lib/pymodules/python2.6 import sys; new=sys.path[sys.__plen:]; del sys.path[sys.__plen:]; p=getattr(sys,'__egginsert',0); sys.path[p:p]=new; sys.__egginsert = p+len(new) This happens to break the python path prepending /usr/lib/pymodules/python2.6 and numpydoc-0.4-py2.6.egg. What is the purpose of this? Is that intended? Thanks. -- Denis From pav at iki.fi Mon Feb 6 16:37:05 2012 From: pav at iki.fi (Pauli Virtanen) Date: Mon, 06 Feb 2012 22:37:05 +0100 Subject: [SciPy-Dev] numpydoc installation and path In-Reply-To: <20120206162321.072f8795@mcgill.ca> References: <20120206162321.072f8795@mcgill.ca> Message-ID: 06.02.2012 22:23, Denis Laxalde kirjoitti: > While trying to build scipy's documentation, I had to install numpydoc > (python setup.py --install --user). This apparently generates an > 'easy-install.pth' file which contains: > > import sys; sys.__plen = len(sys.path) > ./numpydoc-0.4-py2.6.egg > /usr/lib/pymodules/python2.6 > import sys; new=sys.path[sys.__plen:]; del sys.path[sys.__plen:]; p=getattr(sys,'__egginsert',0); sys.path[p:p]=new; sys.__egginsert = p+len(new) > > This happens to break the python path prepending > /usr/lib/pymodules/python2.6 and numpydoc-0.4-py2.6.egg. > > What is the purpose of this? Is that intended? That's some setuptools stuff. Probably nothing specific to numpydoc, but would happen for any setuptools-using package: http://packages.python.org/distribute/ http://peak.telecommunity.com/DevCenter/EasyInstall From thomas at kluyver.me.uk Mon Feb 6 16:47:05 2012 From: thomas at kluyver.me.uk (Thomas Kluyver) Date: Mon, 6 Feb 2012 21:47:05 +0000 Subject: [SciPy-Dev] new.scipy.org Message-ID: I've found that searches can turn up pages from new.scipy.org, a subdomain which was seemingly last updated in 2009, and now has out of date info - e.g. references to SVN (http://new.scipy.org/download.html) and FAQ about Python 3 support (http://new.scipy.org/faq.html#python-version-support). I can't see any details of who maintains the website, but I think leaving out of date information around is liable to cause confusion - can it be taken down or brought up to date? Thanks, Thomas -------------- next part -------------- An HTML attachment was scrubbed... URL: From denis.laxalde at mcgill.ca Mon Feb 6 16:55:09 2012 From: denis.laxalde at mcgill.ca (Denis Laxalde) Date: Mon, 6 Feb 2012 16:55:09 -0500 Subject: [SciPy-Dev] numpydoc installation and path In-Reply-To: References: <20120206162321.072f8795@mcgill.ca> Message-ID: <20120206165509.513c1757@mcgill.ca> Pauli Virtanen wrote: > That's some setuptools stuff. Probably nothing specific to numpydoc, but > would happen for any setuptools-using package: > > http://packages.python.org/distribute/ > > http://peak.telecommunity.com/DevCenter/EasyInstall I think I currently use the former. Is there any alternative? -- Denis From pav at iki.fi Mon Feb 6 16:58:55 2012 From: pav at iki.fi (Pauli Virtanen) Date: Mon, 06 Feb 2012 22:58:55 +0100 Subject: [SciPy-Dev] Trac performance? In-Reply-To: References: Message-ID: Hi, 06.02.2012 14:41, Ognen Duzlevski kirjoitti: [clip] > I am open to suggestions or help with Trac on the current machine or > moving it to the Amazon EC2 instance (where all of it will live > eventually). I don't have much experience with Trac either and that's > why I am reluctant to just upgrade it as is on the "production" > server. If the long term hosting plan is to move things onto EC2, I think your suggestion about moving services there rather than just working on the current box makes much sense. If you need extra hands there, I can reserve time for helping out. I guess all of the *.scipy.org stuff, including the scipy.org wiki would be moving? Moving the Trac could be a useful pilot. Setting it up in principle should be relatively straightforward. The only question is whether it really works without a SVN repo, and whether the tiny Github integration plugin works out of the box for the new Trac version. I can take a look at the latter two on my own box to smoothen the road a bit here. It would also make sense to use a real DB as the backend rather than SQLite, but I don't know beforehand if there are bumps in SQLite -> something else conversion. Pauli From pav at iki.fi Mon Feb 6 17:01:32 2012 From: pav at iki.fi (Pauli Virtanen) Date: Mon, 06 Feb 2012 23:01:32 +0100 Subject: [SciPy-Dev] numpydoc installation and path In-Reply-To: <20120206165509.513c1757@mcgill.ca> References: <20120206162321.072f8795@mcgill.ca> <20120206165509.513c1757@mcgill.ca> Message-ID: 06.02.2012 22:55, Denis Laxalde kirjoitti: > Pauli Virtanen wrote: >> That's some setuptools stuff. Probably nothing specific to numpydoc, but >> would happen for any setuptools-using package: >> >> http://packages.python.org/distribute/ >> >> http://peak.telecommunity.com/DevCenter/EasyInstall > > I think I currently use the former. Is there any alternative? Getting OT, but removing the 'import setuptools' etc. lines from setup.py should work. Or, you can just delete the *.pth file, and manually unpack the packages from inside the egg files (they're zip files). -- Pauli Virtanen From cournape at gmail.com Mon Feb 6 17:09:01 2012 From: cournape at gmail.com (David Cournapeau) Date: Mon, 6 Feb 2012 22:09:01 +0000 Subject: [SciPy-Dev] Trac performance? In-Reply-To: References: Message-ID: On Sun, Feb 5, 2012 at 1:02 PM, Pauli Virtanen wrote: > Hi, > > 05.02.2012 12:59, Ralf Gommers kirjoitti: >> Is there an end in sight for the troubles with Trac? It's driving me >> completely crazy. So if not, I'm willing to invest some time in moving >> all the relevant content on the Trac wikis somewhere else (suggestions >> welcome). I'm thinking about things like development plans, GSOC info, >> list of maintainers per module, etc. > > Newer versions of Trac are supposed to alleviate the "database is > locked" error, which is a PITA. So upgrading the software could help. > > I think moving stuff out from the Trac could be useful in any case, as > its account management is not so transparent. As for the developer wiki > info --- how about turning on wikis on Github and using them? Does > someone have experience on how usable they actually are? > > As for dropping Trac completely -- last time around, I think the Github > issues system lacked features. I think they've added some stuff since > then, though. They have added milestone, which is something that is a must for release management IMO. Otherwise, it is quite limited, but OTOH, that's one less thing to worry about, and it is not like trac is a great issue tracker either. github actually has a few features that are sorely lacking in our current trac, like a decent REST API and mass-editing of existing issues. cheers, David From ognen at enthought.com Mon Feb 6 18:39:50 2012 From: ognen at enthought.com (Ognen Duzlevski) Date: Mon, 6 Feb 2012 17:39:50 -0600 Subject: [SciPy-Dev] Trac performance? In-Reply-To: References: Message-ID: Pauli, On Mon, Feb 6, 2012 at 3:58 PM, Pauli Virtanen wrote: > Hi, > > 06.02.2012 14:41, Ognen Duzlevski kirjoitti: > [clip] >> I am open to suggestions or help with Trac on the current machine or >> moving it to the Amazon EC2 instance (where all of it will live >> eventually). I don't have much experience with Trac either and that's >> why I am reluctant to just upgrade it as is on the "production" >> server. > > If the long term hosting plan is to move things onto EC2, I think your > suggestion about moving services there rather than just working on the > current box makes much sense. If you need extra hands there, I can > reserve time for helping out. I guess all of the *.scipy.org stuff, > including the scipy.org wiki would be moving? > > Moving the Trac could be a useful pilot. Setting it up in principle > should be relatively straightforward. The only question is whether it > really works without a SVN repo, and whether the tiny Github integration > plugin works out of the box for the new Trac version. I can take a look > at the latter two on my own box to smoothen the road a bit here. It > would also make sense to use a real DB as the backend rather than > SQLite, but I don't know beforehand if there are bumps in SQLite -> > something else conversion. Yes, moving to EC2 is a goal - this is where the hosting will end up, on a large instance with enough CPU, RAM etc. Any help (from you or anyone else) would be appreciated. I will send you an email and we can discuss the steps further. Thanks, Ognen From charlesr.harris at gmail.com Mon Feb 6 19:39:01 2012 From: charlesr.harris at gmail.com (Charles R Harris) Date: Mon, 6 Feb 2012 17:39:01 -0700 Subject: [SciPy-Dev] Trac performance? In-Reply-To: References: Message-ID: On Mon, Feb 6, 2012 at 3:09 PM, David Cournapeau wrote: > On Sun, Feb 5, 2012 at 1:02 PM, Pauli Virtanen wrote: > > Hi, > > > > 05.02.2012 12:59, Ralf Gommers kirjoitti: > >> Is there an end in sight for the troubles with Trac? It's driving me > >> completely crazy. So if not, I'm willing to invest some time in moving > >> all the relevant content on the Trac wikis somewhere else (suggestions > >> welcome). I'm thinking about things like development plans, GSOC info, > >> list of maintainers per module, etc. > > > > Newer versions of Trac are supposed to alleviate the "database is > > locked" error, which is a PITA. So upgrading the software could help. > > > > I think moving stuff out from the Trac could be useful in any case, as > > its account management is not so transparent. As for the developer wiki > > info --- how about turning on wikis on Github and using them? Does > > someone have experience on how usable they actually are? > > > > As for dropping Trac completely -- last time around, I think the Github > > issues system lacked features. I think they've added some stuff since > > then, though. > > They have added milestone, which is something that is a must for > release management IMO. Otherwise, it is quite limited, but OTOH, > that's one less thing to worry about, and it is not like trac is a > great issue tracker either. github actually has a few features that > are sorely lacking in our current trac, like a decent REST API and > mass-editing of existing issues. > > And github will probably continue to improve issue tracking in the future at a decent rate. It might make sense to get on that train now instead of running after the caboose later on. Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From fperez.net at gmail.com Mon Feb 6 21:24:51 2012 From: fperez.net at gmail.com (Fernando Perez) Date: Mon, 6 Feb 2012 18:24:51 -0800 Subject: [SciPy-Dev] Trac performance? In-Reply-To: References: Message-ID: On Mon, Feb 6, 2012 at 4:39 PM, Charles R Harris wrote: > > And github will probably continue to improve issue tracking in the future at > a decent rate. It might make sense to get on that train now instead of > running after the caboose later on. Agreed. I think it's really worth deciding explicitly if it makes sense to continue self-hosting things like issue trackers or wikis at this point. For IPython we decided to self-host a wiki, but we had our reasons and it wasn't the main site anymore, just a user-oriented wiki. With the new improvements to the github wiki, even that decision might have been different for us (not sure, but at least we'd consider it). There's a non-negligible cost in overhead (even if Enthought generously provides the resources and shoulders the bill) for a project in manually hosted solutions, the streamlined administration that GH offers as well as the ability to trivially delegate by creating ad-hoc teams with new membership, is IMHO something worth considering. The only thing we kept on scipy.org for ipython were the mailing lists: that's the one service for which there's never been a single problem, slowdown or issue, and we saw no reason to make changes there. But for everything else, the move to the GH hosting facilities has really been a net positive for us. Just offering our perspective, not saying that scipy must necessarily make the same choices we did. Best, f From travis at continuum.io Mon Feb 6 22:09:20 2012 From: travis at continuum.io (Travis Oliphant) Date: Mon, 6 Feb 2012 21:09:20 -0600 Subject: [SciPy-Dev] Trac performance? In-Reply-To: References: Message-ID: <50CD3388-B3CE-4D68-AA6E-FF40C2B75953@continuum.io> On Feb 6, 2012, at 6:39 PM, Charles R Harris wrote: > > > On Mon, Feb 6, 2012 at 3:09 PM, David Cournapeau wrote: > On Sun, Feb 5, 2012 at 1:02 PM, Pauli Virtanen wrote: > > Hi, > > > > 05.02.2012 12:59, Ralf Gommers kirjoitti: > >> Is there an end in sight for the troubles with Trac? It's driving me > >> completely crazy. So if not, I'm willing to invest some time in moving > >> all the relevant content on the Trac wikis somewhere else (suggestions > >> welcome). I'm thinking about things like development plans, GSOC info, > >> list of maintainers per module, etc. > > > > Newer versions of Trac are supposed to alleviate the "database is > > locked" error, which is a PITA. So upgrading the software could help. > > > > I think moving stuff out from the Trac could be useful in any case, as > > its account management is not so transparent. As for the developer wiki > > info --- how about turning on wikis on Github and using them? Does > > someone have experience on how usable they actually are? > > > > As for dropping Trac completely -- last time around, I think the Github > > issues system lacked features. I think they've added some stuff since > > then, though. > > They have added milestone, which is something that is a must for > release management IMO. Otherwise, it is quite limited, but OTOH, > that's one less thing to worry about, and it is not like trac is a > great issue tracker either. github actually has a few features that > are sorely lacking in our current trac, like a decent REST API and > mass-editing of existing issues. > > > And github will probably continue to improve issue tracking in the future at a decent rate. It might make sense to get on that train now instead of running after the caboose later on. > > Chuck I'm very supportive of this because of the ease of management and the APIs available on github. I met Tom Preston-Warner (founder of GitHub) last fall and had breakfast with him. He is aware of the SciPy community, and I mentioned the issue tracking problems when we talked. Perhaps a few of us could contact him, thank him for the great progress made so far on the issue tracking, and continue to gently encourage him in the direction of better Issue Tracking. It does sound like things are getting better on that front. Best regards, -Travis -------------- next part -------------- An HTML attachment was scrubbed... URL: From fperez.net at gmail.com Mon Feb 6 22:36:23 2012 From: fperez.net at gmail.com (Fernando Perez) Date: Mon, 6 Feb 2012 19:36:23 -0800 Subject: [SciPy-Dev] Trac performance? In-Reply-To: <50CD3388-B3CE-4D68-AA6E-FF40C2B75953@continuum.io> References: <50CD3388-B3CE-4D68-AA6E-FF40C2B75953@continuum.io> Message-ID: On Mon, Feb 6, 2012 at 7:09 PM, Travis Oliphant wrote: > I met Tom Preston-Warner (founder of GitHub) last fall and had breakfast > with him. ? He is aware of the SciPy community, and I mentioned the issue > tracking problems when we talked. ? Perhaps a few of us could contact him, > thank him for the great progress made so far on the issue tracking, and > continue to gently encourage him in the direction of better Issue Tracking. Already done :) Ariel Rokem met some githubbers recently, and he gave a great talk at github a few weeks ago, mostly focusing on neuroscience but where the discussion tailed off into issues that include the growing role of github in open source scientific computing and python. We now plan to continue that discussion with a talk I'm set to deliver there in a few weeks, focusing entirely on the scipy ecosystem and github's role in it. We haven't set a date yet b/c right now I'm traveling abroad and pydata/pycon are right after I get back, so we're looking for the right slot calendar-wise. It will most likely be in early April. Obviously, I'd love to have specific points anyone else may have in mind (I have my own list), so by all means post them here, and in a few weeks when I go over there I'll bring them up. I will then report back on-list. Cheers, f From scott.sinclair.za at gmail.com Tue Feb 7 01:24:39 2012 From: scott.sinclair.za at gmail.com (Scott Sinclair) Date: Tue, 7 Feb 2012 08:24:39 +0200 Subject: [SciPy-Dev] new.scipy.org In-Reply-To: References: Message-ID: On 6 February 2012 23:47, Thomas Kluyver wrote: > I've found that searches can turn up pages from new.scipy.org, a subdomain > which was seemingly last updated in 2009, and now has out of date info - > e.g. references to SVN (http://new.scipy.org/download.html) and FAQ about > Python 3 support (http://new.scipy.org/faq.html#python-version-support). I > can't see any details of who maintains the website, but I think leaving out > of date information around is liable to cause confusion - can it be taken > down or brought up to date? Hi Thomas, This issue is currently getting some attention (see this thread on the Numpy list - http://thread.gmane.org/gmane.comp.python.numeric.general/47464). The updated content from new.scipy.org is now at http://scipy.github.com and since this mailing list thread is well named, we may as well continue the discussion here. I'd welcome suggestions/criticism etc. regarding the content of scipy.github.com and the migration plan. Pull requests against https://github.com/scipy/scipy.org-new would also be great. Cheers, Scott From sturla at molden.no Tue Feb 7 08:38:41 2012 From: sturla at molden.no (Sturla Molden) Date: Tue, 07 Feb 2012 14:38:41 +0100 Subject: [SciPy-Dev] [Numpy-discussion] Moving to gcc 4.* for win32 installers ? In-Reply-To: References: Message-ID: <4F312961.5040706@molden.no> On 27.10.2011 15:02, David Cournapeau wrote: > - we need to recompile atlas (but I can take care of it) May I suggest GotoBLAS2 instead of ATLAS? Is is faster (comparable to MKL), easier to build, and now released under BSD licence. http://www.tacc.utexas.edu/tacc-projects/gotoblas2 Sturla From matt.terry at gmail.com Tue Feb 7 13:27:00 2012 From: matt.terry at gmail.com (Matt Terry) Date: Tue, 7 Feb 2012 10:27:00 -0800 Subject: [SciPy-Dev] slices of sparse matrixes Message-ID: Do scipy sparse matrixes have the same slicing semantics as numpy arrays? I would expect the same slice of a numpy array and a scipy array to have the same shape. In this simple test, they do not. I'm using scipy 0.10. >>> d = numpy.zeros( (4,3) ) >>> s = scipy.sparse.lil_matrix(d) >>> d[:,0].shape == s[:,0].shape False Bug or feature? -matt From matt.terry at gmail.com Tue Feb 7 13:31:53 2012 From: matt.terry at gmail.com (Matt Terry) Date: Tue, 7 Feb 2012 10:31:53 -0800 Subject: [SciPy-Dev] slices of sparse matrixes In-Reply-To: References: Message-ID: sparse matrixes are *matrixes* not *ndarrays* The two slice differently. >>> d = numpy.matrix( (4,3) ) >>> s = scipy.sparse.lil_matrix(d) >>> d[:,0].shape == s[:,0].shape True It seems the best way to answer your own dumb question is to ask it publicly : ) -matt From warren.weckesser at enthought.com Tue Feb 7 13:34:22 2012 From: warren.weckesser at enthought.com (Warren Weckesser) Date: Tue, 7 Feb 2012 12:34:22 -0600 Subject: [SciPy-Dev] slices of sparse matrixes In-Reply-To: References: Message-ID: On Tue, Feb 7, 2012 at 12:31 PM, Matt Terry wrote: > sparse matrixes are *matrixes* not *ndarrays* The two slice differently. > > >>> d = numpy.matrix( (4,3) ) > >>> s = scipy.sparse.lil_matrix(d) > >>> d[:,0].shape == s[:,0].shape > True > > It seems the best way to answer your own dumb question is to ask it > publicly : ) > > Been there, done that. :) I think it was on the scikits.learn mailing list that I saw some interest in a sparse ndarray object. That would be a nice addition to the scipy sparse library. Warren -------------- next part -------------- An HTML attachment was scrubbed... URL: From nwagner at iam.uni-stuttgart.de Tue Feb 7 14:22:39 2012 From: nwagner at iam.uni-stuttgart.de (Nils Wagner) Date: Tue, 07 Feb 2012 20:22:39 +0100 Subject: [SciPy-Dev] FAIL: Minimize, method=TNC, 1b (approx gradient) version 0.11.0.dev-a4cd737 Message-ID: FAIL: Minimize, method=TNC, 1b (approx gradient) ---------------------------------------------------------------------- Traceback (most recent call last): File "/home/nwagner/local/lib64/python2.7/site-packages/scipy/optimize/tests/test_optimize.py", line 623, in test_minimize_tnc1b assert_allclose(self.f1(x), self.f1(xopt), atol=1e-6) File "/home/nwagner/local/lib64/python2.7/site-packages/numpy/testing/utils.py", line 1213, in assert_allclose verbose=verbose, header=header) File "/home/nwagner/local/lib64/python2.7/site-packages/numpy/testing/utils.py", line 677, in assert_array_compare raise AssertionError(msg) AssertionError: Not equal to tolerance rtol=1e-07, atol=1e-06 (mismatch 100.0%) x: array(6.811012399502852e-06) y: array(0.0) From denis.laxalde at mcgill.ca Tue Feb 7 14:36:57 2012 From: denis.laxalde at mcgill.ca (Denis Laxalde) Date: Tue, 7 Feb 2012 14:36:57 -0500 Subject: [SciPy-Dev] FAIL: Minimize, method=TNC, 1b (approx gradient) version 0.11.0.dev-a4cd737 In-Reply-To: References: Message-ID: <20120207143657.73a077be@mcgill.ca> Nils Wagner wrote: > FAIL: Minimize, method=TNC, 1b (approx gradient) > ---------------------------------------------------------------------- > Traceback (most recent call last): > File > "/home/nwagner/local/lib64/python2.7/site-packages/scipy/optimize/tests/test_optimize.py", > line 623, in test_minimize_tnc1b > assert_allclose(self.f1(x), self.f1(xopt), atol=1e-6) > File > "/home/nwagner/local/lib64/python2.7/site-packages/numpy/testing/utils.py", > line 1213, in assert_allclose > verbose=verbose, header=header) > File > "/home/nwagner/local/lib64/python2.7/site-packages/numpy/testing/utils.py", > line 677, in assert_array_compare > raise AssertionError(msg) > AssertionError: > Not equal to tolerance rtol=1e-07, atol=1e-06 > > (mismatch 100.0%) > x: array(6.811012399502852e-06) > y: array(0.0) Looks like 08c4211 was overwritten during merge. Will fix that soon. -- Denis From pav at iki.fi Wed Feb 8 04:05:51 2012 From: pav at iki.fi (Pauli Virtanen) Date: Wed, 08 Feb 2012 10:05:51 +0100 Subject: [SciPy-Dev] Trac performance? In-Reply-To: References: Message-ID: 06.02.2012 22:58, Pauli Virtanen kirjoitti: [clip] > Moving the Trac could be a useful pilot. Setting it up in principle > should be relatively straightforward. The only question is whether it > really works without a SVN repo, and whether the tiny Github integration > plugin works out of the box for the new Trac version. OK, the plugin needed small tweaks, [1] and it seems that Trac is happy to run without a repository. So it should be mostly smooth sailing... [1] https://github.com/pv/githubsimple-trac From ralf.gommers at googlemail.com Wed Feb 8 15:40:14 2012 From: ralf.gommers at googlemail.com (Ralf Gommers) Date: Wed, 8 Feb 2012 21:40:14 +0100 Subject: [SciPy-Dev] scikit-signal or Similar In-Reply-To: References: <4F2C09D4.2080201@unidata.ucar.edu> Message-ID: On Mon, Feb 6, 2012 at 11:13 AM, Stuart Mumford wrote: > Hello, > > There will probably be many little things to find on the way along, but I > think they are the major points? One other question, do we want a plotting > routine inside the module or keep it as an example in the documentation? I > ask because I don't know what is the norm. > > Plotting routines don't really belong in scipy. Scipy also doesn't depend on Matplotlib or any other plotting library, so if you'd want to provide some plotting routine it would have to be optional. In examples you can use Matplotlib. Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From sarms at unidata.ucar.edu Wed Feb 8 16:37:11 2012 From: sarms at unidata.ucar.edu (Sean Arms) Date: Wed, 08 Feb 2012 14:37:11 -0700 Subject: [SciPy-Dev] scikit-signal or Similar In-Reply-To: References: <4F2C09D4.2080201@unidata.ucar.edu> Message-ID: <4F32EB07.5050203@unidata.ucar.edu> Greetings! On 02/06/12 03: 13, Stuart Mumford wrote: > Hello, > > > Sorry - I've been off the list for awhile as I was transitioning to my > first 'real world' job. Now that I'm back - hello! This was my first > attempt at object oriented programming - I'll need to take a look > at the > code again to see what I was (or was not) thinking at the time :-) > > > Nice to have you around to help and I am glad you now have some time > to help out! For a first attempt at OOP it's very good! (well much > better than my first attempt was!) I was thinking about how would be > best to implement it, we want to try and mirror as closely as possible > the calling sequence of PyWavelets and it's DWT routines. They use the > approach of > > >>> cA, cD = pywt.dwt(data,mother) > where mother is either a string or an instance of a mother class. > I completely agree. > So I was thinking would the best code structure to be have a "Wavelet" > class which is subclassed for cwt, icwt ccwt etc. and a Mother class > which is subclassed for each family like Morlet, DOG etc. > > Then into the Wavelet instance you can pass either a subclass of > mother or a string and go from there. > > This is pretty close to how your code is structured at the moment, > just need to make the main cwt etc. functions subclasses of Wavelet? > Yes, I think that will work nicely. > As for results I think it would be best to return a class when cwt > etc. is called and then have attributes such as power, data, > coefficients etc. > > ie > > wave = scipy.signal.wavelet.cwt.cwt(data,'morlet') > pwr = wave.power > wave.scalogram() > > etc. That makes sense too. > > > I shall work on improving that code, I can implement more and > more general > > Mother wavelets and also write some examples and update the plotting > > routine to use mpl's make_axes_locatable if people think that is > a better > > way to go. > > > > I'd be happy to start working on this again. I'm still finishing up my > PhD, but I am now working as a developer at UCAR / Unidata and > have some > time I can officially spend on 'guerrilla projects' like this > (especially since it can/will benefit the atmospheric science > community)! > > > I am also doing a Solar Physics PhD so this stuff gets used in lots of > places!! I think as a todo list things that need doing are: > > * Agree on a calling and code structure and implement. Would you like to do the quick restructure? > * Implement other wavelet families and make all general in order m (ie > Mexican hat is just m=2 of DOG) I started on that path, but ran into a few speed bumps because the admissibility constants can be quite tricky to determine without using numerical integration. Maybe not too big of a deal in reality, but the two mother wavelets I implemented did the trick for my dissertation needs. > * Implement a significance contouring routine. This was on my to-do list as well. However, I don't think we need to implement a contouring routine for SciPy, as that should go to Matplotlib or a wavelet cookbook (let's not worry too much about that right now - we can use what's there for testing and strip it out later). > * Develop some tests to show it works! > I've been wanting to come up with some simple analytical tests but just have not had time. You are welcome to take a crack at that, but I will eventually get to it. Cheers! Sean > There will probably be many little things to find on the way along, > but I think they are the major points? One other question, do we want > a plotting routine inside the module or keep it as an example in the > documentation? I ask because I don't know what is the norm. > > Stuart > > > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at googlemail.com Thu Feb 9 15:52:50 2012 From: ralf.gommers at googlemail.com (Ralf Gommers) Date: Thu, 9 Feb 2012 21:52:50 +0100 Subject: [SciPy-Dev] scipy 0.10.1 release In-Reply-To: References: Message-ID: On Sun, Feb 5, 2012 at 6:05 PM, Ralf Gommers wrote: > > > On Sat, Feb 4, 2012 at 10:52 PM, Ralf Gommers > wrote: > >> Hi all, >> >> A 0.10.1 bugfix release soon is starting to look like a good idea. The >> two fixes that are most needed are the single-precision Arpack problems and >> the fix to remain compatible with the macro changes in numpy that Charles >> made. While we're at it, we may as well backport some other bug and doc >> fixes. I'll have a look at recent commits for that, suggestions welcome. An >> RC within a week, and release the week after seems feasible. >> >> Thoughts, comments? >> > > https://github.com/scipy/scipy/pull/150 > > I plan to merge this tomorrow night and prepare an RC this weekend. If there's anything else you want to see backported, please send a PR within the next 24 hours. Thanks, Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From nwagner at iam.uni-stuttgart.de Sat Feb 11 05:45:02 2012 From: nwagner at iam.uni-stuttgart.de (Nils Wagner) Date: Sat, 11 Feb 2012 11:45:02 +0100 Subject: [SciPy-Dev] Sylvester equation solver for complex matrices Message-ID: Hi all, I run a short test on the recent solver for Sylvester equations. In case of complex input matrices the method returns ** On entry to ZTRSYL parameter number 2 had an illegal value Nils import numpy as np from scipy import linalg as la np.random.seed(10) a = np.random.rand(20,20)+1j*np.random.rand(20,20) b = np.random.rand(20,20)+1j*np.random.rand(20,20) q = np.random.rand(20,20)+1j*np.random.rand(20,20) x = la.solve_sylvester(a,b,q) res = la.norm(np.dot(a,x)+np.dot(x,b)-q) From Nicolas.Rougier at inria.fr Sat Feb 11 05:52:21 2012 From: Nicolas.Rougier at inria.fr (Nicolas Rougier) Date: Sat, 11 Feb 2012 11:52:21 +0100 Subject: [SciPy-Dev] fftconvolve speedup / non powers of two Message-ID: <620E0CCF-E04B-4B86-83F1-77DD9765DAA6@inria.fr> Hi, FFTW documentation (http://www.fftw.org/fftw2_doc/fftw_3.html) states that: "FFTW is best at handling sizes of the form 2^a*3^b*5^c*7^d*11^e*13^f, where e+f is either 0 or 1." But the code from fftconvolve is only using (internally) sizes that are power of two, is there a specific reason ? I tried a naive implementation handling size of the above form and it seems to speedup things a bit: In [1]: from fftconvolve import * In [2]: %timeit fftconvolve(Z,K,'full') 10 loops, best of 3: 57.3 ms per loop In [3]: %timeit fftconvolve2(Z,K,'full') 100 loops, best of 3: 13.2 ms per loop This is for the worst case where the internal size is 257. fftconvolve uses a size of 512 while fftconvolve2 uses 260. For powers of two, it should not change performances (only the time to compute best fft shape that may be probably improved). Nicolas Here is the code: -------------- next part -------------- A non-text attachment was scrubbed... Name: fftconvolve.py Type: text/x-python-script Size: 2358 bytes Desc: not available URL: From pav at iki.fi Sat Feb 11 07:46:17 2012 From: pav at iki.fi (Pauli Virtanen) Date: Sat, 11 Feb 2012 13:46:17 +0100 Subject: [SciPy-Dev] fftconvolve speedup / non powers of two In-Reply-To: <620E0CCF-E04B-4B86-83F1-77DD9765DAA6@inria.fr> References: <620E0CCF-E04B-4B86-83F1-77DD9765DAA6@inria.fr> Message-ID: 11.02.2012 11:52, Nicolas Rougier kirjoitti: > FFTW documentation (http://www.fftw.org/fftw2_doc/fftw_3.html) states that: > > "FFTW is best at handling sizes of the form 2^a*3^b*5^c*7^d*11^e*13^f, where e+f is either 0 or 1." > > But the code from fftconvolve is only using (internally) sizes that are > power of two, is there a specific reason ? The underlying library is not FFTW, so the FFTW documentation does not apply. However, IIRC, FFTPACK had factorizations for 2^a*3^b*5^c, so there might be room for similar improvement. However, one would need to check if there are some additional restrictions. -- Pauli Virtanen From ralf.gommers at googlemail.com Sat Feb 11 09:11:06 2012 From: ralf.gommers at googlemail.com (Ralf Gommers) Date: Sat, 11 Feb 2012 15:11:06 +0100 Subject: [SciPy-Dev] ANN: SciPy 0.10.1 release candidate 1 Message-ID: Hi all, I am pleased to announce the availability of the first release candidate of SciPy 0.10.1. Please try out this release and report any problems on the scipy-dev mailing list. If no problems are found, the final release will be available in one week. Sources and binaries can be found at http://sourceforge.net/projects/scipy/files/scipy/0.10.1rc1/, release notes are copied below. Cheers, Ralf ========================== SciPy 0.10.1 Release Notes ========================== .. contents:: SciPy 0.10.1 is a bug-fix release with no new features compared to 0.10.0. Main changes ------------ The most important changes are:: 1. The single precision routines of ``eigs`` and ``eigsh`` in ``scipy.sparse.linalg`` have been disabled (they internally use double precision now). 2. A compatibility issue related to changes in NumPy macros has been fixed, in order to make scipy 0.10.1 compile with the upcoming numpy 1.7.0 release. Other issues fixed ------------------ - #835: stats: nan propagation in stats.distributions - #1202: io: netcdf segfault - #1531: optimize: make curve_fit work with method as callable. - #1560: linalg: fixed mistake in eig_banded documentation. - #1565: ndimage: bug in ndimage.variance - #1457: ndimage: standard_deviation does not work with sequence of indexes - #1562: cluster: segfault in linkage function - #1568: stats: One-sided fisher_exact() returns `p` < 1 for 0 successful attempts - #1575: stats: zscore and zmap handle the axis keyword incorrectly -------------- next part -------------- An HTML attachment was scrubbed... URL: From warren.weckesser at enthought.com Sat Feb 11 12:24:51 2012 From: warren.weckesser at enthought.com (Warren Weckesser) Date: Sat, 11 Feb 2012 11:24:51 -0600 Subject: [SciPy-Dev] Sylvester equation solver for complex matrices In-Reply-To: References: Message-ID: On Sat, Feb 11, 2012 at 4:45 AM, Nils Wagner wrote: > Hi all, > > I run a short test on the recent solver for Sylvester > equations. > In case of complex input matrices the method returns > ** On entry to ZTRSYL parameter number 2 had an illegal > value > > Nils > > Thanks, Nils. The new solve_sylvester function didn't handle complex matrices correctly. I have submitted a pull request ( https://github.com/scipy/scipy/pull/155) that should fix this. Warren > import numpy as np > from scipy import linalg as la > np.random.seed(10) > a = np.random.rand(20,20)+1j*np.random.rand(20,20) > b = np.random.rand(20,20)+1j*np.random.rand(20,20) > q = np.random.rand(20,20)+1j*np.random.rand(20,20) > > x = la.solve_sylvester(a,b,q) > res = la.norm(np.dot(a,x)+np.dot(x,b)-q) > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > -------------- next part -------------- An HTML attachment was scrubbed... URL: From cgohlke at uci.edu Sat Feb 11 14:55:44 2012 From: cgohlke at uci.edu (Christoph Gohlke) Date: Sat, 11 Feb 2012 11:55:44 -0800 Subject: [SciPy-Dev] ANN: SciPy 0.10.1 release candidate 1 In-Reply-To: References: Message-ID: <4F36C7C0.80009@uci.edu> On 2/11/2012 6:11 AM, Ralf Gommers wrote: > Hi all, > > I am pleased to announce the availability of the first release candidate > of SciPy 0.10.1. Please try out this release and report any problems on > the scipy-dev mailing list. If no problems are found, the final release > will be available in one week. > > Sources and binaries can be found at > http://sourceforge.net/projects/scipy/files/scipy/0.10.1rc1/, release > notes are copied below. > > Cheers, > Ralf > > > ========================== > SciPy 0.10.1 Release Notes > ========================== > > .. contents:: > > SciPy 0.10.1 is a bug-fix release with no new features compared to 0.10.0. > > Main changes > ------------ > > The most important changes are:: > > 1. The single precision routines of ``eigs`` and ``eigsh`` in > ``scipy.sparse.linalg`` have been disabled (they internally use double > precision now). > 2. A compatibility issue related to changes in NumPy macros has been > fixed, in > order to make scipy 0.10.1 compile with the upcoming numpy 1.7.0 > release. > > > Other issues fixed > ------------------ > > - #835: stats: nan propagation in stats.distributions > - #1202: io: netcdf segfault > - #1531: optimize: make curve_fit work with method as callable. > - #1560: linalg: fixed mistake in eig_banded documentation. > - #1565: ndimage: bug in ndimage.variance > - #1457: ndimage: standard_deviation does not work with sequence of indexes > - #1562: cluster: segfault in linkage function > - #1568: stats: One-sided fisher_exact() returns `p` < 1 for 0 > successful attempts > - #1575: stats: zscore and zmap handle the axis keyword incorrectly > > > Hi Ralf, looks good here (Windows, msvc9, MKL, numpy 1.6.1). Just one compile error related to the numpy macros: . Christoph From charlesr.harris at gmail.com Sat Feb 11 15:21:50 2012 From: charlesr.harris at gmail.com (Charles R Harris) Date: Sat, 11 Feb 2012 13:21:50 -0700 Subject: [SciPy-Dev] ANN: SciPy 0.10.1 release candidate 1 In-Reply-To: <4F36C7C0.80009@uci.edu> References: <4F36C7C0.80009@uci.edu> Message-ID: On Sat, Feb 11, 2012 at 12:55 PM, Christoph Gohlke wrote: > > > On 2/11/2012 6:11 AM, Ralf Gommers wrote: > > Hi all, > > > > I am pleased to announce the availability of the first release candidate > > of SciPy 0.10.1. Please try out this release and report any problems on > > the scipy-dev mailing list. If no problems are found, the final release > > will be available in one week. > > > > Sources and binaries can be found at > > http://sourceforge.net/projects/scipy/files/scipy/0.10.1rc1/, release > > notes are copied below. > > > > Cheers, > > Ralf > > > > > > ========================== > > SciPy 0.10.1 Release Notes > > ========================== > > > > .. contents:: > > > > SciPy 0.10.1 is a bug-fix release with no new features compared to > 0.10.0. > > > > Main changes > > ------------ > > > > The most important changes are:: > > > > 1. The single precision routines of ``eigs`` and ``eigsh`` in > > ``scipy.sparse.linalg`` have been disabled (they internally use > double > > precision now). > > 2. A compatibility issue related to changes in NumPy macros has been > > fixed, in > > order to make scipy 0.10.1 compile with the upcoming numpy 1.7.0 > > release. > > > > > > Other issues fixed > > ------------------ > > > > - #835: stats: nan propagation in stats.distributions > > - #1202: io: netcdf segfault > > - #1531: optimize: make curve_fit work with method as callable. > > - #1560: linalg: fixed mistake in eig_banded documentation. > > - #1565: ndimage: bug in ndimage.variance > > - #1457: ndimage: standard_deviation does not work with sequence of > indexes > > - #1562: cluster: segfault in linkage function > > - #1568: stats: One-sided fisher_exact() returns `p` < 1 for 0 > > successful attempts > > - #1575: stats: zscore and zmap handle the axis keyword incorrectly > > > > > > > > Hi Ralf, > > looks good here (Windows, msvc9, MKL, numpy 1.6.1). Just one compile > error related to the numpy macros: > . > > Hi Christoph, I already committed your patch in e0b7cf812840f2b293d9db1e00790dd2edea4cc2. Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From josef.pktd at gmail.com Sat Feb 11 15:31:26 2012 From: josef.pktd at gmail.com (josef.pktd at gmail.com) Date: Sat, 11 Feb 2012 15:31:26 -0500 Subject: [SciPy-Dev] ANN: SciPy 0.10.1 release candidate 1 In-Reply-To: <4F36C7C0.80009@uci.edu> References: <4F36C7C0.80009@uci.edu> Message-ID: On Sat, Feb 11, 2012 at 2:55 PM, Christoph Gohlke wrote: > > > On 2/11/2012 6:11 AM, Ralf Gommers wrote: >> Hi all, >> >> I am pleased to announce the availability of the first release candidate >> of SciPy 0.10.1. Please try out this release and report any problems on >> the scipy-dev mailing list. If no problems are found, the final release >> will be available in one week. >> >> Sources and binaries can be found at >> http://sourceforge.net/projects/scipy/files/scipy/0.10.1rc1/, release >> notes are copied below. >> >> Cheers, >> Ralf >> >> >> ========================== >> SciPy 0.10.1 Release Notes >> ========================== >> >> .. contents:: >> >> SciPy 0.10.1 is a bug-fix release with no new features compared to 0.10.0. >> >> Main changes >> ------------ >> >> The most important changes are:: >> >> 1. The single precision routines of ``eigs`` and ``eigsh`` in >> ? ? ``scipy.sparse.linalg`` have been disabled (they internally use double >> ? ? precision now). >> 2. A compatibility issue related to changes in NumPy macros has been >> fixed, in >> ? ? order to make scipy 0.10.1 compile with the upcoming numpy 1.7.0 >> release. >> >> >> Other issues fixed >> ------------------ >> >> - #835: stats: nan propagation in stats.distributions >> - #1202: io: netcdf segfault >> - #1531: optimize: make curve_fit work with method as callable. >> - #1560: linalg: fixed mistake in eig_banded documentation. >> - #1565: ndimage: bug in ndimage.variance >> - #1457: ndimage: standard_deviation does not work with sequence of indexes >> - #1562: cluster: segfault in linkage function >> - #1568: stats: One-sided fisher_exact() returns `p` < 1 for 0 >> successful attempts >> - #1575: stats: zscore and zmap handle the axis keyword incorrectly >> >> >> > > Hi Ralf, > > looks good here (Windows, msvc9, MKL, numpy 1.6.1). Just one compile > error related to the numpy macros: > . > > Christoph Not directly to the thread Christoph, Thank you for providing Windows binaries. You just rescued my incomplete ipython install by providing the only pyzmq binaries for Python 2.6 that I could find. Josef > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev From emmanuelle.gouillart at normalesup.org Sun Feb 12 05:16:44 2012 From: emmanuelle.gouillart at normalesup.org (Emmanuelle Gouillart) Date: Sun, 12 Feb 2012 11:16:44 +0100 Subject: [SciPy-Dev] Euroscipy 2012 - Brussels - August 23-37 - call for abstracts Message-ID: <20120212101644.GB5452@phare.normalesup.org> We apologize for the inconvenience if you received this e-mail through several mailing-lists. ------------------------------------------------------------- Euroscipy 2012, the 5th European meeting on Python in Science ------------------------------------------------------------- It is our pleasure to announce the conference Euroscipy 2012, that will be held in **Brussels**, **August 23-27**, at the Universit? Libre de Bruxelles (ULB, Solbosch Campus). The EuroSciPy meeting is a cross-disciplinary gathering focused on the use and development of the Python language in scientific research and industry. This event strives to bring together both users and developers of scientific tools, as well as academic research and state of the art industry. Website ======= http://www.euroscipy.org/conference/euroscipy2012 Main topics =========== - Presentations of scientific tools and libraries using the Python language, including but not limited to: - vector and array manipulation - parallel computing - scientific visualization - scientific data flow and persistence - algorithms implemented or exposed in Python - web applications and portals for science and engineering. - Reports on the use of Python in scientific achievements or ongoing projects. - General-purpose Python tools that can be of special interest to the scientific community. Tutorials ========= There will be two tutorial tracks at the conference, an introductory one, to bring up to speed with the Python language as a scientific tool, and an advanced track, during which experts of the field will lecture on specific advanced topics such as advanced use of numpy, paralllel computing, advanced testing... Keynote Speaker: David Beazley ============================== This year, we are very happy to welcome David Beazley (http://www.dabeaz.com) as our keynote speaker. David is the original author of SWIG, a software development tool that connects programs written in C and C++ with a variety of high-level programming languages such as Python. He has also authored the acclaimed Python Essential Reference. Important dates =============== Talk submission deadline: Mon Apr 30, 2012 Program announced: end of May Tutorials tracks: Thursday August 23 - Friday August 24, 2012 Conference track: Saturday August 25 - Sunday August 26, 2012 Satellites: Monday August 27 Satellite meetings are yet to be announced. Call for talks and posters ========================== We are soliciting talks and posters that discuss topics related to scientific computing using Python. These include applications, teaching, future development directions, and research. We welcome contributions from the industry as well as the academic world. Indeed, industrial research and development as well academic research face the challenge of mastering IT tools for exploration, modeling and analysis. We look forward to hearing your recent breakthroughs using Python! Submission guidelines ===================== - We solicit proposals in the form of a **one-page long abstract**. - Submissions whose main purpose is to promote a commercial product or service will be refused. - All accepted proposals must be presented at the EuroSciPy conference by at least one author. Abstracts should be detailed enough for the reviewers to appreciate the interest of the work for a wide audience. Examples of abstracts can be found on last year's webpage www.euroscipy.org/track/3992 (talks tab). The one-page long abstracts are for conference planning and selection purposes only. How to submit an abstract ========================= To submit a talk to the EuroScipy conference follow the instructions here: http://www.euroscipy.org/card/euroscipy2012_call_for_contributions Organizers ========== Chairs: - Pierre de Buyl - Didrik Pinte Local organizing committee - Kael Hanson - Nicolas Pettiaux Program committee - Tiziano Zito (Chair) - Pierre de Buyl - Emmanuelle Gouillart - Kael Hanson - Konrad Hinsen - Hans Petter Langtangen - Mike M?ller - Stefan Van Der Walt - Ga?l Varoquaux Tutorials chair: Valentin Haenel General organizing committee - Communication: Emmanuelle Gouillart - Sponsoring: Mike M?ller. - Web site: Nicolas Chauvat. Still have questions? ===================== send an e-mail to org-team at lists.euroscipy.org -- Emmanuelle, for the organizing team From cgohlke at uci.edu Sun Feb 12 15:10:02 2012 From: cgohlke at uci.edu (Christoph Gohlke) Date: Sun, 12 Feb 2012 12:10:02 -0800 Subject: [SciPy-Dev] Numpy-dev and scipy-0.10.1 test errors Message-ID: <4F381C9A.4080000@uci.edu> Hello, while testing msvc9/MKL builds of scipy-0.10.1rc1 and numpy-dev on win-amd64-py2.7 I got 20 numpy test errors and 2 scipy test errors. The full tests results are attached. I have not looked at them in detail. There are no test errors or failures with numpy 1.6.1 and scipy-0.10.1rc1. Christoph -------------- next part -------------- An embedded and charset-unspecified text was scrubbed... Name: scipy-0.10.1rc1.win-amd64-py2.7-test-errors.txt URL: -------------- next part -------------- An embedded and charset-unspecified text was scrubbed... Name: numpy-2.0.0.dev-015cada.win-amd64-py2.7-test-errors.txt URL: From ralf.gommers at googlemail.com Sun Feb 12 15:19:31 2012 From: ralf.gommers at googlemail.com (Ralf Gommers) Date: Sun, 12 Feb 2012 21:19:31 +0100 Subject: [SciPy-Dev] Numpy-dev and scipy-0.10.1 test errors In-Reply-To: <4F381C9A.4080000@uci.edu> References: <4F381C9A.4080000@uci.edu> Message-ID: On Sun, Feb 12, 2012 at 9:10 PM, Christoph Gohlke wrote: > Hello, > > while testing msvc9/MKL builds of scipy-0.10.1rc1 and numpy-dev on > win-amd64-py2.7 I got 20 numpy test errors and 2 scipy test errors. The > full tests results are attached. I have not looked at them in detail. There > are no test errors or failures with numpy 1.6.1 and scipy-0.10.1rc1. Most of the errors are due to https://github.com/numpy/numpy/pull/201, which should be fixed in numpy master. I'm having a look at that. The stats.distributions warnings do look like there's a problem with MSVC that's not there when compiling with MinGW. But that must have been there for 0.10.0 too, so I don't think we should hold up the 0.10.1 release for that. These do like a serious issue, it's the same one that caused MinGW 3.4.5 to stop working: ====================================================================== ERROR: test_datetime_array_str (test_datetime.TestDateTime) ---------------------------------------------------------------------- Traceback (most recent call last): File "X:\Python27-x64\lib\site-packages\numpy\core\tests\test_datetime.py", li ne 514, in test_datetime_array_str formatter={'datetime': lambda x : File "X:\Python27-x64\lib\site-packages\numpy\core\arrayprint.py", line 459, i n array2string separator, prefix, formatter=formatter) File "X:\Python27-x64\lib\site-packages\numpy\core\arrayprint.py", line 259, i n _array2string 'int' : IntegerFormat(data), File "X:\Python27-x64\lib\site-packages\numpy\core\arrayprint.py", line 658, i n __init__ len(str(minimum.reduce(data, skipna=True)))) OSError: Failed to use 'localtime_s' to convert to a local time ====================================================================== ERROR: test_combinations (test_multiarray.TestArgmax) ---------------------------------------------------------------------- Traceback (most recent call last): File "X:\Python27-x64\lib\site-packages\numpy\core\tests\test_multiarray.py", line 979, in test_combinations assert_equal(np.argmax(arr), pos, err_msg="%r"%arr) OSError: Failed to use 'localtime_s' to convert to a local time ====================================================================== ERROR: test_combinations (test_multiarray.TestArgmin) ---------------------------------------------------------------------- Traceback (most recent call last): File "X:\Python27-x64\lib\site-packages\numpy\core\tests\test_multiarray.py", line 1047, in test_combinations assert_equal(np.argmin(arr), pos, err_msg="%r"%arr) OSError: Failed to use 'localtime_s' to convert to a local time Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From edwardyoon at apache.org Mon Feb 13 00:56:19 2012 From: edwardyoon at apache.org (Edward J. Yoon) Date: Mon, 13 Feb 2012 14:56:19 +0900 Subject: [SciPy-Dev] BSP interface in the SciPy. Message-ID: Hi community, My name is Edward, and I'm a committer of Apache Hama project[1] which is a Bulk Synchronous Parallel framework for massive scientific computation on top of Hadoop[2]. Today, I just noticed that there's a BSP interface based on BSPLib in the SciPy, and thought maybe we could work together, on supporting SciPy programs to run on existing Hadoop YARN[3] or Hama cluster. I would like to know if anyone would be willing to participate in this project. Thanks! 1. http://incubator.apache.org/hama/ or check more recent 2. http://hadoop.apache.org/ 3. http://developer.yahoo.com/blogs/hadoop/posts/2011/02/mapreduce-nextgen/ -- Best Regards, Edward J. Yoon @eddieyoon From ralf.gommers at googlemail.com Mon Feb 13 01:11:00 2012 From: ralf.gommers at googlemail.com (Ralf Gommers) Date: Mon, 13 Feb 2012 07:11:00 +0100 Subject: [SciPy-Dev] BSP interface in the SciPy. In-Reply-To: References: Message-ID: On Mon, Feb 13, 2012 at 6:56 AM, Edward J. Yoon wrote: > Hi community, > > My name is Edward, and I'm a committer of Apache Hama project[1] which > is a Bulk Synchronous Parallel framework for massive scientific > computation on top of Hadoop[2]. > > Today, I just noticed that there's a BSP interface based on BSPLib in > the SciPy, and thought maybe we could work together, on supporting > SciPy programs to run on existing Hadoop YARN[3] or Hama cluster. > To anyone else who can't find this interface: this is actually in Scientific, not SciPy. It looks like the source of Scientific is not even publicly available anymore, all I can find is one non-working sourceforge link. Ralf > I would like to know if anyone would be willing to participate in this > project. > > Thanks! > > 1. http://incubator.apache.org/hama/ or check more recent > 2. http://hadoop.apache.org/ > 3. > http://developer.yahoo.com/blogs/hadoop/posts/2011/02/mapreduce-nextgen/ > > -- > Best Regards, Edward J. Yoon > @eddieyoon > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > -------------- next part -------------- An HTML attachment was scrubbed... URL: From edwardyoon at apache.org Mon Feb 13 03:19:28 2012 From: edwardyoon at apache.org (Edward J. Yoon) Date: Mon, 13 Feb 2012 17:19:28 +0900 Subject: [SciPy-Dev] BSP interface in the SciPy. In-Reply-To: References: Message-ID: Hmm, yes. I just looked again, it's my misreading. It looks like a part of Scientific, not a SciPy. But if you have some interested in figuring out whether we can collaborate on high-performance parallel processing, let's continue discuss here :) Thanks! On Mon, Feb 13, 2012 at 3:11 PM, Ralf Gommers wrote: > > > On Mon, Feb 13, 2012 at 6:56 AM, Edward J. Yoon > wrote: >> >> Hi community, >> >> My name is Edward, and I'm a committer of Apache Hama project[1] which >> is a Bulk Synchronous Parallel framework for massive scientific >> computation on top of Hadoop[2]. >> >> Today, I just noticed that there's a BSP interface based on BSPLib in >> the SciPy, and thought maybe we could work together, on supporting >> SciPy programs to run on existing Hadoop YARN[3] or Hama cluster. > > > To anyone else who can't find this interface: this is actually in > Scientific, not SciPy. It looks like the source of Scientific is not even > publicly available anymore, all I can find is one non-working sourceforge > link. > > Ralf > >> >> I would like to know if anyone would be willing to participate in this >> project. >> >> Thanks! >> >> 1. http://incubator.apache.org/hama/ or check more recent >> 2. http://hadoop.apache.org/ >> 3. >> http://developer.yahoo.com/blogs/hadoop/posts/2011/02/mapreduce-nextgen/ >> >> -- >> Best Regards, Edward J. Yoon >> @eddieyoon >> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at scipy.org >> http://mail.scipy.org/mailman/listinfo/scipy-dev > > > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > -- Best Regards, Edward J. Yoon @eddieyoon From thomas at kluyver.me.uk Mon Feb 13 05:40:44 2012 From: thomas at kluyver.me.uk (Thomas Kluyver) Date: Mon, 13 Feb 2012 10:40:44 +0000 Subject: [SciPy-Dev] BSP interface in the SciPy. In-Reply-To: References: Message-ID: On 13 February 2012 06:11, Ralf Gommers wrote: > To anyone else who can't find this interface: this is actually in > Scientific, not SciPy. It looks like the source of Scientific is not even > publicly available anymore, all I can find is one non-working sourceforge > link. There are still tarballs here: https://sourcesup.cru.fr/projects/scientific-py/ And a bit of detective work found that the author has put the repository on bitbucket: https://bitbucket.org/khinsen/scientificpython/overview Thomas -------------- next part -------------- An HTML attachment was scrubbed... URL: From paul.anton.letnes at gmail.com Mon Feb 13 11:58:13 2012 From: paul.anton.letnes at gmail.com (Paul Anton Letnes) Date: Mon, 13 Feb 2012 17:58:13 +0100 Subject: [SciPy-Dev] [SciPy-User] ANN: SciPy 0.10.1 release candidate 1 In-Reply-To: References: Message-ID: Hi, I screwed up my previous e-mail (using my non-virtualenv, regular python) and I am trying again. These are the results on my machine - OS X 10.7.3, gcc 4.2 from apple, and gfortran 4.6.2 from Homebrew-alt. Paul (scipy-test)i-courant /tmp/paulanto % python -c 'import numpy;import scipy;numpy.test(verbose=1);scipy.test(verbose=1)' Running unit tests for numpy NumPy version 1.6.1 NumPy is installed in /Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/numpy Python version 2.7.2 (default, Oct 9 2011, 18:03:13) [GCC 4.2.1 (Apple Inc. build 5666) (dot 3)] nose version 1.1.2 ......................................................................................................................................................................................................................................................................................................................................................................................................................................................................................./Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/numpy/core/numeric.py:1920: RuntimeWarning: invalid value encountered in absolute return all(less_equal(absolute(x-y), atol + rtol * absolute(y))) .......................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................K.................................................................................................K......................K...........FFFFFFF...F./Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/numpy/core/tests/test_umath_complex.py:364: RuntimeWarning: invalid value encountered in sqrt z = np.sqrt(np.array(np.complex(-np.inf, np.nan))) ........................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................S..................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................... ====================================================================== FAIL: test_umath_complex.TestCsqrt.test_special_values(, 1, inf, inf, inf) ---------------------------------------------------------------------- Traceback (most recent call last): File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/nose/case.py", line 197, in runTest self.test(*self.arg) File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/numpy/core/tests/test_umath_complex.py", line 578, in check_complex_value assert_equal(f(z1), z2) File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/numpy/testing/utils.py", line 256, in assert_equal return assert_array_equal(actual, desired, err_msg, verbose) File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/numpy/testing/utils.py", line 707, in assert_array_equal verbose=verbose, header='Arrays are not equal') File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/numpy/testing/utils.py", line 605, in assert_array_compare chk_same_position(x_id, y_id, hasval='nan') File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/numpy/testing/utils.py", line 588, in chk_same_position raise AssertionError(msg) AssertionError: Arrays are not equal x and y nan location mismatch: x: array([ nan+infj]) y: array((inf+infj)) ====================================================================== FAIL: test_umath_complex.TestCsqrt.test_special_values(, -1, inf, inf, inf) ---------------------------------------------------------------------- Traceback (most recent call last): File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/nose/case.py", line 197, in runTest self.test(*self.arg) File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/numpy/core/tests/test_umath_complex.py", line 578, in check_complex_value assert_equal(f(z1), z2) File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/numpy/testing/utils.py", line 256, in assert_equal return assert_array_equal(actual, desired, err_msg, verbose) File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/numpy/testing/utils.py", line 707, in assert_array_equal verbose=verbose, header='Arrays are not equal') File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/numpy/testing/utils.py", line 605, in assert_array_compare chk_same_position(x_id, y_id, hasval='nan') File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/numpy/testing/utils.py", line 588, in chk_same_position raise AssertionError(msg) AssertionError: Arrays are not equal x and y nan location mismatch: x: array([ nan+infj]) y: array((inf+infj)) ====================================================================== FAIL: test_umath_complex.TestCsqrt.test_special_values(, 0.0, inf, inf, inf) ---------------------------------------------------------------------- Traceback (most recent call last): File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/nose/case.py", line 197, in runTest self.test(*self.arg) File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/numpy/core/tests/test_umath_complex.py", line 578, in check_complex_value assert_equal(f(z1), z2) File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/numpy/testing/utils.py", line 256, in assert_equal return assert_array_equal(actual, desired, err_msg, verbose) File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/numpy/testing/utils.py", line 707, in assert_array_equal verbose=verbose, header='Arrays are not equal') File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/numpy/testing/utils.py", line 605, in assert_array_compare chk_same_position(x_id, y_id, hasval='nan') File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/numpy/testing/utils.py", line 588, in chk_same_position raise AssertionError(msg) AssertionError: Arrays are not equal x and y nan location mismatch: x: array([ nan+infj]) y: array((inf+infj)) ====================================================================== FAIL: test_umath_complex.TestCsqrt.test_special_values(, -0.0, inf, inf, inf) ---------------------------------------------------------------------- Traceback (most recent call last): File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/nose/case.py", line 197, in runTest self.test(*self.arg) File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/numpy/core/tests/test_umath_complex.py", line 578, in check_complex_value assert_equal(f(z1), z2) File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/numpy/testing/utils.py", line 256, in assert_equal return assert_array_equal(actual, desired, err_msg, verbose) File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/numpy/testing/utils.py", line 707, in assert_array_equal verbose=verbose, header='Arrays are not equal') File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/numpy/testing/utils.py", line 605, in assert_array_compare chk_same_position(x_id, y_id, hasval='nan') File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/numpy/testing/utils.py", line 588, in chk_same_position raise AssertionError(msg) AssertionError: Arrays are not equal x and y nan location mismatch: x: array([ nan+infj]) y: array((inf+infj)) ====================================================================== FAIL: test_umath_complex.TestCsqrt.test_special_values(, inf, inf, inf, inf) ---------------------------------------------------------------------- Traceback (most recent call last): File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/nose/case.py", line 197, in runTest self.test(*self.arg) File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/numpy/core/tests/test_umath_complex.py", line 578, in check_complex_value assert_equal(f(z1), z2) File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/numpy/testing/utils.py", line 256, in assert_equal return assert_array_equal(actual, desired, err_msg, verbose) File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/numpy/testing/utils.py", line 707, in assert_array_equal verbose=verbose, header='Arrays are not equal') File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/numpy/testing/utils.py", line 605, in assert_array_compare chk_same_position(x_id, y_id, hasval='nan') File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/numpy/testing/utils.py", line 588, in chk_same_position raise AssertionError(msg) AssertionError: Arrays are not equal x and y nan location mismatch: x: array([ nan+infj]) y: array((inf+infj)) ====================================================================== FAIL: test_umath_complex.TestCsqrt.test_special_values(, -inf, inf, inf, inf) ---------------------------------------------------------------------- Traceback (most recent call last): File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/nose/case.py", line 197, in runTest self.test(*self.arg) File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/numpy/core/tests/test_umath_complex.py", line 578, in check_complex_value assert_equal(f(z1), z2) File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/numpy/testing/utils.py", line 256, in assert_equal return assert_array_equal(actual, desired, err_msg, verbose) File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/numpy/testing/utils.py", line 707, in assert_array_equal verbose=verbose, header='Arrays are not equal') File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/numpy/testing/utils.py", line 605, in assert_array_compare chk_same_position(x_id, y_id, hasval='nan') File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/numpy/testing/utils.py", line 588, in chk_same_position raise AssertionError(msg) AssertionError: Arrays are not equal x and y nan location mismatch: x: array([ nan+infj]) y: array((inf+infj)) ====================================================================== FAIL: test_umath_complex.TestCsqrt.test_special_values(, nan, inf, inf, inf) ---------------------------------------------------------------------- Traceback (most recent call last): File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/nose/case.py", line 197, in runTest self.test(*self.arg) File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/numpy/core/tests/test_umath_complex.py", line 578, in check_complex_value assert_equal(f(z1), z2) File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/numpy/testing/utils.py", line 256, in assert_equal return assert_array_equal(actual, desired, err_msg, verbose) File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/numpy/testing/utils.py", line 707, in assert_array_equal verbose=verbose, header='Arrays are not equal') File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/numpy/testing/utils.py", line 605, in assert_array_compare chk_same_position(x_id, y_id, hasval='nan') File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/numpy/testing/utils.py", line 588, in chk_same_position raise AssertionError(msg) AssertionError: Arrays are not equal x and y nan location mismatch: x: array([ nan+infj]) y: array((inf+infj)) ====================================================================== FAIL: test_umath_complex.TestCsqrt.test_special_values(, -inf, 1, 0.0, inf) ---------------------------------------------------------------------- Traceback (most recent call last): File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/nose/case.py", line 197, in runTest self.test(*self.arg) File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/numpy/core/tests/test_umath_complex.py", line 578, in check_complex_value assert_equal(f(z1), z2) File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/numpy/testing/utils.py", line 256, in assert_equal return assert_array_equal(actual, desired, err_msg, verbose) File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/numpy/testing/utils.py", line 707, in assert_array_equal verbose=verbose, header='Arrays are not equal') File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/numpy/testing/utils.py", line 605, in assert_array_compare chk_same_position(x_id, y_id, hasval='nan') File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/numpy/testing/utils.py", line 588, in chk_same_position raise AssertionError(msg) AssertionError: Arrays are not equal x and y nan location mismatch: x: array([ nan+infj]) y: array(infj) ---------------------------------------------------------------------- Ran 3533 tests in 14.446s FAILED (KNOWNFAIL=3, SKIP=1, failures=8) Running unit tests for scipy NumPy version 1.6.1 NumPy is installed in /Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/numpy SciPy version 0.10.1rc1 SciPy is installed in /Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/scipy Python version 2.7.2 (default, Oct 9 2011, 18:03:13) [GCC 4.2.1 (Apple Inc. build 5666) (dot 3)] nose version 1.1.2 ............................................................................................................................................................................................................................K............................................................................................................/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/scipy/interpolate/fitpack2.py:674: UserWarning: The coefficients of the spline returned have been computed as the minimal norm least-squares solution of a (numerically) rank deficient system (deficiency=7). If deficiency is large, the results may be inaccurate. Deficiency may strongly depend on the value of eps. warnings.warn(message) ....../Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/scipy/interpolate/fitpack2.py:605: UserWarning: The required storage space exceeds the available storage space: nxest or nyest too small, or s too small. The weighted least-squares spline corresponds to the current set of knots. warnings.warn(message) ........................K..K.................................................................................................................................................................................................................................................................................................................................................................................................................................................../Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/scipy/io/wavfile.py:31: WavFileWarning: Unfamiliar format bytes warnings.warn("Unfamiliar format bytes", WavFileWarning) /Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/scipy/io/wavfile.py:121: WavFileWarning: chunk not understood warnings.warn("chunk not understood", WavFileWarning) ....................................................................................F..FF......................................................................................................................................SSSSSS......SSSSSS......SSSS.....................FFF.........................................F....FF.......S............................................................................................................................................................................................................................................................K......................................................................................................................................................................................................SSSSS............S..........................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................SSSSSSSSSSS.........../Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/scipy/sparse/linalg/eigen/arpack/arpack.py:63: UserWarning: Single-precision types in `eigs` and `eighs` are not supported currently. Double precision routines are used instead. warnings.warn("Single-precision types in `eigs` and `eighs` " ....F.F.....................F...........F.F..............................................................................................F........................F.........................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................K...............................................................K...........................................................................................................................................................KK.............................................................................................................................................................................................................................................................................................................................................................................................................................................K.K.............................................................................................................................................................................................................................................................................................................................................................................................K........K..............SSSSSSS..........................................................................................................................................................S.............................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................. ====================================================================== FAIL: test_asum (test_blas.TestFBLAS1Simple) ---------------------------------------------------------------------- Traceback (most recent call last): File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/scipy/lib/blas/tests/test_blas.py", line 58, in test_asum assert_almost_equal(f([3,-4,5]),12) File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/numpy/testing/utils.py", line 468, in assert_almost_equal raise AssertionError(msg) AssertionError: Arrays are not almost equal to 7 decimals ACTUAL: 0.0 DESIRED: 12 ====================================================================== FAIL: test_dot (test_blas.TestFBLAS1Simple) ---------------------------------------------------------------------- Traceback (most recent call last): File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/scipy/lib/blas/tests/test_blas.py", line 67, in test_dot assert_almost_equal(f([3,-4,5],[2,5,1]),-9) File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/numpy/testing/utils.py", line 468, in assert_almost_equal raise AssertionError(msg) AssertionError: Arrays are not almost equal to 7 decimals ACTUAL: 0.0 DESIRED: -9 ====================================================================== FAIL: test_nrm2 (test_blas.TestFBLAS1Simple) ---------------------------------------------------------------------- Traceback (most recent call last): File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/scipy/lib/blas/tests/test_blas.py", line 78, in test_nrm2 assert_almost_equal(f([3,-4,5]),math.sqrt(50)) File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/numpy/testing/utils.py", line 468, in assert_almost_equal raise AssertionError(msg) AssertionError: Arrays are not almost equal to 7 decimals ACTUAL: 0.0 DESIRED: 7.0710678118654755 ====================================================================== FAIL: test_basic.TestNorm.test_overflow ---------------------------------------------------------------------- Traceback (most recent call last): File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/nose/case.py", line 197, in runTest self.test(*self.arg) File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/scipy/linalg/tests/test_basic.py", line 581, in test_overflow assert_almost_equal(norm(a), a) File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/numpy/testing/utils.py", line 452, in assert_almost_equal return assert_array_almost_equal(actual, desired, decimal, err_msg) File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/numpy/testing/utils.py", line 800, in assert_array_almost_equal header=('Arrays are not almost equal to %d decimals' % decimal)) File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/numpy/testing/utils.py", line 636, in assert_array_compare raise AssertionError(msg) AssertionError: Arrays are not almost equal to 7 decimals (mismatch 100.0%) x: array(-0.0) y: array([ 1.00000002e+20], dtype=float32) ====================================================================== FAIL: test_basic.TestNorm.test_stable ---------------------------------------------------------------------- Traceback (most recent call last): File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/nose/case.py", line 197, in runTest self.test(*self.arg) File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/scipy/linalg/tests/test_basic.py", line 586, in test_stable assert_almost_equal(norm(a) - 1e4, 0.5) File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/numpy/testing/utils.py", line 468, in assert_almost_equal raise AssertionError(msg) AssertionError: Arrays are not almost equal to 7 decimals ACTUAL: -10000.0 DESIRED: 0.5 ====================================================================== FAIL: test_basic.TestNorm.test_types ---------------------------------------------------------------------- Traceback (most recent call last): File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/nose/case.py", line 197, in runTest self.test(*self.arg) File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/scipy/linalg/tests/test_basic.py", line 568, in test_types assert_allclose(norm(x), np.sqrt(14), rtol=tol) File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/numpy/testing/utils.py", line 1168, in assert_allclose verbose=verbose, header=header) File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/numpy/testing/utils.py", line 636, in assert_array_compare raise AssertionError(msg) AssertionError: Not equal to tolerance rtol=2.38419e-06, atol=0 (mismatch 100.0%) x: array(1.0842021724855044e-19) y: array(3.7416573867739413) ====================================================================== FAIL: test_asum (test_blas.TestFBLAS1Simple) ---------------------------------------------------------------------- Traceback (most recent call last): File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/scipy/linalg/tests/test_blas.py", line 99, in test_asum assert_almost_equal(f([3,-4,5]),12) File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/numpy/testing/utils.py", line 468, in assert_almost_equal raise AssertionError(msg) AssertionError: Arrays are not almost equal to 7 decimals ACTUAL: 0.0 DESIRED: 12 ====================================================================== FAIL: test_dot (test_blas.TestFBLAS1Simple) ---------------------------------------------------------------------- Traceback (most recent call last): File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/scipy/linalg/tests/test_blas.py", line 109, in test_dot assert_almost_equal(f([3,-4,5],[2,5,1]),-9) File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/numpy/testing/utils.py", line 468, in assert_almost_equal raise AssertionError(msg) AssertionError: Arrays are not almost equal to 7 decimals ACTUAL: 0.0 DESIRED: -9 ====================================================================== FAIL: test_nrm2 (test_blas.TestFBLAS1Simple) ---------------------------------------------------------------------- Traceback (most recent call last): File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/scipy/linalg/tests/test_blas.py", line 127, in test_nrm2 assert_almost_equal(f([3,-4,5]),math.sqrt(50)) File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/numpy/testing/utils.py", line 468, in assert_almost_equal raise AssertionError(msg) AssertionError: Arrays are not almost equal to 7 decimals ACTUAL: 0.0 DESIRED: 7.0710678118654755 ====================================================================== FAIL: test_arpack.test_symmetric_modes(True, , 'f', 2, 'LM', None, 0.5, , None, 'normal') ---------------------------------------------------------------------- Traceback (most recent call last): File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/nose/case.py", line 197, in runTest self.test(*self.arg) File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/scipy/sparse/linalg/eigen/arpack/tests/test_arpack.py", line 235, in eval_evec assert_allclose(LHS, RHS, rtol=rtol, atol=atol, err_msg=err) File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/numpy/testing/utils.py", line 1168, in assert_allclose verbose=verbose, header=header) File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/numpy/testing/utils.py", line 636, in assert_array_compare raise AssertionError(msg) AssertionError: Not equal to tolerance rtol=0.00178814, atol=0.000357628 error for eigsh:standard, typ=f, which=LM, sigma=0.5, mattype=aslinearoperator, OPpart=None, mode=normal (mismatch 100.0%) x: array([[ 0.23815642, 0.1763755 ], [-0.10785346, -0.32103487], [ 0.12468303, -0.11230416],... y: array([[ 0.23815642, 0.24814051], [-0.10785347, -0.15634772], [ 0.12468302, 0.05671416],... ====================================================================== FAIL: test_arpack.test_symmetric_modes(True, , 'f', 2, 'LM', None, 0.5, , None, 'cayley') ---------------------------------------------------------------------- Traceback (most recent call last): File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/nose/case.py", line 197, in runTest self.test(*self.arg) File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/scipy/sparse/linalg/eigen/arpack/tests/test_arpack.py", line 235, in eval_evec assert_allclose(LHS, RHS, rtol=rtol, atol=atol, err_msg=err) File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/numpy/testing/utils.py", line 1168, in assert_allclose verbose=verbose, header=header) File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/numpy/testing/utils.py", line 636, in assert_array_compare raise AssertionError(msg) AssertionError: Not equal to tolerance rtol=0.00178814, atol=0.000357628 error for eigsh:standard, typ=f, which=LM, sigma=0.5, mattype=aslinearoperator, OPpart=None, mode=cayley (mismatch 100.0%) x: array([[ 0.23815693, -0.33630507], [-0.10785286, 0.02168 ], [ 0.12468344, -0.11036437],... y: array([[ 0.23815643, -0.2405392 ], [-0.10785349, 0.14390968], [ 0.12468311, -0.04574991],... ====================================================================== FAIL: test_arpack.test_symmetric_modes(True, , 'f', 2, 'LA', None, 0.5, , None, 'normal') ---------------------------------------------------------------------- Traceback (most recent call last): File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/nose/case.py", line 197, in runTest self.test(*self.arg) File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/scipy/sparse/linalg/eigen/arpack/tests/test_arpack.py", line 235, in eval_evec assert_allclose(LHS, RHS, rtol=rtol, atol=atol, err_msg=err) File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/numpy/testing/utils.py", line 1168, in assert_allclose verbose=verbose, header=header) File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/numpy/testing/utils.py", line 636, in assert_array_compare raise AssertionError(msg) AssertionError: Not equal to tolerance rtol=0.00178814, atol=0.000357628 error for eigsh:standard, typ=f, which=LA, sigma=0.5, mattype=aslinearoperator, OPpart=None, mode=normal (mismatch 100.0%) x: array([[ 28.80129188, -0.6379945 ], [ 34.79312355, 0.27066791], [-270.23255444, 0.4851834 ],... y: array([[ 3.93467650e+03, -6.37994494e-01], [ 3.90913859e+03, 2.70667916e-01], [ -3.62176382e+04, 4.85183382e-01],... ====================================================================== FAIL: test_arpack.test_symmetric_modes(True, , 'f', 2, 'SA', None, 0.5, , None, 'normal') ---------------------------------------------------------------------- Traceback (most recent call last): File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/nose/case.py", line 197, in runTest self.test(*self.arg) File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/scipy/sparse/linalg/eigen/arpack/tests/test_arpack.py", line 235, in eval_evec assert_allclose(LHS, RHS, rtol=rtol, atol=atol, err_msg=err) File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/numpy/testing/utils.py", line 1168, in assert_allclose verbose=verbose, header=header) File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/numpy/testing/utils.py", line 636, in assert_array_compare raise AssertionError(msg) AssertionError: Not equal to tolerance rtol=0.00178814, atol=0.000357628 error for eigsh:standard, typ=f, which=SA, sigma=0.5, mattype=aslinearoperator, OPpart=None, mode=normal (mismatch 100.0%) x: array([[ 0.26260981, 0.23815559], [-0.09760907, -0.10785484], [ 0.06149647, 0.12468203],... y: array([[ 0.23744165, 0.2381564 ], [-0.13633069, -0.10785359], [ 0.03132561, 0.12468301],... ====================================================================== FAIL: test_arpack.test_symmetric_modes(True, , 'f', 2, 'SA', None, 0.5, , None, 'cayley') ---------------------------------------------------------------------- Traceback (most recent call last): File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/nose/case.py", line 197, in runTest self.test(*self.arg) File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/scipy/sparse/linalg/eigen/arpack/tests/test_arpack.py", line 235, in eval_evec assert_allclose(LHS, RHS, rtol=rtol, atol=atol, err_msg=err) File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/numpy/testing/utils.py", line 1168, in assert_allclose verbose=verbose, header=header) File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/numpy/testing/utils.py", line 636, in assert_array_compare raise AssertionError(msg) AssertionError: Not equal to tolerance rtol=0.00178814, atol=0.000357628 error for eigsh:standard, typ=f, which=SA, sigma=0.5, mattype=aslinearoperator, OPpart=None, mode=cayley (mismatch 100.0%) x: array([[ 0.29524244, -0.2381569 ], [-0.08169955, 0.10785299], [ 0.06645597, -0.12468332],... y: array([[ 0.24180251, -0.23815646], [-0.14191195, 0.10785349], [ 0.03568392, -0.12468307],... ====================================================================== FAIL: test_arpack.test_symmetric_modes(True, , 'f', 2, 'SM', None, 0.5, , None, 'buckling') ---------------------------------------------------------------------- Traceback (most recent call last): File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/nose/case.py", line 197, in runTest self.test(*self.arg) File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/scipy/sparse/linalg/eigen/arpack/tests/test_arpack.py", line 235, in eval_evec assert_allclose(LHS, RHS, rtol=rtol, atol=atol, err_msg=err) File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/numpy/testing/utils.py", line 1168, in assert_allclose verbose=verbose, header=header) File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/numpy/testing/utils.py", line 636, in assert_array_compare raise AssertionError(msg) AssertionError: Not equal to tolerance rtol=0.00178814, atol=0.000357628 error for eigsh:general, typ=f, which=SM, sigma=0.5, mattype=aslinearoperator, OPpart=None, mode=buckling (mismatch 100.0%) x: array([[-0.10940548, 0.01676016], [-0.07154097, 0.4628113 ], [ 0.06895222, 0.49206394],... y: array([[-0.10940547, 0.05459438], [-0.07154103, 0.31407543], [ 0.06895217, 0.37578294],... ====================================================================== FAIL: test_arpack.test_symmetric_modes(True, , 'f', 2, 'SA', None, 0.5, , None, 'cayley') ---------------------------------------------------------------------- Traceback (most recent call last): File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/nose/case.py", line 197, in runTest self.test(*self.arg) File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/scipy/sparse/linalg/eigen/arpack/tests/test_arpack.py", line 235, in eval_evec assert_allclose(LHS, RHS, rtol=rtol, atol=atol, err_msg=err) File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/numpy/testing/utils.py", line 1168, in assert_allclose verbose=verbose, header=header) File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/numpy/testing/utils.py", line 636, in assert_array_compare raise AssertionError(msg) AssertionError: Not equal to tolerance rtol=0.00178814, atol=0.000357628 error for eigsh:general, typ=f, which=SA, sigma=0.5, mattype=aslinearoperator, OPpart=None, mode=cayley (mismatch 100.0%) x: array([[-0.4404992 , -0.01935683], [-0.25650678, -0.11053132], [-0.36893024, -0.13223556],... y: array([[-0.44017013, -0.0193569 ], [-0.25525379, -0.11053158], [-0.36818443, -0.13223571],... ---------------------------------------------------------------------- Ran 5101 tests in 51.423s FAILED (KNOWNFAIL=12, SKIP=42, failures=16) From fperez.net at gmail.com Mon Feb 13 16:55:45 2012 From: fperez.net at gmail.com (Fernando Perez) Date: Mon, 13 Feb 2012 13:55:45 -0800 Subject: [SciPy-Dev] Discussion with Guido van Rossum and (hopefully) core python-dev on scientific Python and Python3 Message-ID: Hi folks, [ I'm broadcasting this widely for maximum reach, but I'd appreciate it if replies can be kept to the *numpy* list, which is sort of the 'base' list for scientific/numerical work. It will make it much easier to organize a coherent set of notes later on. Apology if you're subscribed to all and get it 10 times. ] As part of the PyData workshop (http://pydataworkshop.eventbrite.com) to be held March 2 and 3 at the Mountain View Google offices, we have scheduled a session for an open discussion with Guido van Rossum and hopefully as many core python-dev members who can make it. We wanted to seize the combined opportunity of the PyData workshop bringing a number of 'scipy people' to Google with the timeline for Python 3.3, the first release after the Python language moratorium, being within sight: http://www.python.org/dev/peps/pep-0398. While a number of scientific Python packages are already available for Python 3 (either in released form or in their master git branches), it's fair to say that there hasn't been a major transition of the scientific community to Python3. Since there is no more development being done on the Python2 series, eventually we will all want to find ways to make this transition, and we think that this is an excellent time to engage the core python development team and consider ideas that would make Python3 generally a more appealing language for scientific work. Guido has made it clear that he doesn't speak for the day-to-day development of Python anymore, so we all should be aware that any ideas that come out of this panel will still need to be discussed with python-dev itself via standard mechanisms before anything is implemented. Nonetheless, the opportunity for a solid face-to-face dialog for brainstorming was too good to pass up. The purpose of this email is then to solicit, from all of our community, ideas for this discussion. In a week or so we'll need to summarize the main points brought up here and make a more concrete agenda out of it; I will also post a summary of the meeting afterwards here. Anything is a valid topic, some points just to get the conversation started: - Extra operators/PEP 225. Here's a summary from the last time we went over this, years ago at Scipy 2008: http://mail.scipy.org/pipermail/numpy-discussion/2008-October/038234.html, and the current status of the document we wrote about it is here: file:///home/fperez/www/site/_build/html/py4science/numpy-pep225/numpy-pep225.html. - Improved syntax/support for rationals or decimal literals? While Python now has both decimals (http://docs.python.org/library/decimal.html) and rationals (http://docs.python.org/library/fractions.html), they're quite clunky to use because they require full constructor calls. Guido has mentioned in previous discussions toying with ideas about support for different kinds of numeric literals... - Using the numpy docstring standard python-wide, and thus having python improve the pathetic state of the stdlib's docstrings? This is an area where our community is light years ahead of the standard library, but we'd all benefit from Python itself improving on this front. I'm toying with the idea of giving a lighting talk at PyConn about this, comparing the great, robust culture and tools of good docstrings across the Scipy ecosystem with the sad, sad state of docstrings in the stdlib. It might spur some movement on that front from the stdlib authors, esp. if the core python-dev team realizes the value and benefit it can bring (at relatively low cost, given how most of the information does exist, it's just in the wrong places). But more importantly for us, if there was truly a universal standard for high-quality docstrings across Python projects, building good documentation/help machinery would be a lot easier, as we'd know what to expect and search for (such as rendering them nicely in the ipython notebook, providing high-quality cross-project help search, etc). - Literal syntax for arrays? Sage has been floating a discussion about a literal matrix syntax (https://groups.google.com/forum/#!topic/sage-devel/mzwepqZBHnA). For something like this to go into python in any meaningful way there would have to be core multidimensional arrays in the language, but perhaps it's time to think about a piece of the numpy array itself into Python? This is one of the more 'out there' ideas, but after all, that's the point of a discussion like this, especially considering we'll have both Travis and Guido in one room. - Other syntactic sugar? Sage has "a..b" <=> range(a, b+1), which I actually think is both nice and useful... There's also the question of allowing "a:b:c" notation outside of [], which has come up a few times in conversation over the last few years. Others? - The packaging quagmire? This continues to be a problem, though python3 does have new improvements to distutils. I'm not really up to speed on the situation, to be frank. If we want to bring this up, someone will have to provide a solid reference or volunteer to do it in person. - etc... I'm putting the above just to *start* the discussion, but the real point is for the rest of the community to contribute ideas, so don't be shy. Final note: while I am here commiting to organizing and presenting this at the discussion with Guido (as well as contacting python-dev), I would greatly appreciate help with the task of summarizing this prior to the meeting as I'm pretty badly swamped in the run-in to pydata/pycon. So if anyone is willing to help draft the summary as the date draws closer (we can put it up on a github wiki, gist, whatever), I will be very grateful. I'm sure it will be better than what I'll otherwise do the last night at 2am :) Cheers, f ps - to the obvious question about webcasting the discussion live for remote participation: yes, we looked into it already; no, unfortunately it appears it won't be possible. We'll try to at least have the audio recorded (and possibly video) for posting later on. pps- if you are close to Mountain View and are interested in attending this panel in person, drop me a line at fernando.perez at berkeley.edu. We have a few spots available *for this discussion only* on top of the pydata regular attendance (which is long closed, I'm afraid). But we'll need to provide Google with a list of those attendees in advance. Please indicate if you are a core python committer in your email, as we'll give priority for this overflow pool to core python developers (but will otherwise accommodate as many people as Google lets us). From stefan at sun.ac.za Mon Feb 13 18:47:28 2012 From: stefan at sun.ac.za (=?ISO-8859-1?Q?St=E9fan_van_der_Walt?=) Date: Mon, 13 Feb 2012 15:47:28 -0800 Subject: [SciPy-Dev] Numpy-dev and scipy-0.10.1 test errors In-Reply-To: References: <4F381C9A.4080000@uci.edu> Message-ID: On Sun, Feb 12, 2012 at 12:19 PM, Ralf Gommers wrote: > These do like a serious issue, it's the same one that caused MinGW 3.4.5 to > stop working: > > ====================================================================== > ERROR: test_datetime_array_str (test_datetime.TestDateTime) > ---------------------------------------------------------------------- I noticed this one on the buildbot as well: http://buildbot.scipy.org/builders/Windows_XP_x86/builds/1112/steps/shell_1/logs/stdio Those builders are still running, if you want to use them to test; they just have to be triggered by hand. St?fan From scott.sinclair.za at gmail.com Tue Feb 14 07:22:41 2012 From: scott.sinclair.za at gmail.com (Scott Sinclair) Date: Tue, 14 Feb 2012 14:22:41 +0200 Subject: [SciPy-Dev] ANN: SciPy 0.10.1 release candidate 1 In-Reply-To: References: Message-ID: On 11 February 2012 16:11, Ralf Gommers wrote: > I am pleased to announce the availability of the first release candidate of > SciPy 0.10.1. Please try out this release and report any problems on the > scipy-dev mailing list. If no problems are found, the final release will be > available in one week. I just noticed that building with Bento doesn't work from a source distribution. There's a pull request that fixes this at https://github.com/scipy/scipy/pull/158. Since the 0.10.0 release notes advertise Bento build support I wonder whether this commit https://github.com/scipy/scipy/commit/6907a2bffb1484623de3afe130e1a13804dec1b1 and the pull request mentioned above (or similar fix) should both be included in the 0.10.1 release? Cheers, Scott From scott.sinclair.za at gmail.com Tue Feb 14 07:56:22 2012 From: scott.sinclair.za at gmail.com (Scott Sinclair) Date: Tue, 14 Feb 2012 14:56:22 +0200 Subject: [SciPy-Dev] ANN: SciPy 0.10.1 release candidate 1 In-Reply-To: References: Message-ID: On 14 February 2012 14:22, Scott Sinclair wrote: > On 11 February 2012 16:11, Ralf Gommers wrote: >> I am pleased to announce the availability of the first release candidate of >> SciPy 0.10.1. Please try out this release and report any problems on the >> scipy-dev mailing list. If no problems are found, the final release will be >> available in one week. > > I just noticed that building with Bento doesn't work from a source > distribution. There's a pull request that fixes this at > https://github.com/scipy/scipy/pull/158. > > Since the 0.10.0 release notes advertise Bento build support I wonder > whether this commit > https://github.com/scipy/scipy/commit/6907a2bffb1484623de3afe130e1a13804dec1b1 > and the pull request mentioned above (or similar fix) should both be > included in the 0.10.1 release? Hmmm - https://github.com/scipy/scipy/commit/6907a2bffb1484623de3afe130e1a13804dec1b1 isn't actually necessary for 0.10.x, but https://github.com/scipy/scipy/pull/158 is. Cheers, Scott From sturla at molden.no Tue Feb 14 11:05:40 2012 From: sturla at molden.no (Sturla Molden) Date: Tue, 14 Feb 2012 17:05:40 +0100 Subject: [SciPy-Dev] BSP interface in the SciPy. In-Reply-To: References: Message-ID: <4F3A8654.7080707@molden.no> On 13.02.2012 06:56, Edward J. Yoon wrote: > Today, I just noticed that there's a BSP interface based on BSPLib in > the SciPy, and thought maybe we could work together, on supporting > SciPy programs to run on existing Hadoop YARN[3] or Hama cluster. AFAIK, BSP is a coding style, not a particular API. If you need a barrier for BSP synchronization, this is the simplest implementation I can think of: from multiprocessing import Event from math import ceil, log from contextlib import contextmanager def _barrier(b): @contextmanager def _context(rank): b.wait(rank) yield b.wait(rank) return _context class Barrier(object): def __init__(self, numproc): self._events = [mp.Event() for n in range(numproc**2)] self._numproc = numproc self.barrier = _barrier(self) def wait(self, rank): # loop log2(numproc) times, rounding up for k in range(int(ceil(log(self._numproc)/log(2)))): # send event to process # (rank + 2**k) % numproc receiver = (rank + 2**k) % self._numproc evt = self._events[rank * self._numproc + receiver] evt.set() # wait for event from process # (rank - 2**k) % numproc sender = (rank - 2**k) % self._numproc evt = self._events[sender * self._numproc + rank] evt.wait() evt.clear() Now BSP code could look like this: barrier = Barrier(numprocs) for data in container: with barrier.barrier(rank): Sturla From sturla at molden.no Tue Feb 14 11:49:26 2012 From: sturla at molden.no (Sturla Molden) Date: Tue, 14 Feb 2012 17:49:26 +0100 Subject: [SciPy-Dev] BSP interface in the SciPy. In-Reply-To: <4F3A8654.7080707@molden.no> References: <4F3A8654.7080707@molden.no> Message-ID: <4F3A9096.9080603@molden.no> On 14.02.2012 17:05, Sturla Molden wrote: > If you need a barrier for BSP synchronization, this is the simplest > implementation I can think of: Moving the context manager to __call__ and adding a timeout to wait it becomes like this. It's stange that Python does not have a barrier object in the standard lib. Considering usefulness to scientific computing it could be worth adding to numpy or scipy. Sturla from multiprocessing import Event # or threading.Event from math import ceil, log from contextlib import contextmanager from time import clock class Barrier(object): def __init__(self, numproc): self._events = [Event() for n in range(numproc**2)] self._numproc = numproc @contextmanager def __call__(self, rank): self.wait(rank, None) yield self.wait(rank, None) def wait(self, rank, *timeout): t0 = clock() if timeout: timeout = timeout[0] if (timeout is not None) and (not isinstance(timeout, float)): return ValueError, 'timeout must be None or a float' # loop log2(num_threads) times, rounding up for k in range(int(ceil(log(self._numproc)/log(2)))): # send event to process (rank + 2**k) % numproc receiver = (rank + 2**k) % self._numproc evt = self._events[rank * self._numproc + receiver] evt.set() # wait for event from process (rank - 2**k) % numproc sender = (rank - 2**k) % self._numproc evt = self._events[sender * self._numproc + rank] if timeout: t = clock() if not evt.wait(max(0.0,timeout-(t-t0))): return False else: evt.wait() evt.clear() return True From pierre.haessig at crans.org Tue Feb 14 11:53:06 2012 From: pierre.haessig at crans.org (Pierre Haessig) Date: Tue, 14 Feb 2012 17:53:06 +0100 Subject: [SciPy-Dev] BSP interface in the SciPy. In-Reply-To: <4F3A8654.7080707@molden.no> References: <4F3A8654.7080707@molden.no> Message-ID: <4F3A9172.6090905@crans.org> Le 14/02/2012 17:05, Sturla Molden a ?crit : > AFAIK, BSP is a coding style, not a particular API. Hi Sturla, I'm not so familiar with parallel processing. Do you have a short reference on this BSP style ? (oh, there is an international organization supporting it : http://www.bsp-worldwide.org/ !) -- Pierre I was wondering if your Class example comes from a preexisting code or if you just speak "parallel computing" as a mother tongue ;-) ? From robert.kern at gmail.com Tue Feb 14 11:58:17 2012 From: robert.kern at gmail.com (Robert Kern) Date: Tue, 14 Feb 2012 16:58:17 +0000 Subject: [SciPy-Dev] BSP interface in the SciPy. In-Reply-To: <4F3A9172.6090905@crans.org> References: <4F3A8654.7080707@molden.no> <4F3A9172.6090905@crans.org> Message-ID: On Tue, Feb 14, 2012 at 16:53, Pierre Haessig wrote: > Le 14/02/2012 17:05, Sturla Molden a ?crit : >> AFAIK, BSP is a coding style, not a particular API. > Hi Sturla, > I'm not so familiar with parallel processing. > Do you have a short reference on this BSP style ? (oh, there is an > international organization supporting it : http://www.bsp-worldwide.org/ !) http://en.wikipedia.org/wiki/Bulk_synchronous_parallel -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." ? -- Umberto Eco From thomas at kluyver.me.uk Tue Feb 14 11:58:35 2012 From: thomas at kluyver.me.uk (Thomas Kluyver) Date: Tue, 14 Feb 2012 16:58:35 +0000 Subject: [SciPy-Dev] BSP interface in the SciPy. In-Reply-To: <4F3A9096.9080603@molden.no> References: <4F3A8654.7080707@molden.no> <4F3A9096.9080603@molden.no> Message-ID: On 14 February 2012 16:49, Sturla Molden wrote: > It's stange that Python does not have a barrier object in the standard > lib. Considering usefulness to scientific computing it could be worth > adding to numpy or scipy. Python 3.2 has a Barrier class for threading, but seemingly not yet for multiprocessing. I imagine it would be a logical addition, since the other synchronisation types from threading are available for multiprocessing. Thomas From pierre.haessig at crans.org Tue Feb 14 12:42:58 2012 From: pierre.haessig at crans.org (Pierre Haessig) Date: Tue, 14 Feb 2012 18:42:58 +0100 Subject: [SciPy-Dev] BSP interface in the SciPy. In-Reply-To: References: <4F3A8654.7080707@molden.no> <4F3A9172.6090905@crans.org> Message-ID: <4F3A9D22.6010204@crans.org> Le 14/02/2012 17:58, Robert Kern a ?crit : > http://en.wikipedia.org/wiki/Bulk_synchronous_parallel > Fair enough ;-) Thanks ! -- Pierre From sturla at molden.no Tue Feb 14 12:55:25 2012 From: sturla at molden.no (Sturla Molden) Date: Tue, 14 Feb 2012 18:55:25 +0100 Subject: [SciPy-Dev] BSP interface in the SciPy. In-Reply-To: References: <4F3A8654.7080707@molden.no> <4F3A9096.9080603@molden.no> Message-ID: <4F3AA00D.7090903@molden.no> On 14.02.2012 17:58, Thomas Kluyver wrote: > Python 3.2 has a Barrier class for threading, but seemingly not yet > for multiprocessing. I imagine it would be a logical addition, since > the other synchronisation types from threading are available for > multiprocessing. Ok, I am still on 2.7 :-) There are dozens of ways to make a barrier too. The one I used is certainly not the fastest, but it has a combinatoral beauty to it, like a butterfly :-) Sturla From sturla at molden.no Tue Feb 14 13:03:11 2012 From: sturla at molden.no (Sturla Molden) Date: Tue, 14 Feb 2012 19:03:11 +0100 Subject: [SciPy-Dev] BSP interface in the SciPy. In-Reply-To: References: <4F3A8654.7080707@molden.no> <4F3A9172.6090905@crans.org> Message-ID: <4F3AA1DF.9070007@molden.no> On 14.02.2012 17:58, Robert Kern wrote: > On Tue, Feb 14, 2012 at 16:53, Pierre Haessig wrote: >> Le 14/02/2012 17:05, Sturla Molden a ?crit : >>> AFAIK, BSP is a coding style, not a particular API. >> Hi Sturla, >> I'm not so familiar with parallel processing. >> Do you have a short reference on this BSP style ? You probably got the links. Short answer: It is a way to avoid deadlocks and livelocks in parallel computing. Computation and ipc are separated in discrete blocks with barrier synch in between. ipc -> barrier -> compute -> barrier -> ipc -> ... But it can often be difficult to fit a problem into a BSP paradigm, and sometimes it yields an inefficient program (the CPUs can spend a significant amount of time idle on the barriers). Sturla From robert.kern at gmail.com Tue Feb 14 13:48:26 2012 From: robert.kern at gmail.com (Robert Kern) Date: Tue, 14 Feb 2012 18:48:26 +0000 Subject: [SciPy-Dev] BSP interface in the SciPy. In-Reply-To: <4F3A9D22.6010204@crans.org> References: <4F3A8654.7080707@molden.no> <4F3A9172.6090905@crans.org> <4F3A9D22.6010204@crans.org> Message-ID: On Tue, Feb 14, 2012 at 17:42, Pierre Haessig wrote: > Le 14/02/2012 17:58, Robert Kern a ?crit : >> http://en.wikipedia.org/wiki/Bulk_synchronous_parallel >> > Fair enough ;-) I apologize. I didn't mean to give such a useless response. I scanned your email too quickly as I was leaving work and thought that you had googled the acronym and only got an unrelated company in the results. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." ? -- Umberto Eco From pav at iki.fi Tue Feb 14 15:19:55 2012 From: pav at iki.fi (Pauli Virtanen) Date: Tue, 14 Feb 2012 21:19:55 +0100 Subject: [SciPy-Dev] [SciPy-User] ANN: SciPy 0.10.1 release candidate 1 In-Reply-To: References: Message-ID: Hi, 13.02.2012 17:58, Paul Anton Letnes kirjoitti: > I screwed up my previous e-mail (using my non-virtualenv, regular python) > and I am trying again. These are the results on my machine - > OS X 10.7.3, gcc 4.2 from apple, and gfortran 4.6.2 from Homebrew-alt. The Arpack-related failures you see this time are precision errors, due to doing iterative inverses within an iterative eigenvalue problem solution. So, it's more of a problem with the tests than with ARPACK. If you can check that the following silences them, it would be appreciated: https://github.com/pv/scipy-work/tree/bug/arpack-tol-0101 -- Pauli Virtanen From scipy at samueljohn.de Tue Feb 14 16:54:00 2012 From: scipy at samueljohn.de (Samuel John) Date: Tue, 14 Feb 2012 22:54:00 +0100 Subject: [SciPy-Dev] [SciPy-User] ANN: SciPy 0.10.1 release candidate 1 In-Reply-To: <6C9EBB86-59C3-450B-B1E7-1388E18A3099@samueljohn.de> References: <20494732-5882-4B0B-8A99-280D5527A74B@samueljohn.de> <6C9EBB86-59C3-450B-B1E7-1388E18A3099@samueljohn.de> Message-ID: [sorry posting to scipy-users and scipy-dev. I feel this is not a good idea, but this thread is already spanning both lists. Replay on scipy-dev pls] The good news is that using clang and clang++ and gfortran from http://r.research.att.com/tools/ (4.2.4-5666.3) numpy and scipy build and test() fine! Yeeha \o/ Anyone with deeper understanding of the scipy internals want to comment on clang usage? Has anyone experienced problems with scipy and clang? However, building numpy 1.6.1 and scipy 0.10.1rc1 (or 0.10.0 or head) on OS X 10.7.3 with Xcode 4.2.1 (build 4D502) with llvm-gcc (which is the default, since non-llvm gcc is not available any more with XCode 4.2) leads to a segfault or to a malloc trap: > Running unit tests for scipy > NumPy version 1.6.1 > NumPy is installed in /usr/local/Cellar/python/2.7.2/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/numpy > SciPy version 0.10.1rc1 > SciPy is installed in /usr/local/Cellar/python/2.7.2/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/scipy > Python version 2.7.2 (default, Feb 14 2012, 22:09:10) [GCC 4.2.1 (Based on Apple Inc. build 5658) (LLVM build 2336.1.00)] > nose version 1.1.2 > ...................................................................................................................................................................................F.FFFPython(13536,0x7fff7c026960) malloc: *** error for object 0x10740b368: incorrect checksum for freed object - object was probably modified after being freed. > *** set a breakpoint in malloc_error_break to debug > Abort trap: 6 As pip install will also use the default llvm-gcc this might be a severe issue! This is already the case right now but does not often show up in daily usage. Perhaps it's possible to set the shell vars CC and CXX during build? Concerning arpack: I'm afraid that the arpack issues prevail in 0.10.1rc1. I beg you to fix these (I am not able to). scipy.test() log at https://gist.github.com/1830780 Samuel From pav at iki.fi Tue Feb 14 17:33:39 2012 From: pav at iki.fi (Pauli Virtanen) Date: Tue, 14 Feb 2012 23:33:39 +0100 Subject: [SciPy-Dev] ANN: SciPy 0.10.1 release candidate 1 In-Reply-To: References: <20494732-5882-4B0B-8A99-280D5527A74B@samueljohn.de> <6C9EBB86-59C3-450B-B1E7-1388E18A3099@samueljohn.de> Message-ID: Hi, 14.02.2012 22:54, Samuel John kirjoitti: [clip] > Concerning arpack: I'm afraid that the arpack issues prevail in 0.10.1rc1. > I beg you to fix these (I am not able to). > scipy.test() log at https://gist.github.com/1830780 Thanks. The issues remaining seem to have just to do with the fact that iterative inversion inside iterative eigenvalue problem solver is too inaccurate in single precision (or, the tolerances used in the test are not OK). I think we'll just disable these tests for rc2, like here: https://github.com/pv/scipy-work/tree/bug/arpack-tol-0101 The ARPACK routines themselves run all the time in double precision, and the double precision test suite appears to pass without problems, so everything should be OK. Pauli From scipy at samueljohn.de Tue Feb 14 17:35:31 2012 From: scipy at samueljohn.de (Samuel John) Date: Tue, 14 Feb 2012 23:35:31 +0100 Subject: [SciPy-Dev] [SciPy-User] ANN: SciPy 0.10.1 release candidate 1 In-Reply-To: References: Message-ID: <0AACEDA9-9AC2-4B3A-872E-9FEACF7EBEA3@samueljohn.de> On 14.02.2012, at 21:19, Pauli Virtanen wrote: > The Arpack-related failures you see this time are precision errors, due > to doing iterative inverses within an iterative eigenvalue problem > solution. > > So, it's more of a problem with the tests than with ARPACK. If you can > check that the following silences them, it would be appreciated: > > https://github.com/pv/scipy-work/tree/bug/arpack-tol-0101 > I just tested your arpack-tol-0101 branch and indeed scipy.test() does not show any failures nor errors with your changes. The "official 0.10.1rc" still has these (8, I think) arpack failues. From efiring at hawaii.edu Tue Feb 14 20:14:17 2012 From: efiring at hawaii.edu (Eric Firing) Date: Tue, 14 Feb 2012 15:14:17 -1000 Subject: [SciPy-Dev] Discussion with Guido van Rossum and (hopefully) core python-dev on scientific Python and Python3 In-Reply-To: References: Message-ID: <4F3B06E9.2030205@hawaii.edu> On 02/13/2012 11:55 AM, Fernando Perez wrote: > Hi folks, > > [ I'm broadcasting this widely for maximum reach, but I'd appreciate > it if replies can be kept to the *numpy* list, which is sort of the > 'base' list for scientific/numerical work. It will make it much > easier to organize a coherent set of notes later on. Apology if > you're subscribed to all and get it 10 times. ] > > As part of the PyData workshop (http://pydataworkshop.eventbrite.com) > to be held March 2 and 3 at the Mountain View Google offices, we have > scheduled a session for an open discussion with Guido van Rossum and > hopefully as many core python-dev members who can make it. We wanted > to seize the combined opportunity of the PyData workshop bringing a > number of 'scipy people' to Google with the timeline for Python 3.3, > the first release after the Python language moratorium, being within > sight: http://www.python.org/dev/peps/pep-0398. > > While a number of scientific Python packages are already available for > Python 3 (either in released form or in their master git branches), > it's fair to say that there hasn't been a major transition of the > scientific community to Python3. Since there is no more development > being done on the Python2 series, eventually we will all want to find > ways to make this transition, and we think that this is an excellent > time to engage the core python development team and consider ideas > that would make Python3 generally a more appealing language for > scientific work. Guido has made it clear that he doesn't speak for > the day-to-day development of Python anymore, so we all should be > aware that any ideas that come out of this panel will still need to be > discussed with python-dev itself via standard mechanisms before > anything is implemented. Nonetheless, the opportunity for a solid > face-to-face dialog for brainstorming was too good to pass up. > > The purpose of this email is then to solicit, from all of our > community, ideas for this discussion. In a week or so we'll need to > summarize the main points brought up here and make a more concrete > agenda out of it; I will also post a summary of the meeting afterwards > here. > > Anything is a valid topic, some points just to get the conversation started: Fernando, For me, the biggest wart to remove is the one addressed by PEP 335: http://www.python.org/dev/peps/pep-0335/. (I can't comment on the specifics of that PEP.) Having to choose between abusing the bitwise operators and using the verbose np.logical_or family is painful. Apart from that, keep it simple. Eric > > - Extra operators/PEP 225. Here's a summary from the last time we > went over this, years ago at Scipy 2008: > http://mail.scipy.org/pipermail/numpy-discussion/2008-October/038234.html, > and the current status of the document we wrote about it is here: > file:///home/fperez/www/site/_build/html/py4science/numpy-pep225/numpy-pep225.html. > > - Improved syntax/support for rationals or decimal literals? While > Python now has both decimals > (http://docs.python.org/library/decimal.html) and rationals > (http://docs.python.org/library/fractions.html), they're quite clunky > to use because they require full constructor calls. Guido has > mentioned in previous discussions toying with ideas about support for > different kinds of numeric literals... > > - Using the numpy docstring standard python-wide, and thus having > python improve the pathetic state of the stdlib's docstrings? This is > an area where our community is light years ahead of the standard > library, but we'd all benefit from Python itself improving on this > front. I'm toying with the idea of giving a lighting talk at PyConn > about this, comparing the great, robust culture and tools of good > docstrings across the Scipy ecosystem with the sad, sad state of > docstrings in the stdlib. It might spur some movement on that front > from the stdlib authors, esp. if the core python-dev team realizes the > value and benefit it can bring (at relatively low cost, given how most > of the information does exist, it's just in the wrong places). But > more importantly for us, if there was truly a universal standard for > high-quality docstrings across Python projects, building good > documentation/help machinery would be a lot easier, as we'd know what > to expect and search for (such as rendering them nicely in the ipython > notebook, providing high-quality cross-project help search, etc). > > - Literal syntax for arrays? Sage has been floating a discussion > about a literal matrix syntax > (https://groups.google.com/forum/#!topic/sage-devel/mzwepqZBHnA). For > something like this to go into python in any meaningful way there > would have to be core multidimensional arrays in the language, but > perhaps it's time to think about a piece of the numpy array itself > into Python? This is one of the more 'out there' ideas, but after > all, that's the point of a discussion like this, especially > considering we'll have both Travis and Guido in one room. > > - Other syntactic sugar? Sage has "a..b"<=> range(a, b+1), which I > actually think is both nice and useful... There's also the question > of allowing "a:b:c" notation outside of [], which has come up a few > times in conversation over the last few years. Others? > > - The packaging quagmire? This continues to be a problem, though > python3 does have new improvements to distutils. I'm not really up to > speed on the situation, to be frank. If we want to bring this up, > someone will have to provide a solid reference or volunteer to do it > in person. > > - etc... > > I'm putting the above just to *start* the discussion, but the real > point is for the rest of the community to contribute ideas, so don't > be shy. > > Final note: while I am here commiting to organizing and presenting > this at the discussion with Guido (as well as contacting python-dev), > I would greatly appreciate help with the task of summarizing this > prior to the meeting as I'm pretty badly swamped in the run-in to > pydata/pycon. So if anyone is willing to help draft the summary as > the date draws closer (we can put it up on a github wiki, gist, > whatever), I will be very grateful. I'm sure it will be better than > what I'll otherwise do the last night at 2am :) > > Cheers, > > f > > ps - to the obvious question about webcasting the discussion live for > remote participation: yes, we looked into it already; no, > unfortunately it appears it won't be possible. We'll try to at least > have the audio recorded (and possibly video) for posting later on. > > pps- if you are close to Mountain View and are interested in attending > this panel in person, drop me a line at fernando.perez at berkeley.edu. > We have a few spots available *for this discussion only* on top of the > pydata regular attendance (which is long closed, I'm afraid). But > we'll need to provide Google with a list of those attendees in > advance. Please indicate if you are a core python committer in your > email, as we'll give priority for this overflow pool to core python > developers (but will otherwise accommodate as many people as Google > lets us). > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev From pierre.haessig at crans.org Wed Feb 15 05:10:49 2012 From: pierre.haessig at crans.org (Pierre Haessig) Date: Wed, 15 Feb 2012 11:10:49 +0100 Subject: [SciPy-Dev] BSP interface in the SciPy. In-Reply-To: <4F3AA1DF.9070007@molden.no> References: <4F3A8654.7080707@molden.no> <4F3A9172.6090905@crans.org> <4F3AA1DF.9070007@molden.no> Message-ID: <4F3B84A9.3090105@crans.org> Le 14/02/2012 19:03, Sturla Molden a ?crit : > It is a way to avoid deadlocks and livelocks in parallel computing. > Computation and ipc are separated in discrete blocks with barrier synch > in between. > > ipc -> barrier -> compute -> barrier -> ipc -> ... > > But it can often be difficult to fit a problem into a BSP paradigm, and > sometimes it yields an inefficient program (the CPUs can spend a > significant amount of time idle on the barriers). Ok, I think I got the global idea now. Thanks a lot. As of today, my exploration of parallel computing didn't go further than using Pool.map() from multiprocessing module. I just run the same simulations with a different set of input parameters. I'm guessing it's a classical use case, similar to the use case of Octave's parcellfun. Simple enough but powerful enough ! Best, Pierre From warren.weckesser at enthought.com Wed Feb 15 13:42:04 2012 From: warren.weckesser at enthought.com (Warren Weckesser) Date: Wed, 15 Feb 2012 12:42:04 -0600 Subject: [SciPy-Dev] Sylvester equation solver for complex matrices In-Reply-To: References: Message-ID: On Sat, Feb 11, 2012 at 11:24 AM, Warren Weckesser < warren.weckesser at enthought.com> wrote: > > > On Sat, Feb 11, 2012 at 4:45 AM, Nils Wagner > wrote: > >> Hi all, >> >> I run a short test on the recent solver for Sylvester >> equations. >> In case of complex input matrices the method returns >> ** On entry to ZTRSYL parameter number 2 had an illegal >> value >> >> Nils >> >> > > Thanks, Nils. The new solve_sylvester function didn't handle complex > matrices correctly. I have submitted a pull request ( > https://github.com/scipy/scipy/pull/155) that should fix this. > > Warren > > Nils, the fix has been committed to the master branch. Let us know if you find any other problems. Warren > >> import numpy as np >> from scipy import linalg as la >> np.random.seed(10) >> a = np.random.rand(20,20)+1j*np.random.rand(20,20) >> b = np.random.rand(20,20)+1j*np.random.rand(20,20) >> q = np.random.rand(20,20)+1j*np.random.rand(20,20) >> >> x = la.solve_sylvester(a,b,q) >> res = la.norm(np.dot(a,x)+np.dot(x,b)-q) >> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at scipy.org >> http://mail.scipy.org/mailman/listinfo/scipy-dev >> > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From nwagner at iam.uni-stuttgart.de Wed Feb 15 16:44:56 2012 From: nwagner at iam.uni-stuttgart.de (Nils Wagner) Date: Wed, 15 Feb 2012 22:44:56 +0100 Subject: [SciPy-Dev] Sylvester equation solver for complex matrices In-Reply-To: References: Message-ID: On Wed, 15 Feb 2012 12:42:04 -0600 Warren Weckesser wrote: > On Sat, Feb 11, 2012 at 11:24 AM, Warren Weckesser < > warren.weckesser at enthought.com> wrote: > >> >> >> On Sat, Feb 11, 2012 at 4:45 AM, Nils Wagner >>> > wrote: >> >>> Hi all, >>> >>> I run a short test on the recent solver for Sylvester >>> equations. >>> In case of complex input matrices the method returns >>> ** On entry to ZTRSYL parameter number 2 had an >>>illegal >>> value >>> >>> Nils >>> >>> >> >> Thanks, Nils. The new solve_sylvester function didn't >>handle complex >> matrices correctly. I have submitted a pull request ( >> https://github.com/scipy/scipy/pull/155) that should fix >>this. >> >> Warren >> >> > > Nils, the fix has been committed to the master branch. > Let us know if you > find any other problems. > > Warren > > > >> >>> import numpy as np >>> from scipy import linalg as la >>> np.random.seed(10) >>> a = np.random.rand(20,20)+1j*np.random.rand(20,20) >>> b = np.random.rand(20,20)+1j*np.random.rand(20,20) >>> q = np.random.rand(20,20)+1j*np.random.rand(20,20) >>> >>> x = la.solve_sylvester(a,b,q) >>> res = la.norm(np.dot(a,x)+np.dot(x,b)-q) >>> _______________________________________________ >>> SciPy-Dev mailing list >>> SciPy-Dev at scipy.org >>> http://mail.scipy.org/mailman/listinfo/scipy-dev >>> >> >> Hi Warren, Great. It works fine for me. BTW, is the Sylvester equation solver part of the next scipy release ? How about Lyapunov and Riccati equation solvers ? Cheers, Nils From warren.weckesser at enthought.com Wed Feb 15 17:15:46 2012 From: warren.weckesser at enthought.com (Warren Weckesser) Date: Wed, 15 Feb 2012 16:15:46 -0600 Subject: [SciPy-Dev] Sylvester equation solver for complex matrices In-Reply-To: References: Message-ID: On Wed, Feb 15, 2012 at 3:44 PM, Nils Wagner wrote: > On Wed, 15 Feb 2012 12:42:04 -0600 > Warren Weckesser wrote: > > On Sat, Feb 11, 2012 at 11:24 AM, Warren Weckesser < > > warren.weckesser at enthought.com> wrote: > > > >> > >> > >> On Sat, Feb 11, 2012 at 4:45 AM, Nils Wagner > >> >> > wrote: > >> > >>> Hi all, > >>> > >>> I run a short test on the recent solver for Sylvester > >>> equations. > >>> In case of complex input matrices the method returns > >>> ** On entry to ZTRSYL parameter number 2 had an > >>>illegal > >>> value > >>> > >>> Nils > >>> > >>> > >> > >> Thanks, Nils. The new solve_sylvester function didn't > >>handle complex > >> matrices correctly. I have submitted a pull request ( > >> https://github.com/scipy/scipy/pull/155) that should fix > >>this. > >> > >> Warren > >> > >> > > > > Nils, the fix has been committed to the master branch. > > Let us know if you > > find any other problems. > > > > Warren > > > > > > > >> > >>> import numpy as np > >>> from scipy import linalg as la > >>> np.random.seed(10) > >>> a = np.random.rand(20,20)+1j*np.random.rand(20,20) > >>> b = np.random.rand(20,20)+1j*np.random.rand(20,20) > >>> q = np.random.rand(20,20)+1j*np.random.rand(20,20) > >>> > >>> x = la.solve_sylvester(a,b,q) > >>> res = la.norm(np.dot(a,x)+np.dot(x,b)-q) > >>> _______________________________________________ > >>> SciPy-Dev mailing list > >>> SciPy-Dev at scipy.org > >>> http://mail.scipy.org/mailman/listinfo/scipy-dev > >>> > >> > >> > Hi Warren, > > > Great. It works fine for me. > BTW, is the Sylvester equation solver part of the next > scipy release ? > Yes, it will be in the next release. > How about Lyapunov and Riccati equation solvers ? > Jeff Armstrong, who wrote solve_sylvester(), is also the author of pydare ( http://code.google.com/p/pydare/). My understanding is that he has reimplemented much of this so that it does not rely on GPL code (there was a thread about it last year, involving Jeff, Ralf, Josef and maybe some others), so he probably has more tools that could end up in scipy. Jeff, what's the status of that work? Are you planning on more contributions? I think it would be great to have the Lyapunov and Riccati solvers in scipy. Warren -------------- next part -------------- An HTML attachment was scrubbed... URL: From jba at sdf.lonestar.org Thu Feb 16 09:58:34 2012 From: jba at sdf.lonestar.org (Jeffrey Armstrong) Date: Thu, 16 Feb 2012 14:58:34 +0000 (UTC) Subject: [SciPy-Dev] Sylvester equation solver for complex matrices In-Reply-To: References: Message-ID: On Wed, 15 Feb 2012, Warren Weckesser wrote: >> How about Lyapunov and Riccati equation solvers ? >> > Jeff, what's the status of that work? Are you planning on more > contributions? I think it would be great to have the Lyapunov and Riccati > solvers in scipy. > > Warren > I do have Riccati and Lyapunov solvers, both continuous and discrete, based on my work in pydare. pydare actually contains only discrete versions, and it contains iterative solvers that really aren't necessary anymore since the Schur decomposition in scipy now supports sorting of the eigenvalues. The only GPL code it had referenced (other than itself...) was Slycot, but Slycot was purely optional. I would be happy to clean up and submit my direct algebraic Riccati and Lyapunov solvers to scipy. I never really had a good idea of where they should reside in scipy, though, so let me know and I'll happily create a branch with the code in it. And I'll take a little more care with the complex cases as well this time... -Jeff Jeff Armstrong - jba at sdf.lonestar.org SDF Public Access UNIX System - http://sdf.lonestar.org -------------- next part -------------- _______________________________________________ SciPy-Dev mailing list SciPy-Dev at scipy.org http://mail.scipy.org/mailman/listinfo/scipy-dev From nwagner at iam.uni-stuttgart.de Thu Feb 16 14:45:43 2012 From: nwagner at iam.uni-stuttgart.de (Nils Wagner) Date: Thu, 16 Feb 2012 20:45:43 +0100 Subject: [SciPy-Dev] Sylvester equation solver for complex matrices In-Reply-To: References: Message-ID: On Thu, 16 Feb 2012 14:58:34 +0000 (UTC) Jeffrey Armstrong wrote: > On Wed, 15 Feb 2012, Warren Weckesser wrote: > >>> How about Lyapunov and Riccati equation solvers ? >>> >> Jeff, what's the status of that work? Are you planning >>on more >> contributions? I think it would be great to have the >>Lyapunov and Riccati >> solvers in scipy. >> >> Warren >> > > I do have Riccati and Lyapunov solvers, both continuous >and discrete, based on my work in pydare. pydare >actually contains only discrete versions, and it contains >iterative solvers that really aren't necessary anymore >since the Schur decomposition in scipy now supports >sorting of the eigenvalues. The only GPL code it had >referenced (other than itself...) was Slycot, but Slycot >was purely optional. > > I would be happy to clean up and submit my direct >algebraic Riccati and Lyapunov solvers to scipy. I never >really had a good idea of where they should reside in >scipy, though, so let me know and I'll happily create a >branch with the code in it. And I'll take a little more >care with the complex cases as well this time... > > -Jeff > > Jeff Armstrong - jba at sdf.lonestar.org > SDF Public Access UNIX System - http://sdf.lonestar.org Hi Jeff, Thank your very much for your work on solvers for matrix equations. I look forward to seeing the Riccati and Lyapunov solvers in scipy. Cheers, Nils From pav at iki.fi Thu Feb 16 15:59:28 2012 From: pav at iki.fi (Pauli Virtanen) Date: Thu, 16 Feb 2012 21:59:28 +0100 Subject: [SciPy-Dev] Sylvester equation solver for complex matrices In-Reply-To: References: Message-ID: Hi, 16.02.2012 15:58, Jeffrey Armstrong kirjoitti: [clip] > I would be happy to clean up and submit my direct algebraic Riccati and > Lyapunov solvers to scipy. I never really had a good idea of where they > should reside in scipy, though, so let me know and I'll happily create a > branch with the code in it. And I'll take a little more care with the > complex cases as well this time... Good question. One suitable place could scipy.linalg, as it seems that these problems can be cast in terms of pure linear algebra without any field-dependent parts. Maybe also scipy.signal --- but I'm don't know the field exactly to say how well they would fit there. Pauli From warren.weckesser at enthought.com Thu Feb 16 16:02:11 2012 From: warren.weckesser at enthought.com (Warren Weckesser) Date: Thu, 16 Feb 2012 15:02:11 -0600 Subject: [SciPy-Dev] Sylvester equation solver for complex matrices In-Reply-To: References: Message-ID: On Thu, Feb 16, 2012 at 2:59 PM, Pauli Virtanen wrote: > Hi, > > 16.02.2012 15:58, Jeffrey Armstrong kirjoitti: > [clip] > > I would be happy to clean up and submit my direct algebraic Riccati and > > Lyapunov solvers to scipy. I never really had a good idea of where they > > should reside in scipy, though, so let me know and I'll happily create a > > branch with the code in it. And I'll take a little more care with the > > complex cases as well this time... > > Good question. One suitable place could scipy.linalg, as it seems that > these problems can be cast in terms of pure linear algebra without any > field-dependent parts. Maybe also scipy.signal --- but I'm don't know > the field exactly to say how well they would fit there. > Pauli > > scipy.linalg works for me. Warren -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at googlemail.com Thu Feb 16 16:51:00 2012 From: ralf.gommers at googlemail.com (Ralf Gommers) Date: Thu, 16 Feb 2012 22:51:00 +0100 Subject: [SciPy-Dev] Numpy-dev and scipy-0.10.1 test errors In-Reply-To: References: <4F381C9A.4080000@uci.edu> Message-ID: 2012/2/14 St?fan van der Walt > On Sun, Feb 12, 2012 at 12:19 PM, Ralf Gommers > wrote: > > These do like a serious issue, it's the same one that caused MinGW 3.4.5 > to > > stop working: > > > > ====================================================================== > > ERROR: test_datetime_array_str (test_datetime.TestDateTime) > > ---------------------------------------------------------------------- > > I noticed this one on the buildbot as well: > > > http://buildbot.scipy.org/builders/Windows_XP_x86/builds/1112/steps/shell_1/logs/stdio > > The datetime ones should be fixed by moving to gcc 4.x, although the MSVC failures suggest something else is wrong there. See https://github.com/numpy/numpy/pull/156 and http://projects.scipy.org/numpy/ticket/1909. The overflow issue is http://projects.scipy.org/numpy/ticket/1755. Would be great if someone could look at that one. The last two are new, I'll add them to the list to be solved before the next release. If anyone wants to have a go, please do! Thanks, Ralf ====================================================================== FAIL: test_finfo_repr (test_getlimits.TestRepr) ---------------------------------------------------------------------- Traceback (most recent call last): File "C:\buildbot\numpy\b11\numpy-install25\Lib\site-packages\numpy\core\tests\test_getlimits.py", line 66, in test_finfo_repr assert_equal(repr(np.finfo(np.float32)), expected) File "..\numpy-install25\Lib\site-packages\numpy\testing\utils.py", line 313, in assert_equal AssertionError: Items are not equal: ACTUAL: 'finfo(resolutioRunning unit tests for numpy NumPy version 2.0.0.dev-Unknown NumPy is installed in C:\buildbot\numpy\b11\numpy-install25\Lib\site-packages\numpy Python version 2.5.4 (r254:67916, Dec 23 2008, 15:10:54) [MSC v.1310 32 bit (Intel)] nose version 0.11.2n=1e-006, min=-3.4028235e+038, max=3.4028235e+038, dtype=float32)' DESIRED: 'finfo(resolution=1e-06, min=-3.4028235e+38, max=3.4028235e+38, dtype=float32)' ====================================================================== FAIL: test_complex_arrays (test_io.TestSaveTxt) ---------------------------------------------------------------------- Traceback (most recent call last): File "C:\buildbot\numpy\b11\numpy-install25\Lib\site-packages\numpy\lib\tests\test_io.py", line 291, in test_complex_arrays ' ( +3.142e+00+ +2.718e+00j) ( +3.142e+00+ +2.718e+00j)\n'])) File "..\numpy-install25\Lib\site-packages\numpy\ma\testutils.py", line 93, in assert_equal File "..\numpy-install25\Lib\site-packages\numpy\ma\testutils.py", line 66, in _assert_equal_on_sequences File "..\numpy-install25\Lib\site-packages\numpy\ma\testutils.py", line 97, in assert_equal AssertionError: Items are not equal: item=0 ACTUAL: ' ( +3.142e+000+ +2.718e+000j) ( +3.142e+000+ +2.718e+000j)\n' DESIRED: ' ( +3.142e+00+ +2.718e+00j) ( +3.142e+00+ +2.718e+00j)\n' -------------- next part -------------- An HTML attachment was scrubbed... URL: From jba at sdf.lonestar.org Fri Feb 17 09:58:13 2012 From: jba at sdf.lonestar.org (Jeffrey Armstrong) Date: Fri, 17 Feb 2012 14:58:13 +0000 (UTC) Subject: [SciPy-Dev] Sylvester equation solver for complex matrices In-Reply-To: References: Message-ID: On Thu, 16 Feb 2012, Warren Weckesser wrote: > > scipy.linalg works for me. > I think placing them in scipy.linalg makes sense. However, would placing them outside of basic.py be the best course of action? There'll be 4 new solver functions for the continous and discrete forms of each equation (Lyapunov and Riccati). Should they be referred to in the "Basics" section of the scipy.linalg docs? Someone had mentioned maybe having a "Solvers" section. -Jeff Jeff Armstrong - jba at sdf.lonestar.org SDF Public Access UNIX System - http://sdf.lonestar.org -------------- next part -------------- _______________________________________________ SciPy-Dev mailing list SciPy-Dev at scipy.org http://mail.scipy.org/mailman/listinfo/scipy-dev From warren.weckesser at enthought.com Fri Feb 17 10:29:46 2012 From: warren.weckesser at enthought.com (Warren Weckesser) Date: Fri, 17 Feb 2012 09:29:46 -0600 Subject: [SciPy-Dev] Sylvester equation solver for complex matrices In-Reply-To: References: Message-ID: On Fri, Feb 17, 2012 at 8:58 AM, Jeffrey Armstrong wrote: > On Thu, 16 Feb 2012, Warren Weckesser wrote: > >> >> scipy.linalg works for me. >> >> > I think placing them in scipy.linalg makes sense. However, would placing > them outside of basic.py be the best course of action? There'll be 4 new > solver functions for the continous and discrete forms of each equation > (Lyapunov and Riccati). They could go in a private module, say _solvers.py, and be imported into the scipy.linalg namespace in __init__.py. > Should they be referred to in the "Basics" section of the scipy.linalg > docs? Someone had mentioned maybe having a "Solvers" section. > > A separate section on solvers sounds good to me. Warren -------------- next part -------------- An HTML attachment was scrubbed... URL: From charlesr.harris at gmail.com Fri Feb 17 10:46:26 2012 From: charlesr.harris at gmail.com (Charles R Harris) Date: Fri, 17 Feb 2012 08:46:26 -0700 Subject: [SciPy-Dev] Sylvester equation solver for complex matrices In-Reply-To: References: Message-ID: On Fri, Feb 17, 2012 at 8:29 AM, Warren Weckesser < warren.weckesser at enthought.com> wrote: > > > On Fri, Feb 17, 2012 at 8:58 AM, Jeffrey Armstrong wrote: > >> On Thu, 16 Feb 2012, Warren Weckesser wrote: >> >>> >>> scipy.linalg works for me. >>> >>> >> I think placing them in scipy.linalg makes sense. However, would placing >> them outside of basic.py be the best course of action? There'll be 4 new >> solver functions for the continous and discrete forms of each equation >> (Lyapunov and Riccati). > > > > They could go in a private module, say _solvers.py, and be imported into > the scipy.linalg namespace in __init__.py. > > _solvers.py sounds a bit generic, a more descriptive name might be better. > > >> Should they be referred to in the "Basics" section of the scipy.linalg >> docs? Someone had mentioned maybe having a "Solvers" section. >> >> > > > A separate section on solvers sounds good to me. > > Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at googlemail.com Sat Feb 18 07:46:35 2012 From: ralf.gommers at googlemail.com (Ralf Gommers) Date: Sat, 18 Feb 2012 13:46:35 +0100 Subject: [SciPy-Dev] [SciPy-User] ANN: SciPy 0.10.1 release candidate 1 In-Reply-To: References: Message-ID: On Mon, Feb 13, 2012 at 5:58 PM, Paul Anton Letnes < paul.anton.letnes at gmail.com> wrote: > Hi, > > I screwed up my previous e-mail (using my non-virtualenv, regular python) > and I am trying again. These are the results on my machine - OS X 10.7.3, > gcc 4.2 from apple, and gfortran 4.6.2 from Homebrew-alt. > > Paul > > Running unit tests for scipy > NumPy version 1.6.1 > NumPy is installed in > /Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/numpy > SciPy version 0.10.1rc1 > SciPy is installed in > /Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/scipy > Python version 2.7.2 (default, Oct 9 2011, 18:03:13) [GCC 4.2.1 (Apple > Inc. build 5666) (dot 3)] > nose version 1.1.2 > ............................................................................................................................................................................................................................K............................................................................................................/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/scipy/interpolate/fitpack2.py:674: > UserWarning: > The coefficients of the spline returned have been computed as the > minimal norm least-squares solution of a (numerically) rank deficient > system (deficiency=7). If deficiency is large, the results may be > inaccurate. Deficiency may strongly depend on the value of eps. > warnings.warn(message) > ....../Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/scipy/interpolate/fitpack2.py:605: > UserWarning: > The required storage space exceeds the available storage space: nxest > or nyest too small, or s too small. > The weighted least-squares spline corresponds to the current set of > knots. > warnings.warn(message) > ........................K..K.................................................................................................................................................................................................................................................................................................................................................................................................................................................../Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/scipy/io/wavfile.py:31: > WavFileWarning: Unfamiliar format bytes > warnings.warn("Unfamiliar format bytes", WavFileWarning) > /Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/scipy/io/wavfile.py:121: > WavFileWarning: chunk not understood > warnings.warn("chunk not understood", WavFileWarning) > > ....................................................................................F..FF......................................................................................................................................SSSSSS......SSSSSS......SSSS.....................FFF.........................................F....FF.......S............................................................................................................................................................................................................................................................K......................................................................................................................................................................................................SSSSS............S.............................................................................................................................................................................................. > ............................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................SSSSSSSSSSS.........../Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/scipy/sparse/linalg/eigen/arpack/arpack.py:63: > UserWarning: Single-precision types in `eigs` and `eighs` are not supported > currently. Double precision routines are used instead. > warnings.warn("Single-precision types in `eigs` and `eighs` " > > ....F.F.....................F...........F.F..............................................................................................F........................F.........................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................K...............................................................K................................................................. > > ..........................................................................................KK.............................................................................................................................................................................................................................................................................................................................................................................................................................................K.K.............................................................................................................................................................................................................................................................................................................................................................................................K........K..............SSSSSSS..................................................... > > .....................................................................................................S.............................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................. > ====================================================================== > FAIL: test_asum (test_blas.TestFBLAS1Simple) > ---------------------------------------------------------------------- > Traceback (most recent call last): > File > "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/scipy/lib/blas/tests/test_blas.py", > line 58, in test_asum > assert_almost_equal(f([3,-4,5]),12) > File > "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/numpy/testing/utils.py", > line 468, in assert_almost_equal > raise AssertionError(msg) > AssertionError: > Arrays are not almost equal to 7 decimals > ACTUAL: 0.0 > DESIRED: 12 > > ====================================================================== > FAIL: test_dot (test_blas.TestFBLAS1Simple) > ---------------------------------------------------------------------- > Traceback (most recent call last): > File > "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/scipy/lib/blas/tests/test_blas.py", > line 67, in test_dot > assert_almost_equal(f([3,-4,5],[2,5,1]),-9) > File > "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/numpy/testing/utils.py", > line 468, in assert_almost_equal > raise AssertionError(msg) > AssertionError: > Arrays are not almost equal to 7 decimals > ACTUAL: 0.0 > DESIRED: -9 > > ====================================================================== > FAIL: test_nrm2 (test_blas.TestFBLAS1Simple) > ---------------------------------------------------------------------- > Traceback (most recent call last): > File > "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/scipy/lib/blas/tests/test_blas.py", > line 78, in test_nrm2 > assert_almost_equal(f([3,-4,5]),math.sqrt(50)) > File > "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/numpy/testing/utils.py", > line 468, in assert_almost_equal > raise AssertionError(msg) > AssertionError: > Arrays are not almost equal to 7 decimals > ACTUAL: 0.0 > DESIRED: 7.0710678118654755 > > ====================================================================== > FAIL: test_basic.TestNorm.test_overflow > ---------------------------------------------------------------------- > Traceback (most recent call last): > File > "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/nose/case.py", > line 197, in runTest > self.test(*self.arg) > File > "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/scipy/linalg/tests/test_basic.py", > line 581, in test_overflow > assert_almost_equal(norm(a), a) > File > "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/numpy/testing/utils.py", > line 452, in assert_almost_equal > return assert_array_almost_equal(actual, desired, decimal, err_msg) > File > "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/numpy/testing/utils.py", > line 800, in assert_array_almost_equal > header=('Arrays are not almost equal to %d decimals' % decimal)) > File > "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/numpy/testing/utils.py", > line 636, in assert_array_compare > raise AssertionError(msg) > AssertionError: > Arrays are not almost equal to 7 decimals > > (mismatch 100.0%) > x: array(-0.0) > y: array([ 1.00000002e+20], dtype=float32) > > ====================================================================== > FAIL: test_basic.TestNorm.test_stable > ---------------------------------------------------------------------- > Traceback (most recent call last): > File > "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/nose/case.py", > line 197, in runTest > self.test(*self.arg) > File > "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/scipy/linalg/tests/test_basic.py", > line 586, in test_stable > assert_almost_equal(norm(a) - 1e4, 0.5) > File > "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/numpy/testing/utils.py", > line 468, in assert_almost_equal > raise AssertionError(msg) > AssertionError: > Arrays are not almost equal to 7 decimals > ACTUAL: -10000.0 > DESIRED: 0.5 > > ====================================================================== > FAIL: test_basic.TestNorm.test_types > ---------------------------------------------------------------------- > Traceback (most recent call last): > File > "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/nose/case.py", > line 197, in runTest > self.test(*self.arg) > File > "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/scipy/linalg/tests/test_basic.py", > line 568, in test_types > assert_allclose(norm(x), np.sqrt(14), rtol=tol) > File > "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/numpy/testing/utils.py", > line 1168, in assert_allclose > verbose=verbose, header=header) > File > "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/numpy/testing/utils.py", > line 636, in assert_array_compare > raise AssertionError(msg) > AssertionError: > Not equal to tolerance rtol=2.38419e-06, atol=0 > > (mismatch 100.0%) > x: array(1.0842021724855044e-19) > y: array(3.7416573867739413) > > ====================================================================== > FAIL: test_asum (test_blas.TestFBLAS1Simple) > ---------------------------------------------------------------------- > Traceback (most recent call last): > File > "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/scipy/linalg/tests/test_blas.py", > line 99, in test_asum > assert_almost_equal(f([3,-4,5]),12) > File > "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/numpy/testing/utils.py", > line 468, in assert_almost_equal > raise AssertionError(msg) > AssertionError: > Arrays are not almost equal to 7 decimals > ACTUAL: 0.0 > DESIRED: 12 > > ====================================================================== > FAIL: test_dot (test_blas.TestFBLAS1Simple) > ---------------------------------------------------------------------- > Traceback (most recent call last): > File > "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/scipy/linalg/tests/test_blas.py", > line 109, in test_dot > assert_almost_equal(f([3,-4,5],[2,5,1]),-9) > File > "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/numpy/testing/utils.py", > line 468, in assert_almost_equal > raise AssertionError(msg) > AssertionError: > Arrays are not almost equal to 7 decimals > ACTUAL: 0.0 > DESIRED: -9 > > ====================================================================== > FAIL: test_nrm2 (test_blas.TestFBLAS1Simple) > ---------------------------------------------------------------------- > Traceback (most recent call last): > File > "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/scipy/linalg/tests/test_blas.py", > line 127, in test_nrm2 > assert_almost_equal(f([3,-4,5]),math.sqrt(50)) > File > "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/numpy/testing/utils.py", > line 468, in assert_almost_equal > raise AssertionError(msg) > AssertionError: > Arrays are not almost equal to 7 decimals > ACTUAL: 0.0 > DESIRED: 7.0710678118654755 > > I'm not sure what to make of these. I'm tempted to say that for now only the recommended gfortran is supported. There's too much going wrong on OS X Lion to be able to fix it all for 0.10.1. For 0.11.0 we should attempt to get this fixed, including the llvm-gcc situation. Can you check where that gfortran 4.6.2 actually comes from? Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at googlemail.com Sat Feb 18 11:53:46 2012 From: ralf.gommers at googlemail.com (Ralf Gommers) Date: Sat, 18 Feb 2012 17:53:46 +0100 Subject: [SciPy-Dev] Goodbye maxentropy! Message-ID: Hi, Here's a heads up that I plan to merge https://github.com/scipy/scipy/pull/151 soon. It removes the scipy.maxentropy module, in preparation for the 0.11 release. Cheers, Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at googlemail.com Sat Feb 18 16:35:35 2012 From: ralf.gommers at googlemail.com (Ralf Gommers) Date: Sat, 18 Feb 2012 22:35:35 +0100 Subject: [SciPy-Dev] Deprecating misc.radon Message-ID: Hi all, Would anyone be opposed to deprecating misc.radon for 0.11? It's a small untested and undocumented function that doesn't really fit anywhere. In scikit-image there's a much better-looking radon function, see http://scikits-image.org/docs/dev/auto_examples/plot_radon_transform.html. Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From travis at continuum.io Sat Feb 18 17:09:50 2012 From: travis at continuum.io (Travis Oliphant) Date: Sat, 18 Feb 2012 16:09:50 -0600 Subject: [SciPy-Dev] Deprecating misc.radon In-Reply-To: References: Message-ID: <693529B5-D662-42AE-9506-D8F1FAD77862@continuum.io> I'm fine with that if it exists in scikits-image. I'm sure you will point people to that function in the docstring and in the deprecation warning. -Travis On Feb 18, 2012, at 3:35 PM, Ralf Gommers wrote: > Hi all, > > Would anyone be opposed to deprecating misc.radon for 0.11? It's a small untested and undocumented function that doesn't really fit anywhere. In scikit-image there's a much better-looking radon function, see http://scikits-image.org/docs/dev/auto_examples/plot_radon_transform.html. > > Ralf > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev -------------- next part -------------- An HTML attachment was scrubbed... URL: From warren.weckesser at enthought.com Sat Feb 18 17:14:20 2012 From: warren.weckesser at enthought.com (Warren Weckesser) Date: Sat, 18 Feb 2012 16:14:20 -0600 Subject: [SciPy-Dev] Deprecating misc.radon In-Reply-To: References: Message-ID: On Sat, Feb 18, 2012 at 3:35 PM, Ralf Gommers wrote: > Hi all, > > Would anyone be opposed to deprecating misc.radon for 0.11? It's a small > untested and undocumented function that doesn't really fit anywhere. In > scikit-image there's a much better-looking radon function, see > http://scikits-image.org/docs/dev/auto_examples/plot_radon_transform.html. > > Ralf > > That's fine with me. Warren -------------- next part -------------- An HTML attachment was scrubbed... URL: From scott.sinclair.za at gmail.com Mon Feb 20 01:23:24 2012 From: scott.sinclair.za at gmail.com (Scott Sinclair) Date: Mon, 20 Feb 2012 08:23:24 +0200 Subject: [SciPy-Dev] Bento build - Test failures Message-ID: Hi, I guess this is aimed at David C.. When scipy has been built with Bento/waf there are several test failures related to undefined Fortran symbols (see an example below). Pauli thinks it's related to library link order (see discussion at https://github.com/scipy/scipy/pull/158). It's not yet clear to me how to define the order of the flags at link time. I'm guessing that waf needs to be told about this in the package level bscript? ERROR: Failure: ImportError (/home/scott/.virtualenvs/scipy-sandbox/local/lib/python2.7/site-packages/scipy/special/_cephes.so: undefined symbol: _gfortran_transfer_character_write) ---------------------------------------------------------------------- Traceback (most recent call last): File "/home/scott/.local/lib/python2.7/site-packages/nose/loader.py", line 390, in loadTestsFromName addr.filename, addr.module) File "/home/scott/.local/lib/python2.7/site-packages/nose/importer.py", line 39, in importFromPath return self.importFromDir(dir_path, fqname) File "/home/scott/.local/lib/python2.7/site-packages/nose/importer.py", line 86, in importFromDir mod = load_module(part_fqname, fh, filename, desc) File "/home/scott/.virtualenvs/scipy-sandbox/local/lib/python2.7/site-packages/scipy/integrate/__init__.py", line 50, in from quadrature import * File "/home/scott/.virtualenvs/scipy-sandbox/local/lib/python2.7/site-packages/scipy/integrate/quadrature.py", line 5, in from scipy.special.orthogonal import p_roots File "/home/scott/.virtualenvs/scipy-sandbox/local/lib/python2.7/site-packages/scipy/special/__init__.py", line 525, in from _cephes import * ImportError: /home/scott/.virtualenvs/scipy-sandbox/local/lib/python2.7/site-packages/scipy/special/_cephes.so: undefined symbol: _gfortran_transfer_character_write Cheers, Scott From ralf.gommers at googlemail.com Mon Feb 20 02:51:50 2012 From: ralf.gommers at googlemail.com (Ralf Gommers) Date: Mon, 20 Feb 2012 08:51:50 +0100 Subject: [SciPy-Dev] ANN: SciPy 0.10.1 release candidate 2 Message-ID: Hi all, I am pleased to announce the availability of the second release candidate of SciPy 0.10.1. Please try out this release and report any problems on the scipy-dev mailing list. If no new problems are found, the final release will be available in one week. Sources and binaries can be found at http://sourceforge.net/projects/scipy/files/scipy/0.10.1rc2/, release notes are copied below. Things that were fixed between RC1 and RC2: - a compile error with MSVC9 against NumPy master - include missing Bento files in source release - some Python 3.x test warnings - an issue with misc.imread when PIL is not installed - Arpack single-precision test failures - cleaned up DeprecationWarnings when Umfpack is not installed Cheers, Ralf ========================== SciPy 0.10.1 Release Notes ========================== .. contents:: SciPy 0.10.1 is a bug-fix release with no new features compared to 0.10.0. Main changes ------------ The most important changes are:: 1. The single precision routines of ``eigs`` and ``eigsh`` in ``scipy.sparse.linalg`` have been disabled (they internally use double precision now). 2. A compatibility issue related to changes in NumPy macros has been fixed, in order to make scipy 0.10.1 compile with the upcoming numpy 1.7.0 release. Other issues fixed ------------------ - #835: stats: nan propagation in stats.distributions - #1202: io: netcdf segfault - #1531: optimize: make curve_fit work with method as callable. - #1560: linalg: fixed mistake in eig_banded documentation. - #1565: ndimage: bug in ndimage.variance - #1457: ndimage: standard_deviation does not work with sequence of indexes - #1562: cluster: segfault in linkage function - #1568: stats: One-sided fisher_exact() returns `p` < 1 for 0 successful attempts - #1575: stats: zscore and zmap handle the axis keyword incorrectly Checksums ========= b119828c64a68794c9562f8228dd7cf9 release/installers/scipy-0.10.1rc2-py2.7-python.org-macosx10.6.dmg 605a30b8a33ff6763261ffde59a38bb9 release/installers/scipy-0.10.1rc2-win32-superpack-python2.5.exe 5a056ed6dbb9abd10bd824a64c47c159 release/installers/scipy-0.10.1rc2-win32-superpack-python2.6.exe 0498bb3f48d0cb251cb9e527b454a50b release/installers/scipy-0.10.1rc2-win32-superpack-python2.7.exe f078ebc55d1b7d832474c8379852062c release/installers/scipy-0.10.1rc2-win32-superpack-python3.1.exe 3ee82aebb2c1d0425fb85c72c6eea80e release/installers/scipy-0.10.1rc2-win32-superpack-python3.2.exe 540ec78cb451bfebbd8f84193fe76581 release/installers/scipy-0.10.1rc2.tar.gz 4813d52623ae63ed492e28bdcecee4c0 release/installers/scipy-0.10.1rc2.zip -------------- next part -------------- An HTML attachment was scrubbed... URL: From cournape at gmail.com Mon Feb 20 03:27:54 2012 From: cournape at gmail.com (David Cournapeau) Date: Mon, 20 Feb 2012 08:27:54 +0000 Subject: [SciPy-Dev] Bento build - Test failures In-Reply-To: References: Message-ID: Le 20 f?vr. 2012 06:23, "Scott Sinclair" a ?crit : > > Hi, > > I guess this is aimed at David C.. > > When scipy has been built with Bento/waf there are several test > failures related to undefined Fortran symbols (see an example below). > Pauli thinks it's related to library link order (see discussion at > https://github.com/scipy/scipy/pull/158). > > It's not yet clear to me how to define the order of the flags at link > time. I'm guessing that waf needs to be told about this in the package > level bscript? Pauli is most likely right. Solving link order dependency is not so simple unless you hardcode things: I am working on a way to add the feature to waf in a generic way. David > > ERROR: Failure: ImportError > (/home/scott/.virtualenvs/scipy-sandbox/local/lib/python2.7/site-packages/scipy/special/_cephes.so: > undefined symbol: _gfortran_transfer_character_write) > ---------------------------------------------------------------------- > Traceback (most recent call last): > File "/home/scott/.local/lib/python2.7/site-packages/nose/loader.py", > line 390, in loadTestsFromName > addr.filename, addr.module) > File "/home/scott/.local/lib/python2.7/site-packages/nose/importer.py", > line 39, in importFromPath > return self.importFromDir(dir_path, fqname) > File "/home/scott/.local/lib/python2.7/site-packages/nose/importer.py", > line 86, in importFromDir > mod = load_module(part_fqname, fh, filename, desc) > File "/home/scott/.virtualenvs/scipy-sandbox/local/lib/python2.7/site-packages/scipy/integrate/__init__.py", > line 50, in > from quadrature import * > File "/home/scott/.virtualenvs/scipy-sandbox/local/lib/python2.7/site-packages/scipy/integrate/quadrature.py", > line 5, in > from scipy.special.orthogonal import p_roots > File "/home/scott/.virtualenvs/scipy-sandbox/local/lib/python2.7/site-packages/scipy/special/__init__.py", > line 525, in > from _cephes import * > ImportError: /home/scott/.virtualenvs/scipy-sandbox/local/lib/python2.7/site-packages/scipy/special/_cephes.so: > undefined symbol: _gfortran_transfer_character_write > > Cheers, > Scott > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev -------------- next part -------------- An HTML attachment was scrubbed... URL: From scott.sinclair.za at gmail.com Mon Feb 20 05:46:11 2012 From: scott.sinclair.za at gmail.com (Scott Sinclair) Date: Mon, 20 Feb 2012 12:46:11 +0200 Subject: [SciPy-Dev] Bento build - Test failures In-Reply-To: References: Message-ID: On 20 February 2012 10:27, David Cournapeau wrote: > > Le 20 f?vr. 2012 06:23, "Scott Sinclair" a > ?crit?: > >> When scipy has been built with Bento/waf there are several test >> failures related to undefined Fortran symbols (see an example below). >> Pauli thinks it's related to library link order (see discussion at >> https://github.com/scipy/scipy/pull/158). >> >> It's not yet clear to me how to define the order of the flags at link >> time. I'm guessing that waf needs to be told about this in the package >> level bscript? > > Pauli is most likely right. Solving link order dependency is not so simple > unless you hardcode things: I am working on a way to add the feature to waf > in a generic way. Thanks. I've opened a new ticket (http://projects.scipy.org/scipy/ticket/1601) so it's not forgotten and will see whether I can understand how to hardcode something in the meantime. It's really refreshing to build Scipy *quickly* with Bento on a multi-core machine! Cheers, Scott From robert.kern at gmail.com Mon Feb 20 05:49:42 2012 From: robert.kern at gmail.com (Robert Kern) Date: Mon, 20 Feb 2012 10:49:42 +0000 Subject: [SciPy-Dev] Bento build - Test failures In-Reply-To: References: Message-ID: On Mon, Feb 20, 2012 at 06:23, Scott Sinclair wrote: > Hi, > > I guess this is aimed at David C.. > > When scipy has been built with Bento/waf there are several test > failures related to undefined Fortran symbols (see an example below). > Pauli thinks it's related to library link order (see discussion at > https://github.com/scipy/scipy/pull/158). > > It's not yet clear to me how to define the order of the flags at link > time. I'm guessing that waf needs to be told about this in the package > level bscript? > > ERROR: Failure: ImportError > (/home/scott/.virtualenvs/scipy-sandbox/local/lib/python2.7/site-packages/scipy/special/_cephes.so: > undefined symbol: _gfortran_transfer_character_write) Out of curiosity, can you post the link line that Bento executed for the _cephes.so module? -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." ? -- Umberto Eco From scott.sinclair.za at gmail.com Mon Feb 20 06:28:41 2012 From: scott.sinclair.za at gmail.com (Scott Sinclair) Date: Mon, 20 Feb 2012 13:28:41 +0200 Subject: [SciPy-Dev] Bento build - Test failures In-Reply-To: References: Message-ID: On 20 February 2012 12:49, Robert Kern wrote: > On Mon, Feb 20, 2012 at 06:23, Scott Sinclair > wrote: >> Hi, >> >> I guess this is aimed at David C.. >> >> When scipy has been built with Bento/waf there are several test >> failures related to undefined Fortran symbols (see an example below). >> Pauli thinks it's related to library link order (see discussion at >> https://github.com/scipy/scipy/pull/158). >> >> It's not yet clear to me how to define the order of the flags at link >> time. I'm guessing that waf needs to be told about this in the package >> level bscript? >> >> ERROR: Failure: ImportError >> (/home/scott/.virtualenvs/scipy-sandbox/local/lib/python2.7/site-packages/scipy/special/_cephes.so: >> undefined symbol: _gfortran_transfer_character_write) > > Out of curiosity, can you post the link line that Bento executed for > the _cephes.so module? Sure. Here you go: [1080/1080] cshlib: build/scipy/special/_cephesmodule.c.1.o build/scipy/special/amos_wrappers.c.1.o build/scipy/special/specfun_wrappers.c.1.o build/scipy/special/toms_wrappers.c.1.o build/scipy/special/cdf_wrappers.c.1.o build/scipy/special/ufunc_extras.c.1.o -> build/scipy/special/_cephes.so 13:19:47 runner ['/usr/bin/gcc', '-shared', '-Wl,-Bsymbolic-functions', '-pthread', '-Wl,-O1', '-Wl,-Bsymbolic-functions', '-Wl,-Bsymbolic-functions', '-Lrelro', '-L/usr/lib/gcc/x86_64-linux-gnu/4.6.1', '-L/usr/lib/gcc/x86_64-linux-gnu/4.6.1/../../../x86_64-linux-gnu', '-L/usr/lib/gcc/x86_64-linux-gnu/4.6.1/../../../../lib', '-L/lib/x86_64-linux-gnu', '-L/lib/../lib', '-L/usr/lib/x86_64-linux-gnu', '-L/usr/lib/../lib', '-L/usr/lib/gcc/x86_64-linux-gnu/4.6.1/../../..', '-lgfortran', '-lm', '-lquadmath', '-lm', 'scipy/special/_cephesmodule.c.1.o', 'scipy/special/amos_wrappers.c.1.o', 'scipy/special/specfun_wrappers.c.1.o', 'scipy/special/toms_wrappers.c.1.o', 'scipy/special/cdf_wrappers.c.1.o', 'scipy/special/ufunc_extras.c.1.o', '-o', '/home/scott/external_repos/scipy/build/scipy/special/_cephes.so', '-Wl,-Bstatic', '-Lscipy/special', '-Lscipy/special', '-Lscipy/special', '-Lscipy/special', '-Lscipy/special', '-Lscipy/special', '-Lscipy/special', '-lsc_amos', '-lsc_toms', '-lsc_c_misc', '-lsc_cephes', '-lsc_mach', '-lsc_cdf', '-lsc_specfunlib', '-Wl,-Bdynamic', '-L/home/scott/.local/lib/python2.7/site-packages/numpy/core/lib', '-lnpymath', '-lm'] No doubt everything in the runner list gets concatenated to form the command line. Cheers, Scott From ralf.gommers at googlemail.com Mon Feb 20 16:59:38 2012 From: ralf.gommers at googlemail.com (Ralf Gommers) Date: Mon, 20 Feb 2012 22:59:38 +0100 Subject: [SciPy-Dev] Numpy-dev and scipy-0.10.1 test errors In-Reply-To: <4F381C9A.4080000@uci.edu> References: <4F381C9A.4080000@uci.edu> Message-ID: On Sun, Feb 12, 2012 at 9:10 PM, Christoph Gohlke wrote: > Hello, > > while testing msvc9/MKL builds of scipy-0.10.1rc1 and numpy-dev on > win-amd64-py2.7 I got 20 numpy test errors and 2 scipy test errors. The > full tests results are attached. I have not looked at them in detail. There > are no test errors or failures with numpy 1.6.1 and scipy-0.10.1rc1. > Christoph, with current master plus PRs 214 and 215 all the numpy test errors you're seeing should be fixed, with the exception of the numpy.maone. Could you please verify that, and have a look at what's going on with the last error (does it just need to be silenced, or is there a problem?). Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From cgohlke at uci.edu Mon Feb 20 18:44:24 2012 From: cgohlke at uci.edu (Christoph Gohlke) Date: Mon, 20 Feb 2012 15:44:24 -0800 Subject: [SciPy-Dev] Numpy-dev and scipy-0.10.1 test errors In-Reply-To: References: <4F381C9A.4080000@uci.edu> Message-ID: <4F42DAD8.4070109@uci.edu> On 2/20/2012 1:59 PM, Ralf Gommers wrote: > > > On Sun, Feb 12, 2012 at 9:10 PM, Christoph Gohlke > wrote: > > Hello, > > while testing msvc9/MKL builds of scipy-0.10.1rc1 and numpy-dev on > win-amd64-py2.7 I got 20 numpy test errors and 2 scipy test errors. > The full tests results are attached. I have not looked at them in > detail. There are no test errors or failures with numpy 1.6.1 and > scipy-0.10.1rc1. > > > Christoph, with current master plus PRs 214 and 215 all the numpy test > errors you're seeing should be fixed, with the exception of the numpy.ma > one. Could you please verify that, and have a look at > what's going on with the last error (does it just need to be silenced, > or is there a problem?). > > Ralf > > Errors are down to 14. The test output is attached. Christoph -------------- next part -------------- An embedded and charset-unspecified text was scrubbed... Name: numpy-2.0.0.dev-0c1c499.win-amd64-py2.7-tests.txt URL: From paul.anton.letnes at gmail.com Tue Feb 21 02:12:03 2012 From: paul.anton.letnes at gmail.com (Paul Anton Letnes) Date: Tue, 21 Feb 2012 08:12:03 +0100 Subject: [SciPy-Dev] [SciPy-User] ANN: SciPy 0.10.1 release candidate 1 In-Reply-To: References: Message-ID: <2B4D4FCB-73C2-49C4-BABC-D98C515B51BB@gmail.com> On 18. feb. 2012, at 13:46, Ralf Gommers wrote: > > > On Mon, Feb 13, 2012 at 5:58 PM, Paul Anton Letnes wrote: > Hi, > > I screwed up my previous e-mail (using my non-virtualenv, regular python) and I am trying again. These are the results on my machine - OS X 10.7.3, gcc 4.2 from apple, and gfortran 4.6.2 from Homebrew-alt. > > Paul > > Running unit tests for scipy > NumPy version 1.6.1 > NumPy is installed in /Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/numpy > SciPy version 0.10.1rc1 > SciPy is installed in /Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/scipy > Python version 2.7.2 (default, Oct 9 2011, 18:03:13) [GCC 4.2.1 (Apple Inc. build 5666) (dot 3)] > nose version 1.1.2 > ............................................................................................................................................................................................................................K............................................................................................................/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/scipy/interpolate/fitpack2.py:674: UserWarning: > The coefficients of the spline returned have been computed as the > minimal norm least-squares solution of a (numerically) rank deficient > system (deficiency=7). If deficiency is large, the results may be > inaccurate. Deficiency may strongly depend on the value of eps. > warnings.warn(message) > ....../Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/scipy/interpolate/fitpack2.py:605: UserWarning: > The required storage space exceeds the available storage space: nxest > or nyest too small, or s too small. > The weighted least-squares spline corresponds to the current set of > knots. > warnings.warn(message) > ........................K..K.................................................................................................................................................................................................................................................................................................................................................................................................................................................../Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/scipy/io/wavfile.py:31: WavFileWarning: Unfamiliar format bytes > warnings.warn("Unfamiliar format bytes", WavFileWarning) > /Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/scipy/io/wavfile.py:121: WavFileWarning: chunk not understood > warnings.warn("chunk not understood", WavFileWarning) > ....................................................................................F..FF......................................................................................................................................SSSSSS......SSSSSS......SSSS.....................FFF.........................................F....FF.......S............................................................................................................................................................................................................................................................K......................................................................................................................................................................................................SSSSS............S.............................................................................................................................................................................................. > ............................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................SSSSSSSSSSS.........../Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/scipy/sparse/linalg/eigen/arpack/arpack.py:63: UserWarning: Single-precision types in `eigs` and `eighs` are not supported currently. Double precision routines are used instead. > warnings.warn("Single-precision types in `eigs` and `eighs` " > ....F.F.....................F...........F.F..............................................................................................F........................F.........................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................K...............................................................K................................................................. > ..........................................................................................KK.............................................................................................................................................................................................................................................................................................................................................................................................................................................K.K.............................................................................................................................................................................................................................................................................................................................................................................................K........K..............SSSSSSS..................................................... > .....................................................................................................S.............................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................. > ====================================================================== > FAIL: test_asum (test_blas.TestFBLAS1Simple) > ---------------------------------------------------------------------- > Traceback (most recent call last): > File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/scipy/lib/blas/tests/test_blas.py", line 58, in test_asum > assert_almost_equal(f([3,-4,5]),12) > File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/numpy/testing/utils.py", line 468, in assert_almost_equal > raise AssertionError(msg) > AssertionError: > Arrays are not almost equal to 7 decimals > ACTUAL: 0.0 > DESIRED: 12 > > ====================================================================== > FAIL: test_dot (test_blas.TestFBLAS1Simple) > ---------------------------------------------------------------------- > Traceback (most recent call last): > File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/scipy/lib/blas/tests/test_blas.py", line 67, in test_dot > assert_almost_equal(f([3,-4,5],[2,5,1]),-9) > File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/numpy/testing/utils.py", line 468, in assert_almost_equal > raise AssertionError(msg) > AssertionError: > Arrays are not almost equal to 7 decimals > ACTUAL: 0.0 > DESIRED: -9 > > ====================================================================== > FAIL: test_nrm2 (test_blas.TestFBLAS1Simple) > ---------------------------------------------------------------------- > Traceback (most recent call last): > File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/scipy/lib/blas/tests/test_blas.py", line 78, in test_nrm2 > assert_almost_equal(f([3,-4,5]),math.sqrt(50)) > File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/numpy/testing/utils.py", line 468, in assert_almost_equal > raise AssertionError(msg) > AssertionError: > Arrays are not almost equal to 7 decimals > ACTUAL: 0.0 > DESIRED: 7.0710678118654755 > > ====================================================================== > FAIL: test_basic.TestNorm.test_overflow > ---------------------------------------------------------------------- > Traceback (most recent call last): > File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/nose/case.py", line 197, in runTest > self.test(*self.arg) > File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/scipy/linalg/tests/test_basic.py", line 581, in test_overflow > assert_almost_equal(norm(a), a) > File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/numpy/testing/utils.py", line 452, in assert_almost_equal > return assert_array_almost_equal(actual, desired, decimal, err_msg) > File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/numpy/testing/utils.py", line 800, in assert_array_almost_equal > header=('Arrays are not almost equal to %d decimals' % decimal)) > File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/numpy/testing/utils.py", line 636, in assert_array_compare > raise AssertionError(msg) > AssertionError: > Arrays are not almost equal to 7 decimals > > (mismatch 100.0%) > x: array(-0.0) > y: array([ 1.00000002e+20], dtype=float32) > > ====================================================================== > FAIL: test_basic.TestNorm.test_stable > ---------------------------------------------------------------------- > Traceback (most recent call last): > File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/nose/case.py", line 197, in runTest > self.test(*self.arg) > File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/scipy/linalg/tests/test_basic.py", line 586, in test_stable > assert_almost_equal(norm(a) - 1e4, 0.5) > File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/numpy/testing/utils.py", line 468, in assert_almost_equal > raise AssertionError(msg) > AssertionError: > Arrays are not almost equal to 7 decimals > ACTUAL: -10000.0 > DESIRED: 0.5 > > ====================================================================== > FAIL: test_basic.TestNorm.test_types > ---------------------------------------------------------------------- > Traceback (most recent call last): > File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/nose/case.py", line 197, in runTest > self.test(*self.arg) > File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/scipy/linalg/tests/test_basic.py", line 568, in test_types > assert_allclose(norm(x), np.sqrt(14), rtol=tol) > File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/numpy/testing/utils.py", line 1168, in assert_allclose > verbose=verbose, header=header) > File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/numpy/testing/utils.py", line 636, in assert_array_compare > raise AssertionError(msg) > AssertionError: > Not equal to tolerance rtol=2.38419e-06, atol=0 > > (mismatch 100.0%) > x: array(1.0842021724855044e-19) > y: array(3.7416573867739413) > > ====================================================================== > FAIL: test_asum (test_blas.TestFBLAS1Simple) > ---------------------------------------------------------------------- > Traceback (most recent call last): > File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/scipy/linalg/tests/test_blas.py", line 99, in test_asum > assert_almost_equal(f([3,-4,5]),12) > File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/numpy/testing/utils.py", line 468, in assert_almost_equal > raise AssertionError(msg) > AssertionError: > Arrays are not almost equal to 7 decimals > ACTUAL: 0.0 > DESIRED: 12 > > ====================================================================== > FAIL: test_dot (test_blas.TestFBLAS1Simple) > ---------------------------------------------------------------------- > Traceback (most recent call last): > File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/scipy/linalg/tests/test_blas.py", line 109, in test_dot > assert_almost_equal(f([3,-4,5],[2,5,1]),-9) > File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/numpy/testing/utils.py", line 468, in assert_almost_equal > raise AssertionError(msg) > AssertionError: > Arrays are not almost equal to 7 decimals > ACTUAL: 0.0 > DESIRED: -9 > > ====================================================================== > FAIL: test_nrm2 (test_blas.TestFBLAS1Simple) > ---------------------------------------------------------------------- > Traceback (most recent call last): > File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/scipy/linalg/tests/test_blas.py", line 127, in test_nrm2 > assert_almost_equal(f([3,-4,5]),math.sqrt(50)) > File "/Users/paulanto/Skrot/src/scipy-test/lib/python2.7/site-packages/numpy/testing/utils.py", line 468, in assert_almost_equal > raise AssertionError(msg) > AssertionError: > Arrays are not almost equal to 7 decimals > ACTUAL: 0.0 > DESIRED: 7.0710678118654755 > > I'm not sure what to make of these. I'm tempted to say that for now only the recommended gfortran is supported. There's too much going wrong on OS X Lion to be able to fix it all for 0.10.1. > > For 0.11.0 we should attempt to get this fixed, including the llvm-gcc situation. > > Can you check where that gfortran 4.6.2 actually comes from? > > Ralf The homebrew formula is here: https://github.com/adamv/homebrew-alt/blob/master/duplicates/gcc.rb and it looks like this line says where to download from: url 'http://ftpmirror.gnu.org/gcc/gcc-4.6.2/gcc-4.6.2.tar.bz2' I'll try to build with the recommended gfortran-4.2 as you suggested. Several people have pointed out the issues haunting gfortran on OS X. Annoying... Paul From paul.anton.letnes at gmail.com Tue Feb 21 02:29:37 2012 From: paul.anton.letnes at gmail.com (Paul Anton Letnes) Date: Tue, 21 Feb 2012 08:29:37 +0100 Subject: [SciPy-Dev] [SciPy-User] ANN: SciPy 0.10.1 release candidate 1 In-Reply-To: References: Message-ID: > I'm not sure what to make of these. I'm tempted to say that for now only the recommended gfortran is supported. There's too much going wrong on OS X Lion to be able to fix it all for 0.10.1. > > For 0.11.0 we should attempt to get this fixed, including the llvm-gcc situation. > > Can you check where that gfortran 4.6.2 actually comes from? > > Ralf After rebuilding with the recommended gfortran-4.2 this is the output from the tests. It does look better now. Let me know if you want me to try something else (like verbose=2). Oh, and by the way - this is 0.10.rc2. Cheers Paul Tests: (scipy-test)i-courant ~/Downloads % python -c 'import scipy;scipy.test(verbose=1)' Running unit tests for scipy NumPy version 1.6.1 NumPy is installed in /usr/local/Cellar/python/2.7.2/lib/python2.7/site-packages/numpy SciPy version 0.10.1rc2 SciPy is installed in /Users/paulanto/Downloads/scipy-0.10.1rc2/scipy-test/lib/python2.7/site-packages/scipy Python version 2.7.2 (default, Oct 9 2011, 18:03:13) [GCC 4.2.1 (Apple Inc. build 5666) (dot 3)] nose version 1.1.2 ............................................................................................................................................................................................................................K............................................................................................................/Users/paulanto/Downloads/scipy-0.10.1rc2/scipy-test/lib/python2.7/site-packages/scipy/interpolate/fitpack2.py:674: UserWarning: The coefficients of the spline returned have been computed as the minimal norm least-squares solution of a (numerically) rank deficient system (deficiency=7). If deficiency is large, the results may be inaccurate. Deficiency may strongly depend on the value of eps. warnings.warn(message) ....../Users/paulanto/Downloads/scipy-0.10.1rc2/scipy-test/lib/python2.7/site-packages/scipy/interpolate/fitpack2.py:605: UserWarning: The required storage space exceeds the available storage space: nxest or nyest too small, or s too small. The weighted least-squares spline corresponds to the current set of knots. warnings.warn(message) ........................K..K....../usr/local/Cellar/python/2.7.2/lib/python2.7/site-packages/numpy/core/numeric.py:1920: RuntimeWarning: invalid value encountered in absolute return all(less_equal(absolute(x-y), atol + rtol * absolute(y))) ............................................................................................................................................................................................................................................................................................................................................................................................................................................./Users/paulanto/Downloads/scipy-0.10.1rc2/scipy-test/lib/python2.7/site-packages/scipy/io/wavfile.py:31: WavFileWarning: Unfamiliar format bytes warnings.warn("Unfamiliar format bytes", WavFileWarning) /Users/paulanto/Downloads/scipy-0.10.1rc2/scipy-test/lib/python2.7/site-packages/scipy/io/wavfile.py:121: WavFileWarning: chunk not understood warnings.warn("chunk not understood", WavFileWarning) ...............................................................................................................................................................................................................................SSSSSS......SSSSSS......SSSS...............................................................................S............................................................................................................................................................................................................................................................K......................................................................................................................................................................................................SSSSS............S..........................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................SSSSSSSSSSS.........../Users/paulanto/Downloads/scipy-0.10.1rc2/scipy-test/lib/python2.7/site-packages/scipy/sparse/linalg/eigen/arpack/arpack.py:63: UserWarning: Single-precision types in `eigs` and `eighs` are not supported currently. Double precision routines are used instead. warnings.warn("Single-precision types in `eigs` and `eighs` " ............................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................K...............................................................K...........................................................................................................................................................KK.............................................................................................................................................................................................................................................................................................................................................................................................................................................K.K.............................................................................................................................................................................................................................................................................................................................................................................................K........K..............SSSSSSS..........................................................................................................................................................S.............................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................. ---------------------------------------------------------------------- Ran 5101 tests in 53.247s OK (KNOWNFAIL=12, SKIP=42) From ralf.gommers at googlemail.com Tue Feb 21 02:49:11 2012 From: ralf.gommers at googlemail.com (Ralf Gommers) Date: Tue, 21 Feb 2012 08:49:11 +0100 Subject: [SciPy-Dev] Numpy-dev and scipy-0.10.1 test errors In-Reply-To: <4F42DAD8.4070109@uci.edu> References: <4F381C9A.4080000@uci.edu> <4F42DAD8.4070109@uci.edu> Message-ID: On Tue, Feb 21, 2012 at 12:44 AM, Christoph Gohlke wrote: > > > On 2/20/2012 1:59 PM, Ralf Gommers wrote: > >> >> >> On Sun, Feb 12, 2012 at 9:10 PM, Christoph Gohlke > > wrote: >> >> Hello, >> >> while testing msvc9/MKL builds of scipy-0.10.1rc1 and numpy-dev on >> win-amd64-py2.7 I got 20 numpy test errors and 2 scipy test errors. >> The full tests results are attached. I have not looked at them in >> detail. There are no test errors or failures with numpy 1.6.1 and >> scipy-0.10.1rc1. >> >> >> Christoph, with current master plus PRs 214 and 215 all the numpy test >> errors you're seeing should be fixed, with the exception of the numpy.ma >> one. Could you please verify that, and have a look at >> >> what's going on with the last error (does it just need to be silenced, >> or is there a problem?). >> >> Ralf >> >> >> > Errors are down to 14. The test output is attached. > > I discovered one mistake in the commit that should have fixed the umath RuntimeWarnings. Should be fixed in PR-215 now. The datetime errors should have been fixed by PR-213, which has merged into master just before I sent the previous email. Was that included for your last build? Thanks, Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From cgohlke at uci.edu Tue Feb 21 03:59:09 2012 From: cgohlke at uci.edu (Christoph Gohlke) Date: Tue, 21 Feb 2012 00:59:09 -0800 Subject: [SciPy-Dev] Numpy-dev and scipy-0.10.1 test errors In-Reply-To: References: <4F381C9A.4080000@uci.edu> <4F42DAD8.4070109@uci.edu> Message-ID: <4F435CDD.8040405@uci.edu> On 2/20/2012 11:49 PM, Ralf Gommers wrote: > > > On Tue, Feb 21, 2012 at 12:44 AM, Christoph Gohlke > wrote: > > > > On 2/20/2012 1:59 PM, Ralf Gommers wrote: > > > > On Sun, Feb 12, 2012 at 9:10 PM, Christoph Gohlke > > >> wrote: > > Hello, > > while testing msvc9/MKL builds of scipy-0.10.1rc1 and > numpy-dev on > win-amd64-py2.7 I got 20 numpy test errors and 2 scipy test > errors. > The full tests results are attached. I have not looked at > them in > detail. There are no test errors or failures with numpy > 1.6.1 and > scipy-0.10.1rc1. > > > Christoph, with current master plus PRs 214 and 215 all the > numpy test > errors you're seeing should be fixed, with the exception of the > numpy.ma > one. Could you please verify that, and have a > look at > > what's going on with the last error (does it just need to be > silenced, > or is there a problem?). > > Ralf > > > > Errors are down to 14. The test output is attached. > > I discovered one mistake in the commit that should have fixed the umath > RuntimeWarnings. Should be fixed in PR-215 now. Didn't help. _FilterInvalids.setUp and tearDown are never executed. > > The datetime errors should have been fixed by PR-213, which has merged > into master just before I sent the previous email. Was that included for > your last build? You are right, I/git must have pulled the whole datetime_strings.c file, instead of just the changes, from PR214. No more datetime errors. Christoph > > Thanks, > Ralf > > From denis.laxalde at mcgill.ca Tue Feb 21 14:36:10 2012 From: denis.laxalde at mcgill.ca (Denis Laxalde) Date: Tue, 21 Feb 2012 14:36:10 -0500 Subject: [SciPy-Dev] sparse matrices comparison Message-ID: <20120221143610.7c5cdb2e@mcgill.ca> Sparse matrices don't support comparison. E.g.: >>> a = csr_matrix([[1, 2], [0, 3]]) >>> b = a.copy() >>> a == b False >>> a - b <2x2 sparse matrix of type '' with 0 stored elements in Compressed Sparse Row format> This is apparently known (Ticket #639) but I was wondering if it wouldn't be better to raise a NotImplemented error instead returning wrong results. -- Denis From ralf.gommers at googlemail.com Tue Feb 21 15:32:03 2012 From: ralf.gommers at googlemail.com (Ralf Gommers) Date: Tue, 21 Feb 2012 21:32:03 +0100 Subject: [SciPy-Dev] Numpy-dev and scipy-0.10.1 test errors In-Reply-To: <4F435CDD.8040405@uci.edu> References: <4F381C9A.4080000@uci.edu> <4F42DAD8.4070109@uci.edu> <4F435CDD.8040405@uci.edu> Message-ID: On Tue, Feb 21, 2012 at 9:59 AM, Christoph Gohlke wrote: > > > On 2/20/2012 11:49 PM, Ralf Gommers wrote: > > > > > > On Tue, Feb 21, 2012 at 12:44 AM, Christoph Gohlke > > wrote: > > > > > > > > On 2/20/2012 1:59 PM, Ralf Gommers wrote: > > > > > > > > On Sun, Feb 12, 2012 at 9:10 PM, Christoph Gohlke > > > > >> wrote: > > > > Hello, > > > > while testing msvc9/MKL builds of scipy-0.10.1rc1 and > > numpy-dev on > > win-amd64-py2.7 I got 20 numpy test errors and 2 scipy test > > errors. > > The full tests results are attached. I have not looked at > > them in > > detail. There are no test errors or failures with numpy > > 1.6.1 and > > scipy-0.10.1rc1. > > > > > > Christoph, with current master plus PRs 214 and 215 all the > > numpy test > > errors you're seeing should be fixed, with the exception of the > > numpy.ma > > one. Could you please verify that, and have a > > look at > > > > what's going on with the last error (does it just need to be > > silenced, > > or is there a problem?). > > > > Ralf > > > > > > > > Errors are down to 14. The test output is attached. > > > > I discovered one mistake in the commit that should have fixed the umath > > RuntimeWarnings. Should be fixed in PR-215 now. > > Didn't help. _FilterInvalids.setUp and tearDown are never executed. > Sorry, random multiple inheritance issue. Fixed now. > > The datetime errors should have been fixed by PR-213, which has merged > > into master just before I sent the previous email. Was that included for > > your last build? > > You are right, I/git must have pulled the whole datetime_strings.c file, > instead of just the changes, from PR214. No more datetime errors. > Great. Getting all the MSVC errors out of the way saves an extra beta:) Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at googlemail.com Tue Feb 21 17:44:35 2012 From: ralf.gommers at googlemail.com (Ralf Gommers) Date: Tue, 21 Feb 2012 23:44:35 +0100 Subject: [SciPy-Dev] [SciPy-User] ANN: SciPy 0.10.1 release candidate 1 In-Reply-To: References: Message-ID: On Tue, Feb 21, 2012 at 8:29 AM, Paul Anton Letnes < paul.anton.letnes at gmail.com> wrote: > > I'm not sure what to make of these. I'm tempted to say that for now only > the recommended gfortran is supported. There's too much going wrong on OS X > Lion to be able to fix it all for 0.10.1. > > > > For 0.11.0 we should attempt to get this fixed, including the llvm-gcc > situation. > > > > Can you check where that gfortran 4.6.2 actually comes from? > > > > Ralf > > After rebuilding with the recommended gfortran-4.2 this is the output from > the tests. It does look better now. Let me know if you want me to try > something else (like verbose=2). Oh, and by the way - this is 0.10.rc2. > > Thanks Paul. I'm seeing the same warnings, so the output is as expected now. Ralf > Cheers > Paul > > Tests: > > (scipy-test)i-courant ~/Downloads % python -c 'import > scipy;scipy.test(verbose=1)' > Running unit tests for scipy > NumPy version 1.6.1 > NumPy is installed in > /usr/local/Cellar/python/2.7.2/lib/python2.7/site-packages/numpy > SciPy version 0.10.1rc2 > SciPy is installed in > /Users/paulanto/Downloads/scipy-0.10.1rc2/scipy-test/lib/python2.7/site-packages/scipy > Python version 2.7.2 (default, Oct 9 2011, 18:03:13) [GCC 4.2.1 (Apple > Inc. build 5666) (dot 3)] > nose version 1.1.2 > ............................................................................................................................................................................................................................K............................................................................................................/Users/paulanto/Downloads/scipy-0.10.1rc2/scipy-test/lib/python2.7/site-packages/scipy/interpolate/fitpack2.py:674: > UserWarning: > The coefficients of the spline returned have been computed as the > minimal norm least-squares solution of a (numerically) rank deficient > system (deficiency=7). If deficiency is large, the results may be > inaccurate. Deficiency may strongly depend on the value of eps. > warnings.warn(message) > ....../Users/paulanto/Downloads/scipy-0.10.1rc2/scipy-test/lib/python2.7/site-packages/scipy/interpolate/fitpack2.py:605: > UserWarning: > The required storage space exceeds the available storage space: nxest > or nyest too small, or s too small. > The weighted least-squares spline corresponds to the current set of > knots. > warnings.warn(message) > ........................K..K....../usr/local/Cellar/python/2.7.2/lib/python2.7/site-packages/numpy/core/numeric.py:1920: > RuntimeWarning: invalid value encountered in absolute > return all(less_equal(absolute(x-y), atol + rtol * absolute(y))) > ............................................................................................................................................................................................................................................................................................................................................................................................................................................./Users/paulanto/Downloads/scipy-0.10.1rc2/scipy-test/lib/python2.7/site-packages/scipy/io/wavfile.py:31: > WavFileWarning: Unfamiliar format bytes > warnings.warn("Unfamiliar format bytes", WavFileWarning) > /Users/paulanto/Downloads/scipy-0.10.1rc2/scipy-test/lib/python2.7/site-packages/scipy/io/wavfile.py:121: > WavFileWarning: chunk not understood > warnings.warn("chunk not understood", WavFileWarning) > > ...............................................................................................................................................................................................................................SSSSSS......SSSSSS......SSSS...............................................................................S............................................................................................................................................................................................................................................................K......................................................................................................................................................................................................SSSSS............S.............................................................................................................................................................................................. > ............................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................SSSSSSSSSSS.........../Users/paulanto/Downloads/scipy-0.10.1rc2/scipy-test/lib/python2.7/site-packages/scipy/sparse/linalg/eigen/arpack/arpack.py:63: > UserWarning: Single-precision types in `eigs` and `eighs` are not supported > currently. Double precision routines are used instead. > warnings.warn("Single-precision types in `eigs` and `eighs` " > > ............................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................K...............................................................K................................................................. > > ..........................................................................................KK.............................................................................................................................................................................................................................................................................................................................................................................................................................................K.K.............................................................................................................................................................................................................................................................................................................................................................................................K........K..............SSSSSSS..................................................... > > .....................................................................................................S.............................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................. > ---------------------------------------------------------------------- > Ran 5101 tests in 53.247s > > OK (KNOWNFAIL=12, SKIP=42) > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > -------------- next part -------------- An HTML attachment was scrubbed... URL: From david.verelst at gmail.com Wed Feb 22 12:55:36 2012 From: david.verelst at gmail.com (David Verelst) Date: Wed, 22 Feb 2012 18:55:36 +0100 Subject: [SciPy-Dev] Building docs with Sphinx fails for 0.10.x Message-ID: <4F452C18.4070900@gmail.com> Hi All, When checking out the SciPy 0.10.1rc2 release I suddenly realised Sphinx failed on building the documentation. First attempt was with Sphinx v1.0.2, than 1.2 and 1.2predev (latest from their Hg repo). While SciPy 0.9 documentation builds just fine with Sphinx 1.0.2, 1.2 and 1.2predev, SciPy 0.10.0 and 0.10.1rc2 both fail with the following: [snip] File "/usr/local/lib/python2.6/dist-packages/matplotlib/sphinxext/plot_directive.py", line 627, in run fd = open(source_file_name, 'r') IOError: [Errno 2] No such file or directory: u'/home/dave/Downloads/Python/scipy-0.10.1rc2/doc/source/examples/normdiscr_plot1.py' Full backtrace is attached. Anyone familiar with this? Regards, David -------------- next part -------------- A non-text attachment was scrubbed... Name: sphinx-scipy-backtrace.log Type: text/x-log Size: 7302 bytes Desc: not available URL: From denis.laxalde at mcgill.ca Wed Feb 22 13:11:50 2012 From: denis.laxalde at mcgill.ca (Denis Laxalde) Date: Wed, 22 Feb 2012 13:11:50 -0500 Subject: [SciPy-Dev] Building docs with Sphinx fails for 0.10.x In-Reply-To: <4F452C18.4070900@gmail.com> References: <4F452C18.4070900@gmail.com> Message-ID: <20120222131150.27d2e312@mcgill.ca> David Verelst wrote: > When checking out the SciPy 0.10.1rc2 release I suddenly realised Sphinx > failed on building the documentation. First attempt was with Sphinx > v1.0.2, than 1.2 and 1.2predev (latest from their Hg repo). While SciPy > 0.9 documentation builds just fine with Sphinx 1.0.2, 1.2 and 1.2predev, > SciPy 0.10.0 and 0.10.1rc2 both fail with the following: > > [snip] > File > "/usr/local/lib/python2.6/dist-packages/matplotlib/sphinxext/plot_directive.py", > line 627, in run > fd = open(source_file_name, 'r') > IOError: [Errno 2] No such file or directory: > u'/home/dave/Downloads/Python/scipy-0.10.1rc2/doc/source/examples/normdiscr_plot1.py' > > Full backtrace is attached. Anyone familiar with this? You can try this: https://github.com/dlaxalde/scipy/commit/bfd94f60fcbc73211485fd32b839ea4f01767b7c -- Denis From josef.pktd at gmail.com Wed Feb 22 13:14:47 2012 From: josef.pktd at gmail.com (josef.pktd at gmail.com) Date: Wed, 22 Feb 2012 13:14:47 -0500 Subject: [SciPy-Dev] Building docs with Sphinx fails for 0.10.x In-Reply-To: <4F452C18.4070900@gmail.com> References: <4F452C18.4070900@gmail.com> Message-ID: On Wed, Feb 22, 2012 at 12:55 PM, David Verelst wrote: > Hi All, > > When checking out the SciPy 0.10.1rc2 release I suddenly realised Sphinx > failed on building the documentation. First attempt was with Sphinx v1.0.2, > than 1.2 and 1.2predev (latest from their Hg repo). While SciPy 0.9 > documentation builds just fine with Sphinx 1.0.2, 1.2 and 1.2predev, SciPy > 0.10.0 and 0.10.1rc2 both fail with the following: > > [snip] > File > "/usr/local/lib/python2.6/dist-packages/matplotlib/sphinxext/plot_directive.py", > line 627, in run > ? ?fd = open(source_file_name, 'r') > IOError: [Errno 2] No such file or directory: > u'/home/dave/Downloads/Python/scipy-0.10.1rc2/doc/source/examples/normdiscr_plot1.py' > > Full backtrace is attached. Anyone familiar with this? This was discussed here https://github.com/scipy/scipy/pull/152#issuecomment-3856617 As far as I know it's a problem looking for a fix. Josef > > Regards, > David > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > From jsseabold at gmail.com Wed Feb 22 13:27:39 2012 From: jsseabold at gmail.com (Skipper Seabold) Date: Wed, 22 Feb 2012 13:27:39 -0500 Subject: [SciPy-Dev] Building docs with Sphinx fails for 0.10.x In-Reply-To: References: <4F452C18.4070900@gmail.com> Message-ID: On Wed, Feb 22, 2012 at 1:14 PM, wrote: > On Wed, Feb 22, 2012 at 12:55 PM, David Verelst wrote: >> Hi All, >> >> When checking out the SciPy 0.10.1rc2 release I suddenly realised Sphinx >> failed on building the documentation. First attempt was with Sphinx v1.0.2, >> than 1.2 and 1.2predev (latest from their Hg repo). While SciPy 0.9 >> documentation builds just fine with Sphinx 1.0.2, 1.2 and 1.2predev, SciPy >> 0.10.0 and 0.10.1rc2 both fail with the following: >> >> [snip] >> File >> "/usr/local/lib/python2.6/dist-packages/matplotlib/sphinxext/plot_directive.py", >> line 627, in run >> ? ?fd = open(source_file_name, 'r') >> IOError: [Errno 2] No such file or directory: >> u'/home/dave/Downloads/Python/scipy-0.10.1rc2/doc/source/examples/normdiscr_plot1.py' >> >> Full backtrace is attached. Anyone familiar with this? > > This was discussed here > > https://github.com/scipy/scipy/pull/152#issuecomment-3856617 > > As far as I know it's a problem looking for a fix. > FWIW, I don't have time to look at the details at the moment, but I fixed a bug in mpl's plot_directive about the base directory some time ago if that's the problem. https://github.com/matplotlib/matplotlib/pull/132 Skipper From stefan at sun.ac.za Mon Feb 27 04:36:08 2012 From: stefan at sun.ac.za (=?ISO-8859-1?Q?St=E9fan_van_der_Walt?=) Date: Mon, 27 Feb 2012 01:36:08 -0800 Subject: [SciPy-Dev] ANN: scikits-image v0.5 In-Reply-To: References: Message-ID: Announcement: scikits-image 0.5 =============================== We're happy to announce the 0.5 release of scikits-image, our image processing toolbox for SciPy. For more information, please visit our website ?http://scikits-image.org New Features ------------ - Consistent intensity rescaling and improved range conversion. - Random walker segmentation. - Harris corner detection. - Otsu thresholding. - Block views, window views and montage. - Plugin for Christoph Gohlke's "tifffile". - Peak detection. - Improved FreeImage wrappers and meta-data reading. - 8-neighbor and background labelling. ... along with updates to the documentation and website, and a number of bug fixes. Contributors to this release ---------------------------- * Andreas Mueller * Brian Holt * Christoph Gohlke * Emmanuelle Gouillart * Michael Aye * Nelle Varoquaux * Nicolas Pinto * Nicolas Poilvert * Pieter Holtzhausen * Stefan van der Walt * Tony S Yu * Warren Weckesser * Zachary Pincus From gael.varoquaux at normalesup.org Mon Feb 27 04:38:47 2012 From: gael.varoquaux at normalesup.org (Gael Varoquaux) Date: Mon, 27 Feb 2012 10:38:47 +0100 Subject: [SciPy-Dev] ANN: scikits-image v0.5 In-Reply-To: References: Message-ID: <20120227093847.GB28286@phare.normalesup.org> Great job! I had a look at your website/documentation, and it looks really good: it is making it easy to use advanced image processing. Go go skimage! From cimrman3 at ntc.zcu.cz Mon Feb 27 13:39:04 2012 From: cimrman3 at ntc.zcu.cz (Robert Cimrman) Date: Mon, 27 Feb 2012 19:39:04 +0100 Subject: [SciPy-Dev] ANN: SfePy 2012.1 Message-ID: <4F4BCDC8.6070202@ntc.zcu.cz> I am pleased to announce release 2012.1 of SfePy. Description ----------- SfePy (simple finite elements in Python) is a software for solving systems of coupled partial differential equations by the finite element method. The code is based on NumPy and SciPy packages. It is distributed under the new BSD license. Home page: http://sfepy.org Downloads, mailing list, wiki: http://code.google.com/p/sfepy/ Git (source) repository, issue tracker: http://github.com/sfepy Highlights of this release -------------------------- - initial version of linearizer of higher order solutions - rewrite variable and evaluate cache history handling - lots of term updates/fixes/simplifications - move web front page to sphinx docs For full release notes see http://docs.sfepy.org/doc/release_notes.html#id1 (rather long and technical). Best regards, Robert Cimrman and Contributors (*) (*) Contributors to this release (alphabetical order): Tom Aldcroft, Vladim?r Luke?, Maty?? Nov?k, Andre Smit From ralf.gommers at googlemail.com Tue Feb 28 01:15:25 2012 From: ralf.gommers at googlemail.com (Ralf Gommers) Date: Tue, 28 Feb 2012 07:15:25 +0100 Subject: [SciPy-Dev] ANN: SciPy 0.10.1 released Message-ID: Hi all, I am pleased to announce the availability of SciPy 0.10.1. This is a maintenance release, with no new features compared to 0.10.0. Sources and binaries can be found at http://sourceforge.net/projects/scipy/files/scipy/0.10.1/, release notes are copied below. Enjoy, The SciPy developers ========================== SciPy 0.10.1 Release Notes ========================== .. contents:: SciPy 0.10.1 is a bug-fix release with no new features compared to 0.10.0. Main changes ------------ The most important changes are:: 1. The single precision routines of ``eigs`` and ``eigsh`` in ``scipy.sparse.linalg`` have been disabled (they internally use double precision now). 2. A compatibility issue related to changes in NumPy macros has been fixed, in order to make scipy 0.10.1 compile with the upcoming numpy 1.7.0 release. Other issues fixed ------------------ - #835: stats: nan propagation in stats.distributions - #1202: io: netcdf segfault - #1531: optimize: make curve_fit work with method as callable. - #1560: linalg: fixed mistake in eig_banded documentation. - #1565: ndimage: bug in ndimage.variance - #1457: ndimage: standard_deviation does not work with sequence of indexes - #1562: cluster: segfault in linkage function - #1568: stats: One-sided fisher_exact() returns `p` < 1 for 0 successful attempts - #1575: stats: zscore and zmap handle the axis keyword incorrectly -------------- next part -------------- An HTML attachment was scrubbed... URL: From robert.milgate at gmail.com Tue Feb 28 12:30:32 2012 From: robert.milgate at gmail.com (Robert Milgate) Date: Tue, 28 Feb 2012 12:30:32 -0500 Subject: [SciPy-Dev] Documentation issue Message-ID: Forgive me if this isn't the right path for this comment, I 'm quite new at this. I believe that the documentation for the hamming window implementation ( http://docs.scipy.org/doc/numpy/reference/generated/numpy.hamming.html ) in numpy is incorrect. Someone with more experience in signal conditioning than I should have a look, but I think this is wrong. The function works correctly, but I believe the equation in the documentation is for an interval symmetric about 0, not for the unipolar interval stated in the documentation. The + sign in front of the 0.46cos should be a - for teh interval as described. -------------- next part -------------- An HTML attachment was scrubbed... URL: From nwagner at iam.uni-stuttgart.de Tue Feb 28 13:42:47 2012 From: nwagner at iam.uni-stuttgart.de (Nils Wagner) Date: Tue, 28 Feb 2012 19:42:47 +0100 Subject: [SciPy-Dev] Possible bug in SLSQP Message-ID: Hi all, I tried slsqp to solve an optimization problem. Traceback (most recent call last): File "test_slsqp.py", line 63, in x_opt=fmin_slsqp(func2,x0,bounds=bounds) File "/home/nwagner/local/lib64/python2.7/site-packages/scipy/optimize/slsqp.py", line 184, in fmin_slsqp full_output=full_output) File "/home/nwagner/local/lib64/python2.7/site-packages/scipy/optimize/slsqp.py", line 309, in _minimize_slsqp raise IndexError('SLSQP Error: If bounds is specified, ' IndexError: SLSQP Error: If bounds is specified, bounds.shape[1] == len(x0) How should I define the box constraints for slsqp ? It seems to be not unique for different optimizers in scipy. Nils -------------- next part -------------- A non-text attachment was scrubbed... Name: test_slsqp.py Type: text/x-python Size: 1569 bytes Desc: not available URL: From denis.laxalde at mcgill.ca Tue Feb 28 14:30:53 2012 From: denis.laxalde at mcgill.ca (Denis Laxalde) Date: Tue, 28 Feb 2012 14:30:53 -0500 Subject: [SciPy-Dev] Possible bug in SLSQP In-Reply-To: References: Message-ID: <20120228143053.1ae02c70@mail.laxalde.org> Nils Wagner a ?crit : > I tried slsqp to solve an optimization problem. > > Traceback (most recent call last): > File "test_slsqp.py", line 63, in > x_opt=fmin_slsqp(func2,x0,bounds=bounds) > File > "/home/nwagner/local/lib64/python2.7/site-packages/scipy/optimize/slsqp.py", > line 184, in fmin_slsqp > full_output=full_output) > File > "/home/nwagner/local/lib64/python2.7/site-packages/scipy/optimize/slsqp.py", > line 309, in _minimize_slsqp > raise IndexError('SLSQP Error: If bounds is > specified, ' > IndexError: SLSQP Error: If bounds is specified, > bounds.shape[1] == len(x0) That's a bug. I just pushed a fix in master. Thanks for reporting. -- Denis From nwagner at iam.uni-stuttgart.de Tue Feb 28 14:46:38 2012 From: nwagner at iam.uni-stuttgart.de (Nils Wagner) Date: Tue, 28 Feb 2012 20:46:38 +0100 Subject: [SciPy-Dev] Possible bug in SLSQP In-Reply-To: <20120228143053.1ae02c70@mail.laxalde.org> References: <20120228143053.1ae02c70@mail.laxalde.org> Message-ID: On Tue, 28 Feb 2012 14:30:53 -0500 Denis Laxalde wrote: > Nils Wagner a ?crit : >> I tried slsqp to solve an optimization problem. >> >> Traceback (most recent call last): >> File "test_slsqp.py", line 63, in >> x_opt=fmin_slsqp(func2,x0,bounds=bounds) >> File >> "/home/nwagner/local/lib64/python2.7/site-packages/scipy/optimize/slsqp.py", >> line 184, in fmin_slsqp >> full_output=full_output) >> File >> "/home/nwagner/local/lib64/python2.7/site-packages/scipy/optimize/slsqp.py", >> line 309, in _minimize_slsqp >> raise IndexError('SLSQP Error: If bounds is >> specified, ' >> IndexError: SLSQP Error: If bounds is specified, >> bounds.shape[1] == len(x0) > > > That's a bug. I just pushed a fix in master. > Thanks for reporting. > > > -- > Denis You are welcome ! Thank you for the bug-fix. Nils From nwagner at iam.uni-stuttgart.de Tue Feb 28 15:01:38 2012 From: nwagner at iam.uni-stuttgart.de (Nils Wagner) Date: Tue, 28 Feb 2012 21:01:38 +0100 Subject: [SciPy-Dev] Possible bug in SLSQP In-Reply-To: <20120228143053.1ae02c70@mail.laxalde.org> References: <20120228143053.1ae02c70@mail.laxalde.org> Message-ID: On Tue, 28 Feb 2012 14:30:53 -0500 Denis Laxalde wrote: > Nils Wagner a ?crit : >> I tried slsqp to solve an optimization problem. >> >> Traceback (most recent call last): >> File "test_slsqp.py", line 63, in >> x_opt=fmin_slsqp(func2,x0,bounds=bounds) >> File >> "/home/nwagner/local/lib64/python2.7/site-packages/scipy/optimize/slsqp.py", >> line 184, in fmin_slsqp >> full_output=full_output) >> File >> "/home/nwagner/local/lib64/python2.7/site-packages/scipy/optimize/slsqp.py", >> line 309, in _minimize_slsqp >> raise IndexError('SLSQP Error: If bounds is >> specified, ' >> IndexError: SLSQP Error: If bounds is specified, >> bounds.shape[1] == len(x0) > > > That's a bug. I just pushed a fix in master. > Thanks for reporting. > > > -- > Denis BTW, is this another issue ? File "/home/nwagner/local/lib64/python2.7/site-packages/scipy/optimize/lbfgsb.py", line 157, in fmin_l_bfgs_b options=opts, full_output=True) File "/home/nwagner/local/lib64/python2.7/site-packages/scipy/optimize/lbfgsb.py", line 270, in _minimize_lbfgsb isave, dsave) ValueError: failed to initialize intent(inout) array -- input not fortran contiguous Nils From nwagner at iam.uni-stuttgart.de Wed Feb 29 14:54:05 2012 From: nwagner at iam.uni-stuttgart.de (Nils Wagner) Date: Wed, 29 Feb 2012 20:54:05 +0100 Subject: [SciPy-Dev] scipy.optimize.fmin_l_bfgs_b versus openopt scipy_lbfgsb Message-ID: Hi all, who can shed some light on the different behavior of scipy.optimize and openopt wrt. l_bfgs_b ? Nils >>> scipy.__version__ '0.11.0.dev-491f9db' >>> openopt.__version__ '0.37' python -i test_lbfgsb.py Optimal solution by L_BFGS_B (array([ 2000. , 1100. , 1614.65585984, 1050. , 900. , 1300. , 1100. , 1200. , 1400. , 1000. , 28. , 28. , 73. , 54. , 30. ]), 22.685511313514301, {'warnflag': 0, 'task': 'CONVERGENCE: NORM OF PROJECTED GRADIENT <= PGTOL', 'grad': array([ 8.38440428e-04, 9.89075488e-04, 5.32907052e-06, 3.17612603e-04, 4.14956958e-04, 2.43041143e-03, 1.26121336e-03, 1.87938554e-04, 2.43041143e-03, 1.10297549e-02, -2.24627428e-02, -2.86387802e-02, -5.51217738e-02, -1.89885441e-02, -5.40191181e-01]), 'funcalls': 36}) ------------------------- OpenOpt 0.37 ------------------------- solver: scipy_lbfgsb problem: unnamed type: NLP goal: minimum iter objFunVal 0 2.355e+01 RUNNING THE L-BFGS-B CODE * * * Machine precision = 2.220D-16 N = 15 M = 10 At X0 0 variables are exactly at the bounds Traceback (most recent call last): File "test_lbfgsb.py", line 68, in r = p.solve('scipy_lbfgsb') File "/home/nwagner/local/lib/python2.7/site-packages/openopt-0.37-py2.7.egg/openopt/kernel/baseProblem.py", line 235, in solve return runProbSolver(self, *args, **kwargs) File "/home/nwagner/local/lib/python2.7/site-packages/openopt-0.37-py2.7.egg/openopt/kernel/runProbSolver.py", line 246, in runProbSolver solver(p) File "/home/nwagner/local/lib/python2.7/site-packages/openopt-0.37-py2.7.egg/openopt/solvers/scipy_optim/scipy_lbfgsb_oo.py", line 36, in __solver__ iprint=p.iprint, maxfun=p.maxFunEvals) File "/home/nwagner/local/lib64/python2.7/site-packages/scipy/optimize/lbfgsb.py", line 157, in fmin_l_bfgs_b options=opts, full_output=True) File "/home/nwagner/local/lib64/python2.7/site-packages/scipy/optimize/lbfgsb.py", line 270, in _minimize_lbfgsb isave, dsave) ValueError: failed to initialize intent(inout) array -- input not fortran contiguous -------------- next part -------------- A non-text attachment was scrubbed... Name: test_lbfgsb.py Type: text/x-python Size: 1656 bytes Desc: not available URL: From pierre.haessig at crans.org Wed Feb 29 15:13:47 2012 From: pierre.haessig at crans.org (Pierre Haessig) Date: Wed, 29 Feb 2012 21:13:47 +0100 Subject: [SciPy-Dev] Documentation issue In-Reply-To: References: Message-ID: <4F4E86FB.2050305@crans.org> Hi, Le 28/02/2012 18:30, Robert Milgate a ?crit : > I believe that the documentation for the hamming window implementation > (http://docs.scipy.org/doc/numpy/reference/generated/numpy.hamming.html ) in > numpy is incorrect. Someone with more experience in signal > conditioning than I should have a look, but I think this is wrong. > > The function works correctly, but I believe the equation in the > documentation is for an interval symmetric about 0, not for the > unipolar interval stated in the documentation. The + sign in front of > the 0.46cos should be a - for teh interval as described. I feel you're right ! Now, regarding to your question about "the right path" for your comment : On every page of Scipy & Numpy docs, there is an "Edit page" link which leads to a wiki-style editor where everybody can contribute to the documentation. http://docs.scipy.org/numpy/Front%20Page/ The only requirement for editing is to have a *activated account*, that is : 1) create an account (see the Front Page above) in the wiki 2) send an email here with the username you've chosen 3) "somebody" with the appropriate rights will give you the edit rights Best, Pierre PS : I've already flipped the sign ;-) -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 900 bytes Desc: OpenPGP digital signature URL: