From amcmorl at gmail.com Thu Jul 1 07:41:09 2010 From: amcmorl at gmail.com (Angus McMorland) Date: Thu, 1 Jul 2010 07:41:09 -0400 Subject: [SciPy-Dev] scipy.misc.doccer: docstrings and role in relation to docstring editor Message-ID: Hi all, I've just found the scipy.misc.doccer object in the docstring editor wiki. Actually I was led to it when I looked at scipy.ndimage.docdict, which ostensibly needs a docstring according to the wiki. Only scipy.ndimage and scipy.ndimage.filter have these docdict objects. What's the story behind them? Is there any way to conveniently use them in conjunction with the docstring editor; should we remove them altogether; can we at least remove them from the count on the docstring wiki? Angus. -- AJC McMorland Post-doctoral research fellow Neurobiology, University of Pittsburgh From ralf.gommers at googlemail.com Thu Jul 1 11:59:31 2010 From: ralf.gommers at googlemail.com (Ralf Gommers) Date: Thu, 1 Jul 2010 23:59:31 +0800 Subject: [SciPy-Dev] scipy.misc.doccer: docstrings and role in relation to docstring editor In-Reply-To: References: Message-ID: On Thu, Jul 1, 2010 at 7:41 PM, Angus McMorland wrote: > Hi all, > > I've just found the scipy.misc.doccer object in the docstring editor > wiki. Actually I was led to it when I looked at scipy.ndimage.docdict, > which ostensibly needs a docstring according to the wiki. Only > scipy.ndimage and scipy.ndimage.filter have these docdict objects. > It's also used in io.matlab and and stats.distributions. > > What's the story behind them? It provides a convenient way to share parts of docstrings between different objects by string substitution. > Is there any way to conveniently use > them in conjunction with the docstring editor; No, and that's probably not going to happen any time soon since it's a lot of work (would need AST parsing to figure instead of importing modules and just looking at the docstrings if I remember correctly). > should we remove them > altogether; can we at least remove them from the count on the > docstring wiki? > It would be good to remove/hide the docstrings that use doccer in the wiki. Cheers, Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From d.l.goldsmith at gmail.com Thu Jul 1 13:36:09 2010 From: d.l.goldsmith at gmail.com (David Goldsmith) Date: Thu, 1 Jul 2010 10:36:09 -0700 Subject: [SciPy-Dev] Reminder: Summer Marathon Skypecon @ Fri Jul 2 12pm EDT Message-ID: Any takers? DG ---------- Forwarded message ---------- From: Google Calendar Date: Thu, Jul 1, 2010 at 9:03 AM Subject: Reminder: Summer Marathon Skypecon @ Fri Jul 2 9am - 10am ( d.l.goldsmith at gmail.com) To: David Goldsmith more details ? Summer Marathon Skypecon *When* Fri Jul 2 9am ? 10am Pacific Time *Where* Skype (map ) *Calendar* d.l.goldsmith at gmail.com *Who* ? David Goldsmith - organizer ? jh at physics.ucf.edu Going? *Yes- Maybe- No * more options ? Invitation from Google Calendar You are receiving this email at the account d.l.goldsmith at gmail.com because you set a reminder for this event on the calendar d.l.goldsmith at gmail.com. You can change your reminders for specific events in the event details page in https://www.google.com/calendar/. -- Mathematician: noun, someone who disavows certainty when their uncertainty set is non-empty, even if that set has measure zero. Hope: noun, that delusive spirit which escaped Pandora's jar and, with her lies, prevents mankind from committing a general suicide. (As interpreted by Robert Graves) -------------- next part -------------- An HTML attachment was scrubbed... URL: From ben.root at ou.edu Thu Jul 1 15:58:30 2010 From: ben.root at ou.edu (Benjamin Root) Date: Thu, 1 Jul 2010 14:58:30 -0500 Subject: [SciPy-Dev] Reminder: Summer Marathon Skypecon @ Fri Jul 2 12pm EDT In-Reply-To: References: Message-ID: Is it going to be at Noon or at 9am? I also heard of some people talking about 9pm, but maybe that was a different call? Ben Root On Thu, Jul 1, 2010 at 12:36 PM, David Goldsmith wrote: > Any takers? > > DG > > ---------- Forwarded message ---------- > From: Google Calendar > Date: Thu, Jul 1, 2010 at 9:03 AM > Subject: Reminder: Summer Marathon Skypecon @ Fri Jul 2 9am - 10am ( > d.l.goldsmith at gmail.com) > To: David Goldsmith > > > more details ? > Summer Marathon Skypecon > *When* > Fri Jul 2 9am ? 10am Pacific Time > *Where* > Skype (map ) > *Calendar* > d.l.goldsmith at gmail.com > *Who* > ? > David Goldsmith - organizer > ? > jh at physics.ucf.edu > > Going? *Yes- > Maybe- > No > * more options ? > > Invitation from Google Calendar > > You are receiving this email at the account d.l.goldsmith at gmail.combecause you set a reminder for this event on the calendar > d.l.goldsmith at gmail.com. > > You can change your reminders for specific events in the event details page > in https://www.google.com/calendar/. > > > > -- > Mathematician: noun, someone who disavows certainty when their uncertainty > set is non-empty, even if that set has measure zero. > > Hope: noun, that delusive spirit which escaped Pandora's jar and, with her > lies, prevents mankind from committing a general suicide. (As interpreted > by Robert Graves) > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From vincent at vincentdavis.net Thu Jul 1 16:09:35 2010 From: vincent at vincentdavis.net (Vincent Davis) Date: Thu, 1 Jul 2010 14:09:35 -0600 Subject: [SciPy-Dev] Reminder: Summer Marathon Skypecon @ Fri Jul 2 12pm EDT In-Reply-To: References: Message-ID: On Thu, Jul 1, 2010 at 1:58 PM, Benjamin Root wrote: > Is it going to be at Noon or at 9am? I also heard of some people talking > about 9pm, but maybe that was a different call? > > I thought it was strange that it was 9am Pacific time. I thought David was in easter time zone and often it shows me the time zone I am in. David, Does it show I accepted the invite, Google seems to be acting a little strange and doesn't add it to my calendar? Vincent > Ben Root > > On Thu, Jul 1, 2010 at 12:36 PM, David Goldsmith wrote: > >> Any takers? >> >> DG >> >> ---------- Forwarded message ---------- >> From: Google Calendar >> Date: Thu, Jul 1, 2010 at 9:03 AM >> Subject: Reminder: Summer Marathon Skypecon @ Fri Jul 2 9am - 10am ( >> d.l.goldsmith at gmail.com) >> To: David Goldsmith >> >> >> more details ? >> Summer Marathon Skypecon >> *When* >> Fri Jul 2 9am ? 10am Pacific Time >> *Where* >> Skype (map ) >> *Calendar* >> d.l.goldsmith at gmail.com >> *Who* >> ? >> David Goldsmith - organizer >> ? >> jh at physics.ucf.edu >> >> Going? *Yes- >> Maybe- >> No >> * more options ? >> >> Invitation from Google Calendar >> >> You are receiving this email at the account d.l.goldsmith at gmail.combecause you set a reminder for this event on the calendar >> d.l.goldsmith at gmail.com. >> >> You can change your reminders for specific events in the event details >> page in https://www.google.com/calendar/. >> >> >> >> -- >> Mathematician: noun, someone who disavows certainty when their uncertainty >> set is non-empty, even if that set has measure zero. >> >> Hope: noun, that delusive spirit which escaped Pandora's jar and, with her >> lies, prevents mankind from committing a general suicide. (As interpreted >> by Robert Graves) >> >> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at scipy.org >> http://mail.scipy.org/mailman/listinfo/scipy-dev >> >> > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > > *Vincent Davis 720-301-3003 * vincent at vincentdavis.net my blog | LinkedIn -------------- next part -------------- An HTML attachment was scrubbed... URL: From ben.root at ou.edu Thu Jul 1 16:39:56 2010 From: ben.root at ou.edu (Benjamin Root) Date: Thu, 1 Jul 2010 15:39:56 -0500 Subject: [SciPy-Dev] Reminder: Summer Marathon Skypecon @ Fri Jul 2 12pm EDT In-Reply-To: References: Message-ID: On Thu, Jul 1, 2010 at 3:09 PM, Vincent Davis wrote: > > David, Does it show I accepted the invite, Google seems to be acting a > little strange and doesn't add it to my calendar? > > Vincent > > Same here.... clicking "yes" did not add the event to my calender like it usually does. Ben -------------- next part -------------- An HTML attachment was scrubbed... URL: From d.l.goldsmith at gmail.com Thu Jul 1 16:52:59 2010 From: d.l.goldsmith at gmail.com (David Goldsmith) Date: Thu, 1 Jul 2010 13:52:59 -0700 Subject: [SciPy-Dev] Reminder: Summer Marathon Skypecon @ Fri Jul 2 12pm EDT In-Reply-To: References: Message-ID: 12 noon EDT (9am PDT, which is where I am), 4pm (?) GMT. On Thu, Jul 1, 2010 at 1:39 PM, Benjamin Root wrote: > > On Thu, Jul 1, 2010 at 3:09 PM, Vincent Davis wrote: > >> >> David, Does it show I accepted the invite, >> > Google seems to be acting a little strange and doesn't add it to my >> calendar? >> >> Vincent >> >> > Same here.... clicking "yes" did not add the event to my calender like it > usually does. > Probably because I just forwarded my personal reminder, not an actual Google "invitation" to an event. -------------- next part -------------- An HTML attachment was scrubbed... URL: From vincent at vincentdavis.net Thu Jul 1 17:00:27 2010 From: vincent at vincentdavis.net (Vincent Davis) Date: Thu, 1 Jul 2010 15:00:27 -0600 Subject: [SciPy-Dev] Reminder: Summer Marathon Skypecon @ Fri Jul 2 12pm EDT In-Reply-To: References: Message-ID: On Thu, Jul 1, 2010 at 2:52 PM, David Goldsmith wrote: > 12 noon EDT (9am PDT, which is where I am), 4pm (?) GMT. > > On Thu, Jul 1, 2010 at 1:39 PM, Benjamin Root wrote: >> >> On Thu, Jul 1, 2010 at 3:09 PM, Vincent Davis >> wrote: >>> >>> David, Does it show I accepted the invite, >>> >>> Google seems to be acting a little strange and doesn't add it to my >>> calendar? >>> Vincent >> >> Same here.... clicking "yes" did not add the event to my calender like it >> usually does. > > Probably because I just forwarded my personal reminder, not an actual Google > "invitation" to an event. I kinda thought that after reading the fine print. Can you add me to you calendar and have it send me a reminder, Might be a nice way for you to keep track of how will be attending. Vincent > > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > > From d.l.goldsmith at gmail.com Thu Jul 1 17:13:53 2010 From: d.l.goldsmith at gmail.com (David Goldsmith) Date: Thu, 1 Jul 2010 14:13:53 -0700 Subject: [SciPy-Dev] Reminder: Summer Marathon Skypecon @ Fri Jul 2 12pm EDT In-Reply-To: References: Message-ID: sure. DG On Thu, Jul 1, 2010 at 2:00 PM, Vincent Davis wrote: > On Thu, Jul 1, 2010 at 2:52 PM, David Goldsmith > wrote: > > 12 noon EDT (9am PDT, which is where I am), 4pm (?) GMT. > > > > On Thu, Jul 1, 2010 at 1:39 PM, Benjamin Root wrote: > >> > >> On Thu, Jul 1, 2010 at 3:09 PM, Vincent Davis > > >> wrote: > >>> > >>> David, Does it show I accepted the invite, > >>> > >>> Google seems to be acting a little strange and doesn't add it to my > >>> calendar? > >>> Vincent > >> > >> Same here.... clicking "yes" did not add the event to my calender like > it > >> usually does. > > > > Probably because I just forwarded my personal reminder, not an actual > Google > > "invitation" to an event. > > I kinda thought that after reading the fine print. Can you add me to > you calendar and have it send me a reminder, Might be a nice way for > you to keep track of how will be attending. > Vincent > > > > > > > _______________________________________________ > > SciPy-Dev mailing list > > SciPy-Dev at scipy.org > > http://mail.scipy.org/mailman/listinfo/scipy-dev > > > > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > -- Mathematician: noun, someone who disavows certainty when their uncertainty set is non-empty, even if that set has measure zero. Hope: noun, that delusive spirit which escaped Pandora's jar and, with her lies, prevents mankind from committing a general suicide. (As interpreted by Robert Graves) -------------- next part -------------- An HTML attachment was scrubbed... URL: From david at silveregg.co.jp Thu Jul 1 22:34:20 2010 From: david at silveregg.co.jp (David) Date: Fri, 02 Jul 2010 11:34:20 +0900 Subject: [SciPy-Dev] [ANN] Bento (ex-toydist) 0.0.3 Message-ID: <4C2D502C.2040909@silveregg.co.jp> Hi, I am pleased to announce the release 0.0.3 for Bento, the pythonic packaging solution. Wherease the 0.0.2 release was mostly about getting the simplest-still-useful subset of distutils features, this new release adds quite a few significant features: - Add hooks to customize arbitrary stages in bento (there is a hackish example which shows how to use waf to build a simple C extension). The API for this is still in flux, though - Parallel and reliable build of C extensions through yaku build library. - One file distribution: no need for your users to install any new packages, just include one single file into your package to build with bento - Improved documentation - 2.4 -> 2.7 support, tested on linux/windows/mac os x You can download bento on github: http://github.com/cournape/Bento cheers, David From gael.varoquaux at normalesup.org Fri Jul 2 04:08:35 2010 From: gael.varoquaux at normalesup.org (Gael Varoquaux) Date: Fri, 2 Jul 2010 10:08:35 +0200 Subject: [SciPy-Dev] Ticket 1057: Connect components: request for review Message-ID: <20100702080835.GA5918@phare.normalesup.org> Robert Cimrman and I just finished a small patch adding a connect component labeling algorithm for sparse matrices: http://projects.scipy.org/scipy/ticket/1057 Robert is planning on checking it in on Monday, but it would be great to have additional external feedback. Does anyone want to review it? Its quite short. Cheers, Ga?l From jsseabold at gmail.com Fri Jul 2 13:50:46 2010 From: jsseabold at gmail.com (Skipper Seabold) Date: Fri, 2 Jul 2010 12:50:46 -0500 Subject: [SciPy-Dev] Warnings raised (from fit in scipy.stats) In-Reply-To: References: Message-ID: On Fri, Jun 11, 2010 at 12:34 PM, Skipper Seabold wrote: > On Fri, Jun 11, 2010 at 1:07 PM, ? wrote: >> On Fri, Jun 11, 2010 at 12:45 PM, Skipper Seabold wrote: >>> Since the raising of warning behavior has been changed (I believe), I >>> have been running into a lot of warnings in my code when say I do >>> something like >>> >>> In [120]: from scipy import stats >>> >>> In [121]: y = [-45, -3, 1, 0, 1, 3] >>> >>> In [122]: v = stats.norm.pdf(y)/stats.norm.cdf(y) >>> Warning: invalid value encountered in divide >>> >>> Sometimes, this is useful to know. ?Sometimes, though, it's very >>> disturbing when it's encountered in some kind of iteration or >>> optimization. ?I have been using numpy.clip to get around this in my >>> own code, but when it's buried a bit deeper, it's not quite so simple. >>> >>> Take this example. >>> >>> In [123]: import numpy as np >>> >>> In [124]: np.random.seed(12345) >>> >>> In [125]: B = 6.0 >>> >>> In [126]: x = np.random.exponential(scale=B, size=5000) >>> >>> In [127]: from scipy.stats import expon >>> >>> In [128]: expon.fit(x) >>> >>> >>> >>> Out[128]: (0.21874043533906118, 5.7122829778172939) >>> >>> The fit is achieved by fmin (as far as I know, since disp=0 in the >>> rv_continuous.fit...), but there are a number of warnings emitted. ?Is >>> there any middle ground to be had in these type of situations via >>> context management perhaps? >>> >>> Should I file a ticket? >> >> Which numpy scipy versions are you using? >> > > Numpy > > '2.0.0.dev8417' > > Scipy > > '0.9.0.dev6447' > >> I don't get any warning with the first example. (numpy 1.4.0) >> (I cannot run the second example because I have a scipy revision with >> a broken fit() method) >> >> I don't think wrapping functions/methods to turn off warnings is a >> good option. (many of them are in inner loops for example for random >> number generation) >> > > Granted I haven't looked too much into the details of the warnings > context manager other than using some toy examples once or twice, but > if you could just suppress them for when the solver is called within a > function/method then this would do the trick (at least for the ones I > have been running into, mostly to do with fitting like this or with > maximum likelihood). > > Skipper > Replying to myself here. For noiseless tests (ie., no floating point warnings) just do import numpy as np np.seterr(all='ignore') Skipper From jh at physics.ucf.edu Fri Jul 2 14:31:08 2010 From: jh at physics.ucf.edu (Joe Harrington) Date: Fri, 02 Jul 2010 14:31:08 -0400 Subject: [SciPy-Dev] SciPy docs: volunteers needed now! Message-ID: Dear SciPy users and developers, For the past two summers, the SciPy Documentation Project has run a concerted effort to write docstrings for all the NumPy objects. This has been very successful, with over 75 writers contributing. Nearly every "important" object now has a reviewable docstring, and we finally have someone working on doc wiki programming again! We accomplished this by writing 3000-5000 words a week, as tabulated on the NumPy doc wiki's stats page. Sometimes we wrote much more. This worked because 1) many volunteered and 2) there was a paid coordinator who managed the workflow, motivated the community, and chased down loose ends. This year we have taken on the SciPy docs, which are bigger and more technical. A handful of very dedicated people has taken up the challenge, and David Goldsmith is once again our able coordinator. However, at the rate we are going it will take something like a decade to finish. We're writing just 700-1200 words a week. I have had to question whether it is worthwhile paying a coordinator for the effort of just a few volunteers, as these people are fairly focused already and there just are not very many of them. I would hate to pull support for the coordinator now, but in the end this project lives or dies by the willingness of its users - US - to contribute our time. Each of us should consider the cost - thousands of dollars a year for many of us with big projects - of commercial numerical software. Extracting (or even requesting) payment in exchange for the right to use SciPy isn't our model, but we still need the labor. In the end, all labor is either volunteer or paid. So, this is an appeal for all knowledgeable SciPy users, including current code developers, to consider writing a little bit of documentation over this summer. For those in a position to assign people to work on docs or to pay people who do, now is the time to step forward as well. Specifically, we need to get the number of words per week up into the 3000 range to make a paid coordinator worthwhile (and indeed, to get anywhere in a reasonable time). Short of a significant pickup in participation and words produced, documenting SciPy will once again be a 100% volunteer effort. Please visit http://docs.scipy.org/numpy/ to learn about editing docs, then sign up, request edit rights, and visit http://docs.scipy.org/scipy/Milestones/ to get connected to the work going on. We meet on Skype every Friday at noon EDT. Contact David Goldsmith (d.l.goldsmith at gmail.com, d.l.goldsmith on Skype) to join those conversations. Email discussion of doc issues happens on scipy-dev. Thanks, --jh-- From matthew.brett at gmail.com Fri Jul 2 14:40:01 2010 From: matthew.brett at gmail.com (Matthew Brett) Date: Fri, 2 Jul 2010 14:40:01 -0400 Subject: [SciPy-Dev] SciPy docs: volunteers needed now! In-Reply-To: References: Message-ID: Hi, > This year we have taken on the SciPy docs, which are bigger and more > technical. ?A handful of very dedicated people has taken up the > challenge, and David Goldsmith is once again our able coordinator. > However, at the rate we are going it will take something like a decade > to finish. ?We're writing just 700-1200 words a week. ?I have had to > question whether it is worthwhile paying a coordinator for the effort > of just a few volunteers, as these people are fairly focused already > and there just are not very many of them. I wonder whether there is any other approach that we can explore to help generate more volunteer work? Do you think it is mainly the difference between scipy and numpy that explains the drop-off? Or something else? To the extent that it is the technical differences - do you think there would be any point in trying to establish something like nominated experts or want-to-find-out type experts who will offer to advise on particular parts of scipy - even if they don't themselves write the docstrings? Or anything else that might help? Best, Matthew From jh at physics.ucf.edu Fri Jul 2 15:14:10 2010 From: jh at physics.ucf.edu (Joe Harrington) Date: Fri, 02 Jul 2010 15:14:10 -0400 Subject: [SciPy-Dev] SciPy docs: volunteers needed now! In-Reply-To: (message from Matthew Brett on Fri, 2 Jul 2010 14:40:01 -0400) References: Message-ID: Replying just to scipy-dev, to respect the other lists. > > This year we have taken on the SciPy docs, which are bigger and more > > technical. ?A handful of very dedicated people has taken up the > > challenge, and David Goldsmith is once again our able coordinator. > > However, at the rate we are going it will take something like a decade > > to finish. ?We're writing just 700-1200 words a week. ?I have had to > > question whether it is worthwhile paying a coordinator for the effort > > of just a few volunteers, as these people are fairly focused already > > and there just are not very many of them. > > I wonder whether there is any other approach that we can explore to > help generate more volunteer work? Do you think it is mainly the > difference between scipy and numpy that explains the drop-off? Or > something else? To the extent that it is the technical differences > - do you think there would be any point in trying to establish > something like nominated experts or want-to-find-out type experts who > will offer to advise on particular parts of scipy - even if they don't > themselves write the docstrings? Or anything else that might help? We already looked for topical experts. We have a few; David can comment more. In the end what we need are rank-and-file writers, people who will take something on, learn about it, and write about it. Yes, SciPy is more technical, but we've all dealt with harder tasks than documenting SciPy. Right now, we have just 1-3 people producing over 100 words each week, and nobody has written 1000 words in one week. Last year, we routinely had 1000+ words from several writers each week. We had weeks that produced more words than we have produced this entire summer. Whether we gain the productivity by having many volunteers work a little or by having a few (like Ralf Gommers last year) who dedicate a large amount of time, there's a critical level of productivity we need to achieve to make it worthwhile spending money on it. I want to reiterate that I do very much appreciate the effort of those who *are* participating, and that the wiki and project will continue regardless of whether we have a paid coordinator. --jh-- From matthew.brett at gmail.com Fri Jul 2 15:17:02 2010 From: matthew.brett at gmail.com (Matthew Brett) Date: Fri, 2 Jul 2010 15:17:02 -0400 Subject: [SciPy-Dev] SciPy docs: volunteers needed now! In-Reply-To: References: Message-ID: Hi, > Right now, we have just 1-3 people producing over 100 words each week, > and nobody has written 1000 words in one week. ?Last year, we > routinely had 1000+ words from several writers each week. ?We had > weeks that produced more words than we have produced this entire > summer. > > Whether we gain the productivity by having many volunteers work a > little or by having a few (like Ralf Gommers last year) who dedicate a > large amount of time, there's a critical level of productivity we need > to achieve to make it worthwhile spending money on it. > > I want to reiterate that I do very much appreciate the effort of those > who *are* participating, and that the wiki and project will continue > regardless of whether we have a paid coordinator. > > --jh-- > From matthew.brett at gmail.com Fri Jul 2 15:19:31 2010 From: matthew.brett at gmail.com (Matthew Brett) Date: Fri, 2 Jul 2010 15:19:31 -0400 Subject: [SciPy-Dev] SciPy docs: volunteers needed now! In-Reply-To: References: Message-ID: Hi, (sorry - hit send accidentally) >> Right now, we have just 1-3 people producing over 100 words each week, >> and nobody has written 1000 words in one week. ?Last year, we >> routinely had 1000+ words from several writers each week. ?We had >> weeks that produced more words than we have produced this entire >> summer. Right - yes - but I was wondering whether it was clear to you what the explanation was, and whether there was any way of a) solving the problems you've identified or b) brainstorming maybe on list for what could be done differently. Maybe you've already tried that? Sorry for the noise if so... Best, Matthew From vincent at vincentdavis.net Fri Jul 2 15:25:40 2010 From: vincent at vincentdavis.net (Vincent Davis) Date: Fri, 2 Jul 2010 13:25:40 -0600 Subject: [SciPy-Dev] stats._support.adm() dos not seem to like string values Message-ID: >>> y array([['1', '2', 't'], ['1', '3', '3']], dtype='|S8') >>> s.adm(y, 'x[1]=='2'') Traceback (most recent call last): File "", line 1, in invalid syntax: , line 1, pos 18 The documentation is very lacking, My question is should the ability to handle string be added/fixed or should the docs specify numbers only. Also I think the criterion should be improved accept "x[1]" or in the above example "y[1]" really it could accept anything as the "x" has not real meaning in the criterion Suggestion/comments Vincent From jh at physics.ucf.edu Fri Jul 2 15:26:23 2010 From: jh at physics.ucf.edu (Joe Harrington) Date: Fri, 02 Jul 2010 15:26:23 -0400 Subject: [SciPy-Dev] SciPy docs: volunteers needed now! In-Reply-To: (message from Matthew Brett on Fri, 2 Jul 2010 15:19:31 -0400) References: Message-ID: > >> Right now, we have just 1-3 people producing over 100 words each week, > >> and nobody has written 1000 words in one week. ?Last year, we > >> routinely had 1000+ words from several writers each week. ?We had > >> weeks that produced more words than we have produced this entire > >> summer. > > Right - yes - but I was wondering whether it was clear to you what the > explanation was, and whether there was any way of a) solving the > problems you've identified or b) brainstorming maybe on list for what > could be done differently. Maybe you've already tried that? Sorry > for the noise if so... Well, David and I brainstorm a lot about it, offline. Mostly we write appeals for help and in the past two years they have been very effective. This year has been different, which may just be random or might have to do with the job at hand. It would be good to get input from others working on the docs this year. Thoughts, anyone? --jh-- From emmanuelle.gouillart at normalesup.org Fri Jul 2 15:28:20 2010 From: emmanuelle.gouillart at normalesup.org (Emmanuelle Gouillart) Date: Fri, 2 Jul 2010 21:28:20 +0200 Subject: [SciPy-Dev] SciPy docs: volunteers needed now! In-Reply-To: References: Message-ID: <20100702192820.GC26710@phare.normalesup.org> Hi all, On Fri, Jul 02, 2010 at 02:40:01PM -0400, Matthew Brett wrote: [need for volunteers] > I wonder whether there is any other approach that we can explore to > help generate more volunteer work? Do you think it is mainly the > difference between scipy and numpy that explains the drop-off? Or > something else? To the extent that it is the technical differences > - do you think there would be any point in trying to establish > something like nominated experts or want-to-find-out type experts who > will offer to advise on particular parts of scipy - even if they don't > themselves write the docstrings? Or anything else that might help? I'm pretty sure that the observed lack of momentum is due to the increased technicity of Scipy, as compared to Numpy. I myself feel that there is only a small fraction of Scipy docstrings I could contribute to (that is, of course, docstrings of the functions that I use). This is not quite true, because it's always possible to add examples to a docstring even if one is not familiar with the function, but we can't expect people to work on stuff they never use... Therefore I think the paid coordinator has a very important "Public Relation" task, which consists in identifying who uses which parts of Scipy and building a network of contributors. I would advocate writing personally to people posting on the mailing-list, asking if they could contribute to previously-identified functions, or if a student of them could, etc. Of course, this is a quite laborious work, but documenting Scipy won't be a mass effort like Numpy's documentation. I could try to do some more work on the documentation of ndimage.morphology (on which I have worked a lot in the Spring) and ndimage.measurements, but this may happen only in ten days: we're busy organizing Euroscipy at the moment! Cheers, Emmanuelle From vincent at vincentdavis.net Fri Jul 2 15:46:38 2010 From: vincent at vincentdavis.net (Vincent Davis) Date: Fri, 2 Jul 2010 13:46:38 -0600 Subject: [SciPy-Dev] SciPy docs: volunteers needed now! In-Reply-To: <20100702192820.GC26710@phare.normalesup.org> References: <20100702192820.GC26710@phare.normalesup.org> Message-ID: On Fri, Jul 2, 2010 at 1:28 PM, Emmanuelle Gouillart wrote: > > Hi all, > > On Fri, Jul 02, 2010 at 02:40:01PM -0400, Matthew Brett wrote: > > [need for volunteers] > >> I wonder whether there is any other approach that we can explore to >> help generate more volunteer work? ? ?Do you think it is mainly the >> difference between scipy and numpy that explains the drop-off? ? Or >> something else? ? ?To the extent that it is the technical differences >> - do you think there would be any point in trying to establish >> something like nominated experts or want-to-find-out type experts who >> will offer to advise on particular parts of scipy - even if they don't >> themselves write the docstrings? ? Or anything else that might help? > > I'm pretty sure that the observed lack of momentum is due to the > increased technicity of Scipy, as compared to Numpy. > > I myself feel that there is only a small fraction of Scipy docstrings I > could contribute to (that is, of course, docstrings of the functions that > I use). This is not quite true, because it's always possible to add > examples to a docstring even if one is not familiar with the function, > but we can't expect people to work on stuff they never use... There are actually a lot of functions that have very lacking docs and I think we should consider (and let it be known) small improvements/examples as being worth contributing and updating/commiting the docstrings with these small improvements. For example I am looking at stats._support I am sure it is a seldom used corner of scipy but the docs are nearly non-existant. More generaly, is it possible to add something to every scipy-user email that asks/reminds users to contribute to the docs. something added to the " _______________________________________________ SciPy-User mailing list SciPy-User at scipy.org http://mail.scipy.org/mailman/listinfo/scipy-user " I think it might be worth seeing if we can get 100 people to write 100 lines. Vincent > > Therefore I think the paid coordinator has a very important "Public > Relation" task, which consists in identifying who uses which parts of > Scipy and building a network of contributors. I would advocate writing > personally to people posting on the mailing-list, asking if they could > contribute to previously-identified functions, or if a student of them > could, etc. Of course, this is a quite laborious work, but documenting > Scipy won't be a mass effort like Numpy's documentation. > > I could try to do some more work on the documentation of > ndimage.morphology (on which I have worked a lot in the Spring) and > ndimage.measurements, but this may happen only in ten days: we're busy > organizing Euroscipy at the moment! > > Cheers, > > Emmanuelle > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > From amcmorl at gmail.com Fri Jul 2 16:28:18 2010 From: amcmorl at gmail.com (Angus McMorland) Date: Fri, 2 Jul 2010 16:28:18 -0400 Subject: [SciPy-Dev] SciPy docs: volunteers needed now! In-Reply-To: References: <20100702192820.GC26710@phare.normalesup.org> Message-ID: On 2 July 2010 15:46, Vincent Davis wrote: > On Fri, Jul 2, 2010 at 1:28 PM, Emmanuelle Gouillart > wrote: >> I'm pretty sure that the observed lack of momentum is due to the >> increased technicity of Scipy, as compared to Numpy. >> >> I myself feel that there is only a small fraction of Scipy docstrings I >> could contribute to (that is, of course, docstrings of the functions that >> I use). This is not quite true, because it's always possible to add >> examples to a docstring even if one is not familiar with the function, >> but we can't expect people to work on stuff they never use... > > There are actually a lot of functions that have very lacking docs and > I think we should consider (and let it be known) small > improvements/examples as being worth contributing and > updating/commiting the docstrings with these small improvements. > For example I am looking at stats._support ?I am sure it is a seldom > used corner of scipy but the docs are nearly non-existant. This has basically already been said, but I'm going to put it in slightly different words: there are currently 2610 entries in the "Needs Editing" category on the docstring wiki. It seems, from a bit of random clicking on function names, that many of these have _some_ documentation already, just not in the correct docstring standard format. People with no more technical knowledge than a solid understanding of the English language (sorry, all you other-language speakers; care to start translation efforts?) could contribute greatly by going through those docstrings and updating the 'Parameters' and 'Returns' sections to the correct format, and for bonus marks, by adding an obvious example. Such information is, I would guess, the most pressing need for people looking at docstrings for usage information anyway. To summarize, the specialist knowledge barrier to contribution is probably a lot lower than people might think. Perhaps we should put the word out beyond scipy-dev, at least to scipy-users, which may have more rank-and-file-worthy contributors. Rank-and-file contributor, Angus. -- AJC McMorland Post-doctoral research fellow Neurobiology, University of Pittsburgh From josef.pktd at gmail.com Fri Jul 2 17:22:45 2010 From: josef.pktd at gmail.com (josef.pktd at gmail.com) Date: Fri, 2 Jul 2010 17:22:45 -0400 Subject: [SciPy-Dev] stats._support.adm() dos not seem to like string values In-Reply-To: References: Message-ID: On Fri, Jul 2, 2010 at 3:25 PM, Vincent Davis wrote: >>>> y > array([['1', '2', 't'], > ? ? ? ['1', '3', '3']], > ? ? ?dtype='|S8') >>>> s.adm(y, 'x[1]=='2'') > Traceback (most recent call last): > ?File "", line 1, in > invalid syntax: , line 1, pos 18 That's invalid syntax in your string >>> stats._support.adm(y, "x[1]=='2'") array([['1', '2', 't']], dtype='|S8') use single quote inside double quote or the other way around. Josef > > The documentation is very lacking, My question is should the ability > to handle string be added/fixed or should the docs specify numbers > only. > > Also I think the criterion should be improved accept "x[1]" or in the > above example "y[1]" really it could accept anything as the "x" has > not real meaning in the criterion > > Suggestion/comments > > Vincent > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > From vincent at vincentdavis.net Fri Jul 2 17:27:14 2010 From: vincent at vincentdavis.net (Vincent Davis) Date: Fri, 2 Jul 2010 15:27:14 -0600 Subject: [SciPy-Dev] stats._support.adm() dos not seem to like string values In-Reply-To: References: Message-ID: On Fri, Jul 2, 2010 at 3:22 PM, wrote: > On Fri, Jul 2, 2010 at 3:25 PM, Vincent Davis wrote: >>>>> y >> array([['1', '2', 't'], >> ? ? ? ['1', '3', '3']], >> ? ? ?dtype='|S8') >>>>> s.adm(y, 'x[1]=='2'') >> Traceback (most recent call last): >> ?File "", line 1, in >> invalid syntax: , line 1, pos 18 > > That's invalid syntax in your string > >>>> stats._support.adm(y, "x[1]=='2'") > array([['1', '2', 't']], > ? ? ?dtype='|S8') > > use single quote inside double quote or the other way around. Rather obvious mistake, I don't care for the how the criterion is formatted so I let my bias get in the way. Vincent > > Josef > > >> >> The documentation is very lacking, My question is should the ability >> to handle string be added/fixed or should the docs specify numbers >> only. >> >> Also I think the criterion should be improved accept "x[1]" or in the >> above example "y[1]" really it could accept anything as the "x" has >> not real meaning in the criterion >> >> Suggestion/comments >> >> Vincent >> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at scipy.org >> http://mail.scipy.org/mailman/listinfo/scipy-dev >> > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > From vincent at vincentdavis.net Fri Jul 2 19:11:29 2010 From: vincent at vincentdavis.net (Vincent Davis) Date: Fri, 2 Jul 2010 17:11:29 -0600 Subject: [SciPy-Dev] Makes stats._support functions more visible Message-ID: There are several useful functions in stats._support I don't think I would have found them if I had been looking for such tools. I think it would be good for these to be more visible. Primarily I am thinking of abut() adm() colex() unique() linexand() collapse() Actually most/all of these probably belong in numpy. I don't think they duplicate any np functions. Vincent From stefan at sun.ac.za Sat Jul 3 04:04:28 2010 From: stefan at sun.ac.za (=?ISO-8859-1?Q?St=E9fan_van_der_Walt?=) Date: Sat, 3 Jul 2010 03:04:28 -0500 Subject: [SciPy-Dev] SciPy docs: volunteers needed now! In-Reply-To: References: Message-ID: On 2 July 2010 14:14, Joe Harrington wrote: >> I wonder whether there is any other approach that we can explore to >> help generate more volunteer work? ? ?Do you think it is mainly the >> difference between scipy and numpy that explains the drop-off? ? Or >> something else? ? ?To the extent that it is the technical differences >> - do you think there would be any point in trying to establish >> something like nominated experts or want-to-find-out type experts who >> will offer to advise on particular parts of scipy - even if they don't >> themselves write the docstrings? ? Or anything else that might help? > > We already looked for topical experts. ?We have a few; David can > comment more. ?In the end what we need are rank-and-file writers, > people who will take something on, learn about it, and write about it. > Yes, SciPy is more technical, but we've all dealt with harder tasks > than documenting SciPy. All the posts I have seen talk about achieving higher word counts, covering more functions, going bigger and better. While that's certainly what we want, such requests may be intimidating to new contributors. My feeling is that we should identify a small handful of functions to focus on. That way, we may only document 10 functions a week, but at least those will get done. Emanuelle's suggestion to target specific writers also seems sensible. Regards St?fan From josh.holbrook at gmail.com Sat Jul 3 04:08:53 2010 From: josh.holbrook at gmail.com (Joshua Holbrook) Date: Sat, 3 Jul 2010 00:08:53 -0800 Subject: [SciPy-Dev] SciPy docs: volunteers needed now! In-Reply-To: References: Message-ID: My own reasons for hesitating have more to do with knowing that any documentation I write will likely have poor style. I tend to write in a very informal, conversational manner. That said, I'll try to do my part as I use parts of scipy, since having unprofessional documentation is probably better than having no documentation. --Josh 2010/7/3 St?fan van der Walt : > On 2 July 2010 14:14, Joe Harrington wrote: >>> I wonder whether there is any other approach that we can explore to >>> help generate more volunteer work? ? ?Do you think it is mainly the >>> difference between scipy and numpy that explains the drop-off? ? Or >>> something else? ? ?To the extent that it is the technical differences >>> - do you think there would be any point in trying to establish >>> something like nominated experts or want-to-find-out type experts who >>> will offer to advise on particular parts of scipy - even if they don't >>> themselves write the docstrings? ? Or anything else that might help? >> >> We already looked for topical experts. ?We have a few; David can >> comment more. ?In the end what we need are rank-and-file writers, >> people who will take something on, learn about it, and write about it. >> Yes, SciPy is more technical, but we've all dealt with harder tasks >> than documenting SciPy. > > All the posts I have seen talk about achieving higher word counts, > covering more functions, going bigger and better. ?While that's > certainly what we want, such requests may be intimidating to new > contributors. > > My feeling is that we should identify a small handful of functions to > focus on. ?That way, we may only document 10 functions a week, but at > least those will get done. ?Emanuelle's suggestion to target specific > writers also seems sensible. > > Regards > St?fan > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > From stefan at sun.ac.za Sat Jul 3 04:18:58 2010 From: stefan at sun.ac.za (=?ISO-8859-1?Q?St=E9fan_van_der_Walt?=) Date: Sat, 3 Jul 2010 03:18:58 -0500 Subject: [SciPy-Dev] SciPy docs: volunteers needed now! In-Reply-To: References: Message-ID: Hi Josh On 3 July 2010 03:08, Joshua Holbrook wrote: > My own reasons for hesitating have more to do with knowing that any > documentation I write will likely have poor style. I tend to write in > a very informal, conversational manner. One of the books that helped me in this regard was The Elements of Style, by Strunk & White http://en.wikipedia.org/wiki/The_Elements_of_Style Maybe it is old fashioned, but I enjoy its concise approach. Regards St?fan From ralf.gommers at googlemail.com Sat Jul 3 04:41:36 2010 From: ralf.gommers at googlemail.com (Ralf Gommers) Date: Sat, 3 Jul 2010 16:41:36 +0800 Subject: [SciPy-Dev] optimize.fmin_ncg bug Message-ID: There has been an intermittent test failure in scipy.optimize for a long time, only on Windows: ====================================================================== ERROR: line-search Newton conjugate gradient optimization routine ---------------------------------------------------------------------- Traceback (most recent call last): File "C:\Python26\lib\site-packages\scipy\optimize\tests\test_optimize.py", line 115, in test_ncg retall=False) File "C:\Python26\lib\site-packages\scipy\optimize\optimize.py", line 1096, in fmin_ncg alphak, fc, gc, old_fval = line_search_BFGS(f,xk,pk,gfk,old_fval) File "C:\Python26\lib\site-packages\scipy\optimize\optimize.py", line 600, in line_search_BFGS phi_a2 = f(*((xk+alpha2*pk,)+args)) File "C:\Python26\lib\site-packages\scipy\optimize\optimize.py", line 103, in function_wrapper return function(x, *args) File "C:\Python26\lib\site-packages\scipy\optimize\tests\test_optimize.py", line 41, in func raise RuntimeError, "too many iterations in optimization routine" RuntimeError: too many iterations in optimization routine ---------------------------------------------------------------------- I finally managed to track it down to line 1076 of optimize.py: if curv == 0.0: This should be replaced with "if curv < eps:" with eps a suitably small number. Now my question is, how small is suitably small? 1e-10 seems to work, but maybe someone who known what the algorithm does can suggest a number that's not just a wild guess. Looking at the rest of optimize.py, there's quite a bit of comparing with zero going on which doesn't look right. Sometimes even with a comment like "maybe this slipped below machine precision". Comparing floating point numbers with zero like that is just a bad idea and should be fixed. Cheers, Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From ben.root at ou.edu Sat Jul 3 10:24:50 2010 From: ben.root at ou.edu (Benjamin Root) Date: Sat, 3 Jul 2010 09:24:50 -0500 Subject: [SciPy-Dev] SciPy docs: volunteers needed now! In-Reply-To: References: Message-ID: Joshua, In addition to the very technical writing for individual functions, we also need documentation that is accessible to newcomers. Many modules do not implement any functions themselves, but act as a grouping module (for example, scipy.io). These modules could definitely use good, up-to-date, summary narratives. Even some modules further down the stack can still benefit from good summaries. To everyone, if you do join the documentation efforts to contribute little bits of writing, it is a common courtesy to notify any others who might also be working on a particular document. The current system does not automatically notify authors of any changes, so it is hard to know if any changes have been made. General rule of thumb is to notify authors who have made changes to the doc within the last 3 months (I believe). I really hope to see you all soon in the marathon! Ben Root On Sat, Jul 3, 2010 at 3:08 AM, Joshua Holbrook wrote: > My own reasons for hesitating have more to do with knowing that any > documentation I write will likely have poor style. I tend to write in > a very informal, conversational manner. > > That said, I'll try to do my part as I use parts of scipy, since > having unprofessional documentation is probably better than having no > documentation. > > --Josh > > 2010/7/3 St?fan van der Walt : > > On 2 July 2010 14:14, Joe Harrington wrote: > >>> I wonder whether there is any other approach that we can explore to > >>> help generate more volunteer work? Do you think it is mainly the > >>> difference between scipy and numpy that explains the drop-off? Or > >>> something else? To the extent that it is the technical differences > >>> - do you think there would be any point in trying to establish > >>> something like nominated experts or want-to-find-out type experts who > >>> will offer to advise on particular parts of scipy - even if they don't > >>> themselves write the docstrings? Or anything else that might help? > >> > >> We already looked for topical experts. We have a few; David can > >> comment more. In the end what we need are rank-and-file writers, > >> people who will take something on, learn about it, and write about it. > >> Yes, SciPy is more technical, but we've all dealt with harder tasks > >> than documenting SciPy. > > > > All the posts I have seen talk about achieving higher word counts, > > covering more functions, going bigger and better. While that's > > certainly what we want, such requests may be intimidating to new > > contributors. > > > > My feeling is that we should identify a small handful of functions to > > focus on. That way, we may only document 10 functions a week, but at > > least those will get done. Emanuelle's suggestion to target specific > > writers also seems sensible. > > > > Regards > > St?fan > > _______________________________________________ > > SciPy-Dev mailing list > > SciPy-Dev at scipy.org > > http://mail.scipy.org/mailman/listinfo/scipy-dev > > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > -------------- next part -------------- An HTML attachment was scrubbed... URL: From d.l.goldsmith at gmail.com Sat Jul 3 11:31:40 2010 From: d.l.goldsmith at gmail.com (David Goldsmith) Date: Sat, 3 Jul 2010 08:31:40 -0700 Subject: [SciPy-Dev] SciPy docs: volunteers needed now! In-Reply-To: References: Message-ID: On Sat, Jul 3, 2010 at 7:24 AM, Benjamin Root wrote: > Joshua, > > In addition to the very technical writing for individual functions, we also > need documentation that is accessible to newcomers. Many modules do not > implement any functions themselves, but act as a grouping module (for > example, scipy.io). These modules could definitely use good, up-to-date, > summary narratives. Even some modules further down the stack can still > benefit from good summaries. > > To everyone, if you do join the documentation efforts to contribute little > bits of writing, it is a common courtesy to notify any others who might also > be working on a particular document. The current system does not > automatically notify authors of any changes, so it is hard to know if any > changes have been made. General rule of thumb is to notify authors who have > made changes to the doc within the last 3 months (I believe). > Actually, all you have to do (have used in both senses of the word, i.e., are "required" to do and all that is necessary to do) is click on the log link when viewing a docstring in the Wiki, note the person who worked on it last, how long ago that was, and then as Ben says, if it was in the last three months, give or take, contact them and ask them if it would be ok if you worked on it. As far as informal style: yes, some doc is better than no doc, and reviewers and/or users will eventually come along and say how they think it should be improved - please don't let any insecurities be a deterrent: every version is saved; worse comes to worse, someone comes along, doesn't like your changes and reverts, but at least you tried and you can't "break" anything. :-) More later (but I like the dialogue that's started in my silence). DG > > I really hope to see you all soon in the marathon! > > Ben Root > > > On Sat, Jul 3, 2010 at 3:08 AM, Joshua Holbrook wrote: > >> My own reasons for hesitating have more to do with knowing that any >> documentation I write will likely have poor style. I tend to write in >> a very informal, conversational manner. >> >> That said, I'll try to do my part as I use parts of scipy, since >> having unprofessional documentation is probably better than having no >> documentation. >> >> --Josh >> >> 2010/7/3 St?fan van der Walt : >> > On 2 July 2010 14:14, Joe Harrington wrote: >> >>> I wonder whether there is any other approach that we can explore to >> >>> help generate more volunteer work? Do you think it is mainly the >> >>> difference between scipy and numpy that explains the drop-off? Or >> >>> something else? To the extent that it is the technical differences >> >>> - do you think there would be any point in trying to establish >> >>> something like nominated experts or want-to-find-out type experts who >> >>> will offer to advise on particular parts of scipy - even if they don't >> >>> themselves write the docstrings? Or anything else that might help? >> >> >> >> We already looked for topical experts. We have a few; David can >> >> comment more. In the end what we need are rank-and-file writers, >> >> people who will take something on, learn about it, and write about it. >> >> Yes, SciPy is more technical, but we've all dealt with harder tasks >> >> than documenting SciPy. >> > >> > All the posts I have seen talk about achieving higher word counts, >> > covering more functions, going bigger and better. While that's >> > certainly what we want, such requests may be intimidating to new >> > contributors. >> > >> > My feeling is that we should identify a small handful of functions to >> > focus on. That way, we may only document 10 functions a week, but at >> > least those will get done. Emanuelle's suggestion to target specific >> > writers also seems sensible. >> > >> > Regards >> > St?fan >> > _______________________________________________ >> > SciPy-Dev mailing list >> > SciPy-Dev at scipy.org >> > http://mail.scipy.org/mailman/listinfo/scipy-dev >> > >> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at scipy.org >> http://mail.scipy.org/mailman/listinfo/scipy-dev >> > > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > > -- Mathematician: noun, someone who disavows certainty when their uncertainty set is non-empty, even if that set has measure zero. Hope: noun, that delusive spirit which escaped Pandora's jar and, with her lies, prevents mankind from committing a general suicide. (As interpreted by Robert Graves) -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsouthey at gmail.com Sat Jul 3 12:13:29 2010 From: bsouthey at gmail.com (Bruce Southey) Date: Sat, 3 Jul 2010 11:13:29 -0500 Subject: [SciPy-Dev] SciPy docs: volunteers needed now! In-Reply-To: References: Message-ID: On Sat, Jul 3, 2010 at 10:31 AM, David Goldsmith wrote: > On Sat, Jul 3, 2010 at 7:24 AM, Benjamin Root wrote: >> >> Joshua, >> >> In addition to the very technical writing for individual functions, we >> also need documentation that is accessible to newcomers.? Many modules do >> not implement any functions themselves, but act as a grouping module (for >> example, scipy.io).? These modules could definitely use good, up-to-date, >> summary narratives.? Even some modules further down the stack can still >> benefit from good summaries. What would really benefit is a solid example for different types. There are so many inconsistent styles in the numpy documentation - I know I wrote some. You can always see that when there are different definitions (when present) for input that are array-like. There should be one single definition for these common input and output that should be immutable by the writer and should automatically appear. >> >> To everyone, if you do join the documentation efforts to contribute little >> bits of writing, it is a common courtesy to notify any others who might also >> be working on a particular document.? The current system does not >> automatically notify authors of any changes, so it is hard to know if any >> changes have been made.? General rule of thumb is to notify authors who have >> made changes to the doc within the last 3 months (I believe). > > Actually, all you have to do (have used in both senses of the word, i.e., > are "required" to do and all that is necessary to do) is click on the log > link when viewing a docstring in the Wiki, note the person who worked on it > last, how long ago that was, and then as Ben says, if it was in the last > three months, give or take, contact them and ask them if it would be ok if > you worked on it. And where is the email or notification button? The more loops to jump through the less people will jump. Where do you see the changes made? It should be a diff. Why does the second person have greater authority to override the original author? Why should I have to go back and check that the later revisions are correct? > > As far as informal style: yes, some doc is better than no doc, and reviewers > and/or users will eventually come along and say how they think it should be > improved - please don't let any insecurities be a deterrent: every version > is saved; worse comes to worse, someone comes along, doesn't like your > changes and reverts, but at least you tried and you can't "break" anything. > :-) > > More later (but I like the dialogue that's started in my silence). > > DG No bad documentation is worthless because either someone does not revisit because they assume it is done or think that the other person knows more so they don't touch it. >> >> I really hope to see you all soon in the marathon! >> >> Ben Root >> >> On Sat, Jul 3, 2010 at 3:08 AM, Joshua Holbrook >> wrote: >>> >>> My own reasons for hesitating have more to do with knowing that any >>> documentation I write will likely have poor style. I tend to write in >>> a very informal, conversational manner. Just read a few of the numpy ones like cumsum and similar. Then just copy that style. >>> >>> That said, I'll try to do my part as I use parts of scipy, since >>> having unprofessional documentation is probably better than having no >>> documentation. Just follow a good docstring from a similar type of function. But also add a variety of examples that show the usage especially 2-d cases and weird cases as these also can act for doctests. >>> >>> --Josh >>> >>> 2010/7/3 St?fan van der Walt : >>> > On 2 July 2010 14:14, Joe Harrington wrote: >>> >>> I wonder whether there is any other approach that we can explore to >>> >>> help generate more volunteer work? ? ?Do you think it is mainly the >>> >>> difference between scipy and numpy that explains the drop-off? ? Or >>> >>> something else? ? ?To the extent that it is the technical differences >>> >>> - do you think there would be any point in trying to establish >>> >>> something like nominated experts or want-to-find-out type experts who >>> >>> will offer to advise on particular parts of scipy - even if they >>> >>> don't >>> >>> themselves write the docstrings? ? Or anything else that might help? >>> >> >>> >> We already looked for topical experts. ?We have a few; David can >>> >> comment more. ?In the end what we need are rank-and-file writers, >>> >> people who will take something on, learn about it, and write about it. >>> >> Yes, SciPy is more technical, but we've all dealt with harder tasks >>> >> than documenting SciPy. >>> > >>> > All the posts I have seen talk about achieving higher word counts, >>> > covering more functions, going bigger and better. ?While that's >>> > certainly what we want, such requests may be intimidating to new >>> > contributors. >>> > >>> > My feeling is that we should identify a small handful of functions to >>> > focus on. ?That way, we may only document 10 functions a week, but at >>> > least those will get done. ?Emanuelle's suggestion to target specific >>> > writers also seems sensible. >>> > >>> > Regards >>> > St?fan >>> > _______________________________________________ >>> > SciPy-Dev mailing list >>> > SciPy-Dev at scipy.org >>> > http://mail.scipy.org/mailman/listinfo/scipy-dev >>> > >>> _______________________________________________ >>> SciPy-Dev mailing list >>> SciPy-Dev at scipy.org >>> http://mail.scipy.org/mailman/listinfo/scipy-dev >> >> >> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at scipy.org >> http://mail.scipy.org/mailman/listinfo/scipy-dev >> > > > > -- > Mathematician: noun, someone who disavows certainty when their uncertainty > set is non-empty, even if that set has measure zero. > > Hope: noun, that delusive spirit which escaped Pandora's jar and, with her > lies, prevents mankind from committing a general suicide. ?(As interpreted > by Robert Graves) > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > > Bruce From josef.pktd at gmail.com Sat Jul 3 12:29:55 2010 From: josef.pktd at gmail.com (josef.pktd at gmail.com) Date: Sat, 3 Jul 2010 12:29:55 -0400 Subject: [SciPy-Dev] ltisys test crash Message-ID: Ralf, Any reason not to do http://projects.scipy.org/scipy/changeset/6484 also in trunk ? I still got the crash when I tested a new scipy install yesterday. Josef From ralf.gommers at googlemail.com Sat Jul 3 12:34:10 2010 From: ralf.gommers at googlemail.com (Ralf Gommers) Date: Sun, 4 Jul 2010 00:34:10 +0800 Subject: [SciPy-Dev] SciPy docs: volunteers needed now! In-Reply-To: References: Message-ID: On Sun, Jul 4, 2010 at 12:13 AM, Bruce Southey wrote: > On Sat, Jul 3, 2010 at 10:31 AM, David Goldsmith > wrote: > > On Sat, Jul 3, 2010 at 7:24 AM, Benjamin Root wrote: > >> > >> Joshua, > >> > >> In addition to the very technical writing for individual functions, we > >> also need documentation that is accessible to newcomers. Many modules > do > >> not implement any functions themselves, but act as a grouping module > (for > >> example, scipy.io). These modules could definitely use good, > up-to-date, > >> summary narratives. Even some modules further down the stack can still > >> benefit from good summaries. > > What would really benefit is a solid example for different types. > There are so many inconsistent styles in the numpy documentation - I > know I wrote some. You can always see that when there are different > definitions (when present) for input that are array-like. There should > be one single definition for these common input and output that should > be immutable by the writer and should automatically appear. > That would be nice but I don't see how that's technically feasible. To see whether the input is ndarray or array_like you just have to look at the code. And you'll always end up with some minor inconsistencies with many writers, but that's what the review/proof stages are for. > > >> > >> To everyone, if you do join the documentation efforts to contribute > little > >> bits of writing, it is a common courtesy to notify any others who might > also > >> be working on a particular document. The current system does not > >> automatically notify authors of any changes, so it is hard to know if > any > >> changes have been made. General rule of thumb is to notify authors who > have > >> made changes to the doc within the last 3 months (I believe). > > > > Actually, all you have to do (have used in both senses of the word, i.e., > > are "required" to do and all that is necessary to do) is click on the log > > link when viewing a docstring in the Wiki, note the person who worked on > it > > last, how long ago that was, and then as Ben says, if it was in the last > > three months, give or take, contact them and ask them if it would be ok > if > > you worked on it. > > And where is the email or notification button? The more loops to jump > through the less people will jump. > Good point. > > Where do you see the changes made? It should be a diff. > Under Log you have access to all diffs for a docstring. > > Why does the second person have greater authority to override the > original author? > Why should I have to go back and check that the later revisions are > correct? > > If the first author brings a docstring to "needs review" status there's no need for a second author to work on it. If it's still in "being written", it wasn't finished in the first place. And later edits possibly overwriting earlier ones is just the nature of wikis. Cheers, Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at googlemail.com Sat Jul 3 12:40:30 2010 From: ralf.gommers at googlemail.com (Ralf Gommers) Date: Sun, 4 Jul 2010 00:40:30 +0800 Subject: [SciPy-Dev] ltisys test crash In-Reply-To: References: Message-ID: On Sun, Jul 4, 2010 at 12:29 AM, wrote: > Ralf, > > Any reason not to do http://projects.scipy.org/scipy/changeset/6484 > also in trunk ? > > I still got the crash when I tested a new scipy install yesterday. > > Sorry about that. Fixed in r6586. Cheers, Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From ben.root at ou.edu Sat Jul 3 12:48:47 2010 From: ben.root at ou.edu (Benjamin Root) Date: Sat, 3 Jul 2010 11:48:47 -0500 Subject: [SciPy-Dev] SciPy docs: volunteers needed now! In-Reply-To: References: Message-ID: On Sat, Jul 3, 2010 at 11:34 AM, Ralf Gommers wrote: > > Why does the second person have greater authority to override the >> original author? >> Why should I have to go back and check that the later revisions are >> correct? >> >> If the first author brings a docstring to "needs review" status there's no > need for a second author to work on it. If it's still in "being written", it > wasn't finished in the first place. And later edits possibly overwriting > earlier ones is just the nature of wikis. > > True, except that in normal wikis, I can set a watch that would notify me of changes to a particular page. This is not the case with our documentation editor. Ben -------------- next part -------------- An HTML attachment was scrubbed... URL: From vincent at vincentdavis.net Sat Jul 3 12:51:53 2010 From: vincent at vincentdavis.net (Vincent Davis) Date: Sat, 3 Jul 2010 10:51:53 -0600 Subject: [SciPy-Dev] SciPy docs: volunteers needed now! In-Reply-To: References: Message-ID: On Sat, Jul 3, 2010 at 10:13 AM, Bruce Southey wrote: > On Sat, Jul 3, 2010 at 10:31 AM, David Goldsmith > wrote: >> On Sat, Jul 3, 2010 at 7:24 AM, Benjamin Root wrote: >>> >>> Joshua, >>> >>> In addition to the very technical writing for individual functions, we >>> also need documentation that is accessible to newcomers.? Many modules do >>> not implement any functions themselves, but act as a grouping module (for >>> example, scipy.io).? These modules could definitely use good, up-to-date, >>> summary narratives.? Even some modules further down the stack can still >>> benefit from good summaries. > > What would really benefit is a solid example for different types. > There are so many inconsistent styles in the numpy documentation - I > know I wrote some. You can always see that when there are different > definitions (when present) for input that are array-like. There should > be one single definition for these common input and output that should > be immutable by the writer and should automatically appear. I have just started making a long list of doc examples. This page http://docs.scipy.org/numpy/Questions+Answers/ Is a great place to get started but if all you (I) want is a cheat sheet with a fairly long list of style examples for each section and the order the sections should be in. And this without all the noise from the discussions related to the style. This was also going to serve as a rst guide. If others are interested in this we should probably add a wiki page for this. Vincent > >>> >>> To everyone, if you do join the documentation efforts to contribute little >>> bits of writing, it is a common courtesy to notify any others who might also >>> be working on a particular document.? The current system does not >>> automatically notify authors of any changes, so it is hard to know if any >>> changes have been made.? General rule of thumb is to notify authors who have >>> made changes to the doc within the last 3 months (I believe). >> >> Actually, all you have to do (have used in both senses of the word, i.e., >> are "required" to do and all that is necessary to do) is click on the log >> link when viewing a docstring in the Wiki, note the person who worked on it >> last, how long ago that was, and then as Ben says, if it was in the last >> three months, give or take, contact them and ask them if it would be ok if >> you worked on it. > > And where is the email or notification button? The more loops to jump > through the less people will jump. > > Where do you see the changes made? It should be a diff. > > Why does the second person have greater authority to override the > original author? > Why should I have to go back and check that the later revisions are correct? > > >> >> As far as informal style: yes, some doc is better than no doc, and reviewers >> and/or users will eventually come along and say how they think it should be >> improved - please don't let any insecurities be a deterrent: every version >> is saved; worse comes to worse, someone comes along, doesn't like your >> changes and reverts, but at least you tried and you can't "break" anything. >> :-) >> >> More later (but I like the dialogue that's started in my silence). >> >> DG > > No bad documentation is worthless because either someone does not > revisit because they assume it is done or think that the other person > knows more so they don't touch it. > >>> >>> I really hope to see you all soon in the marathon! >>> >>> Ben Root >>> >>> On Sat, Jul 3, 2010 at 3:08 AM, Joshua Holbrook >>> wrote: >>>> >>>> My own reasons for hesitating have more to do with knowing that any >>>> documentation I write will likely have poor style. I tend to write in >>>> a very informal, conversational manner. > > Just read a few of the numpy ones like cumsum and similar. Then just > copy that style. > >>>> >>>> That said, I'll try to do my part as I use parts of scipy, since >>>> having unprofessional documentation is probably better than having no >>>> documentation. > > Just follow a good docstring from ?a similar type of function. But > also add a variety of examples that show the usage especially 2-d > cases and weird cases as these also can act for doctests. > >>>> >>>> --Josh >>>> >>>> 2010/7/3 St?fan van der Walt : >>>> > On 2 July 2010 14:14, Joe Harrington wrote: >>>> >>> I wonder whether there is any other approach that we can explore to >>>> >>> help generate more volunteer work? ? ?Do you think it is mainly the >>>> >>> difference between scipy and numpy that explains the drop-off? ? Or >>>> >>> something else? ? ?To the extent that it is the technical differences >>>> >>> - do you think there would be any point in trying to establish >>>> >>> something like nominated experts or want-to-find-out type experts who >>>> >>> will offer to advise on particular parts of scipy - even if they >>>> >>> don't >>>> >>> themselves write the docstrings? ? Or anything else that might help? >>>> >> >>>> >> We already looked for topical experts. ?We have a few; David can >>>> >> comment more. ?In the end what we need are rank-and-file writers, >>>> >> people who will take something on, learn about it, and write about it. >>>> >> Yes, SciPy is more technical, but we've all dealt with harder tasks >>>> >> than documenting SciPy. >>>> > >>>> > All the posts I have seen talk about achieving higher word counts, >>>> > covering more functions, going bigger and better. ?While that's >>>> > certainly what we want, such requests may be intimidating to new >>>> > contributors. >>>> > >>>> > My feeling is that we should identify a small handful of functions to >>>> > focus on. ?That way, we may only document 10 functions a week, but at >>>> > least those will get done. ?Emanuelle's suggestion to target specific >>>> > writers also seems sensible. >>>> > >>>> > Regards >>>> > St?fan >>>> > _______________________________________________ >>>> > SciPy-Dev mailing list >>>> > SciPy-Dev at scipy.org >>>> > http://mail.scipy.org/mailman/listinfo/scipy-dev >>>> > >>>> _______________________________________________ >>>> SciPy-Dev mailing list >>>> SciPy-Dev at scipy.org >>>> http://mail.scipy.org/mailman/listinfo/scipy-dev >>> >>> >>> _______________________________________________ >>> SciPy-Dev mailing list >>> SciPy-Dev at scipy.org >>> http://mail.scipy.org/mailman/listinfo/scipy-dev >>> >> >> >> >> -- >> Mathematician: noun, someone who disavows certainty when their uncertainty >> set is non-empty, even if that set has measure zero. >> >> Hope: noun, that delusive spirit which escaped Pandora's jar and, with her >> lies, prevents mankind from committing a general suicide. ?(As interpreted >> by Robert Graves) >> >> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at scipy.org >> http://mail.scipy.org/mailman/listinfo/scipy-dev >> >> > > Bruce > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > From ben.root at ou.edu Sat Jul 3 13:02:31 2010 From: ben.root at ou.edu (Benjamin Root) Date: Sat, 3 Jul 2010 12:02:31 -0500 Subject: [SciPy-Dev] Makes stats._support functions more visible In-Reply-To: References: Message-ID: unique() is in numpy. As for the rest of them, these functions appear to be very specific to the stats package. In particular, unique() in the support package is a very restricted version of the more generic unique() in numpy. Some of the other functions state that they are like other functions in numpy, but... different... I think they should stay right where they are. Ben Root On Fri, Jul 2, 2010 at 6:11 PM, Vincent Davis wrote: > There are several useful functions in stats._support I don't think I > would have found them if I had been looking for such tools. I think > it would be good for these to be more visible. Primarily I am thinking > of > abut() > adm() > colex() > unique() > linexand() > collapse() > > Actually most/all of these probably belong in numpy. I don't think > they duplicate any np functions. > > Vincent > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > -------------- next part -------------- An HTML attachment was scrubbed... URL: From vincent at vincentdavis.net Sat Jul 3 13:20:31 2010 From: vincent at vincentdavis.net (Vincent Davis) Date: Sat, 3 Jul 2010 11:20:31 -0600 Subject: [SciPy-Dev] Updates? Ticket #835 "nan propagation in stats.distribution" Message-ID: Are there any updates on this ticket? Ticket #835 (new defect) http://projects.scipy.org/scipy/ticket/835 I am kinda looking for reasons to get more familiar with stats.distributions and thought this looked straight forward. Vincent From vincent at vincentdavis.net Sat Jul 3 13:22:46 2010 From: vincent at vincentdavis.net (Vincent Davis) Date: Sat, 3 Jul 2010 11:22:46 -0600 Subject: [SciPy-Dev] function that are duplicates of numpy, such as scipy.special.basic.log In-Reply-To: References: <4C298032.5000608@silveregg.co.jp> Message-ID: Just to note on the list I did file a ticket Ticket #1216 http://projects.scipy.org/scipy/ticket/1216 Vincent On Tue, Jun 29, 2010 at 3:06 AM, Pauli Virtanen wrote: > Tue, 29 Jun 2010 00:10:23 -0500, Benjamin Root wrote: >> So, wouldn't that be justification enough to begin a deprecation process >> for it? ?These are the sort of edges that needs to get polished in >> scipy/numpy. > > Yes. Please file bug tickets listing stuff that is duplicated or looks > like it should be deprecated. (For example, one ticket per submodule > should be ok.) > > -- > Pauli Virtanen > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > From vincent at vincentdavis.net Sat Jul 3 13:26:32 2010 From: vincent at vincentdavis.net (Vincent Davis) Date: Sat, 3 Jul 2010 11:26:32 -0600 Subject: [SciPy-Dev] Is imresize() defined twice in scipy.misc.pilutil.py In-Reply-To: References: <4C2943A7.9050704@enthought.com> Message-ID: Posted a ticket for this. Ticket #1221 (new defect) http://projects.scipy.org/scipy/ticket/1221 Vincent On Tue, Jun 29, 2010 at 2:59 AM, Pauli Virtanen wrote: > Mon, 28 Jun 2010 19:51:51 -0500, Warren Weckesser wrote: >> Vincent Davis wrote: >>> Is imresize defined twice in ?scipy.misc.pilutil.py or am i missing >>> something? >> >> Yup, and they were both added by travo about eight years ago (r521 and >> r545). ?Hey Travis, what's up with that? > > I guess something like that would be hard to remember :) > > The two functions appear to have slightly different functionality. > Nevertheless, perhaps it's just best to remove the first one (which is > overridden by the second one). This looks pretty much like legacy stuff. > > -- > Pauli Virtanen > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > From vincent at vincentdavis.net Sat Jul 3 13:36:00 2010 From: vincent at vincentdavis.net (Vincent Davis) Date: Sat, 3 Jul 2010 11:36:00 -0600 Subject: [SciPy-Dev] {7} My Tickets doesn't work for me Message-ID: in scipy, maybe numpy also the view tickets option {7} My Tickets Doesn't work for me or I don't know how to use it. "This report demonstrates the use of the automatically set $USER dynamic variable, replaced with the username of the logged in user when executed." "No matches found." Is it just me :-) Vincent From charlesr.harris at gmail.com Sat Jul 3 13:59:51 2010 From: charlesr.harris at gmail.com (Charles R Harris) Date: Sat, 3 Jul 2010 11:59:51 -0600 Subject: [SciPy-Dev] {7} My Tickets doesn't work for me In-Reply-To: References: Message-ID: On Sat, Jul 3, 2010 at 11:36 AM, Vincent Davis wrote: > in scipy, maybe numpy also the view tickets option > {7} My Tickets > Doesn't work for me or I don't know how to use it. > "This report demonstrates the use of the automatically set $USER > dynamic variable, replaced with the username of the logged in user > when executed." > "No matches found." > > Is it just me :-) > > You need to be logged in and then it shows tickets assigned to you. Works for me on numpy trac, I haven't tried scipy. Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From vincent at vincentdavis.net Sat Jul 3 14:04:32 2010 From: vincent at vincentdavis.net (Vincent Davis) Date: Sat, 3 Jul 2010 12:04:32 -0600 Subject: [SciPy-Dev] {7} My Tickets doesn't work for me In-Reply-To: References: Message-ID: On Sat, Jul 3, 2010 at 11:59 AM, Charles R Harris wrote> > > On Sat, Jul 3, 2010 at 11:36 AM, Vincent Davis > wrote: >> >> in scipy, maybe numpy also the view tickets option >> {7} My Tickets >> ?Doesn't work for me or I don't know how to use it. >> "This report demonstrates the use of the automatically set $USER >> dynamic variable, replaced with the username of the logged in user >> when executed." >> "No matches found." >> >> Is it just me :-) >> > > You need to be logged in and then it shows tickets assigned to you. Works > for me on numpy trac, I haven't tried scipy. Works for numpy for me also, I am logged in Vincent > > Chuck > > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > > From charlesr.harris at gmail.com Sat Jul 3 14:14:12 2010 From: charlesr.harris at gmail.com (Charles R Harris) Date: Sat, 3 Jul 2010 12:14:12 -0600 Subject: [SciPy-Dev] {7} My Tickets doesn't work for me In-Reply-To: References: Message-ID: On Sat, Jul 3, 2010 at 12:04 PM, Vincent Davis wrote: > On Sat, Jul 3, 2010 at 11:59 AM, Charles R Harris > wrote> > > > > On Sat, Jul 3, 2010 at 11:36 AM, Vincent Davis > > > wrote: > >> > >> in scipy, maybe numpy also the view tickets option > >> {7} My Tickets > >> Doesn't work for me or I don't know how to use it. > >> "This report demonstrates the use of the automatically set $USER > >> dynamic variable, replaced with the username of the logged in user > >> when executed." > >> "No matches found." > >> > >> Is it just me :-) > >> > > > > You need to be logged in and then it shows tickets assigned to you. Works > > for me on numpy trac, I haven't tried scipy. > > Works for numpy for me also, I am logged in > Vincent > And now I have tested on scipy and it worked for me there also, I even found a ticket that I should probably close ;) Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From vincent at vincentdavis.net Sat Jul 3 14:28:30 2010 From: vincent at vincentdavis.net (Vincent Davis) Date: Sat, 3 Jul 2010 12:28:30 -0600 Subject: [SciPy-Dev] {7} My Tickets doesn't work for me In-Reply-To: References: Message-ID: On Sat, Jul 3, 2010 at 12:14 PM, Charles R Harris wrote: > > > On Sat, Jul 3, 2010 at 12:04 PM, Vincent Davis > wrote: >> >> On Sat, Jul 3, 2010 at 11:59 AM, Charles R Harris >> wrote> >> > >> > On Sat, Jul 3, 2010 at 11:36 AM, Vincent Davis >> > >> > wrote: >> >> >> >> in scipy, maybe numpy also the view tickets option >> >> {7} My Tickets >> >> ?Doesn't work for me or I don't know how to use it. >> >> "This report demonstrates the use of the automatically set $USER >> >> dynamic variable, replaced with the username of the logged in user >> >> when executed." >> >> "No matches found." >> >> >> >> Is it just me :-) >> >> >> > >> > You need to be logged in and then it shows tickets assigned to you. >> > Works >> > for me on numpy trac, I haven't tried scipy. >> >> Works for numpy for me also, I am logged in >> Vincent > > And now I have tested on scipy and it worked for me there also, I even found > a ticket that I should probably close ;) Can you "own" a ticket, I can't. I tried again, logged out back in shows me logged in, opened one of my tickets it shows my name ... See attached images for proof :-) Thanks Vincent > > Chuck > > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > > -------------- next part -------------- A non-text attachment was scrubbed... Name: MyTickets.jpg Type: image/jpeg Size: 51266 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: example_ticket.jpg Type: image/jpeg Size: 64714 bytes Desc: not available URL: From charlesr.harris at gmail.com Sat Jul 3 15:01:50 2010 From: charlesr.harris at gmail.com (Charles R Harris) Date: Sat, 3 Jul 2010 13:01:50 -0600 Subject: [SciPy-Dev] {7} My Tickets doesn't work for me In-Reply-To: References: Message-ID: On Sat, Jul 3, 2010 at 12:28 PM, Vincent Davis wrote: > On Sat, Jul 3, 2010 at 12:14 PM, Charles R Harris > wrote: > > > > > > On Sat, Jul 3, 2010 at 12:04 PM, Vincent Davis > > > wrote: > >> > >> On Sat, Jul 3, 2010 at 11:59 AM, Charles R Harris > >> wrote> > >> > > >> > On Sat, Jul 3, 2010 at 11:36 AM, Vincent Davis > >> > > >> > wrote: > >> >> > >> >> in scipy, maybe numpy also the view tickets option > >> >> {7} My Tickets > >> >> Doesn't work for me or I don't know how to use it. > >> >> "This report demonstrates the use of the automatically set $USER > >> >> dynamic variable, replaced with the username of the logged in user > >> >> when executed." > >> >> "No matches found." > >> >> > >> >> Is it just me :-) > >> >> > >> > > >> > You need to be logged in and then it shows tickets assigned to you. > >> > Works > >> > for me on numpy trac, I haven't tried scipy. > >> > >> Works for numpy for me also, I am logged in > >> Vincent > > > > And now I have tested on scipy and it worked for me there also, I even > found > > a ticket that I should probably close ;) > > Can you "own" a ticket, I can't. I tried again, logged out back in > shows me logged in, opened one of my tickets it shows my name ... > See attached images for proof :-) > > To own a ticket you need to go down to the bottom of the page and reassign it to yourself. Owning isn't the same as submitting. Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From amcmorl at gmail.com Sat Jul 3 17:04:48 2010 From: amcmorl at gmail.com (Angus McMorland) Date: Sat, 3 Jul 2010 17:04:48 -0400 Subject: [SciPy-Dev] ndimage.labeled_comprehension default parameter Message-ID: Hi, I'm trying to write the docstring for ndimage.labeled_comprehension. Can anyone tell me what the `default` parameter does? I can't see a case where it isn't over-written by the do_map internal function. Thanks, Angus. -- AJC McMorland Post-doctoral research fellow Neurobiology, University of Pittsburgh From amcmorl at gmail.com Sat Jul 3 17:13:32 2010 From: amcmorl at gmail.com (Angus McMorland) Date: Sat, 3 Jul 2010 17:13:32 -0400 Subject: [SciPy-Dev] ndimage._nd_image Message-ID: Hi, Can we remove the ndimage._nd_image entries from the docstring wiki? They're functions that are exposed in other ndimage modules. There are 22 of them including the module. Perhaps just marking as Unimportant by someone who has the relevant permissions. I think anything to whittle down the list of functions appearing to need documenting makes the list more approachable by new volunteers. Thanks, Angus. -- AJC McMorland Post-doctoral research fellow Neurobiology, University of Pittsburgh From vincent at vincentdavis.net Sat Jul 3 18:27:29 2010 From: vincent at vincentdavis.net (Vincent Davis) Date: Sat, 3 Jul 2010 16:27:29 -0600 Subject: [SciPy-Dev] {7} My Tickets doesn't work for me In-Reply-To: References: Message-ID: On Sat, Jul 3, 2010 at 1:01 PM, Charles R Harris wrote: > > > On Sat, Jul 3, 2010 at 12:28 PM, Vincent Davis > wrote: >> >> On Sat, Jul 3, 2010 at 12:14 PM, Charles R Harris >> wrote: >> > >> > >> > On Sat, Jul 3, 2010 at 12:04 PM, Vincent Davis >> > >> > wrote: >> >> >> >> On Sat, Jul 3, 2010 at 11:59 AM, Charles R Harris >> >> wrote> >> >> > >> >> > On Sat, Jul 3, 2010 at 11:36 AM, Vincent Davis >> >> > >> >> > wrote: >> >> >> >> >> >> in scipy, maybe numpy also the view tickets option >> >> >> {7} My Tickets >> >> >> ?Doesn't work for me or I don't know how to use it. >> >> >> "This report demonstrates the use of the automatically set $USER >> >> >> dynamic variable, replaced with the username of the logged in user >> >> >> when executed." >> >> >> "No matches found." >> >> >> >> >> >> Is it just me :-) >> >> >> >> >> > >> >> > You need to be logged in and then it shows tickets assigned to you. >> >> > Works >> >> > for me on numpy trac, I haven't tried scipy. >> >> >> >> Works for numpy for me also, I am logged in >> >> Vincent >> > >> > And now I have tested on scipy and it worked for me there also, I even >> > found >> > a ticket that I should probably close ;) >> >> Can you "own" a ticket, I can't. I tried again, logged out back in >> shows me logged in, opened one of my tickets it shows my name ... >> See attached images for proof :-) >> > > To own a ticket you need to go down to the bottom of the page and reassign > it to yourself. Owning isn't the same as submitting. I have no "own" option. I was thinking "My Tickets" was the same as the numpy "{9} Tickets I've created" I can just search for my username Thanks Vincent > > Chuck > > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > > From sonoflilit at gmail.com Sat Jul 3 18:29:10 2010 From: sonoflilit at gmail.com (Aur Saraf) Date: Sun, 4 Jul 2010 01:29:10 +0300 Subject: [SciPy-Dev] Copyright of the Optimization on Ubuntu screenshot In-Reply-To: References: Message-ID: Hello, I am writing an "Introduction to Python" for a local non-profit magazine, and would like to touch SciPy, and the best way I have found that fits my space limitations is to include the excellent screenshot from the SciPy homepage, that is worth much more than 1000 words. Who can I talk to about licensing it? This is sent to the entire list because I couldn't find a "contact" link on the website. Thanks, -- Aur PS I am not subscribed to the list, so I would appreciate it very much if every reply would be CCd to my email personally. Thank you very much. From vincent at vincentdavis.net Sat Jul 3 18:34:40 2010 From: vincent at vincentdavis.net (Vincent Davis) Date: Sat, 3 Jul 2010 16:34:40 -0600 Subject: [SciPy-Dev] Is imresize() defined twice in scipy.misc.pilutil.py In-Reply-To: References: <4C2943A7.9050704@enthought.com> Message-ID: On Sat, Jul 3, 2010 at 11:26 AM, Vincent Davis wrote: > Posted a ticket for this. > Ticket #1221 (new defect) > http://projects.scipy.org/scipy/ticket/1221 > > Vincent > > On Tue, Jun 29, 2010 at 2:59 AM, Pauli Virtanen wrote: >> Mon, 28 Jun 2010 19:51:51 -0500, Warren Weckesser wrote: >>> Vincent Davis wrote: >>>> Is imresize defined twice in ?scipy.misc.pilutil.py or am i missing >>>> something? >>> >>> Yup, and they were both added by travo about eight years ago (r521 and >>> r545). ?Hey Travis, what's up with that? >> >> I guess something like that would be hard to remember :) >> >> The two functions appear to have slightly different functionality. >> Nevertheless, perhaps it's just best to remove the first one (which is >> overridden by the second one). This looks pretty much like legacy stuff. >> >> -- >> Pauli Virtanen >> >> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at scipy.org >> http://mail.scipy.org/mailman/listinfo/scipy-dev >> > I had assumed nothing had been done on this, I guess I was wrong. Josef noticed that this had been fixed. So this ticket can be closed. http://projects.scipy.org/scipy/ticket/1221 Vincent From vincent at vincentdavis.net Sat Jul 3 19:29:51 2010 From: vincent at vincentdavis.net (Vincent Davis) Date: Sat, 3 Jul 2010 17:29:51 -0600 Subject: [SciPy-Dev] imresize does not work with size = int Message-ID: I might be doing something wrong but.. This is after the recent removal of the duplicate imresize proposed fix at bottom. >>> imresize(x, 2, mode='F') Traceback (most recent call last): File "", line 1, in File "/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/scipy/misc/pilutil.py", line 313, in imresize imnew = im.resize(size, resample=func[interp]) File "/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/PIL/Image.py", line 1302, in resize im = self.im.resize(size, resample) TypeError: must be 2-item sequence, not float THIS WORKS >>> imresize(x, 2., mode='F') array([[-0.654, -0.654, -0.227, 0.201, 0.141, 0.081], [-0.654, -0.654, -0.227, 0.201, 0.141, 0.081], [ 0.242, 0.242, -0.254, -0.751, -0.806, -0.86 ], [ 1.139, 1.139, -0.282, -1.703, -1.752, -1.801], [ 0.216, 0.216, -0.818, -1.851, -1.591, -1.332], [-0.708, -0.708, -1.354, -1.999, -1.431, -0.862]], dtype=float32) AND THIS >>> imresize(x, (100,100), mode='F') array([[-0.654, -0.654, -0.654, ..., 0.081, 0.081, 0.081], [-0.654, -0.654, -0.654, ..., 0.081, 0.081, 0.081], [-0.654, -0.654, -0.654, ..., 0.081, 0.081, 0.081], ..., [-0.708, -0.708, -0.708, ..., -0.862, -0.862, -0.862], [-0.708, -0.708, -0.708, ..., -0.862, -0.862, -0.862], [-0.708, -0.708, -0.708, ..., -0.862, -0.862, -0.862]], dtype=float32) I think the problem is in line 307 size = size / 100.0 I think size needs to be a tuple looking at line 309 which is what is used for size=float size = (array(im.size)*size).astype(int) I think the fix is percent = size / 100.0 size = (array(im.size)*percent).astype(int) Vincent From vincent at vincentdavis.net Sat Jul 3 19:59:48 2010 From: vincent at vincentdavis.net (Vincent Davis) Date: Sat, 3 Jul 2010 17:59:48 -0600 Subject: [SciPy-Dev] imresize does not work with size = int In-Reply-To: References: Message-ID: On Sat, Jul 3, 2010 at 5:29 PM, Vincent Davis wrote: > I might be doing something wrong but.. > This is after the recent removal of the duplicate imresize > proposed fix at bottom. > >>>> imresize(x, 2, mode='F') > Traceback (most recent call last): > ?File "", line 1, in > ?File "/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/scipy/misc/pilutil.py", > line 313, in imresize > ? ?imnew = im.resize(size, resample=func[interp]) > ?File "/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/PIL/Image.py", > line 1302, in resize > ? ?im = self.im.resize(size, resample) > TypeError: must be 2-item sequence, not float > > THIS WORKS > >>>> imresize(x, 2., mode='F') > array([[-0.654, -0.654, -0.227, ?0.201, ?0.141, ?0.081], > ? ? ? [-0.654, -0.654, -0.227, ?0.201, ?0.141, ?0.081], > ? ? ? [ 0.242, ?0.242, -0.254, -0.751, -0.806, -0.86 ], > ? ? ? [ 1.139, ?1.139, -0.282, -1.703, -1.752, -1.801], > ? ? ? [ 0.216, ?0.216, -0.818, -1.851, -1.591, -1.332], > ? ? ? [-0.708, -0.708, -1.354, -1.999, -1.431, -0.862]], dtype=float32) > > AND THIS > >>>> imresize(x, (100,100), mode='F') > array([[-0.654, -0.654, -0.654, ..., ?0.081, ?0.081, ?0.081], > ? ? ? [-0.654, -0.654, -0.654, ..., ?0.081, ?0.081, ?0.081], > ? ? ? [-0.654, -0.654, -0.654, ..., ?0.081, ?0.081, ?0.081], > ? ? ? ..., > ? ? ? [-0.708, -0.708, -0.708, ..., -0.862, -0.862, -0.862], > ? ? ? [-0.708, -0.708, -0.708, ..., -0.862, -0.862, -0.862], > ? ? ? [-0.708, -0.708, -0.708, ..., -0.862, -0.862, -0.862]], dtype=float32) > > I think the problem is in line 307 > size = size / 100.0 > I think size needs to be a tuple > looking at line 309 which is what is used for size=float > size = (array(im.size)*size).astype(int) > > I think the fix is > percent = size / 100.0 > size = (array(im.size)*percent).astype(int) Applying that change seems to fix it. The following two operation should get the same results and do. >>> imresize(x, 200, mode = 'F') array([[ 0.68348145, 0.68348145, 1.43379724, 2.18411303, 1.42521787, 0.66632271], [ 0.68348145, 0.68348145, 1.43379724, 2.18411303, 1.42521787, 0.66632271], [-0.41771227, -0.41771227, -0.07694912, 0.26381403, 0.07136334, -0.12108734], [-1.518906 , -1.518906 , -1.58769548, -1.65648496, -1.28249121, -0.90849739], [-0.43312848, -0.43312848, -0.3485043 , -0.26388013, -0.3611995 , -0.45851886], [ 0.65264904, 0.65264904, 0.89068687, 1.12872469, 0.56009215, -0.00854035]], dtype=float32) >>> imresize(x, 2., mode = 'F') array([[ 0.68348145, 0.68348145, 1.43379724, 2.18411303, 1.42521787, 0.66632271], [ 0.68348145, 0.68348145, 1.43379724, 2.18411303, 1.42521787, 0.66632271], [-0.41771227, -0.41771227, -0.07694912, 0.26381403, 0.07136334, -0.12108734], [-1.518906 , -1.518906 , -1.58769548, -1.65648496, -1.28249121, -0.90849739], [-0.43312848, -0.43312848, -0.3485043 , -0.26388013, -0.3611995 , -0.45851886], [ 0.65264904, 0.65264904, 0.89068687, 1.12872469, 0.56009215, -0.00854035]], dtype=float32) > > Vincent > From bsouthey at gmail.com Sat Jul 3 22:03:05 2010 From: bsouthey at gmail.com (Bruce Southey) Date: Sat, 3 Jul 2010 21:03:05 -0500 Subject: [SciPy-Dev] SciPy docs: volunteers needed now! In-Reply-To: References: Message-ID: On Sat, Jul 3, 2010 at 11:34 AM, Ralf Gommers wrote: > > > On Sun, Jul 4, 2010 at 12:13 AM, Bruce Southey wrote: >> >> On Sat, Jul 3, 2010 at 10:31 AM, David Goldsmith >> wrote: >> > On Sat, Jul 3, 2010 at 7:24 AM, Benjamin Root wrote: >> >> >> >> Joshua, >> >> >> >> In addition to the very technical writing for individual functions, we >> >> also need documentation that is accessible to newcomers.? Many modules >> >> do >> >> not implement any functions themselves, but act as a grouping module >> >> (for >> >> example, scipy.io).? These modules could definitely use good, >> >> up-to-date, >> >> summary narratives.? Even some modules further down the stack can still >> >> benefit from good summaries. >> >> What would really benefit is a solid example for different types. >> There are so many inconsistent styles in the numpy documentation - I >> know I wrote some. You can always see that when there are different >> definitions (when present) for input that are array-like. There should >> be one single definition for these common input and output that should >> be immutable by the writer and should automatically appear. > > That would be nice but I don't see how that's technically feasible. To see > whether the input is ndarray or array_like you just have to look at the > code. And you'll always end up with some minor inconsistencies with many > writers, but that's what the review/proof stages are for. >> >> >> >> >> To everyone, if you do join the documentation efforts to contribute >> >> little >> >> bits of writing, it is a common courtesy to notify any others who might >> >> also >> >> be working on a particular document.? The current system does not >> >> automatically notify authors of any changes, so it is hard to know if >> >> any >> >> changes have been made.? General rule of thumb is to notify authors who >> >> have >> >> made changes to the doc within the last 3 months (I believe). >> > >> > Actually, all you have to do (have used in both senses of the word, >> > i.e., >> > are "required" to do and all that is necessary to do) is click on the >> > log >> > link when viewing a docstring in the Wiki, note the person who worked on >> > it >> > last, how long ago that was, and then as Ben says, if it was in the last >> > three months, give or take, contact them and ask them if it would be ok >> > if >> > you worked on it. >> >> And where is the email or notification button? The more loops to jump >> through the less people will jump. > > Good point. >> >> Where do you see the changes made? It should be a diff. > > Under Log you have access to all diffs for a docstring. (Using the word 'diff' was perhaps too limiting - probably more like track changes in word processors is more appropriate terminology.) Thanks for pointing that out because I had to search a few examples to find an example. Otherwise all you get is a 'diff to svn'. An example is: http://docs.scipy.org/scipy/docs/scipy.stats.distributions.rv_continuous/log/ Yet you can not tell what was changed and why beyond a trivial comment (that is often not very correct). There is no link to the discussion so you do not know what version the discussion applies to. >> >> Why does the second person have greater authority to override the >> original author? >> Why should I have to go back and check that the later revisions are >> correct? >> > If the first author brings a docstring to "needs review" status there's no > need for a second author to work on it. If it's still in "being written", it > wasn't finished in the first place. And later edits possibly overwriting > earlier ones is just the nature of wikis. > > > Cheers, > Ralf Bruce From bsouthey at gmail.com Sat Jul 3 22:12:49 2010 From: bsouthey at gmail.com (Bruce Southey) Date: Sat, 3 Jul 2010 21:12:49 -0500 Subject: [SciPy-Dev] Makes stats._support functions more visible In-Reply-To: References: Message-ID: On Sat, Jul 3, 2010 at 12:02 PM, Benjamin Root wrote: > unique() is in numpy.? As for the rest of them, these functions appear to be > very specific to the stats package.? In particular, unique() in the support > package is a very restricted version of the more generic unique() in numpy. > Some of the other functions state that they are like other functions in > numpy, but... different... > > I think they should stay right where they are. > > Ben Root > > > On Fri, Jul 2, 2010 at 6:11 PM, Vincent Davis > wrote: >> >> There are several useful functions in stats._support I don't think I >> would have found them if I had ?been looking for such tools. I think >> it would be good for these to be more visible. Primarily I am thinking >> of >> abut() >> adm() >> colex() >> unique() >> linexand() >> collapse() >> >> Actually most/all of these probably belong in numpy. I don't think >> they duplicate any np functions. >> >> Vincent >> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at scipy.org >> http://mail.scipy.org/mailman/listinfo/scipy-dev > > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > > Actually for the most part the stats functions need lots of attention because these have not kept pace with numpy. Most are specific to very specific inputs or perhaps can be one-liners (trimmed ones are an example of that). So these would not be considered by numpy if the functions are not sufficiently general. So what you have to do is write new code for these functions.Then submit a patch to scipy and perhaps an enhancement patch to numpy. Bruce From ralf.gommers at googlemail.com Sat Jul 3 23:52:59 2010 From: ralf.gommers at googlemail.com (Ralf Gommers) Date: Sun, 4 Jul 2010 11:52:59 +0800 Subject: [SciPy-Dev] ndimage._nd_image In-Reply-To: References: Message-ID: On Sun, Jul 4, 2010 at 5:13 AM, Angus McMorland wrote: > Hi, > > Can we remove the ndimage._nd_image entries from the docstring wiki? > They're functions that are exposed in other ndimage modules. > There are 22 of them including the module. Perhaps just marking as > Unimportant by someone who has the relevant permissions. > I think anything to whittle down the list of functions appearing to > need documenting makes the list more approachable by new volunteers. > > Done. Cheers, Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at googlemail.com Sun Jul 4 00:01:13 2010 From: ralf.gommers at googlemail.com (Ralf Gommers) Date: Sun, 4 Jul 2010 12:01:13 +0800 Subject: [SciPy-Dev] {7} My Tickets doesn't work for me In-Reply-To: References: Message-ID: On Sun, Jul 4, 2010 at 6:27 AM, Vincent Davis wrote: > On Sat, Jul 3, 2010 at 1:01 PM, Charles R Harris > wrote: > > > > > > On Sat, Jul 3, 2010 at 12:28 PM, Vincent Davis > wrote: > >> > >> Can you "own" a ticket, I can't. I tried again, logged out back in > >> shows me logged in, opened one of my tickets it shows my name ... > >> See attached images for proof :-) > >> > > > > To own a ticket you need to go down to the bottom of the page and > reassign > > it to yourself. Owning isn't the same as submitting. > > I have no "own" option. > I was thinking "My Tickets" was the same as the numpy "{9} Tickets I've > created" > I can just search for my username > > If you register for Trac you don't get rights to own/reassign/close tickets. Normally only people with commit access can do that. You're right that it would be useful for scipy trac to be the same as numpy trac. Cheers, Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From stefan at sun.ac.za Sun Jul 4 00:24:47 2010 From: stefan at sun.ac.za (=?ISO-8859-1?Q?St=E9fan_van_der_Walt?=) Date: Sat, 3 Jul 2010 23:24:47 -0500 Subject: [SciPy-Dev] {7} My Tickets doesn't work for me In-Reply-To: References: Message-ID: On 3 July 2010 23:01, Ralf Gommers wrote: > If you register for Trac you don't get rights to own/reassign/close tickets. > Normally only people with commit access can do that. > > You're right that it would be useful for scipy trac to be the same as numpy > trac. It should be fixed now. Regards St?fan From vincent at vincentdavis.net Sun Jul 4 00:28:09 2010 From: vincent at vincentdavis.net (Vincent Davis) Date: Sat, 3 Jul 2010 22:28:09 -0600 Subject: [SciPy-Dev] {7} My Tickets doesn't work for me In-Reply-To: References: Message-ID: 2010/7/3 St?fan van der Walt : > On 3 July 2010 23:01, Ralf Gommers wrote: >> If you register for Trac you don't get rights to own/reassign/close tickets. >> Normally only people with commit access can do that. >> >> You're right that it would be useful for scipy trac to be the same as numpy >> trac. > > It should be fixed now. Great Thanks Vincent > > Regards > St?fan > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > From ralf.gommers at googlemail.com Sun Jul 4 00:58:46 2010 From: ralf.gommers at googlemail.com (Ralf Gommers) Date: Sun, 4 Jul 2010 12:58:46 +0800 Subject: [SciPy-Dev] optimize.fmin_ncg bug In-Reply-To: References: Message-ID: On Sat, Jul 3, 2010 at 4:41 PM, Ralf Gommers wrote: > There has been an intermittent test failure in scipy.optimize for a long > time, only on Windows: > > ====================================================================== > ERROR: line-search Newton conjugate gradient optimization routine > ---------------------------------------------------------------------- > Traceback (most recent call last): > File > "C:\Python26\lib\site-packages\scipy\optimize\tests\test_optimize.py", line > 115, in test_ncg > retall=False) > File "C:\Python26\lib\site-packages\scipy\optimize\optimize.py", line > 1096, in fmin_ncg > alphak, fc, gc, old_fval = line_search_BFGS(f,xk,pk,gfk,old_fval) > File "C:\Python26\lib\site-packages\scipy\optimize\optimize.py", line > 600, in line_search_BFGS > phi_a2 = f(*((xk+alpha2*pk,)+args)) > File "C:\Python26\lib\site-packages\scipy\optimize\optimize.py", line > 103, in function_wrapper > return function(x, *args) > File > "C:\Python26\lib\site-packages\scipy\optimize\tests\test_optimize.py", line > 41, in func > raise RuntimeError, "too many iterations in optimization routine" > RuntimeError: too many iterations in optimization routine > > ---------------------------------------------------------------------- > > I finally managed to track it down to line 1076 of optimize.py: > if curv == 0.0: > > This should be replaced with "if curv < eps:" with eps a suitably small > number. Now my question is, how small is suitably small? 1e-10 seems to > work, but maybe someone who known what the algorithm does can suggest a > number that's not just a wild guess. > > Looking at the rest of optimize.py, there's quite a bit of comparing with > zero going on which doesn't look right. Sometimes even with a comment like > "maybe this slipped below machine precision". Comparing floating point > numbers with zero like that is just a bad idea and should be fixed. > > OK, looked at it in some more detail. I propose to change the comparison with 0.0 to: if 0 <= curv <= 3*numpy.finfo(numpy.float64).eps: By printing the value of 'curv' at each iteration before line 1076 we see the problem, the code only checks for exactly 0.0 or <0.0 while the loop approaches 0.0 to about +2*float64.eps and then runs away till we get Inf/Nans in 'pk': .......................0.00399999953806 1.25810055332e-05 1.3020931042e-07 1.98642696237e-10 6.89328152584e-12 3.75054053064e-16 1.17020918838e-15 3.65518790716e-15 1.522193813e-14 3.49498275889e-14 1.00753995671e-13 8.85621735118e-14 6.45602832318e-13 7.71651859561e-13 1.26993734725e-12 1.99182104181e-12 3.589092817e-12 3.31289828857e-12 7.33740017942e-12 1.1035546662e-11 1.41992354262e-11 3.51353735966e-11 4.29576509375e-11 6.83301887956e-11 1.10152323757e-10 2.02779978823e-10 2.58793635812e-10 4.28945979394e-10 7.10409808322e-10 1.17270345986e-09 1.92876849893e-09 3.53828040702e-09 4.92294232525e-09 7.1957586196e-09 1.18388196318e-08 1.70747389794e-08 3.72806974206e-08 5.35293003171e-08 9.33825403925e-08 1.22352272763e-07 2.12909037336e-07 3.07803338116e-07 4.02456362066e-07 7.00986627566e-07 9.6196850915e-07 1.36818301207e-06 2.13258348427e-06 3.28792400662e-06 5.01214989681e-06 7.55113686018e-06 1.12377188955e-05 1.72677931327e-05 2.32111890424e-05 3.3315332674e-05 4.71713470554e-05 6.59126173456e-05 9.09613197183e-05 0.000124130248531 0.000174603954122 0.000220134787641 0.000294134065199 0.000391904614904 0.000521995990713 0.000642371889147 0.000979855604735 0.00131625176769 0.00184480132165 0.00236723563377 0.00336587176004 0.00405386458411 0.00641503610926 0.00829947277392 0.0131341969672 0.0195918691409 0.0246436359513 0.0347262025799 0.0506396319829 0.0699390838321 0.0967920110714 0.129810713109 0.190348188533 0.238461606854 0.392229499575 0.524126282795 0.784084315019 0.931385310094 1.46067729788 2.03440185395 2.82231409859 4.01402864384 4.9533933324 7.49622545276 10.4700586476 13.5314471454 18.1905146597 24.3003828933 32.2523989905 42.5248543154 55.6998127601 72.4850733504 96.2032800258 106.571255441 161.149692744 215.008049236 272.723774241 310.370850848 431.849384449 377.245150044 611.993566609 478.678751989 2454.80128358 921.429944883 4068.83043858 -1503.78719623 E............. With the fix the result is: .......................0.00399999953806 1.25810055332e-05 1.3020931042e-07 1.98642696237e-10 6.89328152584e-12 3.75054053064e-16 .............. It's ticket http://projects.scipy.org/scipy/ticket/1150 I will commit this today in the 0.8.x branch unless someone disagrees. Cheers, Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From ben.root at ou.edu Sun Jul 4 00:59:17 2010 From: ben.root at ou.edu (Benjamin Root) Date: Sat, 3 Jul 2010 23:59:17 -0500 Subject: [SciPy-Dev] SciPy docs: volunteers needed now! In-Reply-To: References: Message-ID: On Sat, Jul 3, 2010 at 9:03 PM, Bruce Southey wrote: > On Sat, Jul 3, 2010 at 11:34 AM, Ralf Gommers > wrote: > > > > > > On Sun, Jul 4, 2010 at 12:13 AM, Bruce Southey > wrote: > >> > >> On Sat, Jul 3, 2010 at 10:31 AM, David Goldsmith > >> wrote: > >> > On Sat, Jul 3, 2010 at 7:24 AM, Benjamin Root > wrote: > >> >> > >> >> Joshua, > >> >> > >> >> In addition to the very technical writing for individual functions, > we > >> >> also need documentation that is accessible to newcomers. Many > modules > >> >> do > >> >> not implement any functions themselves, but act as a grouping module > >> >> (for > >> >> example, scipy.io). These modules could definitely use good, > >> >> up-to-date, > >> >> summary narratives. Even some modules further down the stack can > still > >> >> benefit from good summaries. > >> > >> What would really benefit is a solid example for different types. > >> There are so many inconsistent styles in the numpy documentation - I > >> know I wrote some. You can always see that when there are different > >> definitions (when present) for input that are array-like. There should > >> be one single definition for these common input and output that should > >> be immutable by the writer and should automatically appear. > > > > That would be nice but I don't see how that's technically feasible. To > see > > whether the input is ndarray or array_like you just have to look at the > > code. And you'll always end up with some minor inconsistencies with many > > writers, but that's what the review/proof stages are for. > >> > >> >> > >> >> To everyone, if you do join the documentation efforts to contribute > >> >> little > >> >> bits of writing, it is a common courtesy to notify any others who > might > >> >> also > >> >> be working on a particular document. The current system does not > >> >> automatically notify authors of any changes, so it is hard to know if > >> >> any > >> >> changes have been made. General rule of thumb is to notify authors > who > >> >> have > >> >> made changes to the doc within the last 3 months (I believe). > >> > > >> > Actually, all you have to do (have used in both senses of the word, > >> > i.e., > >> > are "required" to do and all that is necessary to do) is click on the > >> > log > >> > link when viewing a docstring in the Wiki, note the person who worked > on > >> > it > >> > last, how long ago that was, and then as Ben says, if it was in the > last > >> > three months, give or take, contact them and ask them if it would be > ok > >> > if > >> > you worked on it. > >> > >> And where is the email or notification button? The more loops to jump > >> through the less people will jump. > > > > Good point. > >> > >> Where do you see the changes made? It should be a diff. > > > > Under Log you have access to all diffs for a docstring. > > (Using the word 'diff' was perhaps too limiting - probably more like > track changes in word processors is more appropriate terminology.) > Thanks for pointing that out because I had to search a few examples to > find an example. Otherwise all you get is a 'diff to svn'. > An example is: > > http://docs.scipy.org/scipy/docs/scipy.stats.distributions.rv_continuous/log/ > > Yet you can not tell what was changed and why beyond a trivial comment > (that is often not very correct). There is no link to the discussion > so you do not know what version the discussion applies to. > > At that link, you can select two versions and see the difference between those two revisions by clicking on the "Differences" button. This is much like viewing the history on wikipedia. And, if you want to see what the docs looked like for a particular version, you can click on the date time for that revision on the left I hope this helps. Ben Root -------------- next part -------------- An HTML attachment was scrubbed... URL: From thouis at broadinstitute.org Sun Jul 4 14:02:48 2010 From: thouis at broadinstitute.org (Thouis (Ray) Jones) Date: Sun, 4 Jul 2010 14:02:48 -0400 Subject: [SciPy-Dev] ndimage.labeled_comprehension default parameter In-Reply-To: References: Message-ID: (Third try, hopefully from the correct email address this time...) On Sat, Jul 3, 2010 at 17:04, Angus McMorland wrote: > I'm trying to write the docstring for ndimage.labeled_comprehension. > Can anyone tell me what the `default` parameter does? > I can't see a case where it isn't over-written by the do_map internal function. The intent was for that value to be returned for any values in "labels" that don't actually occur in "input", but it might be a bug. I'll try to verify it early this week (my home machine doesn't have an up-to-date install of scipy). Best, Ray Jones From amcmorl at gmail.com Sun Jul 4 14:36:35 2010 From: amcmorl at gmail.com (Angus McMorland) Date: Sun, 4 Jul 2010 14:36:35 -0400 Subject: [SciPy-Dev] ndimage.labeled_comprehension default parameter In-Reply-To: References: Message-ID: On 4 July 2010 14:02, Thouis (Ray) Jones wrote: > (Third try, hopefully from the correct email address this time...) > > On Sat, Jul 3, 2010 at 17:04, Angus McMorland wrote: >> I'm trying to write the docstring for ndimage.labeled_comprehension. >> Can anyone tell me what the `default` parameter does? >> I can't see a case where it isn't over-written by the do_map internal function. > > The intent was for that value to be returned for any values in > "labels" that don't actually occur in "input", but it might be a bug. > I'll try to verify it early this week (my home machine doesn't have an > up-to-date install of scipy). Thanks for the pointer- I've checked it here, and the routine works as advertised. (Well, actually, I'm presuming you mean when a value of `index` does not exist in `labels`.) Supplying 'non-occurring' labels was not a case I had tried. Thanks, Angus. > Best, > Ray Jones > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > -- AJC McMorland Post-doctoral research fellow Neurobiology, University of Pittsburgh From ralf.gommers at googlemail.com Sun Jul 4 19:34:01 2010 From: ralf.gommers at googlemail.com (Ralf Gommers) Date: Mon, 5 Jul 2010 07:34:01 +0800 Subject: [SciPy-Dev] ANN: scipy 0.8.0 release candidate 1 Message-ID: I'm pleased to announce the availability of the first release candidate of SciPy 0.8.0. Please try it out and report any problems on the scipy-dev mailing list. SciPy is a package of tools for science and engineering for Python. It includes modules for statistics, optimization, integration, linear algebra, Fourier transforms, signal and image processing, ODE solvers, and more. This release candidate release comes almost one and a half year after the 0.7.0 release and contains many new features, numerous bug-fixes, improved test coverage, and better documentation. Please note that SciPy 0.8.0rc1 requires Python 2.4-2.6 and NumPy 1.4.1 or greater. For more information, please see the release notes: http://sourceforge.net/projects/scipy/files/scipy/0.8.0rc1/NOTES.txt/view You can download the release from here: https://sourceforge.net/projects/scipy/ Python 2.5/2.6 binaries for Windows and OS X are available, as well as source tarballs for other platforms and the documentation in pdf form. Thank you to everybody who contributed to this release. Enjoy, Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From vincent at vincentdavis.net Sun Jul 4 23:00:04 2010 From: vincent at vincentdavis.net (Vincent Davis) Date: Sun, 4 Jul 2010 21:00:04 -0600 Subject: [SciPy-Dev] pydocweb enhancements Message-ID: I assume that mostly these question would be answered by Pauli but I thought I would share my thoughts. As far as I can tell additional pages have been added manually, i this right? If so it would be nice to add pages from within the admin or maybe by specified user-groups. Then make these pages accessible from a drop-down menu that list the pages. If this makes sense I think I know the basics of how to do this and can plan to work on it this week (actually now). Any other thoughts? Vincent From d.l.goldsmith at gmail.com Mon Jul 5 01:10:42 2010 From: d.l.goldsmith at gmail.com (David Goldsmith) Date: Sun, 4 Jul 2010 22:10:42 -0700 Subject: [SciPy-Dev] pydocweb enhancements In-Reply-To: References: Message-ID: On Sun, Jul 4, 2010 at 8:00 PM, Vincent Davis wrote: > I assume that mostly these question would be answered by Pauli but I > thought I would share my thoughts. > > As far as I can tell additional pages have been added manually, i this > right? > If so it would be nice to add pages from within the admin or maybe > by specified user-groups. Do you mean exclusively? This is antithetical to the whole idea of a wiki as I understand it: I thought the idea of a wiki was that anyone w/ an account can add acceptable content, not just an "admin and/or specified user groups". What's your rationale for this? DG > Then make these pages accessible from a > drop-down menu that list the pages. > If this makes sense I think I know the basics of how to do this and > can plan to work on it this week (actually now). > > Any other thoughts? > > Vincent > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > -- Mathematician: noun, someone who disavows certainty when their uncertainty set is non-empty, even if that set has measure zero. Hope: noun, that delusive spirit which escaped Pandora's jar and, with her lies, prevents mankind from committing a general suicide. (As interpreted by Robert Graves) -------------- next part -------------- An HTML attachment was scrubbed... URL: From vincent at vincentdavis.net Mon Jul 5 09:39:00 2010 From: vincent at vincentdavis.net (Vincent Davis) Date: Mon, 5 Jul 2010 07:39:00 -0600 Subject: [SciPy-Dev] pydocweb enhancements In-Reply-To: References: Message-ID: On Sun, Jul 4, 2010 at 11:10 PM, David Goldsmith wrote: > On Sun, Jul 4, 2010 at 8:00 PM, Vincent Davis > wrote: >> >> I assume that mostly these question would be answered by Pauli but I >> thought I would share my thoughts. >> >> As far as I can tell additional pages have been added manually, i this >> right? >> ? ?If so it would be nice to add pages from within the admin or maybe >> by specified user-groups. > > Do you mean exclusively?? This is antithetical to the whole idea of a wiki > as I understand it: I thought the idea of a wiki was that anyone w/ an > account can add acceptable content, not just an "admin and/or specified user > groups".? What's your rationale for this? I am not the admin, nor do I want to decide who gets to do what :-) My primary interest is the ability to add pages and control/fix the links in the navbar to be appropriate to the "site" without having to manually edit the template. Vincent > > DG > >> >> Then make these pages accessible from a >> drop-down menu that list the pages. >> If this makes sense ?I think I know the basics of how to do this and >> can plan to work on it this week (actually now). >> >> Any other thoughts? >> >> Vincent >> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at scipy.org >> http://mail.scipy.org/mailman/listinfo/scipy-dev > > > > -- > Mathematician: noun, someone who disavows certainty when their uncertainty > set is non-empty, even if that set has measure zero. > > Hope: noun, that delusive spirit which escaped Pandora's jar and, with her > lies, prevents mankind from committing a general suicide. ?(As interpreted > by Robert Graves) > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > > From pav at iki.fi Mon Jul 5 10:28:10 2010 From: pav at iki.fi (Pauli Virtanen) Date: Mon, 5 Jul 2010 14:28:10 +0000 (UTC) Subject: [SciPy-Dev] pydocweb enhancements References: Message-ID: Mon, 05 Jul 2010 07:39:00 -0600, Vincent Davis wrote: [clip] > I am not the admin, nor do I want to decide who gets to do what :-) My > primary interest is the ability to add pages and control/fix the links > in the navbar to be appropriate to the "site" without having to manually > edit the template. In my opinion, it's OK to restrict navbar modifications to site admins. (navbar = I guess Vincent's talking about the bar with the Wiki, Milestones, Docstrings, etc. links) That part, once set up, doesn't really change often, and the admin panel is the right place to tweak that. -- Pauli Virtanen From ralf.gommers at googlemail.com Mon Jul 5 10:39:31 2010 From: ralf.gommers at googlemail.com (Ralf Gommers) Date: Mon, 5 Jul 2010 22:39:31 +0800 Subject: [SciPy-Dev] pydocweb enhancements In-Reply-To: References: Message-ID: On Mon, Jul 5, 2010 at 10:28 PM, Pauli Virtanen wrote: > Mon, 05 Jul 2010 07:39:00 -0600, Vincent Davis wrote: > [clip] > > I am not the admin, nor do I want to decide who gets to do what :-) My > > primary interest is the ability to add pages and control/fix the links > > in the navbar to be appropriate to the "site" without having to manually > > edit the template. > > In my opinion, it's OK to restrict navbar modifications to site admins. > (navbar = I guess Vincent's talking about the bar with the Wiki, > Milestones, Docstrings, etc. links) > > If I had to guess what triggered his interest, it's that the scipy Milestones link point at the numpy version yet again... Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From vincent at vincentdavis.net Mon Jul 5 10:58:09 2010 From: vincent at vincentdavis.net (Vincent Davis) Date: Mon, 5 Jul 2010 08:58:09 -0600 Subject: [SciPy-Dev] pydocweb enhancements In-Reply-To: References: Message-ID: On Mon, Jul 5, 2010 at 8:39 AM, Ralf Gommers wrote: > > > On Mon, Jul 5, 2010 at 10:28 PM, Pauli Virtanen wrote: >> >> Mon, 05 Jul 2010 07:39:00 -0600, Vincent Davis wrote: >> [clip] >> > I am not the admin, nor do I want to decide who gets to do what :-) My >> > primary interest is the ability to add pages and control/fix the links >> > in the navbar to be appropriate to the "site" without having to manually >> > edit the template. >> >> In my opinion, it's OK to restrict navbar modifications to site admins. >> (navbar = I guess Vincent's talking about the bar with the Wiki, >> Milestones, Docstrings, etc. links) Yes that is what I was talking about. >> > If I had to guess what triggered his interest, it's that the scipy > Milestones link point at the numpy version yet again... I don't think it has ever been different :-) , but yes @ Pauli, It is then correct that the Milestone pages for example have been manually added Is it possible that I can get the template and view files (are there others I want) that have been changed for the actual running wiki mainly to see how the additional pages have been added. Like the milestone pages. Vincent > > Ralf > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > > From pav at iki.fi Mon Jul 5 11:03:22 2010 From: pav at iki.fi (Pauli Virtanen) Date: Mon, 5 Jul 2010 15:03:22 +0000 (UTC) Subject: [SciPy-Dev] pydocweb enhancements References: Message-ID: Mon, 05 Jul 2010 08:58:09 -0600, Vincent Davis wrote: [clip] > Is it possible that I can get the template and view files (are there > others I want) that have been changed for the actual running wiki mainly > to see how the additional pages have been added. Like the milestone > pages. The template (base.html) addition (the second line below) runs like this:
  • Changes
  • Milestones
  • Search
  • The page the Milestone link points to is just an ordinary wiki page in the wiki subsystem -- those can be created by anyone. -- Pauli Virtanen From ralf.gommers at googlemail.com Mon Jul 5 11:07:23 2010 From: ralf.gommers at googlemail.com (Ralf Gommers) Date: Mon, 5 Jul 2010 23:07:23 +0800 Subject: [SciPy-Dev] pydocweb enhancements In-Reply-To: References: Message-ID: On Mon, Jul 5, 2010 at 11:03 PM, Pauli Virtanen wrote: > Mon, 05 Jul 2010 08:58:09 -0600, Vincent Davis wrote: > [clip] > > Is it possible that I can get the template and view files (are there > > others I want) that have been changed for the actual running wiki mainly > > to see how the additional pages have been added. Like the milestone > > pages. > > The template (base.html) addition (the second line below) > runs like this: > >
  • Changes
  • >
  • Milestones
  • >
  • Search
  • > > The page the Milestone link points to is just an ordinary > wiki page in the wiki subsystem -- those can be created by anyone. > > While we're at it, I just saw that scipy -> numpy cross links are incorrect. For example on http://docs.scipy.org/scipy/docs/scipy.signal.signaltools.unique_roots/ the numpy.unique link points to: http://docs.scipy.org/numpy/scipy/docs/numpy.lib.arraysetops.unique/#numpy-unique instead of (note the extra /scipy above): http://docs.scipy.org/numpy/docs/numpy.lib.arraysetops.unique/#numpy-unique Cheers, Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From vincent at vincentdavis.net Mon Jul 5 11:46:24 2010 From: vincent at vincentdavis.net (Vincent Davis) Date: Mon, 5 Jul 2010 09:46:24 -0600 Subject: [SciPy-Dev] pydocweb enhancements In-Reply-To: References: Message-ID: On Mon, Jul 5, 2010 at 9:07 AM, Ralf Gommers wrote: > > > On Mon, Jul 5, 2010 at 11:03 PM, Pauli Virtanen wrote: >> >> Mon, 05 Jul 2010 08:58:09 -0600, Vincent Davis wrote: >> [clip] >> > Is it possible that I can get the template and view files (are there >> > others I want) that have been changed for the actual running wiki mainly >> > to see how the additional pages have been added. Like the milestone >> > pages. >> >> The template (base.html) addition (the second line below) >> runs like this: >> >> ? ? ? ?
  • Changes
  • >> ? ? ? ?
  • Milestones
  • >> ? ? ? ?
  • Search
  • OK >> >> The page the Milestone link points to is just an ordinary >> wiki page in the wiki subsystem -- those can be created by anyone. Once you find the link to do so, Seems kinda hidden. The only way I can find it is edit>edit help>Create new wiki page. >> > While we're at it, I just saw that scipy -> numpy cross links are incorrect. > For example on > ?http://docs.scipy.org/scipy/docs/scipy.signal.signaltools.unique_roots/ the > numpy.unique link points to: > http://docs.scipy.org/numpy/scipy/docs/numpy.lib.arraysetops.unique/#numpy-unique > instead of (note the extra /scipy above): > http://docs.scipy.org/numpy/docs/numpy.lib.arraysetops.unique/#numpy-unique This link is just using the rst `numpy.unique` wouldn't this need to be a real http: link as the numpy docs are not within the scipy docs? Vincent > > Cheers, > Ralf > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > > From vincent at vincentdavis.net Mon Jul 5 19:34:21 2010 From: vincent at vincentdavis.net (Vincent Davis) Date: Mon, 5 Jul 2010 17:34:21 -0600 Subject: [SciPy-Dev] improved admin for pydocweb Message-ID: Small addition but adding the following file adds a few utilities for the admin. The file needs to be placed, pydocweb/docweb/admin.py It a small and incremental addition. Vincent -------------- next part -------------- A non-text attachment was scrubbed... Name: admin.py Type: text/x-python Size: 439 bytes Desc: not available URL: From d.l.goldsmith at gmail.com Mon Jul 5 22:14:11 2010 From: d.l.goldsmith at gmail.com (David Goldsmith) Date: Mon, 5 Jul 2010 19:14:11 -0700 Subject: [SciPy-Dev] improved admin for pydocweb In-Reply-To: References: Message-ID: On Mon, Jul 5, 2010 at 4:34 PM, Vincent Davis wrote: > Small addition but adding the following file adds a few utilities for the > admin. > The file needs to be placed, pydocweb/docweb/admin.py > It a small and incremental addition. > Users of the Wiki don't need to do this, correct? DG > > Vincent > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > > -- Mathematician: noun, someone who disavows certainty when their uncertainty set is non-empty, even if that set has measure zero. Hope: noun, that delusive spirit which escaped Pandora's jar and, with her lies, prevents mankind from committing a general suicide. (As interpreted by Robert Graves) -------------- next part -------------- An HTML attachment was scrubbed... URL: From vincent at vincentdavis.net Mon Jul 5 23:33:08 2010 From: vincent at vincentdavis.net (Vincent Davis) Date: Mon, 5 Jul 2010 21:33:08 -0600 Subject: [SciPy-Dev] improved admin for pydocweb In-Reply-To: References: Message-ID: On Mon, Jul 5, 2010 at 8:14 PM, David Goldsmith wrote: > On Mon, Jul 5, 2010 at 4:34 PM, Vincent Davis > wrote: >> >> Small addition but adding the following file adds a few utilities for the >> admin. >> The file needs to be placed, pydocweb/docweb/admin.py >> It a small and incremental addition. > > Users of the Wiki don't need to do this, correct? This is an admin only addition, makes it easier to manage and keep track of pages. It needs to be installed by someone that has access to the files and can reset django server app. > > DG >> >> Vincent >> >> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at scipy.org >> http://mail.scipy.org/mailman/listinfo/scipy-dev >> > > > > -- > Mathematician: noun, someone who disavows certainty when their uncertainty > set is non-empty, even if that set has measure zero. > > Hope: noun, that delusive spirit which escaped Pandora's jar and, with her > lies, prevents mankind from committing a general suicide. ?(As interpreted > by Robert Graves) > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > > From scott.sinclair.za at gmail.com Tue Jul 6 02:07:33 2010 From: scott.sinclair.za at gmail.com (Scott Sinclair) Date: Tue, 6 Jul 2010 08:07:33 +0200 Subject: [SciPy-Dev] improved admin for pydocweb In-Reply-To: References: Message-ID: >On 6 July 2010 01:34, Vincent Davis wrote: > Small addition but adding the following file adds a few utilities for the admin. > The file needs to be placed, pydocweb/docweb/admin.py > It a small and incremental addition. It's more likely to be applied if you supply it as a patch.. Cheers, Scott From d.l.goldsmith at gmail.com Tue Jul 6 02:48:13 2010 From: d.l.goldsmith at gmail.com (David Goldsmith) Date: Mon, 5 Jul 2010 23:48:13 -0700 Subject: [SciPy-Dev] improved admin for pydocweb In-Reply-To: References: Message-ID: On Mon, Jul 5, 2010 at 11:07 PM, Scott Sinclair wrote: > >On 6 July 2010 01:34, Vincent Davis wrote: > > Small addition but adding the following file adds a few utilities for the > admin. > > The file needs to be placed, pydocweb/docweb/admin.py > > It a small and incremental addition. > > It's more likely to be applied if you supply it as a patch.. > And I think only Pauli "cares." Posting here might make people think that they need to be concerned about it to use the Wiki: you can keep anything having to do w/ pydocweb--unless it significantly affects user functionality, like implementation of the two review system--between you and he. DG > > Cheers, > Scott > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > -- Mathematician: noun, someone who disavows certainty when their uncertainty set is non-empty, even if that set has measure zero. Hope: noun, that delusive spirit which escaped Pandora's jar and, with her lies, prevents mankind from committing a general suicide. (As interpreted by Robert Graves) -------------- next part -------------- An HTML attachment was scrubbed... URL: From pav at iki.fi Tue Jul 6 04:27:00 2010 From: pav at iki.fi (Pauli Virtanen) Date: Tue, 6 Jul 2010 08:27:00 +0000 (UTC) Subject: [SciPy-Dev] improved admin for pydocweb References: Message-ID: Tue, 06 Jul 2010 08:07:33 +0200, Scott Sinclair wrote: >>On 6 July 2010 01:34, Vincent Davis wrote: >> Small addition but adding the following file adds a few utilities for >> the admin. The file needs to be placed, pydocweb/docweb/admin.py It a >> small and incremental addition. > > It's more likely to be applied if you supply it as a patch. And even more likely as a git branch. But anyway, we should keep this discussion off the Numpy/Scipy lists -- it's of limited interest to most people. Instead, use this: http://groups.google.com/group/pydocweb-discuss -- Pauli Virtanen From vincent at vincentdavis.net Tue Jul 6 09:22:35 2010 From: vincent at vincentdavis.net (Vincent Davis) Date: Tue, 6 Jul 2010 07:22:35 -0600 Subject: [SciPy-Dev] improved admin for pydocweb In-Reply-To: References: Message-ID: On Tue, Jul 6, 2010 at 2:27 AM, Pauli Virtanen wrote: > Tue, 06 Jul 2010 08:07:33 +0200, Scott Sinclair wrote: > >>>On 6 July 2010 01:34, Vincent Davis wrote: >>> Small addition but adding the following file adds a few utilities for >>> the admin. The file needs to be placed, pydocweb/docweb/admin.py It a >>> small and incremental addition. >> >> It's more likely to be applied if you supply it as a patch. > > And even more likely as a git branch. > > But anyway, we should keep this discussion off the Numpy/Scipy lists -- > it's of limited interest to most people. Instead, use this: > > http://groups.google.com/group/pydocweb-discuss Ok, I hear all of you :-) > > -- > Pauli Virtanen > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > From matthew.brett at gmail.com Tue Jul 6 09:55:10 2010 From: matthew.brett at gmail.com (Matthew Brett) Date: Tue, 6 Jul 2010 09:55:10 -0400 Subject: [SciPy-Dev] SciPy docs: volunteers needed now! In-Reply-To: References: Message-ID: Hi, > Actually, all you have to do (have used in both senses of the word, i.e., > are "required" to do and all that is necessary to do) is click on the log > link when viewing a docstring in the Wiki, note the person who worked on it > last, how long ago that was, and then as Ben says, if it was in the last > three months, give or take, contact them and ask them if it would be ok if > you worked on it. Sorry to come back late to this thread. I realize I may not know enough to say something sensible here - but it is possible to summarize that one major barrier to people just starting is that is can be hard to know where to start and whether you've missed some extra guidelines? I enjoyed the doc sprints in the past. I wonder whether it would be a good idea to - say - every week identify this week's top starter docstrings - docstrings that we're expecting anyone could edit. Then have maybe have a time every day or at least every week when those of us who can, join an IM channel to help out anyone (including ourselves) editing docs. Maybe a little mention each week of the best new docstring - or highlight a good old one if there aren't enough new ones (or the person judging did all the writing !) - with some feedback on what's good about it. Sorry if you've thought of these... See you, Matthew From matthew.brett at gmail.com Tue Jul 6 11:29:50 2010 From: matthew.brett at gmail.com (Matthew Brett) Date: Tue, 6 Jul 2010 11:29:50 -0400 Subject: [SciPy-Dev] Documenting distributions, advice? Message-ID: Hi, I was just playing with the Gamma distribution (sicpy.stats.gamma) and I found it hard to work out how the distribution related to the standard sources on the topic. I think I'm getting there now, but I'm wondering whether there is anywhere some advice on documenting scipy.stats.distributions? For example, it looks as if there is only the 'extradoc' docstring addition for any specific distribution - is that right? Thanks a lot, Matthew From jsseabold at gmail.com Tue Jul 6 11:42:22 2010 From: jsseabold at gmail.com (Skipper Seabold) Date: Tue, 6 Jul 2010 11:42:22 -0400 Subject: [SciPy-Dev] Documenting distributions, advice? In-Reply-To: References: Message-ID: On Tue, Jul 6, 2010 at 11:29 AM, Matthew Brett wrote: > Hi, > > I was just playing with the Gamma distribution (sicpy.stats.gamma) and > I found it hard to work out how the distribution related to the > standard sources on the topic. ? I think I'm getting there now, but > I'm wondering whether there is anywhere some advice on documenting > scipy.stats.distributions? ?For example, it looks as if there is only > the 'extradoc' docstring addition for any specific distribution - is > that right? > > Thanks a lot, > > Matthew After spending the last few days with the distributions trying to fix one thing, I am also wondering about this and other questions with respect to the distributions (more to come later...). At the risk of telling you things you already know, I usually start here: Then click on say 'continuous distributions' link towards the top to see the LaTeX math http://docs.scipy.org/scipy/docs/scipy-docs/tutorial/stats/continuous.rst/#continuous-random-variables Then I go back and forth between Wikipedia to back out what our parameters mean since they are often not standard (with Wikipedia being standard, of course...). I am willing to get involved in the docs marathon with the stats stuff. Josef, Ralf, others, I know you have worked on this issue much more than I. Any thoughts for streamlining things and making the distributions a little easier to work with? I, for one, am willing to give up some of the "autodoc'ing" (and do the work that this means) if it would improve anything from a user or developer standpoint. Maybe we can add a notes section, since there is a note not to use extradoc (?), for noting things like differing parameterizations of the distributions in the literature. Skipper From d.l.goldsmith at gmail.com Tue Jul 6 12:08:35 2010 From: d.l.goldsmith at gmail.com (David Goldsmith) Date: Tue, 6 Jul 2010 09:08:35 -0700 Subject: [SciPy-Dev] Documenting distributions, advice? In-Reply-To: References: Message-ID: At Josef's request, I marked all the distributions unimportant - he asked that they not be worked on for the time being. Josef, would you still prefer people to leave these alone for now? DG On Tue, Jul 6, 2010 at 8:42 AM, Skipper Seabold wrote: > On Tue, Jul 6, 2010 at 11:29 AM, Matthew Brett > wrote: > > Hi, > > > > I was just playing with the Gamma distribution (sicpy.stats.gamma) and > > I found it hard to work out how the distribution related to the > > standard sources on the topic. I think I'm getting there now, but > > I'm wondering whether there is anywhere some advice on documenting > > scipy.stats.distributions? For example, it looks as if there is only > > the 'extradoc' docstring addition for any specific distribution - is > > that right? > > > > Thanks a lot, > > > > Matthew > > After spending the last few days with the distributions trying to fix > one thing, I am also wondering about this and other questions with > respect to the distributions (more to come later...). > > At the risk of telling you things you already know, I usually start here: > > > > Then click on say 'continuous distributions' link towards the top to > see the LaTeX math > > > http://docs.scipy.org/scipy/docs/scipy-docs/tutorial/stats/continuous.rst/#continuous-random-variables > > Then I go back and forth between Wikipedia to back out what our > parameters mean since they are often not standard (with Wikipedia > being standard, of course...). > > I am willing to get involved in the docs marathon with the stats > stuff. Josef, Ralf, others, I know you have worked on this issue much > more than I. Any thoughts for streamlining things and making the > distributions a little easier to work with? I, for one, am willing to > give up some of the "autodoc'ing" (and do the work that this means) if > it would improve anything from a user or developer standpoint. Maybe > we can add a notes section, since there is a note not to use extradoc > (?), for noting things like differing parameterizations of the > distributions in the literature. > > Skipper > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > -- Mathematician: noun, someone who disavows certainty when their uncertainty set is non-empty, even if that set has measure zero. Hope: noun, that delusive spirit which escaped Pandora's jar and, with her lies, prevents mankind from committing a general suicide. (As interpreted by Robert Graves) -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at googlemail.com Tue Jul 6 13:58:31 2010 From: ralf.gommers at googlemail.com (Ralf Gommers) Date: Wed, 7 Jul 2010 01:58:31 +0800 Subject: [SciPy-Dev] Documenting distributions, advice? In-Reply-To: References: Message-ID: On Tue, Jul 6, 2010 at 11:42 PM, Skipper Seabold wrote: > On Tue, Jul 6, 2010 at 11:29 AM, Matthew Brett > wrote: > > Hi, > > > > I was just playing with the Gamma distribution (sicpy.stats.gamma) and > > I found it hard to work out how the distribution related to the > > standard sources on the topic. I think I'm getting there now, but > > I'm wondering whether there is anywhere some advice on documenting > > scipy.stats.distributions? For example, it looks as if there is only > > the 'extradoc' docstring addition for any specific distribution - is > > that right? > > > > Thanks a lot, > > > > Matthew > > After spending the last few days with the distributions trying to fix > one thing, I am also wondering about this and other questions with > respect to the distributions (more to come later...). > > At the risk of telling you things you already know, I usually start here: > > > > Then click on say 'continuous distributions' link towards the top to > see the LaTeX math > > > http://docs.scipy.org/scipy/docs/scipy-docs/tutorial/stats/continuous.rst/#continuous-random-variables > > Then I go back and forth between Wikipedia to back out what our > parameters mean since they are often not standard (with Wikipedia > being standard, of course...). > > I am willing to get involved in the docs marathon with the stats > stuff. Josef, Ralf, others, I know you have worked on this issue much > more than I. Any thoughts for streamlining things and making the > distributions a little easier to work with? I, for one, am willing to > give up some of the "autodoc'ing" (and do the work that this means) if > it would improve anything from a user or developer standpoint. Maybe > we can add a notes section, since there is a note not to use extradoc > (?), for noting things like differing parameterizations of the > distributions in the literature. > > After the recent changes to distribution docstrings this kind of thing is easy to achieve. To add a notes section you would do (for gamma_gen for example): """A gamma function continuous random variable. %(before_notes)s Notes ----- This is some note specific to gamma which explains bla bla.... %(example)s """ In the same way you can add a custom example, references, change a parameter description, etc. And no need to give up "autodoc'ing" for the standard parts. The parts of the default docstring available for string substitution are available in distributions.docdict. Note that extradoc is still available for backwards compatibility reasons and should not be used anymore. Eventually everything in extradoc should migrate over to the new system. If you write custom docstrings like this, just check they are processed without error in the interpreter, then commit to svn (or submit a git branch / patch). The doc wiki is not aware of those string substitutions, just as it wasn't aware of extradoc before, and is therefore not very useful in this case. In the wiki distributions are marked as unimportant, which is fine. If you find any glitches or things to improve please let me know. Cheers, Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From pav at iki.fi Tue Jul 6 14:31:50 2010 From: pav at iki.fi (Pauli Virtanen) Date: Tue, 6 Jul 2010 18:31:50 +0000 (UTC) Subject: [SciPy-Dev] Documenting distributions, advice? References: Message-ID: Tue, 06 Jul 2010 09:08:35 -0700, David Goldsmith wrote: > At Josef's request, I marked all the distributions unimportant - he > asked that they not be worked on for the time being. Josef, would you > still prefer people to leave these alone for now? I think this only concerns the doc editor. The docstrings can of course be worked on, if one just edits them directly in the sources, and e.g., submits patches or Git branches. Here we just have to fall back to ye olde way of doing things at the moment. -- Pauli Virtanen From d.l.goldsmith at gmail.com Tue Jul 6 14:59:41 2010 From: d.l.goldsmith at gmail.com (David Goldsmith) Date: Tue, 6 Jul 2010 11:59:41 -0700 Subject: [SciPy-Dev] Documenting distributions, advice? In-Reply-To: References: Message-ID: On Tue, Jul 6, 2010 at 11:31 AM, Pauli Virtanen wrote: > Tue, 06 Jul 2010 09:08:35 -0700, David Goldsmith wrote: > > At Josef's request, I marked all the distributions unimportant - he > > asked that they not be worked on for the time being. Josef, would you > > still prefer people to leave these alone for now? > > I think this only concerns the doc editor. > Here's the thread: http://article.gmane.org/gmane.comp.python.scientific.devel/13298/match=josef+may+25+2010 At best, IMO, it is unclear what Josef meant vis-a-vis not editing the distribution docstrings in the Wiki vs. at all; I would ask that people be conservative in interpreting his request, i.e., refrain from touching them, 'til he has had a chance to clarify his intent. DG > > The docstrings can of course be worked on, if one just edits them > directly in the sources, and e.g., submits patches or Git branches. Here > we just have to fall back to ye olde way of doing things at the moment. > > -- > Pauli Virtanen > > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > -- Mathematician: noun, someone who disavows certainty when their uncertainty set is non-empty, even if that set has measure zero. Hope: noun, that delusive spirit which escaped Pandora's jar and, with her lies, prevents mankind from committing a general suicide. (As interpreted by Robert Graves) -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at googlemail.com Tue Jul 6 15:24:39 2010 From: ralf.gommers at googlemail.com (Ralf Gommers) Date: Wed, 7 Jul 2010 03:24:39 +0800 Subject: [SciPy-Dev] Documenting distributions, advice? In-Reply-To: References: Message-ID: On Wed, Jul 7, 2010 at 2:59 AM, David Goldsmith wrote: > On Tue, Jul 6, 2010 at 11:31 AM, Pauli Virtanen wrote: > >> Tue, 06 Jul 2010 09:08:35 -0700, David Goldsmith wrote: >> > At Josef's request, I marked all the distributions unimportant - he >> > asked that they not be worked on for the time being. Josef, would you >> > still prefer people to leave these alone for now? >> >> I think this only concerns the doc editor. >> > > Here's the thread: > > > http://article.gmane.org/gmane.comp.python.scientific.devel/13298/match=josef+may+25+2010 > > At best, IMO, it is unclear what Josef meant vis-a-vis not editing the > distribution docstrings in the Wiki vs. at all; I would ask that people be > conservative in interpreting his request, i.e., refrain from touching them, > 'til he has had a chance to clarify his intent. > > Pauli got it right, and the intent is clear. Improvements are welcome of course. Cheers, Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From josef.pktd at gmail.com Tue Jul 6 16:21:47 2010 From: josef.pktd at gmail.com (josef.pktd at gmail.com) Date: Tue, 6 Jul 2010 16:21:47 -0400 Subject: [SciPy-Dev] Documenting distributions, advice? In-Reply-To: References: Message-ID: On Tue, Jul 6, 2010 at 3:24 PM, Ralf Gommers wrote: > > > On Wed, Jul 7, 2010 at 2:59 AM, David Goldsmith > wrote: >> >> On Tue, Jul 6, 2010 at 11:31 AM, Pauli Virtanen wrote: >>> >>> Tue, 06 Jul 2010 09:08:35 -0700, David Goldsmith wrote: >>> > At Josef's request, I marked all the distributions unimportant - he >>> > asked that they not be worked on for the time being. ?Josef, would you >>> > still prefer people to leave these alone for now? >>> >>> I think this only concerns the doc editor. >> >> Here's the thread: >> >> >> http://article.gmane.org/gmane.comp.python.scientific.devel/13298/match=josef+may+25+2010 >> >> At best, IMO, it is unclear what Josef meant vis-a-vis not editing the >> distribution docstrings in the Wiki vs. at all; I would ask that people be >> conservative in interpreting his request, i.e., refrain from touching them, >> 'til he has had a chance to clarify his intent. >> > Pauli got it right, and the intent is clear. Improvements are welcome of > course. > > Cheers, > Ralf (Sorry for any delays, I'm in a beach and family time zone) Essentially, I'm not a fan of docstrings and examples that don't add much information to the generic docs but require maintenance. (e.g redoing all examples if implementation detals or precision changes) Although, now that several users/developers are looking at stats.distributions maybe that won't be my problem anymore. For general explanations to the distributions, I think expanding on the formulas/definitions in scipy.stats.tutorial (splitting up the pages) would be more useful than making the distribution class docstring very long (argument by Pierre). Note I'm usually just using >>> np.source(stats.distributions.gamma_gen) >>> print stats.gamma.extradoc The rest is just repetition that I never read. "print stats.gamma.__doc__" is too much noise. Feel free to add any information that makes it more easily accessible, especially if Ralf thinks that the template system works without too much problem. David, I think we should stay with "unimportant" for now, mainly because the basic docstrings are there, and I don't think the distributions should be advertised as interesting edits (compared to all the other work that is necessary in the scipy docs), and I would like to move slowly until a pattern of enhanced distributions docs is established. Josef > > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > > From d.l.goldsmith at gmail.com Tue Jul 6 17:04:23 2010 From: d.l.goldsmith at gmail.com (David Goldsmith) Date: Tue, 6 Jul 2010 14:04:23 -0700 Subject: [SciPy-Dev] Documenting distributions, advice? In-Reply-To: References: Message-ID: On Tue, Jul 6, 2010 at 1:21 PM, wrote: > On Tue, Jul 6, 2010 at 3:24 PM, Ralf Gommers > wrote: > > > > > > On Wed, Jul 7, 2010 at 2:59 AM, David Goldsmith > > > wrote: > >> > >> On Tue, Jul 6, 2010 at 11:31 AM, Pauli Virtanen wrote: > >>> > >>> Tue, 06 Jul 2010 09:08:35 -0700, David Goldsmith wrote: > >>> > At Josef's request, I marked all the distributions unimportant - he > >>> > asked that they not be worked on for the time being. Josef, would > you > >>> > still prefer people to leave these alone for now? > >>> > >>> I think this only concerns the doc editor. > >> > >> Here's the thread: > >> > >> > >> > http://article.gmane.org/gmane.comp.python.scientific.devel/13298/match=josef+may+25+2010 > >> > >> At best, IMO, it is unclear what Josef meant vis-a-vis not editing the > >> distribution docstrings in the Wiki vs. at all; I would ask that people > be > >> conservative in interpreting his request, i.e., refrain from touching > them, > >> 'til he has had a chance to clarify his intent. > >> > > Pauli got it right, and the intent is clear. Improvements are welcome of > > course. > > > > Cheers, > > Ralf > > (Sorry for any delays, I'm in a beach and family time zone) > > Essentially, I'm not a fan of docstrings and examples that don't add > much information to the generic docs but require maintenance. (e.g > redoing all examples if implementation detals or precision changes) > Although, now that several users/developers are looking at > stats.distributions maybe that won't be my problem anymore. > > For general explanations to the distributions, I think expanding on > the formulas/definitions in scipy.stats.tutorial (splitting up the > pages) would be more useful than making the distribution class > docstring very long (argument by Pierre). > > Note I'm usually just using > >>> np.source(stats.distributions.gamma_gen) > >>> print stats.gamma.extradoc > > The rest is just repetition that I never read. "print > stats.gamma.__doc__" is too much noise. > > Feel free to add any information that makes it more easily accessible, > especially if Ralf thinks that the template system works without too > much problem. > > David, > I think we should stay with "unimportant" for now, mainly because the > basic docstrings are there, and I don't think the distributions should > be advertised as interesting edits (compared to all the other work > that is necessary in the scipy docs), and I would like to move slowly > until a pattern of enhanced distributions docs is established. > > Josef > OK, perhaps we don't understand the issues the same way, but I'm still getting mixed signals; Ralf, on the other hand, seems to feel he and Pauli clearly understand what you have in mind, so I'll happily defer to them. DG -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsouthey at gmail.com Tue Jul 6 17:27:09 2010 From: bsouthey at gmail.com (Bruce Southey) Date: Tue, 06 Jul 2010 16:27:09 -0500 Subject: [SciPy-Dev] scipy error compiling csr_wrap under Python 2.7 Message-ID: <4C339FAD.6010505@gmail.com> Hi, I failed to get the scipy 0.8 rc1 and SVN to build under Python2.7 with numpy '2.0.0.dev8469' and gcc version 4.4.4 20100503 (Red Hat 4.4.4-2) (GCC) . But scipy does compile with Python 2.6. building 'scipy.sparse.sparsetools._csr' extension compiling C++ sources C compiler: g++ -pthread -fno-strict-aliasing -g -O2 -DNDEBUG -g -fwrapv -O3 -Wall -fPIC compile options: '-I/usr/local/lib/python2.7/site-packages/numpy/core/include -I/usr/local/include/python2.7 -c' g++: scipy/sparse/sparsetools/csr_wrap.cxx scipy/sparse/sparsetools/csr_wrap.cxx: In function ?int require_size(PyArrayObject*, npy_intp*, int)?: scipy/sparse/sparsetools/csr_wrap.cxx:2910: error: expected ?)? before ?PRIdPTR? scipy/sparse/sparsetools/csr_wrap.cxx:2910: warning: spurious trailing ?%? in format scipy/sparse/sparsetools/csr_wrap.cxx:2910: warning: too many arguments for format scipy/sparse/sparsetools/csr_wrap.cxx:2917: error: expected ?)? before ?PRIdPTR? scipy/sparse/sparsetools/csr_wrap.cxx:2917: warning: spurious trailing ?%? in format scipy/sparse/sparsetools/csr_wrap.cxx:2917: warning: too many arguments for format scipy/sparse/sparsetools/csr_wrap.cxx: In function ?int require_size(PyArrayObject*, npy_intp*, int)?: scipy/sparse/sparsetools/csr_wrap.cxx:2910: error: expected ?)? before ?PRIdPTR? scipy/sparse/sparsetools/csr_wrap.cxx:2910: warning: spurious trailing ?%? in format scipy/sparse/sparsetools/csr_wrap.cxx:2910: warning: too many arguments for format scipy/sparse/sparsetools/csr_wrap.cxx:2917: error: expected ?)? before ?PRIdPTR? scipy/sparse/sparsetools/csr_wrap.cxx:2917: warning: spurious trailing ?%? in format scipy/sparse/sparsetools/csr_wrap.cxx:2917: warning: too many arguments for format error: Command "g++ -pthread -fno-strict-aliasing -g -O2 -DNDEBUG -g -fwrapv -O3 -Wall -fPIC -I/usr/local/lib/python2.7/site-packages/numpy/core/include -I/usr/local/include/python2.7 -c scipy/sparse/sparsetools/csr_wrap.cxx -o build/temp.linux-x86_64-2.7/scipy/sparse/sparsetools/csr_wrap.o" failed with exit status 1 The error is possibly related the variable 'NPY_INTP_FMT' (defined in numpy/core/include/numpy/ndarraytypes.h). A grep shows these files: scipy/sparse/sparsetools/coo_wrap.cxx: sprintf(s,"%" NPY_INTP_FMT ",", size[i]); scipy/sparse/sparsetools/coo_wrap.cxx: sprintf(s,"%" NPY_INTP_FMT ",", array_size(ary,i)); scipy/sparse/sparsetools/bsr_wrap.cxx: sprintf(s,"%" NPY_INTP_FMT ",", size[i]); scipy/sparse/sparsetools/bsr_wrap.cxx: sprintf(s,"%" NPY_INTP_FMT ",", array_size(ary,i)); scipy/sparse/sparsetools/numpy.i: sprintf(s,"%" NPY_INTP_FMT ",", size[i]); scipy/sparse/sparsetools/numpy.i: sprintf(s,"%" NPY_INTP_FMT ",", array_size(ary,i)); scipy/sparse/sparsetools/csr_wrap.cxx: sprintf(s,"%" NPY_INTP_FMT ",", size[i]); scipy/sparse/sparsetools/csr_wrap.cxx: sprintf(s,"%" NPY_INTP_FMT ",", array_size(ary,i)); scipy/sparse/sparsetools/csc_wrap.cxx: sprintf(s,"%" NPY_INTP_FMT ",", size[i]); scipy/sparse/sparsetools/csc_wrap.cxx: sprintf(s,"%" NPY_INTP_FMT ",", array_size(ary,i)); scipy/sparse/sparsetools/dia_wrap.cxx: sprintf(s,"%" NPY_INTP_FMT ",", size[i]); scipy/sparse/sparsetools/dia_wrap.cxx: sprintf(s,"%" NPY_INTP_FMT ",", array_size(ary,i)); Note that the scipy/sparse/sparsetools/numpy.i contains tabs in some places instead of spaces. The ndarraytypes.h is present in /usr/local/lib/python2.7/site-packages/numpy/core/include/numpy/ Bruce From josef.pktd at gmail.com Tue Jul 6 18:14:52 2010 From: josef.pktd at gmail.com (josef.pktd at gmail.com) Date: Tue, 6 Jul 2010 18:14:52 -0400 Subject: [SciPy-Dev] Documenting distributions, advice? In-Reply-To: References: Message-ID: On Tue, Jul 6, 2010 at 2:31 PM, Pauli Virtanen wrote: > Tue, 06 Jul 2010 09:08:35 -0700, David Goldsmith wrote: >> At Josef's request, I marked all the distributions unimportant - he >> asked that they not be worked on for the time being. ?Josef, would you >> still prefer people to leave these alone for now? > > I think this only concerns the doc editor. > > The docstrings can of course be worked on, if one just edits them > directly in the sources, and e.g., submits patches or Git branches. Here > we just have to fall back to ye olde way of doing things at the moment. As far as I understand, only the initial creation of the class docstring of a distribution needs to be committed to source, once a distribution has the docstring, then I think anyone could edit in the doc editor. http://docs.scipy.org/scipy/docs/scipy.stats.distributions.maxwell_gen/ The template itself and the generic docstring can only be edited in the source. I don't remember this part when the templating system was created: I think, because all public methods, pdf, cdf, sf,..., are inherited, they cannot be edited with version specific information. Josef > > -- > Pauli Virtanen > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > From jjstickel at vcn.com Wed Jul 7 16:42:04 2010 From: jjstickel at vcn.com (Jonathan Stickel) Date: Wed, 07 Jul 2010 14:42:04 -0600 Subject: [SciPy-Dev] scikits contribution? Message-ID: <4C34E69C.9010708@vcn.com> Some time ago, I offered to contribute (on the scipy-user list) some code to smooth 1-D data by regularization: http://mail.scipy.org/pipermail/scipy-user/2010-February/024351.html Someone suggested that scikits might be the right place: http://mail.scipy.org/pipermail/scipy-user/2010-February/024408.html So I am finally looking into scikits, and I am not sure how to proceed. My code consists of several functions in a single .py file. It seems overkill to create a new scikit for just one file, but I do not see an existing scikit that matches. 'Optimization' would be the closest; in core scipy I would put it in 'interpolate'. So, what is the minimum that I need to do to create a scikit and upload my code? Any suggestions for the name of the scikit (interpolate, data_smoothing)? Please know that I am just starting to learn python, being a convert from matlab/octave. Although I have become fairly proficient using numpy/scipy in ipython, I do not know much about python internals, setuptools, etc. Thanks, Jonathan From charlesr.harris at gmail.com Wed Jul 7 23:45:24 2010 From: charlesr.harris at gmail.com (Charles R Harris) Date: Wed, 7 Jul 2010 21:45:24 -0600 Subject: [SciPy-Dev] splev Message-ID: Hi All, I opened ticket #1223 because the values returned by splev are not zero for arguments outside of the interval of definition. Because splev evaluates b-splines, which all have compact support, I think zero is the correct value for such arguments. However, I also find that some tests assume that splev extrapolates the interpolating polynomials defined in the first and last spans, which is what splev currently does. Consequently I thought it worth bringing the topic up on the list for discussion before making the commit that changes the behavior. Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From aarchiba at physics.mcgill.ca Thu Jul 8 00:06:18 2010 From: aarchiba at physics.mcgill.ca (Anne Archibald) Date: Thu, 8 Jul 2010 00:06:18 -0400 Subject: [SciPy-Dev] splev In-Reply-To: References: Message-ID: On 7 July 2010 23:45, Charles R Harris wrote: > Hi All, > > I opened ticket #1223 because the values returned by splev are not zero for > arguments outside of the interval of definition. Because splev evaluates > b-splines, which all have compact support, I think zero is the correct value > for such arguments. However, I also find that some tests assume that splev > extrapolates the interpolating polynomials defined in the first and last > spans, which is what splev currently does.? Consequently I thought it worth > bringing the topic up on the list for discussion before making the commit > that changes the behavior. Please do not make this change. The extrapolation you get now is ugly, as polynomial extrapolation always is, but a discontinuity is considerably uglier. Remember that the arguments and range are both floating-point, so that roundoff error can easily move an argument outside the range; if extrapolation is used this is unimportant, but if the spline flatly drops to zero this will produce wildly wrong answers. Anne > Chuck > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > > From m.boumans at gmx.net Thu Jul 8 00:09:49 2010 From: m.boumans at gmx.net (bowie_22) Date: Thu, 8 Jul 2010 04:09:49 +0000 (UTC) Subject: [SciPy-Dev] Problems with documentation of scipy.interpolate.dfitpack Message-ID: Hello together, I tried to contribute to the documentation of scipy. I found out that the whole scipy.interpolate.dfitpack package seems to be a little bit different. It is not possible to view the code. You get 404 File not found error. Is there no source code available for dfitpack? If so, how is the documentation handled? Regs Marcus From charlesr.harris at gmail.com Thu Jul 8 00:19:06 2010 From: charlesr.harris at gmail.com (Charles R Harris) Date: Wed, 7 Jul 2010 22:19:06 -0600 Subject: [SciPy-Dev] splev In-Reply-To: References: Message-ID: On Wed, Jul 7, 2010 at 10:06 PM, Anne Archibald wrote: > On 7 July 2010 23:45, Charles R Harris wrote: > > Hi All, > > > > I opened ticket #1223 because the values returned by splev are not zero > for > > arguments outside of the interval of definition. Because splev evaluates > > b-splines, which all have compact support, I think zero is the correct > value > > for such arguments. However, I also find that some tests assume that > splev > > extrapolates the interpolating polynomials defined in the first and last > > spans, which is what splev currently does. Consequently I thought it > worth > > bringing the topic up on the list for discussion before making the commit > > that changes the behavior. > > Please do not make this change. The extrapolation you get now is ugly, > as polynomial extrapolation always is, but a discontinuity is > considerably uglier. Remember that the arguments and range are both > floating-point, so that roundoff error can easily move an argument > outside the range; if extrapolation is used this is unimportant, but > if the spline flatly drops to zero this will produce wildly wrong > answers. > > Well, for my purposes extrapolation really screws things up. Note that the b-splines at the ends are in fact c_{-1}, i.e., discontinuous, and the next b-spline in has a discontinuous 1'st derivative, so on and so forth. The discontinuities are mathematically correct, b-splines aren't polynomials. Maybe we can add a keyword? Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From vincent at vincentdavis.net Thu Jul 8 01:18:41 2010 From: vincent at vincentdavis.net (Vincent Davis) Date: Wed, 7 Jul 2010 23:18:41 -0600 Subject: [SciPy-Dev] Problems with documentation of scipy.interpolate.dfitpack In-Reply-To: References: Message-ID: On Wed, Jul 7, 2010 at 10:09 PM, bowie_22 wrote: > Hello together, > > I tried to contribute to the documentation of scipy. > I found out that the whole > > scipy.interpolate.dfitpack > > package seems to be a little bit different. > > It is not possible to view the code. > You get 404 File not found error. > > Is there no source code available for dfitpack? > If so, how is the documentation handled? I think the first line in the doc editor says it all. "This module 'dfitpack' is auto-generated with f2py (version:2_8473). Functions:" I think this is the source http://projects.scipy.org/scipy/browser/trunk/scipy/interpolate/src/fitpack.pyf Vincent > > Regs > > Marcus > > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > From josh.holbrook at gmail.com Thu Jul 8 02:02:47 2010 From: josh.holbrook at gmail.com (Joshua Holbrook) Date: Wed, 7 Jul 2010 22:02:47 -0800 Subject: [SciPy-Dev] splev In-Reply-To: References: Message-ID: On Wed, Jul 7, 2010 at 8:19 PM, Charles R Harris wrote: > > > On Wed, Jul 7, 2010 at 10:06 PM, Anne Archibald > wrote: >> >> On 7 July 2010 23:45, Charles R Harris wrote: >> > Hi All, >> > >> > I opened ticket #1223 because the values returned by splev are not zero >> > for >> > arguments outside of the interval of definition. Because splev evaluates >> > b-splines, which all have compact support, I think zero is the correct >> > value >> > for such arguments. However, I also find that some tests assume that >> > splev >> > extrapolates the interpolating polynomials defined in the first and last >> > spans, which is what splev currently does.? Consequently I thought it >> > worth >> > bringing the topic up on the list for discussion before making the >> > commit >> > that changes the behavior. >> >> Please do not make this change. The extrapolation you get now is ugly, >> as polynomial extrapolation always is, but a discontinuity is >> considerably uglier. Remember that the arguments and range are both >> floating-point, so that roundoff error can easily move an argument >> outside the range; if extrapolation is used this is unimportant, but >> if the spline flatly drops to zero this will produce wildly wrong >> answers. >> > > Well, for my purposes extrapolation really screws things up. Note that the > b-splines at the ends are in fact c_{-1}, i.e., discontinuous, and the next > b-spline in has a discontinuous 1'st derivative, so on and so forth. The > discontinuities are mathematically correct, b-splines aren't polynomials. > Maybe we can add a keyword? > > Chuck > > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > > I personally agree with Anne. While discontinuities may be technically correct, I think the current behavior is more forgiving. If this is noted in the docs, I say things are g2g. Chuck: Is checking the interval yourself prohibitive? --Josh From ralf.gommers at googlemail.com Thu Jul 8 06:58:36 2010 From: ralf.gommers at googlemail.com (Ralf Gommers) Date: Thu, 8 Jul 2010 18:58:36 +0800 Subject: [SciPy-Dev] Documenting distributions, advice? In-Reply-To: References: Message-ID: On Wed, Jul 7, 2010 at 6:14 AM, wrote: > On Tue, Jul 6, 2010 at 2:31 PM, Pauli Virtanen wrote: > > Tue, 06 Jul 2010 09:08:35 -0700, David Goldsmith wrote: > >> At Josef's request, I marked all the distributions unimportant - he > >> asked that they not be worked on for the time being. Josef, would you > >> still prefer people to leave these alone for now? > > > > I think this only concerns the doc editor. > > > > The docstrings can of course be worked on, if one just edits them > > directly in the sources, and e.g., submits patches or Git branches. Here > > we just have to fall back to ye olde way of doing things at the moment. > > As far as I understand, only the initial creation of the class docstring of > a > distribution needs to be committed to source, once a distribution has > the docstring, then I think anyone could edit in the doc editor. > > http://docs.scipy.org/scipy/docs/scipy.stats.distributions.maxwell_gen/ > Good point, I hadn't actually realized this. The maxwell docstring shows the expanded version and maxwell_gen the version with %()s's. The downside is still that you can only see the final expanded result after the _gen version has been committed to svn. Therefore a git branch may still be the way to go. > > The template itself and the generic docstring can only be edited in the > source. > > Correct. > I don't remember this part when the templating system was created: I think, > because all public methods, pdf, cdf, sf,..., are inherited, they cannot be > edited with version specific information. > > If you mean distname.pdf.__doc__ you're right, but you can certainly customize the pdf/cdf/etc info in distname.__doc__. Cheers, Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at googlemail.com Thu Jul 8 07:09:45 2010 From: ralf.gommers at googlemail.com (Ralf Gommers) Date: Thu, 8 Jul 2010 19:09:45 +0800 Subject: [SciPy-Dev] Documenting distributions, advice? In-Reply-To: References: Message-ID: On Wed, Jul 7, 2010 at 4:21 AM, wrote: > > Note I'm usually just using > >>> np.source(stats.distributions.gamma_gen) > >>> print stats.gamma.extradoc > > The rest is just repetition that I never read. "print > stats.gamma.__doc__" is too much noise. > > Well, if the SNR improves you may change that habit:) If things are unclear, like in the case of gamma that this thread started with, the docstring is the only obvious place to clarify them. Users are just not going to guess that the html/pdf docs contain a tutorial with essential info missing from the docstring itself. > Feel free to add any information that makes it more easily accessible, > especially if Ralf thinks that the template system works without too > much problem. > I haven't seen problems yet. If there are any they'll be ironed out quickly enough once the system is exercised a bit more. Cheers, Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at googlemail.com Thu Jul 8 07:33:47 2010 From: ralf.gommers at googlemail.com (Ralf Gommers) Date: Thu, 8 Jul 2010 19:33:47 +0800 Subject: [SciPy-Dev] scipy error compiling csr_wrap under Python 2.7 In-Reply-To: <4C339FAD.6010505@gmail.com> References: <4C339FAD.6010505@gmail.com> Message-ID: On Wed, Jul 7, 2010 at 5:27 AM, Bruce Southey wrote: > Hi, > I failed to get the scipy 0.8 rc1 and SVN to build under Python2.7 with > numpy '2.0.0.dev8469' and gcc version 4.4.4 20100503 (Red Hat 4.4.4-2) > (GCC) . But scipy does compile with Python 2.6. > > If this is easy to fix on linux (and someone comes up with that fix) it makes sense to do so. On OS X the problems are more serious and it looks like they require changes to numpy.distutils first, so it's not going to work with numpy 1.4.1 in any case. So full support for 2.7 on all platforms is not going to happen for the 0.8.0 release. This is http://projects.scipy.org/scipy/ticket/1180 by the way. Ralf building 'scipy.sparse.sparsetools._csr' extension > compiling C++ sources > C compiler: g++ -pthread -fno-strict-aliasing -g -O2 -DNDEBUG -g -fwrapv > -O3 -Wall -fPIC > > compile options: > '-I/usr/local/lib/python2.7/site-packages/numpy/core/include > -I/usr/local/include/python2.7 -c' > g++: scipy/sparse/sparsetools/csr_wrap.cxx > scipy/sparse/sparsetools/csr_wrap.cxx: In function ?int > require_size(PyArrayObject*, npy_intp*, int)?: > scipy/sparse/sparsetools/csr_wrap.cxx:2910: error: expected ?)? before > ?PRIdPTR? > scipy/sparse/sparsetools/csr_wrap.cxx:2910: warning: spurious trailing > ?%? in format > scipy/sparse/sparsetools/csr_wrap.cxx:2910: warning: too many arguments > for format > scipy/sparse/sparsetools/csr_wrap.cxx:2917: error: expected ?)? before > ?PRIdPTR? > scipy/sparse/sparsetools/csr_wrap.cxx:2917: warning: spurious trailing > ?%? in format > scipy/sparse/sparsetools/csr_wrap.cxx:2917: warning: too many arguments > for format > scipy/sparse/sparsetools/csr_wrap.cxx: In function ?int > require_size(PyArrayObject*, npy_intp*, int)?: > scipy/sparse/sparsetools/csr_wrap.cxx:2910: error: expected ?)? before > ?PRIdPTR? > scipy/sparse/sparsetools/csr_wrap.cxx:2910: warning: spurious trailing > ?%? in format > scipy/sparse/sparsetools/csr_wrap.cxx:2910: warning: too many arguments > for format > scipy/sparse/sparsetools/csr_wrap.cxx:2917: error: expected ?)? before > ?PRIdPTR? > scipy/sparse/sparsetools/csr_wrap.cxx:2917: warning: spurious trailing > ?%? in format > scipy/sparse/sparsetools/csr_wrap.cxx:2917: warning: too many arguments > for format > error: Command "g++ -pthread -fno-strict-aliasing -g -O2 -DNDEBUG -g > -fwrapv -O3 -Wall -fPIC > -I/usr/local/lib/python2.7/site-packages/numpy/core/include > -I/usr/local/include/python2.7 -c scipy/sparse/sparsetools/csr_wrap.cxx > -o build/temp.linux-x86_64-2.7/scipy/sparse/sparsetools/csr_wrap.o" > failed with exit status 1 > > The error is possibly related the variable 'NPY_INTP_FMT' (defined in > numpy/core/include/numpy/ndarraytypes.h). A grep shows these files: > scipy/sparse/sparsetools/coo_wrap.cxx: sprintf(s,"%" NPY_INTP_FMT ",", > size[i]); > scipy/sparse/sparsetools/coo_wrap.cxx: sprintf(s,"%" NPY_INTP_FMT ",", > array_size(ary,i)); > scipy/sparse/sparsetools/bsr_wrap.cxx: sprintf(s,"%" NPY_INTP_FMT ",", > size[i]); > scipy/sparse/sparsetools/bsr_wrap.cxx: sprintf(s,"%" NPY_INTP_FMT ",", > array_size(ary,i)); > scipy/sparse/sparsetools/numpy.i: sprintf(s,"%" NPY_INTP_FMT ",", size[i]); > scipy/sparse/sparsetools/numpy.i: sprintf(s,"%" NPY_INTP_FMT ",", > array_size(ary,i)); > scipy/sparse/sparsetools/csr_wrap.cxx: sprintf(s,"%" NPY_INTP_FMT ",", > size[i]); > scipy/sparse/sparsetools/csr_wrap.cxx: sprintf(s,"%" NPY_INTP_FMT ",", > array_size(ary,i)); > scipy/sparse/sparsetools/csc_wrap.cxx: sprintf(s,"%" NPY_INTP_FMT ",", > size[i]); > scipy/sparse/sparsetools/csc_wrap.cxx: sprintf(s,"%" NPY_INTP_FMT ",", > array_size(ary,i)); > scipy/sparse/sparsetools/dia_wrap.cxx: sprintf(s,"%" NPY_INTP_FMT ",", > size[i]); > scipy/sparse/sparsetools/dia_wrap.cxx: sprintf(s,"%" NPY_INTP_FMT ",", > array_size(ary,i)); > > Note that the scipy/sparse/sparsetools/numpy.i contains tabs in some > places instead of spaces. > > The ndarraytypes.h is present in > /usr/local/lib/python2.7/site-packages/numpy/core/include/numpy/ > > Bruce > > > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at googlemail.com Thu Jul 8 08:17:20 2010 From: ralf.gommers at googlemail.com (Ralf Gommers) Date: Thu, 8 Jul 2010 20:17:20 +0800 Subject: [SciPy-Dev] arccosh/arctanh test precision Message-ID: The arccosh test precision is set a bit too high for real input - it's kept at the default level which is 5*eps. Setting it to 5e-14 solves the issue on 32-bit Windows, which is OK I think. On 64-bit Windows the results are further off (see below) which could be considered an actual problem. The arctanh result also looks bad. So I propose to change the 32-bit test precision only. For the curious, it's line 34 of special/tests/test_data.py. Is the above fine? Should I open a ticket for 64-bit? Thanks, Ralf 32-bit-Windows: ======================================== FAIL: test_data.test_boost(,) ---------------------------------------------------------------------- <...> Max |adiff|: 1.77636e-15 Max |rdiff|: 2.44233e-14 64-bit Windows (from the binaries by Christoph Gohlke, built with msvc9): ======================================== FAIL: test_data.test_boost(,) ---------------------------------------------------------------------- <...> AssertionError: Max |adiff|: 1.77636e-15 Max |rdiff|: 1.09352e-13 <...> ====================================================================== FAIL: test_data.test_boost(,) ---------------------------------------------------------------------- <...> AssertionError: Max |adiff|: 6.39488e-12 Max |rdiff|: 1.01982e-12 <...> -------------- next part -------------- An HTML attachment was scrubbed... URL: From josef.pktd at gmail.com Thu Jul 8 08:24:25 2010 From: josef.pktd at gmail.com (josef.pktd at gmail.com) Date: Thu, 8 Jul 2010 08:24:25 -0400 Subject: [SciPy-Dev] scikits contribution? In-Reply-To: <4C34E69C.9010708@vcn.com> References: <4C34E69C.9010708@vcn.com> Message-ID: On Wed, Jul 7, 2010 at 4:42 PM, Jonathan Stickel wrote: > Some time ago, I offered to contribute (on the scipy-user list) some > code to smooth 1-D data by regularization: > > http://mail.scipy.org/pipermail/scipy-user/2010-February/024351.html > > Someone suggested that scikits might be the right place: > > http://mail.scipy.org/pipermail/scipy-user/2010-February/024408.html > > So I am finally looking into scikits, and I am not sure how to proceed. > ?My code consists of several functions in a single .py file. ?It seems > overkill to create a new scikit for just one file, but I do not see an > existing scikit that matches. ?'Optimization' would be the closest; in > core scipy I would put it in 'interpolate'. > > So, what is the minimum that I need to do to create a scikit and upload > my code? ?Any suggestions for the name of the scikit (interpolate, > data_smoothing)? The easiest to get started is to copy the setup structure from another scikit. I think the template scikit in scikits svn is a bit out of date, the last time I looked. If you think your model could form the basis for enhancing the smoother or noisy interpolation category in scipy, then a scikits would be the best way, as we discussed. If you want to add it to an existing scikits, then statsmodels would be a possibility. Although statsmodels is more oriented towards multivariate approaches, I think a smoother category, together with some non-parametric methods, e.g. the existing kernel regression, would be an appropriate fit. There is a need for smoothers in gam, Generalized Additive Models, but that one is not cleaned up yet. And I think there will be more applications where it would be useful to share the cross-validation code as far as possible. Josef > > Please know that I am just starting to learn python, being a convert > from matlab/octave. ?Although I have become fairly proficient using > numpy/scipy in ipython, I do not know much about python internals, > setuptools, etc. > > Thanks, > Jonathan > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > From charlesr.harris at gmail.com Thu Jul 8 08:59:33 2010 From: charlesr.harris at gmail.com (Charles R Harris) Date: Thu, 8 Jul 2010 06:59:33 -0600 Subject: [SciPy-Dev] splev In-Reply-To: References: Message-ID: On Thu, Jul 8, 2010 at 12:02 AM, Joshua Holbrook wrote: > On Wed, Jul 7, 2010 at 8:19 PM, Charles R Harris > wrote: > > > > > > On Wed, Jul 7, 2010 at 10:06 PM, Anne Archibald < > aarchiba at physics.mcgill.ca> > > wrote: > >> > >> On 7 July 2010 23:45, Charles R Harris > wrote: > >> > Hi All, > >> > > >> > I opened ticket #1223 because the values returned by splev are not > zero > >> > for > >> > arguments outside of the interval of definition. Because splev > evaluates > >> > b-splines, which all have compact support, I think zero is the correct > >> > value > >> > for such arguments. However, I also find that some tests assume that > >> > splev > >> > extrapolates the interpolating polynomials defined in the first and > last > >> > spans, which is what splev currently does. Consequently I thought it > >> > worth > >> > bringing the topic up on the list for discussion before making the > >> > commit > >> > that changes the behavior. > >> > >> Please do not make this change. The extrapolation you get now is ugly, > >> as polynomial extrapolation always is, but a discontinuity is > >> considerably uglier. Remember that the arguments and range are both > >> floating-point, so that roundoff error can easily move an argument > >> outside the range; if extrapolation is used this is unimportant, but > >> if the spline flatly drops to zero this will produce wildly wrong > >> answers. > >> > > > > Well, for my purposes extrapolation really screws things up. Note that > the > > b-splines at the ends are in fact c_{-1}, i.e., discontinuous, and the > next > > b-spline in has a discontinuous 1'st derivative, so on and so forth. The > > discontinuities are mathematically correct, b-splines aren't polynomials. > > Maybe we can add a keyword? > > > > Chuck > > > > > > _______________________________________________ > > SciPy-Dev mailing list > > SciPy-Dev at scipy.org > > http://mail.scipy.org/mailman/listinfo/scipy-dev > > > > > > I personally agree with Anne. While discontinuities may be technically > correct, I think the current behavior is more forgiving. If this is > noted in the docs, I say things are g2g. > > Chuck: Is checking the interval yourself prohibitive? > > It's inconvenient and time consuming. I'm leaning towards adding a keyword so that the user can choose one or the other behavior, say extrapolation=True, which will not impact current usage. Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From vincent at vincentdavis.net Thu Jul 8 09:05:11 2010 From: vincent at vincentdavis.net (Vincent Davis) Date: Thu, 8 Jul 2010 07:05:11 -0600 Subject: [SciPy-Dev] scipy error compiling csr_wrap under Python 2.7 In-Reply-To: References: <4C339FAD.6010505@gmail.com> Message-ID: On Thu, Jul 8, 2010 at 5:33 AM, Ralf Gommers wrote: > > > On Wed, Jul 7, 2010 at 5:27 AM, Bruce Southey wrote: >> >> Hi, >> I failed to get the scipy 0.8 rc1 and SVN to build under Python2.7 with >> numpy '2.0.0.dev8469' and gcc version 4.4.4 20100503 (Red Hat 4.4.4-2) >> (GCC) . But scipy does compile with Python 2.6. >> > If this is easy to fix on linux (and someone comes up with that fix) it > makes sense to do so. On OS X the problems are more serious and it looks > like they require changes to numpy.distutils first, so it's not going to > work with numpy 1.4.1 in any case. So full support for 2.7 on all platforms > is not going to happen for the 0.8.0 release. Not that I know what I am doing but I do have recent numpy and scipy installed on oxs running py2.7. I did use numscon and installed 64bit, I get 4 errors and 2 failures. Lots of warnings, "Warning: invalid value encountered in absolute" not sure what those are about. >>> scipy.test() Running unit tests for scipy NumPy version 2.0.0.dev NumPy is installed in /Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/numpy SciPy version 0.9.0.dev SciPy is installed in /Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/scipy Python version 2.7rc2+ (trunk, Jun 27 2010, 21:30:09) [GCC 4.2.1 (Apple Inc. build 5659)] nose version 0.11.3 ====================================================================== ERROR: test_decomp.test_lapack_misaligned(, (array([[ 1.734e-255, 8.189e-217, 4.025e-178, 1.903e-139, 9.344e-101, 4.422e-062, 2.169e-023, 1.028e+016, 5.035e+054, 2.388e+093], [ 1.169e+132, 5.549e+170, 2.713e+209, 1.290e+248, 6.299e+286, -9.272e-292, -4.524e-253, -2.155e-214, -1.050e-175, -5.007e-137], [ -2.438e-098, -1.163e-059, -5.659e-021, -2.703e+018, -1.314e+057, -6.281e+095, -3.049e+134, -1.459e+173, -7.078e+211, -3.391e+250], [ -1.643e+289, 3.524e-294, 1.734e-255, 8.189e-217, 4.025e-178, 1.903e-139, 9.344e-101, 4.422e-062, 2.169e-023, 1.028e+016], [ 5.035e+054, 2.388e+093, 1.169e+132, 5.549e+170, 2.713e+209, 1.290e+248, 6.299e+286, -9.272e-292, -4.524e-253, -2.155e-214], [ -1.050e-175, -5.007e-137, -2.438e-098, -1.163e-059, -5.659e-021, -2.703e+018, -1.314e+057, -6.281e+095, -3.049e+134, -1.459e+173], [ -7.078e+211, -3.391e+250, -1.643e+289, 3.524e-294, 1.734e-255, 8.189e-217, 4.025e-178, 1.903e-139, 9.344e-101, 4.422e-062], [ 2.169e-023, 1.028e+016, 5.035e+054, 2.388e+093, 1.169e+132, 5.549e+170, 2.713e+209, 1.290e+248, 6.299e+286, -9.272e-292], [ -4.524e-253, -2.155e-214, -1.050e-175, -5.007e-137, -2.438e-098, -1.163e-059, -5.659e-021, -2.703e+018, -1.314e+057, -6.281e+095], [ -3.049e+134, -1.459e+173, -7.078e+211, -3.391e+250, -1.643e+289, 3.524e-294, 1.734e-255, 8.189e-217, 4.025e-178, 1.903e-139]]), array([ NaN, NaN, NaN, NaN, NaN, -3.264e+62, 6.682e+23, 9.710e-15, -1.988e-53, -2.520e-94])), {'overwrite_a': True, 'overwrite_b': True}) test_decomp.test_lapack_misaligned(, (array([[ 1.734e-255, 8.189e-217, 4.025e-178, 1.903e-139, 9.344e-101, ---------------------------------------------------------------------- Traceback (most recent call last): File "/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/nose/case.py", line 186, in runTest self.test(*self.arg) File "/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/scipy/linalg/tests/test_decomp.py", line 1074, in check_lapack_misaligned func(*a,**kwargs) File "/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/scipy/linalg/basic.py", line 49, in solve a1, b1 = map(asarray_chkfinite,(a,b)) File "/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/numpy/lib/function_base.py", line 528, in asarray_chkfinite "array must not contain infs or NaNs") ValueError: array must not contain infs or NaNs ====================================================================== ERROR: test_complex_nonsymmetric_modes (test_arpack.TestEigenComplexNonSymmetric) ---------------------------------------------------------------------- Traceback (most recent call last): File "/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/scipy/sparse/linalg/eigen/arpack/tests/test_arpack.py", line 267, in test_complex_nonsymmetric_modes self.eval_evec(m,typ,k,which) File "/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/scipy/sparse/linalg/eigen/arpack/tests/test_arpack.py", line 248, in eval_evec eval,evec=eigen(a,k,which=which) File "/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/scipy/sparse/linalg/eigen/arpack/arpack.py", line 397, in eigen params.iterate() File "/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/scipy/sparse/linalg/eigen/arpack/arpack.py", line 222, in iterate raise RuntimeError("Error info=%d in arpack" % self.info) RuntimeError: Error info=-8 in arpack ====================================================================== ERROR: test_nonsymmetric_modes (test_arpack.TestEigenNonSymmetric) ---------------------------------------------------------------------- Traceback (most recent call last): File "/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/scipy/sparse/linalg/eigen/arpack/tests/test_arpack.py", line 204, in test_nonsymmetric_modes self.eval_evec(m,typ,k,which) File "/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/scipy/sparse/linalg/eigen/arpack/tests/test_arpack.py", line 186, in eval_evec eval,evec=eigen(a,k,which=which,**kwds) File "/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/scipy/sparse/linalg/eigen/arpack/arpack.py", line 397, in eigen params.iterate() File "/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/scipy/sparse/linalg/eigen/arpack/arpack.py", line 222, in iterate raise RuntimeError("Error info=%d in arpack" % self.info) RuntimeError: Error info=-8 in arpack ====================================================================== ERROR: test_starting_vector (test_arpack.TestEigenNonSymmetric) ---------------------------------------------------------------------- Traceback (most recent call last): File "/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/scipy/sparse/linalg/eigen/arpack/tests/test_arpack.py", line 214, in test_starting_vector self.eval_evec(self.symmetric[0],typ,k,which='LM',v0=v0) File "/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/scipy/sparse/linalg/eigen/arpack/tests/test_arpack.py", line 186, in eval_evec eval,evec=eigen(a,k,which=which,**kwds) File "/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/scipy/sparse/linalg/eigen/arpack/arpack.py", line 397, in eigen params.iterate() File "/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/scipy/sparse/linalg/eigen/arpack/arpack.py", line 222, in iterate raise RuntimeError("Error info=%d in arpack" % self.info) RuntimeError: Error info=-8 in arpack ====================================================================== FAIL: test_x_stride (test_fblas.TestCgemv) ---------------------------------------------------------------------- Traceback (most recent call last): File "/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/scipy/linalg/tests/test_fblas.py", line 348, in test_x_stride assert_array_almost_equal(desired_y,y) File "/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/numpy/testing/utils.py", line 774, in assert_array_almost_equal header='Arrays are not almost equal') File "/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/numpy/testing/utils.py", line 618, in assert_array_compare raise AssertionError(msg) AssertionError: Arrays are not almost equal (mismatch 33.3333333333%) x: array([ 2.58077621 -2.58077621j, -15.25132084+17.25131989j, 16.81481171-12.81481171j], dtype=complex64) y: array([ 2.58077621 -2.58077621j, -15.25132084+17.25132179j, 16.81481171-12.81481171j], dtype=complex64) >> raise AssertionError('\nArrays are not almost equal\n\n(mismatch 33.3333333333%)\n x: array([ 2.58077621 -2.58077621j, -15.25132084+17.25131989j,\n 16.81481171-12.81481171j], dtype=complex64)\n y: array([ 2.58077621 -2.58077621j, -15.25132084+17.25132179j,\n 16.81481171-12.81481171j], dtype=complex64)') ====================================================================== FAIL: test_complex_symmetric_modes (test_arpack.TestEigenComplexSymmetric) ---------------------------------------------------------------------- Traceback (most recent call last): File "/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/scipy/sparse/linalg/eigen/arpack/tests/test_arpack.py", line 156, in test_complex_symmetric_modes self.eval_evec(self.symmetric[0],typ,k,which) File "/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/scipy/sparse/linalg/eigen/arpack/tests/test_arpack.py", line 145, in eval_evec assert_array_almost_equal(eval,exact_eval,decimal=_ndigits[typ]) File "/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/numpy/testing/utils.py", line 774, in assert_array_almost_equal header='Arrays are not almost equal') File "/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/numpy/testing/utils.py", line 618, in assert_array_compare raise AssertionError(msg) AssertionError: Arrays are not almost equal (mismatch 100.0%) x: array([ 1.07188725 +6.23436023e-08j, 4.91291142 -3.25412906e-08j], dtype=complex64) y: array([ 5.+0.j, 6.+0.j], dtype=complex64) >> raise AssertionError('\nArrays are not almost equal\n\n(mismatch 100.0%)\n x: array([ 1.07188725 +6.23436023e-08j, 4.91291142 -3.25412906e-08j], dtype=complex64)\n y: array([ 5.+0.j, 6.+0.j], dtype=complex64)') ---------------------------------------------------------------------- Ran 4612 tests in 153.836s FAILED (KNOWNFAIL=11, SKIP=35, errors=4, failures=2) Vincent > > This is http://projects.scipy.org/scipy/ticket/1180 by the way. > > Ralf > > >> building 'scipy.sparse.sparsetools._csr' extension >> compiling C++ sources >> C compiler: g++ -pthread -fno-strict-aliasing -g -O2 -DNDEBUG -g -fwrapv >> -O3 -Wall -fPIC >> >> compile options: >> '-I/usr/local/lib/python2.7/site-packages/numpy/core/include >> -I/usr/local/include/python2.7 -c' >> g++: scipy/sparse/sparsetools/csr_wrap.cxx >> scipy/sparse/sparsetools/csr_wrap.cxx: In function ?int >> require_size(PyArrayObject*, npy_intp*, int)?: >> scipy/sparse/sparsetools/csr_wrap.cxx:2910: error: expected ?)? before >> ?PRIdPTR? >> scipy/sparse/sparsetools/csr_wrap.cxx:2910: warning: spurious trailing >> ?%? in format >> scipy/sparse/sparsetools/csr_wrap.cxx:2910: warning: too many arguments >> for format >> scipy/sparse/sparsetools/csr_wrap.cxx:2917: error: expected ?)? before >> ?PRIdPTR? >> scipy/sparse/sparsetools/csr_wrap.cxx:2917: warning: spurious trailing >> ?%? in format >> scipy/sparse/sparsetools/csr_wrap.cxx:2917: warning: too many arguments >> for format >> scipy/sparse/sparsetools/csr_wrap.cxx: In function ?int >> require_size(PyArrayObject*, npy_intp*, int)?: >> scipy/sparse/sparsetools/csr_wrap.cxx:2910: error: expected ?)? before >> ?PRIdPTR? >> scipy/sparse/sparsetools/csr_wrap.cxx:2910: warning: spurious trailing >> ?%? in format >> scipy/sparse/sparsetools/csr_wrap.cxx:2910: warning: too many arguments >> for format >> scipy/sparse/sparsetools/csr_wrap.cxx:2917: error: expected ?)? before >> ?PRIdPTR? >> scipy/sparse/sparsetools/csr_wrap.cxx:2917: warning: spurious trailing >> ?%? in format >> scipy/sparse/sparsetools/csr_wrap.cxx:2917: warning: too many arguments >> for format >> error: Command "g++ -pthread -fno-strict-aliasing -g -O2 -DNDEBUG -g >> -fwrapv -O3 -Wall -fPIC >> -I/usr/local/lib/python2.7/site-packages/numpy/core/include >> -I/usr/local/include/python2.7 -c scipy/sparse/sparsetools/csr_wrap.cxx >> -o build/temp.linux-x86_64-2.7/scipy/sparse/sparsetools/csr_wrap.o" >> failed with exit status 1 >> >> The error is possibly related the variable 'NPY_INTP_FMT' (defined in >> numpy/core/include/numpy/ndarraytypes.h). A grep shows these files: >> scipy/sparse/sparsetools/coo_wrap.cxx: sprintf(s,"%" NPY_INTP_FMT ",", >> size[i]); >> scipy/sparse/sparsetools/coo_wrap.cxx: sprintf(s,"%" NPY_INTP_FMT ",", >> array_size(ary,i)); >> scipy/sparse/sparsetools/bsr_wrap.cxx: sprintf(s,"%" NPY_INTP_FMT ",", >> size[i]); >> scipy/sparse/sparsetools/bsr_wrap.cxx: sprintf(s,"%" NPY_INTP_FMT ",", >> array_size(ary,i)); >> scipy/sparse/sparsetools/numpy.i: sprintf(s,"%" NPY_INTP_FMT ",", >> size[i]); >> scipy/sparse/sparsetools/numpy.i: sprintf(s,"%" NPY_INTP_FMT ",", >> array_size(ary,i)); >> scipy/sparse/sparsetools/csr_wrap.cxx: sprintf(s,"%" NPY_INTP_FMT ",", >> size[i]); >> scipy/sparse/sparsetools/csr_wrap.cxx: sprintf(s,"%" NPY_INTP_FMT ",", >> array_size(ary,i)); >> scipy/sparse/sparsetools/csc_wrap.cxx: sprintf(s,"%" NPY_INTP_FMT ",", >> size[i]); >> scipy/sparse/sparsetools/csc_wrap.cxx: sprintf(s,"%" NPY_INTP_FMT ",", >> array_size(ary,i)); >> scipy/sparse/sparsetools/dia_wrap.cxx: sprintf(s,"%" NPY_INTP_FMT ",", >> size[i]); >> scipy/sparse/sparsetools/dia_wrap.cxx: sprintf(s,"%" NPY_INTP_FMT ",", >> array_size(ary,i)); >> >> Note that the scipy/sparse/sparsetools/numpy.i contains tabs in some >> places instead of spaces. >> >> The ndarraytypes.h is present in >> /usr/local/lib/python2.7/site-packages/numpy/core/include/numpy/ >> >> Bruce >> >> >> >> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at scipy.org >> http://mail.scipy.org/mailman/listinfo/scipy-dev > > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > > From ralf.gommers at googlemail.com Thu Jul 8 09:40:07 2010 From: ralf.gommers at googlemail.com (Ralf Gommers) Date: Thu, 8 Jul 2010 21:40:07 +0800 Subject: [SciPy-Dev] scipy error compiling csr_wrap under Python 2.7 In-Reply-To: References: <4C339FAD.6010505@gmail.com> Message-ID: On Thu, Jul 8, 2010 at 9:05 PM, Vincent Davis wrote: > On Thu, Jul 8, 2010 at 5:33 AM, Ralf Gommers > wrote: > > > > > > On Wed, Jul 7, 2010 at 5:27 AM, Bruce Southey > wrote: > >> > >> Hi, > >> I failed to get the scipy 0.8 rc1 and SVN to build under Python2.7 with > >> numpy '2.0.0.dev8469' and gcc version 4.4.4 20100503 (Red Hat 4.4.4-2) > >> (GCC) . But scipy does compile with Python 2.6. > >> > > If this is easy to fix on linux (and someone comes up with that fix) it > > makes sense to do so. On OS X the problems are more serious and it looks > > like they require changes to numpy.distutils first, so it's not going to > > work with numpy 1.4.1 in any case. So full support for 2.7 on all > platforms > > is not going to happen for the 0.8.0 release. > > Not that I know what I am doing but I do have recent numpy and scipy > installed on oxs running py2.7. I did use numscon and installed 64bit, > I get 4 errors and 2 failures. Lots of warnings, "Warning: invalid > value encountered in absolute" not sure what those are about. > > Those errors are all known, so all seems fine. How did you install Python 2.7? I see your python says it's compiled with gcc 4.2, while I remember my scipy builds picking up 4.0 by default. Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From vincent at vincentdavis.net Thu Jul 8 09:50:25 2010 From: vincent at vincentdavis.net (Vincent Davis) Date: Thu, 8 Jul 2010 07:50:25 -0600 Subject: [SciPy-Dev] scipy error compiling csr_wrap under Python 2.7 In-Reply-To: References: <4C339FAD.6010505@gmail.com> Message-ID: On Thu, Jul 8, 2010 at 7:40 AM, Ralf Gommers wrote: > > > On Thu, Jul 8, 2010 at 9:05 PM, Vincent Davis > wrote: >> >> On Thu, Jul 8, 2010 at 5:33 AM, Ralf Gommers >> wrote: >> > >> > >> > On Wed, Jul 7, 2010 at 5:27 AM, Bruce Southey >> > wrote: >> >> >> >> Hi, >> >> I failed to get the scipy 0.8 rc1 and SVN to build under Python2.7 with >> >> numpy '2.0.0.dev8469' and gcc version 4.4.4 20100503 (Red Hat 4.4.4-2) >> >> (GCC) . But scipy does compile with Python 2.6. >> >> >> > If this is easy to fix on linux (and someone comes up with that fix) it >> > makes sense to do so. On OS X the problems are more serious and it looks >> > like they require changes to numpy.distutils first, so it's not going to >> > work with numpy 1.4.1 in any case. So full support for 2.7 on all >> > platforms >> > is not going to happen for the 0.8.0 release. >> >> Not that I know what I am doing but I do have recent numpy and scipy >> installed on oxs running py2.7. I did use numscon and installed 64bit, >> I get 4 errors and 2 failures. Lots of warnings, "Warning: invalid >> value encountered in absolute" not sure what those are about. >> > Those errors are all known, so all seems fine. How did you install Python > 2.7? I see your python says it's compiled with gcc 4.2, while I remember my > scipy builds picking up 4.0 by default. Well it was not skill or knowledge :-) pure blind trial and error but it seems simple now. for python 2.7 from source ./configure --with-universal-archs=64-bit --enable-universalsdk=/Developer/SDKs/MacOSX10.5.sdk --enable-framework then make, make install I have been taking notes so I can repeat it. More details here (numpy, scipy) http://vincentdavis.net/installing-python By the way what are "Warning: invalid value encountered in absolute" warnings I tried to find out more but didn't find much. Vincent > > Ralf > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > > From ralf.gommers at googlemail.com Thu Jul 8 11:34:51 2010 From: ralf.gommers at googlemail.com (Ralf Gommers) Date: Thu, 8 Jul 2010 23:34:51 +0800 Subject: [SciPy-Dev] scipy error compiling csr_wrap under Python 2.7 In-Reply-To: References: <4C339FAD.6010505@gmail.com> Message-ID: On Thu, Jul 8, 2010 at 9:50 PM, Vincent Davis wrote: > On Thu, Jul 8, 2010 at 7:40 AM, Ralf Gommers > wrote: > > > > > > On Thu, Jul 8, 2010 at 9:05 PM, Vincent Davis > > wrote: > >> > >> On Thu, Jul 8, 2010 at 5:33 AM, Ralf Gommers > >> wrote: > >> > > >> > > >> > On Wed, Jul 7, 2010 at 5:27 AM, Bruce Southey > >> > wrote: > >> >> > >> >> Hi, > >> >> I failed to get the scipy 0.8 rc1 and SVN to build under Python2.7 > with > >> >> numpy '2.0.0.dev8469' and gcc version 4.4.4 20100503 (Red Hat > 4.4.4-2) > >> >> (GCC) . But scipy does compile with Python 2.6. > >> >> > >> > If this is easy to fix on linux (and someone comes up with that fix) > it > >> > makes sense to do so. On OS X the problems are more serious and it > looks > >> > like they require changes to numpy.distutils first, so it's not going > to > >> > work with numpy 1.4.1 in any case. So full support for 2.7 on all > >> > platforms > >> > is not going to happen for the 0.8.0 release. > >> > >> Not that I know what I am doing but I do have recent numpy and scipy > >> installed on oxs running py2.7. I did use numscon and installed 64bit, > >> I get 4 errors and 2 failures. Lots of warnings, "Warning: invalid > >> value encountered in absolute" not sure what those are about. > >> > > Those errors are all known, so all seems fine. How did you install Python > > 2.7? I see your python says it's compiled with gcc 4.2, while I remember > my > > scipy builds picking up 4.0 by default. > > Well it was not skill or knowledge :-) pure blind trial and error but > it seems simple now. > for python 2.7 from source > ./configure --with-universal-archs=64-bit > --enable-universalsdk=/Developer/SDKs/MacOSX10.5.sdk > --enable-framework > then make, make install > > I have been taking notes so I can repeat it. More details here (numpy, > scipy) > http://vincentdavis.net/installing-python > Interesting. When I try the same build command for numpy with either of the 2.7 binaries from python.org (both built with gcc-4.0), numscons aborts after failing this test: # We check declaration AND type because that's how distutils does it. if config.CheckDeclaration('PY_LONG_LONG', includes = '#include \n'): st = config.CheckTypeSize('PY_LONG_LONG', includes = '#include \n') assert not st == 0 And setup.py install fails because it complains about missing x86_64 architecture. > By the way what are "Warning: invalid value encountered in absolute" > warnings I tried to find out more but didn't find much. > Vincent > > Those are from ufuncs operating on masked arrays - can't remember why it was hard to fix, but it's a known (and annoying) issue. Cheers, Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsouthey at gmail.com Thu Jul 8 11:55:25 2010 From: bsouthey at gmail.com (Bruce Southey) Date: Thu, 08 Jul 2010 10:55:25 -0500 Subject: [SciPy-Dev] scipy error compiling csr_wrap under Python 2.7 In-Reply-To: References: <4C339FAD.6010505@gmail.com> Message-ID: <4C35F4ED.5030605@gmail.com> On 07/08/2010 06:33 AM, Ralf Gommers wrote: > > > On Wed, Jul 7, 2010 at 5:27 AM, Bruce Southey > wrote: > > Hi, > I failed to get the scipy 0.8 rc1 and SVN to build under Python2.7 > with > numpy '2.0.0.dev8469' and gcc version 4.4.4 20100503 (Red Hat 4.4.4-2) > (GCC) . But scipy does compile with Python 2.6. > > If this is easy to fix on linux (and someone comes up with that fix) > it makes sense to do so. On OS X the problems are more serious and it > looks like they require changes to numpy.distutils first, so it's not > going to work with numpy 1.4.1 in any case. So full support for 2.7 on > all platforms is not going to happen for the 0.8.0 release. > > This is http://projects.scipy.org/scipy/ticket/1180 by the way. > > Ralf > > > building 'scipy.sparse.sparsetools._csr' extension > compiling C++ sources > C compiler: g++ -pthread -fno-strict-aliasing -g -O2 -DNDEBUG -g > -fwrapv > -O3 -Wall -fPIC > > compile options: > '-I/usr/local/lib/python2.7/site-packages/numpy/core/include > -I/usr/local/include/python2.7 -c' > g++: scipy/sparse/sparsetools/csr_wrap.cxx > scipy/sparse/sparsetools/csr_wrap.cxx: In function ?int > require_size(PyArrayObject*, npy_intp*, int)?: > scipy/sparse/sparsetools/csr_wrap.cxx:2910: error: expected ?)? before > ?PRIdPTR? > scipy/sparse/sparsetools/csr_wrap.cxx:2910: warning: spurious trailing > ?%? in format > scipy/sparse/sparsetools/csr_wrap.cxx:2910: warning: too many > arguments > for format > scipy/sparse/sparsetools/csr_wrap.cxx:2917: error: expected ?)? before > ?PRIdPTR? > scipy/sparse/sparsetools/csr_wrap.cxx:2917: warning: spurious trailing > ?%? in format > scipy/sparse/sparsetools/csr_wrap.cxx:2917: warning: too many > arguments > for format > scipy/sparse/sparsetools/csr_wrap.cxx: In function ?int > require_size(PyArrayObject*, npy_intp*, int)?: > scipy/sparse/sparsetools/csr_wrap.cxx:2910: error: expected ?)? before > ?PRIdPTR? > scipy/sparse/sparsetools/csr_wrap.cxx:2910: warning: spurious trailing > ?%? in format > scipy/sparse/sparsetools/csr_wrap.cxx:2910: warning: too many > arguments > for format > scipy/sparse/sparsetools/csr_wrap.cxx:2917: error: expected ?)? before > ?PRIdPTR? > scipy/sparse/sparsetools/csr_wrap.cxx:2917: warning: spurious trailing > ?%? in format > scipy/sparse/sparsetools/csr_wrap.cxx:2917: warning: too many > arguments > for format > error: Command "g++ -pthread -fno-strict-aliasing -g -O2 -DNDEBUG -g > -fwrapv -O3 -Wall -fPIC > -I/usr/local/lib/python2.7/site-packages/numpy/core/include > -I/usr/local/include/python2.7 -c > scipy/sparse/sparsetools/csr_wrap.cxx > -o build/temp.linux-x86_64-2.7/scipy/sparse/sparsetools/csr_wrap.o" > failed with exit status 1 > > The error is possibly related the variable 'NPY_INTP_FMT' (defined in > numpy/core/include/numpy/ndarraytypes.h). A grep shows these files: > scipy/sparse/sparsetools/coo_wrap.cxx: sprintf(s,"%" NPY_INTP_FMT ",", > size[i]); > scipy/sparse/sparsetools/coo_wrap.cxx: sprintf(s,"%" NPY_INTP_FMT ",", > array_size(ary,i)); > scipy/sparse/sparsetools/bsr_wrap.cxx: sprintf(s,"%" NPY_INTP_FMT ",", > size[i]); > scipy/sparse/sparsetools/bsr_wrap.cxx: sprintf(s,"%" NPY_INTP_FMT ",", > array_size(ary,i)); > scipy/sparse/sparsetools/numpy.i: sprintf(s,"%" NPY_INTP_FMT ",", > size[i]); > scipy/sparse/sparsetools/numpy.i: sprintf(s,"%" NPY_INTP_FMT ",", > array_size(ary,i)); > scipy/sparse/sparsetools/csr_wrap.cxx: sprintf(s,"%" NPY_INTP_FMT ",", > size[i]); > scipy/sparse/sparsetools/csr_wrap.cxx: sprintf(s,"%" NPY_INTP_FMT ",", > array_size(ary,i)); > scipy/sparse/sparsetools/csc_wrap.cxx: sprintf(s,"%" NPY_INTP_FMT ",", > size[i]); > scipy/sparse/sparsetools/csc_wrap.cxx: sprintf(s,"%" NPY_INTP_FMT ",", > array_size(ary,i)); > scipy/sparse/sparsetools/dia_wrap.cxx: sprintf(s,"%" NPY_INTP_FMT ",", > size[i]); > scipy/sparse/sparsetools/dia_wrap.cxx: sprintf(s,"%" NPY_INTP_FMT ",", > array_size(ary,i)); > > Note that the scipy/sparse/sparsetools/numpy.i contains tabs in some > places instead of spaces. > > The ndarraytypes.h is present in > /usr/local/lib/python2.7/site-packages/numpy/core/include/numpy/ > > Bruce > > > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > > > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > Okay, This is specific numpy error is not new just keeps returning! Aug 2008: http://www.mail-archive.com/numpy-discussion at scipy.org/msg11614.html June 2009: http://mail.scipy.org/pipermail/scipy-user/2009-June/021436.html Scipy builds when make David's hack: http://mail.scipy.org/pipermail/scipy-user/2009-June/021438.html However, I do not know the specific cause. Bruce -------------- next part -------------- An HTML attachment was scrubbed... URL: From vincent at vincentdavis.net Thu Jul 8 12:08:11 2010 From: vincent at vincentdavis.net (Vincent Davis) Date: Thu, 8 Jul 2010 10:08:11 -0600 Subject: [SciPy-Dev] scipy error compiling csr_wrap under Python 2.7 In-Reply-To: References: <4C339FAD.6010505@gmail.com> Message-ID: On Thu, Jul 8, 2010 at 9:34 AM, Ralf Gommers wrote: > > > On Thu, Jul 8, 2010 at 9:50 PM, Vincent Davis > wrote: >> >> On Thu, Jul 8, 2010 at 7:40 AM, Ralf Gommers >> wrote: >> > >> > >> > On Thu, Jul 8, 2010 at 9:05 PM, Vincent Davis >> > wrote: >> >> >> >> On Thu, Jul 8, 2010 at 5:33 AM, Ralf Gommers >> >> wrote: >> >> > >> >> > >> >> > On Wed, Jul 7, 2010 at 5:27 AM, Bruce Southey >> >> > wrote: >> >> >> >> >> >> Hi, >> >> >> I failed to get the scipy 0.8 rc1 and SVN to build under Python2.7 >> >> >> with >> >> >> numpy '2.0.0.dev8469' and gcc version 4.4.4 20100503 (Red Hat >> >> >> 4.4.4-2) >> >> >> (GCC) . But scipy does compile with Python 2.6. >> >> >> >> >> > If this is easy to fix on linux (and someone comes up with that fix) >> >> > it >> >> > makes sense to do so. On OS X the problems are more serious and it >> >> > looks >> >> > like they require changes to numpy.distutils first, so it's not going >> >> > to >> >> > work with numpy 1.4.1 in any case. So full support for 2.7 on all >> >> > platforms >> >> > is not going to happen for the 0.8.0 release. >> >> >> >> Not that I know what I am doing but I do have recent numpy and scipy >> >> installed on oxs running py2.7. I did use numscon and installed 64bit, >> >> I get 4 errors and 2 failures. Lots of warnings, "Warning: invalid >> >> value encountered in absolute" not sure what those are about. >> >> >> > Those errors are all known, so all seems fine. How did you install >> > Python >> > 2.7? I see your python says it's compiled with gcc 4.2, while I remember >> > my >> > scipy builds picking up 4.0 by default. >> >> Well it was not skill or knowledge :-) pure blind trial and error but >> it seems simple now. >> for python 2.7 from source >> ./configure --with-universal-archs=64-bit >> --enable-universalsdk=/Developer/SDKs/MacOSX10.5.sdk >> --enable-framework >> then make, make install >> >> I have been taking notes so I can repeat it. More details here (numpy, >> scipy) >> http://vincentdavis.net/installing-python > > Interesting. When I try the same build command for numpy with either of the > 2.7 binaries from python.org (both built with gcc-4.0), numscons aborts > after failing this test: > # We check declaration AND type because that's how distutils does it. > if config.CheckDeclaration('PY_LONG_LONG', includes = '#include > \n'): > ??? st = config.CheckTypeSize('PY_LONG_LONG', > ????????????????????????????? includes = '#include \n') > ??? assert not st == 0 > > And setup.py install fails because it complains about missing x86_64 > architecture. Just ran the first part again, seems to work for me, see full output attached. LDFLAGS="-arch x86_64" FFLAGS="-arch x86_64" py27 setupscons.py scons btw I just noticed that if I copy past this from the webpage the quotes are wrong, MacBookPro-new-2:numpy vmd$ py27 Python 2.7rc2+ (trunk, Jun 27 2010, 21:30:09) [GCC 4.2.1 (Apple Inc. build 5659)] on darwin Type "help", "copyright", "credits" or "license" for more information. Vincent > >> >> By the way what are "Warning: invalid value encountered in absolute" >> warnings I tried to find out more but didn't find much. >> Vincent >> > Those are from ufuncs operating on masked arrays - can't remember why it was > hard to fix, but it's a known (and annoying) issue. > > Cheers, > Ralf > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > > -------------- next part -------------- A non-text attachment was scrubbed... Name: py27_numpy_2dev_osx Type: application/octet-stream Size: 35270 bytes Desc: not available URL: From dwf at cs.toronto.edu Thu Jul 8 13:25:39 2010 From: dwf at cs.toronto.edu (David Warde-Farley) Date: Thu, 8 Jul 2010 13:25:39 -0400 Subject: [SciPy-Dev] splev In-Reply-To: References: Message-ID: On 2010-07-08, at 8:59 AM, Charles R Harris wrote: > It's inconvenient and time consuming. I'm leaning towards adding a keyword so that the user can choose one or the other behavior, say extrapolation=True, which will not impact current usage. This sounds like a good compromise, +1. David From aarchiba at physics.mcgill.ca Thu Jul 8 13:51:45 2010 From: aarchiba at physics.mcgill.ca (Anne Archibald) Date: Thu, 8 Jul 2010 13:51:45 -0400 Subject: [SciPy-Dev] splev In-Reply-To: References: Message-ID: On 8 July 2010 08:59, Charles R Harris wrote: > > > On Thu, Jul 8, 2010 at 12:02 AM, Joshua Holbrook > wrote: >> >> On Wed, Jul 7, 2010 at 8:19 PM, Charles R Harris >> wrote: >> > >> > >> > On Wed, Jul 7, 2010 at 10:06 PM, Anne Archibald >> > >> > wrote: >> >> >> >> On 7 July 2010 23:45, Charles R Harris >> >> wrote: >> >> > Hi All, >> >> > >> >> > I opened ticket #1223 because the values returned by splev are not >> >> > zero >> >> > for >> >> > arguments outside of the interval of definition. Because splev >> >> > evaluates >> >> > b-splines, which all have compact support, I think zero is the >> >> > correct >> >> > value >> >> > for such arguments. However, I also find that some tests assume that >> >> > splev >> >> > extrapolates the interpolating polynomials defined in the first and >> >> > last >> >> > spans, which is what splev currently does. Consequently I thought it >> >> > worth >> >> > bringing the topic up on the list for discussion before making the >> >> > commit >> >> > that changes the behavior. >> >> >> >> Please do not make this change. The extrapolation you get now is ugly, >> >> as polynomial extrapolation always is, but a discontinuity is >> >> considerably uglier. Remember that the arguments and range are both >> >> floating-point, so that roundoff error can easily move an argument >> >> outside the range; if extrapolation is used this is unimportant, but >> >> if the spline flatly drops to zero this will produce wildly wrong >> >> answers. >> >> >> > >> > Well, for my purposes extrapolation really screws things up. Note that >> > the >> > b-splines at the ends are in fact c_{-1}, i.e., discontinuous, and the >> > next >> > b-spline in has a discontinuous 1'st derivative, so on and so forth. The >> > discontinuities are mathematically correct, b-splines aren't >> > polynomials. >> > Maybe we can add a keyword? >> > >> > Chuck >> > >> > >> > _______________________________________________ >> > SciPy-Dev mailing list >> > SciPy-Dev at scipy.org >> > http://mail.scipy.org/mailman/listinfo/scipy-dev >> > >> > >> >> I personally agree with Anne. While discontinuities may be technically >> correct, I think the current behavior is more forgiving. If this is >> noted in the docs, I say things are g2g. >> >> Chuck: Is checking the interval yourself prohibitive? >> > > It's inconvenient and time consuming. I'm leaning towards adding a keyword > so that the user can choose one or the other behavior, say > extrapolation=True, which will not impact current usage. If you're going to do this it may make sense to offer a third option that raises an exception when outside the interval. There are again floating-point issues, but there are certainly times when it would be useful to find out right away that one had supplied bogus values. Anne > Chuck > > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > > From charlesr.harris at gmail.com Thu Jul 8 14:26:03 2010 From: charlesr.harris at gmail.com (Charles R Harris) Date: Thu, 8 Jul 2010 12:26:03 -0600 Subject: [SciPy-Dev] splev In-Reply-To: References: Message-ID: On Thu, Jul 8, 2010 at 11:51 AM, Anne Archibald wrote: > On 8 July 2010 08:59, Charles R Harris wrote: > > > > > > On Thu, Jul 8, 2010 at 12:02 AM, Joshua Holbrook < > josh.holbrook at gmail.com> > > wrote: > >> > >> On Wed, Jul 7, 2010 at 8:19 PM, Charles R Harris > >> wrote: > >> > > >> > > >> > On Wed, Jul 7, 2010 at 10:06 PM, Anne Archibald > >> > > >> > wrote: > >> >> > >> >> On 7 July 2010 23:45, Charles R Harris > >> >> wrote: > >> >> > Hi All, > >> >> > > >> >> > I opened ticket #1223 because the values returned by splev are not > >> >> > zero > >> >> > for > >> >> > arguments outside of the interval of definition. Because splev > >> >> > evaluates > >> >> > b-splines, which all have compact support, I think zero is the > >> >> > correct > >> >> > value > >> >> > for such arguments. However, I also find that some tests assume > that > >> >> > splev > >> >> > extrapolates the interpolating polynomials defined in the first and > >> >> > last > >> >> > spans, which is what splev currently does. Consequently I thought > it > >> >> > worth > >> >> > bringing the topic up on the list for discussion before making the > >> >> > commit > >> >> > that changes the behavior. > >> >> > >> >> Please do not make this change. The extrapolation you get now is > ugly, > >> >> as polynomial extrapolation always is, but a discontinuity is > >> >> considerably uglier. Remember that the arguments and range are both > >> >> floating-point, so that roundoff error can easily move an argument > >> >> outside the range; if extrapolation is used this is unimportant, but > >> >> if the spline flatly drops to zero this will produce wildly wrong > >> >> answers. > >> >> > >> > > >> > Well, for my purposes extrapolation really screws things up. Note that > >> > the > >> > b-splines at the ends are in fact c_{-1}, i.e., discontinuous, and the > >> > next > >> > b-spline in has a discontinuous 1'st derivative, so on and so forth. > The > >> > discontinuities are mathematically correct, b-splines aren't > >> > polynomials. > >> > Maybe we can add a keyword? > >> > > >> > Chuck > >> > > >> > > >> > _______________________________________________ > >> > SciPy-Dev mailing list > >> > SciPy-Dev at scipy.org > >> > http://mail.scipy.org/mailman/listinfo/scipy-dev > >> > > >> > > >> > >> I personally agree with Anne. While discontinuities may be technically > >> correct, I think the current behavior is more forgiving. If this is > >> noted in the docs, I say things are g2g. > >> > >> Chuck: Is checking the interval yourself prohibitive? > >> > > > > It's inconvenient and time consuming. I'm leaning towards adding a > keyword > > so that the user can choose one or the other behavior, say > > extrapolation=True, which will not impact current usage. > > If you're going to do this it may make sense to offer a third option > that raises an exception when outside the interval. There are again > floating-point issues, but there are certainly times when it would be > useful to find out right away that one had supplied bogus values. > > I've got the extrapolation keyword thing done and could just let it take values 0, 1, 2, so it shouldn't be too difficult to add the check, it can be part of the error return. Can you think of a better word than extrapolation? I should probably add this to the splder function also. Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From d.l.goldsmith at gmail.com Thu Jul 8 19:28:20 2010 From: d.l.goldsmith at gmail.com (David Goldsmith) Date: Thu, 8 Jul 2010 16:28:20 -0700 Subject: [SciPy-Dev] Reminder: Summer Marathon Skypecon @ Fri Jul 9 9am - 10am Message-ID: Skype ID d.l.goldsmith -------------- next part -------------- An HTML attachment was scrubbed... URL: From josef.pktd at gmail.com Fri Jul 9 03:33:43 2010 From: josef.pktd at gmail.com (josef.pktd at gmail.com) Date: Fri, 9 Jul 2010 03:33:43 -0400 Subject: [SciPy-Dev] documenting fortran wrapped functions, was:Problems with documentation of scipy.interpolate.dfitpack Message-ID: On Thu, Jul 8, 2010 at 1:18 AM, Vincent Davis wrote: > On Wed, Jul 7, 2010 at 10:09 PM, bowie_22 wrote: >> Hello together, >> >> I tried to contribute to the documentation of scipy. >> I found out that the whole >> >> scipy.interpolate.dfitpack >> >> package seems to be a little bit different. >> >> It is not possible to view the code. >> You get 404 File not found error. >> >> Is there no source code available for dfitpack? >> If so, how is the documentation handled? > > I think the first line in the doc editor says it all. > "This module 'dfitpack' is auto-generated with f2py (version:2_8473). > Functions:" > I think this is the source > http://projects.scipy.org/scipy/browser/trunk/scipy/interpolate/src/fitpack.pyf > > Vincent Many of the fortran based functions are underdocumented, e.g. lapack/blas in linalg, most functions in special, often the fortran files contain quite extensive documentation. But it seems in some cases like dfitpack, some documentation are autogenerated. Just a question because I don't have the time to look into this: Do we need to mark f2py wrapped functions as "unimportant", or can we make sure every f2py wrapped function has a editable docstring? Josef > >> >> Regs >> >> Marcus >> >> >> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at scipy.org >> http://mail.scipy.org/mailman/listinfo/scipy-dev >> > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > From pav at iki.fi Fri Jul 9 04:06:58 2010 From: pav at iki.fi (Pauli Virtanen) Date: Fri, 9 Jul 2010 08:06:58 +0000 (UTC) Subject: [SciPy-Dev] arccosh/arctanh test precision References: Message-ID: Thu, 08 Jul 2010 20:17:20 +0800, Ralf Gommers wrote: > The arccosh test precision is set a bit too high for real input - it's > kept at the default level which is 5*eps. Setting it to 5e-14 solves the > issue on 32-bit Windows, which is OK I think. On 64-bit Windows the > results are further off (see below) which could be considered an actual > problem. The arctanh result also looks bad. So I propose to change the > 32-bit test precision only. For the curious, it's line 34 of > special/tests/test_data.py. > > Is the above fine? Should I open a ticket for 64-bit? I think bumping the test precisions is OK, at least as long as the relative error is below 1e-11. -- Pauli Virtanen From d.l.goldsmith at gmail.com Fri Jul 9 11:32:15 2010 From: d.l.goldsmith at gmail.com (David Goldsmith) Date: Fri, 9 Jul 2010 08:32:15 -0700 Subject: [SciPy-Dev] Anyone have any doc issues to Skype about? Message-ID: -------------- next part -------------- An HTML attachment was scrubbed... URL: From vincent at vincentdavis.net Fri Jul 9 11:38:11 2010 From: vincent at vincentdavis.net (Vincent Davis) Date: Fri, 9 Jul 2010 09:38:11 -0600 Subject: [SciPy-Dev] Anyone have any doc issues to Skype about? In-Reply-To: References: Message-ID: I am slowly getting up to speed on pyocweb. If anyone wants to discuss suggestions or wishlist.... I am available. Vincent On Fri, Jul 9, 2010 at 9:32 AM, David Goldsmith wrote: > > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > > From matthew.brett at gmail.com Fri Jul 9 12:44:42 2010 From: matthew.brett at gmail.com (Matthew Brett) Date: Fri, 9 Jul 2010 12:44:42 -0400 Subject: [SciPy-Dev] SciPy docs: volunteers needed now! In-Reply-To: References: Message-ID: HI, >> Actually, all you have to do (have used in both senses of the word, i.e., >> are "required" to do and all that is necessary to do) is click on the log >> link when viewing a docstring in the Wiki, note the person who worked on it >> last, how long ago that was, and then as Ben says, if it was in the last >> three months, give or take, contact them and ask them if it would be ok if >> you worked on it. > > Sorry to come back late to this thread. Partly in response to an off-list mail - I'm resending this one. I'd like to ask, whether you (I guess particularly Joe, and David G) think more discussion on this topic is a good idea - or whether you'd rather be left to get on with things. I think that might be the same question as, do you think you have a sustainable long-term solution to the problem of reduced community input on the documentation project? See you, Matthew From d.l.goldsmith at gmail.com Fri Jul 9 21:48:02 2010 From: d.l.goldsmith at gmail.com (David Goldsmith) Date: Fri, 9 Jul 2010 18:48:02 -0700 Subject: [SciPy-Dev] Function of the :mod: wiki function Message-ID: In docstrings like that for numpy.fft, which in edit mode begins: Discrete Fourier Transform (:mod:`numpy.fft`) ================================= what's the function of the :mod:? All it seems to do on being rendered by the Wiki is generate a link back to it's own document. DG -------------- next part -------------- An HTML attachment was scrubbed... URL: From kwgoodman at gmail.com Fri Jul 9 22:41:09 2010 From: kwgoodman at gmail.com (Keith Goodman) Date: Fri, 9 Jul 2010 19:41:09 -0700 Subject: [SciPy-Dev] Function of the :mod: wiki function In-Reply-To: References: Message-ID: On Fri, Jul 9, 2010 at 6:48 PM, David Goldsmith wrote: > In docstrings like that for numpy.fft, which in edit mode begins: > > Discrete Fourier Transform (:mod:`numpy.fft`) > ================================= > > what's the function of the :mod:?? All it seems to do on being rendered by > the Wiki is generate a link back to it's own document. Yep, that's all it does. mod stands for module. So it is a link to the numpy.fft module where the link text will be the name of the module. You can change the link text :mod:`yourword `. Besides mod there is func for function and meth for method and a bunch of others I don't remember. From d.l.goldsmith at gmail.com Sat Jul 10 05:00:35 2010 From: d.l.goldsmith at gmail.com (David Goldsmith) Date: Sat, 10 Jul 2010 02:00:35 -0700 Subject: [SciPy-Dev] Function of the :mod: wiki function In-Reply-To: References: Message-ID: On Fri, Jul 9, 2010 at 7:41 PM, Keith Goodman wrote: > On Fri, Jul 9, 2010 at 6:48 PM, David Goldsmith > wrote: > > In docstrings like that for numpy.fft, which in edit mode begins: > > > > Discrete Fourier Transform (:mod:`numpy.fft`) > > ================================= > > > > what's the function of the :mod:? All it seems to do on being rendered > by > > the Wiki is generate a link back to it's own document. > > Yep, that's all it does. mod stands for module. So it is a link to the > numpy.fft module where the link text will be the name of the module. > You can change the link text :mod:`yourword `. Ah, that's what I was missing; still, what's the point of having a circular reference, even by another name, or is it only a circular reference in the Wiki 'cause the docstring is all the Wiki pulls for any object? DG > Besides mod > there is func for function and meth for method and a bunch of others I > don't remember. > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > -- Mathematician: noun, someone who disavows certainty when their uncertainty set is non-empty, even if that set has measure zero. Hope: noun, that delusive spirit which escaped Pandora's jar and, with her lies, prevents mankind from committing a general suicide. (As interpreted by Robert Graves) -------------- next part -------------- An HTML attachment was scrubbed... URL: From kwgoodman at gmail.com Sat Jul 10 09:51:42 2010 From: kwgoodman at gmail.com (Keith Goodman) Date: Sat, 10 Jul 2010 06:51:42 -0700 Subject: [SciPy-Dev] Function of the :mod: wiki function In-Reply-To: References: Message-ID: On Sat, Jul 10, 2010 at 2:00 AM, David Goldsmith wrote: > On Fri, Jul 9, 2010 at 7:41 PM, Keith Goodman wrote: >> >> On Fri, Jul 9, 2010 at 6:48 PM, David Goldsmith >> wrote: >> > In docstrings like that for numpy.fft, which in edit mode begins: >> > >> > Discrete Fourier Transform (:mod:`numpy.fft`) >> > ================================= >> > >> > what's the function of the :mod:?? All it seems to do on being rendered >> > by >> > the Wiki is generate a link back to it's own document. >> >> Yep, that's all it does. mod stands for module. So it is a link to the >> numpy.fft module where the link text will be the name of the module. >> You can change the link text :mod:`yourword `. > > Ah, that's what I was missing; still, what's the point of having a circular > reference, even by another name, or is it only a circular reference in the > Wiki 'cause the docstring is all the Wiki pulls for any object? Oh, I didn't realize about the circular part. Yeah, that seems odd. Doesn't add anything. From jason-sage at creativetrax.com Sat Jul 10 13:07:21 2010 From: jason-sage at creativetrax.com (Jason Grout) Date: Sat, 10 Jul 2010 10:07:21 -0700 Subject: [SciPy-Dev] Function of the :mod: wiki function In-Reply-To: References: Message-ID: <4C38A8C9.4080702@creativetrax.com> On 7/9/10 6:48 PM, David Goldsmith wrote: > In docstrings like that for numpy.fft, which in edit mode begins: > > Discrete Fourier Transform (:mod:`numpy.fft`) > ================================= > > what's the function of the :mod:? All it seems to do on being > rendered by the Wiki is generate a link back to it's own document. > For reference, here is a list of things like :mod: http://sphinx.pocoo.org/markup/inline.html (you probably already knew that, though) Jason From charlesr.harris at gmail.com Sat Jul 10 13:31:44 2010 From: charlesr.harris at gmail.com (Charles R Harris) Date: Sat, 10 Jul 2010 11:31:44 -0600 Subject: [SciPy-Dev] f2py, the fortran integer type, and npy_intp Message-ID: I note that some c programs try to call f2py generated fortran interfaces using the npy_intp type for the array dimension. There are two problems here, first the prototype for the generated wrapper uses int causing compiler warnings on a 64 bit os, and second, the fortran subroutines themselves use integer types. So some questions. What precision are fortran integers? Should we be using npy_intp for array dimensions at all? Is there any transparent way to have 64 bit support? Example code (from scipy/interpolate/src/__fitpack.h) void SPLDER(double*,int*,double*,int*,int*,double*,double*,int*,int*,double*,int*); ^^^^ void SPLEV(double*,int*,double*,int*,double*,double*,int*,int*,int*); ^^^^ static char doc_spl_[] = " [y,ier] = _spl_(x,nu,t,c,k,e)"; static PyObject *fitpack_spl_(PyObject *dummy, PyObject *args) { int n, nu, ier, k, e=0; npy_intp m; ... if (nu) { SPLDER(t, &n, c, &k, &nu, x, y, &m, &e, wrk, &ier); ^^ } else { SPLEV(t, &n, c, &k, x, y, &m, &e, &ier); ^^ } ... } (from scipy/interpolate/src/fitpack.pyf subroutine splev(t,n,c,k,x,y,m,e,ier) ! y = splev(t,c,k,x,[e]) real*8 dimension(n),intent(in) :: t integer intent(hide),depend(t) :: n=len(t) real*8 dimension(n),depend(n,k),check(len(c)==n),intent(in) :: c integer :: k real*8 dimension(m),intent(in) :: x real*8 dimension(m),depend(m),intent(out) :: y integer intent(hide),depend(x) :: m=len(x) integer check(0<=e && e<=2) :: e=0 integer intent(hide) :: ier end subroutine splev (from scipy/interpolate/fitpack/splev.f) subroutine splev(t,n,c,k,x,y,m,e,ier) ... c ..scalar arguments.. integer n, k, m, e, ier ... Note also that the checks generated by f2py -- and they are generated -- don't seem to work. Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From kwmsmith at gmail.com Sat Jul 10 14:30:23 2010 From: kwmsmith at gmail.com (Kurt Smith) Date: Sat, 10 Jul 2010 13:30:23 -0500 Subject: [SciPy-Dev] f2py, the fortran integer type, and npy_intp In-Reply-To: References: Message-ID: On Sat, Jul 10, 2010 at 12:31 PM, Charles R Harris wrote: > I note that some c programs try to call f2py generated fortran interfaces > using the npy_intp type for the array dimension. There are two problems > here, first the prototype for the generated wrapper uses int causing > compiler warnings on a 64 bit os, and second, the fortran subroutines > themselves use integer types. So some questions. What precision are fortran > integers? Should we be using npy_intp for array dimensions at all? Is there > any transparent way to have 64 bit support? I can comment on some of the above (not the f2py-specific bits, though). 1) "What precision are fortran integers?" According to a somewhat authoritative interlanguage programming site [0], a fortran 'integer' corresponds to a C 'int'. This can be relied upon, but other fortran type declarations cannot (e.g. 'integer*8' != a C 'long' on all platforms, with all compilers). If sizeof(npy_intp) == sizeof(int), then everything will be fine, but it's a near certainty there are platforms where this doesn't hold, and may lead to problems if array sizes are large. [0] http://www.math.utah.edu/software/c-with-fortran.html 2) "Should we be using npy_intp for array dimensions at all?" To interoperate with PEP-3118 buffers, the short answer is "yes." As I'm sure you know, the PyArrayObject struct uses npy_intp's for the dimensions and the strides. Further, the PEP-3118 buffer protocol calls for Py_ssize_t (which maps to npy_intp inside numpy) for the "shape", "stride" and "suboffset" arrays inside the Py_buffer struct. As I understand it, this is primarily required for when "suboffset" is >=0 in a specific dimension, since it indicates the buffer is arranged in a non-contiguous-memory layout and pointer arithmetic is required to access the array elements. Here's the relevant PEP-3118 section: """ suboffsets address of a Py_ssize_t * variable that will be filled with a pointer to an array of Py_ssize_t of length *ndims. If these suboffset numbers are >=0, then the value stored along the indicated dimension is a pointer and the suboffset value dictates how many bytes to add to the pointer after de-referencing. A suboffset value that it negative indicates that no de-referencing should occur (striding in a contiguous memory block). If all suboffsets are negative (i.e. no de-referencing is needed, then this must be NULL (the default value). If this is not requested by the caller (PyBUF_INDIRECT is not set), then this should be set to NULL or an PyExc_BufferError raised if this is not possible. For clarity, here is a function that returns a pointer to the element in an N-D array pointed to by an N-dimesional index when there are both non-NULL strides and suboffsets: void *get_item_pointer(int ndim, void *buf, Py_ssize_t *strides, Py_ssize_t *suboffsets, Py_ssize_t *indices) { char *pointer = (char*)buf; int i; for (i = 0; i < ndim; i++) { pointer += strides[i] * indices[i]; if (suboffsets[i] >=0 ) { pointer = *((char**)pointer) + suboffsets[i]; } } return (void*)pointer; } Notice the suboffset is added "after" the dereferencing occurs. Thus slicing in the ith dimension would add to the suboffsets in the (i-1)st dimension. Slicing in the first dimension would change the location of the starting pointer directly (i.e. buf would be modified). """ Hope this gives you some info to go on. Kurt > > Example code > > (from scipy/interpolate/src/__fitpack.h) > > void > SPLDER(double*,int*,double*,int*,int*,double*,double*,int*,int*,double*,int*); > ?????????????????????????????????????????????????????????? ^^^^ > void SPLEV(double*,int*,double*,int*,double*,double*,int*,int*,int*); > ????????????????????????????????????????????????????????? ^^^^ > > static char doc_spl_[] = " [y,ier] = _spl_(x,nu,t,c,k,e)"; > static PyObject *fitpack_spl_(PyObject *dummy, PyObject *args) > { > ??? int n, nu, ier, k, e=0; > ??? npy_intp m; > > ??? ... > > ??? if (nu) { > ??????? SPLDER(t, &n, c, &k, &nu, x, y, &m, &e, wrk, &ier); > ??????????????????????????????????????? ^^ > ??? } > ??? else { > ??????? SPLEV(t, &n, c, &k, x, y, &m, &e, &ier); > > ^^ > ??? } > ?... > } > > (from scipy/interpolate/src/fitpack.pyf > > ???? subroutine splev(t,n,c,k,x,y,m,e,ier) > ?????? ! y = splev(t,c,k,x,[e]) > ?????? real*8 dimension(n),intent(in) :: t > ?????? integer intent(hide),depend(t) :: n=len(t) > ?????? real*8 dimension(n),depend(n,k),check(len(c)==n),intent(in) :: c > ?????? integer :: k > ?????? real*8 dimension(m),intent(in) :: x > ?????? real*8 dimension(m),depend(m),intent(out) :: y > ?????? integer intent(hide),depend(x) :: m=len(x) > ?????? integer check(0<=e && e<=2) :: e=0 > ?????? integer intent(hide) :: ier > ???? end subroutine splev > > (from scipy/interpolate/fitpack/splev.f) > > ????? subroutine splev(t,n,c,k,x,y,m,e,ier) > > ... > > c? ..scalar arguments.. > ????? integer n, k, m, e, ier > > ... > > > Note also that the checks generated by f2py -- and they are generated -- > don't seem to work. > > Chuck > > > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > > From charlesr.harris at gmail.com Sat Jul 10 15:32:44 2010 From: charlesr.harris at gmail.com (Charles R Harris) Date: Sat, 10 Jul 2010 13:32:44 -0600 Subject: [SciPy-Dev] f2py, the fortran integer type, and npy_intp In-Reply-To: References: Message-ID: On Sat, Jul 10, 2010 at 12:30 PM, Kurt Smith wrote: > On Sat, Jul 10, 2010 at 12:31 PM, Charles R Harris > wrote: > > I note that some c programs try to call f2py generated fortran interfaces > > using the npy_intp type for the array dimension. There are two problems > > here, first the prototype for the generated wrapper uses int causing > > compiler warnings on a 64 bit os, and second, the fortran subroutines > > themselves use integer types. So some questions. What precision are > fortran > > integers? Should we be using npy_intp for array dimensions at all? Is > there > > any transparent way to have 64 bit support? > > I can comment on some of the above (not the f2py-specific bits, though). > > 1) "What precision are fortran integers?" > > According to a somewhat authoritative interlanguage programming site > [0], a fortran 'integer' corresponds to a C 'int'. This can be relied > upon, but other fortran type declarations cannot (e.g. 'integer*8' != > a C 'long' on all platforms, with all compilers). If sizeof(npy_intp) > == sizeof(int), then everything will be fine, but it's a near > certainty there are platforms where this doesn't hold, and may lead to > problems if array sizes are large. > > [0] http://www.math.utah.edu/software/c-with-fortran.html > > Since 'int' is 32 bits with gcc and msvc regardless of whether the operating system is 32 or 64 bits, we are going to have a problem with all the fortran code in scipy that uses integer types for array indexing. That is the case for all the code in fitpack, for instance. The only way I see to get around that is to have two fortran libraries, one modified for 64 bits and another for 32 bits. How does your cythonized fortran deal with this? > 2) "Should we be using npy_intp for array dimensions at all?" > > To interoperate with PEP-3118 buffers, the short answer is "yes." > > Well, yes. The question is what to do with all that fortran code that doesn't support it. > As I'm sure you know, the PyArrayObject struct uses npy_intp's for the > dimensions and the strides. Further, the PEP-3118 buffer protocol > calls for Py_ssize_t (which maps to npy_intp inside numpy) for the > "shape", "stride" and "suboffset" arrays inside the Py_buffer struct. > As I understand it, this is primarily required for when "suboffset" is > >=0 in a specific dimension, since it indicates the buffer is arranged > in a non-contiguous-memory layout and pointer arithmetic is required > to access the array elements. > > Here's the relevant PEP-3118 section: > > """ > suboffsets > > address of a Py_ssize_t * variable that will be filled with a > pointer to an array of Py_ssize_t of length *ndims. If these suboffset > numbers are >=0, then the value stored along the indicated dimension > is a pointer and the suboffset value dictates how many bytes to add to > the pointer after de-referencing. A suboffset value that it negative > indicates that no de-referencing should occur (striding in a > contiguous memory block). If all suboffsets are negative (i.e. no > de-referencing is needed, then this must be NULL (the default value). > If this is not requested by the caller (PyBUF_INDIRECT is not set), > then this should be set to NULL or an PyExc_BufferError raised if this > is not possible. > > For clarity, here is a function that returns a pointer to the > element in an N-D array pointed to by an N-dimesional index when there > are both non-NULL strides and suboffsets: > > void *get_item_pointer(int ndim, void *buf, Py_ssize_t *strides, > Py_ssize_t *suboffsets, Py_ssize_t *indices) { > char *pointer = (char*)buf; > int i; > for (i = 0; i < ndim; i++) { > pointer += strides[i] * indices[i]; > if (suboffsets[i] >=0 ) { > pointer = *((char**)pointer) + suboffsets[i]; > } > } > return (void*)pointer; > } > > Notice the suboffset is added "after" the dereferencing occurs. > Thus slicing in the ith dimension would add to the suboffsets in the > (i-1)st dimension. Slicing in the first dimension would change the > location of the starting pointer directly (i.e. buf would be > modified). > """ > > Hope this gives you some info to go on. > > Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From d.l.goldsmith at gmail.com Sat Jul 10 16:38:46 2010 From: d.l.goldsmith at gmail.com (David Goldsmith) Date: Sat, 10 Jul 2010 13:38:46 -0700 Subject: [SciPy-Dev] Function of the :mod: wiki function In-Reply-To: <4C38A8C9.4080702@creativetrax.com> References: <4C38A8C9.4080702@creativetrax.com> Message-ID: On Sat, Jul 10, 2010 at 10:07 AM, Jason Grout wrote: > On 7/9/10 6:48 PM, David Goldsmith wrote: > > In docstrings like that for numpy.fft, which in edit mode begins: > > > > Discrete Fourier Transform (:mod:`numpy.fft`) > > ================================= > > > > what's the function of the :mod:? All it seems to do on being > > rendered by the Wiki is generate a link back to it's own document. > > > > For reference, here is a list of things like :mod: > > http://sphinx.pocoo.org/markup/inline.html > > (you probably already knew that, though) > No, I didn't thanks. That page indicates these things are to be used for *cross*-referencing, i.e., what we're using single backticks for; my guess is that someone used the :mod: before all our mark-up policies were settled. So, I'm removing it; if someone later explains to me why it should be there, I'll put it back in. DG > Jason > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > -- Mathematician: noun, someone who disavows certainty when their uncertainty set is non-empty, even if that set has measure zero. Hope: noun, that delusive spirit which escaped Pandora's jar and, with her lies, prevents mankind from committing a general suicide. (As interpreted by Robert Graves) -------------- next part -------------- An HTML attachment was scrubbed... URL: From charlesr.harris at gmail.com Sat Jul 10 16:49:01 2010 From: charlesr.harris at gmail.com (Charles R Harris) Date: Sat, 10 Jul 2010 14:49:01 -0600 Subject: [SciPy-Dev] f2py, the fortran integer type, and npy_intp In-Reply-To: References: Message-ID: On Sat, Jul 10, 2010 at 12:30 PM, Kurt Smith wrote: > On Sat, Jul 10, 2010 at 12:31 PM, Charles R Harris > wrote: > > I note that some c programs try to call f2py generated fortran interfaces > > using the npy_intp type for the array dimension. There are two problems > > here, first the prototype for the generated wrapper uses int causing > > compiler warnings on a 64 bit os, and second, the fortran subroutines > > themselves use integer types. So some questions. What precision are > fortran > > integers? Should we be using npy_intp for array dimensions at all? Is > there > > any transparent way to have 64 bit support? > > I can comment on some of the above (not the f2py-specific bits, though). > > 1) "What precision are fortran integers?" > > According to a somewhat authoritative interlanguage programming site > [0], a fortran 'integer' corresponds to a C 'int'. This can be relied > upon, but other fortran type declarations cannot (e.g. 'integer*8' != > a C 'long' on all platforms, with all compilers). If sizeof(npy_intp) > == sizeof(int), then everything will be fine, but it's a near > certainty there are platforms where this doesn't hold, and may lead to > problems if array sizes are large. > > Looks like most FORTRANs allow the size of integer to be set by a compilerflag. But hooking it up to work with f2py and C looks like a significant problem involving the build infrastructure among other things. I think for the time being we should just avoid using npy_intp as a type passed to FORTRAN subroutines. This will limit the size of one dimensional arrays that can be passed to the subroutines to 2GB but I don't see any easy way around it. Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From d.l.goldsmith at gmail.com Sat Jul 10 16:58:27 2010 From: d.l.goldsmith at gmail.com (David Goldsmith) Date: Sat, 10 Jul 2010 13:58:27 -0700 Subject: [SciPy-Dev] Opinions re: duplication of content Message-ID: numpy.fft has a lengthy, referenced "Background" section on DFT's in general; opinions, please, re: duplicating vs. referencing that content in scipy.fftpack.basic. (If someone explicitly downloads/installs only scipy, do they have full local access to all numpy doc?) Thanks! DG -------------- next part -------------- An HTML attachment was scrubbed... URL: From kwmsmith at gmail.com Sat Jul 10 16:59:15 2010 From: kwmsmith at gmail.com (Kurt Smith) Date: Sat, 10 Jul 2010 15:59:15 -0500 Subject: [SciPy-Dev] f2py, the fortran integer type, and npy_intp In-Reply-To: References: Message-ID: On Sat, Jul 10, 2010 at 2:32 PM, Charles R Harris wrote: > > > On Sat, Jul 10, 2010 at 12:30 PM, Kurt Smith wrote: >> >> On Sat, Jul 10, 2010 at 12:31 PM, Charles R Harris >> wrote: >> > I note that some c programs try to call f2py generated fortran >> > interfaces >> > using the npy_intp type for the array dimension. There are two problems >> > here, first the prototype for the generated wrapper uses int causing >> > compiler warnings on a 64 bit os, and second, the fortran subroutines >> > themselves use integer types. So some questions. What precision are >> > fortran >> > integers? Should we be using npy_intp for array dimensions at all? Is >> > there >> > any transparent way to have 64 bit support? >> >> I can comment on some of the above (not the f2py-specific bits, though). >> >> 1) "What precision are fortran integers?" >> >> According to a somewhat authoritative interlanguage programming site >> [0], a fortran 'integer' corresponds to a C 'int'. ?This can be relied >> upon, but other fortran type declarations cannot (e.g. 'integer*8' != >> a C 'long' on all platforms, with all compilers). ?If sizeof(npy_intp) >> == sizeof(int), then everything will be fine, but it's a near >> certainty there are platforms where this doesn't hold, and may lead to >> problems if array sizes are large. >> >> [0] http://www.math.utah.edu/software/c-with-fortran.html >> > > Since 'int' is 32 bits with gcc and msvc regardless of whether the operating > system is 32 or 64 bits, we are going to have a problem with all the fortran > code in scipy that uses integer types for array indexing. That is the case > for all the code in fitpack, for instance. The only way I see to get around > that is to have two fortran libraries, one modified for 64 bits and another > for 32 bits. How does your cythonized fortran deal with this? There's a compatibility wrapper written in fortran. All index arguments are passed to the wrapper as npy_intp's, and there's a check in the wrapper to ensure the sizes line up before calling the wrapped subroutine/function. If the user wants to pass an array that requires a dimension >= 2 mln. elements and the code has 'integer' dimensions, then the user should use compile flags to modify the default integer size. Kurt > >> >> 2) "Should we be using npy_intp for array dimensions at all?" >> >> To interoperate with PEP-3118 buffers, the short answer is "yes." >> > > Well, yes. The question is what to do with all that fortran code that > doesn't support it. > >> >> As I'm sure you know, the PyArrayObject struct uses npy_intp's for the >> dimensions and the strides. ?Further, the PEP-3118 buffer protocol >> calls for Py_ssize_t (which maps to npy_intp inside numpy) for the >> "shape", "stride" and "suboffset" arrays inside the Py_buffer struct. >> As I understand it, this is primarily required for when "suboffset" is >> >=0 in a specific dimension, since it indicates the buffer is arranged >> in a non-contiguous-memory layout and pointer arithmetic is required >> to access the array elements. >> >> Here's the relevant PEP-3118 section: >> >> """ >> suboffsets >> >> ? ?address of a Py_ssize_t * variable that will be filled with a >> pointer to an array of Py_ssize_t of length *ndims. If these suboffset >> numbers are >=0, then the value stored along the indicated dimension >> is a pointer and the suboffset value dictates how many bytes to add to >> the pointer after de-referencing. A suboffset value that it negative >> indicates that no de-referencing should occur (striding in a >> contiguous memory block). If all suboffsets are negative (i.e. no >> de-referencing is needed, then this must be NULL (the default value). >> If this is not requested by the caller (PyBUF_INDIRECT is not set), >> then this should be set to NULL or an PyExc_BufferError raised if this >> is not possible. >> >> ? ?For clarity, here is a function that returns a pointer to the >> element in an N-D array pointed to by an N-dimesional index when there >> are both non-NULL strides and suboffsets: >> >> ? ?void *get_item_pointer(int ndim, void *buf, Py_ssize_t *strides, >> ? ? ? ? ? ? ? ? ? ? ? ? ? Py_ssize_t *suboffsets, Py_ssize_t *indices) { >> ? ? ? ?char *pointer = (char*)buf; >> ? ? ? ?int i; >> ? ? ? ?for (i = 0; i < ndim; i++) { >> ? ? ? ? ? ?pointer += strides[i] * indices[i]; >> ? ? ? ? ? ?if (suboffsets[i] >=0 ) { >> ? ? ? ? ? ? ? ?pointer = *((char**)pointer) + suboffsets[i]; >> ? ? ? ? ? ?} >> ? ? ? ?} >> ? ? ? ?return (void*)pointer; >> ? ?} >> >> ? ?Notice the suboffset is added "after" the dereferencing occurs. >> Thus slicing in the ith dimension would add to the suboffsets in the >> (i-1)st dimension. Slicing in the first dimension would change the >> location of the starting pointer directly (i.e. buf would be >> modified). >> """ >> >> Hope this gives you some info to go on. >> > > Chuck > > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > > From josef.pktd at gmail.com Sat Jul 10 17:22:20 2010 From: josef.pktd at gmail.com (josef.pktd at gmail.com) Date: Sat, 10 Jul 2010 17:22:20 -0400 Subject: [SciPy-Dev] Function of the :mod: wiki function In-Reply-To: References: <4C38A8C9.4080702@creativetrax.com> Message-ID: On Sat, Jul 10, 2010 at 4:38 PM, David Goldsmith wrote: > On Sat, Jul 10, 2010 at 10:07 AM, Jason Grout > wrote: >> >> On 7/9/10 6:48 PM, David Goldsmith wrote: >> > In docstrings like that for numpy.fft, which in edit mode begins: >> > >> > Discrete Fourier Transform (:mod:`numpy.fft`) >> > ================================= >> > >> > what's the function of the :mod:? ?All it seems to do on being >> > rendered by the Wiki is generate a link back to it's own document. >> > >> >> For reference, here is a list of things like :mod: >> >> http://sphinx.pocoo.org/markup/inline.html >> >> (you probably already knew that, though) > > No, I didn't thanks.? That page indicates these things are to be used for > *cross*-referencing, i.e., what we're using single backticks for; my guess > is that someone used the :mod: before all our mark-up policies were > settled.? So, I'm removing it; if someone later explains to me why it should > be there, I'll put it back in. don't remove them from the package titles, I think you are destroying the module index http://docs.scipy.org/doc/scipy/reference/modindex.html Josef > > DG > >> >> Jason >> >> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at scipy.org >> http://mail.scipy.org/mailman/listinfo/scipy-dev > > > > -- > Mathematician: noun, someone who disavows certainty when their uncertainty > set is non-empty, even if that set has measure zero. > > Hope: noun, that delusive spirit which escaped Pandora's jar and, with her > lies, prevents mankind from committing a general suicide.? (As interpreted > by Robert Graves) > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > > From josef.pktd at gmail.com Sat Jul 10 17:27:41 2010 From: josef.pktd at gmail.com (josef.pktd at gmail.com) Date: Sat, 10 Jul 2010 17:27:41 -0400 Subject: [SciPy-Dev] Function of the :mod: wiki function In-Reply-To: References: <4C38A8C9.4080702@creativetrax.com> Message-ID: On Sat, Jul 10, 2010 at 5:22 PM, wrote: > On Sat, Jul 10, 2010 at 4:38 PM, David Goldsmith > wrote: >> On Sat, Jul 10, 2010 at 10:07 AM, Jason Grout >> wrote: >>> >>> On 7/9/10 6:48 PM, David Goldsmith wrote: >>> > In docstrings like that for numpy.fft, which in edit mode begins: >>> > >>> > Discrete Fourier Transform (:mod:`numpy.fft`) >>> > ================================= >>> > >>> > what's the function of the :mod:? ?All it seems to do on being >>> > rendered by the Wiki is generate a link back to it's own document. >>> > >>> >>> For reference, here is a list of things like :mod: >>> >>> http://sphinx.pocoo.org/markup/inline.html >>> >>> (you probably already knew that, though) >> >> No, I didn't thanks.? That page indicates these things are to be used for >> *cross*-referencing, i.e., what we're using single backticks for; my guess >> is that someone used the :mod: before all our mark-up policies were >> settled.? So, I'm removing it; if someone later explains to me why it should >> be there, I'll put it back in. > > don't remove them from the package titles, I think you are destroying > the module index > > http://docs.scipy.org/doc/scipy/reference/modindex.html If you are only talking about numpy, then the module index looks different (numpy.doc.xxx seem to be automodules) http://docs.scipy.org/doc/numpy/modindex.html I thought only about the scipy docs, and I don't know the structure of the numpy docs very well. Josef > > Josef > > >> >> DG >> >>> >>> Jason >>> >>> _______________________________________________ >>> SciPy-Dev mailing list >>> SciPy-Dev at scipy.org >>> http://mail.scipy.org/mailman/listinfo/scipy-dev >> >> >> >> -- >> Mathematician: noun, someone who disavows certainty when their uncertainty >> set is non-empty, even if that set has measure zero. >> >> Hope: noun, that delusive spirit which escaped Pandora's jar and, with her >> lies, prevents mankind from committing a general suicide.? (As interpreted >> by Robert Graves) >> >> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at scipy.org >> http://mail.scipy.org/mailman/listinfo/scipy-dev >> >> > From sturla at molden.no Sat Jul 10 17:49:44 2010 From: sturla at molden.no (Sturla Molden) Date: Sat, 10 Jul 2010 23:49:44 +0200 Subject: [SciPy-Dev] f2py, the fortran integer type, and npy_intp In-Reply-To: References: Message-ID: <4C38EAF8.4040705@molden.no> Charles R Harris skrev: > I note that some c programs try to call f2py generated fortran > interfaces using the npy_intp type for the array dimension. There are > two problems here, first the prototype for the generated wrapper uses > int causing compiler warnings on a 64 bit os, and second, the fortran > subroutines themselves use integer types. So some questions. What > precision are fortran integers? Precision is determined by the kind number, which is implementation dependent. In other words, "you cannot know". The only portable solution to this is to use Fortran 2003 C bindings. The Fortran compiler knows it's own internal details. Sturla From d.l.goldsmith at gmail.com Sat Jul 10 18:21:53 2010 From: d.l.goldsmith at gmail.com (David Goldsmith) Date: Sat, 10 Jul 2010 15:21:53 -0700 Subject: [SciPy-Dev] Function of the :mod: wiki function In-Reply-To: References: <4C38A8C9.4080702@creativetrax.com> Message-ID: On Sat, Jul 10, 2010 at 2:22 PM, wrote: > On Sat, Jul 10, 2010 at 4:38 PM, David Goldsmith > wrote: > > On Sat, Jul 10, 2010 at 10:07 AM, Jason Grout < > jason-sage at creativetrax.com> > > wrote: > >> > >> On 7/9/10 6:48 PM, David Goldsmith wrote: > >> > In docstrings like that for numpy.fft, which in edit mode begins: > >> > > >> > Discrete Fourier Transform (:mod:`numpy.fft`) > >> > ================================= > >> > > >> > what's the function of the :mod:? All it seems to do on being > >> > rendered by the Wiki is generate a link back to it's own document. > >> > > >> > >> For reference, here is a list of things like :mod: > >> > >> http://sphinx.pocoo.org/markup/inline.html > >> > >> (you probably already knew that, though) > > > > No, I didn't thanks. That page indicates these things are to be used for > > *cross*-referencing, i.e., what we're using single backticks for; my > guess > > is that someone used the :mod: before all our mark-up policies were > > settled. So, I'm removing it; if someone later explains to me why it > should > > be there, I'll put it back in. > > don't remove them from the package titles, I think you are destroying > the module index > > http://docs.scipy.org/doc/scipy/reference/modindex.html > Ah, OK, this is the info I was looking for: they do serve a (non-circular) function I was unaware of. Thanks, I'll leave them in; is there any way to remove their link nature in the Wiki so that others don't make the same mistake again in the future, (i.e., think that they're creating a redundant circular reference and thus remove them)? DG > Josef > > > > > > DG > > > >> > >> Jason > >> > >> _______________________________________________ > >> SciPy-Dev mailing list > >> SciPy-Dev at scipy.org > >> http://mail.scipy.org/mailman/listinfo/scipy-dev > > > > > > > > -- > > Mathematician: noun, someone who disavows certainty when their > uncertainty > > set is non-empty, even if that set has measure zero. > > > > Hope: noun, that delusive spirit which escaped Pandora's jar and, with > her > > lies, prevents mankind from committing a general suicide. (As > interpreted > > by Robert Graves) > > > > _______________________________________________ > > SciPy-Dev mailing list > > SciPy-Dev at scipy.org > > http://mail.scipy.org/mailman/listinfo/scipy-dev > > > > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > -- Mathematician: noun, someone who disavows certainty when their uncertainty set is non-empty, even if that set has measure zero. Hope: noun, that delusive spirit which escaped Pandora's jar and, with her lies, prevents mankind from committing a general suicide. (As interpreted by Robert Graves) -------------- next part -------------- An HTML attachment was scrubbed... URL: From charlesr.harris at gmail.com Sat Jul 10 18:31:10 2010 From: charlesr.harris at gmail.com (Charles R Harris) Date: Sat, 10 Jul 2010 16:31:10 -0600 Subject: [SciPy-Dev] f2py, the fortran integer type, and npy_intp In-Reply-To: <4C38EAF8.4040705@molden.no> References: <4C38EAF8.4040705@molden.no> Message-ID: On Sat, Jul 10, 2010 at 3:49 PM, Sturla Molden wrote: > Charles R Harris skrev: > > I note that some c programs try to call f2py generated fortran > > interfaces using the npy_intp type for the array dimension. There are > > two problems here, first the prototype for the generated wrapper uses > > int causing compiler warnings on a 64 bit os, and second, the fortran > > subroutines themselves use integer types. So some questions. What > > precision are fortran integers? > Precision is determined by the kind number, which is implementation > dependent. In other words, "you cannot know". The only portable solution > to this is to use Fortran 2003 C bindings. The Fortran compiler knows > it's own internal details. > > I don't think the bindings don't help, we need to have the default integers in the existing pre-FORTRAN77 code in scipy compiled as Py_ssize_d, and then f2py needs to be modified to generate appropriate Python bindings. That's a lot of work. Even with the bindings I think one would need to have a script to rewrite the FORTRAN code since the c-type corresponding to Py_ssize_d isn't fixed. Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From sturla at molden.no Sat Jul 10 19:33:12 2010 From: sturla at molden.no (Sturla Molden) Date: Sun, 11 Jul 2010 01:33:12 +0200 Subject: [SciPy-Dev] f2py, the fortran integer type, and npy_intp In-Reply-To: References: <4C38EAF8.4040705@molden.no> Message-ID: <4C390338.7080005@molden.no> Charles R Harris skrev: > > I don't think the bindings don't help, we need to have the default > integers in the existing pre-FORTRAN77 code in scipy compiled as > Py_ssize_d, and then f2py needs to be modified to generate appropriate > Python bindings. That's a lot of work. Even with the bindings I think > one would need to have a script to rewrite the FORTRAN code since the > c-type corresponding to Py_ssize_d isn't fixed. We should also beware that the problem applies to real numbers as well as integer. You cannot rely on standard mappings from C float to REAL and C double to DOUBLE PRECISION. With pre-Fortran 90, there is no way of controlling this portably. With Fortran 90 and later, we have the standard methods selected_real_kind and selected_int_kind that returns (compiler dependent) "kind" numbers, which can be used to declare real and integers with specific precitions. integer, parameter :: single = selected_real_kind(p=6, r=37) integer, parameter :: double = selected_real_kind(p=13) integer, parameter :: npy_int8 = selected_int_kind(2) integer, parameter :: npy_int16 = selected_int_kind(4) integer, parameter :: npy_int32 = selected_int_kind(9) integer, parameter :: npy_int64 = selected_int_kind(18) Now we can declare an npy_int32 like this: integer(kind=npy_int32) :: i Still we have no ide what npy_intp would map to. We can do this in Fortran 2003: use, intrinsic :: iso_c_binding integer, parameter :: npy_intp = c_intptr_t integer(kind=npy_intp) :: i Real numers and other integer sre also easier: integer, parameter :: float = c_float integer, parameter :: double = c_double integer, parameter :: npy_int32 = c_int32_t Such declarations can be put in a module, and subsequently imported to Fortran 90. It might be that f2c is the only cure for old Fortran code. The other option is to write a Fortran 2003 wrapper to interface C. Then this wrapper will call the old Fortran code. We then need to declare the old Fortran routine (as an interface block) to Fortran 2003. The Fortran compiler is then smart enough to do the correct conversions, including making a local copy of an array if that is needed. Wasn't Kurt Smith working on this for a GSOC project? Sturla From josef.pktd at gmail.com Sat Jul 10 19:33:46 2010 From: josef.pktd at gmail.com (josef.pktd at gmail.com) Date: Sat, 10 Jul 2010 19:33:46 -0400 Subject: [SciPy-Dev] Function of the :mod: wiki function In-Reply-To: References: <4C38A8C9.4080702@creativetrax.com> Message-ID: On Sat, Jul 10, 2010 at 6:21 PM, David Goldsmith wrote: > On Sat, Jul 10, 2010 at 2:22 PM, wrote: >> >> On Sat, Jul 10, 2010 at 4:38 PM, David Goldsmith >> wrote: >> > On Sat, Jul 10, 2010 at 10:07 AM, Jason Grout >> > >> > wrote: >> >> >> >> On 7/9/10 6:48 PM, David Goldsmith wrote: >> >> > In docstrings like that for numpy.fft, which in edit mode begins: >> >> > >> >> > Discrete Fourier Transform (:mod:`numpy.fft`) >> >> > ================================= >> >> > >> >> > what's the function of the :mod:? ?All it seems to do on being >> >> > rendered by the Wiki is generate a link back to it's own document. >> >> > >> >> >> >> For reference, here is a list of things like :mod: >> >> >> >> http://sphinx.pocoo.org/markup/inline.html >> >> >> >> (you probably already knew that, though) >> > >> > No, I didn't thanks.? That page indicates these things are to be used >> > for >> > *cross*-referencing, i.e., what we're using single backticks for; my >> > guess >> > is that someone used the :mod: before all our mark-up policies were >> > settled.? So, I'm removing it; if someone later explains to me why it >> > should >> > be there, I'll put it back in. >> >> don't remove them from the package titles, I think you are destroying >> the module index >> >> http://docs.scipy.org/doc/scipy/reference/modindex.html > > Ah, OK, this is the info I was looking for: they do serve a (non-circular) > function I was unaware of.? Thanks, I'll leave them in; is there any way to > remove their link nature in the Wiki so that others don't make the same > mistake again in the future, (i.e., think that they're creating a redundant > circular reference and thus remove them)? Sorry, my mistake, I mixed up the link and the link target .. module:: directive indicates the module target and causes the entry in the module index :mod: is the link and semantic markup. I still like the full path rendering that :mod: produces in the title. Josef > > DG > > >> >> Josef >> >> >> > >> > DG >> > >> >> >> >> Jason >> >> >> >> _______________________________________________ >> >> SciPy-Dev mailing list >> >> SciPy-Dev at scipy.org >> >> http://mail.scipy.org/mailman/listinfo/scipy-dev >> > >> > >> > >> > -- >> > Mathematician: noun, someone who disavows certainty when their >> > uncertainty >> > set is non-empty, even if that set has measure zero. >> > >> > Hope: noun, that delusive spirit which escaped Pandora's jar and, with >> > her >> > lies, prevents mankind from committing a general suicide.? (As >> > interpreted >> > by Robert Graves) >> > >> > _______________________________________________ >> > SciPy-Dev mailing list >> > SciPy-Dev at scipy.org >> > http://mail.scipy.org/mailman/listinfo/scipy-dev >> > >> > >> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at scipy.org >> http://mail.scipy.org/mailman/listinfo/scipy-dev > > > > -- > Mathematician: noun, someone who disavows certainty when their uncertainty > set is non-empty, even if that set has measure zero. > > Hope: noun, that delusive spirit which escaped Pandora's jar and, with her > lies, prevents mankind from committing a general suicide.? (As interpreted > by Robert Graves) > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > > From d.l.goldsmith at gmail.com Sat Jul 10 21:55:04 2010 From: d.l.goldsmith at gmail.com (David Goldsmith) Date: Sat, 10 Jul 2010 18:55:04 -0700 Subject: [SciPy-Dev] Function of the :mod: wiki function In-Reply-To: References: <4C38A8C9.4080702@creativetrax.com> Message-ID: On Sat, Jul 10, 2010 at 4:33 PM, wrote: > On Sat, Jul 10, 2010 at 6:21 PM, David Goldsmith > wrote: > > On Sat, Jul 10, 2010 at 2:22 PM, wrote: > >> > >> On Sat, Jul 10, 2010 at 4:38 PM, David Goldsmith > >> wrote: > >> > On Sat, Jul 10, 2010 at 10:07 AM, Jason Grout > >> > > >> > wrote: > >> >> > >> >> On 7/9/10 6:48 PM, David Goldsmith wrote: > >> >> > In docstrings like that for numpy.fft, which in edit mode begins: > >> >> > > >> >> > Discrete Fourier Transform (:mod:`numpy.fft`) > >> >> > ================================= > >> >> > > >> >> > what's the function of the :mod:? All it seems to do on being > >> >> > rendered by the Wiki is generate a link back to it's own document. > >> >> > > >> >> > >> >> For reference, here is a list of things like :mod: > >> >> > >> >> http://sphinx.pocoo.org/markup/inline.html > >> >> > >> >> (you probably already knew that, though) > >> > > >> > No, I didn't thanks. That page indicates these things are to be used > >> > for > >> > *cross*-referencing, i.e., what we're using single backticks for; my > >> > guess > >> > is that someone used the :mod: before all our mark-up policies were > >> > settled. So, I'm removing it; if someone later explains to me why it > >> > should > >> > be there, I'll put it back in. > >> > >> don't remove them from the package titles, I think you are destroying > >> the module index > >> > >> http://docs.scipy.org/doc/scipy/reference/modindex.html > > > > Ah, OK, this is the info I was looking for: they do serve a > (non-circular) > > function I was unaware of. Thanks, I'll leave them in; is there any way > to > > remove their link nature in the Wiki so that others don't make the same > > mistake again in the future, (i.e., think that they're creating a > redundant > > circular reference and thus remove them)? > > Sorry, my mistake, I mixed up the link and the link target > .. module:: directive indicates the module target and causes the > entry in the module index > > :mod: is the link and semantic markup. > > I still like the full path rendering that :mod: produces in the title. > But it's to itself - what's the point? DG -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at googlemail.com Sun Jul 11 04:59:47 2010 From: ralf.gommers at googlemail.com (Ralf Gommers) Date: Sun, 11 Jul 2010 16:59:47 +0800 Subject: [SciPy-Dev] Opinions re: duplication of content In-Reply-To: References: Message-ID: On Sun, Jul 11, 2010 at 4:58 AM, David Goldsmith wrote: > numpy.fft has a lengthy, referenced "Background" section on DFT's in > general; opinions, please, re: duplicating vs. referencing that content in > scipy.fftpack.basic. The content in numpy.fft looks fine to me, and I don't see it duplicated in scipy.fftpack.basic either in svn or the wiki. Can you be more precise? (If someone explicitly downloads/installs only scipy, do they have full > local access to all numpy doc?) Thanks! Scipy doesn't work without numpy, so yes. Cheers, Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From tom.grydeland at gmail.com Sun Jul 11 10:32:31 2010 From: tom.grydeland at gmail.com (Tom Grydeland) Date: Sun, 11 Jul 2010 16:32:31 +0200 Subject: [SciPy-Dev] Opinions re: duplication of content In-Reply-To: References: Message-ID: <1EF82521-9105-42D2-96EC-47C91BE03519@gmail.com> On 10. juli 2010, at 22.58, David Goldsmith wrote: > numpy.fft has a lengthy, referenced "Background" section on DFT's in > general; opinions, please, re: duplicating vs. referencing that > content in scipy.fftpack.basic. To me, referencing it is the best -- only one copy to keep current. > DG -- //Tom Grydeland "You cannot perceive beauty but with a serene mind" -H. D. Thoreau From josef.pktd at gmail.com Sun Jul 11 10:45:55 2010 From: josef.pktd at gmail.com (josef.pktd at gmail.com) Date: Sun, 11 Jul 2010 10:45:55 -0400 Subject: [SciPy-Dev] Opinions re: duplication of content In-Reply-To: <1EF82521-9105-42D2-96EC-47C91BE03519@gmail.com> References: <1EF82521-9105-42D2-96EC-47C91BE03519@gmail.com> Message-ID: On Sun, Jul 11, 2010 at 10:32 AM, Tom Grydeland wrote: > > > On 10. juli 2010, at 22.58, David Goldsmith > wrote: > >> numpy.fft has a lengthy, referenced "Background" section on DFT's in >> general; opinions, please, re: duplicating vs. referencing that >> content in scipy.fftpack.basic. > > To me, referencing it is the best -- only one copy to keep current. We had a brief discussion on the mailing list a while ago about the background docs for fftpack, repeating a few paragraphs doesn't sound a lot of duplication. Some (very incomplete) information is in http://docs.scipy.org/scipy/docs/scipy-docs/tutorial/fftpack.rst/#fftpack put there to rescue the Latex rendered formulas, because they had too much markup for the docstrings. Josef > >> DG > > -- > //Tom Grydeland > ?"You cannot perceive beauty but with a serene mind" -H. D. Thoreau > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > From ralf.gommers at googlemail.com Sun Jul 11 11:24:20 2010 From: ralf.gommers at googlemail.com (Ralf Gommers) Date: Sun, 11 Jul 2010 23:24:20 +0800 Subject: [SciPy-Dev] ANN: scipy 0.8.0 release candidate 2 Message-ID: I'm pleased to announce the availability of the second release candidate of SciPy 0.8.0. The only changes compared to rc1 are the fixed test failures for special.arccosh/arctanh on Windows, and correct version numbering of the documentation. If no more problems are reported, the final release will be available this Wednesday. SciPy is a package of tools for science and engineering for Python. It includes modules for statistics, optimization, integration, linear algebra, Fourier transforms, signal and image processing, ODE solvers, and more. This release candidate release comes almost one and a half year after the 0.7.0 release and contains many new features, numerous bug-fixes, improved test coverage, and better documentation. Please note that SciPy 0.8.0rc2 requires Python 2.4-2.6 and NumPy 1.4.1 or greater. For more information, please see the release notes: http://sourceforge.net/projects/scipy/files/scipy/0.8.0rc2/NOTES.txt/view You can download the release from here: https://sourceforge.net/projects/scipy/ Python 2.5/2.6 binaries for Windows and OS X are available, as well as source tarballs for other platforms and the documentation in pdf form. Thank you to everybody who contributed to this release. Enjoy, Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From d.l.goldsmith at gmail.com Sun Jul 11 13:10:58 2010 From: d.l.goldsmith at gmail.com (David Goldsmith) Date: Sun, 11 Jul 2010 10:10:58 -0700 Subject: [SciPy-Dev] Opinions re: duplication of content In-Reply-To: References: <1EF82521-9105-42D2-96EC-47C91BE03519@gmail.com> Message-ID: On Sun, Jul 11, 2010 at 7:45 AM, wrote: > On Sun, Jul 11, 2010 at 10:32 AM, Tom Grydeland > wrote: > > > > > > On 10. juli 2010, at 22.58, David Goldsmith > > wrote: > > > >> numpy.fft has a lengthy, referenced "Background" section on DFT's in > >> general; opinions, please, re: duplicating vs. referencing that > >> content in scipy.fftpack.basic. > > > > To me, referencing it is the best -- only one copy to keep current. > > We had a brief discussion on the mailing list a while ago about the > background docs for fftpack, repeating a few paragraphs doesn't sound > a lot of duplication. > > Some (very incomplete) information is in > http://docs.scipy.org/scipy/docs/scipy-docs/tutorial/fftpack.rst/#fftpack > put there to rescue the Latex rendered formulas, because they had too > much markup for the docstrings. > OK, thanks guys, a couple paragraphs (of a nature that won't need to be updated - as it is, the content under discussion is background, i.e., mathematical, not computational, and thus is based on stuff a century to two old) and then referral. DG > > Josef > > > > >> DG > > > > -- > > //Tom Grydeland > > "You cannot perceive beauty but with a serene mind" -H. D. Thoreau > > > > _______________________________________________ > > SciPy-Dev mailing list > > SciPy-Dev at scipy.org > > http://mail.scipy.org/mailman/listinfo/scipy-dev > > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > -- Mathematician: noun, someone who disavows certainty when their uncertainty set is non-empty, even if that set has measure zero. Hope: noun, that delusive spirit which escaped Pandora's jar and, with her lies, prevents mankind from committing a general suicide. (As interpreted by Robert Graves) -------------- next part -------------- An HTML attachment was scrubbed... URL: From aisaac at american.edu Sun Jul 11 17:08:42 2010 From: aisaac at american.edu (Alan G Isaac) Date: Sun, 11 Jul 2010 17:08:42 -0400 Subject: [SciPy-Dev] ANN: scipy 0.8.0 release candidate 2 In-Reply-To: References: Message-ID: <4C3A32DA.4010508@american.edu> Still have the directory removal error and now have two warnings. But no SciPy test errors. Alan Isaac Python 2.6.5 (r265:79096, Mar 19 2010, 21:48:26) [MSC v.1500 32 bit (Intel)] on win32 Type "help", "copyright", "credits" or "license" for more information. >>> import scipy as sp >>> sp.test() Running unit tests for scipy NumPy version 1.4.1 NumPy is installed in C:\Python26\lib\site-packages\numpy SciPy version 0.8.0rc2 SciPy is installed in C:\Python26\lib\site-packages\scipy Python version 2.6.5 (r265:79096, Mar 19 2010, 21:48:26) [MSC v.1500 32 bit (Intel)] nose version 0.11.0 C:\Python26\lib\site-packages\scipy\io\matlab\mio5.py:90: RuntimeWarning: __builtin__.file size changed, may indicate binary incompatibility from mio5_utils import VarReader5 ................................................................................ ....................................................................C:\Python26\ lib\site-packages\scipy\cluster\vq.py:570: UserWarning: One of the clusters is e mpty. Re-run kmean with a different initialization. warnings.warn("One of the clusters is empty. " ..................................................K............................. [snip] ................................................................................ .........error removing c:\users\alanis~1\appdata\local\temp\tmp67htl4cat_test: c:\users\alanis~1\appdata\local\temp\tmp67htl4cat_test: The directory is not empty ................................................................................ .................. ---------------------------------------------------------------------- Ran 4415 tests in 56.028s OK (KNOWNFAIL=13, SKIP=39) >>> From scott.sinclair.za at gmail.com Mon Jul 12 03:59:04 2010 From: scott.sinclair.za at gmail.com (Scott Sinclair) Date: Mon, 12 Jul 2010 09:59:04 +0200 Subject: [SciPy-Dev] ANN: scipy 0.8.0 release candidate 2 In-Reply-To: References: Message-ID: >On 11 July 2010 17:24, Ralf Gommers wrote: > I'm pleased to announce the availability of the second release candidate of > SciPy 0.8.0. If no more problems are reported, the final release will be > available this Wednesday. I've just noticed this http://projects.scipy.org/scipy/ticket/1228 Cheers, Scott From gael.varoquaux at normalesup.org Mon Jul 12 04:35:18 2010 From: gael.varoquaux at normalesup.org (Gael Varoquaux) Date: Mon, 12 Jul 2010 10:35:18 +0200 Subject: [SciPy-Dev] Modifying the trac wiki Message-ID: <20100712083518.GD14504@phare.normalesup.org> Hi there, I don't believe I can modify the wiki pages on the scipy trac. I'd like to modify the following page: http://www.scipy.org/scipy/scikits/wiki/MachineLearning That describes the scikit learn. Could somebody empty the content (not the title) and add a reference to the current page for the scikit learn: http://scikit-learn.sourceforge.net/ simply stating that this is the scikit to do machine learning. Other option would be to give me wiki edition rights, but that might be harder :). Cheers, Ga?l -- Gael Varoquaux Research Fellow, INRIA Laboratoire de Neuro-Imagerie Assistee par Ordinateur NeuroSpin/CEA Saclay , Bat 145, 91191 Gif-sur-Yvette France Phone: ++ 33-1-69-08-78-35 Mobile: ++ 33-6-28-25-64-62 http://gael-varoquaux.info From kwmsmith at gmail.com Mon Jul 12 12:10:04 2010 From: kwmsmith at gmail.com (Kurt Smith) Date: Mon, 12 Jul 2010 11:10:04 -0500 Subject: [SciPy-Dev] f2py, the fortran integer type, and npy_intp In-Reply-To: <4C390338.7080005@molden.no> References: <4C38EAF8.4040705@molden.no> <4C390338.7080005@molden.no> Message-ID: On Sat, Jul 10, 2010 at 6:33 PM, Sturla Molden wrote: > Charles R Harris skrev: >> >> I don't think the bindings don't help, we need to have the default >> integers in the existing pre-FORTRAN77 code in scipy compiled as >> Py_ssize_d, and then f2py needs to be modified to generate appropriate >> Python bindings. That's a lot of work. Even with the bindings I think >> one would need to have a script to rewrite the FORTRAN code since the >> c-type corresponding to Py_ssize_d isn't fixed. > We should also beware that the problem applies to real numbers as well > as integer. You cannot rely on standard mappings from C float to REAL > and C double to DOUBLE PRECISION. > > With pre-Fortran 90, there is no way of controlling this portably. With > Fortran 90 and later, we have the standard methods selected_real_kind > and selected_int_kind that returns (compiler dependent) "kind" numbers, > which can be used to declare real and integers with specific precitions. > > ?integer, parameter :: single = selected_real_kind(p=6, r=37) > ?integer, parameter :: double = selected_real_kind(p=13) > > ?integer, parameter :: npy_int8 = selected_int_kind(2) > ?integer, parameter :: npy_int16 = selected_int_kind(4) > ?integer, parameter :: npy_int32 = selected_int_kind(9) > ?integer, parameter :: npy_int64 = selected_int_kind(18) > > Now we can declare an npy_int32 like this: > > ?integer(kind=npy_int32) :: i > > Still we have no ide what npy_intp would map to. We can do this in > Fortran 2003: > > ?use, intrinsic :: ?iso_c_binding > ?integer, parameter :: npy_intp = c_intptr_t Trouble is that 'c_intptr_t' isn't defined for some fortran compilers. So you first have to find which one of 'int', 'long int' or 'long long int' corresponds to 'npy_intp' in C code, then declare a parameter in fortran: integer, parameter :: fort_npy_intp = c_long for example. Not a big deal. > > ?integer(kind=npy_intp) :: ?i > > Real numers and other integer sre also easier: > > ?integer, parameter :: float = c_float > ?integer, parameter :: double = c_double > ?integer, parameter :: npy_int32 = c_int32_t > > Such declarations can be put in a module, and subsequently imported to > Fortran 90. > > It might be that f2c is the only cure for old Fortran code. The other > option is to write a Fortran 2003 wrapper to interface C. Then this > wrapper will call the old Fortran code. We then need to declare the old > Fortran routine (as an interface block) to Fortran 2003. The Fortran > compiler is then smart enough to do the correct conversions, including > making a local copy of an array if that is needed. Umm, unless I misunderstand you, this isn't how many fortran compilers behave (ifort, gfortran & g95). Try the following: subroutine outer() integer(kind=8) :: arr(10,10) call inner(arr) contains subroutine inner(arr) integer(kind=4), intent(inout) :: arr(:,:) arr = 1 end subroutine end subroutine Compiling with gfortran 4.4.4: call inner(arr) 1 Error: Type mismatch in argument 'arr' at (1); passed INTEGER(8) to INTEGER(4) But perhaps I misunderstand you. > > Wasn't Kurt Smith working on this for a GSOC project? Yes :-) It is coming along, and being actively developed. Many niggling issues to work out. We're at near-parity with f2py at the moment, with a few enhancements, but no 'pyf' interface modules (yet). http://fwrap.sourceforge.net/ http://fortrancython.wordpress.com/ Kurt From sturla at molden.no Mon Jul 12 12:38:04 2010 From: sturla at molden.no (Sturla Molden) Date: Mon, 12 Jul 2010 18:38:04 +0200 Subject: [SciPy-Dev] f2py, the fortran integer type, and npy_intp In-Reply-To: References: <4C38EAF8.4040705@molden.no> <4C390338.7080005@molden.no> Message-ID: <4C3B44EC.1050700@molden.no> Den 12.07.2010 18:10, skrev Kurt Smith: > 1 > Error: Type mismatch in argument 'arr' at (1); passed INTEGER(8) to INTEGER(4) > > But perhaps I misunderstand you. > > Damn... I really thought the compiler would do the conversion :( Ok, another suggestion: Write a small Fortran 2003 routine that returns the size of (compiler dependent) Fortran types to C, and do the conversion manually in C. Or start a Herculean effort and convert the old Fortran codes. Anyone feel like writing a FORTRAN IV compiler in Python? ;) Sturla From josh.holbrook at gmail.com Mon Jul 12 15:55:29 2010 From: josh.holbrook at gmail.com (Joshua Holbrook) Date: Mon, 12 Jul 2010 11:55:29 -0800 Subject: [SciPy-Dev] Request for Edit Permissions Message-ID: Sorry if I'm polluting the mailing list--I accidentally closed the instructions page for this. Anyways: I'd like to be able to edit the scipy documentation--in particular, I wanted to clarify the use of scipy.interpolate.lagrange. Can I get permission? :D --Josh From d.l.goldsmith at gmail.com Mon Jul 12 17:23:36 2010 From: d.l.goldsmith at gmail.com (David Goldsmith) Date: Mon, 12 Jul 2010 14:23:36 -0700 Subject: [SciPy-Dev] Request for Edit Permissions In-Reply-To: References: Message-ID: On Mon, Jul 12, 2010 at 12:55 PM, Joshua Holbrook wrote: > Sorry if I'm polluting the mailing list--I accidentally closed the > instructions page for this. > Not polluting, this is the place to make such a request. DG Anyways: I'd like to be able to edit the scipy documentation--in > particular, I wanted to clarify the use of scipy.interpolate.lagrange. > Can I get permission? :D > > --Josh > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > -- Mathematician: noun, someone who disavows certainty when their uncertainty set is non-empty, even if that set has measure zero. Hope: noun, that delusive spirit which escaped Pandora's jar and, with her lies, prevents mankind from committing a general suicide. (As interpreted by Robert Graves) -------------- next part -------------- An HTML attachment was scrubbed... URL: From josef.pktd at gmail.com Mon Jul 12 17:31:34 2010 From: josef.pktd at gmail.com (josef.pktd at gmail.com) Date: Mon, 12 Jul 2010 17:31:34 -0400 Subject: [SciPy-Dev] chi-square test for a contingency (R x C) table In-Reply-To: References: <4C05DDF3.9010206@enthought.com> <4C06B8FB.8080806@gmail.com> <4C07ADC1.6040504@enthought.com> <4C0932FC.2020108@gmail.com> <4C0D0993.1080206@gmail.com> <4C0D2213.7020302@gmail.com> <4C19AB7E.1000405@enthought.com> <4C1A2C85.7090001@gmail.com> <4C1A3411.1000806@enthought.com> <4C1A3FD8.3050801@gmail.com> <4C1A5476.1040407@gmail.com> <4C1AC126.1080308@wartburg.edu> <4C1B7EA8.4090109@gmail.com> <4C1C253C.5050604@enthought.com> <4C1CC58F.6090102@enthought.com> Message-ID: On Sat, Jun 19, 2010 at 9:58 AM, wrote: > On Sat, Jun 19, 2010 at 9:26 AM, Warren Weckesser > wrote: >> josef.pktd at gmail.com wrote: >>> >>> >>> Forget any merging of the functions. >>> >>> Statistical functions should also be defined by their purpose, we are >>> not creating universal f_tests and t_tests. Unless someone is >>> proposing the merge and unify various t_tests, ... ? >>> misquoting: "The user's hypothesis is totally irrelevant ..." ??? >>> >>> Testing for goodness-of-fit is a completely different use case, with >>> different extensions, e.g. power discrepancy. What if I have a 2d >>> array and want to check goodness-of-fit along each axis, which might >>> be useful once group-by extensions to bincount handle more than 1d >>> weights. >> >> >> So you are anticipating something like this (where `obs` is, say, 2D): >> >> ?>>> chisquare_fit(table, axis=-1) >> >> Then the result would also be 2D, with the last axis having length 2 and >> holding the (chi2, p) values? > > I haven't looked at this closely yet, but I would think it would be a > standard reduce by one axis, usually we would return one array for the > test statistic and one array for the p-values (both same dimension > equal to one less than the original) > > chisquare_fit(table, axis=-1) ?as equivalent to [chisquare(table[k]) > for k in range(table.size[0])] for 2d > and apply_along_axis for nd > > This would be easy to extend but I don't know how much the need is for > this currently. > > eg. if we have a sample by geographic region or groups, we might want > to test whether the distribution is uniform or normal in each group. > (continuous distributions would require binning first) > >> >>> ?Or if we extend it to multivariate distributions, then the >>> default might be uniform for each column (and not independence.) >>> This is a standard test for distributions, and should not be mixed >>> with contingency tables >>> >>> >> >> Could you elaborate on this use case? ?I don't know enough about it to >> be able to decide if this is something that could be implemented right >> away, or if it is something that might not happen for years, if ever. > > During this thread, I started to think of contingency tables just as a > nd discrete distribution, where we can have functions for the > multivariate distributions, marginal pdf, conditional pdf, ... and > some tests on it. > Independence in this case would be just one hypothesis. > Also, the chisquare independence test conditions on the margin totals, > this might be the most common case, but not necessarily the only > chisquare hypothesis we might test. (I'm not to clear on all the > contingency table stuff.) > > multivariate distributions are only on my wish list, and it will > require some work to go beyond pdf, loglike and rvs. > multivariate discrete (contingency tables without the statistics) and > multivariate normal and some others would be the first candidates. > (copulas would be another multivariate distribution wish) > > I don't know what would be the ETA (expected time of arrival) for these. > > > I like your current implementation, because it's right to the point > and easy to explain and use. And it looks forward compatible to > extended functionality that we might think of. > > Josef > >> >> >>> contingency tables are a different case, which I never use, and where >>> I would go with whatever statisticians prefer. But I think, going by >>> null hypothesis makes functions for statistical tests much cleaner >>> (easier to categorize, explain, find) than one-stop statistics (at >>> least for functions and not methods in classes) as is the current >>> tradition of scipy.stats. >>> >>> "fit" in your function name is very misleading chisquare_fit, because >>> your function doesn't do any fitting. If a rename is desired, I would >>> call it chisquare_gof, but I use a similar name for the actual gof >>> test based on the sample data, with automatic binning. >>> Fitting the distribution parameters raises other issues which I don't >>> think should be mixed with the basic chisquare-test >>> >>> >> >> Yes, I agree. ?I only used "fit" to distinguish it from "ind". ?I didn't >> want to use "oneway" and "nway", because those names might lead one to >> think that "oneway" is the n=1 case of "nway", but it is not. >> >> >> Warren >> >> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at scipy.org >> http://mail.scipy.org/mailman/listinfo/scipy-dev >> > another reference http://www.mathworks.com/access/helpdesk/help/toolbox/stats/crosstab.html found when I was looking for something different and I never used it. Josef From gael.varoquaux at normalesup.org Mon Jul 12 17:39:42 2010 From: gael.varoquaux at normalesup.org (Gael Varoquaux) Date: Mon, 12 Jul 2010 23:39:42 +0200 Subject: [SciPy-Dev] Request for Edit Permissions In-Reply-To: References: Message-ID: <20100712213942.GB23568@phare.normalesup.org> On Mon, Jul 12, 2010 at 11:55:29AM -0800, Joshua Holbrook wrote: > Sorry if I'm polluting the mailing list--I accidentally closed the > instructions page for this. > Anyways: I'd like to be able to edit the scipy documentation--in > particular, I wanted to clarify the use of scipy.interpolate.lagrange. > Can I get permission? :D This should be done. Ga?l From gael.varoquaux at normalesup.org Mon Jul 12 17:40:17 2010 From: gael.varoquaux at normalesup.org (Gael Varoquaux) Date: Mon, 12 Jul 2010 23:40:17 +0200 Subject: [SciPy-Dev] Request for Edit Permissions In-Reply-To: <20100712213942.GB23568@phare.normalesup.org> References: <20100712213942.GB23568@phare.normalesup.org> Message-ID: <20100712214017.GC23568@phare.normalesup.org> On Mon, Jul 12, 2010 at 11:39:42PM +0200, Gael Varoquaux wrote: > On Mon, Jul 12, 2010 at 11:55:29AM -0800, Joshua Holbrook wrote: > > Sorry if I'm polluting the mailing list--I accidentally closed the > > instructions page for this. > > Anyways: I'd like to be able to edit the scipy documentation--in > > particular, I wanted to clarify the use of scipy.interpolate.lagrange. > > Can I get permission? :D > This should be done. And I forgot to thank you for your interest. We need people to improve the docs! Ga?l From pav at iki.fi Mon Jul 12 18:08:22 2010 From: pav at iki.fi (Pauli Virtanen) Date: Tue, 13 Jul 2010 00:08:22 +0200 Subject: [SciPy-Dev] A zombie Scikits Trac? [was: Modifying the trac wiki] Message-ID: <1278972502.2403.2.camel@talisman> Mon, 12 Jul 2010 10:35:18 +0200, Gael Varoquaux wrote: > I don't believe I can modify the wiki pages on the scipy trac. I'd > like to modify the following page: > > http://www.scipy.org/scipy/scikits/wiki/MachineLearning Uhh, that's the old scikits Trac. There's another one here: http://projects.scipy.org/scikits/ Does someone (Stefan?) remember why the old one wasn't deactivated? Is there a reason not to deactivate it? Pauli From charlesr.harris at gmail.com Mon Jul 12 20:24:45 2010 From: charlesr.harris at gmail.com (Charles R Harris) Date: Mon, 12 Jul 2010 18:24:45 -0600 Subject: [SciPy-Dev] build problem in scipy/interpolate Message-ID: Hi All, I notice that if I touch a fortran file in scipy/interpolate/fitpack, that file along with all the others in the directory are recompiled, but the library they are part of is untouched. This is annoying as I have to delete the whole build directory to get a modified library. I assume this is a dependency error someplace but I don't know where to look. Anyone know? Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From cgohlke at uci.edu Mon Jul 12 21:11:53 2010 From: cgohlke at uci.edu (Christoph Gohlke) Date: Mon, 12 Jul 2010 18:11:53 -0700 Subject: [SciPy-Dev] memory corruption when running scipy 0.8 test suite Message-ID: <4C3BBD59.2020300@uci.edu> Dear SciPy developers, I am trying to fix the errors and failures reported in the thread "[SciPy-User] many test failures on windows 64" . Most of the issues seem specific for the msvc9/MKL build of scipy 0.8. I am now down to 1 error and 6 failures (from 16 errors and 9 failures). The following failure in ndimage does not appear when ndimage.test() is run out of context of scipy.test() FAIL: extrema 3 ---------------------------------------------------------------------- Traceback (most recent call last): File "C:\Python26\lib\site-packages\scipy\ndimage\tests\test_ndimage.py", line 3149, in test_extrema03 self.failUnless(numpy.all(output1[2] == output4)) AssertionError The output1[2] array contains a NaN in the first position. If I disable the following dsyevr related tests in scipy.lib.lapack.tests.test_esv.py, all ndimage tests pass. Could this be a memory corruption issue in MKL? Besides the ndimage failure, scipy.lib.lapack seems to work and passes all tests. Also, this artifact only happens or surfaces on the 64-bit build. Index: test_esv.py =================================================================== --- test_esv.py (revision 6598) +++ test_esv.py (working copy) @@ -91,17 +91,17 @@ def test_ssyevr(self): self._test_base('ssyevr', 'F') - @dec.skipif(FLAPACK_IS_EMPTY, "Flapack empty, skip flapack test") - def test_dsyevr(self): - self._test_base('dsyevr', 'F') +# @dec.skipif(FLAPACK_IS_EMPTY, "Flapack empty, skip flapack test") +# def test_dsyevr(self): +# self._test_base('dsyevr', 'F') @dec.skipif(FLAPACK_IS_EMPTY, "Flapack empty, skip flapack test") def test_ssyevr_ranges(self): self._test_syevr_ranges('ssyevr', 'F') - @dec.skipif(FLAPACK_IS_EMPTY, "Flapack empty, skip flapack test") - def test_dsyevr_ranges(self): - self._test_syevr_ranges('dsyevr', 'F') +# @dec.skipif(FLAPACK_IS_EMPTY, "Flapack empty, skip flapack test") +# def test_dsyevr_ranges(self): +# self._test_syevr_ranges('dsyevr', 'F') # Clapack tests @dec.skipif(CLAPACK_IS_EMPTY or not FUNCS_CLAPACK["ssyev"], I checked the flapack_esv.pyf.src code but could not find anything obvious. I am linking against mkl_lapack95_lp64, mkl_blas95_lp64, mkl_intel_lp64, mkl_intel_thread, and mkl_core, MKL version 10.2.5.1. Thank you, Christoph From sturla at molden.no Tue Jul 13 01:45:40 2010 From: sturla at molden.no (Sturla Molden) Date: Tue, 13 Jul 2010 07:45:40 +0200 Subject: [SciPy-Dev] Re FFTs in SciPy (and NumPy) Message-ID: <4C3BFD84.2070901@molden.no> There has been some discussion on FFTPACK lately. Problems with FFTPACK seems to be: - Written in old Fortran 77. - Unprecise for single precision. - Can sometimes be very slow, depending on input size. - Can only handle a few small prime factors {2,3,4,5} efficiently. - How to control integer size portably for 64-bit and f2py? - Only 1D transforms. There is another FFT library in cwplib from Colorado Mining School. It has some very nice properties: - Written in C, not Fortran, and the FFT-part of the library is just one single C file. - License is BSD it seems. - Quite fast (see benchmarks in http://www.fftw.org/fftw-paper.pdf and http://www.fftw.org/speed/) - Can handle larger set of prime factors than FFTPACK {2,3,4,5,7,8,9,11,13,16}. - 1D and 2D inplace transforms. The problem with cwplib is that it does not allow arbitrary sized FFTs. But til will be fast whenever FFTPACK is fast (it seems FFTPACK is only competitive for N=2**k case). It will also be fast for other factorizations where FFTPACK will fallback to O(N**2). How to handle arbitrary sized arrays with cwplib? One possibility is using Bluestein's FFT algorithm, which just requires a few lines og Python. This was even suggested to avoid O(N**2) fallback in FFTPACK here a while ago. If we are going to do that for FFTPACK, the cwplib's FFT will actually be more flexible with respect to array size. All in all I think it is worth looking at. The C code is in the "Seismic U*nx" download from CWP, which is huge, but contains a lot of nice C library functions for scientific computing apart from FFTs as well (all of which are BSD licensed it seems). http://www.cwp.mines.edu/cwpcodes/index.html Regards, Sturla From sturla at molden.no Tue Jul 13 02:02:54 2010 From: sturla at molden.no (Sturla Molden) Date: Tue, 13 Jul 2010 08:02:54 +0200 Subject: [SciPy-Dev] Re FFTs in SciPy (and NumPy) In-Reply-To: <4C3BFD84.2070901@molden.no> References: <4C3BFD84.2070901@molden.no> Message-ID: <4C3C018E.6050200@molden.no> Since the code appears to be BSD licensed I can attach it here, in case you don't want to download 20 MB just to see one C file. Sturla -------------- next part -------------- An embedded and charset-unspecified text was scrubbed... Name: LEGAL_STATEMENT URL: -------------- next part -------------- An embedded and charset-unspecified text was scrubbed... Name: pfafft.c URL: From sturla at molden.no Tue Jul 13 02:16:24 2010 From: sturla at molden.no (Sturla Molden) Date: Tue, 13 Jul 2010 08:16:24 +0200 Subject: [SciPy-Dev] Re FFTs in SciPy (and NumPy) In-Reply-To: <4C3C018E.6050200@molden.no> References: <4C3BFD84.2070901@molden.no> <4C3C018E.6050200@molden.no> Message-ID: <4C3C04B8.4050601@molden.no> I should have included these files as well. Sturla Sturla Molden skrev: > Since the code appears to be BSD licensed I can attach it here, in > case you don't want to download 20 MB just to see one C file. > > Sturla > ------------------------------------------------------------------------ > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev -------------- next part -------------- An embedded and charset-unspecified text was scrubbed... Name: dpfafft.c URL: -------------- next part -------------- An embedded and charset-unspecified text was scrubbed... Name: cwp.h URL: From stefan at sun.ac.za Tue Jul 13 03:44:55 2010 From: stefan at sun.ac.za (=?ISO-8859-1?Q?St=E9fan_van_der_Walt?=) Date: Tue, 13 Jul 2010 09:44:55 +0200 Subject: [SciPy-Dev] A zombie Scikits Trac? [was: Modifying the trac wiki] In-Reply-To: <1278972502.2403.2.camel@talisman> References: <1278972502.2403.2.camel@talisman> Message-ID: 2010/7/13 Pauli Virtanen : > Mon, 12 Jul 2010 10:35:18 +0200, Gael Varoquaux wrote: >> I don't believe I can modify the wiki pages on the scipy trac. I'd >> like to modify the following page: >> >> http://www.scipy.org/scipy/scikits/wiki/MachineLearning > > Uhh, that's the old scikits Trac. There's another one here: > > ? ? ? ?http://projects.scipy.org/scikits/ > > Does someone (Stefan?) remember why the old one wasn't deactivated? Is > there a reason not to deactivate it? I see no reason not to; I think it was simply left up to verify that information was transferred correctly between instances. I don't have access to the www.scipy.org server, though, so I hope Aaron can disable it. Regards St?fan From ralf.gommers at googlemail.com Tue Jul 13 08:26:39 2010 From: ralf.gommers at googlemail.com (Ralf Gommers) Date: Tue, 13 Jul 2010 20:26:39 +0800 Subject: [SciPy-Dev] memory corruption when running scipy 0.8 test suite In-Reply-To: <4C3BBD59.2020300@uci.edu> References: <4C3BBD59.2020300@uci.edu> Message-ID: Hi Christoph, On Tue, Jul 13, 2010 at 9:11 AM, Christoph Gohlke wrote: > Dear SciPy developers, > > I am trying to fix the errors and failures reported in the thread > "[SciPy-User] many test failures on windows 64" > . Most > of the issues seem specific for the msvc9/MKL build of scipy 0.8. I am > now down to 1 error and 6 failures (from 16 errors and 9 failures). > Thanks for working on this! Can you please share the fixes you have already (git branch or patches)? I have to make another release candidate which fixes the Rbf issue Scott Sinclair noticed and would like to incorporate these changes. For the ndimage issue I'm not sure what the answer is, but skipping the dsyevr tests on 64-bit Windows only should at least allow you to build binaries that pass all tests, right? Cheers, Ralf > > The following failure in ndimage does not appear when ndimage.test() is > run out of context of scipy.test() > > FAIL: extrema 3 > ---------------------------------------------------------------------- > Traceback (most recent call last): > File "C:\Python26\lib\site-packages\scipy\ndimage\tests\test_ndimage.py", > line 3149, in test_extrema03 > self.failUnless(numpy.all(output1[2] == output4)) > AssertionError > > > The output1[2] array contains a NaN in the first position. If I disable > the following dsyevr related tests in > scipy.lib.lapack.tests.test_esv.py, all ndimage tests pass. Could this > be a memory corruption issue in MKL? Besides the ndimage failure, > scipy.lib.lapack seems to work and passes all tests. Also, this artifact > only happens or surfaces on the 64-bit build. > > > Index: test_esv.py > =================================================================== > --- test_esv.py (revision 6598) > +++ test_esv.py (working copy) > @@ -91,17 +91,17 @@ > def test_ssyevr(self): > self._test_base('ssyevr', 'F') > > - @dec.skipif(FLAPACK_IS_EMPTY, "Flapack empty, skip flapack test") > - def test_dsyevr(self): > - self._test_base('dsyevr', 'F') > +# @dec.skipif(FLAPACK_IS_EMPTY, "Flapack empty, skip flapack test") > +# def test_dsyevr(self): > +# self._test_base('dsyevr', 'F') > > @dec.skipif(FLAPACK_IS_EMPTY, "Flapack empty, skip flapack test") > def test_ssyevr_ranges(self): > self._test_syevr_ranges('ssyevr', 'F') > > - @dec.skipif(FLAPACK_IS_EMPTY, "Flapack empty, skip flapack test") > - def test_dsyevr_ranges(self): > - self._test_syevr_ranges('dsyevr', 'F') > +# @dec.skipif(FLAPACK_IS_EMPTY, "Flapack empty, skip flapack test") > +# def test_dsyevr_ranges(self): > +# self._test_syevr_ranges('dsyevr', 'F') > > # Clapack tests > @dec.skipif(CLAPACK_IS_EMPTY or not FUNCS_CLAPACK["ssyev"], > > > I checked the flapack_esv.pyf.src code but could not find anything > obvious. I am linking against mkl_lapack95_lp64, mkl_blas95_lp64, > mkl_intel_lp64, mkl_intel_thread, and mkl_core, MKL version 10.2.5.1. > > Thank you, > > Christoph > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > -------------- next part -------------- An HTML attachment was scrubbed... URL: From charlesr.harris at gmail.com Tue Jul 13 08:54:00 2010 From: charlesr.harris at gmail.com (Charles R Harris) Date: Tue, 13 Jul 2010 06:54:00 -0600 Subject: [SciPy-Dev] Re FFTs in SciPy (and NumPy) In-Reply-To: <4C3BFD84.2070901@molden.no> References: <4C3BFD84.2070901@molden.no> Message-ID: On Mon, Jul 12, 2010 at 11:45 PM, Sturla Molden wrote: > There has been some discussion on FFTPACK lately. Problems with FFTPACK > seems to be: > > - Written in old Fortran 77. > - Unprecise for single precision. > - Can sometimes be very slow, depending on input size. > - Can only handle a few small prime factors {2,3,4,5} efficiently. > - How to control integer size portably for 64-bit and f2py? > - Only 1D transforms. > > There is another FFT library in cwplib from Colorado Mining School. It > has some very nice properties: > > - Written in C, not Fortran, and the FFT-part of the library is just one > single C file. > - License is BSD it seems. > - Quite fast (see benchmarks in http://www.fftw.org/fftw-paper.pdf and > http://www.fftw.org/speed/) > - Can handle larger set of prime factors than FFTPACK > {2,3,4,5,7,8,9,11,13,16}. > - 1D and 2D inplace transforms. > > The problem with cwplib is that it does not allow arbitrary sized FFTs. > But til will be fast whenever FFTPACK is fast (it seems FFTPACK is only > competitive for N=2**k case). It will also be fast for other > factorizations where FFTPACK will fallback to O(N**2). > > How to handle arbitrary sized arrays with cwplib? One possibility is > using Bluestein's FFT algorithm, which just requires a few lines og > Python. This was even suggested to avoid O(N**2) fallback in FFTPACK > here a while ago. If we are going to do that for FFTPACK, the cwplib's > FFT will actually be more flexible with respect to array size. > > All in all I think it is worth looking at. > > The C code is in the "Seismic U*nx" download from CWP, which is huge, > but contains a lot of nice C library functions for scientific computing > apart from FFTs as well (all of which are BSD licensed it seems). > http://www.cwp.mines.edu/cwpcodes/index.html > > Looks interesting. Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at googlemail.com Tue Jul 13 09:27:21 2010 From: ralf.gommers at googlemail.com (Ralf Gommers) Date: Tue, 13 Jul 2010 21:27:21 +0800 Subject: [SciPy-Dev] ANN: scipy 0.8.0 release candidate 2 In-Reply-To: References: Message-ID: On Mon, Jul 12, 2010 at 3:59 PM, Scott Sinclair wrote: > >On 11 July 2010 17:24, Ralf Gommers wrote: > > I'm pleased to announce the availability of the second release candidate > of > > SciPy 0.8.0. If no more problems are reported, the final release will be > > available this Wednesday. > > I've just noticed this http://projects.scipy.org/scipy/ticket/1228 > > Good catch, and thanks for the patch. Applied in r6599 and r6601. A new RC will follow soon. Looking at the change that introduced the bug, there are (yet again) no tests for the changes. One test for function=callable is the minimum I'd think, added in r6602/03. Cheers, Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at googlemail.com Tue Jul 13 09:40:51 2010 From: ralf.gommers at googlemail.com (Ralf Gommers) Date: Tue, 13 Jul 2010 21:40:51 +0800 Subject: [SciPy-Dev] ANN: scipy 0.8.0 release candidate 2 In-Reply-To: <4C3A32DA.4010508@american.edu> References: <4C3A32DA.4010508@american.edu> Message-ID: On Mon, Jul 12, 2010 at 5:08 AM, Alan G Isaac wrote: > Still have the directory removal error and now have two warnings. > For the cluster warning, looks like that's a test with random input and I can't reproduce it. Can you find a fixed seed and make it reproducible? The matlab warning is just random noise, not an actual binary incompatibility. Maybe better to just filter it out in the 0.8.x branch. Also can't reproduce the dir removal message. Can you run the tests with verbose=2 and see which test it is? Thanks, Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at googlemail.com Tue Jul 13 10:15:07 2010 From: ralf.gommers at googlemail.com (Ralf Gommers) Date: Tue, 13 Jul 2010 22:15:07 +0800 Subject: [SciPy-Dev] documenting fortran wrapped functions, was:Problems with documentation of scipy.interpolate.dfitpack In-Reply-To: References: Message-ID: On Fri, Jul 9, 2010 at 3:33 PM, wrote: > On Thu, Jul 8, 2010 at 1:18 AM, Vincent Davis > wrote: > > On Wed, Jul 7, 2010 at 10:09 PM, bowie_22 wrote: > >> Hello together, > >> > >> I tried to contribute to the documentation of scipy. > >> I found out that the whole > >> > >> scipy.interpolate.dfitpack > >> > >> package seems to be a little bit different. > >> > >> It is not possible to view the code. > >> You get 404 File not found error. > >> > >> Is there no source code available for dfitpack? > >> If so, how is the documentation handled? > > > > I think the first line in the doc editor says it all. > > "This module 'dfitpack' is auto-generated with f2py (version:2_8473). > > Functions:" > > I think this is the source > > > http://projects.scipy.org/scipy/browser/trunk/scipy/interpolate/src/fitpack.pyf > > > > Vincent > > Many of the fortran based functions are underdocumented, e.g. > lapack/blas in linalg, most functions in special, often the fortran > files contain quite extensive documentation. But it seems in some > cases like dfitpack, some documentation are autogenerated. > > Just a question because I don't have the time to look into this: > Do we need to mark f2py wrapped functions as "unimportant", or can we > make sure every f2py wrapped function has a editable docstring? > > I looked at this a while ago after writing docs for ftpack_lite. The problem I ran into is that the add_newdoc function only works for C code, not Fortran code. Not sure if this can be easily fixed or not. Cheers, Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From charlesr.harris at gmail.com Tue Jul 13 10:23:25 2010 From: charlesr.harris at gmail.com (Charles R Harris) Date: Tue, 13 Jul 2010 08:23:25 -0600 Subject: [SciPy-Dev] documenting fortran wrapped functions, was:Problems with documentation of scipy.interpolate.dfitpack In-Reply-To: References: Message-ID: On Tue, Jul 13, 2010 at 8:15 AM, Ralf Gommers wrote: > > > On Fri, Jul 9, 2010 at 3:33 PM, wrote: > >> On Thu, Jul 8, 2010 at 1:18 AM, Vincent Davis >> wrote: >> > On Wed, Jul 7, 2010 at 10:09 PM, bowie_22 wrote: >> >> Hello together, >> >> >> >> I tried to contribute to the documentation of scipy. >> >> I found out that the whole >> >> >> >> scipy.interpolate.dfitpack >> >> >> >> package seems to be a little bit different. >> >> >> >> It is not possible to view the code. >> >> You get 404 File not found error. >> >> >> >> Is there no source code available for dfitpack? >> >> If so, how is the documentation handled? >> > >> > I think the first line in the doc editor says it all. >> > "This module 'dfitpack' is auto-generated with f2py (version:2_8473). >> > Functions:" >> > I think this is the source >> > >> http://projects.scipy.org/scipy/browser/trunk/scipy/interpolate/src/fitpack.pyf >> > >> > Vincent >> >> Many of the fortran based functions are underdocumented, e.g. >> lapack/blas in linalg, most functions in special, often the fortran >> files contain quite extensive documentation. But it seems in some >> cases like dfitpack, some documentation are autogenerated. >> >> Just a question because I don't have the time to look into this: >> Do we need to mark f2py wrapped functions as "unimportant", or can we >> make sure every f2py wrapped function has a editable docstring? >> >> I looked at this a while ago after writing docs for ftpack_lite. The > problem I ran into is that the add_newdoc function only works for C code, > not Fortran code. Not sure if this can be easily fixed or not. > > I've been thinking of redoing parts of that code in Cython. Not that there looks to be a lot of time for that, but the basic functions are small and the driver functions should be a lot cleaner with the base functions in c. Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From aisaac at american.edu Tue Jul 13 10:54:48 2010 From: aisaac at american.edu (Alan G Isaac) Date: Tue, 13 Jul 2010 10:54:48 -0400 Subject: [SciPy-Dev] ANN: scipy 0.8.0 release candidate 2 In-Reply-To: References: <4C3A32DA.4010508@american.edu> Message-ID: <4C3C7E38.30706@american.edu> On 7/13/2010 9:40 AM, Ralf Gommers wrote: > Also can't reproduce the dir removal message. Can you run the tests with > verbose=2 and see which test it is? The error is reported on line 4366, with the following context. test_create_catalog (test_catalog.TestGetCatalog) ... Running unit tests for scipy NumPy version 1.4.1 NumPy is installed in c:\Python26\lib\site-packages\numpy SciPy version 0.8.0rc2 SciPy is installed in c:\Python26\lib\site-packages\scipy Python version 2.6.5 (r265:79096, Mar 19 2010, 21:48:26) [MSC v.1500 32 bit (Intel)] nose version 0.11.0 error removing c:\users\private\appdata\local\temp\tmpia8s7rcat_test: c:\users\private\appdata\local\temp\tmpia8s7rcat_test: The directory is not empty The output is long, so I'll send it off list. Alan From charlesr.harris at gmail.com Tue Jul 13 12:42:48 2010 From: charlesr.harris at gmail.com (Charles R Harris) Date: Tue, 13 Jul 2010 10:42:48 -0600 Subject: [SciPy-Dev] openNURBS inclusion in interpolation Message-ID: As a follow on to Sturla's suggestion of CWP for the standard fft library, I'm looking into openNURBS for inclusion in the interpolation packages. Thoughts? Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From vitali at hkl.hms.harvard.edu Tue Jul 13 14:55:46 2010 From: vitali at hkl.hms.harvard.edu (Vitali V) Date: Tue, 13 Jul 2010 18:55:46 +0000 (UTC) Subject: [SciPy-Dev] =?utf-8?q?scipy_error_compiling_csr=5Fwrap_under_Pyth?= =?utf-8?q?on_2=2E7?= References: <4C339FAD.6010505@gmail.com> Message-ID: Bruce Southey gmail.com> writes: > > Hi, > I failed to get the scipy 0.8 rc1 and SVN to build under Python2.7 with > numpy '2.0.0.dev8469' and gcc version 4.4.4 20100503 (Red Hat 4.4.4-2) > (GCC) . But scipy does compile with Python 2.6. > > building 'scipy.sparse.sparsetools._csr' extension > compiling C++ sources > C compiler: g++ -pthread -fno-strict-aliasing -g -O2 -DNDEBUG -g -fwrapv > -O3 -Wall -fPIC > .... Bruce, I've got precisely the problem when attempting to compile Scipy under centos-5-x86_64 I've spent 2 days already trying to figure out what the problem is... Any ideas/clues on the solution?.. Any help is highly appreciated! Vitali From cgohlke at uci.edu Tue Jul 13 18:52:28 2010 From: cgohlke at uci.edu (Christoph Gohlke) Date: Tue, 13 Jul 2010 15:52:28 -0700 Subject: [SciPy-Dev] memory corruption when running scipy 0.8 test suite In-Reply-To: References: <4C3BBD59.2020300@uci.edu> Message-ID: <4C3CEE2C.8000307@uci.edu> Hi Ralph, On 7/13/2010 5:26 AM, Ralf Gommers wrote: > Hi Christoph, > > On Tue, Jul 13, 2010 at 9:11 AM, Christoph Gohlke > wrote: > > Dear SciPy developers, > > I am trying to fix the errors and failures reported in the thread > "[SciPy-User] many test failures on windows 64" > . Most > of the issues seem specific for the msvc9/MKL build of scipy 0.8. I am > now down to 1 error and 6 failures (from 16 errors and 9 failures). > > > Thanks for working on this! Can you please share the fixes you have > already (git branch or patches)? I have to make another release > candidate which fixes the Rbf issue Scott Sinclair noticed and would > like to incorporate these changes. I have opened tickets and left comments on NumPy and SciPy trac: http://projects.scipy.org/scipy/ticket/1229 "Patch for two test failures with msvc9 build" http://projects.scipy.org/scipy/ticket/1225 "Test errors in scipy.sparse when using MSVC/MKL build" The sparse.linalg.spsolve function works correctly when linking SuperLU against CBLAS instead of MKL. Nevertheless, one error and two failures remain for the scipy.sparse tests. http://projects.scipy.org/numpy/ticket/1539 "MSVC specific TypeError when using double, longdouble in numpy.dot" This is fixed in numpy trunk. http://projects.scipy.org/scipy/ticket/678 "scipy.test failure with mkl/cdft" The scipy.odr test failures also appear on linux platforms when scipy is linked with MKL. http://projects.scipy.org/scipy/ticket/1210 "crash during sparse matrix slicing with Python 2.7" This crash still persists with Python 2.7 final. > > For the ndimage issue I'm not sure what the answer is, but skipping the > dsyevr tests on 64-bit Windows only should at least allow you to build > binaries that pass all tests, right? All ndimage tests would pass. One error and five failures would still remain for the whole scipy test suite. I am not suggesting to disable the dsyevr test in the final release. Hiding the failure could create false confidence. > The following failure in ndimage does not appear when ndimage.test() is > run out of context of scipy.test() > > FAIL: extrema 3 > ---------------------------------------------------------------------- > Traceback (most recent call last): > File > "C:\Python26\lib\site-packages\scipy\ndimage\tests\test_ndimage.py", > line 3149, in test_extrema03 > self.failUnless(numpy.all(output1[2] == output4)) > AssertionError > > > The output1[2] array contains a NaN in the first position. If I disable > the following dsyevr related tests in > scipy.lib.lapack.tests.test_esv.py > , all ndimage tests pass. > Could this > be a memory corruption issue in MKL? Besides the ndimage failure, > scipy.lib.lapack seems to work and passes all tests. Also, this artifact > only happens or surfaces on the 64-bit build. > > > Index: test_esv.py > =================================================================== > --- test_esv.py (revision 6598) > +++ test_esv.py (working copy) > @@ -91,17 +91,17 @@ > def test_ssyevr(self): > self._test_base('ssyevr', 'F') > > - @dec.skipif(FLAPACK_IS_EMPTY, "Flapack empty, skip flapack test") > - def test_dsyevr(self): > - self._test_base('dsyevr', 'F') > +# @dec.skipif(FLAPACK_IS_EMPTY, "Flapack empty, skip flapack test") > +# def test_dsyevr(self): > +# self._test_base('dsyevr', 'F') > > @dec.skipif(FLAPACK_IS_EMPTY, "Flapack empty, skip flapack test") > def test_ssyevr_ranges(self): > self._test_syevr_ranges('ssyevr', 'F') > > - @dec.skipif(FLAPACK_IS_EMPTY, "Flapack empty, skip flapack test") > - def test_dsyevr_ranges(self): > - self._test_syevr_ranges('dsyevr', 'F') > +# @dec.skipif(FLAPACK_IS_EMPTY, "Flapack empty, skip flapack test") > +# def test_dsyevr_ranges(self): > +# self._test_syevr_ranges('dsyevr', 'F') > > # Clapack tests > @dec.skipif(CLAPACK_IS_EMPTY or not FUNCS_CLAPACK["ssyev"], > > > I checked the flapack_esv.pyf.src code but could not find anything > obvious. I am linking against mkl_lapack95_lp64, mkl_blas95_lp64, > mkl_intel_lp64, mkl_intel_thread, and mkl_core, MKL version 10.2.5.1. > This is another weird test failure that is apparently dependent on the context in which the test is run: ====================================================================== FAIL: Real-valued Bessel domains ---------------------------------------------------------------------- Traceback (most recent call last): File "X:\Python26\lib\site-packages\scipy\special\tests\test_basic.py", line 1691, in test_ticket_854 assert not isnan(airye(-1)[2:4]).any(), airye(-1) AssertionError: (nan, nan, nan, nan) This failure is specific to the 32 bit build and only appears when I run 'python -c"import scipy;scipy.test()"', but not in 'python -c"import scipy.special;scipy.special.test()"'. In contrast to the ndimage failure, this test fails for scipy.test() even if I remove all tests but the scipy special tests. -- Christoph From charlesr.harris at gmail.com Tue Jul 13 23:19:18 2010 From: charlesr.harris at gmail.com (Charles R Harris) Date: Tue, 13 Jul 2010 21:19:18 -0600 Subject: [SciPy-Dev] memory corruption when running scipy 0.8 test suite In-Reply-To: <4C3CEE2C.8000307@uci.edu> References: <4C3BBD59.2020300@uci.edu> <4C3CEE2C.8000307@uci.edu> Message-ID: On Tue, Jul 13, 2010 at 4:52 PM, Christoph Gohlke wrote: > Hi Ralph, > > On 7/13/2010 5:26 AM, Ralf Gommers wrote: > > Hi Christoph, > > > > On Tue, Jul 13, 2010 at 9:11 AM, Christoph Gohlke > > wrote: > > > > Dear SciPy developers, > > > > I am trying to fix the errors and failures reported in the thread > > "[SciPy-User] many test failures on windows 64" > > . > Most > > of the issues seem specific for the msvc9/MKL build of scipy 0.8. I > am > > now down to 1 error and 6 failures (from 16 errors and 9 failures). > > > > > > Thanks for working on this! Can you please share the fixes you have > > already (git branch or patches)? I have to make another release > > candidate which fixes the Rbf issue Scott Sinclair noticed and would > > like to incorporate these changes. > > > I have opened tickets and left comments on NumPy and SciPy trac: > > http://projects.scipy.org/scipy/ticket/1229 > "Patch for two test failures with msvc9 build" > > > http://projects.scipy.org/scipy/ticket/1225 > "Test errors in scipy.sparse when using MSVC/MKL build" > The sparse.linalg.spsolve function works correctly when linking SuperLU > against CBLAS instead of MKL. Nevertheless, one error and two failures > remain for the scipy.sparse tests. > > > http://projects.scipy.org/numpy/ticket/1539 > "MSVC specific TypeError when using double, longdouble in numpy.dot" > This is fixed in numpy trunk. > > > http://projects.scipy.org/scipy/ticket/678 > "scipy.test failure with mkl/cdft" > The scipy.odr test failures also appear on linux platforms when scipy is > linked with MKL. > > > http://projects.scipy.org/scipy/ticket/1210 > "crash during sparse matrix slicing with Python 2.7" > This crash still persists with Python 2.7 final. > > > > > > For the ndimage issue I'm not sure what the answer is, but skipping the > > dsyevr tests on 64-bit Windows only should at least allow you to build > > binaries that pass all tests, right? > > All ndimage tests would pass. One error and five failures would still > remain for the whole scipy test suite. > > I am not suggesting to disable the dsyevr test in the final release. > Hiding the failure could create false confidence. > > > > > The following failure in ndimage does not appear when ndimage.test() > is > > run out of context of scipy.test() > > > > FAIL: extrema 3 > > > ---------------------------------------------------------------------- > > Traceback (most recent call last): > > File > > "C:\Python26\lib\site-packages\scipy\ndimage\tests\test_ndimage.py", > > line 3149, in test_extrema03 > > self.failUnless(numpy.all(output1[2] == output4)) > > AssertionError > > > > > > The output1[2] array contains a NaN in the first position. If I > disable > > the following dsyevr related tests in > > scipy.lib.lapack.tests.test_esv.py > > , all ndimage tests pass. > > Could this > > be a memory corruption issue in MKL? Besides the ndimage failure, > > scipy.lib.lapack seems to work and passes all tests. Also, this > artifact > > only happens or surfaces on the 64-bit build. > > > > > > Index: test_esv.py > > =================================================================== > > --- test_esv.py (revision 6598) > > +++ test_esv.py (working copy) > > @@ -91,17 +91,17 @@ > > def test_ssyevr(self): > > self._test_base('ssyevr', 'F') > > > > - @dec.skipif(FLAPACK_IS_EMPTY, "Flapack empty, skip flapack > test") > > - def test_dsyevr(self): > > - self._test_base('dsyevr', 'F') > > +# @dec.skipif(FLAPACK_IS_EMPTY, "Flapack empty, skip flapack > test") > > +# def test_dsyevr(self): > > +# self._test_base('dsyevr', 'F') > > > > @dec.skipif(FLAPACK_IS_EMPTY, "Flapack empty, skip flapack > test") > > def test_ssyevr_ranges(self): > > self._test_syevr_ranges('ssyevr', 'F') > > > > - @dec.skipif(FLAPACK_IS_EMPTY, "Flapack empty, skip flapack > test") > > - def test_dsyevr_ranges(self): > > - self._test_syevr_ranges('dsyevr', 'F') > > +# @dec.skipif(FLAPACK_IS_EMPTY, "Flapack empty, skip flapack > test") > > +# def test_dsyevr_ranges(self): > > +# self._test_syevr_ranges('dsyevr', 'F') > > > > # Clapack tests > > @dec.skipif(CLAPACK_IS_EMPTY or not FUNCS_CLAPACK["ssyev"], > > > > > > I checked the flapack_esv.pyf.src code but could not find anything > > obvious. I am linking against mkl_lapack95_lp64, mkl_blas95_lp64, > > mkl_intel_lp64, mkl_intel_thread, and mkl_core, MKL version 10.2.5.1. > > > > > > This is another weird test failure that is apparently dependent on the > context in which the test is run: > > ====================================================================== > FAIL: Real-valued Bessel domains > ---------------------------------------------------------------------- > Traceback (most recent call last): > File > "X:\Python26\lib\site-packages\scipy\special\tests\test_basic.py", line > 1691, in test_ticket_854 > assert not isnan(airye(-1)[2:4]).any(), airye(-1) > AssertionError: (nan, nan, nan, nan) > > > This failure is specific to the 32 bit build and only appears when I run > 'python -c"import scipy;scipy.test()"', but not in 'python -c"import > scipy.special;scipy.special.test()"'. In contrast to the ndimage > failure, this test fails for scipy.test() even if I remove all tests but > the scipy special tests. > > That's strange. Is the same function getting imported in both cases? Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From cgohlke at uci.edu Wed Jul 14 05:25:36 2010 From: cgohlke at uci.edu (Christoph Gohlke) Date: Wed, 14 Jul 2010 02:25:36 -0700 Subject: [SciPy-Dev] memory corruption when running scipy 0.8 test suite In-Reply-To: References: <4C3BBD59.2020300@uci.edu> <4C3CEE2C.8000307@uci.edu> Message-ID: <4C3D8290.9060906@uci.edu> On 7/13/2010 8:19 PM, Charles R Harris wrote: > > > On Tue, Jul 13, 2010 at 4:52 PM, Christoph Gohlke > wrote: > > Hi Ralph, > > On 7/13/2010 5:26 AM, Ralf Gommers wrote: > > Hi Christoph, > > > > On Tue, Jul 13, 2010 at 9:11 AM, Christoph Gohlke > > >> wrote: > > > > Dear SciPy developers, > > > > I am trying to fix the errors and failures reported in the thread > > "[SciPy-User] many test failures on windows 64" > > > . Most > > of the issues seem specific for the msvc9/MKL build of scipy > 0.8. I am > > now down to 1 error and 6 failures (from 16 errors and 9 > failures). > > > > > > Thanks for working on this! Can you please share the fixes you have > > already (git branch or patches)? I have to make another release > > candidate which fixes the Rbf issue Scott Sinclair noticed and would > > like to incorporate these changes. > > > I have opened tickets and left comments on NumPy and SciPy trac: > > http://projects.scipy.org/scipy/ticket/1229 > "Patch for two test failures with msvc9 build" > > > http://projects.scipy.org/scipy/ticket/1225 > "Test errors in scipy.sparse when using MSVC/MKL build" > The sparse.linalg.spsolve function works correctly when linking SuperLU > against CBLAS instead of MKL. Nevertheless, one error and two failures > remain for the scipy.sparse tests. > > > http://projects.scipy.org/numpy/ticket/1539 > "MSVC specific TypeError when using double, longdouble in numpy.dot" > This is fixed in numpy trunk. > > > http://projects.scipy.org/scipy/ticket/678 > "scipy.test failure with mkl/cdft" > The scipy.odr test failures also appear on linux platforms when scipy is > linked with MKL. > > > http://projects.scipy.org/scipy/ticket/1210 > "crash during sparse matrix slicing with Python 2.7" > This crash still persists with Python 2.7 final. > > > > > > For the ndimage issue I'm not sure what the answer is, but > skipping the > > dsyevr tests on 64-bit Windows only should at least allow you to build > > binaries that pass all tests, right? > > All ndimage tests would pass. One error and five failures would still > remain for the whole scipy test suite. > > I am not suggesting to disable the dsyevr test in the final release. > Hiding the failure could create false confidence. > > > > > The following failure in ndimage does not appear when > ndimage.test() is > > run out of context of scipy.test() > > > > FAIL: extrema 3 > > > ---------------------------------------------------------------------- > > Traceback (most recent call last): > > File > > "C:\Python26\lib\site-packages\scipy\ndimage\tests\test_ndimage.py", > > line 3149, in test_extrema03 > > self.failUnless(numpy.all(output1[2] == output4)) > > AssertionError > > > > > > The output1[2] array contains a NaN in the first position. If > I disable > > the following dsyevr related tests in > > scipy.lib.lapack.tests.test_esv.py > > > , all ndimage tests pass. > > Could this > > be a memory corruption issue in MKL? Besides the ndimage failure, > > scipy.lib.lapack seems to work and passes all tests. Also, > this artifact > > only happens or surfaces on the 64-bit build. > > > > > > Index: test_esv.py > > > =================================================================== > > --- test_esv.py (revision 6598) > > +++ test_esv.py (working copy) > > @@ -91,17 +91,17 @@ > > def test_ssyevr(self): > > self._test_base('ssyevr', 'F') > > > > - @dec.skipif(FLAPACK_IS_EMPTY, "Flapack empty, skip > flapack test") > > - def test_dsyevr(self): > > - self._test_base('dsyevr', 'F') > > +# @dec.skipif(FLAPACK_IS_EMPTY, "Flapack empty, skip > flapack test") > > +# def test_dsyevr(self): > > +# self._test_base('dsyevr', 'F') > > > > @dec.skipif(FLAPACK_IS_EMPTY, "Flapack empty, skip > flapack test") > > def test_ssyevr_ranges(self): > > self._test_syevr_ranges('ssyevr', 'F') > > > > - @dec.skipif(FLAPACK_IS_EMPTY, "Flapack empty, skip > flapack test") > > - def test_dsyevr_ranges(self): > > - self._test_syevr_ranges('dsyevr', 'F') > > +# @dec.skipif(FLAPACK_IS_EMPTY, "Flapack empty, skip > flapack test") > > +# def test_dsyevr_ranges(self): > > +# self._test_syevr_ranges('dsyevr', 'F') > > > > # Clapack tests > > @dec.skipif(CLAPACK_IS_EMPTY or not FUNCS_CLAPACK["ssyev"], > > > > > > I checked the flapack_esv.pyf.src code but could not find anything > > obvious. I am linking against mkl_lapack95_lp64, mkl_blas95_lp64, > > mkl_intel_lp64, mkl_intel_thread, and mkl_core, MKL version > 10.2.5.1. > > > > > > This is another weird test failure that is apparently dependent on the > context in which the test is run: > > ====================================================================== > FAIL: Real-valued Bessel domains > ---------------------------------------------------------------------- > Traceback (most recent call last): > File > "X:\Python26\lib\site-packages\scipy\special\tests\test_basic.py", line > 1691, in test_ticket_854 > assert not isnan(airye(-1)[2:4]).any(), airye(-1) > AssertionError: (nan, nan, nan, nan) > > > This failure is specific to the 32 bit build and only appears when I run > 'python -c"import scipy;scipy.test()"', but not in 'python -c"import > scipy.special;scipy.special.test()"'. In contrast to the ndimage > failure, this test fails for scipy.test() even if I remove all tests but > the scipy special tests. > > > That's strange. Is the same function getting imported in both cases? AFAICT the airye functions are the same in both cases (same type, repr, and __doc__). I reran the tests with the python -v switch and noticed that besides numpy and scipy, the matplotlib, PIL, sympy, zope, scikits, and entought packages are imported (why?). Maybe there's some monkeypatching or name shadowing going on? Anyway, I could not reproduce this failure on a clean installation of just python, numpy, scipy, and nose. The test_twodiags (test_linsolve.TestLinsolve) error and the test_linsolve.TestSplu.test_spilu_smoketest failure also disappear on this clean installation. -- Christoph From pav at iki.fi Wed Jul 14 06:44:59 2010 From: pav at iki.fi (Pauli Virtanen) Date: Wed, 14 Jul 2010 10:44:59 +0000 (UTC) Subject: [SciPy-Dev] memory corruption when running scipy 0.8 test suite References: <4C3BBD59.2020300@uci.edu> <4C3CEE2C.8000307@uci.edu> <4C3D8290.9060906@uci.edu> Message-ID: Wed, 14 Jul 2010 02:25:36 -0700, Christoph Gohlke wrote: [clip] > AFAICT the airye functions are the same in both cases (same type, repr, > and __doc__). I reran the tests with the python -v switch and noticed > that besides numpy and scipy, the matplotlib, PIL, sympy, zope, scikits, > and entought packages are imported (why?). Maybe there's some > monkeypatching or name shadowing going on? If you install stuff as eggs, some of them may always be imported on Python startup. (I guess this is so for namespace hook-ful packages.) > Anyway, I could not reproduce > this failure on a clean installation of just python, numpy, scipy, and > nose. The test_twodiags (test_linsolve.TestLinsolve) error and the > test_linsolve.TestSplu.test_spilu_smoketest failure also disappear on > this clean installation. Do you mean that if you start from scratch, and install only the numpy +scipy packages you built for 64-bit Windows, all problems apart from the MKL ones disappear? Do the other packages you had installed use the same C runtime as the ones you built, and were they compiled with the same compilers? -- Pauli Virtanen From cgohlke at uci.edu Wed Jul 14 06:47:07 2010 From: cgohlke at uci.edu (Christoph Gohlke) Date: Wed, 14 Jul 2010 03:47:07 -0700 Subject: [SciPy-Dev] memory corruption when running scipy 0.8 test suite In-Reply-To: <4C3D8290.9060906@uci.edu> References: <4C3BBD59.2020300@uci.edu> <4C3CEE2C.8000307@uci.edu> <4C3D8290.9060906@uci.edu> Message-ID: <4C3D95AB.5070007@uci.edu> On 7/14/2010 2:25 AM, Christoph Gohlke wrote: > > > On 7/13/2010 8:19 PM, Charles R Harris wrote: >> >> >> On Tue, Jul 13, 2010 at 4:52 PM, Christoph Gohlke> > wrote: >> >> Hi Ralph, >> >> On 7/13/2010 5:26 AM, Ralf Gommers wrote: >> > Hi Christoph, >> > >> > On Tue, Jul 13, 2010 at 9:11 AM, Christoph Gohlke> >> > >> wrote: >> > >> > Dear SciPy developers, >> > >> > I am trying to fix the errors and failures reported in the thread >> > "[SciPy-User] many test failures on windows 64" >> > >> . Most >> > of the issues seem specific for the msvc9/MKL build of scipy >> 0.8. I am >> > now down to 1 error and 6 failures (from 16 errors and 9 >> failures). >> > >> > >> > Thanks for working on this! Can you please share the fixes you have >> > already (git branch or patches)? I have to make another release >> > candidate which fixes the Rbf issue Scott Sinclair noticed and would >> > like to incorporate these changes. >> >> >> I have opened tickets and left comments on NumPy and SciPy trac: >> >> http://projects.scipy.org/scipy/ticket/1229 >> "Patch for two test failures with msvc9 build" >> >> >> http://projects.scipy.org/scipy/ticket/1225 >> "Test errors in scipy.sparse when using MSVC/MKL build" >> The sparse.linalg.spsolve function works correctly when linking SuperLU >> against CBLAS instead of MKL. Nevertheless, one error and two failures >> remain for the scipy.sparse tests. >> >> >> http://projects.scipy.org/numpy/ticket/1539 >> "MSVC specific TypeError when using double, longdouble in numpy.dot" >> This is fixed in numpy trunk. >> >> >> http://projects.scipy.org/scipy/ticket/678 >> "scipy.test failure with mkl/cdft" >> The scipy.odr test failures also appear on linux platforms when scipy is >> linked with MKL. >> >> >> http://projects.scipy.org/scipy/ticket/1210 >> "crash during sparse matrix slicing with Python 2.7" >> This crash still persists with Python 2.7 final. >> >> >> > >> > For the ndimage issue I'm not sure what the answer is, but >> skipping the >> > dsyevr tests on 64-bit Windows only should at least allow you to build >> > binaries that pass all tests, right? >> >> All ndimage tests would pass. One error and five failures would still >> remain for the whole scipy test suite. >> >> I am not suggesting to disable the dsyevr test in the final release. >> Hiding the failure could create false confidence. >> >> >> >> > The following failure in ndimage does not appear when >> ndimage.test() is >> > run out of context of scipy.test() >> > >> > FAIL: extrema 3 >> > >> ---------------------------------------------------------------------- >> > Traceback (most recent call last): >> > File >> > "C:\Python26\lib\site-packages\scipy\ndimage\tests\test_ndimage.py", >> > line 3149, in test_extrema03 >> > self.failUnless(numpy.all(output1[2] == output4)) >> > AssertionError >> > >> > >> > The output1[2] array contains a NaN in the first position. If >> I disable >> > the following dsyevr related tests in >> > scipy.lib.lapack.tests.test_esv.py >> >> > , all ndimage tests pass. >> > Could this >> > be a memory corruption issue in MKL? Besides the ndimage failure, >> > scipy.lib.lapack seems to work and passes all tests. Also, >> this artifact >> > only happens or surfaces on the 64-bit build. >> > >> > >> > Index: test_esv.py >> > >> =================================================================== >> > --- test_esv.py (revision 6598) >> > +++ test_esv.py (working copy) >> > @@ -91,17 +91,17 @@ >> > def test_ssyevr(self): >> > self._test_base('ssyevr', 'F') >> > >> > - @dec.skipif(FLAPACK_IS_EMPTY, "Flapack empty, skip >> flapack test") >> > - def test_dsyevr(self): >> > - self._test_base('dsyevr', 'F') >> > +# @dec.skipif(FLAPACK_IS_EMPTY, "Flapack empty, skip >> flapack test") >> > +# def test_dsyevr(self): >> > +# self._test_base('dsyevr', 'F') >> > >> > @dec.skipif(FLAPACK_IS_EMPTY, "Flapack empty, skip >> flapack test") >> > def test_ssyevr_ranges(self): >> > self._test_syevr_ranges('ssyevr', 'F') >> > >> > - @dec.skipif(FLAPACK_IS_EMPTY, "Flapack empty, skip >> flapack test") >> > - def test_dsyevr_ranges(self): >> > - self._test_syevr_ranges('dsyevr', 'F') >> > +# @dec.skipif(FLAPACK_IS_EMPTY, "Flapack empty, skip >> flapack test") >> > +# def test_dsyevr_ranges(self): >> > +# self._test_syevr_ranges('dsyevr', 'F') >> > >> > # Clapack tests >> > @dec.skipif(CLAPACK_IS_EMPTY or not FUNCS_CLAPACK["ssyev"], >> > >> > >> > I checked the flapack_esv.pyf.src code but could not find anything >> > obvious. I am linking against mkl_lapack95_lp64, mkl_blas95_lp64, >> > mkl_intel_lp64, mkl_intel_thread, and mkl_core, MKL version >> 10.2.5.1. >> > >> >> >> >> This is another weird test failure that is apparently dependent on the >> context in which the test is run: >> >> ====================================================================== >> FAIL: Real-valued Bessel domains >> ---------------------------------------------------------------------- >> Traceback (most recent call last): >> File >> "X:\Python26\lib\site-packages\scipy\special\tests\test_basic.py", line >> 1691, in test_ticket_854 >> assert not isnan(airye(-1)[2:4]).any(), airye(-1) >> AssertionError: (nan, nan, nan, nan) >> >> >> This failure is specific to the 32 bit build and only appears when I run >> 'python -c"import scipy;scipy.test()"', but not in 'python -c"import >> scipy.special;scipy.special.test()"'. In contrast to the ndimage >> failure, this test fails for scipy.test() even if I remove all tests but >> the scipy special tests. >> >> >> That's strange. Is the same function getting imported in both cases? > > > AFAICT the airye functions are the same in both cases (same type, repr, > and __doc__). I reran the tests with the python -v switch and noticed > that besides numpy and scipy, the matplotlib, PIL, sympy, zope, scikits, > and entought packages are imported (why?). Maybe there's some > monkeypatching or name shadowing going on? Anyway, I could not reproduce > this failure on a clean installation of just python, numpy, scipy, and > nose. The test_twodiags (test_linsolve.TestLinsolve) error and the > test_linsolve.TestSplu.test_spilu_smoketest failure also disappear on > this clean installation. > Finally, the "Real-valued Bessel domains", test_twodiags (test_linsolve.TestLinsolve), and test_linsolve.TestSplu.test_spilu_smoketest errors and failures disappear when I run the scipy tests with the -O flag, which tells Python to optimize generated bytecode: python -O -c"import scipy;scipy.test()" -> 4 failures python -c"import scipy;scipy.test()" -> 1 error, 6 failures This is reproducible on different computers, 32 & 64 bit installations, and EPD 6.2.2. The -O flag (or PYTHONOPTIMIZE=x environment variable) was the default on my minimal test installation but not on the development system. In the process of deleting *.pyo files and reinstalling other MKL dependent packages, the mysterious 64-bit ndimage/dsyevr failure also vanished. -- Christoph From pav at iki.fi Wed Jul 14 06:54:31 2010 From: pav at iki.fi (Pauli Virtanen) Date: Wed, 14 Jul 2010 10:54:31 +0000 (UTC) Subject: [SciPy-Dev] memory corruption when running scipy 0.8 test suite References: <4C3BBD59.2020300@uci.edu> <4C3CEE2C.8000307@uci.edu> <4C3D8290.9060906@uci.edu> <4C3D95AB.5070007@uci.edu> Message-ID: Wed, 14 Jul 2010 03:47:07 -0700, Christoph Gohlke wrote: [clip] > Finally, the "Real-valued Bessel domains", test_twodiags > (test_linsolve.TestLinsolve), and > test_linsolve.TestSplu.test_spilu_smoketest errors and failures > disappear when I run the scipy tests with the -O flag, which tells > Python to optimize generated bytecode: > > python -O -c"import scipy;scipy.test()" -> 4 failures > > python -c "import scipy;scipy.test()" -> 1 error, 6 failures It also tells Python to remove all of the "assert" statements, which effectively removes some of the tests. Recommendation for using "assert" is maybe not the best of ideas in Nose, and we should probably avoid that... The airye stuff comes from Amos, and the nan failures either indicate issues with compilation of the Fortran codes, or (less likely) in the wrappers. [clip] > In the process of deleting *.pyo files and reinstalling other MKL > dependent packages, the mysterious 64-bit ndimage/dsyevr failure also > vanished. So SuperLU still fails with MKL, but MKL causes no other additional failures? Could you re-paste the errors+failures that you see in a "clean" installation? Thanks, Pauli From ralf.gommers at googlemail.com Wed Jul 14 09:59:33 2010 From: ralf.gommers at googlemail.com (Ralf Gommers) Date: Wed, 14 Jul 2010 21:59:33 +0800 Subject: [SciPy-Dev] ANN: scipy 0.8.0 release candidate 2 In-Reply-To: <4C3C7E38.30706@american.edu> References: <4C3A32DA.4010508@american.edu> <4C3C7E38.30706@american.edu> Message-ID: On Tue, Jul 13, 2010 at 10:54 PM, Alan G Isaac wrote: > On 7/13/2010 9:40 AM, Ralf Gommers wrote: > > Also can't reproduce the dir removal message. Can you run the tests with > > verbose=2 and see which test it is? > > > The error is reported on line 4366, > with the following context. > > test_create_catalog (test_catalog.TestGetCatalog) ... Running unit > tests for scipy > NumPy version 1.4.1 > NumPy is installed in c:\Python26\lib\site-packages\numpy > SciPy version 0.8.0rc2 > SciPy is installed in c:\Python26\lib\site-packages\scipy > Python version 2.6.5 (r265:79096, Mar 19 2010, 21:48:26) [MSC > v.1500 32 bit (Intel)] > nose version 0.11.0 > error removing > c:\users\private\appdata\local\temp\tmpia8s7rcat_test: > c:\users\private\appdata\local\temp\tmpia8s7rcat_test: The > directory is not empty > > The output is long, so I'll send it off list. > > Looks like this has something to do with either the permissions on the folder being set wrong, in which case it can't be removed even though it's empty, or with Python wanting a case-sensitive filename, which is impossible to get on Windows except for when using the win32api extension. In either case, it's not easy to solve. And since it's just some empty folders sticking around in a temp dir I'm going to just ignore the issue for now. Some other bug reports on the same issue: http://bitbucket.org/pv/textext/issue/25 http://bitten.edgewall.org/ticket/253 Cheers, Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From jjstickel at vcn.com Wed Jul 14 13:36:57 2010 From: jjstickel at vcn.com (Jonathan Stickel) Date: Wed, 14 Jul 2010 11:36:57 -0600 Subject: [SciPy-Dev] scikits contribution? In-Reply-To: References: Message-ID: <4C3DF5B9.4000807@vcn.com> On 7/8/10 07:05 , scipy-dev-request at scipy.org wrote: > Date: Thu, 8 Jul 2010 08:24:25 -0400 > From:josef.pktd at gmail.com > Subject: Re: [SciPy-Dev] scikits contribution? > To: SciPy Developers List > Message-ID: > > Content-Type: text/plain; charset=ISO-8859-1 > > On Wed, Jul 7, 2010 at 4:42 PM, Jonathan Stickel wrote: >> > Some time ago, I offered to contribute (on the scipy-user list) some >> > code to smooth 1-D data by regularization: >> > >> > http://mail.scipy.org/pipermail/scipy-user/2010-February/024351.html >> > >> > Someone suggested that scikits might be the right place: >> > >> > http://mail.scipy.org/pipermail/scipy-user/2010-February/024408.html >> > >> > So I am finally looking into scikits, and I am not sure how to proceed. >> > ?My code consists of several functions in a single .py file. ?It seems >> > overkill to create a new scikit for just one file, but I do not see an >> > existing scikit that matches. ?'Optimization' would be the closest; in >> > core scipy I would put it in 'interpolate'. >> > >> > So, what is the minimum that I need to do to create a scikit and upload >> > my code? ?Any suggestions for the name of the scikit (interpolate, >> > data_smoothing)? > > The easiest to get started is to copy the setup structure from another scikit. > I think the template scikit in scikits svn is a bit out of date, the > last time I looked. > > If you think your model could form the basis for enhancing the > smoother or noisy interpolation category in scipy, then a scikits > would be the best way, as we discussed. > > If you want to add it to an existing scikits, then statsmodels would > be a possibility. > Although statsmodels is more oriented towards multivariate approaches, > I think a smoother category, together with some non-parametric > methods, e.g. the existing kernel regression, would be an appropriate > fit. There is a need for smoothers in gam, Generalized Additive > Models, but that one is not cleaned up yet. > > And I think there will be more applications where it would be useful > to share the cross-validation code as far as possible. > > Josef > >> > >> > Please know that I am just starting to learn python, being a convert >> > from matlab/octave. ?Although I have become fairly proficient using >> > numpy/scipy in ipython, I do not know much about python internals, >> > setuptools, etc. >> > OK, I created a scikit named "datasmooth" and included my current code. It seems to install OK with "python setup install" and import correctly. However, I am not able to commit to the svn repository. I registered on the scikits wiki, but I guess there is something else I need to do? Thanks, Jonathan P.S. Please cc me in your reply since I receive list emails in digest form. From josef.pktd at gmail.com Wed Jul 14 13:57:49 2010 From: josef.pktd at gmail.com (josef.pktd at gmail.com) Date: Wed, 14 Jul 2010 13:57:49 -0400 Subject: [SciPy-Dev] scikits contribution? In-Reply-To: <4C3DF5B9.4000807@vcn.com> References: <4C3DF5B9.4000807@vcn.com> Message-ID: On Wed, Jul 14, 2010 at 1:36 PM, Jonathan Stickel wrote: > On 7/8/10 07:05 , scipy-dev-request at scipy.org wrote: >> Date: Thu, 8 Jul 2010 08:24:25 -0400 >> From:josef.pktd at gmail.com >> Subject: Re: [SciPy-Dev] scikits contribution? >> To: SciPy Developers List >> Message-ID: >> ? ? ? >> Content-Type: text/plain; charset=ISO-8859-1 >> >> On Wed, Jul 7, 2010 at 4:42 PM, Jonathan Stickel ?wrote: >>> > ?Some time ago, I offered to contribute (on the scipy-user list) some >>> > ?code to smooth 1-D data by regularization: >>> > >>> > ?http://mail.scipy.org/pipermail/scipy-user/2010-February/024351.html >>> > >>> > ?Someone suggested that scikits might be the right place: >>> > >>> > ?http://mail.scipy.org/pipermail/scipy-user/2010-February/024408.html >>> > >>> > ?So I am finally looking into scikits, and I am not sure how to proceed. >>> > ??My code consists of several functions in a single .py file. ?It seems >>> > ?overkill to create a new scikit for just one file, but I do not see an >>> > ?existing scikit that matches. ?'Optimization' would be the closest; in >>> > ?core scipy I would put it in 'interpolate'. >>> > >>> > ?So, what is the minimum that I need to do to create a scikit and upload >>> > ?my code? ?Any suggestions for the name of the scikit (interpolate, >>> > ?data_smoothing)? >> >> The easiest to get started is to copy the setup structure from another scikit. >> I think the template scikit in scikits svn is a bit out of date, the >> last time I looked. >> >> If you think your model could form the basis for enhancing the >> smoother or noisy interpolation category in scipy, then a scikits >> would be the best way, as we discussed. >> >> If you want to add it to an existing scikits, then statsmodels would >> be a possibility. >> Although statsmodels is more oriented towards multivariate approaches, >> I think a smoother category, together with some non-parametric >> methods, e.g. the existing kernel regression, would be an appropriate >> fit. There is a need for smoothers in gam, Generalized Additive >> Models, but that one is not cleaned up yet. >> >> And I think there will be more applications where it would be useful >> to share the cross-validation code as far as possible. >> >> Josef >> >>> > >>> > ?Please know that I am just starting to learn python, being a convert >>> > ?from matlab/octave. ?Although I have become fairly proficient using >>> > ?numpy/scipy in ipython, I do not know much about python internals, >>> > ?setuptools, etc. >>> > > > OK, I created a scikit named "datasmooth" and included my current code. > ?It seems to install OK with "python setup install" and import > correctly. ?However, I am not able to commit to the svn repository. ?I > registered on the scikits wiki, but I guess there is something else I > need to do? Sharing the code would be much easier if you pick your favorite decentralized revision control system, git, bazaar or mercurial, and host it at the corresponding website. It would also avoid any permission questions. I don't know who handles setup and administration of http://projects.scipy.org/scikits. Josef > > Thanks, > Jonathan > > P.S. > Please cc me in your reply since I receive list emails in digest form. > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > From cgohlke at uci.edu Wed Jul 14 14:33:36 2010 From: cgohlke at uci.edu (Christoph Gohlke) Date: Wed, 14 Jul 2010 11:33:36 -0700 Subject: [SciPy-Dev] memory corruption when running scipy 0.8 test suite In-Reply-To: References: <4C3BBD59.2020300@uci.edu> <4C3CEE2C.8000307@uci.edu> <4C3D8290.9060906@uci.edu> <4C3D95AB.5070007@uci.edu> Message-ID: <4C3E0300.8050800@uci.edu> On 7/14/2010 3:54 AM, Pauli Virtanen wrote: > Wed, 14 Jul 2010 03:47:07 -0700, Christoph Gohlke wrote: > [clip] >> Finally, the "Real-valued Bessel domains", test_twodiags >> (test_linsolve.TestLinsolve), and >> test_linsolve.TestSplu.test_spilu_smoketest errors and failures >> disappear when I run the scipy tests with the -O flag, which tells >> Python to optimize generated bytecode: >> >> python -O -c"import scipy;scipy.test()" -> 4 failures >> >> python -c "import scipy;scipy.test()" -> 1 error, 6 failures > > It also tells Python to remove all of the "assert" statements, which > effectively removes some of the tests. I forgot that. > > Recommendation for using "assert" is maybe not the best of ideas in Nose, > and we should probably avoid that... > > The airye stuff comes from Amos, and the nan failures either indicate > issues with compilation of the Fortran codes, or (less likely) in the > wrappers. > > [clip] >> In the process of deleting *.pyo files and reinstalling other MKL >> dependent packages, the mysterious 64-bit ndimage/dsyevr failure also >> vanished. > > So SuperLU still fails with MKL, but MKL causes no other additional > failures? Unfortunately, now the ndimage/dsyevr failure reappeared even on a clean installation. The ODR and QMR failures below might also be due to MKL. Would you be interested in a patch for linking SuperLU against CBLAS? > > Could you re-paste the errors+failures that you see in a "clean" > installation? > Sure, sorry for the confusion. This is the output of a scipy.test() run without PYTHONOPTIMIZE on a clean minimal 32-bit installation. Numpy and scipy include the latest fixes from svn and SuperLU is linked against CBLAS: Running unit tests for scipy NumPy version 1.4.1 NumPy is installed in X:\python26\lib\site-packages\numpy SciPy version 0.8.0.dev6609 SciPy is installed in X:\python26\lib\site-packages\scipy Python version 2.6.5 (r265:79096, Mar 19 2010, 21:48:26) [MSC v.1500 32 bit (Intel)] nose version 0.11.3 ................................................................................ ................................................................................ ......................................K......................................... ................................................................................ ....K..K........................................................................ ................................................................................ ................................................................................ ................................................................................ ................................................................................ ................................................................................ ................................................................................ .........................................SSSSSS......SSSSSS......SSSS........... ....................................................S........................... ................................................................................ ................................................................................ ...................K............................................................ ................................................................................ ................................................................................ ................................................................................ ................................................................................ ................................................................................ ................................................................................ ..........................................................FFF................... ................................................................................ ................................................................................ ................................................................................ ...............................................................SSSSSSSSSSS.E..F. .........K.........F............................................................ .................................................................K.............. .................................................K.............................. ................................................................................ ...........................................KK................................... ................................................................................ ................................................................................ ................................................................................ ................................................................................ ....................................K.K......................................... ................................................................................ ................................................................................ ................................................................................ ................................................................................ ..........K........K.........SSSSSS............................................. ................................................................................ ................................................................................ ................................................................................ ................................................................................ ................................................................................ ...................................................................S............ ................................................................................ ................................................................................ ................................................................................ ................................................................................ ................................................................................ .............................................................error removing c:\u sers\gohlke\appdata\local\temp\tmpaqkoypcat_test: c:\users\gohlke\appdata\local\ temp\tmpaqkoypcat_test: The directory is not empty ................................................................................ .................. ====================================================================== ERROR: test_twodiags (test_linsolve.TestLinsolve) ---------------------------------------------------------------------- Traceback (most recent call last): File "X:\python26\lib\site-packages\scipy\sparse\linalg\dsolve\tests\test_linsolve.py", line 39, in test_twodiags assert( norm(b - Asp*x) < 10 * cond_A * eps ) File "X:\python26\lib\site-packages\scipy\linalg\misc.py", line 9, in norm return np.linalg.norm(np.asarray_chkfinite(a), ord=ord) File "X:\python26\lib\site-packages\numpy\lib\function_base.py", line 586, inasarray_chkfinite raise ValueError, "array must not contain infs or NaNs" ValueError: array must not contain infs or NaNs ====================================================================== FAIL: test_lorentz (test_odr.TestODR) ---------------------------------------------------------------------- Traceback (most recent call last): File "X:\python26\lib\site-packages\scipy\odr\tests\test_odr.py", line 292, in test_lorentz 3.7798193600109009e+00]), File "X:\python26\lib\site-packages\numpy\testing\utils.py", line 765, in assert_array_almost_equal header='Arrays are not almost equal') File "X:\python26\lib\site-packages\numpy\testing\utils.py", line 609, in assert_array_compare raise AssertionError(msg) AssertionError: Arrays are not almost equal (mismatch 100.0%) x: array([ 1.00000000e+03, 1.00000000e-01, 3.80000000e+00]) y: array([ 1.43067808e+03, 1.33905090e-01, 3.77981936e+00]) ====================================================================== FAIL: test_multi (test_odr.TestODR) ---------------------------------------------------------------------- Traceback (most recent call last): File "X:\python26\lib\site-packages\scipy\odr\tests\test_odr.py", line 188, in test_multi 0.5101147161764654, 0.5173902330489161]), File "X:\python26\lib\site-packages\numpy\testing\utils.py", line 765, in assert_array_almost_equal header='Arrays are not almost equal') File "X:\python26\lib\site-packages\numpy\testing\utils.py", line 609, in assert_array_compare raise AssertionError(msg) AssertionError: Arrays are not almost equal (mismatch 100.0%) x: array([ 4. , 2. , 7. , 0.4, 0.5]) y: array([ 4.37998803, 2.43330576, 8.00288459, 0.51011472, 0.51739023]) ====================================================================== FAIL: test_pearson (test_odr.TestODR) ---------------------------------------------------------------------- Traceback (most recent call last): File "X:\python26\lib\site-packages\scipy\odr\tests\test_odr.py", line 235, in test_pearson np.array([ 5.4767400299231674, -0.4796082367610305]), File "X:\python26\lib\site-packages\numpy\testing\utils.py", line 765, in assert_array_almost_equal header='Arrays are not almost equal') File "X:\python26\lib\site-packages\numpy\testing\utils.py", line 609, in assert_array_compare raise AssertionError(msg) AssertionError: Arrays are not almost equal (mismatch 100.0%) x: array([ 1., 1.]) y: array([ 5.47674003, -0.47960824]) ====================================================================== FAIL: test_linsolve.TestSplu.test_spilu_smoketest ---------------------------------------------------------------------- Traceback (most recent call last): File "X:\python26\lib\site-packages\nose\case.py", line 186, in runTest self.test(*self.arg) File "X:\python26\lib\site-packages\scipy\sparse\linalg\dsolve\tests\test_linsolve.py", line 63, in test_spilu_smoketest assert abs(x - r).max() < 1e-2 AssertionError ====================================================================== FAIL: Check that QMR works with left and right preconditioners ---------------------------------------------------------------------- Traceback (most recent call last): File "X:\python26\lib\site-packages\scipy\sparse\linalg\isolve\tests\test_iterative.py", line 178, in test_leftright_precond assert_equal(info,0) File "X:\python26\lib\site-packages\numpy\testing\utils.py", line 309, in assert_equal raise AssertionError(msg) AssertionError: Items are not equal: ACTUAL: 1 DESIRED: 0 ---------------------------------------------------------------------- Ran 4399 tests in 36.788s FAILED (KNOWNFAIL=13, SKIP=35, errors=1, failures=5) On a clean 64-bit installation I also get the ndimage/dsyevr memory corruption (?) failure: ====================================================================== FAIL: extrema 3 ---------------------------------------------------------------------- Traceback (most recent call last): File "X:\python26-x64\lib\site-packages\scipy\ndimage\tests\test_ndimage.py", line 3149, in test_extrema03 self.failUnless(numpy.all(output1[2] == output4)) AssertionError And on my 32-bit development installation I still get the airye failure, but only when run in the context of scipy.test(): ====================================================================== FAIL: Real-valued Bessel domains ---------------------------------------------------------------------- Traceback (most recent call last): File "X:\python26\lib\site-packages\scipy\special\tests\test_basic.py", line 1691, in test_ticket_854 assert not isnan(airye(-1)[2:4]).any(), airye(-1) AssertionError: (nan, nan, nan, nan) Thank you for all the help. This work should also help improve the scipy package in EPD, which shows most of the test failures I see. -- Christoph From jjstickel at vcn.com Wed Jul 14 16:59:17 2010 From: jjstickel at vcn.com (Jonathan Stickel) Date: Wed, 14 Jul 2010 14:59:17 -0600 Subject: [SciPy-Dev] scikits contribution? In-Reply-To: References: <4C3DF5B9.4000807@vcn.com> Message-ID: <4C3E2525.8030107@vcn.com> On 7/14/10 12:01 , josef.pktd at gmail.com wrote: > forgot to CC > > Josef > > ---------- Forwarded message ---------- > From: > Date: Wed, Jul 14, 2010 at 1:57 PM > Subject: Re: [SciPy-Dev] scikits contribution? > To: SciPy Developers List > > > On Wed, Jul 14, 2010 at 1:36 PM, Jonathan Stickel wrote: >> On 7/8/10 07:05 , scipy-dev-request at scipy.org wrote: >>> Date: Thu, 8 Jul 2010 08:24:25 -0400 >>> From:josef.pktd at gmail.com >>> Subject: Re: [SciPy-Dev] scikits contribution? >>> To: SciPy Developers List >>> Message-ID: >>> >>> Content-Type: text/plain; charset=ISO-8859-1 >>> >>> On Wed, Jul 7, 2010 at 4:42 PM, Jonathan Stickel wrote: >>>>> Some time ago, I offered to contribute (on the scipy-user list) some >>>>> code to smooth 1-D data by regularization: >>>>> >>>>> http://mail.scipy.org/pipermail/scipy-user/2010-February/024351.html >>>>> >>>>> Someone suggested that scikits might be the right place: >>>>> >>>>> http://mail.scipy.org/pipermail/scipy-user/2010-February/024408.html >>>>> >>>>> So I am finally looking into scikits, and I am not sure how to proceed. >>>>> ?My code consists of several functions in a single .py file. ?It seems >>>>> overkill to create a new scikit for just one file, but I do not see an >>>>> existing scikit that matches. ?'Optimization' would be the closest; in >>>>> core scipy I would put it in 'interpolate'. >>>>> >>>>> So, what is the minimum that I need to do to create a scikit and upload >>>>> my code? ?Any suggestions for the name of the scikit (interpolate, >>>>> data_smoothing)? >>> >>> The easiest to get started is to copy the setup structure from another scikit. >>> I think the template scikit in scikits svn is a bit out of date, the >>> last time I looked. >>> >>> If you think your model could form the basis for enhancing the >>> smoother or noisy interpolation category in scipy, then a scikits >>> would be the best way, as we discussed. >>> >>> If you want to add it to an existing scikits, then statsmodels would >>> be a possibility. >>> Although statsmodels is more oriented towards multivariate approaches, >>> I think a smoother category, together with some non-parametric >>> methods, e.g. the existing kernel regression, would be an appropriate >>> fit. There is a need for smoothers in gam, Generalized Additive >>> Models, but that one is not cleaned up yet. >>> >>> And I think there will be more applications where it would be useful >>> to share the cross-validation code as far as possible. >>> >>> Josef >>> >>>>> >>>>> Please know that I am just starting to learn python, being a convert >>>>> from matlab/octave. ?Although I have become fairly proficient using >>>>> numpy/scipy in ipython, I do not know much about python internals, >>>>> setuptools, etc. >>>>> >> >> OK, I created a scikit named "datasmooth" and included my current code. >> It seems to install OK with "python setup install" and import >> correctly. However, I am not able to commit to the svn repository. I >> registered on the scikits wiki, but I guess there is something else I >> need to do? > > Sharing the code would be much easier if you pick your favorite > decentralized revision control system, git, bazaar or mercurial, and > host it at the corresponding website. > It would also avoid any permission questions. I don't know who handles > setup and administration of http://projects.scipy.org/scikits. > It seems strange to me that each scikit would host its own development sources in separate locations! I'd prefer to use the existing SVN tree for my small contribution, if possible. I did register at http://projects.scipy.org/scikits with the username "jjstickel". The wiki page indicates that this is all that is needed to edit the wiki, but I do not see a "Edit this page" link. So it does seem that someone needs to give me permission. I also need permission for commit access to svn.scipy.org/svn/scikits/. If someone wants to see some code before giving out permissions willy nilly, I can do that of course! Just let me know. Thanks, Jonathan >> P.S. >> Please cc me in your reply since I receive list emails in digest form. From millman at berkeley.edu Wed Jul 14 17:50:09 2010 From: millman at berkeley.edu (Jarrod Millman) Date: Wed, 14 Jul 2010 14:50:09 -0700 Subject: [SciPy-Dev] scikits contribution? In-Reply-To: <4C3E2525.8030107@vcn.com> References: <4C3DF5B9.4000807@vcn.com> <4C3E2525.8030107@vcn.com> Message-ID: On Wed, Jul 14, 2010 at 1:59 PM, Jonathan Stickel wrote: > I did register at http://projects.scipy.org/scikits with the username > "jjstickel". ?The wiki page indicates that this is all that is needed to > edit the wiki, but I do not see a "Edit this page" link. ?So it does > seem that someone needs to give me permission. ?I also need permission > for commit access to svn.scipy.org/svn/scikits/. Hey Jonathan, Thanks for contributing a new scikit! Unfortunately, it is a bit tedious to create accounts on the existing subversion system and there is no way to separate developer accounts between the various scikits. If you really want to use subversion, I would create a sourceforge project. For example, that is where the scikits.learn project is hosted: http://sourceforge.net/projects/scikit-learn Otherwise, I would recommend using github or one of the other dvcs hosting sites. In the near future (over the next month or so) it looks increasingly likely that we will be moving numpy and possibly scipy to git/github, so the existing scikits subversion repository will probably get moved over and broken up as well. The official scikits registry is here: http://scikits.appspot.com/ Any package named "scikits.xyz" on the Python Packaging Index will be included on the scikits.appspot.com website automatically. So there is no advantage to having commit access to the subversion site and there are several disadvantages such as it is slow, there isn't an obvious way to get account access, and it may disappear soon (assuming we move to github). Best, Jarrod From jjstickel at vcn.com Wed Jul 14 18:01:07 2010 From: jjstickel at vcn.com (Jonathan Stickel) Date: Wed, 14 Jul 2010 16:01:07 -0600 Subject: [SciPy-Dev] scikits contribution? In-Reply-To: References: Message-ID: <4C3E33A3.6080000@vcn.com> On 7/14/10 15:50 , scipy-dev-request at scipy.org wrote: > Date: Wed, 14 Jul 2010 14:50:09 -0700 > From: Jarrod Millman > Subject: Re: [SciPy-Dev] scikits contribution? > To: SciPy Developers List > Message-ID: > > Content-Type: text/plain; charset=ISO-8859-1 > > On Wed, Jul 14, 2010 at 1:59 PM, Jonathan Stickel wrote: >> > I did register athttp://projects.scipy.org/scikits with the username >> > "jjstickel". ?The wiki page indicates that this is all that is needed to >> > edit the wiki, but I do not see a "Edit this page" link. ?So it does >> > seem that someone needs to give me permission. ?I also need permission >> > for commit access to svn.scipy.org/svn/scikits/. > Hey Jonathan, > > Thanks for contributing a new scikit! > > Unfortunately, it is a bit tedious to create accounts on the existing > subversion system and there is no way to separate developer accounts > between the various scikits. If you really want to use subversion, I > would create a sourceforge project. For example, that is where the > scikits.learn project is hosted: > http://sourceforge.net/projects/scikit-learn > Otherwise, I would recommend using github or one of the other dvcs > hosting sites. > > In the near future (over the next month or so) it looks increasingly > likely that we will be moving numpy and possibly scipy to git/github, > so the existing scikits subversion repository will probably get moved > over and broken up as well. > > The official scikits registry is here:http://scikits.appspot.com/ > Any package named "scikits.xyz" on the Python Packaging Index will be > included on the scikits.appspot.com website automatically. So there > is no advantage to having commit access to the subversion site and > there are several disadvantages such as it is slow, there isn't an > obvious way to get account access, and it may disappear soon (assuming > we move to github). > OK, I'll check out github, or, if the learning curve is high, I may fall back to sourceforge since I have more experience with that interface. Can you elaborate on "Any package named "scikits.xyz" on the Python Packaging Index will be included on the scikits.appspot.com website automatically"? Where is the Python Packaging Index and how do I get a scikit to show up there once I make the sources available? I have to say this has become a lot more involved than I initially expected. Simpler procedures and better "howto" documentation would be helpful for sideline contributors like myself. Maybe this will improve after the move to github? Thanks, Jonathan From jsseabold at gmail.com Wed Jul 14 18:06:19 2010 From: jsseabold at gmail.com (Skipper Seabold) Date: Wed, 14 Jul 2010 18:06:19 -0400 Subject: [SciPy-Dev] scikits contribution? In-Reply-To: <4C3E33A3.6080000@vcn.com> References: <4C3E33A3.6080000@vcn.com> Message-ID: On Wed, Jul 14, 2010 at 6:01 PM, Jonathan Stickel wrote: > On 7/14/10 15:50 , scipy-dev-request at scipy.org wrote: >> Date: Wed, 14 Jul 2010 14:50:09 -0700 >> From: Jarrod Millman >> Subject: Re: [SciPy-Dev] scikits contribution? >> To: SciPy Developers List >> Message-ID: >> ? ? ? >> Content-Type: text/plain; charset=ISO-8859-1 >> >> On Wed, Jul 14, 2010 at 1:59 PM, Jonathan Stickel ?wrote: >>> > ?I did register athttp://projects.scipy.org/scikits ?with the username >>> > ?"jjstickel". ?The wiki page indicates that this is all that is needed to >>> > ?edit the wiki, but I do not see a "Edit this page" link. ?So it does >>> > ?seem that someone needs to give me permission. ?I also need permission >>> > ?for commit access to svn.scipy.org/svn/scikits/. >> Hey Jonathan, >> >> Thanks for contributing a new scikit! >> >> Unfortunately, it is a bit tedious to create accounts on the existing >> subversion system and there is no way to separate developer accounts >> between the various scikits. ?If you really want to use subversion, I >> would create a sourceforge project. ?For example, that is where the >> scikits.learn project is hosted: >> http://sourceforge.net/projects/scikit-learn >> Otherwise, I would recommend using github or one of the other dvcs >> hosting sites. >> >> In the near future (over the next month or so) it looks increasingly >> likely that we will be moving numpy and possibly scipy to git/github, >> so the existing scikits subversion repository will probably get moved >> over and broken up as well. >> >> The official scikits registry is here:http://scikits.appspot.com/ >> Any package named "scikits.xyz" on the Python Packaging Index will be >> included on the scikits.appspot.com website automatically. ?So there >> is no advantage to having commit access to the subversion site and >> there are several disadvantages such as it is slow, there isn't an >> obvious way to get account access, and it may disappear soon (assuming >> we move to github). >> > > OK, I'll check out github, or, if the learning curve is high, I may fall > back to sourceforge since I have more experience with that interface. > > Can you elaborate on "Any package named "scikits.xyz" on the Python > Packaging Index will be included on the scikits.appspot.com website > automatically"? ?Where is the Python Packaging Index and how do I get a http://pypi.python.org/pypi > scikit to show up there once I make the sources available? See below. > > I have to say this has become a lot more involved than I initially > expected. ?Simpler procedures and better "howto" documentation would be > helpful for sideline contributors like myself. ?Maybe this will improve > after the move to github? > I think most all of your questions are answered here. http://www.scipy.org/scipy/scikits/wiki/ScikitsForDevelopers Skipper From jsseabold at gmail.com Wed Jul 14 18:08:28 2010 From: jsseabold at gmail.com (Skipper Seabold) Date: Wed, 14 Jul 2010 18:08:28 -0400 Subject: [SciPy-Dev] scikits contribution? In-Reply-To: References: <4C3E33A3.6080000@vcn.com> Message-ID: On Wed, Jul 14, 2010 at 6:06 PM, Skipper Seabold wrote: > On Wed, Jul 14, 2010 at 6:01 PM, Jonathan Stickel wrote: >> On 7/14/10 15:50 , scipy-dev-request at scipy.org wrote: >>> Date: Wed, 14 Jul 2010 14:50:09 -0700 >>> From: Jarrod Millman >>> Subject: Re: [SciPy-Dev] scikits contribution? >>> To: SciPy Developers List >>> Message-ID: >>> ? ? ? >>> Content-Type: text/plain; charset=ISO-8859-1 >>> >>> On Wed, Jul 14, 2010 at 1:59 PM, Jonathan Stickel ?wrote: >>>> > ?I did register athttp://projects.scipy.org/scikits ?with the username >>>> > ?"jjstickel". ?The wiki page indicates that this is all that is needed to >>>> > ?edit the wiki, but I do not see a "Edit this page" link. ?So it does >>>> > ?seem that someone needs to give me permission. ?I also need permission >>>> > ?for commit access to svn.scipy.org/svn/scikits/. >>> Hey Jonathan, >>> >>> Thanks for contributing a new scikit! >>> >>> Unfortunately, it is a bit tedious to create accounts on the existing >>> subversion system and there is no way to separate developer accounts >>> between the various scikits. ?If you really want to use subversion, I >>> would create a sourceforge project. ?For example, that is where the >>> scikits.learn project is hosted: >>> http://sourceforge.net/projects/scikit-learn >>> Otherwise, I would recommend using github or one of the other dvcs >>> hosting sites. >>> >>> In the near future (over the next month or so) it looks increasingly >>> likely that we will be moving numpy and possibly scipy to git/github, >>> so the existing scikits subversion repository will probably get moved >>> over and broken up as well. >>> >>> The official scikits registry is here:http://scikits.appspot.com/ >>> Any package named "scikits.xyz" on the Python Packaging Index will be >>> included on the scikits.appspot.com website automatically. ?So there >>> is no advantage to having commit access to the subversion site and >>> there are several disadvantages such as it is slow, there isn't an >>> obvious way to get account access, and it may disappear soon (assuming >>> we move to github). >>> >> >> OK, I'll check out github, or, if the learning curve is high, I may fall >> back to sourceforge since I have more experience with that interface. >> >> Can you elaborate on "Any package named "scikits.xyz" on the Python >> Packaging Index will be included on the scikits.appspot.com website >> automatically"? ?Where is the Python Packaging Index and how do I get a > > http://pypi.python.org/pypi > >> scikit to show up there once I make the sources available? > > See below. > >> >> I have to say this has become a lot more involved than I initially >> expected. ?Simpler procedures and better "howto" documentation would be >> helpful for sideline contributors like myself. ?Maybe this will improve >> after the move to github? >> > > I think most all of your questions are answered here. > > http://www.scipy.org/scipy/scikits/wiki/ScikitsForDevelopers > Up to date link: http://projects.scipy.org/scikits/wiki/ScikitsForDevelopers Skipper From millman at berkeley.edu Wed Jul 14 18:11:55 2010 From: millman at berkeley.edu (Jarrod Millman) Date: Wed, 14 Jul 2010 15:11:55 -0700 Subject: [SciPy-Dev] scikits contribution? In-Reply-To: <4C3E33A3.6080000@vcn.com> References: <4C3E33A3.6080000@vcn.com> Message-ID: On Wed, Jul 14, 2010 at 3:01 PM, Jonathan Stickel wrote: > I have to say this has become a lot more involved than I initially expected. > ?Simpler procedures and better "howto" documentation would be helpful for > sideline contributors like myself. ?Maybe this will improve after the move > to github? Absolutely. No one has really stepped up to own the scikits overall project/structure per se. Some of the documentation is out-of-date and it has changed some from what it was originally conceived. Hopefully, over the next year someone will step forward and help with this aspect of the project. If you are considering moving to git/github, but haven't used it yet, you may want to wait a week or so. We will be adding instructions on how to use it for numpy, which may be helpful to you as well. Best, Jarrod From millman at berkeley.edu Wed Jul 14 18:14:32 2010 From: millman at berkeley.edu (Jarrod Millman) Date: Wed, 14 Jul 2010 15:14:32 -0700 Subject: [SciPy-Dev] scikits contribution? In-Reply-To: References: <4C3E33A3.6080000@vcn.com> Message-ID: On Wed, Jul 14, 2010 at 3:08 PM, Skipper Seabold wrote: > Up to date link: http://projects.scipy.org/scikits/wiki/ScikitsForDevelopers Unfortunately, that link is out dated. It says the all use a common svn repo, which is no longer true and is no longer recommended. It also doesn't say anything about PyPI or the scikits.appspot.com site. Jarrod From millman at berkeley.edu Wed Jul 14 18:15:57 2010 From: millman at berkeley.edu (Jarrod Millman) Date: Wed, 14 Jul 2010 15:15:57 -0700 Subject: [SciPy-Dev] scikits contribution? In-Reply-To: References: <4C3E33A3.6080000@vcn.com> Message-ID: On Wed, Jul 14, 2010 at 3:14 PM, Jarrod Millman wrote: > Unfortunately, that link is out dated. ?It says the all use a common > svn repo, which is no longer true and is no longer recommended. ?It > also doesn't say anything about PyPI or the scikits.appspot.com site. Actually, it does mention PyPI: > python setup.py register sdist bdist_egg upload -s Will register to PyPI, the python package index and upload a gpg signed (-s) source (sdist) and binary (bdist) distribution of your project. From josef.pktd at gmail.com Wed Jul 14 18:16:28 2010 From: josef.pktd at gmail.com (josef.pktd at gmail.com) Date: Wed, 14 Jul 2010 18:16:28 -0400 Subject: [SciPy-Dev] scikits contribution? In-Reply-To: References: <4C3E33A3.6080000@vcn.com> Message-ID: On Wed, Jul 14, 2010 at 6:06 PM, Skipper Seabold wrote: > On Wed, Jul 14, 2010 at 6:01 PM, Jonathan Stickel wrote: >> On 7/14/10 15:50 , scipy-dev-request at scipy.org wrote: >>> Date: Wed, 14 Jul 2010 14:50:09 -0700 >>> From: Jarrod Millman >>> Subject: Re: [SciPy-Dev] scikits contribution? >>> To: SciPy Developers List >>> Message-ID: >>> ? ? ? >>> Content-Type: text/plain; charset=ISO-8859-1 >>> >>> On Wed, Jul 14, 2010 at 1:59 PM, Jonathan Stickel ?wrote: >>>> > ?I did register athttp://projects.scipy.org/scikits ?with the username >>>> > ?"jjstickel". ?The wiki page indicates that this is all that is needed to >>>> > ?edit the wiki, but I do not see a "Edit this page" link. ?So it does >>>> > ?seem that someone needs to give me permission. ?I also need permission >>>> > ?for commit access to svn.scipy.org/svn/scikits/. >>> Hey Jonathan, >>> >>> Thanks for contributing a new scikit! >>> >>> Unfortunately, it is a bit tedious to create accounts on the existing >>> subversion system and there is no way to separate developer accounts >>> between the various scikits. ?If you really want to use subversion, I >>> would create a sourceforge project. ?For example, that is where the >>> scikits.learn project is hosted: >>> http://sourceforge.net/projects/scikit-learn >>> Otherwise, I would recommend using github or one of the other dvcs >>> hosting sites. >>> >>> In the near future (over the next month or so) it looks increasingly >>> likely that we will be moving numpy and possibly scipy to git/github, >>> so the existing scikits subversion repository will probably get moved >>> over and broken up as well. >>> >>> The official scikits registry is here:http://scikits.appspot.com/ >>> Any package named "scikits.xyz" on the Python Packaging Index will be >>> included on the scikits.appspot.com website automatically. ?So there >>> is no advantage to having commit access to the subversion site and >>> there are several disadvantages such as it is slow, there isn't an >>> obvious way to get account access, and it may disappear soon (assuming >>> we move to github). >>> >> >> OK, I'll check out github, or, if the learning curve is high, I may fall >> back to sourceforge since I have more experience with that interface. >> >> Can you elaborate on "Any package named "scikits.xyz" on the Python >> Packaging Index will be included on the scikits.appspot.com website >> automatically"? ?Where is the Python Packaging Index and how do I get a > > http://pypi.python.org/pypi > >> scikit to show up there once I make the sources available? > > See below. > >> >> I have to say this has become a lot more involved than I initially >> expected. ?Simpler procedures and better "howto" documentation would be >> helpful for sideline contributors like myself. ?Maybe this will improve >> after the move to github? >> > > I think most all of your questions are answered here. > > http://www.scipy.org/scipy/scikits/wiki/ScikitsForDevelopers It's a bit outdated, since now hosting with a decentralized revision control system is preferred. Also nothing about sphinx docs, neither in the template (at least when I checked and looked for a template last year) Basic hosting instructions or links to instructions with alternatives on github, launchpad, mercurial? Instructions for creating account and uploading to pypi are also missing. It took me quite a bit of time last year to create the scikits infrastructure. So improvements to and updating of the scikits docs would be very useful. Josef > > Skipper > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > From thomas.robitaille at gmail.com Wed Jul 14 20:40:22 2010 From: thomas.robitaille at gmail.com (Thomas Robitaille) Date: Wed, 14 Jul 2010 20:40:22 -0400 Subject: [SciPy-Dev] potential scikits package Message-ID: Hello, Recently I have developed a package to read IDL save files into Python, at http://idlsave.sourceforge.net/. The code is now very robust, and works for many different IDL save files of varying complexity (including structures). Since this is something that is likely to be most useful for scientists, I was wondering whether it is a good candidate to make into a scikits package. In the long run it is likely this package (if re-licensed with a BSD licence) might be considered for inclusion in scipy? I noticed functions in scipy to read in matlab files, so this is along the same lines. Thanks in advance for any advice, Thomas -------------- next part -------------- An HTML attachment was scrubbed... URL: From fperez.net at gmail.com Wed Jul 14 21:15:25 2010 From: fperez.net at gmail.com (Fernando Perez) Date: Wed, 14 Jul 2010 18:15:25 -0700 Subject: [SciPy-Dev] potential scikits package In-Reply-To: References: Message-ID: On Wed, Jul 14, 2010 at 5:40 PM, Thomas Robitaille wrote: > In the long run it is likely this package?(if re-licensed with a BSD > licence)?might be considered for inclusion in scipy? I noticed functions in > scipy to read in matlab files, so this is along the same lines. This looks like a great contribution, and I suspect you'd get even more feedback if you put in the email subject 'IDL file reader', as many people who care about IDL may accidentally skip this message :) Regards, f From josh.holbrook at gmail.com Wed Jul 14 21:42:32 2010 From: josh.holbrook at gmail.com (Joshua Holbrook) Date: Wed, 14 Jul 2010 17:42:32 -0800 Subject: [SciPy-Dev] potential scikits package In-Reply-To: References: Message-ID: On Wed, Jul 14, 2010 at 5:15 PM, Fernando Perez wrote: > On Wed, Jul 14, 2010 at 5:40 PM, Thomas Robitaille > wrote: >> In the long run it is likely this package?(if re-licensed with a BSD >> licence)?might be considered for inclusion in scipy? I noticed functions in >> scipy to read in matlab files, so this is along the same lines. > > This looks like a great contribution, and I suspect you'd get even > more feedback if you put in the email subject 'IDL file reader', as > many people who care about IDL may accidentally skip this message :) > > Regards, > > f > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > There's also discussion regarding astronomy-specific libraries. Astronomers use IDL right? They'd probably be really interested. --Josh From matthew.brett at gmail.com Wed Jul 14 21:45:09 2010 From: matthew.brett at gmail.com (Matthew Brett) Date: Wed, 14 Jul 2010 18:45:09 -0700 Subject: [SciPy-Dev] potential scikits package In-Reply-To: References: Message-ID: Hi. > Recently I have developed a package to read IDL save files into Python, > at?http://idlsave.sourceforge.net/. The code is now very robust, and works > for many different IDL save files of varying complexity (including > structures). Since this is something that is likely to be most useful for > scientists, I was wondering whether it is a good candidate to make into a > scikits package. In the long run it is likely this package?(if re-licensed > with a BSD licence)?might be considered for inclusion in scipy? I noticed > functions in scipy to read in matlab files, so this is along the same lines. I would have thought that would be very good to go in scipy.io directly. I'm happy to help ... See you, Matthew From thomas.robitaille at gmail.com Wed Jul 14 21:50:51 2010 From: thomas.robitaille at gmail.com (Thomas Robitaille) Date: Wed, 14 Jul 2010 21:50:51 -0400 Subject: [SciPy-Dev] potential scikits package In-Reply-To: References: Message-ID: >> wrote: >>> In the long run it is likely this package (if re-licensed with a BSD >>> licence) might be considered for inclusion in scipy? I noticed functions in >>> scipy to read in matlab files, so this is along the same lines. >> >> This looks like a great contribution, and I suspect you'd get even >> more feedback if you put in the email subject 'IDL file reader', as >> many people who care about IDL may accidentally skip this message :) >> >> Regards, >> >> f >> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at scipy.org >> http://mail.scipy.org/mailman/listinfo/scipy-dev >> > > There's also discussion regarding astronomy-specific libraries. > Astronomers use IDL right? They'd probably be really interested. I actually wrote this for astronomers originally, and advertised it on the astropy mailing list, but I think this could be useful for other fields too, so decided it might be a good idea to merge it into a general scientific library rather than an astronomy library. Cheers, Thomas > > --Josh > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev From fperez.net at gmail.com Wed Jul 14 21:52:41 2010 From: fperez.net at gmail.com (Fernando Perez) Date: Wed, 14 Jul 2010 18:52:41 -0700 Subject: [SciPy-Dev] potential scikits package In-Reply-To: References: Message-ID: On Wed, Jul 14, 2010 at 6:45 PM, Matthew Brett wrote: > I would have thought that would be very good to go in scipy.io > directly. Completely agreed, and it would round up nicely scipy.io. Cheers, f From josh.holbrook at gmail.com Wed Jul 14 22:01:41 2010 From: josh.holbrook at gmail.com (Joshua Holbrook) Date: Wed, 14 Jul 2010 18:01:41 -0800 Subject: [SciPy-Dev] potential scikits package In-Reply-To: References: Message-ID: On Wed, Jul 14, 2010 at 5:52 PM, Fernando Perez wrote: > On Wed, Jul 14, 2010 at 6:45 PM, Matthew Brett wrote: >> I would have thought that would be very good to go in scipy.io >> directly. > > Completely agreed, and it would round up nicely scipy.io. > > Cheers, > > f > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > Speaking of scipy.io: I noticed the other day that the .mat module doesn't read the new hdf5-based format. Has anyone opened them with a more standard hdf5 library? How was it? --Josh From gokhansever at gmail.com Thu Jul 15 00:52:04 2010 From: gokhansever at gmail.com (=?UTF-8?Q?G=C3=B6khan_Sever?=) Date: Wed, 14 Jul 2010 23:52:04 -0500 Subject: [SciPy-Dev] potential scikits package In-Reply-To: References: Message-ID: On Wed, Jul 14, 2010 at 8:50 PM, Thomas Robitaille < thomas.robitaille at gmail.com> wrote: > I actually wrote this for astronomers originally, and advertised it on the > astropy mailing list, but I think this could be useful for other fields > too... > Indeed they do --in atmospheric sciences. I have been using it to verify and compare IDL and Python produced identical cloud model results. Placing this package in scipy.io is a good idea to combine basic external data interfacing options in one main place along with other functions. -- G?khan -------------- next part -------------- An HTML attachment was scrubbed... URL: From cgohlke at uci.edu Thu Jul 15 03:43:10 2010 From: cgohlke at uci.edu (Christoph Gohlke) Date: Thu, 15 Jul 2010 00:43:10 -0700 Subject: [SciPy-Dev] memory corruption when running scipy 0.8 test suite In-Reply-To: <4C3E0300.8050800@uci.edu> References: <4C3BBD59.2020300@uci.edu> <4C3CEE2C.8000307@uci.edu> <4C3D8290.9060906@uci.edu> <4C3D95AB.5070007@uci.edu> <4C3E0300.8050800@uci.edu> Message-ID: <4C3EBC0E.4030402@uci.edu> On 7/14/2010 11:33 AM, Christoph Gohlke wrote: > > > Running unit tests for scipy > NumPy version 1.4.1 > NumPy is installed in X:\python26\lib\site-packages\numpy > SciPy version 0.8.0.dev6609 > SciPy is installed in X:\python26\lib\site-packages\scipy > Python version 2.6.5 (r265:79096, Mar 19 2010, 21:48:26) [MSC v.1500 32 > bit (Intel)] > nose version 0.11.3 > ................................................................................ > ................................................................................ > ......................................K......................................... > ................................................................................ > ....K..K........................................................................ > ................................................................................ > ................................................................................ > ................................................................................ > ................................................................................ > ................................................................................ > ................................................................................ > .........................................SSSSSS......SSSSSS......SSSS........... > ....................................................S........................... > ................................................................................ > ................................................................................ > ...................K............................................................ > ................................................................................ > ................................................................................ > ................................................................................ > ................................................................................ > ................................................................................ > ................................................................................ > ..........................................................FFF................... > ................................................................................ > ................................................................................ > ................................................................................ > ...............................................................SSSSSSSSSSS.E..F. > .........K.........F............................................................ > .................................................................K.............. > .................................................K.............................. > ................................................................................ > ...........................................KK................................... > ................................................................................ > ................................................................................ > ................................................................................ > ................................................................................ > ....................................K.K......................................... > ................................................................................ > ................................................................................ > ................................................................................ > ................................................................................ > ..........K........K.........SSSSSS............................................. > ................................................................................ > ................................................................................ > ................................................................................ > ................................................................................ > ................................................................................ > ...................................................................S............ > ................................................................................ > ................................................................................ > ................................................................................ > ................................................................................ > ................................................................................ > .............................................................error > removing c:\u > sers\gohlke\appdata\local\temp\tmpaqkoypcat_test: > c:\users\gohlke\appdata\local\ > temp\tmpaqkoypcat_test: The directory is not empty > ................................................................................ > .................. > ====================================================================== > ERROR: test_twodiags (test_linsolve.TestLinsolve) > ---------------------------------------------------------------------- > Traceback (most recent call last): > File > "X:\python26\lib\site-packages\scipy\sparse\linalg\dsolve\tests\test_linsolve.py", > line 39, in test_twodiags > assert( norm(b - Asp*x)< 10 * cond_A * eps ) > File "X:\python26\lib\site-packages\scipy\linalg\misc.py", line 9, in > norm > return np.linalg.norm(np.asarray_chkfinite(a), ord=ord) > File "X:\python26\lib\site-packages\numpy\lib\function_base.py", line > 586, inasarray_chkfinite > raise ValueError, "array must not contain infs or NaNs" > ValueError: array must not contain infs or NaNs > > ====================================================================== > FAIL: test_lorentz (test_odr.TestODR) > ---------------------------------------------------------------------- > Traceback (most recent call last): > File "X:\python26\lib\site-packages\scipy\odr\tests\test_odr.py", > line 292, in > test_lorentz > 3.7798193600109009e+00]), > File "X:\python26\lib\site-packages\numpy\testing\utils.py", line > 765, in assert_array_almost_equal > header='Arrays are not almost equal') > File "X:\python26\lib\site-packages\numpy\testing\utils.py", line > 609, in assert_array_compare > raise AssertionError(msg) > AssertionError: > Arrays are not almost equal > > (mismatch 100.0%) > x: array([ 1.00000000e+03, 1.00000000e-01, 3.80000000e+00]) > y: array([ 1.43067808e+03, 1.33905090e-01, 3.77981936e+00]) > > ====================================================================== > FAIL: test_multi (test_odr.TestODR) > ---------------------------------------------------------------------- > Traceback (most recent call last): > File "X:\python26\lib\site-packages\scipy\odr\tests\test_odr.py", > line 188, in test_multi > 0.5101147161764654, 0.5173902330489161]), > File "X:\python26\lib\site-packages\numpy\testing\utils.py", line > 765, in assert_array_almost_equal > header='Arrays are not almost equal') > File "X:\python26\lib\site-packages\numpy\testing\utils.py", line > 609, in assert_array_compare > raise AssertionError(msg) > AssertionError: > Arrays are not almost equal > > (mismatch 100.0%) > x: array([ 4. , 2. , 7. , 0.4, 0.5]) > y: array([ 4.37998803, 2.43330576, 8.00288459, 0.51011472, > 0.51739023]) > > ====================================================================== > FAIL: test_pearson (test_odr.TestODR) > ---------------------------------------------------------------------- > Traceback (most recent call last): > File "X:\python26\lib\site-packages\scipy\odr\tests\test_odr.py", > line 235, in test_pearson > np.array([ 5.4767400299231674, -0.4796082367610305]), > File "X:\python26\lib\site-packages\numpy\testing\utils.py", line > 765, in assert_array_almost_equal > header='Arrays are not almost equal') > File "X:\python26\lib\site-packages\numpy\testing\utils.py", line > 609, in assert_array_compare > raise AssertionError(msg) > AssertionError: > Arrays are not almost equal > > (mismatch 100.0%) > x: array([ 1., 1.]) > y: array([ 5.47674003, -0.47960824]) > > ====================================================================== > FAIL: test_linsolve.TestSplu.test_spilu_smoketest > ---------------------------------------------------------------------- > Traceback (most recent call last): > File "X:\python26\lib\site-packages\nose\case.py", line 186, in runTest > self.test(*self.arg) > File > "X:\python26\lib\site-packages\scipy\sparse\linalg\dsolve\tests\test_linsolve.py", > line 63, in test_spilu_smoketest > assert abs(x - r).max()< 1e-2 > AssertionError > > ====================================================================== > FAIL: Check that QMR works with left and right preconditioners > ---------------------------------------------------------------------- > Traceback (most recent call last): > File > "X:\python26\lib\site-packages\scipy\sparse\linalg\isolve\tests\test_iterative.py", > line 178, in test_leftright_precond > assert_equal(info,0) > File "X:\python26\lib\site-packages\numpy\testing\utils.py", line > 309, in assert_equal > raise AssertionError(msg) > AssertionError: > Items are not equal: > ACTUAL: 1 > DESIRED: 0 > > ---------------------------------------------------------------------- > Ran 4399 tests in 36.788s > > FAILED (KNOWNFAIL=13, SKIP=35, errors=1, failures=5) > > > > On a clean 64-bit installation I also get the ndimage/dsyevr memory > corruption (?) failure: > > ====================================================================== > FAIL: extrema 3 > ---------------------------------------------------------------------- > Traceback (most recent call last): > File > "X:\python26-x64\lib\site-packages\scipy\ndimage\tests\test_ndimage.py", > line 3149, in test_extrema03 > self.failUnless(numpy.all(output1[2] == output4)) > AssertionError > > > > And on my 32-bit development installation I still get the airye failure, > but only when run in the context of scipy.test(): > > ====================================================================== > FAIL: Real-valued Bessel domains > ---------------------------------------------------------------------- > Traceback (most recent call last): > File > "X:\python26\lib\site-packages\scipy\special\tests\test_basic.py", line > 1691, in test_ticket_854 > assert not isnan(airye(-1)[2:4]).any(), airye(-1) > AssertionError: (nan, nan, nan, nan) > > I opened a ticket for this artifact at http://projects.scipy.org/scipy/ticket/1233. All scipy.sparse.linalg tests now pass after reworking the patch to link SuperLU with CBLAS instead of MKL. The patch is attached to http://projects.scipy.org/scipy/ticket/1225. -- Christoph From ralf.gommers at googlemail.com Thu Jul 15 11:27:45 2010 From: ralf.gommers at googlemail.com (Ralf Gommers) Date: Thu, 15 Jul 2010 23:27:45 +0800 Subject: [SciPy-Dev] ANN: scipy 0.8.0 release candidate 3 Message-ID: I'm pleased to announce the availability of the third release candidate of SciPy 0.8.0. The only changes compared to rc2 are a fix for a regression in interpolate.Rbf and some fixes for failures on 64-bit Windows. If no more problems are reported, the final release will be available in one week. SciPy is a package of tools for science and engineering for Python. It includes modules for statistics, optimization, integration, linear algebra, Fourier transforms, signal and image processing, ODE solvers, and more. This release candidate release comes one and a half year after the 0.7.0 release and contains many new features, numerous bug-fixes, improved test coverage, and better documentation. Please note that SciPy 0.8.0rc3 requires Python 2.4-2.6 and NumPy 1.4.1 or greater. For more information, please see the release notes: http://sourceforge.net/projects/scipy/files/scipy/0.8.0rc3/NOTES.txt/view You can download the release from here: https://sourceforge.net/projects/scipy/ Python 2.5/2.6 binaries for Windows and OS X are available, as well as source tarballs for other platforms and the documentation in pdf form. Thank you to everybody who contributed to this release. Enjoy, Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at googlemail.com Thu Jul 15 11:34:57 2010 From: ralf.gommers at googlemail.com (Ralf Gommers) Date: Thu, 15 Jul 2010 23:34:57 +0800 Subject: [SciPy-Dev] better signature for stats.linregress Message-ID: The current linregress signature is (*args) which is not very informative, as pointed out in http://projects.scipy.org/scipy/ticket/1164. I propose to change it to (x, y=None), which doesn't break any code. Changes and tests are here: http://github.com/rgommers/scipy/commits/linregress-signature. Does this look OK? Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at googlemail.com Thu Jul 15 11:53:50 2010 From: ralf.gommers at googlemail.com (Ralf Gommers) Date: Thu, 15 Jul 2010 23:53:50 +0800 Subject: [SciPy-Dev] closing (invalid) tickets without follow-up Message-ID: Hi, For tickets that seem to be invalid and/or don't have enough info to be useful, how long do we wait before closing them? Running into the same ones all the time is quite annoying. For example: http://projects.scipy.org/scipy/ticket/926 (14 months old, close to zero info) http://projects.scipy.org/scipy/ticket/1168 (likely invalid, no follow up) http://projects.scipy.org/scipy/ticket/1073 (same) http://projects.scipy.org/scipy/ticket/1188 (same) The ticket submitter should really get emails with comments, but I'm not sure how easy this is to do with Trac. Thanks, Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From josef.pktd at gmail.com Thu Jul 15 12:37:06 2010 From: josef.pktd at gmail.com (josef.pktd at gmail.com) Date: Thu, 15 Jul 2010 12:37:06 -0400 Subject: [SciPy-Dev] better signature for stats.linregress In-Reply-To: References: Message-ID: On Thu, Jul 15, 2010 at 11:34 AM, Ralf Gommers wrote: > The current linregress signature is (*args) which is not very informative, > as pointed out in http://projects.scipy.org/scipy/ticket/1164. I propose to > change it to (x, y=None), which doesn't break any code. Changes and tests > are here: http://github.com/rgommers/scipy/commits/linregress-signature. > > Does this look OK? looks good to me 2381 + if x.shape[0] == 2: 2382 + y = x[1, :] 2383 + x = x[0, :] 2384 + elif x.shape[1] == 2: 2385 + y = x[:, 1] 2386 + x = x[:, 0] could use tuple unpacking instead, works with arrays x,y = x and x,y = x.T Thanks, Josef > > Ralf > > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > > From pav at iki.fi Thu Jul 15 12:50:35 2010 From: pav at iki.fi (Pauli Virtanen) Date: Thu, 15 Jul 2010 16:50:35 +0000 (UTC) Subject: [SciPy-Dev] closing (invalid) tickets without follow-up References: Message-ID: Thu, 15 Jul 2010 23:53:50 +0800, Ralf Gommers wrote: > For tickets that seem to be invalid and/or don't have enough info to be > useful, how long do we wait before closing them? Running into the same > ones all the time is quite annoying. For example: > http://projects.scipy.org/scipy/ticket/926 (14 months old, close to zero > info) > http://projects.scipy.org/scipy/ticket/1168 (likely invalid, no follow > up) http://projects.scipy.org/scipy/ticket/1073 (same) > http://projects.scipy.org/scipy/ticket/1188 (same) > > The ticket submitter should really get emails with comments, but I'm not > sure how easy this is to do with Trac. We can perhaps set them to the "needs_info" status, and decide that it means "No further developer attention worthwhile, until some feedback from the reporter (or someone else) received." Would this help? The "needs_info" state is already there. I think the ticket submitter (and all commenters) do already get emails on comments -- at least I receive them. -- Pauli Virtanen From nwagner at iam.uni-stuttgart.de Thu Jul 15 13:01:44 2010 From: nwagner at iam.uni-stuttgart.de (Nils Wagner) Date: Thu, 15 Jul 2010 19:01:44 +0200 Subject: [SciPy-Dev] closing (invalid) tickets without follow-up In-Reply-To: References: Message-ID: On Thu, 15 Jul 2010 16:50:35 +0000 (UTC) Pauli Virtanen wrote: > Thu, 15 Jul 2010 23:53:50 +0800, Ralf Gommers wrote: >> For tickets that seem to be invalid and/or don't have >>enough info to be >> useful, how long do we wait before closing them? Running >>into the same >> ones all the time is quite annoying. For example: >> http://projects.scipy.org/scipy/ticket/926 (14 months >>old, close to zero >> info) >> http://projects.scipy.org/scipy/ticket/1168 (likely >>invalid, no follow >> up) http://projects.scipy.org/scipy/ticket/1073 (same) >> http://projects.scipy.org/scipy/ticket/1188 (same) >> >> The ticket submitter should really get emails with >>comments, but I'm not >> sure how easy this is to do with Trac. > > We can perhaps set them to the "needs_info" status, and >decide that it > means > > "No further developer attention worthwhile, until some >feedback > from the reporter (or someone else) received." > > Would this help? The "needs_info" state is already >there. > > I think the ticket submitter (and all commenters) do >already get emails > on comments -- at least I receive them. > > -- > Pauli Virtanen > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev Hi all, is there someone with available capacity to resolve http://projects.scipy.org/numpy/ticket/937 Thanks in advance. Nils From d.l.goldsmith at gmail.com Thu Jul 15 13:11:49 2010 From: d.l.goldsmith at gmail.com (David Goldsmith) Date: Thu, 15 Jul 2010 10:11:49 -0700 Subject: [SciPy-Dev] Skypecon tomorrow: important agenda item! Message-ID: Hi, folks. Great news! Vincent Davis has been getting up to speed on pydocweb and has almost finished implementing the infrastructure to support the separate-technical-and-presentation-review process! It's still undergoing development iteration at the "alpha" stage and hopefully will be ready for pre-deployment beta testing soon (sorry if I'm overreaching, Vincent), but there's finally light at the end of the tunnel. Accordingly, it's time to start talking about a formal process for reviewer recruitment and QA/QC, so I'd like to make this the primary agenda item for our next Skypecon. Obviously, this is of high importance and in all likelihood interests many who have otherwise been ambivalent about this year's Skypecons, so I'd like to be accommodating: if you'd like to participate but absolutely, positively can't do so tomorrow at noon EDT, please let me know ASAP some times--plural--that would be good for you - I'll try to accommodate as many people as I can. Thanks! DG -------------- next part -------------- An HTML attachment was scrubbed... URL: From jjstickel at vcn.com Thu Jul 15 15:15:16 2010 From: jjstickel at vcn.com (Jonathan Stickel) Date: Thu, 15 Jul 2010 13:15:16 -0600 Subject: [SciPy-Dev] scikits contribution? In-Reply-To: References: Message-ID: <4C3F5E44.2000105@vcn.com> On 7/14/10 18:40 , scipy-dev-request at scipy.org wrote: > Date: Wed, 14 Jul 2010 15:15:57 -0700 > From: Jarrod Millman > Subject: Re: [SciPy-Dev] scikits contribution? > To: SciPy Developers List > Message-ID: > > Content-Type: text/plain; charset=ISO-8859-1 > > On Wed, Jul 14, 2010 at 3:14 PM, Jarrod Millman wrote: >> > Unfortunately, that link is out dated. ?It says the all use a common >> > svn repo, which is no longer true and is no longer recommended. ?It >> > also doesn't say anything about PyPI or the scikits.appspot.com site. > Actually, it does mention PyPI: > >> > python setup.py register sdist bdist_egg upload -s > Will register to PyPI, the python package index and upload a gpg > signed (-s) source (sdist) and binary (bdist) distribution of your > project. > Thanks for all the responses. I found github quite easy to use and with good site documentation. Here is the source repository: http://github.com/jjstickel/scikit-datasmooth/ When I attempted to register and upload to PyPI (after creating a gpg key), I got an error: $ python setup.py register sdist bdist_egg upload -s running register running egg_info running build_src ... running upload gpg --detach-sign -a dist/scikits.datasmooth-0.5.tar.gz You need a passphrase to unlock the secret key for user: "Jonathan Stickel " 2048-bit RSA key, ID 30D37C66, created 2010-07-15 Traceback (most recent call last): File "setup.py", line 53, in 'Topic :: Scientific/Engineering']) File "/opt/local/Library/Frameworks/Python.framework/Versions/2.6/lib/python2.6/site-packages/numpy/distutils/core.py", line 186, in setup return old_setup(**new_attr) File "/opt/local/Library/Frameworks/Python.framework/Versions/2.6/lib/python2.6/distutils/core.py", line 152, in setup dist.run_commands() File "/opt/local/Library/Frameworks/Python.framework/Versions/2.6/lib/python2.6/distutils/dist.py", line 987, in run_commands self.run_command(cmd) File "/opt/local/Library/Frameworks/Python.framework/Versions/2.6/lib/python2.6/distutils/dist.py", line 1007, in run_command cmd_obj.run() File "/opt/local/Library/Frameworks/Python.framework/Versions/2.6/lib/python2.6/distutils/command/upload.py", line 57, in run self.upload_file(command, pyversion, filename) File "/opt/local/Library/Frameworks/Python.framework/Versions/2.6/lib/python2.6/distutils/command/upload.py", line 133, in upload_file value = value[1] IndexError: tuple index out of range It looks like the index was created OK, but no source or binary package was uploaded. Any help on this? Thanks, Jonathan From d.l.goldsmith at gmail.com Thu Jul 15 16:41:20 2010 From: d.l.goldsmith at gmail.com (David Goldsmith) Date: Thu, 15 Jul 2010 13:41:20 -0700 Subject: [SciPy-Dev] Is scipy.org down? Message-ID: -- Mathematician: noun, someone who disavows certainty when their uncertainty set is non-empty, even if that set has measure zero. Hope: noun, that delusive spirit which escaped Pandora's jar and, with her lies, prevents mankind from committing a general suicide. (As interpreted by Robert Graves) -------------- next part -------------- An HTML attachment was scrubbed... URL: From matthieu.brucher at gmail.com Thu Jul 15 16:43:36 2010 From: matthieu.brucher at gmail.com (Matthieu Brucher) Date: Thu, 15 Jul 2010 22:43:36 +0200 Subject: [SciPy-Dev] scikits contribution? In-Reply-To: <4C3F5E44.2000105@vcn.com> References: <4C3F5E44.2000105@vcn.com> Message-ID: > It looks like the index was created OK, but no source or binary package > was uploaded. ?Any help on this? It couldn't be uploaded: "You need a passphrase to unlock the secret key for". You may have a passphrase with your ssh key :| Matthieu -- Information System Engineer, Ph.D. Blog: http://matt.eifelle.com LinkedIn: http://www.linkedin.com/in/matthieubrucher From d.l.goldsmith at gmail.com Thu Jul 15 16:57:10 2010 From: d.l.goldsmith at gmail.com (David Goldsmith) Date: Thu, 15 Jul 2010 13:57:10 -0700 Subject: [SciPy-Dev] Is scipy.org down? In-Reply-To: References: Message-ID: Ok, back up now. DG -------------- next part -------------- An HTML attachment was scrubbed... URL: From ben.root at ou.edu Thu Jul 15 16:31:37 2010 From: ben.root at ou.edu (Benjamin Root) Date: Thu, 15 Jul 2010 15:31:37 -0500 Subject: [SciPy-Dev] int division in signaltools.py Message-ID: Hi, I was peeking through signaltools.py today, and I noticed several spots where integer division was being intended, but not explicitly done with '//'. While most of the cases seem pretty benign, there were a couple of spots where unverified input was being used for the division. I am including a patch for all the spots I believe integer division was intended. This change does not appear to effect the nose test, but I am still weary of whatever implications there are with this change. Also, as a side note, I noticed that the hilbert2() function has a double call to "x = asarray(x)". This is *not* covered by the attached patch, though. Ben Root -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: intdiv.patch Type: application/octet-stream Size: 2295 bytes Desc: not available URL: From pav at iki.fi Thu Jul 15 16:02:52 2010 From: pav at iki.fi (Pauli Virtanen) Date: Thu, 15 Jul 2010 20:02:52 +0000 (UTC) Subject: [SciPy-Dev] scikits contribution? References: <4C3F5E44.2000105@vcn.com> Message-ID: Thu, 15 Jul 2010 13:15:16 -0600, Jonathan Stickel wrote: [clip] > $ python setup.py register sdist bdist_egg upload -s [clip: distutils error] > "/opt/local/Library/Frameworks/Python.framework/Versions/2.6/lib/ python2.6/distutils/command/upload.py", > line 133, in upload_file > > IndexError: tuple index out of range > > > It looks like the index was created OK, but no source or binary package > was uploaded. Any help on this? I think I remember seeing this too. This is some distutils bug, I think... My "fix" was just to edit the "distutils/command/upload.py" and replace value = value[1] on line 133 by the obvious if len(value) > 1: value = value[1] else: value = "" and hope for the best. Or maybe I ran "upload" on a separate line from the "sdist" and "bdist". This is maybe not how it should go. You can also upload the tarballs on the PyPi website. -- Pauli Virtanen From vincent at vincentdavis.net Thu Jul 15 17:14:29 2010 From: vincent at vincentdavis.net (Vincent Davis) Date: Thu, 15 Jul 2010 15:14:29 -0600 Subject: [SciPy-Dev] Is scipy.org down? In-Reply-To: References: Message-ID: All good for me Vincent On Thu, Jul 15, 2010 at 2:41 PM, David Goldsmith wrote: > > > -- > Mathematician: noun, someone who disavows certainty when their uncertainty > set is non-empty, even if that set has measure zero. > > Hope: noun, that delusive spirit which escaped Pandora's jar and, with her > lies, prevents mankind from committing a general suicide.? (As interpreted > by Robert Graves) > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > > From vincent at vincentdavis.net Thu Jul 15 17:15:23 2010 From: vincent at vincentdavis.net (Vincent Davis) Date: Thu, 15 Jul 2010 15:15:23 -0600 Subject: [SciPy-Dev] [SciPy-User] ANN: scipy 0.8.0 release candidate 3 In-Reply-To: References: Message-ID: Looks good for me now running py2.7 on oxs compiled 64bit Ran 4407 tests in 119.221s OK (KNOWNFAIL=19, SKIP=35) Vincent On Thu, Jul 15, 2010 at 9:27 AM, Ralf Gommers wrote: > I'm pleased to announce the availability of the third release candidate of > SciPy 0.8.0. The only changes compared to rc2 are a fix for a regression in > interpolate.Rbf and some fixes for failures on 64-bit Windows. If no more > problems are reported, the final release will be available in one week. > > SciPy is a package of tools for science and engineering for Python. It > includes modules for statistics, optimization, integration, linear algebra, > Fourier transforms, signal and image processing, ODE solvers, and more. > > This release candidate release comes one and a half year after the 0.7.0 > release and contains many new features, numerous bug-fixes, improved test > coverage, and better documentation. ?Please note that SciPy 0.8.0rc3 > requires Python 2.4-2.6 and NumPy 1.4.1 or greater. > > For more information, please see the release notes: > http://sourceforge.net/projects/scipy/files/scipy/0.8.0rc3/NOTES.txt/view > > You can download the release from here: > https://sourceforge.net/projects/scipy/ > Python 2.5/2.6 binaries for Windows and OS X are available, as well as > source tarballs for other platforms and the documentation in pdf form. > > Thank you to everybody who contributed to this release. > > Enjoy, > Ralf > _______________________________________________ > SciPy-User mailing list > SciPy-User at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-user > > From jjstickel at vcn.com Thu Jul 15 17:47:50 2010 From: jjstickel at vcn.com (Jonathan Stickel) Date: Thu, 15 Jul 2010 15:47:50 -0600 Subject: [SciPy-Dev] scikits contribution? In-Reply-To: References: Message-ID: <4C3F8206.9090408@vcn.com> On 7/15/10 15:14 , scipy-dev-request at scipy.org wrote: > Date: Thu, 15 Jul 2010 20:02:52 +0000 (UTC) > From: Pauli Virtanen > Subject: Re: [SciPy-Dev] scikits contribution? > To: scipy-dev at scipy.org > Jonathan Stickel wrote: [clip] >> > $ python setup.py register sdist bdist_egg upload -s > [clip: distutils error] >> > "/opt/local/Library/Frameworks/Python.framework/Versions/2.6/lib/ > python2.6/distutils/command/upload.py", >> > line 133, in upload_file >> > >> > IndexError: tuple index out of range >> > >> > >> > It looks like the index was created OK, but no source or binary package >> > was uploaded. Any help on this? > I think I remember seeing this too. This is some distutils bug, I think... > > My "fix" was just to edit the "distutils/command/upload.py" and replace > > value = value[1] > > on line 133 by the obvious > > if len(value)> 1: > value = value[1] > else: > value = "" > > and hope for the best. Or maybe I ran "upload" on a separate line from > the "sdist" and "bdist". > > This is maybe not how it should go. > > You can also upload the tarballs on the PyPi website. Ah, this seems to be it. I uploaded the egg using the website. OK, now that I have the sources on git and an egg on pypi, would some of you like to try out the scikit and let me know what I did wrong? Thanks, Jonathan From d.l.goldsmith at gmail.com Thu Jul 15 17:52:17 2010 From: d.l.goldsmith at gmail.com (David Goldsmith) Date: Thu, 15 Jul 2010 14:52:17 -0700 Subject: [SciPy-Dev] New Wiki page Message-ID: http://docs.scipy.org/numpy/Reviewer Standards and Recruitment Discussion/ DG -------------- next part -------------- An HTML attachment was scrubbed... URL: From dwf at cs.toronto.edu Thu Jul 15 23:01:12 2010 From: dwf at cs.toronto.edu (David Warde-Farley) Date: Thu, 15 Jul 2010 23:01:12 -0400 Subject: [SciPy-Dev] potential scikits package In-Reply-To: References: Message-ID: <20100716030112.GA26827@ravage> On Wed, Jul 14, 2010 at 06:45:09PM -0700, Matthew Brett wrote: > Hi. > > > Recently I have developed a package to read IDL save files into Python, > > at?http://idlsave.sourceforge.net/. The code is now very robust, and works > > for many different IDL save files of varying complexity (including > > structures). Since this is something that is likely to be most useful for > > scientists, I was wondering whether it is a good candidate to make into a > > scikits package. In the long run it is likely this package?(if re-licensed > > with a BSD licence)?might be considered for inclusion in scipy? I noticed > > functions in scipy to read in matlab files, so this is along the same lines. > > I would have thought that would be very good to go in scipy.io > directly. I'm happy to help ... Agreed, though I think the purpose of starting it as a scikit would be to make sure everything (including documentation, hint hint ;) is up to snuff, satisfactory, etc. David From robert.kern at gmail.com Fri Jul 16 00:29:25 2010 From: robert.kern at gmail.com (Robert Kern) Date: Thu, 15 Jul 2010 23:29:25 -0500 Subject: [SciPy-Dev] potential scikits package In-Reply-To: <20100716030112.GA26827@ravage> References: <20100716030112.GA26827@ravage> Message-ID: On Thu, Jul 15, 2010 at 22:01, David Warde-Farley wrote: > On Wed, Jul 14, 2010 at 06:45:09PM -0700, Matthew Brett wrote: >> Hi. >> >> > Recently I have developed a package to read IDL save files into Python, >> > at?http://idlsave.sourceforge.net/. The code is now very robust, and works >> > for many different IDL save files of varying complexity (including >> > structures). Since this is something that is likely to be most useful for >> > scientists, I was wondering whether it is a good candidate to make into a >> > scikits package. In the long run it is likely this package?(if re-licensed >> > with a BSD licence)?might be considered for inclusion in scipy? I noticed >> > functions in scipy to read in matlab files, so this is along the same lines. >> >> I would have thought that would be very good to go in scipy.io >> directly. ? I'm happy to help ... > > Agreed, though I think the purpose of starting it as a scikit would be to > make sure everything (including documentation, hint hint ;) is up to snuff, > satisfactory, etc. It already exists as a package in its own right. There is no point in going through the rigamarole of turning it into a scikit if it is ultimately targeted for inclusion in scipy. Just do whatever needs to be done in its current form before migrating it into scipy.io. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." ? -- Umberto Eco From josef.pktd at gmail.com Fri Jul 16 01:02:18 2010 From: josef.pktd at gmail.com (josef.pktd at gmail.com) Date: Fri, 16 Jul 2010 01:02:18 -0400 Subject: [SciPy-Dev] scikits contribution? In-Reply-To: <4C3F8206.9090408@vcn.com> References: <4C3F8206.9090408@vcn.com> Message-ID: On Thu, Jul 15, 2010 at 5:47 PM, Jonathan Stickel wrote: > On 7/15/10 15:14 , scipy-dev-request at scipy.org wrote: >> Date: Thu, 15 Jul 2010 20:02:52 +0000 (UTC) > ?> From: Pauli Virtanen > ?> Subject: Re: [SciPy-Dev] scikits contribution? > ?> To: scipy-dev at scipy.org >> Jonathan Stickel wrote: [clip] >>> > ?$ python setup.py register sdist bdist_egg upload -s >> [clip: distutils error] >>> > ?"/opt/local/Library/Frameworks/Python.framework/Versions/2.6/lib/ >> python2.6/distutils/command/upload.py", >>> > ?line 133, in upload_file >>> > >>> > ?IndexError: tuple index out of range >>> > >>> > >>> > ?It looks like the index was created OK, but no source or binary package >>> > ?was uploaded. ?Any help on this? >> I think I remember seeing this too. This is some distutils bug, I think... >> >> My "fix" was just to edit the "distutils/command/upload.py" and replace >> >> ? ? ? value = value[1] >> >> on line 133 by the obvious >> >> ? ? ? if len(value)> ?1: >> ? ? ? ? ? value = value[1] >> ? ? ? else: >> ? ? ? ? ? value = "" >> >> and hope for the best. Or maybe I ran "upload" on a separate line from >> the "sdist" and "bdist". >> >> This is maybe not how it should go. >> >> You can also upload the tarballs on the PyPi website. > > Ah, this seems to be it. ?I uploaded the egg using the website. It would be better to upload also the sdist, Since scikits.datasmooth is pure python, an egg doesn't really have an advantage and is specific to a python version, py2.6 only. (I'm on py25) Why do you have the code duplication in two modules and the conditional import in __init__.py ? try: from regularsmooth import * except ImportError as error: What is the import error that might occur with regularsmooth ? Also an example to quickly try out the scikit would be very useful. (And tests and docs are not included yet.) Thanks, Josef > > OK, now that I have the sources on git and an egg on pypi, would some of > you like to try out the scikit and let me know what I did wrong? > > Thanks, > Jonathan > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > From jjstickel at vcn.com Fri Jul 16 13:19:16 2010 From: jjstickel at vcn.com (Jonathan Stickel) Date: Fri, 16 Jul 2010 11:19:16 -0600 Subject: [SciPy-Dev] scikits contribution? In-Reply-To: References: Message-ID: <4C409494.20206@vcn.com> On 7/16/10 11:00 , scipy-dev-request at scipy.org wrote: > Date: Fri, 16 Jul 2010 01:02:18 -0400 From: josef.pktd at gmail.com > Subject: Re: [SciPy-Dev] scikits contribution? To: SciPy Developers List > Message-ID: > > Content-Type: text/plain; charset=ISO-8859-1 On Thu, Jul 15, 2010 at > 5:47 PM, Jonathan Stickel wrote: >> > On 7/15/10 15:14 ,scipy-dev-request at scipy.org wrote: >>> >> Date: Thu, 15 Jul 2010 20:02:52 +0000 (UTC) >> > ?> From: Pauli Virtanen >> > ?> Subject: Re: [SciPy-Dev] scikits contribution? >> > ?> To:scipy-dev at scipy.org >>> >> Jonathan Stickel wrote: [clip] >>>>> >>> > ?$ python setup.py register sdist bdist_egg upload -s >>> >> [clip: distutils error] >>>>> >>> > ?"/opt/local/Library/Frameworks/Python.framework/Versions/2.6/lib/ >>> >> python2.6/distutils/command/upload.py", >>>>> >>> > ?line 133, in upload_file >>>>> >>> > >>>>> >>> > ?IndexError: tuple index out of range >>>>> >>> > >>>>> >>> > >>>>> >>> > ?It looks like the index was created OK, but no source or binary package >>>>> >>> > ?was uploaded. ?Any help on this? >>> >> I think I remember seeing this too. This is some distutils bug, I think... >>> >> >>> >> My "fix" was just to edit the "distutils/command/upload.py" and replace >>> >> >>> >> ? ? ? value = value[1] >>> >> >>> >> on line 133 by the obvious >>> >> >>> >> ? ? ? if len(value)> ?1: >>> >> ? ? ? ? ? value = value[1] >>> >> ? ? ? else: >>> >> ? ? ? ? ? value = "" >>> >> >>> >> and hope for the best. Or maybe I ran "upload" on a separate line from >>> >> the "sdist" and "bdist". >>> >> >>> >> This is maybe not how it should go. >>> >> >>> >> You can also upload the tarballs on the PyPi website. >> > >> > Ah, this seems to be it. ?I uploaded the egg using the website. > It would be better to upload also the sdist, > Since scikits.datasmooth is pure python, an egg doesn't really have an > advantage and is specific to a python version, py2.6 only. > (I'm on py25) > > Why do you have the code duplication in two modules and the > conditional import in __init__.py ? > try: > from regularsmooth import * > except ImportError as error: > > What is the import error that might occur with regularsmooth ? > > Also an example to quickly try out the scikit would be very useful. > (And tests and docs are not included yet.) > Thanks for the feedback. I was confused what was the "sdist", but I think I figured it out and uploaded it. I will get to examples, tests, and docs later when I have time (this will require more learning for me). The functions are well documented and the primary use function "smooth_data" shows an example. There are two primary implementations: "smooth_data" that includes cross-validation, and "smooth_data_constr" which will take constraints but does not include cross-validation. The later requires the module "cvxopt". If this is not available, I wanted to allow a user to still use the unconstrained smoother. Maybe there is a better way to do this than my "try/except" hack. Suggestions are welcome. Jonathan From jjstickel at vcn.com Fri Jul 16 14:09:56 2010 From: jjstickel at vcn.com (Jonathan Stickel) Date: Fri, 16 Jul 2010 12:09:56 -0600 Subject: [SciPy-Dev] scikits datasmooth (was: scikits contribution?) In-Reply-To: References: <4C409494.20206@vcn.com> Message-ID: <4C40A074.60404@vcn.com> On 7/16/10 11:55 , josef.pktd at gmail.com wrote: >>> It would be better to upload also the sdist, >>> >> Since scikits.datasmooth is pure python, an egg doesn't really have an >>> >> advantage and is specific to a python version, py2.6 only. >>> >> (I'm on py25) >>> >> >>> >> Why do you have the code duplication in two modules and the >>> >> conditional import in __init__.py ? >>> >> try: >>> >> from regularsmooth import * >>> >> except ImportError as error: >>> >> >>> >> What is the import error that might occur with regularsmooth ? >>> >> >>> >> Also an example to quickly try out the scikit would be very useful. >>> >> (And tests and docs are not included yet.) >>> >> >> > >> > Thanks for the feedback. I was confused what was the "sdist", but I think I >> > figured it out and uploaded it. >> > >> > I will get to examples, tests, and docs later when I have time (this will >> > require more learning for me). The functions are well documented and the >> > primary use function "smooth_data" shows an example. >> > >> > There are two primary implementations: "smooth_data" that includes >> > cross-validation, and "smooth_data_constr" which will take constraints but >> > does not include cross-validation. The later requires the module "cvxopt". >> > If this is not available, I wanted to allow a user to still use the >> > unconstrained smoother. Maybe there is a better way to do this than my >> > "try/except" hack. Suggestions are welcome. > Try/except are fine for this, but it wasn't very informative. You > could also try/except import cvxopt, or add a note that the full > version requires cvxopt. Maybe it's noted somewhere in your code, I > haven't looked that carefully. Kind of noted in the README. I am thinking of putting the try/except in the script itself rather than in __init__.py, but I will get to it later. > > Is it possible to replace cvxopt with scipy optimizers? But this is > not necessary for a scikit. > I needed a quadratic programming (qp) implementation, and I couldn't find any qp in scipy. I found the one in cvxopt to be quite good. Jonathan From cgohlke at uci.edu Fri Jul 16 15:39:58 2010 From: cgohlke at uci.edu (Christoph Gohlke) Date: Fri, 16 Jul 2010 12:39:58 -0700 Subject: [SciPy-Dev] Two potential 64 bit issues Message-ID: <4C40B58E.1090409@uci.edu> Hello, I came across two potential 64 bit issues in scipy 0.8 while trying to track down an elusive scipy.test failure on win-amd64 (http://mail.scipy.org/pipermail/scipy-dev/2010-July/015237.html). Unfortunately none of these two issues turns out to be related to any test failure so I report them in a new thread. The first issue is described in http://projects.scipy.org/scipy/ticket/1103. In scipy/integrate/__quadpack.h a npy_intp variable is parsed as int in the PyArg_ParseTuple function. Shouldn't it be parsed as Py_ssize_t? Unfortunately that would work only on Python 2.5+. Are there any plans to drop support for Python 2.4? Second, in scipy/sparse/linalg/dsolve/_superlu_utils.c a memory pointer is first cast to a long and then stored as PyInt, which value is later passed to free(). At least the cast to long will cause trouble on win-amd64 where long is 32 bit. I opened a ticket at http://projects.scipy.org/scipy/ticket/1236. Thank you. -- Christoph From derek at astro.physik.uni-goettingen.de Fri Jul 16 16:50:31 2010 From: derek at astro.physik.uni-goettingen.de (Derek Homeier) Date: Fri, 16 Jul 2010 22:50:31 +0200 Subject: [SciPy-Dev] [SciPy-User] ANN: scipy 0.8.0 release candidate 3 In-Reply-To: References: Message-ID: <4498D491-50AB-4578-BDAF-D91477AE91DA@astro.physik.uni-goettingen.de> On 15.07.2010, at 11:15PM, Vincent Davis wrote: > Looks good for me now running py2.7 on oxs compiled 64bit > > Ran 4407 tests in 119.221s > OK (KNOWNFAIL=19, SKIP=35) > > Notably, I never got trouble-free builds with 2.7 - both the numpy and scipy test suites fail with a bus error - I already tried compiling with gcc 4.2 instead of 4.0, but to no avail. That said, numpy and scipy, as well as matplotlib and a number of other packages do run in everyday operations without any failures so far. With python2.5/2.6 I've found only two problems on MacOS X 10.5/PPC (already present in rc1 and rc2, sorry I did not get around to report them earlier): I. ====================================================================== ERROR: test_wavfile.test_read_1 ---------------------------------------------------------------------- Traceback (most recent call last): File "/sw/lib/python2.6/site-packages/nose/case.py", line 186, in runTest self.test(*self.arg) File "/sw/lib/python2.6/site-packages/scipy/io/tests/test_wavfile.py", line 14, in test_read_1 rate, data = wavfile.read(datafile('test-44100-le-1ch-4bytes.wav')) File "/sw/lib/python2.6/site-packages/scipy/io/wavfile.py", line 124, in read return rate, data UnboundLocalError: local variable 'data' referenced before assignment I noted there already is a fix for big-endian architectures in http://projects.scipy.org/scipy/changeset/5575 though I don't quite understand how it is supposed to work since _big_endian = False is still hardcoded. test-8000-le-2ch-1byteu.wav loads without errors anyway, but even when setting _big_endian = True manually, both files fail with a /sw/lib/python2.6/site-packages/scipy/io/wavfile.py in read(file) 119 else: 120 warnings.warn("chunk not understood", WavFileWarning) --> 121 size = struct.unpack('I',fid.read(4))[0] 122 bytes = fid.read(size) 123 fid.close() error: unpack requires a string argument of length 4 - sorry I can't be of more help with this. II. ====================================================================== FAIL: test_data.test_boost(,) ---------------------------------------------------------------------- Traceback (most recent call last): File "/sw/lib/python2.6/site-packages/nose/case.py", line 186, in runTest self.test(*self.arg) File "/sw/lib/python2.6/site-packages/scipy/special/tests/test_data.py", line 205, in _test_factory test.check(dtype=dtype) File "/sw/lib/python2.6/site-packages/scipy/special/tests/testutils.py", line 223, in check assert False, "\n".join(msg) AssertionError: Max |adiff|: 5.56451e-80 Max |rdiff|: 6.01399e-12 Bad results for the following points (in output 0): -55.25 => 1.281342652143248e-73 != 1.2813426521356121e-73 (rdiff 5.9592794108831318e-12) -55.0625 => 9.8673260749254454e-73 != 9.8673260748719804e-73 (rdiff 5.4183854671083142e-12) -55.015625 => 4.736069454765531e-72 != 4.7360694547400013e-72 (rdiff 5.3904770710359471e-12) [remaining results skipped, all with 4.6e-12 < rdiff < 6.1e-12] I'd say just another case of too strict accuracy requirements. scipy.test('full') in addition produces the following failure on PPC: ====================================================================== FAIL: test_iv_cephes_vs_amos_mass_test (test_basic.TestBessel) ---------------------------------------------------------------------- Traceback (most recent call last): File "/sw/lib/python2.6/site-packages/scipy/special/tests/test_basic.py", line 1621, in test_iv_cephes_vs_amos_mass_test assert dc[k] < 1e-9, (v[k], x[k], iv(v[k], x[k]), iv(v[k], x[k]+0j)) AssertionError: (-1.685450377317264, 1.2676128629915242, 4.748141012443594e-05, (4.7481418802688318e-05+0j)) Both standard and full test pass on i386 and x86_64, but I noticed the following strange behaviour: when running the test suite twice in a row, the second run produces these failures: ====================================================================== FAIL: test_mio.test_mat4_3d(, , , {'a': array([[[ 0, 1, 2, 3], ---------------------------------------------------------------------- Traceback (most recent call last): File "/sw/lib/python2.6/site-packages/nose/case.py", line 186, in runTest self.test(*self.arg) File "/sw/lib/python2.6/site-packages/numpy/testing/utils.py", line 973, in assert_raises return nose.tools.assert_raises(*args,**kwargs) AssertionError: DeprecationWarning not raised ====================================================================== FAIL: test_00_deprecation_warning (test_basic.TestSolveHBanded) ---------------------------------------------------------------------- Traceback (most recent call last): File "/sw/lib/python2.6/site-packages/scipy/linalg/tests/test_basic.py", line 261, in test_00_deprecation_warning assert_raises(DeprecationWarning, solveh_banded, ab, b) File "/sw/lib/python2.6/site-packages/numpy/testing/utils.py", line 973, in assert_raises return nose.tools.assert_raises(*args,**kwargs) AssertionError: DeprecationWarning not raised ====================================================================== FAIL: Regression test for #651: better handling of badly conditioned ---------------------------------------------------------------------- Traceback (most recent call last): File "/sw/lib/python2.6/site-packages/scipy/signal/tests/test_filter_design.py", line 34, in test_bad_filter raise AssertionError("tf2zpk did not warn about bad "\ AssertionError: tf2zpk did not warn about bad coefficients Cheers, Derek From josh.holbrook at gmail.com Fri Jul 16 20:42:22 2010 From: josh.holbrook at gmail.com (Joshua Holbrook) Date: Fri, 16 Jul 2010 16:42:22 -0800 Subject: [SciPy-Dev] Typo in license file Message-ID: Hey guys, Not a big deal but under clause (c): c. Neither the name of the Enthought nor the names of its contributors should probably be: c. Neither the name of Enthought nor the names of its contributors That is, there's an extra "the." Like I said, not a big deal, but I did notice. --Josh From charlesr.harris at gmail.com Fri Jul 16 22:37:49 2010 From: charlesr.harris at gmail.com (Charles R Harris) Date: Fri, 16 Jul 2010 20:37:49 -0600 Subject: [SciPy-Dev] Required Python Version Message-ID: Hi All, Is there a good reason we shouldn't raise the required python version up to 2.5? It was released on September 19th 2006 and I think enough time has gone by to consider moving the version number up. Doing so will give us access to the "n" format letter for PyArg_ParseTuple*, allow use of the with statement, and provide the new absolute/relative imports with the "from __future__ import absolute_import" statement. Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at googlemail.com Fri Jul 16 23:53:07 2010 From: ralf.gommers at googlemail.com (Ralf Gommers) Date: Sat, 17 Jul 2010 11:53:07 +0800 Subject: [SciPy-Dev] closing (invalid) tickets without follow-up In-Reply-To: References: Message-ID: On Fri, Jul 16, 2010 at 12:50 AM, Pauli Virtanen wrote: > Thu, 15 Jul 2010 23:53:50 +0800, Ralf Gommers wrote: > > For tickets that seem to be invalid and/or don't have enough info to be > > useful, how long do we wait before closing them? Running into the same > > ones all the time is quite annoying. For example: > > http://projects.scipy.org/scipy/ticket/926 (14 months old, close to zero > > info) > > http://projects.scipy.org/scipy/ticket/1168 (likely invalid, no follow > > up) http://projects.scipy.org/scipy/ticket/1073 (same) > > http://projects.scipy.org/scipy/ticket/1188 (same) > > > > The ticket submitter should really get emails with comments, but I'm not > > sure how easy this is to do with Trac. > > We can perhaps set them to the "needs_info" status, and decide that it > means > > "No further developer attention worthwhile, until some feedback > from the reporter (or someone else) received." > > Would this help? The "needs_info" state is already there. > > That helps to some extent. If that doesn't get us any responses I think those tickets should still be closed after say 6 months. > I think the ticket submitter (and all commenters) do already get emails > on comments -- at least I receive them. > I had the impression it wasn't always working, but I could be wrong. Trac has definitely been working better lately. Cheers, Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at googlemail.com Sat Jul 17 00:30:45 2010 From: ralf.gommers at googlemail.com (Ralf Gommers) Date: Sat, 17 Jul 2010 12:30:45 +0800 Subject: [SciPy-Dev] better signature for stats.linregress In-Reply-To: References: Message-ID: On Fri, Jul 16, 2010 at 12:37 AM, wrote: > On Thu, Jul 15, 2010 at 11:34 AM, Ralf Gommers > wrote: > > The current linregress signature is (*args) which is not very > informative, > > as pointed out in http://projects.scipy.org/scipy/ticket/1164. I propose > to > > change it to (x, y=None), which doesn't break any code. Changes and tests > > are here: http://github.com/rgommers/scipy/commits/linregress-signature. > > > > Does this look OK? > > looks good to me > > > > 2381 + if x.shape[0] == 2: > 2382 + y = x[1, :] > 2383 + x = x[0, :] > 2384 + elif x.shape[1] == 2: > 2385 + y = x[:, 1] > 2386 + x = x[:, 0] > > could use tuple unpacking instead, works with arrays > x,y = x > and > x,y = x.T > > That's a bit cleaner indeed. Committed in r6612. Cheers, Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at googlemail.com Sat Jul 17 00:52:26 2010 From: ralf.gommers at googlemail.com (Ralf Gommers) Date: Sat, 17 Jul 2010 12:52:26 +0800 Subject: [SciPy-Dev] Typo in license file In-Reply-To: References: Message-ID: On Sat, Jul 17, 2010 at 8:42 AM, Joshua Holbrook wrote: > Hey guys, > > Not a big deal but under clause (c): > > c. Neither the name of the Enthought nor the names of its contributors > > should probably be: > > c. Neither the name of Enthought nor the names of its contributors > > That is, there's an extra "the." Like I said, not a big deal, but I did > notice. > > I think there's also a bit missing, it should be something like: "The names of Enthought, the SciPy Developers or any contributors may not be used ..." Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at googlemail.com Sat Jul 17 04:42:47 2010 From: ralf.gommers at googlemail.com (Ralf Gommers) Date: Sat, 17 Jul 2010 16:42:47 +0800 Subject: [SciPy-Dev] replacing integrate.cumtrapz with numpy.trapz Message-ID: integrate.cumtrapz and numpy.trapz do exactly the same thing and the code is very similar but not identical, as pointed out in http://projects.scipy.org/scipy/ticket/720. Assuming numpy.trapz is not going anywhere, can we replace the scipy version with it? For a small bonus, the numpy version is about 10% faster (tested for several array shapes): >>> a = np.arange(1e4).reshape(500, 20) >>> %timeit np.trapz(a, axis=1) 10000 loops, best of 3: 182 us per loop >>> %timeit sp.integrate.cumtrapz(a, axis=1) 1000 loops, best of 3: 209 us per loop Cheers, Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at googlemail.com Sat Jul 17 05:38:17 2010 From: ralf.gommers at googlemail.com (Ralf Gommers) Date: Sat, 17 Jul 2010 17:38:17 +0800 Subject: [SciPy-Dev] signal.freqs/freqz plot keyword Message-ID: freqs/freqz have a keyword plot=None, which if True does this: if not plot is None: plot(w, h) while plot is not even defined. This keyword should be removed or the implementation should be something like: if plot: try: import matplotlib.pyplot as plt plt.plot(w, h) plt.show() except ImportError: warnings.warn("`plot` is True, but can't import matplotlib.pyplot.") Removal makes more sense I think, since scipy does not depend on matplotlib. And why provide a plot keyword in these functions and not in many others where it would also make sense? It's ticket http://projects.scipy.org/scipy/ticket/896 Cheers, Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From cgohlke at uci.edu Sat Jul 17 05:48:24 2010 From: cgohlke at uci.edu (Christoph Gohlke) Date: Sat, 17 Jul 2010 02:48:24 -0700 Subject: [SciPy-Dev] memory corruption when running scipy 0.8 test suite In-Reply-To: <4C3BBD59.2020300@uci.edu> References: <4C3BBD59.2020300@uci.edu> Message-ID: <4C417C68.9080809@uci.edu> On 7/12/2010 6:11 PM, Christoph Gohlke wrote: > Dear SciPy developers, > > I am trying to fix the errors and failures reported in the thread > "[SciPy-User] many test failures on windows 64" > . Most > of the issues seem specific for the msvc9/MKL build of scipy 0.8. I am > now down to 1 error and 6 failures (from 16 errors and 9 failures). > > > The following failure in ndimage does not appear when ndimage.test() is > run out of context of scipy.test() > > FAIL: extrema 3 > ---------------------------------------------------------------------- > Traceback (most recent call last): > File "C:\Python26\lib\site-packages\scipy\ndimage\tests\test_ndimage.py", > line 3149, in test_extrema03 > self.failUnless(numpy.all(output1[2] == output4)) > AssertionError > > > The output1[2] array contains a NaN in the first position. If I disable > the following dsyevr related tests in > scipy.lib.lapack.tests.test_esv.py, all ndimage tests pass. Could this > be a memory corruption issue in MKL? Besides the ndimage failure, > scipy.lib.lapack seems to work and passes all tests. Also, this artifact > only happens or surfaces on the 64-bit build. > I could reduce this failure to the following: In [1]: import numpy In [2]: from scipy.lib.lapack.flapack import dsyevr In [3]: numpy.mod(1., 1) Out[3]: 0.0 In [4]: dsyevr(numpy.array([[1,2,3],[2,2,3],[3,3,6]])) Out[4]: (array([-0.66992434, 0.48769389, 9.18223045]), array([[ 0.87881028, -0.25679224, 0.40218185], [-0.432995 , -0.78333614, 0.44598186], [-0.20051889, 0.56607617, 0.79959361]]), 0) In [5]: numpy.mod(1., 1) Out[5]: nan # <--- wrong In [6]: numpy.mod(1., 1) Out[6]: 0.0 The first call to numpy.mod() after calling dsyevr() returns a wrong result (NaN). The output of dsyevr is correct. Numpy.mod returns the correct result when using a slightly different input matrix for dsyevr, e.g. [[1,2,3],[2,1,3],[3,3,6]]. -- Christoph From cournape at gmail.com Sat Jul 17 08:06:42 2010 From: cournape at gmail.com (David Cournapeau) Date: Sat, 17 Jul 2010 14:06:42 +0200 Subject: [SciPy-Dev] Required Python Version In-Reply-To: References: Message-ID: On Sat, Jul 17, 2010 at 4:37 AM, Charles R Harris wrote: > Hi All, > > Is there a good reason we shouldn't raise the required python version up to > 2.5? Yes, a lot of "enterprise ready" distributions still use python 2.4 (RHEL, Centos). David From ralf.gommers at googlemail.com Sat Jul 17 08:25:28 2010 From: ralf.gommers at googlemail.com (Ralf Gommers) Date: Sat, 17 Jul 2010 20:25:28 +0800 Subject: [SciPy-Dev] Required Python Version In-Reply-To: References: Message-ID: On Sat, Jul 17, 2010 at 8:06 PM, David Cournapeau wrote: > On Sat, Jul 17, 2010 at 4:37 AM, Charles R Harris > wrote: > > Hi All, > > > > Is there a good reason we shouldn't raise the required python version up > to > > 2.5? > > Yes, a lot of "enterprise ready" distributions still use python 2.4 > (RHEL, Centos). > > That's not a convincing argument for an infinite amount of time. People who value "enterprise ready" meaning they run ancient stuff should be perfectly fine with numpy 1.4 + scipy 0.8 for a long time from now. And otherwise they can upgrade python itself quite easily. 2.4 doesn't even get security updates anymore. We now support python 2.4-2.6, for the next versions we'll add 2.7, 3.1 and 3.2. Supporting more versions has a cost. And it's clear that the amount of people running 2.4 from svn is at or close to zero, because recent syntax errors for 2.4 have gone unnoticed for a long period. I think it's about time to drop 2.4. Cheers, Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From charlesr.harris at gmail.com Sat Jul 17 08:57:18 2010 From: charlesr.harris at gmail.com (Charles R Harris) Date: Sat, 17 Jul 2010 06:57:18 -0600 Subject: [SciPy-Dev] replacing integrate.cumtrapz with numpy.trapz In-Reply-To: References: Message-ID: On Sat, Jul 17, 2010 at 2:42 AM, Ralf Gommers wrote: > integrate.cumtrapz and numpy.trapz do exactly the same thing and the code > is very similar but not identical, as pointed out in > http://projects.scipy.org/scipy/ticket/720. Assuming numpy.trapz is not > going anywhere, can we replace the scipy version with it? > > For a small bonus, the numpy version is about 10% faster (tested for > several array shapes): > >>> a = np.arange(1e4).reshape(500, 20) > >>> %timeit np.trapz(a, axis=1) > 10000 loops, best of 3: 182 us per loop > >>> %timeit sp.integrate.cumtrapz(a, axis=1) > 1000 loops, best of 3: 209 us per loop > > Replacement seems reasonable to me. We should try to prune duplicate functionality with numpy taking precedence. Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From charlesr.harris at gmail.com Sat Jul 17 09:10:40 2010 From: charlesr.harris at gmail.com (Charles R Harris) Date: Sat, 17 Jul 2010 07:10:40 -0600 Subject: [SciPy-Dev] Required Python Version In-Reply-To: References: Message-ID: On Sat, Jul 17, 2010 at 6:25 AM, Ralf Gommers wrote: > > > On Sat, Jul 17, 2010 at 8:06 PM, David Cournapeau wrote: > >> On Sat, Jul 17, 2010 at 4:37 AM, Charles R Harris >> wrote: >> > Hi All, >> > >> > Is there a good reason we shouldn't raise the required python version up >> to >> > 2.5? >> >> Yes, a lot of "enterprise ready" distributions still use python 2.4 >> (RHEL, Centos). >> >> That's not a convincing argument for an infinite amount of time. People > who value "enterprise ready" meaning they run ancient stuff should be > perfectly fine with numpy 1.4 + scipy 0.8 for a long time from now. And > otherwise they can upgrade python itself quite easily. 2.4 doesn't even get > security updates anymore. > > That was my thought. Also I wanted to merge Christoph's 3.1 fixes for ndimage without cherry picking the changes. Having the "n" format and absolute/relative imports would also clean up the code needed to support both 2.x and 3.x > We now support python 2.4-2.6, for the next versions we'll add 2.7, 3.1 and > 3.2. Supporting more versions has a cost. And it's clear that the amount of > people running 2.4 from svn is at or close to zero, because recent syntax > errors for 2.4 have gone unnoticed for a long period. I think it's about > time to drop 2.4. > > Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at googlemail.com Sat Jul 17 09:29:53 2010 From: ralf.gommers at googlemail.com (Ralf Gommers) Date: Sat, 17 Jul 2010 21:29:53 +0800 Subject: [SciPy-Dev] [SciPy-User] ANN: scipy 0.8.0 release candidate 3 In-Reply-To: <4498D491-50AB-4578-BDAF-D91477AE91DA@astro.physik.uni-goettingen.de> References: <4498D491-50AB-4578-BDAF-D91477AE91DA@astro.physik.uni-goettingen.de> Message-ID: On Sat, Jul 17, 2010 at 4:50 AM, Derek Homeier < derek at astro.physik.uni-goettingen.de> wrote: > On 15.07.2010, at 11:15PM, Vincent Davis wrote: > > > Looks good for me now running py2.7 on oxs compiled 64bit > > > > Ran 4407 tests in 119.221s > > OK (KNOWNFAIL=19, SKIP=35) > > > > > > Notably, I never got trouble-free builds with 2.7 - both the numpy and > scipy test suites fail > with a bus error - I already tried compiling with gcc 4.2 instead of 4.0, > but to no avail. > Yes, there are issues with 2.7. Compiling against numpy 1.4.1 doesn't work, against trunk also has some issues on OS X. As Vincent pointed out, with a 64-bit Python 2.7 built with gcc-4.2 (python.org binaries are gcc-4.0) it does work without problems. > That said, numpy and scipy, as well as matplotlib and a number of other > packages do > run in everyday operations without any failures so far. > > With python2.5/2.6 I've found only two problems on MacOS X 10.5/PPC > (already present in > rc1 and rc2, sorry I did not get around to report them earlier): > > I. > ====================================================================== > ERROR: test_wavfile.test_read_1 > ---------------------------------------------------------------------- > Traceback (most recent call last): > File "/sw/lib/python2.6/site-packages/nose/case.py", line 186, in runTest > self.test(*self.arg) > File "/sw/lib/python2.6/site-packages/scipy/io/tests/test_wavfile.py", > line 14, in test_read_1 > rate, data = wavfile.read(datafile('test-44100-le-1ch-4bytes.wav')) > File "/sw/lib/python2.6/site-packages/scipy/io/wavfile.py", line 124, in > read > return rate, data > UnboundLocalError: local variable 'data' referenced before assignment > > I noted there already is a fix for big-endian architectures in > http://projects.scipy.org/scipy/changeset/5575 > > though I don't quite understand how it is supposed to work since > _big_endian = False is still hardcoded. > Further down _big_endian is declared as global, so not as hardcoded as it looks. But I don't think it ever worked, and that changeset didn't have tests: http://projects.scipy.org/scipy/ticket/873 I asked on the list a while ago if someone had time to fix this, but got no response. And since it's not a regression I'm afraid it'll have to wait till the next release. > > test-8000-le-2ch-1byteu.wav loads without errors anyway, > but even when setting _big_endian = True manually, both files fail with a > /sw/lib/python2.6/site-packages/scipy/io/wavfile.py in read(file) > 119 else: > 120 warnings.warn("chunk not understood", WavFileWarning) > --> 121 size = struct.unpack('I',fid.read(4))[0] > 122 bytes = fid.read(size) > 123 fid.close() > > error: unpack requires a string argument of length 4 > > - sorry I can't be of more help with this. > > II. > ====================================================================== > FAIL: test_data.test_boost(,) > ---------------------------------------------------------------------- > Traceback (most recent call last): > File "/sw/lib/python2.6/site-packages/nose/case.py", line 186, in runTest > self.test(*self.arg) > File "/sw/lib/python2.6/site-packages/scipy/special/tests/test_data.py", > line 205, in _test_factory > test.check(dtype=dtype) > File "/sw/lib/python2.6/site-packages/scipy/special/tests/testutils.py", > line 223, in check > assert False, "\n".join(msg) > AssertionError: > Max |adiff|: 5.56451e-80 > Max |rdiff|: 6.01399e-12 > Bad results for the following points (in output 0): > -55.25 => 1.281342652143248e-73 != > 1.2813426521356121e-73 (rdiff 5.9592794108831318e-12) > -55.0625 => 9.8673260749254454e-73 != > 9.8673260748719804e-73 (rdiff 5.4183854671083142e-12) > -55.015625 => 4.736069454765531e-72 != > 4.7360694547400013e-72 (rdiff 5.3904770710359471e-12) > > [remaining results skipped, all with 4.6e-12 < rdiff < 6.1e-12] > I'd say just another case of too strict accuracy requirements. > > These should have been fixed by r6520: - self.check_cephes_vs_amos(iv, iv, rtol=1e-12, atol=1e-305) + self.check_cephes_vs_amos(iv, iv, rtol=5e-9, atol=1e-305) data(gammaincinv, 'gamma_inv_big_data_ipp-gamma_inv_big_data', - (0,1), 2, rtol=5e-12), + (0,1), 2, rtol=1e-11), Can you check if you have these changes, and with what accuracy the tests pass? scipy.test('full') in addition produces the following failure on PPC: > > ====================================================================== > FAIL: test_iv_cephes_vs_amos_mass_test (test_basic.TestBessel) > ---------------------------------------------------------------------- > Traceback (most recent call last): > File "/sw/lib/python2.6/site-packages/scipy/special/tests/test_basic.py", > line 1621, in test_iv_cephes_vs_amos_mass_test > assert dc[k] < 1e-9, (v[k], x[k], iv(v[k], x[k]), iv(v[k], x[k]+0j)) > AssertionError: (-1.685450377317264, 1.2676128629915242, > 4.748141012443594e-05, (4.7481418802688318e-05+0j)) > > Both standard and full test pass on i386 and x86_64, but I noticed the > following strange behaviour: > when running the test suite twice in a row, the second run produces these > failures: > This is because warnings are only raised once from the same code, so the check if they're raised a second time fails. So no problem. Cheers, Ralf > > ====================================================================== > FAIL: test_mio.test_mat4_3d(, > , 0x4d542b0>, {'a': array([[[ 0, 1, 2, 3], > ---------------------------------------------------------------------- > Traceback (most recent call last): > File "/sw/lib/python2.6/site-packages/nose/case.py", line 186, in runTest > self.test(*self.arg) > File "/sw/lib/python2.6/site-packages/numpy/testing/utils.py", line 973, > in assert_raises > return nose.tools.assert_raises(*args,**kwargs) > AssertionError: DeprecationWarning not raised > > ====================================================================== > FAIL: test_00_deprecation_warning (test_basic.TestSolveHBanded) > ---------------------------------------------------------------------- > Traceback (most recent call last): > File "/sw/lib/python2.6/site-packages/scipy/linalg/tests/test_basic.py", > line 261, in test_00_deprecation_warning > assert_raises(DeprecationWarning, solveh_banded, ab, b) > File "/sw/lib/python2.6/site-packages/numpy/testing/utils.py", line 973, > in assert_raises > return nose.tools.assert_raises(*args,**kwargs) > AssertionError: DeprecationWarning not raised > > ====================================================================== > FAIL: Regression test for #651: better handling of badly conditioned > ---------------------------------------------------------------------- > Traceback (most recent call last): > File > "/sw/lib/python2.6/site-packages/scipy/signal/tests/test_filter_design.py", > line 34, in test_bad_filter > raise AssertionError("tf2zpk did not warn about bad "\ > AssertionError: tf2zpk did not warn about bad coefficients > > Cheers, > Derek > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at googlemail.com Sat Jul 17 09:38:42 2010 From: ralf.gommers at googlemail.com (Ralf Gommers) Date: Sat, 17 Jul 2010 21:38:42 +0800 Subject: [SciPy-Dev] replacing integrate.cumtrapz with numpy.trapz In-Reply-To: References: Message-ID: On Sat, Jul 17, 2010 at 8:57 PM, Charles R Harris wrote: > > > On Sat, Jul 17, 2010 at 2:42 AM, Ralf Gommers > wrote: > >> integrate.cumtrapz and numpy.trapz do exactly the same thing and the code >> is very similar but not identical, as pointed out in >> http://projects.scipy.org/scipy/ticket/720. Assuming numpy.trapz is not >> going anywhere, can we replace the scipy version with it? >> >> For a small bonus, the numpy version is about 10% faster (tested for >> several array shapes): >> >>> a = np.arange(1e4).reshape(500, 20) >> >>> %timeit np.trapz(a, axis=1) >> 10000 loops, best of 3: 182 us per loop >> >>> %timeit sp.integrate.cumtrapz(a, axis=1) >> 1000 loops, best of 3: 209 us per loop >> >> > Replacement seems reasonable to me. We should try to prune duplicate > functionality with numpy taking precedence. > > Just to double check, that also means deprecating the name cumtrapz right? Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From charlesr.harris at gmail.com Sat Jul 17 09:48:19 2010 From: charlesr.harris at gmail.com (Charles R Harris) Date: Sat, 17 Jul 2010 07:48:19 -0600 Subject: [SciPy-Dev] replacing integrate.cumtrapz with numpy.trapz In-Reply-To: References: Message-ID: On Sat, Jul 17, 2010 at 7:38 AM, Ralf Gommers wrote: > > > On Sat, Jul 17, 2010 at 8:57 PM, Charles R Harris < > charlesr.harris at gmail.com> wrote: > >> >> >> On Sat, Jul 17, 2010 at 2:42 AM, Ralf Gommers < >> ralf.gommers at googlemail.com> wrote: >> >>> integrate.cumtrapz and numpy.trapz do exactly the same thing and the code >>> is very similar but not identical, as pointed out in >>> http://projects.scipy.org/scipy/ticket/720. Assuming numpy.trapz is not >>> going anywhere, can we replace the scipy version with it? >>> >>> For a small bonus, the numpy version is about 10% faster (tested for >>> several array shapes): >>> >>> a = np.arange(1e4).reshape(500, 20) >>> >>> %timeit np.trapz(a, axis=1) >>> 10000 loops, best of 3: 182 us per loop >>> >>> %timeit sp.integrate.cumtrapz(a, axis=1) >>> 1000 loops, best of 3: 209 us per loop >>> >>> >> Replacement seems reasonable to me. We should try to prune duplicate >> functionality with numpy taking precedence. >> >> Just to double check, that also means deprecating the name cumtrapz right? > > Yes, I think so. And maybe a ticket to remove it later although that means having versions in trac for future releases. Maybe we should have a "deprecated" component so we can use such tickets. Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at googlemail.com Sat Jul 17 09:52:22 2010 From: ralf.gommers at googlemail.com (Ralf Gommers) Date: Sat, 17 Jul 2010 21:52:22 +0800 Subject: [SciPy-Dev] replacing integrate.cumtrapz with numpy.trapz In-Reply-To: References: Message-ID: On Sat, Jul 17, 2010 at 9:48 PM, Charles R Harris wrote: > > > On Sat, Jul 17, 2010 at 7:38 AM, Ralf Gommers > wrote: > >> >> >> On Sat, Jul 17, 2010 at 8:57 PM, Charles R Harris < >> charlesr.harris at gmail.com> wrote: >> >>> >>> >>> On Sat, Jul 17, 2010 at 2:42 AM, Ralf Gommers < >>> ralf.gommers at googlemail.com> wrote: >>> >>>> integrate.cumtrapz and numpy.trapz do exactly the same thing and the >>>> code is very similar but not identical, as pointed out in >>>> http://projects.scipy.org/scipy/ticket/720. Assuming numpy.trapz is not >>>> going anywhere, can we replace the scipy version with it? >>>> >>>> For a small bonus, the numpy version is about 10% faster (tested for >>>> several array shapes): >>>> >>> a = np.arange(1e4).reshape(500, 20) >>>> >>> %timeit np.trapz(a, axis=1) >>>> 10000 loops, best of 3: 182 us per loop >>>> >>> %timeit sp.integrate.cumtrapz(a, axis=1) >>>> 1000 loops, best of 3: 209 us per loop >>>> >>>> >>> Replacement seems reasonable to me. We should try to prune duplicate >>> functionality with numpy taking precedence. >>> >>> Just to double check, that also means deprecating the name cumtrapz >> right? >> >> > Yes, I think so. And maybe a ticket to remove it later although that means > having versions in trac for future releases. Maybe we should have a > "deprecated" component so we can use such tickets. > > I was thinking about adding "check all deprecations" as part of the release process (at the beginning). Then there's less chance to forget it, and less noise in Trac. Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From jjstickel at vcn.com Sat Jul 17 09:52:53 2010 From: jjstickel at vcn.com (Jonathan Stickel) Date: Sat, 17 Jul 2010 07:52:53 -0600 Subject: [SciPy-Dev] replacing integrate.cumtrapz with numpy.trapz In-Reply-To: References: Message-ID: <4C41B5B5.40209@vcn.com> On 7/17/10 07:38 , Ralf Gommers wrote: > > > On Sat, Jul 17, 2010 at 8:57 PM, Charles R Harris > > wrote: > > > > On Sat, Jul 17, 2010 at 2:42 AM, Ralf Gommers > > > wrote: > > integrate.cumtrapz and numpy.trapz do exactly the same thing and > the code is very similar but not identical, as pointed out in > http://projects.scipy.org/scipy/ticket/720. Assuming numpy.trapz > is not going anywhere, can we replace the scipy version with it? > > For a small bonus, the numpy version is about 10% faster (tested > for several array shapes): > >>> a = np.arange(1e4).reshape(500, 20) > >>> %timeit np.trapz(a, axis=1) > 10000 loops, best of 3: 182 us per loop > >>> %timeit sp.integrate.cumtrapz(a, axis=1) > 1000 loops, best of 3: 209 us per loop > > > Replacement seems reasonable to me. We should try to prune duplicate > functionality with numpy taking precedence. > > Just to double check, that also means deprecating the name cumtrapz right? > Uh, you do realize that trapz and cumtrapz provide different functionality, right? In [5]: import scipy.integrate as igt In [8]: a = np.arange(10) In [9]: igt.cumtrapz(a) Out[9]: array([ 0.5, 2. , 4.5, 8. , 12.5, 18. , 24.5, 32. , 40.5]) In [10]: np.trapz(a) Out[10]: 40.5 In [11]: igt.trapz(a) Out[11]: 40.5 This is numpy-1.4.1 and scipy-0.7.2. Jonathan From njs at pobox.com Sat Jul 17 10:13:56 2010 From: njs at pobox.com (Nathaniel Smith) Date: Sat, 17 Jul 2010 07:13:56 -0700 Subject: [SciPy-Dev] Required Python Version In-Reply-To: References: Message-ID: On Sat, Jul 17, 2010 at 5:25 AM, Ralf Gommers wrote: > On Sat, Jul 17, 2010 at 8:06 PM, David Cournapeau > wrote: >> Yes, a lot of "enterprise ready" ?distributions still use python 2.4 >> (RHEL, Centos). >> > That's not a convincing argument for an infinite amount of time. People who > value "enterprise ready" meaning they run ancient stuff should be perfectly > fine with numpy 1.4 + scipy 0.8 for a long time from now. And otherwise they > can upgrade python itself quite easily. 2.4 doesn't even get security > updates anymore. Your argument makes sense to me (and this decision doesn't affect me either way), but it isn't actually an infinite amount of time -- RHEL6, which ships python 2.6, is coming out in a matter of months. Someone should probably poll the -users list in any case -- people running from SVN are not necessarily a representative sample of the user base :-) -- Nathaniel From rmay31 at gmail.com Sat Jul 17 10:09:48 2010 From: rmay31 at gmail.com (Ryan May) Date: Sat, 17 Jul 2010 09:09:48 -0500 Subject: [SciPy-Dev] signal.freqs/freqz plot keyword In-Reply-To: References: Message-ID: <76138E52-A30A-4E18-B7D1-13FD9A738BF3@gmail.com> On Jul 17, 2010, at 4:38, Ralf Gommers wrote: > freqs/freqz have a keyword plot=None, which if True does this: > if not plot is None: > plot(w, h) > while plot is not even defined. If it's not none, it should be a callable passed in, which let's you seine your own plotting function. Actually using matplotlib's plot produces unexpected results, you end up plotting the real part of the complex transfer function, not the magnitude. This actually bit me recently and should probably be documented. > keyword should be removed or the implementation should be something like: > if plot: > try: > import matplotlib.pyplot as plt > plt.plot(w, h) > plt.show() > except ImportError: > warnings.warn("`plot` is True, but can't import matplotlib.pyplot.") No, you can use any plotting library right now. > Removal makes more sense I think, since scipy does not depend on matplotlib. And why provide a plot keyword in these functions and not in many others where it would also make sense? It's mimmicking MATLAB functionality, which will plot the transfer function if you call freqz and don't save the results. Ryan From mellerf at netvision.net.il Sat Jul 17 11:10:36 2010 From: mellerf at netvision.net.il (Yosef Meller) Date: Sat, 17 Jul 2010 18:10:36 +0300 Subject: [SciPy-Dev] ANN: scipy 0.8.0 release candidate 3 In-Reply-To: References: Message-ID: <4C41C7EC.8030601@netvision.net.il> Looks good on Ubuntu Lucid 64 bit, py2.6, self-built numpy 1.4.1: Ran 4407 tests in Ran 4407 tests in 59.976s OK (KNOWNFAIL=13, SKIP=34) On 15/07/10 18:27, Ralf Gommers wrote: > I'm pleased to announce the availability of the third release candidate > of SciPy 0.8.0. The only changes compared to rc2 are a fix for a > regression in interpolate.Rbf and some fixes for failures on 64-bit > Windows. If no more problems are reported, the final release will be > available in one week. > > SciPy is a package of tools for science and engineering for Python. It > includes modules for statistics, optimization, integration, linear > algebra, Fourier transforms, signal and image processing, ODE solvers, > and more. > > This release candidate release comes one and a half year after the 0.7.0 > release and contains many new features, numerous bug-fixes, improved > test coverage, and better documentation. Please note that SciPy > 0.8.0rc3 requires Python 2.4-2.6 and NumPy 1.4.1 or greater. > > For more information, please see the release notes: > http://sourceforge.net/projects/scipy/files/scipy/0.8.0rc3/NOTES.txt/view > > You can download the release from here: > https://sourceforge.net/projects/scipy/ > Python 2.5/2.6 binaries for Windows and OS X are available, as well as > source tarballs for other platforms and the documentation in pdf form. > > Thank you to everybody who contributed to this release. > > Enjoy, > Ralf > > > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev From aisaac at american.edu Sat Jul 17 11:17:16 2010 From: aisaac at american.edu (Alan G Isaac) Date: Sat, 17 Jul 2010 11:17:16 -0400 Subject: [SciPy-Dev] replacing integrate.cumtrapz with numpy.trapz In-Reply-To: References: Message-ID: <4C41C97C.8070001@american.edu> On Sat, Jul 17, 2010 at 2:42 AM, Ralf Gommers wrote: > For a small bonus, the numpy version is about 10% faster > (tested for several array shapes): > >>> a = np.arange(1e4).reshape(500, 20) > >>> %timeit np.trapz(a, axis=1) > 10000 loops, best of 3: 182 us per loop > >>> %timeit sp.integrate.cumtrapz(a, axis=1) > 1000 loops, best of 3: 209 us per loop That reminds me of a question: numpy.trapz returns add.reduce(d * (y[slice1]+y[slice2])/2.0,axis) Isn't that equivalent to but slightly less efficient than (d * (y[slice1]+y[slice2])).sum(axis=axis)/2.0 ? Alan Isaac From brian.lee.hawthorne at gmail.com Sat Jul 17 11:19:30 2010 From: brian.lee.hawthorne at gmail.com (Brian Hawthorne) Date: Sat, 17 Jul 2010 08:19:30 -0700 Subject: [SciPy-Dev] (no subject) Message-ID: http://sites.google.com/site/fdgy754g/ljie4j From ralf.gommers at googlemail.com Sat Jul 17 11:52:30 2010 From: ralf.gommers at googlemail.com (Ralf Gommers) Date: Sat, 17 Jul 2010 23:52:30 +0800 Subject: [SciPy-Dev] replacing integrate.cumtrapz with numpy.trapz In-Reply-To: <4C41B5B5.40209@vcn.com> References: <4C41B5B5.40209@vcn.com> Message-ID: On Sat, Jul 17, 2010 at 9:52 PM, Jonathan Stickel wrote: > On 7/17/10 07:38 , Ralf Gommers wrote: > >> >> >> On Sat, Jul 17, 2010 at 8:57 PM, Charles R Harris >> > wrote: >> >> >> >> On Sat, Jul 17, 2010 at 2:42 AM, Ralf Gommers >> > >> >> wrote: >> >> integrate.cumtrapz and numpy.trapz do exactly the same thing and >> the code is very similar but not identical, as pointed out in >> http://projects.scipy.org/scipy/ticket/720. Assuming numpy.trapz >> is not going anywhere, can we replace the scipy version with it? >> >> For a small bonus, the numpy version is about 10% faster (tested >> for several array shapes): >> >>> a = np.arange(1e4).reshape(500, 20) >> >>> %timeit np.trapz(a, axis=1) >> 10000 loops, best of 3: 182 us per loop >> >>> %timeit sp.integrate.cumtrapz(a, axis=1) >> 1000 loops, best of 3: 209 us per loop >> >> >> Replacement seems reasonable to me. We should try to prune duplicate >> functionality with numpy taking precedence. >> >> Just to double check, that also means deprecating the name cumtrapz right? >> >> > Uh, you do realize that trapz and cumtrapz provide different functionality, > right? > > Actually no, sorry for the noise. Now I'm not sure what the point of that ticket was. Maybe to replace part of the code in cumtrapz with trapz? Or the submitter made the same mistake as I just did. Ralf > In [5]: import scipy.integrate as igt > > In [8]: a = np.arange(10) > > In [9]: igt.cumtrapz(a) > Out[9]: array([ 0.5, 2. , 4.5, 8. , 12.5, 18. , 24.5, 32. , > 40.5]) > > In [10]: np.trapz(a) > Out[10]: 40.5 > > In [11]: igt.trapz(a) > Out[11]: 40.5 > > This is numpy-1.4.1 and scipy-0.7.2. > > Jonathan > -------------- next part -------------- An HTML attachment was scrubbed... URL: From ceball at gmail.com Sat Jul 17 11:49:15 2010 From: ceball at gmail.com (Chris Ball) Date: Sat, 17 Jul 2010 15:49:15 +0000 (UTC) Subject: [SciPy-Dev] Getting Scipy's weave to work reliably on Windows Message-ID: Hi, While testing Scipy's weave on several different Windows installations, I came across some problems with spaces in paths that often prevent weave from working. I can see a change that could probably get weave working on most Windows installations, but it is a quick hack. Someone knowledgeable about distutils (and numpy.distutils?) might be able to help me fix this properly. Below I describe three common problems with weave on Windows, in the hope that this information helps others, or allows someone to suggest how to fix the spaces-in- paths problem properly. I think there are three common problems that stop weave from working on Windows. The first is not having a C compiler. Both Python(x,y) and EPD provide a C compiler that seems to work fine, which is great! The second problem is that if weave is installed to a location with a space in the path, linking fails. There is already a scipy bug report about this (http://projects.scipy.org/scipy/ticket/809). I've just commented on that report, saying the problem appears to be with distutils, and there is already a Python bug report about it (http://bugs.python.org/issue4508). Maybe someone could close this scipy bug, or link it to the Python one somehow? In any case, when using Python(x,y) or EPD, this bug will not show up if the default installation locations are accepted. So, that's also good news! The third problem is that if the Windows user name has a space in it (which in my experience is quite common), compilation fails. Weave uses the user name to create a path for its "intermediate" and "compiled" files. When the compilation command is issued, the path with the space in it is also not quoted. Presumably that is another error in distutils (or numpy.distutils)? Unfortunately I wasn't able to pinpoint what function is failing to quote strings properly, because I couldn't figure out the chain that leads to the compiler being called. However, I can avoid the problem by removing spaces from the user name in weave itself (catalog.py): def whoami(): """return a string identifying the user.""" return (os.environ.get("USER") or os.environ.get("USERNAME") or "unknown").replace(" ","") (where I have added .replace(" ","") to the existing code). I realize this isn't the right solution, so if someone could help to guide me to the point where quoting should occur, that would be very helpful. Otherwise, is there any chance of applying a hack like this so weave can work reliably on Windows? Thanks, Chris From ben.root at ou.edu Sat Jul 17 13:49:42 2010 From: ben.root at ou.edu (Benjamin Root) Date: Sat, 17 Jul 2010 12:49:42 -0500 Subject: [SciPy-Dev] Typo in license file In-Reply-To: References: Message-ID: On Fri, Jul 16, 2010 at 11:52 PM, Ralf Gommers wrote: > > > On Sat, Jul 17, 2010 at 8:42 AM, Joshua Holbrook wrote: > >> Hey guys, >> >> Not a big deal but under clause (c): >> >> c. Neither the name of the Enthought nor the names of its contributors >> >> should probably be: >> >> c. Neither the name of Enthought nor the names of its contributors >> >> That is, there's an extra "the." Like I said, not a big deal, but I did >> notice. >> >> I think there's also a bit missing, it should be something like: > "The names of Enthought, the SciPy Developers or any contributors may not > be used ..." > > Ralf > > I agree. While at first it might seem redundant, the original phrasing can be interpreted as to mean only the contributors from Enthought. This way, any contributors to SciPy is covered by the clause. Ben Root -------------- next part -------------- An HTML attachment was scrubbed... URL: From ben.root at ou.edu Sat Jul 17 14:32:02 2010 From: ben.root at ou.edu (Benjamin Root) Date: Sat, 17 Jul 2010 13:32:02 -0500 Subject: [SciPy-Dev] Required Python Version In-Reply-To: References: Message-ID: FWIW, NOAA has recently (within the past year, I believe) gotten approval to *upgrade* to RHEL5. And because of IT policies, or because they need to run programs on servers out of their control, many of the users can not personally update their version of Python away from 2.4. I am sure there are other government agencies that are in the same boat. Ben Root On Sat, Jul 17, 2010 at 9:13 AM, Nathaniel Smith wrote: > On Sat, Jul 17, 2010 at 5:25 AM, Ralf Gommers > wrote: > > On Sat, Jul 17, 2010 at 8:06 PM, David Cournapeau > > wrote: > >> Yes, a lot of "enterprise ready" distributions still use python 2.4 > >> (RHEL, Centos). > >> > > That's not a convincing argument for an infinite amount of time. People > who > > value "enterprise ready" meaning they run ancient stuff should be > perfectly > > fine with numpy 1.4 + scipy 0.8 for a long time from now. And otherwise > they > > can upgrade python itself quite easily. 2.4 doesn't even get security > > updates anymore. > > Your argument makes sense to me (and this decision doesn't affect me > either way), but it isn't actually an infinite amount of time -- > RHEL6, which ships python 2.6, is coming out in a matter of months. > > Someone should probably poll the -users list in any case -- people > running from SVN are not necessarily a representative sample of the > user base :-) > > -- Nathaniel > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > -------------- next part -------------- An HTML attachment was scrubbed... URL: From charlesr.harris at gmail.com Sat Jul 17 14:46:02 2010 From: charlesr.harris at gmail.com (Charles R Harris) Date: Sat, 17 Jul 2010 12:46:02 -0600 Subject: [SciPy-Dev] Required Python Version In-Reply-To: References: Message-ID: On Sat, Jul 17, 2010 at 12:32 PM, Benjamin Root wrote: > FWIW, NOAA has recently (within the past year, I believe) gotten approval > to *upgrade* to RHEL5. And because of IT policies, or because they need to > run programs on servers out of their control, many of the users can not > personally update their version of Python away from 2.4. > > I am sure there are other government agencies that are in the same boat. > > But will Numpy 1.5 and Scipy 8.0 suffice for their needs? Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From efiring at hawaii.edu Sat Jul 17 14:46:58 2010 From: efiring at hawaii.edu (Eric Firing) Date: Sat, 17 Jul 2010 08:46:58 -1000 Subject: [SciPy-Dev] Required Python Version In-Reply-To: References: Message-ID: <4C41FAA2.7080400@hawaii.edu> On 07/17/2010 08:32 AM, Benjamin Root wrote: > FWIW, NOAA has recently (within the past year, I believe) gotten > approval to *upgrade* to RHEL5. And because of IT policies, or because > they need to run programs on servers out of their control, many of the > users can not personally update their version of Python away from 2.4. But they *can* update their numpy, scipy, mpl? Eric > > I am sure there are other government agencies that are in the same boat. > > Ben Root > > > > On Sat, Jul 17, 2010 at 9:13 AM, Nathaniel Smith > wrote: > > On Sat, Jul 17, 2010 at 5:25 AM, Ralf Gommers > > > wrote: > > On Sat, Jul 17, 2010 at 8:06 PM, David Cournapeau > > > > wrote: > >> Yes, a lot of "enterprise ready" distributions still use python 2.4 > >> (RHEL, Centos). > >> > > That's not a convincing argument for an infinite amount of time. > People who > > value "enterprise ready" meaning they run ancient stuff should be > perfectly > > fine with numpy 1.4 + scipy 0.8 for a long time from now. And > otherwise they > > can upgrade python itself quite easily. 2.4 doesn't even get security > > updates anymore. > > Your argument makes sense to me (and this decision doesn't affect me > either way), but it isn't actually an infinite amount of time -- > RHEL6, which ships python 2.6, is coming out in a matter of months. > > Someone should probably poll the -users list in any case -- people > running from SVN are not necessarily a representative sample of the > user base :-) > > -- Nathaniel > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > > > > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev From josh.holbrook at gmail.com Sat Jul 17 14:51:16 2010 From: josh.holbrook at gmail.com (Joshua Holbrook) Date: Sat, 17 Jul 2010 10:51:16 -0800 Subject: [SciPy-Dev] Required Python Version In-Reply-To: References: Message-ID: On Sat, Jul 17, 2010 at 10:32 AM, Benjamin Root wrote: > FWIW, NOAA has recently (within the past year, I believe) gotten approval to > *upgrade* to RHEL5.? And because of IT policies, or because they need to run > programs on servers out of their control, many of the users can not > personally update their version of Python away from 2.4. > > I am sure there are other government agencies that are in the same boat. > > Ben Root > > > > On Sat, Jul 17, 2010 at 9:13 AM, Nathaniel Smith wrote: >> >> On Sat, Jul 17, 2010 at 5:25 AM, Ralf Gommers >> wrote: >> > On Sat, Jul 17, 2010 at 8:06 PM, David Cournapeau >> > wrote: >> >> Yes, a lot of "enterprise ready" ?distributions still use python 2.4 >> >> (RHEL, Centos). >> >> >> > That's not a convincing argument for an infinite amount of time. People >> > who >> > value "enterprise ready" meaning they run ancient stuff should be >> > perfectly >> > fine with numpy 1.4 + scipy 0.8 for a long time from now. And otherwise >> > they >> > can upgrade python itself quite easily. 2.4 doesn't even get security >> > updates anymore. >> >> Your argument makes sense to me (and this decision doesn't affect me >> either way), but it isn't actually an infinite amount of time -- >> RHEL6, which ships python 2.6, is coming out in a matter of months. >> >> Someone should probably poll the -users list in any case -- people >> running from SVN are not necessarily a representative sample of the >> user base :-) >> >> -- Nathaniel >> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at scipy.org >> http://mail.scipy.org/mailman/listinfo/scipy-dev > > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > I was just using some incredistale government computers yesterday, and they were running python 2.5. I'd support dropping 2.4, especially if scipy is going to start supporting even more python versions. In addition, users requiring 2.4 support can use older versions of numpy, scipy, etc. As long as these versions are made available, I think everything would be okay. --Josh From ben.root at ou.edu Sat Jul 17 14:57:07 2010 From: ben.root at ou.edu (Benjamin Root) Date: Sat, 17 Jul 2010 13:57:07 -0500 Subject: [SciPy-Dev] Required Python Version In-Reply-To: <4C41FAA2.7080400@hawaii.edu> References: <4C41FAA2.7080400@hawaii.edu> Message-ID: On Sat, Jul 17, 2010 at 1:46 PM, Eric Firing wrote: > On 07/17/2010 08:32 AM, Benjamin Root wrote: > > FWIW, NOAA has recently (within the past year, I believe) gotten > > approval to *upgrade* to RHEL5. And because of IT policies, or because > > they need to run programs on servers out of their control, many of the > > users can not personally update their version of Python away from 2.4. > > But they *can* update their numpy, scipy, mpl? > > Eric > Eric and Charles, You would have to ask a NOAA employee about the details of their IT setup and their needs, I merely share a building with people in the NWS. Do we have any NOAA people on SciPy-dev list? I know we have some for the mpl list. Ben Root -------------- next part -------------- An HTML attachment was scrubbed... URL: From ben.root at ou.edu Sat Jul 17 15:10:26 2010 From: ben.root at ou.edu (Benjamin Root) Date: Sat, 17 Jul 2010 14:10:26 -0500 Subject: [SciPy-Dev] Required Python Version In-Reply-To: References: Message-ID: On Sat, Jul 17, 2010 at 1:51 PM, Joshua Holbrook wrote: > On Sat, Jul 17, 2010 at 10:32 AM, Benjamin Root wrote: > > FWIW, NOAA has recently (within the past year, I believe) gotten approval > to > > *upgrade* to RHEL5. And because of IT policies, or because they need to > run > > programs on servers out of their control, many of the users can not > > personally update their version of Python away from 2.4. > > > > I am sure there are other government agencies that are in the same boat. > > > > Ben Root > > > > > > > > On Sat, Jul 17, 2010 at 9:13 AM, Nathaniel Smith wrote: > >> > >> On Sat, Jul 17, 2010 at 5:25 AM, Ralf Gommers > >> wrote: > >> > On Sat, Jul 17, 2010 at 8:06 PM, David Cournapeau > > >> > wrote: > >> >> Yes, a lot of "enterprise ready" distributions still use python 2.4 > >> >> (RHEL, Centos). > >> >> > >> > That's not a convincing argument for an infinite amount of time. > People > >> > who > >> > value "enterprise ready" meaning they run ancient stuff should be > >> > perfectly > >> > fine with numpy 1.4 + scipy 0.8 for a long time from now. And > otherwise > >> > they > >> > can upgrade python itself quite easily. 2.4 doesn't even get security > >> > updates anymore. > >> > >> Your argument makes sense to me (and this decision doesn't affect me > >> either way), but it isn't actually an infinite amount of time -- > >> RHEL6, which ships python 2.6, is coming out in a matter of months. > >> > >> Someone should probably poll the -users list in any case -- people > >> running from SVN are not necessarily a representative sample of the > >> user base :-) > >> > >> -- Nathaniel > >> _______________________________________________ > >> SciPy-Dev mailing list > >> SciPy-Dev at scipy.org > >> http://mail.scipy.org/mailman/listinfo/scipy-dev > > > > > > _______________________________________________ > > SciPy-Dev mailing list > > SciPy-Dev at scipy.org > > http://mail.scipy.org/mailman/listinfo/scipy-dev > > > > I was just using some incredistale government computers yesterday, and > they were running python 2.5. I'd support dropping 2.4, especially if > scipy is going to start supporting even more python versions. In > addition, users requiring 2.4 support can use older versions of numpy, > scipy, etc. As long as these versions are made available, I think > everything would be okay. > > --Josh > > And I have seen a few independently administered government machines running Fedora, that doesn't necessarily reflect the IT policy of government servers. What would probably be important to find out is if the features/enhancements that would be made available by dropping support for 2.4 outweigh leaving behind a group of (dedicated) users. In addition, are there features/enhancements that are planned that don't require dropping 2.4, but that these users would also want/need? Note, I am all for moving ahead and eliminating cruft, however, the meteorological community is at a critical point right now and is starting to realize the value of Python. I have just found out that at the next annual meeting of the American Meteorological Society, there will be a special symposium on the applications of Python to the field of meteorology. I wouldn't want to alienate an field that is largely in the government sector, so this decision isn't to be taken lightly. Ben Root -------------- next part -------------- An HTML attachment was scrubbed... URL: From vincent at vincentdavis.net Sat Jul 17 15:11:46 2010 From: vincent at vincentdavis.net (Vincent Davis) Date: Sat, 17 Jul 2010 13:11:46 -0600 Subject: [SciPy-Dev] PyDocWeb "the doc editor" and github transition Message-ID: I have not seen anything about the move to github and how this effects pydocweb or did I just miss it. I am willing to adapt pydocweb to work with github but not sure what the desired integration would be. This should not be to difficult but should/needs to be part of the plan and documented. My initial thought is to create a branch for the doc editor. My only thought is to allow the doc editor (and those administering it) to maintain a github repo/branch and then allow those that are in control of the numpy/scipy source to merge/pull the docs with the doc repo/branch. (tell me what to do and I will work on it) Vincent From pav at iki.fi Sat Jul 17 15:42:42 2010 From: pav at iki.fi (Pauli Virtanen) Date: Sat, 17 Jul 2010 19:42:42 +0000 (UTC) Subject: [SciPy-Dev] Required Python Version References: Message-ID: Sat, 17 Jul 2010 14:10:26 -0500, Benjamin Root wrote: [clip] > And I have seen a few independently administered government machines > running Fedora, that doesn't necessarily reflect the IT policy of > government servers. What would probably be important to find out is if > the features/enhancements that would be made available by dropping > support for 2.4 outweigh leaving behind a group of (dedicated) users. > In addition, are there features/enhancements that are planned that don't > require dropping 2.4, but that these users would also want/need? I do not think supporting 2.4 has a large impact on maintenance burden. The differences to Python > 2.4 are small, at least if compared to Python 3. I would not consider the "with" statement or absolute imports very important at this point. We already went through the trouble to get things working on Python 3, and deciding to drop Python 2.4 to make the transition easier is no additional help at this point. -- Pauli Virtanen From pav at iki.fi Sat Jul 17 15:47:42 2010 From: pav at iki.fi (Pauli Virtanen) Date: Sat, 17 Jul 2010 19:47:42 +0000 (UTC) Subject: [SciPy-Dev] PyDocWeb "the doc editor" and github transition References: Message-ID: Sat, 17 Jul 2010 13:11:46 -0600, Vincent Davis wrote: > I have not seen anything about the move to github and how this effects > pydocweb or did I just miss it. I am willing to adapt pydocweb to work > with github but not sure what the desired integration would be. This > should not be to difficult but should/needs to be part of the plan and > documented. You do not need to make any changes to the doc editor because of the github move -- only to its configuration. The documentation editor app itself does not know or care what the version control system is. The required change is essentially replacing svn checkout ... svn update by git clone ... git pull ... in a shell script, and clearing up the checkout directories. I can take care of this on the server machine, once the move to git has been made. -- Pauli Virtanen From charlesr.harris at gmail.com Sat Jul 17 15:55:52 2010 From: charlesr.harris at gmail.com (Charles R Harris) Date: Sat, 17 Jul 2010 13:55:52 -0600 Subject: [SciPy-Dev] Required Python Version In-Reply-To: References: Message-ID: On Sat, Jul 17, 2010 at 1:42 PM, Pauli Virtanen wrote: > Sat, 17 Jul 2010 14:10:26 -0500, Benjamin Root wrote: > [clip] > > And I have seen a few independently administered government machines > > running Fedora, that doesn't necessarily reflect the IT policy of > > government servers. What would probably be important to find out is if > > the features/enhancements that would be made available by dropping > > support for 2.4 outweigh leaving behind a group of (dedicated) users. > > In addition, are there features/enhancements that are planned that don't > > require dropping 2.4, but that these users would also want/need? > > I do not think supporting 2.4 has a large impact on maintenance burden. > The differences to Python > 2.4 are small, at least if compared to Python > 3. > > I would not consider the "with" statement or absolute imports very > important at this point. We already went through the trouble to get > things working on Python 3, and deciding to drop Python 2.4 to make the > transition easier is no additional help at this point. > > How to parse Py_ssize_t values? I think the 'n' format code is quite useful for supporting 64 bits in a portable way and it isn't available in 2.4. Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From vincent at vincentdavis.net Sat Jul 17 16:58:03 2010 From: vincent at vincentdavis.net (Vincent Davis) Date: Sat, 17 Jul 2010 14:58:03 -0600 Subject: [SciPy-Dev] [Numpy-discussion] PyDocWeb "the doc editor" and github transition In-Reply-To: References: Message-ID: On Sat, Jul 17, 2010 at 1:47 PM, Pauli Virtanen wrote: > Sat, 17 Jul 2010 13:11:46 -0600, Vincent Davis wrote: >> I have not seen anything about the move to github and how this effects >> pydocweb or did I just miss it. I am willing to adapt pydocweb to work >> with github but not sure what the desired integration would be. This >> should not be to difficult but should/needs to be part of the plan and >> documented. > > You do not need to make any changes to the doc editor because of the > github move -- only to its configuration. The documentation editor app > itself does not know or care what the version control system is. > > The required change is essentially replacing > > ? ? ? ?svn checkout ... > ? ? ? ?svn update > > by > > ? ? ? ?git clone ... > ? ? ? ?git pull ... > > in a shell script, and clearing up the checkout directories. I can take > care of this on the server machine, once the move to git has been made. I was thinking it was that simple. Should it clone/pull.. from the master branch? Thanks Vincent > > -- > Pauli Virtanen > > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion at scipy.org > http://mail.scipy.org/mailman/listinfo/numpy-discussion > From cournape at gmail.com Sat Jul 17 17:33:14 2010 From: cournape at gmail.com (David Cournapeau) Date: Sat, 17 Jul 2010 23:33:14 +0200 Subject: [SciPy-Dev] Required Python Version In-Reply-To: References: Message-ID: On Sat, Jul 17, 2010 at 2:25 PM, Ralf Gommers wrote: > > > On Sat, Jul 17, 2010 at 8:06 PM, David Cournapeau > wrote: >> >> On Sat, Jul 17, 2010 at 4:37 AM, Charles R Harris >> wrote: >> > Hi All, >> > >> > Is there a good reason we shouldn't raise the required python version up >> > to >> > 2.5? >> >> Yes, a lot of "enterprise ready" ?distributions still use python 2.4 >> (RHEL, Centos). >> > That's not a convincing argument for an infinite amount of time. People who > value "enterprise ready" meaning they run ancient stuff should be perfectly > fine with numpy 1.4 + scipy 0.8 for a long time from now. That's not true in my experience. 2 years ago I used those "ancient" stuff. > And otherwise they > can upgrade python itself quite easily. that's not true at all. True, you can update python itself easily, but updating things like pygtk or pyqt are near impossible without the support of IT staff (because you generally lacks X11 headers, or gcc is too old). > 2.4 doesn't even get security > updates anymore. That's not exactly true - python developers will not give security updates, but I think RH may still do so. > We now support python 2.4-2.6, for the next versions we'll add 2.7, 3.1 and > 3.2. Supporting more versions has a cost. And it's clear that the amount of > people running 2.4 from svn is at or close to zero, because recent syntax > errors for 2.4 have gone unnoticed for a long period. People running svn are a small proportion of our userbase. And the 2.4 support cost is fairly minimal IMO. Certainly, 2.4 support is more important than 3.x at this point. David From derek at astro.physik.uni-goettingen.de Sat Jul 17 18:30:47 2010 From: derek at astro.physik.uni-goettingen.de (Derek Homeier) Date: Sun, 18 Jul 2010 00:30:47 +0200 Subject: [SciPy-Dev] [SciPy-User] ANN: scipy 0.8.0 release candidate 3 In-Reply-To: References: <4498D491-50AB-4578-BDAF-D91477AE91DA@astro.physik.uni-goettingen.de> Message-ID: Hi Ralf, > > Notably, I never got trouble-free builds with 2.7 - both the numpy > and scipy test suites fail > with a bus error - I already tried compiling with gcc 4.2 instead of > 4.0, but to no avail. > > Yes, there are issues with 2.7. Compiling against numpy 1.4.1 > doesn't work, against trunk also has some issues on OS X. As Vincent > pointed out, with a 64-bit Python 2.7 built with gcc-4.2 (python.org > binaries are gcc-4.0) it does work without problems. well, as I said I did try the gcc-4.2 build, but probably missed that Vincent was compiling numpy trunk - I probably should try again on my 10.6 64-bit installation. Anyway, compiling both numpy 1.4.1 and scipy 0.8.0rc3 works fine both on i386/ppc 32-bit and and x86_64, using the fink build system. And it is running as well, so far, except for those tests - btw. numpy.test(verbose=5) crashes at test_multiarray.TestIO.test_ascii ... Bus error and scipy.test() at Testing that kmeans2 init methods work. ... Bus error > > These should have been fixed by r6520: > - self.check_cephes_vs_amos(iv, iv, rtol=1e-12, atol=1e-305) > + self.check_cephes_vs_amos(iv, iv, rtol=5e-9, atol=1e-305) > > data(gammaincinv, 'gamma_inv_big_data_ipp- > gamma_inv_big_data', > - (0,1), 2, rtol=5e-12), > + (0,1), 2, rtol=1e-11), > > Can you check if you have these changes, and with what accuracy the > tests pass? > I have (rc3 tarball), but it's a different couple of tests. 'gamma_inv_big_data_ipp-gamma_inv_big_data' actually still passes with rtol=2.1e-12. These are the minimal accuracy changes needed: --- /sw/lib/python2.6/site-packages/scipy/special/tests/test_data.py 2010-07-11 17:25:24.000000000 +0200 +++ test_data.py 2010-07-17 21:13:07.000000000 +0200 @@ -85,7 +85,7 @@ data(gamma, 'test_gamma_data_ipp-near_1', 0, 1), data(gamma, 'test_gamma_data_ipp-near_2', 0, 1), data(gamma, 'test_gamma_data_ipp-near_m10', 0, 1), - data(gamma, 'test_gamma_data_ipp-near_m55', 0, 1), + data(gamma, 'test_gamma_data_ipp-near_m55', 0, 1, rtol=7e-12), data(gamma, 'test_gamma_data_ipp-near_0', 0j, 1, rtol=2e-9), data(gamma, 'test_gamma_data_ipp-near_1', 0j, 1, rtol=2e-9), data(gamma, 'test_gamma_data_ipp-near_2', 0j, 1, rtol=2e-9), --- /sw/lib/python2.6/site-packages/scipy/special/tests/test_basic.py 2010-07-11 17:25:24.000000000 +0200 +++ test_basic.py 2010-07-17 21:28:07.000000000 +0200 @@ -1618,7 +1618,7 @@ # Most error apparently comes from AMOS and not our implementation; # there are some problems near integer orders there - assert dc[k] < 1e-9, (v[k], x[k], iv(v[k], x[k]), iv(v[k], x[k]+0j)) + assert dc[k] < 1.9e-7, (v[k], x[k], iv(v[k], x[k]), iv(v[k], x[k]+0j)) def test_kv_cephes_vs_amos(self): #self.check_cephes_vs_amos(kv, kn, rtol=1e-9, atol=1e-305) Of course nearly 2e-7 seems a quite high tolerance - don't know if that's acceptable. > > Both standard and full test pass on i386 and x86_64, but I noticed > the following strange behaviour: > when running the test suite twice in a row, the second run produces > these failures: > > This is because warnings are only raised once from the same code, so > the check if they're raised a second time fails. So no problem. > Ah, that makes sense, thanks! Cheers, Derek From vincent at vincentdavis.net Sat Jul 17 18:44:17 2010 From: vincent at vincentdavis.net (Vincent Davis) Date: Sat, 17 Jul 2010 16:44:17 -0600 Subject: [SciPy-Dev] [SciPy-User] ANN: scipy 0.8.0 release candidate 3 In-Reply-To: References: <4498D491-50AB-4578-BDAF-D91477AE91DA@astro.physik.uni-goettingen.de> Message-ID: On Sat, Jul 17, 2010 at 4:30 PM, Derek Homeier wrote: > Hi Ralf, > >> >> Notably, I never got trouble-free builds with 2.7 - both the numpy >> and scipy test suites fail >> with a bus error - I already tried compiling with gcc 4.2 instead of >> 4.0, but to no avail. >> >> Yes, there are issues with 2.7. Compiling against numpy 1.4.1 >> doesn't work, against trunk also has some issues on OS X. As Vincent >> pointed out, with a 64-bit Python 2.7 built with gcc-4.2 (python.org >> binaries are gcc-4.0) it does work without problems. > > well, as I said I did try the gcc-4.2 build, but probably missed that > Vincent was compiling numpy trunk - > I probably should try again on my 10.6 64-bit installation. For you information Python version 2.7.0+ (release27-maint:82653, Jul 8 2010, 14:45:18) [GCC 4.2.1 (Apple Inc. build 5659)] Build commands used python 2.7 ./configure --with-universal-archs=64-bit --enable-universalsdk=/Developer/SDKs/MacOSX10.5.sdk --enable-framework for numpy and scipy I used LDFLAGS="-arch x86_64" FFLAGS="-arch x86_64" py27 setupscons.py scons Vincent > Anyway, compiling both numpy 1.4.1 and scipy 0.8.0rc3 works fine both > on i386/ppc 32-bit and and x86_64, > using the fink build system. And it is running as well, so far, except > for those tests - btw. > numpy.test(verbose=5) crashes at > test_multiarray.TestIO.test_ascii ... Bus error > > and scipy.test() at > Testing that kmeans2 init methods work. ... Bus error > >> >> These should have been fixed by r6520: >> - ? ? ? ?self.check_cephes_vs_amos(iv, iv, rtol=1e-12, atol=1e-305) >> + ? ? ? ?self.check_cephes_vs_amos(iv, iv, rtol=5e-9, atol=1e-305) >> >> ? ? ? ? ?data(gammaincinv, 'gamma_inv_big_data_ipp- >> gamma_inv_big_data', >> - ? ? ? ? ? ? (0,1), 2, rtol=5e-12), >> + ? ? ? ? ? ? (0,1), 2, rtol=1e-11), >> >> Can you check if you have these changes, and with what accuracy the >> tests pass? >> > I have (rc3 tarball), but it's a different couple of tests. > 'gamma_inv_big_data_ipp-gamma_inv_big_data' actually still passes with > rtol=2.1e-12. > > These are the minimal accuracy changes needed: > > --- /sw/lib/python2.6/site-packages/scipy/special/tests/test_data.py > 2010-07-11 17:25:24.000000000 +0200 > +++ test_data.py ? ? ? ?2010-07-17 21:13:07.000000000 +0200 > @@ -85,7 +85,7 @@ > ? ? ? ? ?data(gamma, 'test_gamma_data_ipp-near_1', 0, 1), > ? ? ? ? ?data(gamma, 'test_gamma_data_ipp-near_2', 0, 1), > ? ? ? ? ?data(gamma, 'test_gamma_data_ipp-near_m10', 0, 1), > - ? ? ? ?data(gamma, 'test_gamma_data_ipp-near_m55', 0, 1), > + ? ? ? ?data(gamma, 'test_gamma_data_ipp-near_m55', 0, 1, rtol=7e-12), > ? ? ? ? ?data(gamma, 'test_gamma_data_ipp-near_0', 0j, 1, rtol=2e-9), > ? ? ? ? ?data(gamma, 'test_gamma_data_ipp-near_1', 0j, 1, rtol=2e-9), > ? ? ? ? ?data(gamma, 'test_gamma_data_ipp-near_2', 0j, 1, rtol=2e-9), > --- /sw/lib/python2.6/site-packages/scipy/special/tests/test_basic.py > 2010-07-11 17:25:24.000000000 +0200 > +++ test_basic.py ? ? ? 2010-07-17 21:28:07.000000000 +0200 > @@ -1618,7 +1618,7 @@ > > ? ? ? ? ?# Most error apparently comes from AMOS and not our > implementation; > ? ? ? ? ?# there are some problems near integer orders there > - ? ? ? ?assert dc[k] < 1e-9, (v[k], x[k], iv(v[k], x[k]), iv(v[k], > x[k]+0j)) > + ? ? ? ?assert dc[k] < 1.9e-7, (v[k], x[k], iv(v[k], x[k]), iv(v[k], > x[k]+0j)) > > ? ? ?def test_kv_cephes_vs_amos(self): > ? ? ? ? ?#self.check_cephes_vs_amos(kv, kn, rtol=1e-9, atol=1e-305) > > Of course nearly 2e-7 seems a quite high tolerance - don't know if > that's acceptable. > >> >> Both standard and full test pass on i386 and x86_64, but I noticed >> the following strange behaviour: >> when running the test suite twice in a row, the second run produces >> these failures: >> >> This is because warnings are only raised once from the same code, so >> the check if they're raised a second time fails. So no problem. >> > > Ah, that makes sense, thanks! > > Cheers, > ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ?Derek > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > From matthieu.brucher at gmail.com Sun Jul 18 02:21:36 2010 From: matthieu.brucher at gmail.com (Matthieu Brucher) Date: Sun, 18 Jul 2010 08:21:36 +0200 Subject: [SciPy-Dev] Required Python Version In-Reply-To: References: Message-ID: 2010/7/17 David Cournapeau : > On Sat, Jul 17, 2010 at 2:25 PM, Ralf Gommers > wrote: >> >> >> On Sat, Jul 17, 2010 at 8:06 PM, David Cournapeau >> wrote: >>> >>> On Sat, Jul 17, 2010 at 4:37 AM, Charles R Harris >>> wrote: >>> > Hi All, >>> > >>> > Is there a good reason we shouldn't raise the required python version up >>> > to >>> > 2.5? >>> >>> Yes, a lot of "enterprise ready" ?distributions still use python 2.4 >>> (RHEL, Centos). >>> >> That's not a convincing argument for an infinite amount of time. People who >> value "enterprise ready" meaning they run ancient stuff should be perfectly >> fine with numpy 1.4 + scipy 0.8 for a long time from now. > > That's not true in my experience. 2 years ago I used those "ancient" stuff. > >> And otherwise they >> can upgrade python itself quite easily. > > that's not true at all. True, you can update python itself easily, but > updating things like pygtk or pyqt are near impossible without the > support of IT staff (because you generally lacks X11 headers, or gcc > is too old). > >> ?2.4 doesn't even get security >> updates anymore. > > That's not exactly true - python developers will not give security > updates, but I think RH may still do so. > >> We now support python 2.4-2.6, for the next versions we'll add 2.7, 3.1 and >> 3.2. Supporting more versions has a cost. And it's clear that the amount of >> people running 2.4 from svn is at or close to zero, because recent syntax >> errors for 2.4 have gone unnoticed for a long period. > > People running svn are a small proportion of our userbase. And the 2.4 > support cost is fairly minimal IMO. Certainly, 2.4 support is more > important than 3.x at this point. Hi, I cannot agree more with David. We are still running RHEL 4 on several hundreds computers, and we still are heavilly using SLES 10.1 which comes with 2.4 (IIRC). Migrating these computers is a lengthy task which is only done every few years (few = 4-5 years for RHEL, we are starting to jump to 5.2 which has... 2.4). Matthieu -- Information System Engineer, Ph.D. Blog: http://matt.eifelle.com LinkedIn: http://www.linkedin.com/in/matthieubrucher From ralf.gommers at googlemail.com Sun Jul 18 06:44:03 2010 From: ralf.gommers at googlemail.com (Ralf Gommers) Date: Sun, 18 Jul 2010 18:44:03 +0800 Subject: [SciPy-Dev] signal.freqs/freqz plot keyword In-Reply-To: <76138E52-A30A-4E18-B7D1-13FD9A738BF3@gmail.com> References: <76138E52-A30A-4E18-B7D1-13FD9A738BF3@gmail.com> Message-ID: On Sat, Jul 17, 2010 at 10:09 PM, Ryan May wrote: > On Jul 17, 2010, at 4:38, Ralf Gommers > wrote: > > > freqs/freqz have a keyword plot=None, which if True does this: > > if not plot is None: > > plot(w, h) > > while plot is not even defined. > > If it's not none, it should be a callable passed in, which let's you seine > your own plotting function. Actually using matplotlib's plot produces > unexpected results, you end up plotting the real part of the complex > transfer function, not the magnitude. This actually bit me recently and > should probably be documented. > Ah, that makes sense. Would indeed have been clearer with documentation. I'll add it. > > > keyword should be removed or the implementation should be something like: > > if plot: > > try: > > import matplotlib.pyplot as plt > > plt.plot(w, h) > > plt.show() > > except ImportError: > > warnings.warn("`plot` is True, but can't import > matplotlib.pyplot.") > > No, you can use any plotting library right now. > > > > Removal makes more sense I think, since scipy does not depend on > matplotlib. And why provide a plot keyword in these functions and not in > many others where it would also make sense? > > It's mimmicking MATLAB functionality, which will plot the transfer function > if you call freqz and don't save the results. > > Looks like it's been part of the signature since the functions were added, so never mind the removing. But I still find it very inconsistent and odd-looking. Cheers, Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at googlemail.com Sun Jul 18 10:22:43 2010 From: ralf.gommers at googlemail.com (Ralf Gommers) Date: Sun, 18 Jul 2010 22:22:43 +0800 Subject: [SciPy-Dev] [SciPy-User] ANN: scipy 0.8.0 release candidate 3 In-Reply-To: References: <4498D491-50AB-4578-BDAF-D91477AE91DA@astro.physik.uni-goettingen.de> Message-ID: On Sun, Jul 18, 2010 at 6:30 AM, Derek Homeier < derek at astro.physik.uni-goettingen.de> wrote: > > > > These should have been fixed by r6520: > > - self.check_cephes_vs_amos(iv, iv, rtol=1e-12, atol=1e-305) > > + self.check_cephes_vs_amos(iv, iv, rtol=5e-9, atol=1e-305) > > > > data(gammaincinv, 'gamma_inv_big_data_ipp- > > gamma_inv_big_data', > > - (0,1), 2, rtol=5e-12), > > + (0,1), 2, rtol=1e-11), > > > > Can you check if you have these changes, and with what accuracy the > > tests pass? > > > > I have (rc3 tarball), but it's a different couple of tests. > 'gamma_inv_big_data_ipp-gamma_inv_big_data' actually still passes with > rtol=2.1e-12. > > These are the minimal accuracy changes needed: > > --- /sw/lib/python2.6/site-packages/scipy/special/tests/test_data.py > 2010-07-11 17:25:24.000000000 +0200 > +++ test_data.py 2010-07-17 21:13:07.000000000 +0200 > @@ -85,7 +85,7 @@ > data(gamma, 'test_gamma_data_ipp-near_1', 0, 1), > data(gamma, 'test_gamma_data_ipp-near_2', 0, 1), > data(gamma, 'test_gamma_data_ipp-near_m10', 0, 1), > - data(gamma, 'test_gamma_data_ipp-near_m55', 0, 1), > + data(gamma, 'test_gamma_data_ipp-near_m55', 0, 1, rtol=7e-12), > data(gamma, 'test_gamma_data_ipp-near_0', 0j, 1, rtol=2e-9), > data(gamma, 'test_gamma_data_ipp-near_1', 0j, 1, rtol=2e-9), > data(gamma, 'test_gamma_data_ipp-near_2', 0j, 1, rtol=2e-9), > --- /sw/lib/python2.6/site-packages/scipy/special/tests/test_basic.py > 2010-07-11 17:25:24.000000000 +0200 > +++ test_basic.py 2010-07-17 21:28:07.000000000 +0200 > @@ -1618,7 +1618,7 @@ > > # Most error apparently comes from AMOS and not our > implementation; > # there are some problems near integer orders there > - assert dc[k] < 1e-9, (v[k], x[k], iv(v[k], x[k]), iv(v[k], > x[k]+0j)) > + assert dc[k] < 1.9e-7, (v[k], x[k], iv(v[k], x[k]), iv(v[k], > x[k]+0j)) > > def test_kv_cephes_vs_amos(self): > #self.check_cephes_vs_amos(kv, kn, rtol=1e-9, atol=1e-305) > > Of course nearly 2e-7 seems a quite high tolerance - don't know if > that's acceptable. > > That does seem a little high. I guess Pauli has to decide whether this qualifies as a bug or not. Cheers, Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at googlemail.com Sun Jul 18 10:40:32 2010 From: ralf.gommers at googlemail.com (Ralf Gommers) Date: Sun, 18 Jul 2010 22:40:32 +0800 Subject: [SciPy-Dev] Required Python Version In-Reply-To: References: Message-ID: On Sun, Jul 18, 2010 at 2:21 PM, Matthieu Brucher < matthieu.brucher at gmail.com> wrote: > 2010/7/17 David Cournapeau : > > On Sat, Jul 17, 2010 at 2:25 PM, Ralf Gommers > > wrote: > > > >> We now support python 2.4-2.6, for the next versions we'll add 2.7, 3.1 > and > >> 3.2. Supporting more versions has a cost. And it's clear that the amount > of > >> people running 2.4 from svn is at or close to zero, because recent > syntax > >> errors for 2.4 have gone unnoticed for a long period. > > > > People running svn are a small proportion of our userbase. And the 2.4 > > support cost is fairly minimal IMO. > I am not the best judge of how much the cost is, but this thread started because of a patch that couldn't go in, and Charles question on Py_ssize_t is as yet unanswered. Certainly, 2.4 support is more > > important than 3.x at this point. > > That's debatable. 2.4 users have perfectly good releases right now (plus numpy 1/5 / scipy 0.8 coming up), 3.x users have nothing. > I cannot agree more with David. We are still running RHEL 4 on several > hundreds computers, and we still are heavilly using SLES 10.1 which > comes with 2.4 (IIRC). Migrating these computers is a lengthy task > which is only done every few years (few = 4-5 years for RHEL, we are > starting to jump to 5.2 which has... 2.4). > > So is your point you're still running 2.4 in 4 years time, therefore support for it cannot be dropped for 4 more years? Or when does it become okay for you to do so? Cheers, Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From cournape at gmail.com Sun Jul 18 12:33:26 2010 From: cournape at gmail.com (David Cournapeau) Date: Sun, 18 Jul 2010 18:33:26 +0200 Subject: [SciPy-Dev] Required Python Version In-Reply-To: References: Message-ID: On Sun, Jul 18, 2010 at 4:40 PM, Ralf Gommers wrote: > > > On Sun, Jul 18, 2010 at 2:21 PM, Matthieu Brucher > wrote: >> >> 2010/7/17 David Cournapeau : >> > On Sat, Jul 17, 2010 at 2:25 PM, Ralf Gommers >> > wrote: >> > >> >> We now support python 2.4-2.6, for the next versions we'll add 2.7, 3.1 >> >> and >> >> 3.2. Supporting more versions has a cost. And it's clear that the >> >> amount of >> >> people running 2.4 from svn is at or close to zero, because recent >> >> syntax >> >> errors for 2.4 have gone unnoticed for a long period. >> > >> > People running svn are a small proportion of our userbase. And the 2.4 >> > support cost is fairly minimal IMO. > > I am not the best judge of how much the cost is, but this thread started > because of a patch that couldn't go in, and Charles question on Py_ssize_t > is as yet unanswered. I don't see why we could not add a support function to parse those values. We need similar functions anyway to deal with size_t and co correctly. > That's debatable. 2.4 users have perfectly good releases right now (plus > numpy 1/5 / scipy 0.8 coming up), 3.x users have nothing. I meant that we have more 2.4 users than 3.x. It is really a cost/benefit analysis: the advantages of 2.5 seem rather trivial to me compared to losing support for quite common platforms. David From rcsqtc at iqac.csic.es Mon Jul 19 05:35:24 2010 From: rcsqtc at iqac.csic.es (Ramon Crehuet) Date: Mon, 19 Jul 2010 11:35:24 +0200 Subject: [SciPy-Dev] f2py: questions on array arguments Message-ID: <4C441C5C.3020205@iqac.csic.es> An HTML attachment was scrubbed... URL: From sturla at molden.no Mon Jul 19 09:02:50 2010 From: sturla at molden.no (Sturla Molden) Date: Mon, 19 Jul 2010 15:02:50 +0200 Subject: [SciPy-Dev] f2py: questions on array arguments In-Reply-To: <4C441C5C.3020205@iqac.csic.es> References: <4C441C5C.3020205@iqac.csic.es> Message-ID: <4C444CFA.9070809@molden.no> Ramon Crehuet skrev: > > 1. The follwing fortran function: > > function f2(x,y) > implicit none > real,intent(in):: x,y > real, dimension(3):: f2 > f2(1)=x+y**2 > f2(2)=sin(x*y) > f2(3)=2*x-y > end function f2 > > gives a segmentation fault when called from python if it is not in a > fortran module. If it is contained in a fortran module, it works fine > and returns an array. That makes sense because fortran modules > automatically generate an interface. However, I don't see that reflected > in the .pyf files generated by f2py. So, is there a way to "correct" the > function outside the module to work with f2py? Wrap it in a subroutine f2wrap(x,y,output) and call that. C and Fortran interpop (which is what f2py does) can only go so far. C has no concept of functions returning arrays. The Fortran function must know where to write the return values, but how can you provide that information from C? So Fortran and C has different calling conventions here. And the reason for this working inside a module is that f2py must write a Fortran wrapper for the module. With modules there are name mangling and stuff going on that are not interopable with C. But then you have a situation where f2 is called from Fortran, so it will not segfault. > > 2. I have read in several posts that automatic arrays do not work with > f2py. So that something like: > > real function trace(m) > real, dimension(:,:), intent(in) :: m > > Has to be converted into: > > real function trace2(m,n) > integer, intent(in) :: n > !f2py integer,intent(hide),depend(m) :: n=shape(m,0) > real, dimension(n,n), intent(in) :: m > Again, here you have a C vs. Fortran incompatibility again. An assume shaped array like "m" in "trace" is actually a dope vector, i.e. a C struct not different from NumPy's PyArrayObject (though the binary layout is compiler dependent, and incompatible with NumPy). So we cannot pass a C pointer to "trace" and hope that it will work. But if we do know the C defintion of the Fortran compiler's dope vector, we can fill in the fields from a PyArrayObject and pass a pointer to the struct (there is a C library that attempts to do that, but I don't recall the name). But in "trace2", the shape of "m" is explicit. Here Fortran will assume m is just a C pointer to the first element. Fortran 77 compilers usually treat explicit and assumed-ashed arrays differently for interopability with C and compatilibity with other F77 compilers. So in "trace2", we can pass a C pointer to the first element of "m", whereas in "trace" we cannot. This difference is not required by the standard though. A Fortran 90 compiler could use only dope vectors if it wanted to. (There are other non-standard ways of interfacing with C such as Cray pointers.) So this is a bit murky territory. With Fortran 2003 we finally have a portable Fortran to C interface. Not all Fortran compilers implement this; gfortran does, whereas my favourite compiler Absoft does not. Sturla From sturla at molden.no Mon Jul 19 10:07:56 2010 From: sturla at molden.no (Sturla Molden) Date: Mon, 19 Jul 2010 16:07:56 +0200 Subject: [SciPy-Dev] f2py: questions on array arguments In-Reply-To: <4C441C5C.3020205@iqac.csic.es> References: <4C441C5C.3020205@iqac.csic.es> Message-ID: <4C445C3C.8080803@molden.no> Ramon Crehuet skrev: > real function trace(m) > !f2py integer,depend(m) :: n=shape(m,0) > !f2py real, dimension(n,n), intent(in) :: m > real, dimension(:,:), intent(in) :: m > > But it does not work. Is there a workaround to avoid passing the > dimension of the matrix as a fortran argument? If you know your compiler's "dope vector" (known for most compilers, or just ask the vendor), we could implement an ndarray to Fortran 90 array converter in C, and pass a pointer to a dope vector struct. If you don't know this, you will need a Fortran wrapper/proxy function. There are several ways that could work: Either: - Pass in all information about the array (shape and strides), then pass on a Fortran pointer or a slice. Or: - A C utility function could expose the PyArrayObject fields to Fortran. - The Fortran compiler might support C structures as an extension (e.g. Absoft). - Fortran 2003 ISO C bindings: C structures are supported. After the ndarray's meta-information is obtained, from any of those three, you can: - Use non-standard Cray pointers to access the buffer. - Use Fortran 2003 ISO C bindings to access the buffer: c_f_pointer converts a C pointer to a Fortran pointer. There is actually a Cython-related project called 'fwrap' that works somewhat like this. But if you have a Fortran contiguous array, or can create one, f2py will mean much less work. Sturla From sturla at molden.no Mon Jul 19 10:12:06 2010 From: sturla at molden.no (Sturla Molden) Date: Mon, 19 Jul 2010 16:12:06 +0200 Subject: [SciPy-Dev] f2py: questions on array arguments In-Reply-To: <4C444CFA.9070809@molden.no> References: <4C441C5C.3020205@iqac.csic.es> <4C444CFA.9070809@molden.no> Message-ID: <4C445D36.5020900@molden.no> Sturla Molden skrev: > m is just a C pointer to the first element. Fortran 77 compilers usually > treat explicit and assumed-ashed arrays differently I meant Fortran 90 compilers, of course. S. From dwf at cs.toronto.edu Mon Jul 19 13:09:45 2010 From: dwf at cs.toronto.edu (David Warde-Farley) Date: Mon, 19 Jul 2010 13:09:45 -0400 Subject: [SciPy-Dev] Required Python Version In-Reply-To: References: Message-ID: <32DB2FE1-40A8-48FF-BB42-792B529D464C@cs.toronto.edu> On 2010-07-18, at 12:33 PM, David Cournapeau wrote: > I meant that we have more 2.4 users than 3.x. It is really a > cost/benefit analysis: the advantages of 2.5 seem rather trivial to me > compared to losing support for quite common platforms. I had this discussion with Fernando, Brian and Gael last year in Pasadena, and they came to the conclusion that while there may be people still running crufty versions of Python, it's dubious that the same users would be installing bleeding edge IPython... The relative difficulty in building NumPy and SciPy makes me wonder whether people on these 2.4 platforms are even building from more recent tarballs, nevermind SVN. What's the most recent version of NumPy or SciPy that's packaged for RHEL by the vendor? David From pav at iki.fi Mon Jul 19 13:18:22 2010 From: pav at iki.fi (Pauli Virtanen) Date: Mon, 19 Jul 2010 17:18:22 +0000 (UTC) Subject: [SciPy-Dev] [SciPy-User] ANN: scipy 0.8.0 release candidate 3 References: <4498D491-50AB-4578-BDAF-D91477AE91DA@astro.physik.uni-goettingen.de> Message-ID: Sun, 18 Jul 2010 22:22:43 +0800, Ralf Gommers wrote: [clip] > > assert dc[k] < 1e-9, (v[k], x[k], iv(v[k], x[k]), iv(v[k], > > x[k]+0j)) > > AssertionError: (-1.685450377317264, 1.2676128629915242, > > 4.748141012443594e-05, (4.7481418802688318e-05+0j)) > > > > relative error <~ 2e-7 > > That does seem a little high. I guess Pauli has to decide whether this > qualifies as a bug or not. Works for me. No idea how to fix this, since it probably is compiler specific. On the other hand, that's pretty close to a zero of Iv, where there is cancellation error (since v < 0 we use a reflection identity), so it's probably a difficult spot anyway. Maybe the treshold could be made platform-dependent -- I wouldn't like to bump it if it fails only on a single platform/compiler combination. Pauli From derek at astro.physik.uni-goettingen.de Mon Jul 19 13:51:35 2010 From: derek at astro.physik.uni-goettingen.de (Derek Homeier) Date: Mon, 19 Jul 2010 19:51:35 +0200 Subject: [SciPy-Dev] [SciPy-User] ANN: scipy 0.8.0 release candidate 3 In-Reply-To: References: <4498D491-50AB-4578-BDAF-D91477AE91DA@astro.physik.uni-goettingen.de> Message-ID: On 18.07.2010, at 12:44AM, Vincent Davis wrote: >> well, as I said I did try the gcc-4.2 build, but probably missed that >> Vincent was compiling numpy trunk - >> I probably should try again on my 10.6 64-bit installation. > For you information > Python version 2.7.0+ (release27-maint:82653, Jul 8 2010, 14:45:18) > [GCC 4.2.1 (Apple Inc. build 5659)] > > Build commands used > python 2.7 > ./configure --with-universal-archs=64-bit > --enable-universalsdk=/Developer/SDKs/MacOSX10.5.sdk > --enable-framework > > for numpy and scipy I used > LDFLAGS="-arch x86_64" FFLAGS="-arch x86_64" py27 setupscons.py scons thanks for the pointer, but the crucial point was in fact, as Ralf mentioned, using the numpy-1.4.1 release - after I upgraded to numpy-2.0.0.dev8509 from svn, both numpy and scipy tests passed simply building with the defaults (which amount to 64bit compile on my OS X 10.6 installation anyway, I believe). Cheers, Derek From cournape at gmail.com Mon Jul 19 13:54:38 2010 From: cournape at gmail.com (David Cournapeau) Date: Mon, 19 Jul 2010 19:54:38 +0200 Subject: [SciPy-Dev] Required Python Version In-Reply-To: <32DB2FE1-40A8-48FF-BB42-792B529D464C@cs.toronto.edu> References: <32DB2FE1-40A8-48FF-BB42-792B529D464C@cs.toronto.edu> Message-ID: On Mon, Jul 19, 2010 at 7:09 PM, David Warde-Farley wrote: > > I had this discussion with Fernando, Brian and Gael last year in Pasadena, and they came to the conclusion that while there may be people still running crufty versions of Python, it's dubious that the same users would be installing bleeding edge IPython... The relative difficulty in building NumPy and SciPy makes me wonder whether people on these 2.4 platforms are even building from more recent tarballs, nevermind SVN. Building numpy and scipy is difficult, but it is nothing compared to having to build pygtk/pyqt, etc... which is necessary if you build your own python. A lot of clusters and co run on RHEL/Centos, for example. Building numpy/scipy by yourself is doable, a whole python toolset not so much. > What's the most recent version of NumPy or SciPy that's packaged for RHEL by the vendor? They are ancient IIRC - when I worked in a company which used RHEL, it was useless, I compiled atlas/numpy/scipy by myself. Knowing how many people depend on python 2.4 support is not possible, so we are a bit in the dark. But again, as mentioned by Pauli already, supporting 2.4 instead of 2.5 is not that difficult - the cost is really low. And as a data point, a lot of heavily used libraries also support python 2.4. David From njs at pobox.com Mon Jul 19 15:50:04 2010 From: njs at pobox.com (Nathaniel Smith) Date: Mon, 19 Jul 2010 12:50:04 -0700 Subject: [SciPy-Dev] Required Python Version In-Reply-To: References: <32DB2FE1-40A8-48FF-BB42-792B529D464C@cs.toronto.edu> Message-ID: On Mon, Jul 19, 2010 at 10:54 AM, David Cournapeau wrote: > Building numpy and scipy is difficult, but it is nothing compared to > having to build pygtk/pyqt, etc... which is necessary if you build > your own python. I'm not longer in this position (our cluster runs Debian stable instead of RHEL, so the release of Lenny was the magic moment for me), but I can affirm that I've had this problem too -- installing python is really easy, getting scipy/numpy built from source not hard at all (I think easy_install Just Worked, even if I probably gave up something in terms of using the fastest possible math libraries), but the graphics toolkits start putting you in dependency hell and it's a huge pain. -- Nathaniel From 14772361 at sun.ac.za Mon Jul 19 16:27:57 2010 From: 14772361 at sun.ac.za (Hamman, RA, Mr <14772361@sun.ac.za>) Date: Mon, 19 Jul 2010 22:27:57 +0200 Subject: [SciPy-Dev] Introducing myself... Message-ID: <63F358F0ED08BD40AADA9D735420310F029AF3CF387A@STBEVS08.stb.sun.ac.za> Hi all, I thought I'd introduce myself. I'm a mechanical engineering master's student at Stellenbosch University, in South Africa. I work in the optimization world and am interested in multi-physics problems and the use of nonlinear approximations in place of complex engineering analyses. I use scipy all the time for my studies and I'm interested in getting involved and helping out were I can. I thought I'd lurk a bit, and then jump in. I look forward to learning from all of you and contributing where I can. Regards, Richard -------------- next part -------------- A non-text attachment was scrubbed... Name: winmail.dat Type: application/ms-tnef Size: 2930 bytes Desc: not available URL: From josh.holbrook at gmail.com Mon Jul 19 16:39:43 2010 From: josh.holbrook at gmail.com (Joshua Holbrook) Date: Mon, 19 Jul 2010 12:39:43 -0800 Subject: [SciPy-Dev] Introducing myself... In-Reply-To: <63F358F0ED08BD40AADA9D735420310F029AF3CF387A@STBEVS08.stb.sun.ac.za> References: <63F358F0ED08BD40AADA9D735420310F029AF3CF387A@STBEVS08.stb.sun.ac.za> Message-ID: On Mon, Jul 19, 2010 at 12:27 PM, Hamman, RA, Mr <14772361 at sun.ac.za> <14772361 at sun.ac.za> wrote: > Hi all, > > I thought I'd introduce myself. ?I'm a mechanical engineering master's > student at Stellenbosch University, in South Africa. ?I work in the > optimization world and am interested in multi-physics problems and the > use of nonlinear approximations in place of complex engineering > analyses. > > I use scipy all the time for my studies and I'm interested in getting > involved and helping out were I can. ?I thought I'd lurk a bit, and then > jump in. ?I look forward to learning from all of you and contributing > where I can. > > Regards, > Richard > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > > Do you know Stefan, then? I'm also an ME masters student, but on the complete opposite side of the globe. XD --Josh From d.l.goldsmith at gmail.com Mon Jul 19 16:50:00 2010 From: d.l.goldsmith at gmail.com (David Goldsmith) Date: Mon, 19 Jul 2010 13:50:00 -0700 Subject: [SciPy-Dev] Introducing myself... In-Reply-To: <63F358F0ED08BD40AADA9D735420310F029AF3CF387A@STBEVS08.stb.sun.ac.za> References: <63F358F0ED08BD40AADA9D735420310F029AF3CF387A@STBEVS08.stb.sun.ac.za> Message-ID: Hi, Richard, and thanks! Right now where we need the most help is in documenting SciPy; if this interests you, please visit docs.scipy.org/numpy(yes, that's numpy, but everything there is applicable to editing the SciPy docs as well) to get oriented, then register an account as instructed there and post your username back here and one of us will get you editing permissions. Thanks again, David Goldsmith Olympia, WA On Mon, Jul 19, 2010 at 1:27 PM, Hamman, RA, Mr <14772361 at sun.ac.za> < 14772361 at sun.ac.za> wrote: > Hi all, > > I thought I'd introduce myself. I'm a mechanical engineering master's > student at Stellenbosch University, in South Africa. I work in the > optimization world and am interested in multi-physics problems and the > use of nonlinear approximations in place of complex engineering > analyses. > > I use scipy all the time for my studies and I'm interested in getting > involved and helping out were I can. I thought I'd lurk a bit, and then > jump in. I look forward to learning from all of you and contributing > where I can. > > Regards, > Richard > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > > -- Mathematician: noun, someone who disavows certainty when their uncertainty set is non-empty, even if that set has measure zero. Hope: noun, that delusive spirit which escaped Pandora's jar and, with her lies, prevents mankind from committing a general suicide. (As interpreted by Robert Graves) -------------- next part -------------- An HTML attachment was scrubbed... URL: From thouis at broadinstitute.org Mon Jul 19 17:04:57 2010 From: thouis at broadinstitute.org (Thouis (Ray) Jones) Date: Mon, 19 Jul 2010 17:04:57 -0400 Subject: [SciPy-Dev] small patch to scipy.ndimage.measurements Message-ID: Lee Kamentsky noticed that ndimage.measurements has a too strict test in _stats() and _select() that pushes it off the fast path when the label matrix is a uint instead of an int. I've filed a patch with ticket 1242 (which I can't get to right now, for some reason). I include the patch below, as well: Index: measurements.py =================================================================== --- measurements.py (revision 6630) +++ measurements.py (working copy) @@ -298,7 +298,7 @@ # remap labels to unique integers if necessary, or if the largest # label is larger than the number of values. - if ((not numpy.issubdtype(labels.dtype, numpy.int)) or + if ((not numpy.issubdtype(labels.dtype, (numpy.int, numpy.uint))) or (labels.min() < 0) or (labels.max() > labels.size)): unique_labels, new_labels = numpy.unique1d(labels, return_inverse=True) @@ -465,7 +465,7 @@ # remap labels to unique integers if necessary, or if the largest # label is larger than the number of values. - if ((not numpy.issubdtype(labels.dtype, numpy.int)) or + if ((not numpy.issubdtype(labels.dtype, (numpy.int, numpy.uint))) or (labels.min() < 0) or (labels.max() > labels.size)): # remap labels, and indexes unique_labels, labels = numpy.unique1d(labels, return_inverse=True) From d.l.goldsmith at gmail.com Mon Jul 19 20:03:23 2010 From: d.l.goldsmith at gmail.com (David Goldsmith) Date: Mon, 19 Jul 2010 17:03:23 -0700 Subject: [SciPy-Dev] New Wiki page Message-ID: Hi, all! As per a suggestion, I've added a "Who's Who"of "permission granters, etc." for our Wikiland to the doc wiki. If you have any manner of administrative capability on the Wiki, please visit the page and, if you please, edit the table to include yourself and your capabilities. Thanks! DG -------------- next part -------------- An HTML attachment was scrubbed... URL: From ben.root at ou.edu Mon Jul 19 20:13:46 2010 From: ben.root at ou.edu (Benjamin Root) Date: Mon, 19 Jul 2010 19:13:46 -0500 Subject: [SciPy-Dev] New Wiki page In-Reply-To: References: Message-ID: On Mon, Jul 19, 2010 at 7:03 PM, David Goldsmith wrote: > Hi, all! As per a suggestion, I've added a "Who's Who"of "permission granters, etc." for our Wikiland to the doc wiki. If you > have any manner of administrative capability on the Wiki, please visit the > page and, if you please, edit the table to include yourself and your > capabilities. Thanks! > > DG > > David, Just to be clear, this is *only* for those who can grant permissions to others, right? Ben Root -------------- next part -------------- An HTML attachment was scrubbed... URL: From d.l.goldsmith at gmail.com Mon Jul 19 20:58:28 2010 From: d.l.goldsmith at gmail.com (David Goldsmith) Date: Mon, 19 Jul 2010 17:58:28 -0700 Subject: [SciPy-Dev] New Wiki page In-Reply-To: References: Message-ID: On Mon, Jul 19, 2010 at 5:13 PM, Benjamin Root wrote: > On Mon, Jul 19, 2010 at 7:03 PM, David Goldsmith wrote: > >> Hi, all! As per a suggestion, I've added a "Who's Who"of "permission granters, etc." for our Wikiland to the doc wiki. If you >> have any manner of administrative capability on the Wiki, please visit the >> page and, if you please, edit the table to include yourself and your >> capabilities. Thanks! >> >> DG >> >> > David, > > Just to be clear, this is *only* for those who can grant permissions to > others, right? > Visit the page (it's open to everybody): if you fall into any of the displayed categories, please add yourself. (For your convenience, the categories are: Edit permissions granter, Review permissions granter, Proof permissions granter, Merger, Superuser, and pydocweb admin. DG > > Ben Root > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > > -- Mathematician: noun, someone who disavows certainty when their uncertainty set is non-empty, even if that set has measure zero. Hope: noun, that delusive spirit which escaped Pandora's jar and, with her lies, prevents mankind from committing a general suicide. (As interpreted by Robert Graves) -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsouthey at gmail.com Tue Jul 20 10:20:27 2010 From: bsouthey at gmail.com (Bruce Southey) Date: Tue, 20 Jul 2010 09:20:27 -0500 Subject: [SciPy-Dev] Python2.4 test failures with scipy SVN and rc3 Message-ID: <4C45B0AB.9010900@gmail.com> Hi, There are a few tests failing with Python2.4 under Linux 64-bit: $ python2.4 Python 2.4.5 (#1, Oct 6 2008, 09:54:35) [GCC 4.3.2 20080917 (Red Hat 4.3.2-4)] on linux2 Type "help", "copyright", "credits" or "license" for more information. >>> import numpy as np >>> import scipy as sp >>> np.__version__ '2.0.0.dev8391' >>> sp.__version__ '0.9.0.dev6630' One is due to use of the 'functools' module introduced in Python2.5 so I added the same workaround as scipy.io.arff http://projects.scipy.org/scipy/ticket/1244 Three are the same error in 'scipy/io/netcdf.py' file: http://projects.scipy.org/scipy/ticket/1243 Can the test 'test_decomp.test_lapack_misaligned' be set to known as this is ticket 1152 http://projects.scipy.org/scipy/ticket/1152 I do not get these errors under Python2.6. Bruce ====================================================================== ERROR: Failure: ImportError (No module named functools) ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib/python2.4/site-packages/nose/loader.py", line 363, in loadTestsFromName module = self.importer.importFromPath( File "/usr/local/lib/python2.4/site-packages/nose/importer.py", line 39, in importFromPath return self.importFromDir(dir_path, fqname) File "/usr/local/lib/python2.4/site-packages/nose/importer.py", line 84, in importFromDir mod = load_module(part_fqname, fh, filename, desc) File "/usr/local/lib/python2.4/site-packages/scipy/io/matlab/tests/test_mio.py", line 11, in ? from functools import partial ImportError: No module named functools ====================================================================== ERROR: test suite ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib/python2.4/site-packages/nose/suite.py", line 154, in run self.setUp() File "/usr/local/lib/python2.4/site-packages/nose/suite.py", line 180, in setUp if not self: File "/usr/local/lib/python2.4/site-packages/nose/suite.py", line 65, in __nonzero__ test = self.test_generator.next() File "/usr/local/lib/python2.4/site-packages/nose/loader.py", line 221, in generate for test in g(): File "/usr/local/lib/python2.4/site-packages/scipy/io/tests/test_netcdf.py", line 52, in test_read_write_files f = netcdf_file('simple.nc') File "/usr/local/lib/python2.4/site-packages/scipy/io/netcdf.py", line 182, in __init__ self._read() File "/usr/local/lib/python2.4/site-packages/scipy/io/netcdf.py", line 411, in _read self._read_var_array() File "/usr/local/lib/python2.4/site-packages/scipy/io/netcdf.py", line 451, in _read_var_array (name, dimensions, shape, attributes, File "/usr/local/lib/python2.4/site-packages/scipy/io/netcdf.py", line 533, in _read_var dimname = self._dims[dimid] TypeError: list indices must be integers ====================================================================== ERROR: test suite ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib/python2.4/site-packages/nose/suite.py", line 154, in run self.setUp() File "/usr/local/lib/python2.4/site-packages/nose/suite.py", line 180, in setUp if not self: File "/usr/local/lib/python2.4/site-packages/nose/suite.py", line 65, in __nonzero__ test = self.test_generator.next() File "/usr/local/lib/python2.4/site-packages/nose/loader.py", line 221, in generate for test in g(): File "/usr/local/lib/python2.4/site-packages/scipy/io/tests/test_netcdf.py", line 91, in test_read_write_sio f2 = netcdf_file(eg_sio2) File "/usr/local/lib/python2.4/site-packages/scipy/io/netcdf.py", line 182, in __init__ self._read() File "/usr/local/lib/python2.4/site-packages/scipy/io/netcdf.py", line 411, in _read self._read_var_array() File "/usr/local/lib/python2.4/site-packages/scipy/io/netcdf.py", line 451, in _read_var_array (name, dimensions, shape, attributes, File "/usr/local/lib/python2.4/site-packages/scipy/io/netcdf.py", line 533, in _read_var dimname = self._dims[dimid] TypeError: list indices must be integers ====================================================================== ERROR: test_netcdf.test_read_example_data ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib/python2.4/site-packages/nose/case.py", line 182, in runTest self.test(*self.arg) File "/usr/local/lib/python2.4/site-packages/scipy/io/tests/test_netcdf.py", line 119, in test_read_example_data f = netcdf_file(fname, 'r') File "/usr/local/lib/python2.4/site-packages/scipy/io/netcdf.py", line 182, in __init__ self._read() File "/usr/local/lib/python2.4/site-packages/scipy/io/netcdf.py", line 411, in _read self._read_var_array() File "/usr/local/lib/python2.4/site-packages/scipy/io/netcdf.py", line 451, in _read_var_array (name, dimensions, shape, attributes, File "/usr/local/lib/python2.4/site-packages/scipy/io/netcdf.py", line 533, in _read_var dimname = self._dims[dimid] TypeError: list indices must be integers ====================================================================== ERROR: test_decomp.test_lapack_misaligned(, (array([[ 1.734e-255, 8.189e-217, 4.025e-178, 1.903e-139, 9.344e-101, ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib/python2.4/site-packages/nose/case.py", line 182, in runTest self.test(*self.arg) File "/usr/local/lib/python2.4/site-packages/scipy/linalg/tests/test_decomp.py", line 1074, in check_lapack_misaligned func(*a,**kwargs) File "/usr/local/lib/python2.4/site-packages/scipy/linalg/basic.py", line 49, in solve a1, b1 = map(asarray_chkfinite,(a,b)) File "/usr/local/lib/python2.4/site-packages/numpy/lib/function_base.py", line 527, in asarray_chkfinite raise ValueError( ValueError: array must not contain infs or NaNs ---------------------------------------------------------------------- Ran 4472 tests in 79.999s FAILED (KNOWNFAIL=11, SKIP=40, errors=5) -------------- next part -------------- An HTML attachment was scrubbed... URL: From d.l.goldsmith at gmail.com Tue Jul 20 15:13:03 2010 From: d.l.goldsmith at gmail.com (David Goldsmith) Date: Tue, 20 Jul 2010 12:13:03 -0700 Subject: [SciPy-Dev] ***Reviewer recruitment*** Message-ID: OK, my apologies, I "buried" this very important news in an email w/ the vague subject "Skypecon tomorrow: important agenda item!": Vincent Davis, to whom we owe much gratitude, is nearing completion of adding the two-review infrastructure to the doc Wiki! Accordingly, it is time to start actively recruiting technical and presentation reviewers. As announced previously (in an email different from the one referred to above) I have added a "Reviewer Standards and Recruitment Discussion" (RSRD) page to the Wiki, and have since started it off w/ some comments on reviewer standards proffered by Joe H.; if you have any opinions on reviewer recruitment--be they on standards, procedures, or "other"--please voice them. To get the ball rolling: it has been suggested in the past (I forget by whom specifically) that we proceed by issuing on the lists (numpy-discussion, scipy-user, scipy-dev, and astropy) a formal recruitment announcement w/ the qualifications expected and desired of both types of reviewers, as well as the application procedure and a description of how reviewers will be selected. Joe has furnished us w/ some of the first on the RSRD page, but presently we are lacking specification of the latter two. I could simply draft such a recruitment announcement and put it up here for discussion, but there's (at least) one problem w/ that scenario: IMO, review and acceptance of applicants should be by a committee of at least two, preferably three "prominent" community members who, either by choice or due to, say, having been significant contributors of text to the docstrings, will not themselves be reviewer candidates, and at this time, no such committee exists (nor is there any expressed consensus w/ this applicant evaluation model). So, ideally, this email is a request that we begin a discussion and elaboration of my proposed applicant evaluation model, or at a minimum, a request for volunteers to step forward to be on a "Reviewer Applicant Review and Acceptance Committee." Thanks for your time and consideration, David Goldsmith Olympia, WA -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at googlemail.com Wed Jul 21 10:19:09 2010 From: ralf.gommers at googlemail.com (Ralf Gommers) Date: Wed, 21 Jul 2010 22:19:09 +0800 Subject: [SciPy-Dev] Python2.4 test failures with scipy SVN and rc3 In-Reply-To: <4C45B0AB.9010900@gmail.com> References: <4C45B0AB.9010900@gmail.com> Message-ID: On Tue, Jul 20, 2010 at 10:20 PM, Bruce Southey wrote: > Hi, > There are a few tests failing with Python2.4 under Linux 64-bit: > $ python2.4 > Python 2.4.5 (#1, Oct 6 2008, 09:54:35) > [GCC 4.3.2 20080917 (Red Hat 4.3.2-4)] on linux2 > Type "help", "copyright", "credits" or "license" for more information. > >>> import numpy as np > >>> import scipy as sp > >>> np.__version__ > '2.0.0.dev8391' > >>> sp.__version__ > '0.9.0.dev6630' > > One is due to use of the 'functools' module introduced in Python2.5 so I > added the same workaround as scipy.io.arff > http://projects.scipy.org/scipy/ticket/1244 > > Applied in trunk in r6632. Will do the same for 0.8.0. > Three are the same error in 'scipy/io/netcdf.py' file: > http://projects.scipy.org/scipy/ticket/1243 > No idea. There are a bunch of other tickets for netcdf, it could use a makeover. > > Can the test 'test_decomp.test_lapack_misaligned' be set to known as this > is ticket 1152 > http://projects.scipy.org/scipy/ticket/1152 > Marked as known in r6633. Thanks for testing, Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsouthey at gmail.com Wed Jul 21 11:10:41 2010 From: bsouthey at gmail.com (Bruce Southey) Date: Wed, 21 Jul 2010 10:10:41 -0500 Subject: [SciPy-Dev] Python2.4 test failures with scipy SVN and rc3 In-Reply-To: References: <4C45B0AB.9010900@gmail.com> Message-ID: <4C470DF1.5080603@gmail.com> On 07/21/2010 09:19 AM, Ralf Gommers wrote: > > > On Tue, Jul 20, 2010 at 10:20 PM, Bruce Southey > wrote: > > Hi, > There are a few tests failing with Python2.4 under Linux 64-bit: > $ python2.4 > Python 2.4.5 (#1, Oct 6 2008, 09:54:35) > [GCC 4.3.2 20080917 (Red Hat 4.3.2-4)] on linux2 > Type "help", "copyright", "credits" or "license" for more information. > >>> import numpy as np > >>> import scipy as sp > >>> np.__version__ > '2.0.0.dev8391' > >>> sp.__version__ > '0.9.0.dev6630' > > One is due to use of the 'functools' module introduced in > Python2.5 so I added the same workaround as scipy.io.arff > http://projects.scipy.org/scipy/ticket/1244 > > Applied in trunk in r6632. Will do the same for 0.8.0. > > Three are the same error in 'scipy/io/netcdf.py' file: > http://projects.scipy.org/scipy/ticket/1243 > > > No idea. There are a bunch of other tickets for netcdf, it could use a > makeover. > > > Can the test 'test_decomp.test_lapack_misaligned' be set to known > as this is ticket 1152 > http://projects.scipy.org/scipy/ticket/1152 > > > Marked as known in r6633. > > Thanks for testing, > Ralf > > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > I tracked down the netcdf error is one of indexing with Python2.4 with numpy integers. So while int64 is okay (as it should be) but most of the other integer types fail. I have added this to the the ticket. >>> import numpy as np >>> alist=['1','4','6'] >>> alist[np.int64(0)] '1' >>> alist[np.int32(0)] Traceback (most recent call last): File "", line 1, in ? TypeError: list indices must be integers >>> alist[np.int8(0)] Traceback (most recent call last): File "", line 1, in ? TypeError: list indices must be integers >>> alist[np.int16(0)] Traceback (most recent call last): File "", line 1, in ? TypeError: list indices must be integers Bruce -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsouthey at gmail.com Wed Jul 21 12:30:47 2010 From: bsouthey at gmail.com (Bruce Southey) Date: Wed, 21 Jul 2010 11:30:47 -0500 Subject: [SciPy-Dev] Python2.4 test failures with scipy SVN and rc3 In-Reply-To: <4C470DF1.5080603@gmail.com> References: <4C45B0AB.9010900@gmail.com> <4C470DF1.5080603@gmail.com> Message-ID: On Wed, Jul 21, 2010 at 10:10 AM, Bruce Southey wrote: > On 07/21/2010 09:19 AM, Ralf Gommers wrote: > > On Tue, Jul 20, 2010 at 10:20 PM, Bruce Southey wrote: >> >> Hi, >> There are a few tests failing with Python2.4 under Linux 64-bit: >> $ python2.4 >> Python 2.4.5 (#1, Oct? 6 2008, 09:54:35) >> [GCC 4.3.2 20080917 (Red Hat 4.3.2-4)] on linux2 >> Type "help", "copyright", "credits" or "license" for more information. >> >>> import numpy as np >> >>> import scipy as sp >> >>> np.__version__ >> '2.0.0.dev8391' >> >>> sp.__version__ >> '0.9.0.dev6630' >> >> One is due to use of the 'functools' module introduced in Python2.5 so I >> added the same workaround as scipy.io.arff >> http://projects.scipy.org/scipy/ticket/1244 >> > Applied in trunk in r6632. Will do the same for 0.8.0. > >> >> Three are the same error in 'scipy/io/netcdf.py' file: >> http://projects.scipy.org/scipy/ticket/1243 > > No idea. There are a bunch of other tickets for netcdf, it could use a > makeover. >> >> Can the test 'test_decomp.test_lapack_misaligned' be set to known as this >> is ticket 1152 >> http://projects.scipy.org/scipy/ticket/1152 > > Marked as known in r6633. > > Thanks for testing, > Ralf > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > > > I tracked down the netcdf error is one of indexing with Python2.4 with numpy > integers.? So while int64 is okay (as it should be) but most of the other > integer types fail. I have added this to the the ticket. > >>>> import numpy as np >>>> alist=['1','4','6'] >>>> alist[np.int64(0)] > '1' >>>> alist[np.int32(0)] > Traceback (most recent call last): > ? File "", line 1, in ? > TypeError: list indices must be integers >>>> alist[np.int8(0)] > Traceback (most recent call last): > ? File "", line 1, in ? > TypeError: list indices must be integers >>>> alist[np.int16(0)] > Traceback (most recent call last): > ? File "", line 1, in ? > TypeError: list indices must be integers > > > Bruce I have just found numpy ticket 911: http://projects.scipy.org/numpy/ticket/911 While I do not know netcdf, I think that the generic int type has been hardcoded to the int32 type. I made a patch (attached to the ticket) that I know is not the correct solution to address the _unpack_int function. The correct fix is to create _unpack_int and _pack_int to use the system default dtype of int as well as having int32 versions. However, I do not know if that will overcome the Python2.4 indexing issue. Bruce From nwagner at iam.uni-stuttgart.de Wed Jul 21 15:43:16 2010 From: nwagner at iam.uni-stuttgart.de (Nils Wagner) Date: Wed, 21 Jul 2010 21:43:16 +0200 Subject: [SciPy-Dev] scipy.test() failures '0.9.0.dev6635' Message-ID: ====================================================================== FAIL: test_mio.test_mat4_3d(, , , {'a': array([[[ 0, 1, 2, 3], ---------------------------------------------------------------------- Traceback (most recent call last): File "/home/nwagner/local/lib64/python2.6/site-packages/nose-0.11.2.dev-py2.6.egg/nose/case.py", line 18 3, in runTest self.test(*self.arg) File "/home/nwagner/local/lib64/python2.6/site-packages/numpy/testing/utils.py", line 982, in assert_rai ses return nose.tools.assert_raises(*args,**kwargs) AssertionError: DeprecationWarning not raised if hasattr(excClass,'__name__'): 'DeprecationWarning' = excClass.'unittest' else: 'DeprecationWarning' = str(excClass) >> raise self.failureException, "%s not raised" % 'DeprecationWarning' ====================================================================== FAIL: test_00_deprecation_warning (test_basic.TestSolveHBanded) ---------------------------------------------------------------------- Traceback (most recent call last): File "/home/nwagner/local/lib64/python2.6/site-packages/scipy/linalg/tests/test_basic.py", line 262, in test_00_deprecation_warning assert_raises(DeprecationWarning, solveh_banded, ab, b) File "/home/nwagner/local/lib64/python2.6/site-packages/numpy/testing/utils.py", line 982, in assert_rai ses return nose.tools.assert_raises(*args,**kwargs) AssertionError: DeprecationWarning not raised if hasattr(excClass,'__name__'): 'DeprecationWarning' = excClass.'unittest' else: 'DeprecationWarning' = str(excClass) >> raise self.failureException, "%s not raised" % 'DeprecationWarning' ====================================================================== FAIL: line-search Newton conjugate gradient optimization routine ---------------------------------------------------------------------- Traceback (most recent call last): File "/home/nwagner/local/lib64/python2.6/site-packages/scipy/optimize/tests/test_optimize.py", line 177 , in test_ncg assert self.gradcalls == 18, self.gradcalls # 0.8.0 AssertionError: 16 # Ensure that function call counts are 'known good'; these are from # Scipy 0.7.0. Don't allow them to increase. assert self.funccalls == 7, self.funccalls >> assert self.gradcalls == 18, self.gradcalls # 0.8.0 #assert self.gradcalls == 22, self.gradcalls # 0.7.0 ---------------------------------------------------------------------- Ran 4604 tests in 82.164s FAILED (KNOWNFAIL=12, SKIP=28, failures=3) From warren.weckesser at enthought.com Wed Jul 21 17:13:34 2010 From: warren.weckesser at enthought.com (Warren Weckesser) Date: Wed, 21 Jul 2010 16:13:34 -0500 Subject: [SciPy-Dev] scipy.test() failures '0.9.0.dev6635' In-Reply-To: References: Message-ID: <4C4762FE.5010807@enthought.com> Nils Wagner wrote: > ====================================================================== > FAIL: test_mio.test_mat4_3d( 'exceptions.DeprecationWarning'>, object at 0x3081680 > >> , , {'a': array([[[ 0, 1, 2, 3], >> > ---------------------------------------------------------------------- > Traceback (most recent call last): > File > "/home/nwagner/local/lib64/python2.6/site-packages/nose-0.11.2.dev-py2.6.egg/nose/case.py", > line 18 > 3, in runTest > self.test(*self.arg) > File > "/home/nwagner/local/lib64/python2.6/site-packages/numpy/testing/utils.py", > line 982, in assert_rai > ses > return nose.tools.assert_raises(*args,**kwargs) > AssertionError: DeprecationWarning not raised > if hasattr(excClass,'__name__'): 'DeprecationWarning' > = excClass.'unittest' > else: 'DeprecationWarning' = str(excClass) > >>> raise self.failureException, "%s not raised" % 'DeprecationWarning' >>> > > > ====================================================================== > FAIL: test_00_deprecation_warning > (test_basic.TestSolveHBanded) > ---------------------------------------------------------------------- > Traceback (most recent call last): > File > "/home/nwagner/local/lib64/python2.6/site-packages/scipy/linalg/tests/test_basic.py", > line 262, in > test_00_deprecation_warning > assert_raises(DeprecationWarning, solveh_banded, ab, > b) > File > "/home/nwagner/local/lib64/python2.6/site-packages/numpy/testing/utils.py", > line 982, in assert_rai > ses > return nose.tools.assert_raises(*args,**kwargs) > AssertionError: DeprecationWarning not raised > if hasattr(excClass,'__name__'): 'DeprecationWarning' > = excClass.'unittest' > else: 'DeprecationWarning' = str(excClass) > >>> raise self.failureException, "%s not raised" % 'DeprecationWarning' >>> > > These two errors are failures to raise a deprecation warning. Does anyone else get these? I wrote the test test_00_deprecation_warning in test_basic.py, one of the linalg test modules. In SciPy 0.8 and currently in the trunk, the function solveh_banded is supposed to raise a deprecation warning, because in the final release of 0.9, its return value will be changed. The tests for solveh_banded work for me, but they might be taking advantage of behavior of nose that is not reliable. In effect, test_00_deprecation_warning assumes that will execute the first call of solveh_banded, so the deprecation warning will occur. All the other test functions include the line warnings.simplefilter('ignore', category=DeprecationWarning) so they ignore the warning, even if it occurs. Is there a way to reset whatever internal flag the warnings module has that indicates that a warning has already been issued once? I see that the documentation for the warnings module describes a 'catch_warnings' context manager, but we need to be compatible with Python 2.4. The 'with' statement appears in 2.5. Warren P.S. It looks like handling of DeprecationWarnings will require a little more work with Python 2.7, where they are ignored by default. From pav at iki.fi Wed Jul 21 17:19:34 2010 From: pav at iki.fi (Pauli Virtanen) Date: Wed, 21 Jul 2010 21:19:34 +0000 (UTC) Subject: [SciPy-Dev] scipy.test() failures '0.9.0.dev6635' References: Message-ID: Wed, 21 Jul 2010 21:43:16 +0200, Nils Wagner wrote: [clip] > ====================================================================== > FAIL: line-search Newton conjugate gradient optimization routine > ---------------------------------------------------------------------- > Traceback (most recent call last): > File > "/home/nwagner/local/lib64/python2.6/site-packages/scipy/optimize/tests/ test_optimize.py", > line 177 > , in test_ncg > assert self.gradcalls == 18, self.gradcalls # 0.8.0 > AssertionError: 16 > # Ensure that function call counts are 'known good'; > these are from > # Scipy 0.7.0. Don't allow them to increase. assert self.funccalls > == 7, self.funccalls >>> assert self.gradcalls == 18, self.gradcalls # 0.8.0 > #assert self.gradcalls == 22, self.gradcalls # 0.7.0 It seems that the details of how these optimization algorithms perform is sensitive to numerical noise. On my compiler/platform combination, the above does take the 18 iterations. I guess I'll need to pull this test back, or make it more relaxed. -- Pauli Virtanen From warren.weckesser at enthought.com Wed Jul 21 17:22:46 2010 From: warren.weckesser at enthought.com (Warren Weckesser) Date: Wed, 21 Jul 2010 16:22:46 -0500 Subject: [SciPy-Dev] scipy.test() failures '0.9.0.dev6635' In-Reply-To: <4C4762FE.5010807@enthought.com> References: <4C4762FE.5010807@enthought.com> Message-ID: <4C476526.4010509@enthought.com> Warren Weckesser wrote: > Nils Wagner wrote: > >> ====================================================================== >> FAIL: test_mio.test_mat4_3d(> 'exceptions.DeprecationWarning'>, > object at 0x3081680 >> >> >>> , , {'a': array([[[ 0, 1, 2, 3], >>> >>> >> ---------------------------------------------------------------------- >> Traceback (most recent call last): >> File >> "/home/nwagner/local/lib64/python2.6/site-packages/nose-0.11.2.dev-py2.6.egg/nose/case.py", >> line 18 >> 3, in runTest >> self.test(*self.arg) >> File >> "/home/nwagner/local/lib64/python2.6/site-packages/numpy/testing/utils.py", >> line 982, in assert_rai >> ses >> return nose.tools.assert_raises(*args,**kwargs) >> AssertionError: DeprecationWarning not raised >> if hasattr(excClass,'__name__'): 'DeprecationWarning' >> = excClass.'unittest' >> else: 'DeprecationWarning' = str(excClass) >> >> >>>> raise self.failureException, "%s not raised" % 'DeprecationWarning' >>>> >>>> >> ====================================================================== >> FAIL: test_00_deprecation_warning >> (test_basic.TestSolveHBanded) >> ---------------------------------------------------------------------- >> Traceback (most recent call last): >> File >> "/home/nwagner/local/lib64/python2.6/site-packages/scipy/linalg/tests/test_basic.py", >> line 262, in >> test_00_deprecation_warning >> assert_raises(DeprecationWarning, solveh_banded, ab, >> b) >> File >> "/home/nwagner/local/lib64/python2.6/site-packages/numpy/testing/utils.py", >> line 982, in assert_rai >> ses >> return nose.tools.assert_raises(*args,**kwargs) >> AssertionError: DeprecationWarning not raised >> if hasattr(excClass,'__name__'): 'DeprecationWarning' >> = excClass.'unittest' >> else: 'DeprecationWarning' = str(excClass) >> >> >>>> raise self.failureException, "%s not raised" % 'DeprecationWarning' >>>> >>>> >> >> > > These two errors are failures to raise a deprecation warning. > Does anyone else get these? > > I wrote the test test_00_deprecation_warning in test_basic.py, > one of the linalg test modules. In SciPy 0.8 and currently > in the trunk, the function solveh_banded is supposed to > raise a deprecation warning, because in the final release > of 0.9, its return value will be changed. > > The tests for solveh_banded work for me, but they might > be taking advantage of behavior of nose that is not > reliable. In effect, test_00_deprecation_warning assumes > that will execute the first call of solveh_banded, so > the deprecation warning will occur. All the other test > functions include the line > warnings.simplefilter('ignore', category=DeprecationWarning) > so they ignore the warning, even if it occurs. > > Is there a way to reset whatever internal flag the > warnings module has that indicates that a warning has > already been issued once? > Partially answering my own question: in the test where I want the warning to occur, I could use a simplefilter to always issue deprecation warnings: warnings.simplefilter('always', category=DeprecationWarning) Warren > I see that the documentation for the warnings module > describes a 'catch_warnings' context manager, but > we need to be compatible with Python 2.4. The 'with' > statement appears in 2.5. > > Warren > > > P.S. It looks like handling of DeprecationWarnings will > require a little more work with Python 2.7, where they > are ignored by default. > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > From warren.weckesser at enthought.com Wed Jul 21 17:52:07 2010 From: warren.weckesser at enthought.com (Warren Weckesser) Date: Wed, 21 Jul 2010 16:52:07 -0500 Subject: [SciPy-Dev] scipy.test() failures '0.9.0.dev6635' In-Reply-To: <4C476526.4010509@enthought.com> References: <4C4762FE.5010807@enthought.com> <4C476526.4010509@enthought.com> Message-ID: <4C476C07.6060603@enthought.com> Sorry for talking to myself as I learn more about the warnings module... Warren Weckesser wrote: > Warren Weckesser wrote: > >> Nils Wagner wrote: >> >> >>> ====================================================================== >>> FAIL: test_mio.test_mat4_3d(>> 'exceptions.DeprecationWarning'>, >> object at 0x3081680 >>> >>> >>> >>>> , , {'a': array([[[ 0, 1, 2, 3], >>>> >>>> >>>> >>> ---------------------------------------------------------------------- >>> Traceback (most recent call last): >>> File >>> "/home/nwagner/local/lib64/python2.6/site-packages/nose-0.11.2.dev-py2.6.egg/nose/case.py", >>> line 18 >>> 3, in runTest >>> self.test(*self.arg) >>> File >>> "/home/nwagner/local/lib64/python2.6/site-packages/numpy/testing/utils.py", >>> line 982, in assert_rai >>> ses >>> return nose.tools.assert_raises(*args,**kwargs) >>> AssertionError: DeprecationWarning not raised >>> if hasattr(excClass,'__name__'): 'DeprecationWarning' >>> = excClass.'unittest' >>> else: 'DeprecationWarning' = str(excClass) >>> >>> >>> >>>>> raise self.failureException, "%s not raised" % 'DeprecationWarning' >>>>> >>>>> >>>>> >>> ====================================================================== >>> FAIL: test_00_deprecation_warning >>> (test_basic.TestSolveHBanded) >>> ---------------------------------------------------------------------- >>> Traceback (most recent call last): >>> File >>> "/home/nwagner/local/lib64/python2.6/site-packages/scipy/linalg/tests/test_basic.py", >>> line 262, in >>> test_00_deprecation_warning >>> assert_raises(DeprecationWarning, solveh_banded, ab, >>> b) >>> File >>> "/home/nwagner/local/lib64/python2.6/site-packages/numpy/testing/utils.py", >>> line 982, in assert_rai >>> ses >>> return nose.tools.assert_raises(*args,**kwargs) >>> AssertionError: DeprecationWarning not raised >>> if hasattr(excClass,'__name__'): 'DeprecationWarning' >>> = excClass.'unittest' >>> else: 'DeprecationWarning' = str(excClass) >>> >>> >>> >>>>> raise self.failureException, "%s not raised" % 'DeprecationWarning' >>>>> >>>>> >>>>> >>> >>> >>> >> These two errors are failures to raise a deprecation warning. >> Does anyone else get these? >> >> I wrote the test test_00_deprecation_warning in test_basic.py, >> one of the linalg test modules. In SciPy 0.8 and currently >> in the trunk, the function solveh_banded is supposed to >> raise a deprecation warning, because in the final release >> of 0.9, its return value will be changed. >> >> The tests for solveh_banded work for me, but they might >> be taking advantage of behavior of nose that is not >> reliable. In effect, test_00_deprecation_warning assumes >> that will execute the first call of solveh_banded, so >> the deprecation warning will occur. All the other test >> functions include the line >> warnings.simplefilter('ignore', category=DeprecationWarning) >> so they ignore the warning, even if it occurs. >> >> Is there a way to reset whatever internal flag the >> warnings module has that indicates that a warning has >> already been issued once? >> >> > > Partially answering my own question: in the test > where I want the warning to occur, I could use a > simplefilter to always issue deprecation warnings: > > warnings.simplefilter('always', category=DeprecationWarning) > > Nope, that doesn't do what I expected. Now I'll be quiet until I have a better idea of how warnings *really* work. Warren > Warren > > > >> I see that the documentation for the warnings module >> describes a 'catch_warnings' context manager, but >> we need to be compatible with Python 2.4. The 'with' >> statement appears in 2.5. >> >> Warren >> >> >> P.S. It looks like handling of DeprecationWarnings will >> require a little more work with Python 2.7, where they >> are ignored by default. >> >> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at scipy.org >> http://mail.scipy.org/mailman/listinfo/scipy-dev >> >> > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > From josef.pktd at gmail.com Wed Jul 21 18:02:06 2010 From: josef.pktd at gmail.com (josef.pktd at gmail.com) Date: Thu, 22 Jul 2010 00:02:06 +0200 Subject: [SciPy-Dev] scipy.test() failures '0.9.0.dev6635' In-Reply-To: <4C476C07.6060603@enthought.com> References: <4C4762FE.5010807@enthought.com> <4C476526.4010509@enthought.com> <4C476C07.6060603@enthought.com> Message-ID: On Wed, Jul 21, 2010 at 11:52 PM, Warren Weckesser wrote: > Sorry for talking to myself as I learn more > about the warnings module... > > Warren Weckesser wrote: >> Warren Weckesser wrote: >> >>> Nils Wagner wrote: >>> >>> >>>> ====================================================================== >>>> FAIL: test_mio.test_mat4_3d(>>> 'exceptions.DeprecationWarning'>, >>> object at 0x3081680 >>>> >>>> >>>> >>>>> , , {'a': array([[[ 0, ?1, ?2, ?3], >>>>> >>>>> >>>>> >>>> ---------------------------------------------------------------------- >>>> Traceback (most recent call last): >>>> ? ?File >>>> "/home/nwagner/local/lib64/python2.6/site-packages/nose-0.11.2.dev-py2.6.egg/nose/case.py", >>>> line 18 >>>> 3, in runTest >>>> ? ? ?self.test(*self.arg) >>>> ? ?File >>>> "/home/nwagner/local/lib64/python2.6/site-packages/numpy/testing/utils.py", >>>> line 982, in assert_rai >>>> ses >>>> ? ? ?return nose.tools.assert_raises(*args,**kwargs) >>>> AssertionError: DeprecationWarning not raised >>>> ? ? ?if hasattr(excClass,'__name__'): 'DeprecationWarning' >>>> = excClass.'unittest' >>>> ? ? ?else: 'DeprecationWarning' = str(excClass) >>>> >>>> >>>> >>>>>> ?raise self.failureException, "%s not raised" % 'DeprecationWarning' >>>>>> >>>>>> >>>>>> >>>> ====================================================================== >>>> FAIL: test_00_deprecation_warning >>>> (test_basic.TestSolveHBanded) >>>> ---------------------------------------------------------------------- >>>> Traceback (most recent call last): >>>> ? ?File >>>> "/home/nwagner/local/lib64/python2.6/site-packages/scipy/linalg/tests/test_basic.py", >>>> line 262, in >>>> test_00_deprecation_warning >>>> ? ? ?assert_raises(DeprecationWarning, solveh_banded, ab, >>>> b) >>>> ? ?File >>>> "/home/nwagner/local/lib64/python2.6/site-packages/numpy/testing/utils.py", >>>> line 982, in assert_rai >>>> ses >>>> ? ? ?return nose.tools.assert_raises(*args,**kwargs) >>>> AssertionError: DeprecationWarning not raised >>>> ? ? ?if hasattr(excClass,'__name__'): 'DeprecationWarning' >>>> = excClass.'unittest' >>>> ? ? ?else: 'DeprecationWarning' = str(excClass) >>>> >>>> >>>> >>>>>> ?raise self.failureException, "%s not raised" % 'DeprecationWarning' >>>>>> >>>>>> >>>>>> >>>> >>>> >>>> >>> These two errors are failures to raise a deprecation warning. >>> Does anyone else get these? >>> >>> I wrote the test test_00_deprecation_warning in test_basic.py, >>> one of the linalg test modules. ?In SciPy 0.8 and currently >>> in the trunk, the function solveh_banded is supposed to >>> raise a deprecation warning, because in the final release >>> of 0.9, its return value will be changed. >>> >>> The tests for solveh_banded work for me, but they might >>> be taking advantage of behavior of nose that is not >>> reliable. ?In effect, test_00_deprecation_warning assumes >>> that will execute the first call of solveh_banded, so >>> the deprecation warning will occur. ?All the other test >>> functions include the line >>> ? ? warnings.simplefilter('ignore', category=DeprecationWarning) >>> so they ignore the warning, even if it occurs. >>> >>> Is there a way to reset whatever internal flag the >>> warnings module has that indicates that a warning has >>> already been issued once? >>> >>> >> >> Partially answering my own question: ?in the test >> where I want the warning to occur, I could use a >> simplefilter to always issue deprecation warnings: >> >> ? warnings.simplefilter('always', category=DeprecationWarning) >> >> > > Nope, that doesn't do what I expected. ?Now I'll be quiet > until I have a better idea of how warnings *really* > work. Keep going even if it's a monologue. I find negative results very instructive and have my struggle with the details of warnings still ahead of me. Josef > > Warren > > >> Warren >> >> >> >>> I see that the documentation for the warnings module >>> describes a 'catch_warnings' context manager, but >>> we need to be compatible with Python 2.4. ?The 'with' >>> statement appears in 2.5. >>> >>> Warren >>> >>> >>> P.S. It looks like handling of DeprecationWarnings will >>> require a little more work with Python 2.7, where they >>> are ignored by default. >>> >>> _______________________________________________ >>> SciPy-Dev mailing list >>> SciPy-Dev at scipy.org >>> http://mail.scipy.org/mailman/listinfo/scipy-dev >>> >>> >> >> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at scipy.org >> http://mail.scipy.org/mailman/listinfo/scipy-dev >> > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > From fabian.pedregosa at inria.fr Thu Jul 22 08:13:14 2010 From: fabian.pedregosa at inria.fr (Fabian Pedregosa) Date: Thu, 22 Jul 2010 14:13:14 +0200 Subject: [SciPy-Dev] [PATCH] Scipy and python2.7 Message-ID: <20100722141314.hjq3677lmsg8kg08@fseoane.net> Hi all. With some minor changes I could install scipy using python2.7. The only thing that failed was the c++ swig wrappers in scipy.sparse.sparsetools. I believe this is because python is compiled now using gcc and not g++, and also default flags have changed. The problem seems to be that numpy/ndarraytypes.h import inttypes.h, but the macros in this file are not accessibly from C++ unless you define the variable __STDC_FORMAT_MACROS. My solution was to define this variable in the setup.py script. This makes it build fine and test are OK too (ubuntu linux 64 bit). I attach the (git) patch. ~Fabian -------------- next part -------------- A non-text attachment was scrubbed... Name: 0001-FIX-define-macro-to-access-C99-extensions-from-C.patch Type: text/x-patch Size: 1194 bytes Desc: not available URL: From bsouthey at gmail.com Fri Jul 23 10:07:35 2010 From: bsouthey at gmail.com (Bruce Southey) Date: Fri, 23 Jul 2010 09:07:35 -0500 Subject: [SciPy-Dev] [PATCH] Scipy and python2.7 In-Reply-To: <20100722141314.hjq3677lmsg8kg08@fseoane.net> References: <20100722141314.hjq3677lmsg8kg08@fseoane.net> Message-ID: <4C49A227.5090105@gmail.com> On 07/22/2010 07:13 AM, Fabian Pedregosa wrote: > Hi all. > > With some minor changes I could install scipy using python2.7. The > only thing that failed was the c++ swig wrappers in > scipy.sparse.sparsetools. I believe this is because python is compiled > now using gcc and not g++, and also default flags have changed. > > The problem seems to be that numpy/ndarraytypes.h import inttypes.h, > but the macros in this file are not accessibly from C++ unless you > define the variable __STDC_FORMAT_MACROS. My solution was to define > this variable in the setup.py script. This makes it build fine and > test are OK too (ubuntu linux 64 bit). I attach the (git) patch. > > ~Fabian > > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > Hi, Thanks for the patch as it works for Python version 2.4, 2.5, 2.6 and 2.7 on my 64-bit Linux system. Since you did the work, can you attach it to ticket 1180? http://projects.scipy.org/scipy/ticket/1180 Thanks Bruce -------------- next part -------------- An HTML attachment was scrubbed... URL: From jh at physics.ucf.edu Fri Jul 23 15:01:06 2010 From: jh at physics.ucf.edu (Joe Harrington) Date: Fri, 23 Jul 2010 15:01:06 -0400 Subject: [SciPy-Dev] reviewers needed for NumPy Message-ID: Hi folks, We are (finally) about to begin reviewing and proofing the NumPy docstrings! This is the final step in producing professional-level docs for NumPy. What we need now are people willing to review docs. There are two types of reviewers: Technical reviewers should be developers or *very* experienced NumPy users. Technical review entails checking the source code (it's available on a click in the doc wiki) and reading the doc to ensure that the signature and description are both correct and complete. Presentation reviewers need to be modestly experienced with NumPy, and should have some experience either in technical writing or as educators. Their job is to make sure the docstring is understandable to the target audience (one level below the expected user of that item), including appropriate examples and references. Review entails reading each page, checking that it meets the review standards, and either approving it or saying how it doesn't meet them. All this takes place on the doc wiki, so the mechanics are easy. Please post a message on scipy-dev if you are interested in becoming a reviewer, or if you have questions about reviewing. As a volunteer reviewer, you can put as much or as little time into this as you like. Thanks! --jh-- for the SciPy Documentation Project team From ralf.gommers at googlemail.com Sat Jul 24 04:27:54 2010 From: ralf.gommers at googlemail.com (Ralf Gommers) Date: Sat, 24 Jul 2010 16:27:54 +0800 Subject: [SciPy-Dev] [PATCH] Scipy and python2.7 In-Reply-To: <20100722141314.hjq3677lmsg8kg08@fseoane.net> References: <20100722141314.hjq3677lmsg8kg08@fseoane.net> Message-ID: Hi Fabian, On Thu, Jul 22, 2010 at 8:13 PM, Fabian Pedregosa wrote: > Hi all. > > With some minor changes I could install scipy using python2.7. The only > thing that failed was the c++ swig wrappers in scipy.sparse.sparsetools. I > believe this is because python is compiled now using gcc and not g++, and > also default flags have changed. > > The problem seems to be that numpy/ndarraytypes.h import inttypes.h, but > the macros in this file are not accessibly from C++ unless you define the > variable __STDC_FORMAT_MACROS. My solution was to define this variable in > the setup.py script. This makes it build fine and test are OK too (ubuntu > linux 64 bit). I attach the (git) patch. > > That seems to fix the problem, good catch. I'm wondering if this really belongs in setup.py. numpyconfig.h already defines __STDC_FORMAT_MACROS, so maybe including that in sparsetools.i before arrayobject.h would be cleaner? If it goes in setup.py it should also go in the SConscript file. Cheers, Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From fabian.pedregosa at inria.fr Sun Jul 25 08:45:00 2010 From: fabian.pedregosa at inria.fr (Fabian Pedregosa) Date: Sun, 25 Jul 2010 14:45:00 +0200 Subject: [SciPy-Dev] [PATCH] Scipy and python2.7 In-Reply-To: References: <20100722141314.hjq3677lmsg8kg08@fseoane.net> Message-ID: <4C4C31CC.10702@inria.fr> Ralf Gommers wrote: > Hi Fabian, > > On Thu, Jul 22, 2010 at 8:13 PM, Fabian Pedregosa > > wrote: > > Hi all. > > With some minor changes I could install scipy using python2.7. The > only thing that failed was the c++ swig wrappers in > scipy.sparse.sparsetools. I believe this is because python is > compiled now using gcc and not g++, and also default flags have changed. > > The problem seems to be that numpy/ndarraytypes.h import inttypes.h, > but the macros in this file are not accessibly from C++ unless you > define the variable __STDC_FORMAT_MACROS. My solution was to define > this variable in the setup.py script. This makes it build fine and > test are OK too (ubuntu linux 64 bit). I attach the (git) patch. > > > That seems to fix the problem, good catch. I'm wondering if this really > belongs in setup.py. numpyconfig.h already defines __STDC_FORMAT_MACROS, > so maybe including that in sparsetools.i before arrayobject.h would be > cleaner? Thanks for the review. I tried to include numpyconfig.h in sparsetools.i as you suggest but for some reason it didn't solve the problem (hand-editing the generated swig wrapper and placing that file at start made it work, though). Maybe someone with more swig/C/C++ experience knows why. > > If it goes in setup.py it should also go in the SConscript file. I updated the patch in issue #1180. Cheers, Fabian > > Cheers, > Ralf > > > ------------------------------------------------------------------------ > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev From thomas.robitaille at gmail.com Sun Jul 25 09:10:36 2010 From: thomas.robitaille at gmail.com (Thomas Robitaille) Date: Sun, 25 Jul 2010 09:10:36 -0400 Subject: [SciPy-Dev] IDL .sav file reader for scipy.io Message-ID: Hi everyone, A couple of weeks ago, I wrote a message to this list ('potential scikits package') asking whether a package I have developed to read IDL .sav files would be useful to have as a scikits package, but several people suggested it should go into scipy.io directly. Since then, I've been working with Matthew Brett to get the package into a suitable form for inclusion in scipy.io (adding unit tests, docstrings, style changes, ...). We now think it's almost ready to be included. Would anyone else have time to look over the package to review it? The latest version is available here: http://idlsave.svn.sourceforge.net/viewvc/idlsave/trunk/idlsave/ and I have also attached a (small) tarball to this email. Thanks, Tom -------------- next part -------------- A non-text attachment was scrubbed... Name: IDLSave-0.9.7.tar.gz Type: application/x-gzip Size: 14104 bytes Desc: not available URL: From pav at iki.fi Sun Jul 25 11:35:11 2010 From: pav at iki.fi (Pauli Virtanen) Date: Sun, 25 Jul 2010 15:35:11 +0000 (UTC) Subject: [SciPy-Dev] N-dimensional interpolation Message-ID: Hi all, I took the Qhull by the horns, and wrote a straightforward `griddata` implementation for working in N-D: http://github.com/pv/scipy-work/tree/qhull http://github.com/pv/scipy-work/blob/qhull/scipy/interpolate/qhull.pyx Sample (4-D): ------------------------------------------------------------- import numpy as np import matplotlib.pyplot as plt from scipy.interpolate import griddata def f(x, y, z, u): return np.cos(x + np.cos(y + np.cos(z + np.cos(u)))) points = np.random.rand(2000, 4) values = f(*points.T) p = np.linspace(0, 1, 1500) z = griddata(points, values, np.c_[p, p, p, p], method='linear') plt.plot(p, z, '-', p, f(p, p, p, p), '-') plt.show() ------------------------------------------------------------- It performs the N-D Delaunay tesselation with Qhull, and after that does linear barycentric interpolation on the simplex containing each point. (The nearest-neighbor "interpolation" is, on the other hand, implemented using scipy.spatial.KDTree.) I ended up writing some custom simplex-walking code for locating the simplex containing the points -- this is much faster than a brute-force search. However, it is slightly different from what `qh_findbestfacet` does. [If you're a computational geometry expert, see below...] Speed-wise, Qhull appears to lose in 2-D to `scikits.delaunay` by a factor of about 10 in triangulation in my tests; this, however, includes the time taken by computing the barycentric coordinate transforms. Interpolation, however, seems to be faster. *** I think this would be a nice addition to `scipy.optimize`. I'd like to get it in for 0.9. Later on, we can specialize the `griddata` function to perform better in low dimensions (ndim=2, 3) and add more interpolation methods there; for example natural neighbors, interpolating splines, etc. In ndim > 3, the `griddata` is however already feature-equivalent to Octave and MATLAB. Comments? Pauli PS. A question for computational geometry experts: `qh_findbestfacet` finds the facet on the lower convex hull whose oriented hyperplane distance to the point lifted on the paraboloid is maximal --- however, this does not always give the simplex containing the point in the projected Delaunay tesselation. There's a counterexample in qhull.pyx:_find_simplex docstring. On the other hand, Qhull documentation claims that this is the way to locate the Delaunay simplex containing a point. What gives? It's clear, however, that if a simplex contains the point, then the hyperplane distance is positive. So currently, I just walk around positive-distance simplices checking the inside condition for each and hoping for the best. If nothing is found, I just fall back to brute force search. This seems to work well in practice. However, if a point is outside the triangulation (but inside the bounding box), I have to fall back to a brute-force search. Is there a better option here? *** Another sample (2-D, object-oriented interface): ------------------------------------------------------------- import time import numpy as np import matplotlib.pyplot as plt from scipy.interpolate import LinearNDInterpolator # some random data x = np.array([(0,0), (0,1), (1, 1), (1, 0)], dtype=np.double) np.random.seed(0) xr = np.random.rand(200*200, 2).astype(np.double) x = np.r_[xr, x] y = np.arange(x.shape[0], dtype=np.double) # evaluate on a grid nn = 500 xx = np.linspace(-0.1, 1.1, nn) yy = np.linspace(-0.1, 1.1, nn) xx, yy = np.broadcast_arrays(xx[:,None], yy[None,:]) xi = np.array([xx, yy]).transpose(1,2,0).copy() start = time.time() ip = LinearNDInterpolator(x, y) print "Triangulation", time.time() - start start = time.time() zi = ip(xi) print "Interpolation", time.time() - start from scikits.delaunay import Triangulation start = time.time() tri2 = Triangulation(x[:,0], x[:,1]) print "scikits.delaunay triangulation", time.time() - start start = time.time() ip2 = tri2.linear_interpolator(y) zi2 = ip2[-0.1:1.1:500j,-0.1:1.1:500j].T print "scikits.delaunay interpolation", time.time() - start print "rel-difference", np.nanmax(abs(zi - zi2))/np.nanmax(np.abs(zi)) plt.figure() plt.imshow(zi) plt.clim(y.min(), y.max()) plt.colorbar() plt.figure() plt.imshow(zi2) plt.clim(y.min(), y.max()) plt.colorbar() plt.show() ------------------------------------------------------------- -- Pauli Virtanen From pav at iki.fi Sun Jul 25 11:58:59 2010 From: pav at iki.fi (Pauli Virtanen) Date: Sun, 25 Jul 2010 15:58:59 +0000 (UTC) Subject: [SciPy-Dev] IDL .sav file reader for scipy.io References: Message-ID: Sun, 25 Jul 2010 09:10:36 -0400, Thomas Robitaille wrote: > A couple of weeks ago, I wrote a message to this list ('potential > scikits package') asking whether a package I have developed to read IDL > .sav files would be useful to have as a scikits package, but several > people suggested it should go into scipy.io directly. +1 Looks good to me. Has tests and seems to work, so it should go in for 0.9 IMHO. Some comments: - verbose=False should be the default; a function in a library shouldn't print anything unless explicitly asked to. - What's the motivation for the `python_dict` option? - What's the motivation for the `uncompressed_file_name` option? For working with very large files? -- Pauli Virtanen From thomas.robitaille at gmail.com Sun Jul 25 12:16:53 2010 From: thomas.robitaille at gmail.com (Thomas Robitaille) Date: Sun, 25 Jul 2010 12:16:53 -0400 Subject: [SciPy-Dev] IDL .sav file reader for scipy.io In-Reply-To: References: Message-ID: On Jul 25, 2010, at 11:58 AM, Pauli Virtanen wrote: > Sun, 25 Jul 2010 09:10:36 -0400, Thomas Robitaille wrote: >> A couple of weeks ago, I wrote a message to this list ('potential >> scikits package') asking whether a package I have developed to read IDL >> .sav files would be useful to have as a scikits package, but several >> people suggested it should go into scipy.io directly. > > +1 > > Looks good to me. Has tests and seems to work, so it should go in for 0.9 > IMHO. Thanks for looking over it! > Some comments: > > - verbose=False should be the default; a function in a library shouldn't > print anything unless explicitly asked to. Ok - I will change that when including it in scipy.io (I decided to leave it as it is in IDLSave since that has been the default since the start). > - What's the motivation for the `python_dict` option? I added that in case someone wanted a proper dictionary rather than the overloaded dictionary type with attribute access. It probably won't get used much, but I just put it there in case. Do you think it should be removed? > - What's the motivation for the `uncompressed_file_name` option? > For working with very large files? Yes - in some cases it might be desirable to not have to read from the compressed file each time, and this can also be used if the uncompressed file should be expanded to a scratch disk rather than whatever tempfile provides (again, for large files). Again, this probably won't get used much, but it was more there in case anyone needs it. Cheers, Tom > > -- > Pauli Virtanen > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev From charlesr.harris at gmail.com Sun Jul 25 12:45:23 2010 From: charlesr.harris at gmail.com (Charles R Harris) Date: Sun, 25 Jul 2010 10:45:23 -0600 Subject: [SciPy-Dev] N-dimensional interpolation In-Reply-To: References: Message-ID: On Sun, Jul 25, 2010 at 9:35 AM, Pauli Virtanen wrote: > Hi all, > > I took the Qhull by the horns, and wrote a straightforward `griddata` > implementation for working in N-D: > > http://github.com/pv/scipy-work/tree/qhull > http://github.com/pv/scipy-work/blob/qhull/scipy/interpolate/qhull.pyx > > Sample (4-D): > ------------------------------------------------------------- > import numpy as np > import matplotlib.pyplot as plt > from scipy.interpolate import griddata > > def f(x, y, z, u): > return np.cos(x + np.cos(y + np.cos(z + np.cos(u)))) > > points = np.random.rand(2000, 4) > values = f(*points.T) > > p = np.linspace(0, 1, 1500) > z = griddata(points, values, np.c_[p, p, p, p], method='linear') > > plt.plot(p, z, '-', p, f(p, p, p, p), '-') > plt.show() > ------------------------------------------------------------- > > It performs the N-D Delaunay tesselation with Qhull, and after that does > linear barycentric interpolation on the simplex containing each point. > (The nearest-neighbor "interpolation" is, on the other hand, implemented > using scipy.spatial.KDTree.) > > I ended up writing some custom simplex-walking code for locating the > simplex containing the points -- this is much faster than a brute-force > search. However, it is slightly different from what `qh_findbestfacet` > does. [If you're a computational geometry expert, see below...] > > Speed-wise, Qhull appears to lose in 2-D to `scikits.delaunay` by a > factor of about 10 in triangulation in my tests; this, however, includes > the time taken by computing the barycentric coordinate transforms. > Interpolation, however, seems to be faster. > > *** > > I think this would be a nice addition to `scipy.optimize`. I'd like to > get it in for 0.9. > > Do you mean scipy.interpolate? > Later on, we can specialize the `griddata` function to perform better in > low dimensions (ndim=2, 3) and add more interpolation methods there; for > example natural neighbors, interpolating splines, etc. In ndim > 3, the > `griddata` is however already feature-equivalent to Octave and MATLAB. > > Comments? > > I'm not a computational geometry expert, but I expect the 2-D performance or scikits.delaunay takes advantage of a special algorithm for that case (I recall googling for that back when). Perhaps it can be adapted to the higher dimension case or adapted for searching. Just random thoughts here, I have no actual experience with these sorts of problems. I certainly support getting added interpolation functionality into scipy and barycentric interpolation has the advantage of not introducing weird artifacts. I like it for drawing contours also. Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From oliphant at enthought.com Sun Jul 25 15:25:42 2010 From: oliphant at enthought.com (Travis Oliphant) Date: Sun, 25 Jul 2010 14:25:42 -0500 Subject: [SciPy-Dev] N-dimensional interpolation In-Reply-To: References: Message-ID: Fantastic. Excellent work. This is something we have been missing for a long time. Travis -- (mobile phone of) Travis Oliphant Enthought, Inc. 1-512-536-1057 http://www.enthought.com On Jul 25, 2010, at 10:35 AM, Pauli Virtanen wrote: > Hi all, > > I took the Qhull by the horns, and wrote a straightforward `griddata` > implementation for working in N-D: > > http://github.com/pv/scipy-work/tree/qhull > http://github.com/pv/scipy-work/blob/qhull/scipy/interpolate/qhull.pyx > > Sample (4-D): > ------------------------------------------------------------- > import numpy as np > import matplotlib.pyplot as plt > from scipy.interpolate import griddata > > def f(x, y, z, u): > return np.cos(x + np.cos(y + np.cos(z + np.cos(u)))) > > points = np.random.rand(2000, 4) > values = f(*points.T) > > p = np.linspace(0, 1, 1500) > z = griddata(points, values, np.c_[p, p, p, p], method='linear') > > plt.plot(p, z, '-', p, f(p, p, p, p), '-') > plt.show() > ------------------------------------------------------------- > > It performs the N-D Delaunay tesselation with Qhull, and after that does > linear barycentric interpolation on the simplex containing each point. > (The nearest-neighbor "interpolation" is, on the other hand, implemented > using scipy.spatial.KDTree.) > > I ended up writing some custom simplex-walking code for locating the > simplex containing the points -- this is much faster than a brute-force > search. However, it is slightly different from what `qh_findbestfacet` > does. [If you're a computational geometry expert, see below...] > > Speed-wise, Qhull appears to lose in 2-D to `scikits.delaunay` by a > factor of about 10 in triangulation in my tests; this, however, includes > the time taken by computing the barycentric coordinate transforms. > Interpolation, however, seems to be faster. > > *** > > I think this would be a nice addition to `scipy.optimize`. I'd like to > get it in for 0.9. > > Later on, we can specialize the `griddata` function to perform better in > low dimensions (ndim=2, 3) and add more interpolation methods there; for > example natural neighbors, interpolating splines, etc. In ndim > 3, the > `griddata` is however already feature-equivalent to Octave and MATLAB. > > Comments? > > Pauli > > > PS. A question for computational geometry experts: > > `qh_findbestfacet` finds the facet on the lower convex hull whose > oriented hyperplane distance to the point lifted on the paraboloid is > maximal --- however, this does not always give the simplex containing the > point in the projected Delaunay tesselation. There's a counterexample in > qhull.pyx:_find_simplex docstring. > > On the other hand, Qhull documentation claims that this is the way to > locate the Delaunay simplex containing a point. What gives? > > It's clear, however, that if a simplex contains the point, then the > hyperplane distance is positive. So currently, I just walk around > positive-distance simplices checking the inside condition for each and > hoping for the best. If nothing is found, I just fall back to brute force > search. > > This seems to work well in practice. However, if a point is outside the > triangulation (but inside the bounding box), I have to fall back to a > brute-force search. Is there a better option here? > > *** > > Another sample (2-D, object-oriented interface): > ------------------------------------------------------------- > import time > import numpy as np > import matplotlib.pyplot as plt > from scipy.interpolate import LinearNDInterpolator > > # some random data > x = np.array([(0,0), (0,1), (1, 1), (1, 0)], dtype=np.double) > np.random.seed(0) > xr = np.random.rand(200*200, 2).astype(np.double) > x = np.r_[xr, x] > y = np.arange(x.shape[0], dtype=np.double) > > # evaluate on a grid > nn = 500 > xx = np.linspace(-0.1, 1.1, nn) > yy = np.linspace(-0.1, 1.1, nn) > xx, yy = np.broadcast_arrays(xx[:,None], yy[None,:]) > > > xi = np.array([xx, yy]).transpose(1,2,0).copy() > start = time.time() > ip = LinearNDInterpolator(x, y) > print "Triangulation", time.time() - start > start = time.time() > zi = ip(xi) > print "Interpolation", time.time() - start > > from scikits.delaunay import Triangulation > > start = time.time() > tri2 = Triangulation(x[:,0], x[:,1]) > print "scikits.delaunay triangulation", time.time() - start > > start = time.time() > ip2 = tri2.linear_interpolator(y) > zi2 = ip2[-0.1:1.1:500j,-0.1:1.1:500j].T > print "scikits.delaunay interpolation", time.time() - start > > print "rel-difference", np.nanmax(abs(zi - zi2))/np.nanmax(np.abs(zi)) > > plt.figure() > plt.imshow(zi) > plt.clim(y.min(), y.max()) > plt.colorbar() > plt.figure() > plt.imshow(zi2) > plt.clim(y.min(), y.max()) > plt.colorbar() > plt.show() > ------------------------------------------------------------- > > -- > Pauli Virtanen > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev From pav at iki.fi Sun Jul 25 17:52:24 2010 From: pav at iki.fi (Pauli Virtanen) Date: Sun, 25 Jul 2010 21:52:24 +0000 (UTC) Subject: [SciPy-Dev] N-dimensional interpolation References: Message-ID: Sun, 25 Jul 2010 10:45:23 -0600, Charles R Harris wrote: > Do you mean scipy.interpolate? Yep. [clip] > I'm not a computational geometry expert, but I expect the 2-D > performance or scikits.delaunay takes advantage of a special algorithm > for that case (I recall googling for that back when). Perhaps it can be > adapted to the higher dimension case or adapted for searching. Just > random thoughts here, I have no actual experience with these sorts of > problems. Optimizing the time spent in Qhull is probably very difficult, so I think we'll just have to live with that part. As far as I understand, the search algorithm in scikits.delaunay walks the triangles by hopping to a neighbour chosen so that the target point is on positive side of the corresponding ridge. That generalizes to N-D easily. Randomizing the interpolation points causes a 20x hit on the current search algorithm, whereas scikits.delaunay gets away with a 4x performance drop with the same input, so there could be some room for improvement here. -- Pauli Virtanen From pav at iki.fi Mon Jul 26 17:48:31 2010 From: pav at iki.fi (Pauli Virtanen) Date: Mon, 26 Jul 2010 21:48:31 +0000 (UTC) Subject: [SciPy-Dev] N-dimensional interpolation References: Message-ID: Sun, 25 Jul 2010 21:52:24 +0000, Pauli Virtanen wrote: [clip] > As far as I understand, the search algorithm in scikits.delaunay walks > the triangles by hopping to a neighbour chosen so that the target point > is on positive side of the corresponding ridge. That generalizes to N-D > easily. This turned out to work quite well. A hybrid algorithm combining the two (first search a simplex with a positive plane distance, then continue with directed search) seems to work even better. So now it should be quite robust, and I'm happy with the timings, too: -- qhull triangulate 2.19968700409 -- qhull interpolate (meshgrid) 0.651250123978 -- qhull interpolate (random order) 22.201515913 -- scikits.delaunay triangulate 0.925380945206 -- scikits.delaunay linear_interpolate (meshgrid) 4.93817210197 -- scikits.delaunay nn_interpolate (meshgrid) 7.32182908058 -- scikits.delaunay nn_interpolate (random order) 45.4793360233 -- abs-diff-max 8.91013769433e-08 -- rel-diff-max 7.27796350818e-12 Qhull is roughly 2-3x slower in forming the triangulation in 2-D than scikits.delaunay, but that's acceptable. For some reason, the new interpolation code is faster than the linear interpolation in scikits.delaunay. I'll probably try to split the code so that the Qhull geometry parts end up in scipy.spatial, and the interpolation routines in scipy.interpolate. I'll still need to test the Qhull triangulation against some pathological cases.. Pauli --------------------------------------------------------------------- import qhull import numpy as np import sys import time np.random.seed(1234) x = np.random.rand(240*240, 2).astype(np.double) y = np.arange(x.shape[0], dtype=np.double) nn = 500 xx = np.linspace(-0.1, 1.1, nn) yy = np.linspace(-0.1, 1.1, nn) xx, yy = np.broadcast_arrays(xx[:,None], yy[None,:]) xi = np.array([xx, yy]).transpose(1,2,0).copy() # permuted order xix = xi.reshape(nn*nn, 2) p = np.random.permutation(nn*nn) p2 = p.copy() p2[p] = np.arange(nn*nn) xix = xix[p] # process! print "-- qhull triangulate" start = time.time() ip = qhull.LinearNDInterpolator(x, y) print time.time() - start print "-- qhull interpolate (meshgrid)" start = time.time() zi = ip(xi) print time.time() - start print "-- qhull interpolate (random order)" start = time.time() zi = ip(xix)[p2].reshape(zi.shape) print time.time() - start from scikits.delaunay import Triangulation print "-- scikits.delaunay triangulate" start = time.time() tri2 = Triangulation(x[:,0], x[:,1]) print time.time() - start print "-- scikits.delaunay linear_interpolate (meshgrid)" start = time.time() ip2 = tri2.linear_interpolator(y) zi2 = ip2[-0.1:1.1:(nn*1j),-0.1:1.1:(nn*1j)].T print time.time() - start print "-- scikits.delaunay nn_interpolate (meshgrid)" start = time.time() ip3 = tri2.nn_interpolator(y) zi3 = ip3(xi[:,:,0].ravel(), xi[:,:,1].ravel()).reshape(zi.shape) print time.time() - start print "-- scikits.delaunay nn_interpolate (random order)" start = time.time() zi3 = ip3(xix[:,0], xix[:,1])[p2].reshape(zi.shape) print time.time() - start print "-- abs-diff-max", np.nanmax(abs(zi-zi2)) print "-- rel-diff-max", np.nanmax(abs(zi-zi2)/abs(zi)) From ralf.gommers at googlemail.com Tue Jul 27 11:20:08 2010 From: ralf.gommers at googlemail.com (Ralf Gommers) Date: Tue, 27 Jul 2010 23:20:08 +0800 Subject: [SciPy-Dev] [PATCH] Scipy and python2.7 In-Reply-To: <4C4C31CC.10702@inria.fr> References: <20100722141314.hjq3677lmsg8kg08@fseoane.net> <4C4C31CC.10702@inria.fr> Message-ID: On Sun, Jul 25, 2010 at 8:45 PM, Fabian Pedregosa wrote: > Ralf Gommers wrote: > > Hi Fabian, > > > > On Thu, Jul 22, 2010 at 8:13 PM, Fabian Pedregosa > > > wrote: > > > > Hi all. > > > > With some minor changes I could install scipy using python2.7. The > > only thing that failed was the c++ swig wrappers in > > scipy.sparse.sparsetools. I believe this is because python is > > compiled now using gcc and not g++, and also default flags have > changed. > > > > The problem seems to be that numpy/ndarraytypes.h import inttypes.h, > > but the macros in this file are not accessibly from C++ unless you > > define the variable __STDC_FORMAT_MACROS. My solution was to define > > this variable in the setup.py script. This makes it build fine and > > test are OK too (ubuntu linux 64 bit). I attach the (git) patch. > > > > > > That seems to fix the problem, good catch. I'm wondering if this really > > belongs in setup.py. numpyconfig.h already defines __STDC_FORMAT_MACROS, > > so maybe including that in sparsetools.i before arrayobject.h would be > > cleaner? > > Thanks for the review. I tried to include numpyconfig.h in sparsetools.i > as you suggest but for some reason it didn't solve the problem > (hand-editing the generated swig wrapper and placing that file at start > made it work, though). Maybe someone with more swig/C/C++ experience > knows why. > > Hmm, no idea. But admittedly I don't know too much about SWIG or C++. > > > > If it goes in setup.py it should also go in the SConscript file. > > I updated the patch in issue #1180. > > Thanks, looks good. This is an important patch, so unless someone objects it will go in in the next few days. Cheers, Ralf Cheers, > > Fabian > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at googlemail.com Tue Jul 27 12:49:23 2010 From: ralf.gommers at googlemail.com (Ralf Gommers) Date: Wed, 28 Jul 2010 00:49:23 +0800 Subject: [SciPy-Dev] ANN: SciPy 0.8.0 Message-ID: I'm pleased to announce the release of SciPy 0.8.0. SciPy is a package of tools for science and engineering for Python. It includes modules for statistics, optimization, integration, linear algebra, Fourier transforms, signal and image processing, ODE solvers, and more. This release comes one and a half year after the 0.7.0 release and contains many new features, numerous bug-fixes, improved test coverage, and better documentation. Please note that SciPy 0.8.0 requires Python 2.4-2.6 and NumPy 1.4.1 or greater. For more information, please see the release notes at the end of this email. You can download the release from here: https://sourceforge.net/projects/scipy/ Python 2.5/2.6 binaries for Windows and OS X are available, as well as source tarballs for other platforms and the documentation in pdf form. Thank you to everybody who contributed to this release. Enjoy, The SciPy developers ========================= SciPy 0.8.0 Release Notes ========================= .. contents:: SciPy 0.8.0 is the culmination of 17 months of hard work. It contains many new features, numerous bug-fixes, improved test coverage and better documentation. There have been a number of deprecations and API changes in this release, which are documented below. All users are encouraged to upgrade to this release, as there are a large number of bug-fixes and optimizations. Moreover, our development attention will now shift to bug-fix releases on the 0.8.x branch, and on adding new features on the development trunk. This release requires Python 2.4 - 2.6 and NumPy 1.4.1 or greater. Please note that SciPy is still considered to have "Beta" status, as we work toward a SciPy 1.0.0 release. The 1.0.0 release will mark a major milestone in the development of SciPy, after which changing the package structure or API will be much more difficult. Whilst these pre-1.0 releases are considered to have "Beta" status, we are committed to making them as bug-free as possible. However, until the 1.0 release, we are aggressively reviewing and refining the functionality, organization, and interface. This is being done in an effort to make the package as coherent, intuitive, and useful as possible. To achieve this, we need help from the community of users. Specifically, we need feedback regarding all aspects of the project - everything - from which algorithms we implement, to details about our function's call signatures. Python 3 ======== Python 3 compatibility is planned and is currently technically feasible, since Numpy has been ported. However, since the Python 3 compatible Numpy 1.5 has not been released yet, support for Python 3 in Scipy is not yet included in Scipy 0.8. SciPy 0.9, planned for fall 2010, will very likely include experimental support for Python 3. Major documentation improvements ================================ SciPy documentation is greatly improved. Deprecated features =================== Swapping inputs for correlation functions (scipy.signal) -------------------------------------------------------- Concern correlate, correlate2d, convolve and convolve2d. If the second input is larger than the first input, the inputs are swapped before calling the underlying computation routine. This behavior is deprecated, and will be removed in scipy 0.9.0. Obsolete code deprecated (scipy.misc) ------------------------------------- The modules `helpmod`, `ppimport` and `pexec` from `scipy.misc` are deprecated. They will be removed from SciPy in version 0.9. Additional deprecations ----------------------- * linalg: The function `solveh_banded` currently returns a tuple containing the Cholesky factorization and the solution to the linear system. In SciPy 0.9, the return value will be just the solution. * The function `constants.codata.find` will generate a DeprecationWarning. In Scipy version 0.8.0, the keyword argument 'disp' was added to the function, with the default value 'True'. In 0.9.0, the default will be 'False'. * The `qshape` keyword argument of `signal.chirp` is deprecated. Use the argument `vertex_zero` instead. * Passing the coefficients of a polynomial as the argument `f0` to `signal.chirp` is deprecated. Use the function `signal.sweep_poly` instead. * The `io.recaster` module has been deprecated and will be removed in 0.9.0. New features ============ DCT support (scipy.fftpack) --------------------------- New realtransforms have been added, namely dct and idct for Discrete Cosine Transform; type I, II and III are available. Single precision support for fft functions (scipy.fftpack) ---------------------------------------------------------- fft functions can now handle single precision inputs as well: fft(x) will return a single precision array if x is single precision. At the moment, for FFT sizes that are not composites of 2, 3, and 5, the transform is computed internally in double precision to avoid rounding error in FFTPACK. Correlation functions now implement the usual definition (scipy.signal) ----------------------------------------------------------------------- The outputs should now correspond to their matlab and R counterparts, and do what most people expect if the old_behavior=False argument is passed: * correlate, convolve and their 2d counterparts do not swap their inputs depending on their relative shape anymore; * correlation functions now conjugate their second argument while computing the slided sum-products, which correspond to the usual definition of correlation. Additions and modification to LTI functions (scipy.signal) ---------------------------------------------------------- * The functions `impulse2` and `step2` were added to `scipy.signal`. They use the function `scipy.signal.lsim2` to compute the impulse and step response of a system, respectively. * The function `scipy.signal.lsim2` was changed to pass any additional keyword arguments to the ODE solver. Improved waveform generators (scipy.signal) ------------------------------------------- Several improvements to the `chirp` function in `scipy.signal` were made: * The waveform generated when `method="logarithmic"` was corrected; it now generates a waveform that is also known as an "exponential" or "geometric" chirp. (See http://en.wikipedia.org/wiki/Chirp.) * A new `chirp` method, "hyperbolic", was added. * Instead of the keyword `qshape`, `chirp` now uses the keyword `vertex_zero`, a boolean. * `chirp` no longer handles an arbitrary polynomial. This functionality has been moved to a new function, `sweep_poly`. A new function, `sweep_poly`, was added. New functions and other changes in scipy.linalg ----------------------------------------------- The functions `cho_solve_banded`, `circulant`, `companion`, `hadamard` and `leslie` were added to `scipy.linalg`. The function `block_diag` was enhanced to accept scalar and 1D arguments, along with the usual 2D arguments. New function and changes in scipy.optimize ------------------------------------------ The `curve_fit` function has been added; it takes a function and uses non-linear least squares to fit that to the provided data. The `leastsq` and `fsolve` functions now return an array of size one instead of a scalar when solving for a single parameter. New sparse least squares solver ------------------------------- The `lsqr` function was added to `scipy.sparse`. `This routine `_ finds a least-squares solution to a large, sparse, linear system of equations. ARPACK-based sparse SVD ----------------------- A naive implementation of SVD for sparse matrices is available in scipy.sparse.linalg.eigen.arpack. It is based on using an symmetric solver on , and as such may not be very precise. Alternative behavior available for `scipy.constants.find` --------------------------------------------------------- The keyword argument `disp` was added to the function `scipy.constants.find`, with the default value `True`. When `disp` is `True`, the behavior is the same as in Scipy version 0.7. When `False`, the function returns the list of keys instead of printing them. (In SciPy version 0.9, the default will be reversed.) Incomplete sparse LU decompositions ----------------------------------- Scipy now wraps SuperLU version 4.0, which supports incomplete sparse LU decompositions. These can be accessed via `scipy.sparse.linalg.spilu`. Upgrade to SuperLU 4.0 also fixes some known bugs. Faster matlab file reader and default behavior change ------------------------------------------------------ We've rewritten the matlab file reader in Cython and it should now read matlab files at around the same speed that Matlab does. The reader reads matlab named and anonymous functions, but it can't write them. Until scipy 0.8.0 we have returned arrays of matlab structs as numpy object arrays, where the objects have attributes named for the struct fields. As of 0.8.0, we return matlab structs as numpy structured arrays. You can get the older behavior by using the optional ``struct_as_record=False`` keyword argument to `scipy.io.loadmat` and friends. There is an inconsistency in the matlab file writer, in that it writes numpy 1D arrays as column vectors in matlab 5 files, and row vectors in matlab 4 files. We will change this in the next version, so both write row vectors. There is a `FutureWarning` when calling the writer to warn of this change; for now we suggest using the ``oned_as='row'`` keyword argument to `scipy.io.savemat` and friends. Faster evaluation of orthogonal polynomials ------------------------------------------- Values of orthogonal polynomials can be evaluated with new vectorized functions in `scipy.special`: `eval_legendre`, `eval_chebyt`, `eval_chebyu`, `eval_chebyc`, `eval_chebys`, `eval_jacobi`, `eval_laguerre`, `eval_genlaguerre`, `eval_hermite`, `eval_hermitenorm`, `eval_gegenbauer`, `eval_sh_legendre`, `eval_sh_chebyt`, `eval_sh_chebyu`, `eval_sh_jacobi`. This is faster than constructing the full coefficient representation of the polynomials, which was previously the only available way. Note that the previous orthogonal polynomial routines will now also invoke this feature, when possible. Lambert W function ------------------ `scipy.special.lambertw` can now be used for evaluating the Lambert W function. Improved hypergeometric 2F1 function ------------------------------------ Implementation of `scipy.special.hyp2f1` for real parameters was revised. The new version should produce accurate values for all real parameters. More flexible interface for Radial basis function interpolation --------------------------------------------------------------- The `scipy.interpolate.Rbf` class now accepts a callable as input for the "function" argument, in addition to the built-in radial basis functions which can be selected with a string argument. Removed features ================ scipy.stsci: the package was removed The module `scipy.misc.limits` was removed. scipy.io -------- The IO code in both NumPy and SciPy is being extensively reworked. NumPy will be where basic code for reading and writing NumPy arrays is located, while SciPy will house file readers and writers for various data formats (data, audio, video, images, matlab, etc.). Several functions in `scipy.io` are removed in the 0.8.0 release including: `npfile`, `save`, `load`, `create_module`, `create_shelf`, `objload`, `objsave`, `fopen`, `read_array`, `write_array`, `fread`, `fwrite`, `bswap`, `packbits`, `unpackbits`, and `convert_objectarray`. Some of these functions have been replaced by NumPy's raw reading and writing capabilities, memory-mapping capabilities, or array methods. Others have been moved from SciPy to NumPy, since basic array reading and writing capability is now handled by NumPy. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at googlemail.com Wed Jul 28 06:36:12 2010 From: ralf.gommers at googlemail.com (Ralf Gommers) Date: Wed, 28 Jul 2010 18:36:12 +0800 Subject: [SciPy-Dev] empty input to spline from UnivariateSpline Message-ID: Hi, In ticket http://projects.scipy.org/scipy/ticket/1014 there is a request to return an empty array instead of getting an error when an empty array is passed to the spline object generated by UnivariateSpline. This is easy to fix, but is it the desired behavior? The submitter wants an empty array, maybe other users would want an exception. Change this or not? Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at googlemail.com Wed Jul 28 06:51:02 2010 From: ralf.gommers at googlemail.com (Ralf Gommers) Date: Wed, 28 Jul 2010 18:51:02 +0800 Subject: [SciPy-Dev] Typo in license file In-Reply-To: References: Message-ID: On Sun, Jul 18, 2010 at 1:49 AM, Benjamin Root wrote: > On Fri, Jul 16, 2010 at 11:52 PM, Ralf Gommers < > ralf.gommers at googlemail.com> wrote: > >> >> >> On Sat, Jul 17, 2010 at 8:42 AM, Joshua Holbrook > > wrote: >> >>> Hey guys, >>> >>> Not a big deal but under clause (c): >>> >>> c. Neither the name of the Enthought nor the names of its contributors >>> >>> should probably be: >>> >>> c. Neither the name of Enthought nor the names of its contributors >>> >>> That is, there's an extra "the." Like I said, not a big deal, but I did >>> notice. >>> >>> I think there's also a bit missing, it should be something like: >> "The names of Enthought, the SciPy Developers or any contributors may not >> be used ..." >> >> Ralf >> >> > I agree. While at first it might seem redundant, the original phrasing can > be interpreted as to mean only the contributors from Enthought. This way, > any contributors to SciPy is covered by the clause. > > So can this be changed? It seems to make things more precise, but since the license is quite important I don't want to change it without an explicit OK from an authoritative voice (steering committee member). Proposed diff: - c. Neither the name of the Enthought nor the names of its contributors - may be used to endorse or promote products derived from this software + c. The names of Enthought, the SciPy Developers or any contributors may + not be used to endorse or promote products derived from this software without specific prior written permission. Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From pav at iki.fi Wed Jul 28 06:55:24 2010 From: pav at iki.fi (Pauli Virtanen) Date: Wed, 28 Jul 2010 10:55:24 +0000 (UTC) Subject: [SciPy-Dev] empty input to spline from UnivariateSpline References: Message-ID: Wed, 28 Jul 2010 18:36:12 +0800, Ralf Gommers wrote: > In ticket http://projects.scipy.org/scipy/ticket/1014 there is a request > to return an empty array instead of getting an error when an empty array > is passed to the spline object generated by UnivariateSpline. This is > easy to fix, but is it the desired behavior? The submitter wants an > empty array, maybe other users would want an exception. Change this or > not? x = UnivariateSpline(...) x([]) == np.array([]) is the correct behavior, IMO. The user is asking for the interpolant evaluated at 0 input points -- so the correct result is the interpolant at 0 input points, i.e., an empty array. That's what we do with sin, cos, and other ufuncs, and since the interpolant is also essentially a function, it should behave in the same way. -- Pauli Virtanen From ralf.gommers at googlemail.com Wed Jul 28 09:59:39 2010 From: ralf.gommers at googlemail.com (Ralf Gommers) Date: Wed, 28 Jul 2010 21:59:39 +0800 Subject: [SciPy-Dev] empty input to spline from UnivariateSpline In-Reply-To: References: Message-ID: On Wed, Jul 28, 2010 at 6:55 PM, Pauli Virtanen wrote: > Wed, 28 Jul 2010 18:36:12 +0800, Ralf Gommers wrote: > > In ticket http://projects.scipy.org/scipy/ticket/1014 there is a request > > to return an empty array instead of getting an error when an empty array > > is passed to the spline object generated by UnivariateSpline. This is > > easy to fix, but is it the desired behavior? The submitter wants an > > empty array, maybe other users would want an exception. Change this or > > not? > > x = UnivariateSpline(...) > x([]) == np.array([]) > > is the correct behavior, IMO. The user is asking for the interpolant > evaluated at 0 input points -- so the correct result is the interpolant > at 0 input points, i.e., an empty array. > > That's what we do with sin, cos, and other ufuncs, and since the > interpolant is also essentially a function, it should behave in the same > way. > > Makes sense. Done in r6642. Cheers, Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From d.l.goldsmith at gmail.com Wed Jul 28 16:46:56 2010 From: d.l.goldsmith at gmail.com (David Goldsmith) Date: Wed, 28 Jul 2010 13:46:56 -0700 Subject: [SciPy-Dev] I'm getting a "OperationalError: database is locked" Message-ID: when trying to file a new scipy ticket; I am logged in. DG -------------- next part -------------- An HTML attachment was scrubbed... URL: From thomas.robitaille at gmail.com Wed Jul 28 17:32:28 2010 From: thomas.robitaille at gmail.com (Thomas Robitaille) Date: Wed, 28 Jul 2010 17:32:28 -0400 Subject: [SciPy-Dev] severe bug in scipy.integrate.quadrature? Message-ID: Hi, I think there is a mistake in scipy.integrate.quadrature that can result in false convergence of the integral. The code in question is the following: err = 100. val = err n = 1 vfunc = vectorize1(func, args, vec_func=vec_func) while (err > tol) and (n < maxiter): newval = fixed_quad(vfunc, a, b, (), n)[0] err = abs(newval-val) val = newval n = n + 1 The error inside the loop is the absolute error, so it looks like the tolerance to specify is the absolute tolerance, not relative. However, the original err is set to 100, so if I compute an integral that has a result of 1e20, I might want a tolerance of say 1e15, in which case the loop will be skipped, because err is not greater than tol. In its present form, I think that err should be initially set to infinity. In addition, I think that the tolerance to specify should be the relative one, not the absolute, because most of the time, I know I want my integrals calculated to a precision of say 1e-5, but I don't know in advance the answer! I think a relative tolerance is what users expect at any rate. Can any experts comment on this? Thanks, Thomas From thomas.robitaille at gmail.com Wed Jul 28 17:49:55 2010 From: thomas.robitaille at gmail.com (Thomas Robitaille) Date: Wed, 28 Jul 2010 17:49:55 -0400 Subject: [SciPy-Dev] Incorrect results from scipy.integrate.romberg Message-ID: <57126FB7-DC6F-4953-B625-DA5FCA4A1CDA@gmail.com> Hi, I think there is some kind of bug with the romberg integration function. In the following code, romberg integration give an incorrect answer that is too large by a factor of two: from scipy.integrate import quad, romberg def powerlaw(x): return x**3 print "%12.3e" % quad(powerlaw, 1e1, 1e5)[0] # gives 2.500.e+19 print "%12.3e" % romberg(powerlaw, 1e1, 1e5) # gives 5.000.e+19 The correct answer is 2.5e19, but romberg gives twice that. Also, as in my previous email, it seems the tol= argument in romberg() is the absolute tolerance, not relative. Would it not be a good idea to be able to specify a relative tolerance? Cheers, Thomas From pav at iki.fi Wed Jul 28 18:51:00 2010 From: pav at iki.fi (Pauli Virtanen) Date: Wed, 28 Jul 2010 22:51:00 +0000 (UTC) Subject: [SciPy-Dev] Incorrect results from scipy.integrate.romberg References: <57126FB7-DC6F-4953-B625-DA5FCA4A1CDA@gmail.com> Message-ID: Wed, 28 Jul 2010 17:49:55 -0400, Thomas Robitaille wrote: > Hi, > > I think there is some kind of bug with the romberg integration function. > In the following code, romberg integration give an incorrect answer that > is too large by a factor of two: > > from scipy.integrate import quad, romberg > > def powerlaw(x): > return x**3 > > print "%12.3e" % quad(powerlaw, 1e1, 1e5)[0] # gives 2.500.e+19 print > "%12.3e" % romberg(powerlaw, 1e1, 1e5) # gives 5.000.e+19 > > The correct answer is 2.5e19, but romberg gives twice that. > > Also, as in my previous email, it seems the tol= argument in romberg() > is the absolute tolerance, not relative. Would it not be a good idea to > be able to specify a relative tolerance? Yes, fix'd. Also patches (that include tests) are accepted in the future :) -- Pauli Virtanen From pav at iki.fi Wed Jul 28 18:51:04 2010 From: pav at iki.fi (Pauli Virtanen) Date: Wed, 28 Jul 2010 22:51:04 +0000 (UTC) Subject: [SciPy-Dev] severe bug in scipy.integrate.quadrature? References: Message-ID: Wed, 28 Jul 2010 17:32:28 -0400, Thomas Robitaille wrote: [clip: scipy.integrate.quadrature] > The error inside the loop is the absolute error, so it looks like the > tolerance to specify is the absolute tolerance, not relative. However, > the original err is set to 100, so if I compute an integral that has a > result of 1e20, I might want a tolerance of say 1e15, in which case the > loop will be skipped, because err is not greater than tol. In its > present form, I think that err should be initially set to infinity. Yep, it's wrong. Fixed. -- Pauli Virtanen From d.l.goldsmith at gmail.com Thu Jul 29 01:09:43 2010 From: d.l.goldsmith at gmail.com (David Goldsmith) Date: Wed, 28 Jul 2010 22:09:43 -0700 Subject: [SciPy-Dev] Good-bye, sort of Message-ID: July 31 is my last day as paid NumPy/SciPy docs editor/marathon coordinator; I'm moving on to a programming data analyst position for the Marine Waters Data Management System Project of the Environmental Assessment Program in the Washington State Department of Ecology; unfortunately, they need me to program in MATLAB, but I plan on continuing to use NumPy, etc. for my fractals, etc. I also intend to continue to contribute to this community as a volunteer as my time permits, but you probably will be seeing less of me on the lists. Also, I won't be hosting a Marathon Skypecon this Friday (but of course others are free to organize one if they want to.) Thanks for all your help and support, David Goldsmith Olympia, WA -------------- next part -------------- An HTML attachment was scrubbed... URL: From millman at berkeley.edu Thu Jul 29 02:56:13 2010 From: millman at berkeley.edu (Jarrod Millman) Date: Wed, 28 Jul 2010 23:56:13 -0700 Subject: [SciPy-Dev] Good-bye, sort of In-Reply-To: References: Message-ID: Hello David, Good luck with your new position! Thanks for your work on the NumPy/SciPy documentation. Best, Jarrod From nwagner at iam.uni-stuttgart.de Thu Jul 29 06:02:00 2010 From: nwagner at iam.uni-stuttgart.de (Nils Wagner) Date: Thu, 29 Jul 2010 12:02:00 +0200 Subject: [SciPy-Dev] numpy svn broken NPY_COPY_PYOBJECT_PTR Message-ID: Hi all, I have installed the most recent svn version of numpy. However if I try to import numpy it fails with >>> import numpy Traceback (most recent call last): File "", line 1, in File "/data/home/nwagner/local/lib/python2.5/site-packages/numpy/__init__.py", line 136, in import add_newdocs File "/data/home/nwagner/local/lib/python2.5/site-packages/numpy/add_newdocs.py", line 9, in from numpy.lib import add_newdoc File "/data/home/nwagner/local/lib/python2.5/site-packages/numpy/lib/__init__.py", line 4, in from type_check import * File "/data/home/nwagner/local/lib/python2.5/site-packages/numpy/lib/type_check.py", line 8, in import numpy.core.numeric as _nx File "/data/home/nwagner/local/lib/python2.5/site-packages/numpy/core/__init__.py", line 5, in import multiarray ImportError: /data/home/nwagner/local/lib/python2.5/site-packages/numpy/core/multiarray.so: undefined symbol: NPY_COPY_PYOBJECT_PTR I cannot install the most recent svn version of scipy python setup.py install --prefix=$HOME/local Traceback (most recent call last): File "setup.py", line 160, in setup_package() File "setup.py", line 127, in setup_package from numpy.distutils.core import setup File "/data/home/nwagner/local/lib/python2.5/site-packages/numpy/__init__.py", line 136, in import add_newdocs File "/data/home/nwagner/local/lib/python2.5/site-packages/numpy/add_newdocs.py", line 9, in from numpy.lib import add_newdoc File "/data/home/nwagner/local/lib/python2.5/site-packages/numpy/lib/__init__.py", line 4, in from type_check import * File "/data/home/nwagner/local/lib/python2.5/site-packages/numpy/lib/type_check.py", line 8, in import numpy.core.numeric as _nx File "/data/home/nwagner/local/lib/python2.5/site-packages/numpy/core/__init__.py", line 5, in import multiarray ImportError: /data/home/nwagner/local/lib/python2.5/site-packages/numpy/core/multiarray.so: undefined symbol: NPY_COPY_PYOBJECT_PTR Any idea ? Nils From enzomich at gmail.com Thu Jul 29 07:47:05 2010 From: enzomich at gmail.com (Enzo Michelangeli) Date: Thu, 29 Jul 2010 19:47:05 +0800 Subject: [SciPy-Dev] Proposed enhancement: pure Python implementation of LP simplex method (two-phase) Message-ID: <005F84F1B0F84701849CAA1AC58081DA@EMLT> I just submitted the following enhancement ticket: http://projects.scipy.org/scipy/ticket/1252 Enzo From scott.sinclair.za at gmail.com Thu Jul 29 08:13:01 2010 From: scott.sinclair.za at gmail.com (Scott Sinclair) Date: Thu, 29 Jul 2010 14:13:01 +0200 Subject: [SciPy-Dev] numpy svn broken NPY_COPY_PYOBJECT_PTR In-Reply-To: References: Message-ID: >On 29 July 2010 12:02, Nils Wagner wrote: > I have installed the most recent svn version of numpy. > However if I try to import numpy it fails with > >>>> import numpy > Traceback (most recent call last): > ? File "", line 1, in > ? File > "/data/home/nwagner/local/lib/python2.5/site-packages/numpy/__init__.py", > line 136, in > ? ? import add_newdocs > ? File > "/data/home/nwagner/local/lib/python2.5/site-packages/numpy/add_newdocs.py", > line 9, in > ? ? from numpy.lib import add_newdoc > ? File > "/data/home/nwagner/local/lib/python2.5/site-packages/numpy/lib/__init__.py", > line 4, in > ? ? from type_check import * > ? File > "/data/home/nwagner/local/lib/python2.5/site-packages/numpy/lib/type_check.py", > line 8, in > ? ? import numpy.core.numeric as _nx > ? File > "/data/home/nwagner/local/lib/python2.5/site-packages/numpy/core/__init__.py", > line 5, in > ? ? import multiarray > ImportError: > /data/home/nwagner/local/lib/python2.5/site-packages/numpy/core/multiarray.so: > undefined symbol: NPY_COPY_PYOBJECT_PTR I see this too (since r8541) on 64 bit Ubuntu. I see that the change has also gone into the 1.4.x and 1.5.x branches (r8542 & r8543), I guess that those commits should be reverted until trunk is fixed. > I cannot install the most recent svn version of scipy Scipy doesn't build because numpy can't be imported.. Cheers, Scott From aisaac at american.edu Thu Jul 29 08:18:08 2010 From: aisaac at american.edu (Alan G Isaac) Date: Thu, 29 Jul 2010 08:18:08 -0400 Subject: [SciPy-Dev] Proposed enhancement: pure Python implementation of LP simplex method (two-phase) In-Reply-To: <005F84F1B0F84701849CAA1AC58081DA@EMLT> References: <005F84F1B0F84701849CAA1AC58081DA@EMLT> Message-ID: <4C517180.90209@american.edu> On 7/29/2010 7:47 AM, Enzo Michelangeli wrote: > I just submitted the following enhancement ticket: > > http://projects.scipy.org/scipy/ticket/1252 I think your "test case" could easily be turned into a unit test, which would make the submission more complete. Cheers, Alan Isaac From cournape at gmail.com Thu Jul 29 10:10:50 2010 From: cournape at gmail.com (David Cournapeau) Date: Thu, 29 Jul 2010 23:10:50 +0900 Subject: [SciPy-Dev] numpy svn broken NPY_COPY_PYOBJECT_PTR In-Reply-To: References: Message-ID: On Thu, Jul 29, 2010 at 9:13 PM, Scott Sinclair wrote: > > I see this too (since r8541) on 64 bit Ubuntu. I see that the change > has also gone into the 1.4.x and 1.5.x branches (r8542 & r8543), I > guess that those commits should be reverted until trunk is fixed. Sorry for the trouble, I did not expect the change could have broken anything if it had built. I should have run the test suite. Anyway, it should be fixed as we speak, David From scott.sinclair.za at gmail.com Thu Jul 29 10:27:05 2010 From: scott.sinclair.za at gmail.com (Scott Sinclair) Date: Thu, 29 Jul 2010 16:27:05 +0200 Subject: [SciPy-Dev] numpy svn broken NPY_COPY_PYOBJECT_PTR In-Reply-To: References: Message-ID: >On 29 July 2010 16:10, David Cournapeau wrote: > On Thu, Jul 29, 2010 at 9:13 PM, Scott Sinclair > wrote: >> >> I see this too (since r8541) on 64 bit Ubuntu. I see that the change >> has also gone into the 1.4.x and 1.5.x branches (r8542 & r8543), I >> guess that those commits should be reverted until trunk is fixed. > > Sorry for the trouble, I did not expect the change could have broken > anything if it had built. I should have run the test suite. No trouble at all. > Anyway, it should be fixed as we speak, Thanks! Works for me now. Cheers, Scott From ben.root at ou.edu Thu Jul 29 10:30:06 2010 From: ben.root at ou.edu (Benjamin Root) Date: Thu, 29 Jul 2010 09:30:06 -0500 Subject: [SciPy-Dev] Good-bye, sort of In-Reply-To: References: Message-ID: On Thu, Jul 29, 2010 at 12:09 AM, David Goldsmith wrote: > July 31 is my last day as paid NumPy/SciPy docs editor/marathon > coordinator; I'm moving on to a programming data analyst position for the Marine > Waters Data Management System Project of the Environmental Assessment > Program in the Washington State Department of Ecology; unfortunately, they > need me to program in MATLAB, but I plan on continuing to use NumPy, etc. > for my fractals, etc. I also intend to continue to contribute to this > community as a volunteer as my time permits, but you probably will be seeing > less of me on the lists. Also, I won't be hosting a Marathon Skypecon this > Friday (but of course others are free to organize one if they want to.) > > Thanks for all your help and support, > > David Goldsmith > Olympia, WA > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > > David, Thank you for all your efforts! Ben Root -------------- next part -------------- An HTML attachment was scrubbed... URL: From charlesr.harris at gmail.com Thu Jul 29 10:35:12 2010 From: charlesr.harris at gmail.com (Charles R Harris) Date: Thu, 29 Jul 2010 08:35:12 -0600 Subject: [SciPy-Dev] Good-bye, sort of In-Reply-To: References: Message-ID: On Wed, Jul 28, 2010 at 11:09 PM, David Goldsmith wrote: > July 31 is my last day as paid NumPy/SciPy docs editor/marathon > coordinator; I'm moving on to a programming data analyst position for the Marine > Waters Data Management System Project of the Environmental Assessment > Program in the Washington State Department of Ecology; unfortunately, they > need me to program in MATLAB, but I plan on continuing to use NumPy, etc. > for my fractals, etc. I also intend to continue to contribute to this > community as a volunteer as my time permits, but you probably will be seeing > less of me on the lists. Also, I won't be hosting a Marathon Skypecon this > Friday (but of course others are free to organize one if they want to.) > > Thanks for all your help and support, > > Enjoy your new job and thanks for all the work you have done on the documentation. Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From oliphant at enthought.com Thu Jul 29 10:49:10 2010 From: oliphant at enthought.com (Travis Oliphant) Date: Thu, 29 Jul 2010 09:49:10 -0500 Subject: [SciPy-Dev] Good-bye, sort of In-Reply-To: References: Message-ID: <6007DB57-EC49-4C76-8C6F-55FB4DD56BDE@enthought.com> Thanks for all your contributions to improving SciPy documentation. It has been a valuable thing for all of us in the community. Best regards, Travis -- (mobile phone of) Travis Oliphant Enthought, Inc. 1-512-536-1057 http://www.enthought.com On Jul 29, 2010, at 12:09 AM, David Goldsmith wrote: > July 31 is my last day as paid NumPy/SciPy docs editor/marathon coordinator; I'm moving on to a programming data analyst position for the Marine Waters Data Management System Project of the Environmental Assessment Program in the Washington State Department of Ecology; unfortunately, they need me to program in MATLAB, but I plan on continuing to use NumPy, etc. for my fractals, etc. I also intend to continue to contribute to this community as a volunteer as my time permits, but you probably will be seeing less of me on the lists. Also, I won't be hosting a Marathon Skypecon this Friday (but of course others are free to organize one if they want to.) > > Thanks for all your help and support, > > David Goldsmith > Olympia, WA > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev -------------- next part -------------- An HTML attachment was scrubbed... URL: From vincent at vincentdavis.net Thu Jul 29 10:59:47 2010 From: vincent at vincentdavis.net (Vincent Davis) Date: Thu, 29 Jul 2010 08:59:47 -0600 Subject: [SciPy-Dev] Good-bye, sort of In-Reply-To: References: Message-ID: Thanks for you efforts and I guess thanks for getting me involved with pydocweb :-) Vincent On Wed, Jul 28, 2010 at 11:09 PM, David Goldsmith wrote: > July 31 is my last day as paid NumPy/SciPy docs editor/marathon coordinator; > I'm moving on to a programming data analyst position for the Marine Waters > Data Management System Project of the Environmental Assessment Program in > the Washington State Department of Ecology; unfortunately, they need me to > program in MATLAB, but I plan on continuing to use NumPy, etc. for my > fractals, etc.? I also intend to continue to contribute to this community as > a volunteer as my time permits, but you probably will be seeing less of me > on the lists.? Also, I won't be hosting a Marathon Skypecon this Friday (but > of course others are free to organize one if they want to.) > > Thanks for all your help and support, > > David Goldsmith > Olympia, WA > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > > From gael.varoquaux at normalesup.org Thu Jul 29 11:12:36 2010 From: gael.varoquaux at normalesup.org (Gael Varoquaux) Date: Thu, 29 Jul 2010 17:12:36 +0200 Subject: [SciPy-Dev] Good-bye, sort of In-Reply-To: References: Message-ID: <20100729151236.GI26793@phare.normalesup.org> Yes, David, I would like to join with the others to thank you. But you need no thank from us, I believe, to be proud, just looking at the progress in the documentation should be enough. Ga?l On Thu, Jul 29, 2010 at 08:59:47AM -0600, Vincent Davis wrote: > Thanks for you efforts and I guess thanks for getting me involved with > pydocweb :-) > Vincent > On Wed, Jul 28, 2010 at 11:09 PM, David Goldsmith > wrote: > > July 31 is my last day as paid NumPy/SciPy docs editor/marathon coordinator; > > I'm moving on to a programming data analyst position for the Marine Waters > > Data Management System Project of the Environmental Assessment Program in > > the Washington State Department of Ecology; unfortunately, they need me to > > program in MATLAB, but I plan on continuing to use NumPy, etc. for my > > fractals, etc.? I also intend to continue to contribute to this community as > > a volunteer as my time permits, but you probably will be seeing less of me > > on the lists.? Also, I won't be hosting a Marathon Skypecon this Friday (but > > of course others are free to organize one if they want to.) > > Thanks for all your help and support, > > David Goldsmith > > Olympia, WA From matthieu.brucher at gmail.com Thu Jul 29 11:51:28 2010 From: matthieu.brucher at gmail.com (Matthieu Brucher) Date: Thu, 29 Jul 2010 17:51:28 +0200 Subject: [SciPy-Dev] Good-bye, sort of In-Reply-To: <20100729151236.GI26793@phare.normalesup.org> References: <20100729151236.GI26793@phare.normalesup.org> Message-ID: Indeed, the current documentation and its quality are an excellent testimony to all users. Many thanks to you. Matthieu 2010/7/29 Gael Varoquaux : > Yes, David, I would like to join with the others to thank you. But you > need no thank from us, I believe, to be proud, just looking at the > progress in the documentation should be enough. > > Ga?l > > On Thu, Jul 29, 2010 at 08:59:47AM -0600, Vincent Davis wrote: >> Thanks for you efforts and I guess thanks for getting me involved with >> pydocweb :-) >> Vincent > >> On Wed, Jul 28, 2010 at 11:09 PM, David Goldsmith >> wrote: >> > July 31 is my last day as paid NumPy/SciPy docs editor/marathon coordinator; >> > I'm moving on to a programming data analyst position for the Marine Waters >> > Data Management System Project of the Environmental Assessment Program in >> > the Washington State Department of Ecology; unfortunately, they need me to >> > program in MATLAB, but I plan on continuing to use NumPy, etc. for my >> > fractals, etc.? I also intend to continue to contribute to this community as >> > a volunteer as my time permits, but you probably will be seeing less of me >> > on the lists.? Also, I won't be hosting a Marathon Skypecon this Friday (but >> > of course others are free to organize one if they want to.) > >> > Thanks for all your help and support, > >> > David Goldsmith >> > Olympia, WA > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > -- Information System Engineer, Ph.D. Blog: http://matt.eifelle.com LinkedIn: http://www.linkedin.com/in/matthieubrucher From ariver at enthought.com Thu Jul 29 13:42:12 2010 From: ariver at enthought.com (Aaron River) Date: Thu, 29 Jul 2010 12:42:12 -0500 Subject: [SciPy-Dev] I'm getting a "OperationalError: database is locked" In-Reply-To: References: Message-ID: David and I worked this out off-list. If any one else is experiencing any problems, please let me know. On Wed, Jul 28, 2010 at 15:46, David Goldsmith wrote: > when trying to file a new scipy ticket; I am logged in. > > DG > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > > From enzomich at gmail.com Thu Jul 29 21:48:49 2010 From: enzomich at gmail.com (Enzo Michelangeli) Date: Fri, 30 Jul 2010 09:48:49 +0800 Subject: [SciPy-Dev] Proposed enhancement: pure Python implementation of LP simplex method (two-phase) References: <005F84F1B0F84701849CAA1AC58081DA@EMLT> <4C517180.90209@american.edu> Message-ID: <6C847BBFD7CC40B28AB118EE8CEFDAF9@EMLT> From: "Alan G Isaac" Sent: Thursday, July 29, 2010 8:18 PM > On 7/29/2010 7:47 AM, Enzo Michelangeli wrote: >> I just submitted the following enhancement ticket: >> >> http://projects.scipy.org/scipy/ticket/1252 > > > I think your "test case" could easily be turned > into a unit test, which would make the submission > more complete. Done. Enzo From pav at iki.fi Fri Jul 30 04:49:31 2010 From: pav at iki.fi (Pauli Virtanen) Date: Fri, 30 Jul 2010 08:49:31 +0000 (UTC) Subject: [SciPy-Dev] Proposed enhancement: pure Python implementation of LP simplex method (two-phase) References: <005F84F1B0F84701849CAA1AC58081DA@EMLT> <4C517180.90209@american.edu> <6C847BBFD7CC40B28AB118EE8CEFDAF9@EMLT> Message-ID: Fri, 30 Jul 2010 09:48:49 +0800, Enzo Michelangeli wrote: [clip] >>> http://projects.scipy.org/scipy/ticket/1252 >> >> I think your "test case" could easily be turned into a unit test, which >> would make the submission more complete. > > Done. Seems to work. Some API nitpicks, 1) Return a solution object rather than a tuple: class NamedTuple(tuple): def __new__(cls, values, names): self = tuple.__new__(cls, values) for value, name in zip(values, names): setattr(self, name, value) return self class Solution(NamedTuple): _fields = [] def __new__(cls, *a, **kw): values = list(a) for name in cls._fields[len(values):]: values.append(kw.pop(name)) if len(values) != len(cls._fields) or kw: raise ValueError("Invalid arguments") return NamedTuple.__new__(cls, values, cls._fields) def __repr__(self): return "%s%s" % (self.__class__.__name__, NamedTuple.__repr__(self)) class LPSolution(Solution): """ Solution to a linear programming problem Attributes ---------- x The optimal solution min The optimal value is_bounded True if the solution is bounded; False if unbounded solvable True if the problem is solvable; False if unsolvable basis Indices of the basis of the solution. """ _fields = ['x', 'min', 'is_bounded', 'solvable', 'basis'] def lp(...): ... return LPSolution(optx, zmin, is_bounded, sol, basis) We could (and probably should) replace *all* cases in Scipy where a tuple is currently returned by this sort of pattern. It's backwards compatible, since the returned object is still some sort of a tuple. 2) Don't print to stdout. Use warnings.warn if you need to warn about something. 3) Call c = np.asarray(c) A = np.asarray(A) b = np.asarray(b) in the beginning -- good for convenience. -- Pauli Virtanen From enzomich at gmail.com Fri Jul 30 06:54:40 2010 From: enzomich at gmail.com (Enzo Michelangeli) Date: Fri, 30 Jul 2010 18:54:40 +0800 Subject: [SciPy-Dev] Proposed enhancement: pure Python implementationof LP simplex method (two-phase) References: <005F84F1B0F84701849CAA1AC58081DA@EMLT><4C517180.90209@american.edu> <6C847BBFD7CC40B28AB118EE8CEFDAF9@EMLT> Message-ID: <4CCA2CE2BD7240F7BF84B883643756C3@EMLT> Pauli, Thanks for uploading lp2.py: I hadn't noticed it and I uploaded a revised lp.py with the changes you required except for the object stuff, on which I had a question (namely: should the NamedTuple be based on http://docs.python.org/library/collections.html#collections.namedtuple ?). Anyway, feel free to remove lp.py . Regarding the warning, you may safely take it away altogether: the "print" was a diagnostic leftover that I had overlooked (the "unsolvable" status is anyway returned to the caller). Enzo From: "Pauli Virtanen" Sent: Friday, July 30, 2010 4:49 PM > Fri, 30 Jul 2010 09:48:49 +0800, Enzo Michelangeli wrote: > [clip] >>>> http://projects.scipy.org/scipy/ticket/1252 >>> >>> I think your "test case" could easily be turned into a unit test, which >>> would make the submission more complete. >> >> Done. > > Seems to work. > > Some API nitpicks, > > 1) > > Return a solution object rather than a tuple: > > class NamedTuple(tuple): > def __new__(cls, values, names): > self = tuple.__new__(cls, values) > for value, name in zip(values, names): > setattr(self, name, value) > return self > > class Solution(NamedTuple): > _fields = [] > def __new__(cls, *a, **kw): > values = list(a) > for name in cls._fields[len(values):]: > values.append(kw.pop(name)) > if len(values) != len(cls._fields) or kw: > raise ValueError("Invalid arguments") > return NamedTuple.__new__(cls, values, cls._fields) > > def __repr__(self): > return "%s%s" % (self.__class__.__name__, > NamedTuple.__repr__(self)) > > class LPSolution(Solution): > """ > Solution to a linear programming problem > > Attributes > ---------- > x > The optimal solution > min > The optimal value > is_bounded > True if the solution is bounded; False if unbounded > solvable > True if the problem is solvable; False if unsolvable > basis > Indices of the basis of the solution. > """ > _fields = ['x', 'min', 'is_bounded', 'solvable', 'basis'] > > def lp(...): > ... > return LPSolution(optx, zmin, is_bounded, sol, basis) > > > We could (and probably should) replace *all* cases in Scipy where a tuple > is currently returned by this sort of pattern. It's backwards compatible, > since the returned object is still some sort of a tuple. > > 2) > > Don't print to stdout. > > Use warnings.warn if you need to warn about something. > > 3) > > Call > > c = np.asarray(c) > A = np.asarray(A) > b = np.asarray(b) > > in the beginning -- good for convenience. From pav at iki.fi Fri Jul 30 07:23:50 2010 From: pav at iki.fi (Pauli Virtanen) Date: Fri, 30 Jul 2010 11:23:50 +0000 (UTC) Subject: [SciPy-Dev] Proposed enhancement: pure Python implementationof LP simplex method (two-phase) References: <005F84F1B0F84701849CAA1AC58081DA@EMLT> <4C517180.90209@american.edu> <6C847BBFD7CC40B28AB118EE8CEFDAF9@EMLT> <4CCA2CE2BD7240F7BF84B883643756C3@EMLT> Message-ID: Fri, 30 Jul 2010 18:54:40 +0800, Enzo Michelangeli wrote: > Thanks for uploading lp2.py: I hadn't noticed it and I uploaded a > revised lp.py with the changes you required except for the object stuff, > on which I had a question (namely: should the NamedTuple be based on > http://docs.python.org/library/collections.html#collections.namedtuple > ?). Anyway, feel free to remove lp.py . We need to support also Python 2.4 and 2.5 which do not have collections.namedtuple, so we cannot use the pre-cooked one. *** Another thing: you have a magic number 1e-10 in there: 1) Should this tolerance be user-configurable? 2) Should it be a relative tolerance instead of an absolute one? I.e., is the quantity H it is compared with always of order 1? Cheers, Pauli From bsouthey at gmail.com Fri Jul 30 09:49:31 2010 From: bsouthey at gmail.com (Bruce Southey) Date: Fri, 30 Jul 2010 08:49:31 -0500 Subject: [SciPy-Dev] Proposed enhancement: pure Python implementation of LP simplex method (two-phase) In-Reply-To: References: <005F84F1B0F84701849CAA1AC58081DA@EMLT> <4C517180.90209@american.edu> <6C847BBFD7CC40B28AB118EE8CEFDAF9@EMLT> Message-ID: <4C52D86B.7050005@gmail.com> On 07/30/2010 03:49 AM, Pauli Virtanen wrote: > Fri, 30 Jul 2010 09:48:49 +0800, Enzo Michelangeli wrote: > [clip] >>>> http://projects.scipy.org/scipy/ticket/1252 >>> I think your "test case" could easily be turned into a unit test, which >>> would make the submission more complete. >> Done. > Seems to work. > > Some API nitpicks, > > 1) > > Return a solution object rather than a tuple: > > class NamedTuple(tuple): > def __new__(cls, values, names): > self = tuple.__new__(cls, values) > for value, name in zip(values, names): > setattr(self, name, value) > return self > > class Solution(NamedTuple): > _fields = [] > def __new__(cls, *a, **kw): > values = list(a) > for name in cls._fields[len(values):]: > values.append(kw.pop(name)) > if len(values) != len(cls._fields) or kw: > raise ValueError("Invalid arguments") > return NamedTuple.__new__(cls, values, cls._fields) > > def __repr__(self): > return "%s%s" % (self.__class__.__name__, > NamedTuple.__repr__(self)) > > class LPSolution(Solution): > """ > Solution to a linear programming problem > > Attributes > ---------- > x > The optimal solution > min > The optimal value > is_bounded > True if the solution is bounded; False if unbounded > solvable > True if the problem is solvable; False if unsolvable > basis > Indices of the basis of the solution. > """ > _fields = ['x', 'min', 'is_bounded', 'solvable', 'basis'] > > def lp(...): > ... > return LPSolution(optx, zmin, is_bounded, sol, basis) > > > We could (and probably should) replace *all* cases in Scipy where a tuple > is currently returned by this sort of pattern. It's backwards compatible, > since the returned object is still some sort of a tuple. > > > 2) > > Don't print to stdout. > > Use warnings.warn if you need to warn about something. > > > 3) > > Call > > c = np.asarray(c) > A = np.asarray(A) > b = np.asarray(b) > > in the beginning -- good for convenience. > Really this is lacking major documentation that needs to be addressed as much as possible. It would be really really nice if there was documentation as per the Scipy doc marathon. In particular there is no description of 'A', 'b' and 'c' as these could be any type of 'numpy array' and of different shapes. Just a couple of other suggestions as I have only glanced at the code. This is necessary to check the input shapes because it is not very obvious what the shapes are without reading the code. For example 'A' must be a 2-d array and apparently 'b' and 'c' must be 1d arrays. So I do think that you need to provide some checks that 'A', 'b' and 'c' are of the correct shape - at least to inform the user what is expected when an error arises. This also related to documentation. Also np.asarray does not respect other ndarray subclasses so you probably want to test and reject things like masked arrays and Matrix inputs but allow other array-like inputs. (It is my little hope that numpy/scipy functions become aware of the different arrays and at least warn the user when an array is not what is expected.) Bruce From cournape at gmail.com Sat Jul 31 13:41:11 2010 From: cournape at gmail.com (David Cournapeau) Date: Sun, 1 Aug 2010 02:41:11 +0900 Subject: [SciPy-Dev] Would it be acceptable to give up subpackage build ? Message-ID: Hi there, I have made enough progress on bento recently to start tackling the meat of the issue: building numpy and scipy with it. Numpy goes nicely, but scipy causes issues because of the following constraing: in scipy, each package with a setup.py is a self-contained package, i.e. you can just do python setup.py build in scipy/lib/blas, and it will run. Is it acceptable to give up on this in the long run ? I don't see much advantages in that feature, and giving it up has several nice advantages: - one can have a global configure stage (i.e. check blas/lapack only once instead of many times as done currently, although the way numpy.distutils has near 0 cost since it only stat files) - I would expect the build to be faster and more accurate - it makes my life easier for bento Note that this has no incidence on the existing distutils-based infrastructure, cheers, David From robert.kern at gmail.com Sat Jul 31 14:40:17 2010 From: robert.kern at gmail.com (Robert Kern) Date: Sat, 31 Jul 2010 13:40:17 -0500 Subject: [SciPy-Dev] Would it be acceptable to give up subpackage build ? In-Reply-To: References: Message-ID: On Sat, Jul 31, 2010 at 12:41, David Cournapeau wrote: > Hi there, > > I have made enough progress on bento recently to start tackling the > meat of the issue: building numpy and scipy with it. Numpy goes > nicely, but scipy causes issues because of the following constraing: > in scipy, each package with a setup.py is a self-contained package, > i.e. you can just do python setup.py build in scipy/lib/blas, and it > will run. Is it acceptable to give up on this in the long run ? Yes. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." ? -- Umberto Eco From oliphant at enthought.com Sat Jul 31 23:33:45 2010 From: oliphant at enthought.com (Travis Oliphant) Date: Sat, 31 Jul 2010 21:33:45 -0600 Subject: [SciPy-Dev] Would it be acceptable to give up subpackage build ? In-Reply-To: References: Message-ID: <977CB113-B1A1-4E67-B46A-FA037499D1EB@enthought.com> I think this is a good move as well. Travis -- (mobile phone of) Travis Oliphant Enthought, Inc. 1-512-536-1057 http://www.enthought.com On Jul 31, 2010, at 11:41 AM, David Cournapeau wrote: > Hi there, > > I have made enough progress on bento recently to start tackling the > meat of the issue: building numpy and scipy with it. Numpy goes > nicely, but scipy causes issues because of the following constraing: > in scipy, each package with a setup.py is a self-contained package, > i.e. you can just do python setup.py build in scipy/lib/blas, and it > will run. Is it acceptable to give up on this in the long run ? > > I don't see much advantages in that feature, and giving it up has > several nice advantages: > - one can have a global configure stage (i.e. check blas/lapack only > once instead of many times as done currently, although the way > numpy.distutils has near 0 cost since it only stat files) > - I would expect the build to be faster and more accurate > - it makes my life easier for bento > > Note that this has no incidence on the existing distutils-based infrastructure, > > cheers, > > David > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev