From fperez.net at gmail.com Wed Jul 1 03:40:21 2009 From: fperez.net at gmail.com (Fernando Perez) Date: Wed, 1 Jul 2009 00:40:21 -0700 Subject: [SciPy-dev] Tutorial topics for SciPy'09 Conference In-Reply-To: References: Message-ID: Hi, On Mon, Jun 1, 2009 at 10:20 PM, Fernando Perez wrote: > The time for the Scipy'09 conference is rapidly approaching, and we > would like to both announce the plan for tutorials and solicit > feedback from everyone on topics of interest. rather than rehash much here, where it's not easy to paste a table, I've posted a note with the poll results here: http://fdoperez.blogspot.com/2009/06/scipy-advanced-tutorials-results.html The short and plain-text-friendly version is the final topic ranking: 1 Advanced topics in matplotlib use 2 Advanced numpy 3 Designing scientific interfaces with Traits 4 Mayavi/TVTK 5 Cython 6 Symbolic computing with sympy 7 Statistics with Scipy 8 Using GPUs with PyCUDA 9 Testing strategies for scientific codes 10 Parallel computing in Python and mpi4py 11 Sparse Linear Algebra with Scipy 12 Structured and record arrays in numpy 13 Design patterns for efficient iterator-based scientific codes 14 Sage 15 The TimeSeries scikit 16 Hermes: high order Finite Element Methods 17 Graph theory with NetworkX We're currently contacting speakers, and we'll let you know once a final list is made with confirmed speakers. Cheers, f From d_l_goldsmith at yahoo.com Wed Jul 1 16:29:57 2009 From: d_l_goldsmith at yahoo.com (David Goldsmith) Date: Wed, 1 Jul 2009 13:29:57 -0700 (PDT) Subject: [SciPy-dev] Summer Marathon "Category of the Week" Message-ID: <700012.35219.qm@web52103.mail.re2.yahoo.com> Linear Algebra! (My favorite! :-)) Let's get it to pink or better by next Wed! DG From stefan at sun.ac.za Thu Jul 2 06:20:12 2009 From: stefan at sun.ac.za (=?ISO-8859-1?Q?St=E9fan_van_der_Walt?=) Date: Thu, 2 Jul 2009 12:20:12 +0200 Subject: [SciPy-dev] Summer Marathon "Category of the Week" In-Reply-To: <700012.35219.qm@web52103.mail.re2.yahoo.com> References: <700012.35219.qm@web52103.mail.re2.yahoo.com> Message-ID: <9457e7c80907020320o7a149968q43bf89f1ee993eb4@mail.gmail.com> Hi marathon runners, Well done on the progress made last week! 2009/7/1 David Goldsmith : > Linear Algebra! (My favorite! :-)) ?Let's get it to pink or better by next Wed! Excellent! The linalg module is shown here: http://docs.scipy.org/numpy/docs/numpy.linalg/ You can choose from: cholesky cond det eig eigh eigvals eigvalsh inv lstsq matrix_power norm inv qr solve svd tensorinv tensorsolve These mostly have decent docstrings already, so I'd suggest focusing on making the following more accessible to your typical linear algebra student: http://docs.scipy.org/numpy/docs/numpy.linalg.linalg.eig/ [1] http://docs.scipy.org/numpy/docs/numpy.linalg.linalg.qr/ [2] http://docs.scipy.org/numpy/docs/numpy.linalg.linalg.norm/ [3] http://docs.scipy.org/numpy/docs/numpy.linalg.linalg.svd/ [4] http://docs.scipy.org/numpy/docs/numpy.linalg.linalg.tensorinv/ [5] Have fun! St?fan [1]?In the notes section, describe right vs left eigenvectors. Personally, I think we can drop the reference to the characteristic polynomial (we don't solve it that way in any case). [2] Needs formatting and a link to an explanation of QR. [3]?Describe what Frobenius means, or give a link to an appropriate reference. [4]?Note Pauli's comment: indicate clearly that the decomposition A = U S V.T is returned (V.H for complex matrices). [5] Badly in need of some examples. From nwagner at iam.uni-stuttgart.de Thu Jul 2 07:11:16 2009 From: nwagner at iam.uni-stuttgart.de (Nils Wagner) Date: Thu, 02 Jul 2009 13:11:16 +0200 Subject: [SciPy-dev] Summer Marathon "Category of the Week" In-Reply-To: <9457e7c80907020320o7a149968q43bf89f1ee993eb4@mail.gmail.com> References: <700012.35219.qm@web52103.mail.re2.yahoo.com> <9457e7c80907020320o7a149968q43bf89f1ee993eb4@mail.gmail.com> Message-ID: On Thu, 2 Jul 2009 12:20:12 +0200 St?fan van der Walt wrote: > Hi marathon runners, > > Well done on the progress made last week! > > 2009/7/1 David Goldsmith : >> Linear Algebra! (My favorite! :-)) Let's get it to pink >>or better by next Wed! > > Excellent! The linalg module is shown here: > > http://docs.scipy.org/numpy/docs/numpy.linalg/ > > You can choose from: > > cholesky cond det eig eigh eigvals eigvalsh > inv lstsq matrix_power norm inv qr solve svd tensorinv >tensorsolve > > These mostly have decent docstrings already, so I'd >suggest focusing > on making the following more accessible to your typical >linear algebra > student: > > http://docs.scipy.org/numpy/docs/numpy.linalg.linalg.eig/ >[1] > http://docs.scipy.org/numpy/docs/numpy.linalg.linalg.qr/ >[2] > http://docs.scipy.org/numpy/docs/numpy.linalg.linalg.norm/ >[3] > http://docs.scipy.org/numpy/docs/numpy.linalg.linalg.svd/ >[4] > http://docs.scipy.org/numpy/docs/numpy.linalg.linalg.tensorinv/ >[5] > > Have fun! > St?fan > > [1] In the notes section, describe right vs left >eigenvectors. > Personally, I think we can drop the reference to the >characteristic > polynomial (we > don't solve it that way in any case). > > [2] Needs formatting and a link to an explanation of QR. > > [3] Describe what Frobenius means, or give a link to an >appropriate reference. > > [4] Note Pauli's comment: indicate clearly that the >decomposition A = > U S V.T is returned (V.H for complex matrices). > > [5] Badly in need of some examples. > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev Hi St?fan, Is there a chance to fix the following ticket http://projects.scipy.org/numpy/ticket/937 in this context ? Nils From stefan at sun.ac.za Thu Jul 2 07:16:34 2009 From: stefan at sun.ac.za (=?ISO-8859-1?Q?St=E9fan_van_der_Walt?=) Date: Thu, 2 Jul 2009 13:16:34 +0200 Subject: [SciPy-dev] Summer Marathon "Category of the Week" In-Reply-To: References: <700012.35219.qm@web52103.mail.re2.yahoo.com> <9457e7c80907020320o7a149968q43bf89f1ee993eb4@mail.gmail.com> Message-ID: <9457e7c80907020416o3840c2d4ta9bb040ddf2a3a6a@mail.gmail.com> 2009/7/2 Nils Wagner : > Is there a chance to fix the following ticket > http://projects.scipy.org/numpy/ticket/937 > in this context ? Sure, I'd gladly review any patches (I'm a bit swamped myself at the moment). Regards St?fan From d_l_goldsmith at yahoo.com Thu Jul 2 13:02:05 2009 From: d_l_goldsmith at yahoo.com (David Goldsmith) Date: Thu, 2 Jul 2009 10:02:05 -0700 (PDT) Subject: [SciPy-dev] Summer Marathon "Category of the Week" Message-ID: <956303.34904.qm@web52107.mail.re2.yahoo.com> OK, the gauntlet has been thrown: anyone have the time and expertise to commit to fixing this code (I can commit to fixing the docstring) this week? DG --- On Thu, 7/2/09, St?fan van der Walt wrote: > From: St?fan van der Walt > Subject: Re: [SciPy-dev] Summer Marathon "Category of the Week" > To: "SciPy Developers List" > Date: Thursday, July 2, 2009, 4:16 AM > 2009/7/2 Nils Wagner : > > Is there a chance to fix the following ticket > > http://projects.scipy.org/numpy/ticket/937 > > in this context ? > > Sure, I'd gladly review any patches (I'm a bit swamped > myself at the moment). > > Regards > St?fan > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > From d_l_goldsmith at yahoo.com Fri Jul 3 01:08:49 2009 From: d_l_goldsmith at yahoo.com (David Goldsmith) Date: Thu, 2 Jul 2009 22:08:49 -0700 (PDT) Subject: [SciPy-dev] Is this a (formatting) bug? Message-ID: <55945.58737.qm@web52105.mail.re2.yahoo.com> Actual output: >>> from numpy import linalg as LA >>> LA.eig(np.diag((1,2,3))) (array([ 1., 2., 3.]), array([[ 1., 0., 0.], [ 0., 1., 0.], [ 0., 0., 1.]])) i.e., a line feed wasn't inserted between w and v. Bug? DG From d_l_goldsmith at yahoo.com Fri Jul 3 01:22:02 2009 From: d_l_goldsmith at yahoo.com (David Goldsmith) Date: Thu, 2 Jul 2009 22:22:02 -0700 (PDT) Subject: [SciPy-dev] Is this a (formatting) bug? Message-ID: <671121.23422.qm@web52108.mail.re2.yahoo.com> I figured out a "work-around": >>> w,v=LA.eig(np.diag((1,2,3))) >>> w;v array([ 1., 2., 3.]) array([[ 1., 0., 0.], [ 0., 1., 0.], [ 0., 0., 1.]]) But I'm still "concerned" about how >>> w,v (effectively) is printed (though I now see more clearly that it's an artifact of how a tuple is printed, not an artifact of how LA.eig is printed, and thus probably "unfixable" solely within numpy). DG --- On Thu, 7/2/09, David Goldsmith wrote: > From: David Goldsmith > Subject: [SciPy-dev] Is this a (formatting) bug? > To: scipy-dev at scipy.org > Date: Thursday, July 2, 2009, 10:08 PM > > Actual output: > > >>> from numpy import linalg as LA > >>> LA.eig(np.diag((1,2,3))) > (array([ 1.,? 2.,? 3.]), array([[ 1.,? > 0.,? 0.], > ? ? ???[ 0.,? 1.,? 0.], > ? ? ???[ 0.,? 0.,? > 1.]])) > > i.e., a line feed wasn't inserted between w and v. > > Bug? > > DG > > > ? ? ? > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > From d_l_goldsmith at yahoo.com Fri Jul 3 01:50:44 2009 From: d_l_goldsmith at yahoo.com (David Goldsmith) Date: Thu, 2 Jul 2009 22:50:44 -0700 (PDT) Subject: [SciPy-dev] Example of eig not converging Message-ID: <98049.18013.qm@web52107.mail.re2.yahoo.com> What has to be true of `a` so that eig(a) fails to converge? DG From d_l_goldsmith at yahoo.com Fri Jul 3 02:05:12 2009 From: d_l_goldsmith at yahoo.com (David Goldsmith) Date: Thu, 2 Jul 2009 23:05:12 -0700 (PDT) Subject: [SciPy-dev] Summer Marathon "Category of the Week" Message-ID: <49637.53115.qm@web52108.mail.re2.yahoo.com> eig = done! DG --- On Thu, 7/2/09, St?fan van der Walt wrote: > From: St?fan van der Walt > Subject: Re: [SciPy-dev] Summer Marathon "Category of the Week" > To: "SciPy Developers List" > Date: Thursday, July 2, 2009, 3:20 AM > Hi marathon runners, > > Well done on the progress made last week! > > 2009/7/1 David Goldsmith : > > Linear Algebra! (My favorite! :-)) ?Let's get it to > pink or better by next Wed! > > Excellent!? The linalg module is shown here: > > http://docs.scipy.org/numpy/docs/numpy.linalg/ > > You can choose from: > > cholesky cond det eig eigh eigvals eigvalsh > inv lstsq matrix_power norm inv qr solve svd tensorinv > tensorsolve > > These mostly have decent docstrings already, so I'd suggest > focusing > on making the following more accessible to your typical > linear algebra > student: > > http://docs.scipy.org/numpy/docs/numpy.linalg.linalg.eig/ > [1] > http://docs.scipy.org/numpy/docs/numpy.linalg.linalg.qr/ > [2] > http://docs.scipy.org/numpy/docs/numpy.linalg.linalg.norm/ > [3] > http://docs.scipy.org/numpy/docs/numpy.linalg.linalg.svd/ > [4] > http://docs.scipy.org/numpy/docs/numpy.linalg.linalg.tensorinv/ > [5] > > Have fun! > St?fan > > [1]?In the notes section, describe right vs left > eigenvectors. > Personally, I think we can drop the reference to the > characteristic > polynomial (we > don't solve it that way in any case). > > [2] Needs formatting and a link to an explanation of QR. > > [3]?Describe what Frobenius means, or give a link to an > appropriate reference. > > [4]?Note Pauli's comment: indicate clearly that the > decomposition A = > U S V.T is returned (V.H for complex matrices). > > [5] Badly in need of some examples. > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > From pav at iki.fi Fri Jul 3 14:38:55 2009 From: pav at iki.fi (Pauli Virtanen) Date: Fri, 3 Jul 2009 18:38:55 +0000 (UTC) Subject: [SciPy-dev] Example of eig not converging References: <98049.18013.qm@web52107.mail.re2.yahoo.com> Message-ID: On 2009-07-03, David Goldsmith wrote: > What has to be true of `a` so that eig(a) fails to converge? The approximate answer is: in practice, it always converges. Mathematically, the eigenvalue problem of an n x n matrix is well-defined (modulo floating-point accuracy). So the actual answer is very much implementation-dependent. LAPACK is a widely used library, and I suppose its eigenvalue routines are quite robust. So, failing matrices are probably difficult to find, and constructing them probably requires looking carefully at the algorithm itself, unless LAPACK authors themselves have listed some examples somewhere. (I didn't find any with a quick search.) So, I don't know the answer, and I suppose it would take quite a bit of work to find out. -- Pauli Virtanen From d_l_goldsmith at yahoo.com Fri Jul 3 16:05:53 2009 From: d_l_goldsmith at yahoo.com (David Goldsmith) Date: Fri, 3 Jul 2009 13:05:53 -0700 (PDT) Subject: [SciPy-dev] Example of eig not converging Message-ID: <375407.64488.qm@web52111.mail.re2.yahoo.com> Thanks, Pauli. In that case, I won't worry about it (I thought there was some "well-known" condition, like w/ the condition number and numerical invertibility, but if not, then there's no point in illustrating it w/ an example). DG --- On Fri, 7/3/09, Pauli Virtanen wrote: > From: Pauli Virtanen > Subject: Re: [SciPy-dev] Example of eig not converging > To: scipy-dev at scipy.org > Date: Friday, July 3, 2009, 11:38 AM > On 2009-07-03, David Goldsmith > wrote: > > What has to be true of `a` so that eig(a) fails to > converge? > > The approximate answer is: in practice, it always > converges. > > Mathematically, the eigenvalue problem of an n x n matrix > is > well-defined (modulo floating-point accuracy). So the > actual > answer is very much implementation-dependent. LAPACK is a > widely > used library, and I suppose its eigenvalue routines are > quite > robust. So, failing matrices are probably difficult to > find, and > constructing them probably requires looking carefully at > the > algorithm itself, unless LAPACK authors themselves have > listed > some examples somewhere. (I didn't find any with a quick > search.) > > So, I don't know the answer, and I suppose it would take > quite a > bit of work to find out. > > -- > Pauli Virtanen > > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > From d_l_goldsmith at yahoo.com Mon Jul 6 01:58:50 2009 From: d_l_goldsmith at yahoo.com (David Goldsmith) Date: Sun, 5 Jul 2009 22:58:50 -0700 (PDT) Subject: [SciPy-dev] Weird problem trying to preview Milestones edit Message-ID: <148340.5270.qm@web52107.mail.re2.yahoo.com> Hi! Editing the Milestones page, if I try to preview (even before I make any edits), I get: * Wiki * Docstrings * Changes * Milestones * Search * Stats * Patch * ? * Log in 500 Internal Error even on multiple refreshes. I don't get this previewing edits on any other page (so far), and it's the first time it's happened to me while editing Milestones (which I've successfully edited several times). Is that a sign that it's locked 'cause someone else is editing it simultaneously? Otherwise, something's amiss. DG From scott.sinclair.za at gmail.com Mon Jul 6 02:33:48 2009 From: scott.sinclair.za at gmail.com (Scott Sinclair) Date: Mon, 6 Jul 2009 08:33:48 +0200 Subject: [SciPy-dev] Weird problem trying to preview Milestones edit In-Reply-To: <148340.5270.qm@web52107.mail.re2.yahoo.com> References: <148340.5270.qm@web52107.mail.re2.yahoo.com> Message-ID: <6a17e9ee0907052333g22d3f87ka043e5eff1cfb6b0@mail.gmail.com> >2009/7/6 David Goldsmith : > > Hi! ?Editing the Milestones page, if I try to preview (even before I make any edits), I get: > > ? ?* Wiki > ? ?* Docstrings > ? ?* Changes > ? ?* Milestones > ? ?* Search > ? ?* Stats > ? ?* Patch > ? ?* ? > ? ?* Log in > > 500 Internal Error > > even on multiple refreshes. ?I don't get this previewing edits on any other page (so far), and it's the first time it's happened to me while editing Milestones (which I've successfully edited several times). > I'm seeing this too. No problems on other pages. Cheers, Scott From d_l_goldsmith at yahoo.com Mon Jul 6 02:59:54 2009 From: d_l_goldsmith at yahoo.com (David Goldsmith) Date: Sun, 5 Jul 2009 23:59:54 -0700 (PDT) Subject: [SciPy-dev] Weird problem trying to preview Milestones edit Message-ID: <248296.1361.qm@web52112.mail.re2.yahoo.com> Thanks for confirming. DG --- On Sun, 7/5/09, Scott Sinclair wrote: > From: Scott Sinclair > Subject: Re: [SciPy-dev] Weird problem trying to preview Milestones edit > To: "SciPy Developers List" > Date: Sunday, July 5, 2009, 11:33 PM > >2009/7/6 David Goldsmith : > > > > Hi! ?Editing the Milestones page, if I try to preview > (even before I make any edits), I get: > > > > ? ?* Wiki > > ? ?* Docstrings > > ? ?* Changes > > ? ?* Milestones > > ? ?* Search > > ? ?* Stats > > ? ?* Patch > > ? ?* ? > > ? ?* Log in > > > > 500 Internal Error > > > > even on multiple refreshes. ?I don't get this > previewing edits on any other page (so far), and it's the > first time it's happened to me while editing Milestones > (which I've successfully edited several times). > > > > I'm seeing this too. No problems on other pages. > > Cheers, > Scott > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > From pav at iki.fi Mon Jul 6 03:44:44 2009 From: pav at iki.fi (Pauli Virtanen) Date: Mon, 6 Jul 2009 07:44:44 +0000 (UTC) Subject: [SciPy-dev] Weird problem trying to preview Milestones edit References: <148340.5270.qm@web52107.mail.re2.yahoo.com> Message-ID: Sun, 05 Jul 2009 22:58:50 -0700, David Goldsmith kirjoitti: > Hi! Editing the Milestones page, if I try to preview (even before I > make any edits), I get: > > 500 Internal Error Some unicode thing, should be fixed now. -- Pauli Virtanen From pav+sp at iki.fi Mon Jul 6 15:51:32 2009 From: pav+sp at iki.fi (Pauli Virtanen) Date: Mon, 6 Jul 2009 19:51:32 +0000 (UTC) Subject: [SciPy-dev] Scipy.org down Message-ID: Hi, Scipy.org seems to be currently down. -- Pauli Virtanen From V.vanBeveren at rijnhuizen.nl Tue Jul 7 09:59:41 2009 From: V.vanBeveren at rijnhuizen.nl (Vincent van Beveren) Date: Tue, 7 Jul 2009 15:59:41 +0200 Subject: [SciPy-dev] FW: SciPy compared to IDL, matlab Message-ID: <2926F4BC94217A43A2D21792DE8818931C5B245B28@ex1.rijnh.nl> Oops, Sorry, posted this in the wrong list. I will repost it in scipy-users Regards, Vincent ________________________________ From: Vincent van Beveren Sent: dinsdag 7 juli 2009 12:00 To: 'scipy-dev at scipy.org' Subject: SciPy compared to IDL, matlab Hello everyone, I'm an engineer at Rijnhuizen, which is a research institute for plasma physics. We use Python to drive one of our main research projects, however, in the scientific circles in this institute Python (and SciPy) are largely unknown. Time to change this, I think :). However, since I am an engineer and not a scientist I my viewpoint on Python and SciPy are more of an engineering perspective, like its Open Source, Free, modern language, functional programming, etc... I'm not entirely sure these are compelling arguments for a scientist to start working with Python (or atleast not on it self). So I was wondering, if I was to promote Python in the scientific community here at Rijnhuizen. So I have a few questions: - In what aspects does SciPy excel, compared to say IDL or matlab? - In what ways allows it a scientist to be more effective? - How usable is SciPy for Plasma physics, molucular dynamics and nanolayer Surface and Interface Physics (the 3 main areas at Rijnhuizen)? - How stable is it compared to other packages. (bugs, computation)? However, feel free to give any comments and insights! Regards, Vincent Ing. V. van Beveren Software Engineer, FOM Rijnhuizen T: +31 (0) 30-6096769 E: V.vanBeveren at rijnhuizen.nl -------------- next part -------------- An HTML attachment was scrubbed... URL: From eads at soe.ucsc.edu Tue Jul 7 12:48:54 2009 From: eads at soe.ucsc.edu (Damian Eads) Date: Tue, 7 Jul 2009 10:48:54 -0600 Subject: [SciPy-dev] [Numpy-discussion] rgb_to_hsv in scipy.misc ? (was: Optimizing speed for large-array inter-element algorithms (specifically, color space conversion)) In-Reply-To: <9457e7c80902181246j43ec0cddg4a444f911d77b175@mail.gmail.com> References: <954ae5aa0902181036i192ad020kd152e0629974ed6c@mail.gmail.com> <9457e7c80902181246j43ec0cddg4a444f911d77b175@mail.gmail.com> Message-ID: <91b4b1ab0907070948x56b7e36er6e2ea770c123a5d8@mail.gmail.com> +1 on rgb_to_hsv making it into SciPy. I needed it recently for some research code I'm working on. +1 on imread. Stefan, do you have any thoughts on its interface? I'm an advocate for making it as similar to MATLAB's interface as possible to make it easier for people to switch from MATLAB to Python. 2009/2/18 St?fan van der Walt : > Hi Nicolas > > 2009/2/18 Nicolas Pinto : >> Would it be possible to include the following rgb to hsv conversion code in >> scipy (probably in misc along with misc.imread, etc.) ? > > I think SciPy could do with some more image processing algorithms. > Would anyone mind if we added this sort of thing to the ndimage > namespace? ?I'd also like to make available `imread` in that module. > > If there aren't any objections, I'd gladly help integrate the color > conversion code. > > Regards > St?fan > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://projects.scipy.org/mailman/listinfo/scipy-dev > -- ----------------------------------------------------- Damian Eads Ph.D. Candidate Jack Baskin School of Engineering, UCSC E2-489 1156 High Street Machine Learning Lab Santa Cruz, CA 95064 http://www.soe.ucsc.edu/~eads From stefan at sun.ac.za Tue Jul 7 13:16:14 2009 From: stefan at sun.ac.za (=?ISO-8859-1?Q?St=E9fan_van_der_Walt?=) Date: Tue, 7 Jul 2009 19:16:14 +0200 Subject: [SciPy-dev] [Numpy-discussion] rgb_to_hsv in scipy.misc ? (was: Optimizing speed for large-array inter-element algorithms (specifically, color space conversion)) In-Reply-To: <91b4b1ab0907070948x56b7e36er6e2ea770c123a5d8@mail.gmail.com> References: <954ae5aa0902181036i192ad020kd152e0629974ed6c@mail.gmail.com> <9457e7c80902181246j43ec0cddg4a444f911d77b175@mail.gmail.com> <91b4b1ab0907070948x56b7e36er6e2ea770c123a5d8@mail.gmail.com> Message-ID: <9457e7c80907071016k6126c8fr4abf3682046bea9f@mail.gmail.com> Hi Damian 2009/7/7 Damian Eads : > +1 on rgb_to_hsv making it into SciPy. I needed it recently for some > research code I'm working on. I've also needed that code recently, so I'd be glad to include it. > +1 on imread. Stefan, do you have any thoughts on its interface? I'm > an advocate for making it as similar to MATLAB's interface as possible > to make it easier for people to switch from MATLAB to Python. It's already in :-) I'm open to any recommendations on the API. Cheers St?fan From d_l_goldsmith at yahoo.com Tue Jul 7 15:31:01 2009 From: d_l_goldsmith at yahoo.com (David Goldsmith) Date: Tue, 7 Jul 2009 12:31:01 -0700 (PDT) Subject: [SciPy-dev] Doc Marathon Skypecon reminder and last week's minutes Message-ID: <606888.90656.qm@web52101.mail.re2.yahoo.com> Good day, Marathon runners! Just a friendly reminder that our weekly Skypecon to discuss all things NumPy Doc related will be tomorrow, Wednesday, July 8, 19:00 UTC. Email me any agenda items (mine: the low participation this last week on the category of the week - is there a "systemic" problem of which I need to be aware, or was it just a bad concurrence of everyone being either swamped or on vacation at the same time?), and please make sure I have your SkypeID sufficiently in advance. Below are the Minutes from last week's meeting. Also, remember that we hope to "finish" the Linear Algebra category by tomorrow (at the rate we're going, however, I think I'll say we hope to finish by 23:59 UTC tomorrow, as opposed to 19:00 UTC). Talk to you tomorrow, hopefully! David Goldsmith Technical Editor Olympia, WA Numpy Doc Summer '09 Marathon July 1, 2009 Skypecon Minutes Start: 19:10 UTC End: 19:30 UTC Present: David Goldsmith, Secretary Ralf Gommers Pauli Virtanen Discussions: 0) Status of Patch #1146. Ready, more or less, for adoption (later ID-ed one more ?little detail? to add, which Pauli agreed to take care of). 1) Clarification of how to document object methods which are equivalent to a numpy function Explanation in new ?Questions and Answers? section of the Wiki. Next Skypecon: Wednesday, July 8, 19:00 UTC Notes: Only one drop-out this time ? yeah! From pinto at mit.edu Tue Jul 7 20:07:22 2009 From: pinto at mit.edu (Nicolas Pinto) Date: Tue, 7 Jul 2009 20:07:22 -0400 Subject: [SciPy-dev] bug using array with a PIL Image (misc.imread) Message-ID: <954ae5aa0907071707n5be0c415w944174717794b0f4@mail.gmail.com> Dear all, For some reason I have a problem converting a specific png image using array(). Here is what I am getting (using numpy 1.3.0 and scipy 0.7.0). % wget http://files.getdropbox.com/u/167753/spiky_adj_023.png % python -c "import numpy as np; import Image; print np.array(Image.open('spiky_adj_023.png'))" % wget http://files.getdropbox.com/u/167753/spiky_norm_001.png % python -c "import numpy as np; import Image; print np.array(Image.open('spiky_norm_001.png'))" [[134 30 140 ..., 230 83 59] [ 99 202 233 ..., 160 63 133] [ 93 241 35 ..., 7 240 101] ..., [206 132 196 ..., 139 190 112] [218 21 217 ..., 121 152 109] [ 83 188 187 ..., 6 240 251]] I was initially using scipy's misc.imread when I found this bug. I am currently using the following workaround: % python -c "import numpy as np; import Image; print np.array(Image.open('spiky_adj_023.png').convert('RGB'))" Let me know what you think. Thanks in advance. Sincerely, -- Nicolas Pinto Ph.D. Candidate, Brain & Computer Sciences Massachusetts Institute of Technology, USA http://web.mit.edu/pinto -------------- next part -------------- An HTML attachment was scrubbed... URL: From robert.kern at gmail.com Tue Jul 7 20:57:30 2009 From: robert.kern at gmail.com (Robert Kern) Date: Tue, 7 Jul 2009 19:57:30 -0500 Subject: [SciPy-dev] [Numpy-discussion] rgb_to_hsv in scipy.misc ? (was: Optimizing speed for large-array inter-element algorithms (specifically, color space conversion)) In-Reply-To: <9457e7c80907071016k6126c8fr4abf3682046bea9f@mail.gmail.com> References: <954ae5aa0902181036i192ad020kd152e0629974ed6c@mail.gmail.com> <9457e7c80902181246j43ec0cddg4a444f911d77b175@mail.gmail.com> <91b4b1ab0907070948x56b7e36er6e2ea770c123a5d8@mail.gmail.com> <9457e7c80907071016k6126c8fr4abf3682046bea9f@mail.gmail.com> Message-ID: <3d375d730907071757l11fdd1f7p469c641d28ab038b@mail.gmail.com> 2009/7/7 St?fan van der Walt : > Hi Damian > > 2009/7/7 Damian Eads : >> +1 on rgb_to_hsv making it into SciPy. I needed it recently for some >> research code I'm working on. > > I've also needed that code recently, so I'd be glad to include it. > >> +1 on imread. Stefan, do you have any thoughts on its interface? I'm >> an advocate for making it as similar to MATLAB's interface as possible >> to make it easier for people to switch from MATLAB to Python. > > It's already in :-) ?I'm open to any recommendations on the API. http://svn.scipy.org/svn/scipy/branches/sandbox/scipy/sandbox/image/color.py -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From eads at soe.ucsc.edu Tue Jul 7 22:47:51 2009 From: eads at soe.ucsc.edu (Damian Eads) Date: Tue, 7 Jul 2009 20:47:51 -0600 Subject: [SciPy-dev] [Numpy-discussion] rgb_to_hsv in scipy.misc ? (was: Optimizing speed for large-array inter-element algorithms (specifically, color space conversion)) In-Reply-To: <9457e7c80907071016k6126c8fr4abf3682046bea9f@mail.gmail.com> References: <954ae5aa0902181036i192ad020kd152e0629974ed6c@mail.gmail.com> <9457e7c80902181246j43ec0cddg4a444f911d77b175@mail.gmail.com> <91b4b1ab0907070948x56b7e36er6e2ea770c123a5d8@mail.gmail.com> <9457e7c80907071016k6126c8fr4abf3682046bea9f@mail.gmail.com> Message-ID: <91b4b1ab0907071947g7bd9b407t516617b874a362c1@mail.gmail.com> Hi Stefan, 2009/7/7 St?fan van der Walt : > Hi Damian > > 2009/7/7 Damian Eads : >> +1 on rgb_to_hsv making it into SciPy. I needed it recently for some >> research code I'm working on. > > I've also needed that code recently, so I'd be glad to include it. I used the code in this message, http://www.mail-archive.com/numpy-discussion at scipy.org/msg15642.html . I don't know if it's correct or not. >> +1 on imread. Stefan, do you have any thoughts on its interface? I'm >> an advocate for making it as similar to MATLAB's interface as possible >> to make it easier for people to switch from MATLAB to Python. > > It's already in :-) ?I'm open to any recommendations on the API. Two other arguments in the imread function would be helpful, roi and band. 'roi' is a 2 or 3 element tuple of coordinates to read, which is useful when reading from a very large image. Thanks! Damian > > Cheers > St?fan > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > -- ----------------------------------------------------- Damian Eads Ph.D. Candidate Jack Baskin School of Engineering, UCSC E2-489 1156 High Street Machine Learning Lab Santa Cruz, CA 95064 http://www.soe.ucsc.edu/~eads From d_l_goldsmith at yahoo.com Wed Jul 8 14:51:52 2009 From: d_l_goldsmith at yahoo.com (David Goldsmith) Date: Wed, 8 Jul 2009 11:51:52 -0700 (PDT) Subject: [SciPy-dev] Marathon Skypecon in 10 min. Message-ID: <485461.89426.qm@web52102.mail.re2.yahoo.com> So far, no one has RSVP-ed (in the positive) this week (not that you're required to, but) I'd like to know: anyone planning on Skying in? DG SkypeID: d.l.goldsmith From mueller at imt.uni-luebeck.de Thu Jul 9 15:41:45 2009 From: mueller at imt.uni-luebeck.de (=?iso-8859-1?Q?Jan_M=FCller?=) Date: Thu, 9 Jul 2009 21:41:45 +0200 Subject: [SciPy-dev] Small bug in miobase.py? Message-ID: <6C7C97F0FE350247B2607BAC32BB720930722075C8@z.imt.uni-luebeck.de> (Using 32 bit ubuntu, 0.70 scipy, python 2.6.2) What happens: --------- Accidentally I just tried to open an invalid matlab file using m = scipy.io.matlab.loadmat(...) and got an unexpected error message. From the scipy source I was expecting an "Unknown mat file type..." message but all I got was a lousy unhandled TypeEror "not all arguments converted during string formatting"... Steps to reproduce: -------------- import scipy.io filename = ... % an existing file which is _not_ a valid matlab file ret = scipy.io.matlab.loadmat(filename) Solution (?): -------------- this is the relevant section: def get_matfile_version(fileobj): % ... % do some stuff % ... if maj_val in (1, 2): return ret else: raise ValueError('Unknown mat file type, version %s' % ret) If I change line 125 from raise ValueError('Unknown mat file type, version %s' % ret) to raise ValueError('Unknown mat file type, version %s' % str(ret)) everything works as expected. Jan From matthew.brett at gmail.com Thu Jul 9 16:08:11 2009 From: matthew.brett at gmail.com (Matthew Brett) Date: Thu, 9 Jul 2009 13:08:11 -0700 Subject: [SciPy-dev] Small bug in miobase.py? In-Reply-To: <6C7C97F0FE350247B2607BAC32BB720930722075C8@z.imt.uni-luebeck.de> References: <6C7C97F0FE350247B2607BAC32BB720930722075C8@z.imt.uni-luebeck.de> Message-ID: <1e2af89e0907091308m69f282e0hb4f8a90e37c1ce6a@mail.gmail.com> Hi, > Steps to reproduce: > -------------- > import scipy.io > filename = ... % an existing file which is _not_ a valid matlab file > ret = scipy.io.matlab.loadmat(filename) Thanks - yes - I hit that one myself a while ago, it should be fixed in SVN and 0.7.1 Best, Matthew From d_l_goldsmith at yahoo.com Sat Jul 11 03:23:31 2009 From: d_l_goldsmith at yahoo.com (David Goldsmith) Date: Sat, 11 Jul 2009 00:23:31 -0700 (PDT) Subject: [SciPy-dev] Anyone wanna propose a category of the week Message-ID: <127465.66974.qm@web52112.mail.re2.yahoo.com> Anyone care? DG From millman at berkeley.edu Sat Jul 11 04:32:21 2009 From: millman at berkeley.edu (Jarrod Millman) Date: Sat, 11 Jul 2009 01:32:21 -0700 Subject: [SciPy-dev] ANN: SciPy 2009 early registration extended to July 17th Message-ID: The early registration deadline for SciPy 2009 has been extended for one week to July 17, 2009. Please register ( http://conference.scipy.org/to_register ) by this date to take advantage of the reduced early registration rate. About the conference -------------------- SciPy 2009, the 8th Python in Science conference, will be held from August 18-23, 2009 at Caltech in Pasadena, CA, USA. The conference starts with two days of tutorials to the scientific Python tools. There will be two tracks, one for introduction of the basic tools to beginners, and one for more advanced tools. The tutorials will be followed by two days of talks. Both days of talks will begin with a keynote address. The first day?s keynote will be given by Peter Norvig, the Director of Research at Google; while, the second keynote will be delivered by Jon Guyer, a Materials Scientist in the Thermodynamics and Kinetics Group at NIST. The program committee will select the remaining talks from submissions to our call for papers. All selected talks will be included in our conference proceedings edited by the program committee. After the talks each day we will provide several rooms for impromptu birds of a feather discussions. Finally, the last two days of the conference will be used for a number of coding sprints on the major software projects in our community. For the 8th consecutive year, the conference will bring together the developers and users of the open source software stack for scientific computing with Python. Attendees have the opportunity to review the available tools and how they apply to specific problems. By providing a forum for developers to share their Python expertise with the wider commercial, academic, and research communities, this conference fosters collaboration and facilitates the sharing of software components, techniques, and a vision for high level language use in scientific computing. For further information, please visit the conference homepage: http://conference.scipy.org. Important Dates --------------- * Friday, July 3: Abstracts Due * Wednesday, July 15: Announce accepted talks, post schedule * Friday, July 17: Early Registration ends * Tuesday-Wednesday, August 18-19: Tutorials * Thursday-Friday, August 20-21: Conference * Saturday-Sunday, August 22-23: Sprints * Friday, September 4: Papers for proceedings due Executive Committee ------------------- * Jarrod Millman, UC Berkeley, USA (Conference Chair) * Ga?l Varoquaux, INRIA Saclay, France (Program Co-Chair) * St?fan van der Walt, University of Stellenbosch, South Africa (Program Co-Chair) * Fernando P?rez, UC Berkeley, USA (Tutorial Chair) From ralf.gommers at googlemail.com Sat Jul 11 10:50:35 2009 From: ralf.gommers at googlemail.com (Ralf Gommers) Date: Sat, 11 Jul 2009 10:50:35 -0400 Subject: [SciPy-dev] Anyone wanna propose a category of the week In-Reply-To: <127465.66974.qm@web52112.mail.re2.yahoo.com> References: <127465.66974.qm@web52112.mail.re2.yahoo.com> Message-ID: Data type investigation? I'm up for writing a few of those. ralf On Sat, Jul 11, 2009 at 3:23 AM, David Goldsmith wrote: > > Anyone care? > > DG > > > > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > -------------- next part -------------- An HTML attachment was scrubbed... URL: From d_l_goldsmith at yahoo.com Sat Jul 11 23:44:07 2009 From: d_l_goldsmith at yahoo.com (David Goldsmith) Date: Sat, 11 Jul 2009 20:44:07 -0700 (PDT) Subject: [SciPy-dev] Anyone wanna propose a category of the week Message-ID: <750788.67129.qm@web52110.mail.re2.yahoo.com> Data type investigation it is; we'll make the week Sunday to Saturday this week. Thanks, Ralf! DG --- On Sat, 7/11/09, Ralf Gommers wrote: > Data type investigation? I'm up for > writing a few of those. > > ralf From gael.varoquaux at normalesup.org Sat Jul 11 23:48:49 2009 From: gael.varoquaux at normalesup.org (Gael Varoquaux) Date: Sun, 12 Jul 2009 05:48:49 +0200 Subject: [SciPy-dev] Anyone wanna propose a category of the week In-Reply-To: <750788.67129.qm@web52110.mail.re2.yahoo.com> References: <750788.67129.qm@web52110.mail.re2.yahoo.com> Message-ID: <20090712034849.GB26380@phare.normalesup.org> On Sat, Jul 11, 2009 at 08:44:07PM -0700, David Goldsmith wrote: > Data type investigation it is; we'll make the week Sunday to Saturday this week. Thanks, Ralf! Something that would also need some time (though not as much), is make sure that the NaN comparison and sorting issues are well-explained on all the functions that do comparison and sorting. Ga?l From d_l_goldsmith at yahoo.com Sun Jul 12 00:20:39 2009 From: d_l_goldsmith at yahoo.com (David Goldsmith) Date: Sat, 11 Jul 2009 21:20:39 -0700 (PDT) Subject: [SciPy-dev] Anyone wanna propose a category of the week Message-ID: <592243.55482.qm@web52105.mail.re2.yahoo.com> Awesome, now we have one in the queue, that's what I like to see! DG --- On Sat, 7/11/09, Gael Varoquaux wrote: > From: Gael Varoquaux > Subject: Re: [SciPy-dev] Anyone wanna propose a category of the week > To: "SciPy Developers List" > Date: Saturday, July 11, 2009, 8:48 PM > On Sat, Jul 11, 2009 at 08:44:07PM > -0700, David Goldsmith wrote: > > > Data type investigation it is; we'll make the week > Sunday to Saturday this week.? Thanks, Ralf! > > Something that would also need some time (though not as > much), is make > sure that the NaN comparison and sorting issues are > well-explained on all > the functions that do comparison and sorting. > > Ga?l > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > From gael.varoquaux at normalesup.org Sun Jul 12 00:22:32 2009 From: gael.varoquaux at normalesup.org (Gael Varoquaux) Date: Sun, 12 Jul 2009 06:22:32 +0200 Subject: [SciPy-dev] Anyone wanna propose a category of the week In-Reply-To: <592243.55482.qm@web52105.mail.re2.yahoo.com> References: <592243.55482.qm@web52105.mail.re2.yahoo.com> Message-ID: <20090712042232.GC26380@phare.normalesup.org> On Sat, Jul 11, 2009 at 09:20:39PM -0700, David Goldsmith wrote: > Awesome, now we have one in the queue, that's what I like to see! It's great to have you around to manage the queue. Ga?l From ralf.gommers at googlemail.com Sun Jul 12 01:49:50 2009 From: ralf.gommers at googlemail.com (Ralf Gommers) Date: Sun, 12 Jul 2009 01:49:50 -0400 Subject: [SciPy-dev] Anyone wanna propose a category of the week In-Reply-To: <592243.55482.qm@web52105.mail.re2.yahoo.com> References: <592243.55482.qm@web52105.mail.re2.yahoo.com> Message-ID: Actually, data type investigation is done. I guess it is Sunday now so it counts:) Maybe we can do Gael's suggestion as well this week. Cheers, Ralf On Sun, Jul 12, 2009 at 12:20 AM, David Goldsmith wrote: > > Awesome, now we have one in the queue, that's what I like to see! > > DG > > --- On Sat, 7/11/09, Gael Varoquaux wrote: > > > From: Gael Varoquaux > > Subject: Re: [SciPy-dev] Anyone wanna propose a category of the week > > To: "SciPy Developers List" > > Date: Saturday, July 11, 2009, 8:48 PM > > On Sat, Jul 11, 2009 at 08:44:07PM > > -0700, David Goldsmith wrote: > > > > > Data type investigation it is; we'll make the week > > Sunday to Saturday this week. Thanks, Ralf! > > > > Something that would also need some time (though not as > > much), is make > > sure that the NaN comparison and sorting issues are > > well-explained on all > > the functions that do comparison and sorting. > > > > Ga?l > > _______________________________________________ > > Scipy-dev mailing list > > Scipy-dev at scipy.org > > http://mail.scipy.org/mailman/listinfo/scipy-dev > > > > > > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > -------------- next part -------------- An HTML attachment was scrubbed... URL: From d_l_goldsmith at yahoo.com Sun Jul 12 13:13:45 2009 From: d_l_goldsmith at yahoo.com (David Goldsmith) Date: Sun, 12 Jul 2009 10:13:45 -0700 (PDT) Subject: [SciPy-dev] Anyone wanna propose a category of the week Message-ID: <930172.58442.qm@web52109.mail.re2.yahoo.com> You're too kind. ;-) DG --- On Sat, 7/11/09, Gael Varoquaux wrote: > From: Gael Varoquaux > Subject: Re: [SciPy-dev] Anyone wanna propose a category of the week > To: "SciPy Developers List" > Date: Saturday, July 11, 2009, 9:22 PM > On Sat, Jul 11, 2009 at 09:20:39PM > -0700, David Goldsmith wrote: > > > Awesome, now we have one in the queue, that's what I > like to see! > > It's great to have you around to manage the queue. > > Ga?l > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > From d_l_goldsmith at yahoo.com Sun Jul 12 13:21:46 2009 From: d_l_goldsmith at yahoo.com (David Goldsmith) Date: Sun, 12 Jul 2009 10:21:46 -0700 (PDT) Subject: [SciPy-dev] Anyone wanna propose a category of the week and Skypecon reminder Message-ID: <144654.70462.qm@web52103.mail.re2.yahoo.com> Darn, that means the queue is empty; thanks, Ralf. ;-) Oh wohl. OK, > > make sure that the NaN comparison and sorting issues are > > well-explained on all > > the functions that do comparison and sorting. Is now the "category" of the week. Thanks, Gael. Reminder: Skypecon this week is Tues. 18:00 UTC (instead of the usual Wed. 19:00 UTC). DG --- On Sat, 7/11/09, Ralf Gommers wrote: > From: Ralf Gommers > Subject: Re: [SciPy-dev] Anyone wanna propose a category of the week > To: "SciPy Developers List" > Date: Saturday, July 11, 2009, 10:49 PM > Actually, data type investigation is > done. I guess it is Sunday now so it counts:) > > Maybe we can do Gael's suggestion as well this week. > > Cheers, > Ralf > > > On Sun, Jul 12, 2009 at 12:20 AM, > David Goldsmith > wrote: > > > > Awesome, now we have one in the queue, that's what I > like to see! > > > > DG > > > > --- On Sat, 7/11/09, Gael Varoquaux > wrote: > > > > > From: Gael Varoquaux > > > Subject: Re: [SciPy-dev] Anyone wanna propose a > category of the week > > > To: "SciPy Developers List" > > > Date: Saturday, July 11, 2009, 8:48 PM > > > On Sat, Jul 11, 2009 > at 08:44:07PM > > > -0700, David Goldsmith wrote: > > > > > > > Data type investigation it is; we'll make the > week > > > Sunday to Saturday this week.? Thanks, Ralf! > > > > > > Something that would also need some time (though not > as > > > much), is make > > > sure that the NaN comparison and sorting issues are > > > well-explained on all > > > the functions that do comparison and sorting. > > > > > > Ga?l > > > _______________________________________________ > > > Scipy-dev mailing list > > > Scipy-dev at scipy.org > > > http://mail.scipy.org/mailman/listinfo/scipy-dev > > > > > > > > > > > _______________________________________________ > > Scipy-dev mailing list > > Scipy-dev at scipy.org > > http://mail.scipy.org/mailman/listinfo/scipy-dev > > > > > -----Inline Attachment Follows----- > > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > From d_l_goldsmith at yahoo.com Mon Jul 13 17:34:46 2009 From: d_l_goldsmith at yahoo.com (d_l_goldsmith at yahoo.com) Date: Mon, 13 Jul 2009 14:34:46 -0700 (PDT) Subject: [SciPy-dev] skype call agenda points Message-ID: <738234.61438.qm@web52101.mail.re2.yahoo.com> Hi, Ralf, thanks for your thoughts - I hope you don't mind that I'm replying via the list: I didn't see anything in the email that looked like it should be kept private, and perhaps more people will participate if they see what and how much is on the agenda! :-) All your agenda items look good - not that I'd reject any of 'em, I see these things as open discussions, so all scipy-dev subscriber's concerns are fair game at these Skypecons.? Also, thanks for RSVP-ing in the positive; others: please follow Ralf's example in that regard if you can.? I have some comments on the items I'll give below, so here let me just remind everyone of the time: tomorrow, Tues. July 14, 18:00 UTC (an hour and a day earlier than usual). --- On Mon, 7/13/09, Ralf Gommers wrote: > Here are some thoughts on what to talk about during the > call tomorrow. I should be able to make it. > > - we're getting close to being done with Milestones > accessible to contributors without in-depth knowledge of > NumPy Awesome, thanks for pointing this out. > - how to get the specialized Milestones doc'ed (i.e. > ctypes, ufuncs, numpy internals,...) Well, I'll be acquiring "in-depth" knowledge as needed, but I certainly hope I don't have to do 'em all. > - what is next, scipy reference or numpy user guide, or > both? Open to discussion. > - SciPy Milestones page needs to be done Yup. > - note, scipy has 3000 docstrings, it took us >1 year to > do 1000 for numpy. how to ever get scipy done? Well, part of the reason it took so long is 'cause the Marathon stalled when the summer hired position terminated; I won't presume there's a causation there, but it certainly suggests that if Joe and I (mostly Joe) can secure long term funding, then the work can progress more continuously. > - review system, at least should show who reviewed what > docstring in the Log Agreed. Talk to you tomorrow, and again, thanks! DG PS: The other reminder - the "category" of the week is to make sure functions w/ nan equivalents are well-documented in that respect. > Cheers, > Ralf > > > From frostedcheerios at gmail.com Mon Jul 13 17:43:17 2009 From: frostedcheerios at gmail.com (David Martin) Date: Mon, 13 Jul 2009 16:43:17 -0500 Subject: [SciPy-dev] scipy.linalg.eigvals error on 64bit RH3 & RH4 Message-ID: <4A5BAA75.7070209@gmail.com> Has anyone else gotten these scipy.test() errors when running on RH3 and RH4 (both 64bit)? The following tests pass on RH5 64-bit and RH3,4,5 32-bit, so this seems to be a 64-bit RH issue prior to RH5. It looks like eigvals isn't returning the correct values: -David On Windows XP 32-bit: Python 2.5.4 (r254:67916, May 17 2009, 21:12:18) [MSC v.1310 32 bit (Intel)] on win32 Type "help", "copyright", "credits" or "license" for more information. >>> from numpy import sqrt >>> from scipy.linalg import eigvals >>> a = [[1,2,3],[1,2,3],[2,5,6]] >>> w = eigvals(a) >>> exact_w = [(9+sqrt(93))/2,0,(9-sqrt(93))/2] >>> w array([ 9.32182538e+00+0.j, -9.59852072e-16+0.j, -3.21825380e-01+0.j]) >>> exact_w [9.3218253804964775, 0, -0.32182538049647746] On RH3.8 64-bit: Python 2.5.4 (r254:67916, May 20 2009, 22:57:32) [GCC 3.2.3 20030502 (Red Hat Linux 3.2.3-56)] on linux2 Type "help", "copyright", "credits" or "license" for more information. >>> from numpy import sqrt >>> from scipy.linalg import eigvals >>> a = [[1,2,3],[1,2,3],[2,5,6]] >>> w = eigvals(a) >>> exact_w = [(9+sqrt(93))/2,0,(9-sqrt(93))/2] >>> w array([ 9.43719064+0.j, -0.11536526+0.j, -0.32182538+0.j]) >>> exact_w [9.3218253804964775, 0, -0.32182538049647746] ====================================================================== FAIL: test_simple (test_decomp.TestEig) ---------------------------------------------------------------------- Traceback (most recent call last): File "/home/tester/master/lib/python2.5/site-packages/scipy/linalg/tests/test_decomp.py", line 145, in test_simple assert_array_almost_equal(w,exact_w) File "/home/tester/master/lib/python2.5/site-packages/numpy/testing/utils.py", line 537, in assert_array_almost_equal header='Arrays are not almost equal') File "/home/tester/master/lib/python2.5/site-packages/numpy/testing/utils.py", line 395, in assert_array_compare raise AssertionError(msg) AssertionError: Arrays are not almost equal (mismatch 66.6666666667%) x: array([ 9.43719064+0.j, -0.11536526+0.j, -0.32182538+0.j]) y: array([ 9.32182538, 0. , -0.32182538]) ====================================================================== FAIL: test_simple (test_decomp.TestEigVals) ---------------------------------------------------------------------- Traceback (most recent call last): File "/home/tester/master/lib/python2.5/site-packages/scipy/linalg/tests/test_decomp.py", line 114, in test_simple assert_array_almost_equal(w,exact_w) File "/home/tester/master/lib/python2.5/site-packages/numpy/testing/utils.py", line 537, in assert_array_almost_equal header='Arrays are not almost equal') File "/home/tester/master/lib/python2.5/site-packages/numpy/testing/utils.py", line 395, in assert_array_compare raise AssertionError(msg) AssertionError: Arrays are not almost equal (mismatch 66.6666666667%) x: array([ 9.43719064+0.j, -0.11536526+0.j, -0.32182538+0.j]) y: array([ 9.32182538, 0. , -0.32182538]) ====================================================================== FAIL: test_simple_tr (test_decomp.TestEigVals) ---------------------------------------------------------------------- Traceback (most recent call last): File "/home/tester/master/lib/python2.5/site-packages/scipy/linalg/tests/test_decomp.py", line 122, in test_simple_tr assert_array_almost_equal(w,exact_w) File "/home/tester/master/lib/python2.5/site-packages/numpy/testing/utils.py", line 537, in assert_array_almost_equal header='Arrays are not almost equal') File "/home/tester/master/lib/python2.5/site-packages/numpy/testing/utils.py", line 395, in assert_array_compare raise AssertionError(msg) AssertionError: Arrays are not almost equal (mismatch 66.6666666667%) x: array([ 9.43719064+0.j, -0.11536526+0.j, -0.32182538+0.j]) y: array([ 9.32182538, 0. , -0.32182538]) ====================================================================== FAIL: test (test_speigs.TestEigs) ---------------------------------------------------------------------- Traceback (most recent call last): File "/home/tester/master/lib/python2.5/site-packages/scipy/sparse/linalg/eigen/arpack/tests/test_speigs.py", line 34, in test assert_array_almost_equal(calc_vals, vals[0:nev], decimal=7) File "/home/tester/master/lib/python2.5/site-packages/numpy/testing/utils.py", line 537, in assert_array_almost_equal header='Arrays are not almost equal') File "/home/tester/master/lib/python2.5/site-packages/numpy/testing/utils.py", line 395, in assert_array_compare raise AssertionError(msg) AssertionError: Arrays are not almost equal (mismatch 100.0%) x: array([ 0.03559256, 0.73900411, 0.73525312, 1.07213911]) y: array([ 0.25819889, 0.51639778, 0.77459667, 1.03279556]) From robert.kern at gmail.com Mon Jul 13 17:54:13 2009 From: robert.kern at gmail.com (Robert Kern) Date: Mon, 13 Jul 2009 16:54:13 -0500 Subject: [SciPy-dev] scipy.linalg.eigvals error on 64bit RH3 & RH4 In-Reply-To: <4A5BAA75.7070209@gmail.com> References: <4A5BAA75.7070209@gmail.com> Message-ID: <3d375d730907131454j40b93880ubd8eb5ffd6115a8c@mail.gmail.com> On Mon, Jul 13, 2009 at 16:43, David Martin wrote: > Has anyone else gotten these scipy.test() errors when running on RH3 and > RH4 (both 64bit)? The following tests pass on RH5 64-bit and RH3,4,5 > 32-bit, so this seems to be a 64-bit RH issue prior to RH5. It looks > like eigvals isn't returning the correct values: Try to run the ATLAS tests for the version you built scipy against, if you can. Incorrect values from eigvals() usually mean that the underlying library is having problems, not scipy. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From frostedcheerios at gmail.com Mon Jul 13 19:12:32 2009 From: frostedcheerios at gmail.com (David Martin) Date: Mon, 13 Jul 2009 18:12:32 -0500 Subject: [SciPy-dev] scipy.linalg.eigvals error on 64bit RH3 & RH4 In-Reply-To: <3d375d730907131454j40b93880ubd8eb5ffd6115a8c@mail.gmail.com> References: <4A5BAA75.7070209@gmail.com> <3d375d730907131454j40b93880ubd8eb5ffd6115a8c@mail.gmail.com> Message-ID: <4A5BBF60.4040706@gmail.com> Robert Kern wrote: > On Mon, Jul 13, 2009 at 16:43, David Martin wrote: > >> Has anyone else gotten these scipy.test() errors when running on RH3 and >> RH4 (both 64bit)? The following tests pass on RH5 64-bit and RH3,4,5 >> 32-bit, so this seems to be a 64-bit RH issue prior to RH5. It looks >> like eigvals isn't returning the correct values: >> > > Try to run the ATLAS tests for the version you built scipy against, if > you can. Incorrect values from eigvals() usually mean that the > underlying library is having problems, not scipy. > > Unfortunately I only had the .so libraries to link against so I couldn't run the ATLAS sanity tests against them. I built a new set of ATLAS libraries on RH3 64-bit and ran "make check" after the build, which showed all tests passing. DONE BUILDING TESTERS, RUNNING: SCOPING FOR FAILURES IN BIN TESTS: fgrep -e fault -e FAULT -e error -e ERROR -e fail -e FAIL \ bin/sanity.out 8 cases: 8 passed, 0 skipped, 0 failed 4 cases: 4 passed, 0 skipped, 0 failed 8 cases: 8 passed, 0 skipped, 0 failed 4 cases: 4 passed, 0 skipped, 0 failed 8 cases: 8 passed, 0 skipped, 0 failed 4 cases: 4 passed, 0 skipped, 0 failed 8 cases: 8 passed, 0 skipped, 0 failed 4 cases: 4 passed, 0 skipped, 0 failed DONE SCOPING FOR FAILURES IN CBLAS TESTS: fgrep -e fault -e FAULT -e error -e ERROR -e fail -e FAIL \ interfaces/blas/C/testing/sanity.out | \ fgrep -v PASSED DONE SCOPING FOR FAILURES IN F77BLAS TESTS: fgrep -e fault -e FAULT -e error -e ERROR -e fail -e FAIL \ interfaces/blas/F77/testing/sanity.out | \ fgrep -v PASSED DONE I rebuilt Numpy and Scipy against this new ATLAS build and re-ran both the Numpy and Scipy tests. The Numpy tests all pass, but most of the Scipy failures persist (apparently the new ATLAS libraries fixed the test_speigs.TestEigs issue): ====================================================================== FAIL: test_simple (test_decomp.TestEig) ---------------------------------------------------------------------- Traceback (most recent call last): File "/home/builder/master-1.0.4/lib/python2.5/site-packages/scipy/linalg/tests/test_decomp.py", line 145, in test_simple assert_array_almost_equal(w,exact_w) File "/home/builder/recipes/install/numpy-1.3.0-1.egg/numpy/testing/utils.py", line 537, in assert_array_almost_equal header='Arrays are not almost equal') File "/home/builder/recipes/install/numpy-1.3.0-1.egg/numpy/testing/utils.py", line 395, in assert_array_compare raise AssertionError(msg) AssertionError: Arrays are not almost equal (mismatch 66.6666666667%) x: array([ 9.43719064+0.j, -0.11536526+0.j, -0.32182538+0.j]) y: array([ 9.32182538, 0. , -0.32182538]) ====================================================================== FAIL: test_simple (test_decomp.TestEigVals) ---------------------------------------------------------------------- Traceback (most recent call last): File "/home/builder/master-1.0.4/lib/python2.5/site-packages/scipy/linalg/tests/test_decomp.py", line 114, in test_simple assert_array_almost_equal(w,exact_w) File "/home/builder/recipes/install/numpy-1.3.0-1.egg/numpy/testing/utils.py", line 537, in assert_array_almost_equal header='Arrays are not almost equal') File "/home/builder/recipes/install/numpy-1.3.0-1.egg/numpy/testing/utils.py", line 395, in assert_array_compare raise AssertionError(msg) AssertionError: Arrays are not almost equal (mismatch 66.6666666667%) x: array([ 9.43719064+0.j, -0.11536526+0.j, -0.32182538+0.j]) y: array([ 9.32182538, 0. , -0.32182538]) ====================================================================== FAIL: test_simple_tr (test_decomp.TestEigVals) ---------------------------------------------------------------------- Traceback (most recent call last): File "/home/builder/master-1.0.4/lib/python2.5/site-packages/scipy/linalg/tests/test_decomp.py", line 122, in test_simple_tr assert_array_almost_equal(w,exact_w) File "/home/builder/recipes/install/numpy-1.3.0-1.egg/numpy/testing/utils.py", line 537, in assert_array_almost_equal header='Arrays are not almost equal') File "/home/builder/recipes/install/numpy-1.3.0-1.egg/numpy/testing/utils.py", line 395, in assert_array_compare raise AssertionError(msg) AssertionError: Arrays are not almost equal (mismatch 66.6666666667%) x: array([ 9.43719064+0.j, -0.11536526+0.j, -0.32182538+0.j]) y: array([ 9.32182538, 0. , -0.32182538]) ---------------------------------------------------------------------- -David From d_l_goldsmith at yahoo.com Mon Jul 13 22:15:01 2009 From: d_l_goldsmith at yahoo.com (David Goldsmith) Date: Mon, 13 Jul 2009 19:15:01 -0700 (PDT) Subject: [SciPy-dev] Finished off LA Message-ID: <1035.38924.qm@web52108.mail.re2.yahoo.com> Linear Algebra status is now "Goal Met!" DG From ralf.gommers at googlemail.com Tue Jul 14 01:30:25 2009 From: ralf.gommers at googlemail.com (Ralf Gommers) Date: Tue, 14 Jul 2009 01:30:25 -0400 Subject: [SciPy-dev] skype call agenda points In-Reply-To: <738234.61438.qm@web52101.mail.re2.yahoo.com> References: <738234.61438.qm@web52101.mail.re2.yahoo.com> Message-ID: On Mon, Jul 13, 2009 at 5:34 PM, wrote: > > Hi, Ralf, thanks for your thoughts - I hope you don't mind that I'm > replying via the list: I didn't see anything in the email that looked like > it should be kept private, and perhaps more people will participate if they > see what and how much is on the agenda! :-) Don't mind at all, I assumed you would send out the agenda with these points included anyway. > > > All your agenda items look good - not that I'd reject any of 'em, I see > these things as open discussions, so all scipy-dev subscriber's concerns are > fair game at these Skypecons. Also, thanks for RSVP-ing in the positive; > others: please follow Ralf's example in that regard if you can. I have some > comments on the items I'll give below, so here let me just remind everyone > of the time: tomorrow, Tues. July 14, 18:00 UTC (an hour and a day earlier > than usual). > > --- On Mon, 7/13/09, Ralf Gommers wrote: > > > Here are some thoughts on what to talk about during the > > call tomorrow. I should be able to make it. > > > > - we're getting close to being done with Milestones > > accessible to contributors without in-depth knowledge of > > NumPy > > Awesome, thanks for pointing this out. > > > - how to get the specialized Milestones doc'ed (i.e. > > ctypes, ufuncs, numpy internals,...) > > Well, I'll be acquiring "in-depth" knowledge as needed, but I certainly > hope I don't have to do 'em all. > > > - what is next, scipy reference or numpy user guide, or > > both? > > Open to discussion. > > > - SciPy Milestones page needs to be done > > Yup. > > > - note, scipy has 3000 docstrings, it took us >1 year to > > do 1000 for numpy. how to ever get scipy done? > > Well, part of the reason it took so long is 'cause the Marathon stalled > when the summer hired position terminated; I won't presume there's a > causation there, but it certainly suggests that if Joe and I (mostly Joe) > can secure long term funding, then the work can progress more continuously. that would be great. > > > > - review system, at least should show who reviewed what > > docstring in the Log > > Agreed. > > Talk to you tomorrow, and again, thanks! > > DG > > PS: The other reminder - the "category" of the week is to make sure > functions w/ nan equivalents are well-documented in that respect. > > > Cheers, > > Ralf > > > > > > > > > > > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > -------------- next part -------------- An HTML attachment was scrubbed... URL: From david at ar.media.kyoto-u.ac.jp Tue Jul 14 01:21:30 2009 From: david at ar.media.kyoto-u.ac.jp (David Cournapeau) Date: Tue, 14 Jul 2009 14:21:30 +0900 Subject: [SciPy-dev] scipy.linalg.eigvals error on 64bit RH3 & RH4 In-Reply-To: <4A5BBF60.4040706@gmail.com> References: <4A5BAA75.7070209@gmail.com> <3d375d730907131454j40b93880ubd8eb5ffd6115a8c@mail.gmail.com> <4A5BBF60.4040706@gmail.com> Message-ID: <4A5C15DA.5030405@ar.media.kyoto-u.ac.jp> David Martin wrote: > Unfortunately I only had the .so libraries to link against so I couldn't > run the ATLAS sanity tests against them. I built a new set of ATLAS > libraries on RH3 64-bit and ran "make check" after the build, which > showed all tests passing. > What does the following return ? >>> from scipy.linalg import atlas_version cheers, David From d_l_goldsmith at yahoo.com Tue Jul 14 01:50:21 2009 From: d_l_goldsmith at yahoo.com (David Goldsmith) Date: Mon, 13 Jul 2009 22:50:21 -0700 (PDT) Subject: [SciPy-dev] generic.base Message-ID: <283100.20345.qm@web52107.mail.re2.yahoo.com> Hi! I'm working on the Scalar Base Class category and, after a little bit of a shaky start, I think I have a pretty good handle on it now. I wanted to see for my self whether or not generic.base is "settable" (i.e., after object creation), so I instantiated an np.complex64 object, ala g = np.complex64(1) and then tried various ways (e.g., h=g.copy(), h=g.view(), h=g.real, h=g.imag) to create a view of g (i.e., create an object that shared its memory), but in all cases h.base returned None, meaning, IIUC, h is _not_ a view of (does not share memory with) g. So my questions are: are objects derived from generic "viewable"; is so, how; if not, why the base property? Thanks! DG From robert.kern at gmail.com Tue Jul 14 03:06:20 2009 From: robert.kern at gmail.com (Robert Kern) Date: Tue, 14 Jul 2009 02:06:20 -0500 Subject: [SciPy-dev] generic.base In-Reply-To: <283100.20345.qm@web52107.mail.re2.yahoo.com> References: <283100.20345.qm@web52107.mail.re2.yahoo.com> Message-ID: <3d375d730907140006l1052031yc4bf2e8fdf625f60@mail.gmail.com> On Tue, Jul 14, 2009 at 00:50, David Goldsmith wrote: > > Hi! ?I'm working on the Scalar Base Class category and, after a little bit of a shaky start, I think I have a pretty good handle on it now. ?I wanted to see for my self whether or not generic.base is "settable" (i.e., after object creation), so I instantiated an np.complex64 object, ala > > ? ?g = np.complex64(1) > > and then tried various ways (e.g., h=g.copy(), h=g.view(), h=g.real, h=g.imag) to create a view of g (i.e., create an object that shared its memory), but in all cases h.base returned None, meaning, IIUC, h is _not_ a view of (does not share memory with) g. ?So my questions are: are objects derived from generic "viewable"; They are not. The scalar types need to be immutable and allowing views would make violating that too easy. > is so, how; if not, why the base property? So as to expose a consistent interface. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From d_l_goldsmith at yahoo.com Tue Jul 14 03:27:58 2009 From: d_l_goldsmith at yahoo.com (David Goldsmith) Date: Tue, 14 Jul 2009 00:27:58 -0700 (PDT) Subject: [SciPy-dev] generic.base Message-ID: <380236.57031.qm@web52104.mail.re2.yahoo.com> OK, thanks! DG --- On Tue, 7/14/09, Robert Kern wrote: > From: Robert Kern > Subject: Re: [SciPy-dev] generic.base > To: "SciPy Developers List" > Date: Tuesday, July 14, 2009, 12:06 AM > On Tue, Jul 14, 2009 at 00:50, David > Goldsmith > wrote: > > > > Hi! ?I'm working on the Scalar Base Class category > and, after a little bit of a shaky start, I think I have a > pretty good handle on it now. ?I wanted to see for my self > whether or not generic.base is "settable" (i.e., after > object creation), so I instantiated an np.complex64 object, > ala > > > > ? ?g = np.complex64(1) > > > > and then tried various ways (e.g., h=g.copy(), > h=g.view(), h=g.real, h=g.imag) to create a view of g (i.e., > create an object that shared its memory), but in all cases > h.base returned None, meaning, IIUC, h is _not_ a view of > (does not share memory with) g. ?So my questions are: are > objects derived from generic "viewable"; > > They are not. The scalar types need to be immutable and > allowing views > would make violating that too easy. > > > is so, how; if not, why the base property? > > So as to expose a consistent interface. > > -- > Robert Kern > > "I have come to believe that the whole world is an enigma, > a harmless > enigma that is made terrible by our own mad attempt to > interpret it as > though it had an underlying truth." > ? -- Umberto Eco > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > From frostedcheerios at gmail.com Tue Jul 14 11:56:48 2009 From: frostedcheerios at gmail.com (David Martin) Date: Tue, 14 Jul 2009 10:56:48 -0500 Subject: [SciPy-dev] scipy.linalg.eigvals error on 64bit RH3 & RH4 In-Reply-To: <4A5C15DA.5030405@ar.media.kyoto-u.ac.jp> References: <4A5BAA75.7070209@gmail.com> <3d375d730907131454j40b93880ubd8eb5ffd6115a8c@mail.gmail.com> <4A5BBF60.4040706@gmail.com> <4A5C15DA.5030405@ar.media.kyoto-u.ac.jp> Message-ID: <4A5CAAC0.3030606@gmail.com> David Cournapeau wrote: > David Martin wrote: > >> Unfortunately I only had the .so libraries to link against so I couldn't >> run the ATLAS sanity tests against them. I built a new set of ATLAS >> libraries on RH3 64-bit and ran "make check" after the build, which >> showed all tests passing. >> >> > > What does the following return ? > > >>>> from scipy.linalg import atlas_version >>>> Python 2.5.4 (r254:67916, May 20 2009, 22:57:32) [GCC 3.2.3 20030502 (Red Hat Linux 3.2.3-56)] on linux2 Type "help", "copyright", "credits" or "license" for more information. >>> from scipy.linalg import atlas_version >>> atlas_version >>> atlas_version.version() ATLAS version 3.8.3 built by builder on Mon Jul 13 20:09:54 CDT 2009: UNAME : Linux centos-3-x86_64-build 2.4.21-47.EL #1 SMP Tue Aug 1 07:58:43 EDT 2006 x86_64 x86_64 x86_64 GNU/Linux INSTFLG : -1 0 -a 1 ARCHDEFS : -DATL_OS_Linux -DATL_ARCH_Core2 -DATL_CPUMHZ=2492 -DATL_SSE3 -DATL_SSE2 -DATL_SSE1 -DATL_USE64BITS -DATL_GAS_x8664 F2CDEFS : -DAdd__ -DF77_INTEGER=int -DStringSunStyle CACHEEDGE: 262144 F77 : g77, version GNU Fortran (GCC 3.2.3 20030502 (Red Hat Linux 3.2.3-56)) 3.2.3 20030502 (Red Hat Linux 3.2.3-56) F77FLAGS : -O -fPIC -g -m64 SMC : gcc, version gcc (GCC) 3.2.3 20030502 (Red Hat Linux 3.2.3-56) SMCFLAGS : -fomit-frame-pointer -mfpmath=sse -msse3 -O2 -fPIC -g -m64 SKC : gcc, version gcc (GCC) 3.2.3 20030502 (Red Hat Linux 3.2.3-56) SKCFLAGS : -fomit-frame-pointer -mfpmath=sse -msse3 -O2 -fPIC -g -m64 From d_l_goldsmith at yahoo.com Tue Jul 14 13:39:28 2009 From: d_l_goldsmith at yahoo.com (David Goldsmith) Date: Tue, 14 Jul 2009 10:39:28 -0700 (PDT) Subject: [SciPy-dev] generic.base Message-ID: <167947.20354.qm@web52106.mail.re2.yahoo.com> > --- On Tue, 7/14/09, Robert Kern > wrote: > > > The scalar types need to be immutable May I infer from this that all attributes of generic are "read-only"? DG From robert.kern at gmail.com Tue Jul 14 13:41:23 2009 From: robert.kern at gmail.com (Robert Kern) Date: Tue, 14 Jul 2009 12:41:23 -0500 Subject: [SciPy-dev] generic.base In-Reply-To: <167947.20354.qm@web52106.mail.re2.yahoo.com> References: <167947.20354.qm@web52106.mail.re2.yahoo.com> Message-ID: <3d375d730907141041w7d424d1m66e3d3e87a925220@mail.gmail.com> On Tue, Jul 14, 2009 at 12:39, David Goldsmith wrote: > >> --- On Tue, 7/14/09, Robert Kern >> wrote: >> >> > The scalar types need to be immutable > > May I infer from this that all attributes of generic are "read-only"? They should be. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From d_l_goldsmith at yahoo.com Tue Jul 14 13:42:27 2009 From: d_l_goldsmith at yahoo.com (David Goldsmith) Date: Tue, 14 Jul 2009 10:42:27 -0700 (PDT) Subject: [SciPy-dev] generic.base Message-ID: <130619.22065.qm@web52112.mail.re2.yahoo.com> Nevermind, just found that info in http://docs.scipy.org/numpy/docs/numpy-docs/reference/arrays.scalars.rst/. Sorry for the noise. DG --- On Tue, 7/14/09, David Goldsmith wrote: > From: David Goldsmith > Subject: Re: [SciPy-dev] generic.base > To: "SciPy Developers List" > Date: Tuesday, July 14, 2009, 10:39 AM > > > --- On Tue, 7/14/09, Robert Kern > > wrote: > > > > > The scalar types need to be immutable > > May I infer from this that all attributes of generic are > "read-only"? > > DG > > > ? ? ? > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > From d_l_goldsmith at yahoo.com Tue Jul 14 13:51:44 2009 From: d_l_goldsmith at yahoo.com (David Goldsmith) Date: Tue, 14 Jul 2009 10:51:44 -0700 (PDT) Subject: [SciPy-dev] 10 min to Skypecon Message-ID: <118790.28634.qm@web52112.mail.re2.yahoo.com> From pgmdevlist at gmail.com Tue Jul 14 22:51:54 2009 From: pgmdevlist at gmail.com (Pierre GM) Date: Tue, 14 Jul 2009 22:51:54 -0400 Subject: [SciPy-dev] Anybody to confirm a result in scipy.stats.distributions Message-ID: <1D087DBD-351A-48F4-A743-C8FBE7985640@gmail.com> All, Could anybody in/confirm the following result ? (scipy 0.8.0.dev5845), >>> import scipy.stats.distributions as dist >>> dist.gamma(4.).cdf(dist.gamma(4.).ppf(0.25)) 0.0 I would have expected 0.25, of course. The pb seems to be >>> dist.gamma(4.).ppf(0.25) 0.0 when R gives 2.53532 Thx in advance for any help/comments P. From david at ar.media.kyoto-u.ac.jp Tue Jul 14 22:41:06 2009 From: david at ar.media.kyoto-u.ac.jp (David Cournapeau) Date: Wed, 15 Jul 2009 11:41:06 +0900 Subject: [SciPy-dev] scipy.linalg.eigvals error on 64bit RH3 & RH4 In-Reply-To: <4A5CAAC0.3030606@gmail.com> References: <4A5BAA75.7070209@gmail.com> <3d375d730907131454j40b93880ubd8eb5ffd6115a8c@mail.gmail.com> <4A5BBF60.4040706@gmail.com> <4A5C15DA.5030405@ar.media.kyoto-u.ac.jp> <4A5CAAC0.3030606@gmail.com> Message-ID: <4A5D41C2.3060505@ar.media.kyoto-u.ac.jp> David Martin wrote: > Python 2.5.4 (r254:67916, May 20 2009, 22:57:32) [GCC 3.2.3 20030502 > (Red Hat Linux 3.2.3-56)] on linux2 > Type "help", "copyright", "credits" or "license" for more information. > >>> from scipy.linalg import atlas_version > >>> atlas_version > '/home/tester/master/lib/python2.5/site-packages/scipy/linalg/atlas_version.so'> > >>> atlas_version.version() > ATLAS version 3.8.3 built by builder on Mon Jul 13 20:09:54 CDT 2009: > UNAME : Linux centos-3-x86_64-build 2.4.21-47.EL #1 SMP Tue Aug 1 > 07:58:43 EDT 2006 x86_64 x86_64 x86_64 GNU/Linux > INSTFLG : -1 0 -a 1 > ARCHDEFS : -DATL_OS_Linux -DATL_ARCH_Core2 -DATL_CPUMHZ=2492 > -DATL_SSE3 -DATL_SSE2 -DATL_SSE1 -DATL_USE64BITS -DATL_GAS_x8664 > F2CDEFS : -DAdd__ -DF77_INTEGER=int -DStringSunStyle > CACHEEDGE: 262144 > F77 : g77, version GNU Fortran (GCC 3.2.3 20030502 (Red Hat > Linux 3.2.3-56)) 3.2.3 20030502 (Red Hat Linux 3.2.3-56) > F77FLAGS : -O -fPIC -g -m64 > SMC : gcc, version gcc (GCC) 3.2.3 20030502 (Red Hat Linux 3.2.3-56) > SMCFLAGS : -fomit-frame-pointer -mfpmath=sse -msse3 -O2 -fPIC -g -m64 > SKC : gcc, version gcc (GCC) 3.2.3 20030502 (Red Hat Linux 3.2.3-56) > SKCFLAGS : -fomit-frame-pointer -mfpmath=sse -msse3 -O2 -fPIC -g -m64 > There are some problems with gcc 3.2/3.3 and using sse as the math unit (-mfpmath=sse), and I would not be surprised if g77 inherits them as well. Maybe Red Hat backported fixes, though, you would have to see this with your OS provider. One way to confirm this would be to compile your own gcc/gfortran, and recompile everything (lapack, atlas, numpy and scipy) with it. cheers, David From robert.kern at gmail.com Tue Jul 14 23:06:04 2009 From: robert.kern at gmail.com (Robert Kern) Date: Tue, 14 Jul 2009 22:06:04 -0500 Subject: [SciPy-dev] Anybody to confirm a result in scipy.stats.distributions In-Reply-To: <1D087DBD-351A-48F4-A743-C8FBE7985640@gmail.com> References: <1D087DBD-351A-48F4-A743-C8FBE7985640@gmail.com> Message-ID: <3d375d730907142006o709a2d37k2f78500e043336b9@mail.gmail.com> On Tue, Jul 14, 2009 at 21:51, Pierre GM wrote: > All, > Could anybody in/confirm the following result ? (scipy 0.8.0.dev5845), > > ?>>> import scipy.stats.distributions as dist > ?>>> dist.gamma(4.).cdf(dist.gamma(4.).ppf(0.25)) > 0.0 I can confirm the bug. special.gammaincinv(x,y) appears to not like y=0.25 for any x, but any other y seems to be fine. These are suggestive lines in c_misc/gammaincinv.c: if (a <= 0.0 || y <= 0.0 || y > 0.25) { return cephes_igami(a, 1-y); } double flo = -y, fhi = 0.25 - y; -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From pgmdevlist at gmail.com Tue Jul 14 23:17:54 2009 From: pgmdevlist at gmail.com (Pierre GM) Date: Tue, 14 Jul 2009 23:17:54 -0400 Subject: [SciPy-dev] Anybody to confirm a result in scipy.stats.distributions In-Reply-To: <3d375d730907142006o709a2d37k2f78500e043336b9@mail.gmail.com> References: <1D087DBD-351A-48F4-A743-C8FBE7985640@gmail.com> <3d375d730907142006o709a2d37k2f78500e043336b9@mail.gmail.com> Message-ID: On Jul 14, 2009, at 11:06 PM, Robert Kern wrote: > On Tue, Jul 14, 2009 at 21:51, Pierre GM wrote: >> All, >> Could anybody in/confirm the following result ? (scipy >> 0.8.0.dev5845), >> >> >>> import scipy.stats.distributions as dist >> >>> dist.gamma(4.).cdf(dist.gamma(4.).ppf(0.25)) >> 0.0 > > I can confirm the bug. special.gammaincinv(x,y) appears to not like > y=0.25 for any x, but any other y seems to be fine. Excellent (well, quite the opposite, but at least it doesn't come from me). I created ticket #975. Robert, thanks a lot for confirming. Meanwhile, I gonna work with 0.2500001 instead of 0.25. P. From d_l_goldsmith at yahoo.com Wed Jul 15 19:09:26 2009 From: d_l_goldsmith at yahoo.com (David Goldsmith) Date: Wed, 15 Jul 2009 16:09:26 -0700 (PDT) Subject: [SciPy-dev] bool8 link returns bool_ docstring Message-ID: <217676.53437.qm@web52106.mail.re2.yahoo.com> Hi! So, now I'm working on the Scalar Types category; perhaps this is "old news," but when I click on the link for bool8, I get the docstring for bool_; similarly for other types (e.g., cfloat and complex128 both return the docstring for complex128). Is this a "bug" or a "feature"? If the latter, may I assume that if link_type_A and link_type_B both return the docstring for type_A, then type_A is the "base" and type_B is the "alias"? Thanks! DG From d_l_goldsmith at yahoo.com Wed Jul 15 20:37:18 2009 From: d_l_goldsmith at yahoo.com (David Goldsmith) Date: Wed, 15 Jul 2009 17:37:18 -0700 (PDT) Subject: [SciPy-dev] bool8 link returns bool_ docstring Message-ID: <937606.42266.qm@web52103.mail.re2.yahoo.com> Once again I believe I've answered my own question (I need to have more faith in my research abilities and not cry for help so quickly): Figure 2.2 in "Guide To Numpy" illustrating the hierarchy of scalar types - the names used there are the "base" names, any equivalent is considered an "alias," yes? Oh, but this doesn't answer my first question: is the fact that the Wiki returns the same docstring for equivalent types a bug or a feature? (At this point I'm assuming "feature," but confirmation would be appreciated.) Thanks, DG --- On Wed, 7/15/09, David Goldsmith wrote: > From: David Goldsmith > Subject: bool8 link returns bool_ docstring > To: scipy-dev at scipy.org > Date: Wednesday, July 15, 2009, 4:09 PM > Hi!? So, now I'm working on the > Scalar Types category; perhaps this is "old news," but when > I click on the link for bool8, I get the docstring for > bool_; similarly for other types (e.g., cfloat and > complex128 both return the docstring for complex128).? > Is this a "bug" or a "feature"?? If the latter, may I > assume that if link_type_A and link_type_B both return the > docstring for type_A, then type_A is the "base" and type_B > is the "alias"?? Thanks! > > DG > > > ? ? ? > From gael.varoquaux at normalesup.org Wed Jul 15 22:13:11 2009 From: gael.varoquaux at normalesup.org (Gael Varoquaux) Date: Thu, 16 Jul 2009 04:13:11 +0200 Subject: [SciPy-dev] [ANN] Announcing the SciPy conference schedule Message-ID: <20090716021311.GA21642@phare.normalesup.org> The SciPy conference committee is pleased to announce the schedule of the conference: http://conference.scipy.org/schedule This year?s program is very rich. In order to limit the number of interesting talks that we had to turn down, we decided to reduce the length of talks. Although this results in many short talks, we hope that it will foster discussions, and give new ideas. Many subjects are covered, both varying technical subject in the scientific computing spectrum, and covering a lot of different research areas. I would personally like to thank the members of the program committee, who spent time reviewing the proposed abstracts and giving the chairs feedback. Fernando Perez and the tutorial presenters are hard at work finishing planning all the details of the two-day tutorial session that will precede the conference. An introduction tutorial track and an advanced tutorial track, both covering various aspect of scientific computing in Python, presented by experts in the field, should help many people getting up to speed on the amazing technology driving this community. The SciPy 2009 program committee * Co-Chair Ga?l Varoquaux, Applied Mathematics and Neuroscience, * Neurospin, CEA - INRIA Saclay (France) * Co-Chair St?fan van der Walt, Applied Mathematics, University of * Stellenbosch (South Africa) * Michael Aivazis, Center for Advanced Computing Research, California * Institute of Technology (USA) * Brian Granger, Physics Department, California Polytechnic State * University, San Luis Obispo (USA) * Aric Hagberg, Theoretical Division, Los Alamos National Laboratory * (USA) * Konrad Hinsen, Centre de Biophysique Mol?culaire, CNRS Orl?ans * (France) * Randall LeVeque, Mathematics, University of Washington, Seattle * (USA) * Travis Oliphant, Enthought (USA) * Prabhu Ramachandran, Department of Aerospace Engineering, IIT * Bombay (India) * Raphael Ritz, International Neuroinformatics Coordinating Facility * (Sweden) * William Stein, Mathematics, University of Washington, Seattle (USA) Conference Chair: Jarrod Millman, Neuroscience Institute, UC Berkeley (USA) From frostedcheerios at gmail.com Thu Jul 16 11:28:53 2009 From: frostedcheerios at gmail.com (David Martin) Date: Thu, 16 Jul 2009 10:28:53 -0500 Subject: [SciPy-dev] scipy.linalg.eigvals error on 64bit RH3 & RH4 In-Reply-To: <4A5D41C2.3060505@ar.media.kyoto-u.ac.jp> References: <4A5BAA75.7070209@gmail.com> <3d375d730907131454j40b93880ubd8eb5ffd6115a8c@mail.gmail.com> <4A5BBF60.4040706@gmail.com> <4A5C15DA.5030405@ar.media.kyoto-u.ac.jp> <4A5CAAC0.3030606@gmail.com> <4A5D41C2.3060505@ar.media.kyoto-u.ac.jp> Message-ID: <4A5F4735.6000004@gmail.com> David Cournapeau wrote: > David Martin wrote: > >> Python 2.5.4 (r254:67916, May 20 2009, 22:57:32) [GCC 3.2.3 20030502 >> (Red Hat Linux 3.2.3-56)] on linux2 >> Type "help", "copyright", "credits" or "license" for more information. >> >>> from scipy.linalg import atlas_version >> >>> atlas_version >> > '/home/tester/master/lib/python2.5/site-packages/scipy/linalg/atlas_version.so'> >> >>> atlas_version.version() >> ATLAS version 3.8.3 built by builder on Mon Jul 13 20:09:54 CDT 2009: >> UNAME : Linux centos-3-x86_64-build 2.4.21-47.EL #1 SMP Tue Aug 1 >> 07:58:43 EDT 2006 x86_64 x86_64 x86_64 GNU/Linux >> INSTFLG : -1 0 -a 1 >> ARCHDEFS : -DATL_OS_Linux -DATL_ARCH_Core2 -DATL_CPUMHZ=2492 >> -DATL_SSE3 -DATL_SSE2 -DATL_SSE1 -DATL_USE64BITS -DATL_GAS_x8664 >> F2CDEFS : -DAdd__ -DF77_INTEGER=int -DStringSunStyle >> CACHEEDGE: 262144 >> F77 : g77, version GNU Fortran (GCC 3.2.3 20030502 (Red Hat >> Linux 3.2.3-56)) 3.2.3 20030502 (Red Hat Linux 3.2.3-56) >> F77FLAGS : -O -fPIC -g -m64 >> SMC : gcc, version gcc (GCC) 3.2.3 20030502 (Red Hat Linux 3.2.3-56) >> SMCFLAGS : -fomit-frame-pointer -mfpmath=sse -msse3 -O2 -fPIC -g -m64 >> SKC : gcc, version gcc (GCC) 3.2.3 20030502 (Red Hat Linux 3.2.3-56) >> SKCFLAGS : -fomit-frame-pointer -mfpmath=sse -msse3 -O2 -fPIC -g -m64 >> >> > > There are some problems with gcc 3.2/3.3 and using sse as the math unit > (-mfpmath=sse), and I would not be surprised if g77 inherits them as > well. Maybe Red Hat backported fixes, though, you would have to see this > with your OS provider. One way to confirm this would be to compile your > own gcc/gfortran, and recompile everything (lapack, atlas, numpy and > scipy) with it Unfortunately if I upgrade gcc to 3.4.6, the Scipy egg which is built using that compiler will require libstdc++.so.6 instead of libstdc++.so.5. libstdc++.so.5 exists on a clean install of RH3 in /usr/lib, but libstdc++.so.6 does not, so any Scipy built with gcc 3.4.6 cannot be distributed and used on other RH3 platforms without updating the c libraries on those platforms before hand (and the goal of this build is a Scipy egg which I can package in an installer). -David From stefan at sun.ac.za Thu Jul 16 11:49:21 2009 From: stefan at sun.ac.za (=?ISO-8859-1?Q?St=E9fan_van_der_Walt?=) Date: Thu, 16 Jul 2009 17:49:21 +0200 Subject: [SciPy-dev] bool8 link returns bool_ docstring In-Reply-To: <937606.42266.qm@web52103.mail.re2.yahoo.com> References: <937606.42266.qm@web52103.mail.re2.yahoo.com> Message-ID: <9457e7c80907160849k49bbdaf1oca968ed627dbc2a7@mail.gmail.com> 2009/7/16 David Goldsmith : > > Once again I believe I've answered my own question (I need to have more faith in my research abilities and not cry for help so quickly): Figure 2.2 in "Guide To Numpy" illustrating the hierarchy of scalar types - > the names used there are the "base" names, any equivalent is considered an "alias," yes? ?Oh, but this doesn't answer my first question: is the fact that the Wiki returns the same docstring for equivalent types a bug or a feature? ?(At this point I'm assuming "feature," but confirmation would be appreciated.) ?Thanks, It's the same object: In [8]: np.bool_ Out[8]: In [9]: np.bool8 Out[9]: Cheers St?fan From dpeterson at enthought.com Thu Jul 16 15:27:54 2009 From: dpeterson at enthought.com (Dave Peterson) Date: Thu, 16 Jul 2009 14:27:54 -0500 Subject: [SciPy-dev] ANNOUNCEMENT: Enthought Tool Suite (ETS) v3.3.0 released Message-ID: <4A5F7F3A.4090101@enthought.com> Hello, I'm pleased to announce that Enthought Tool Suite (ETS) version 3.3.0 has been tagged and released! Please see below for a partial list of changes for this release. PyPi has been updated with the release, including the availability of both Windows binaries (.egg) and source distributions (.tar.gz). A full install of ETS can be done using a command like: easy_install -U "ETS[nonets] == 3.3.0" HOWEVER, it is important to note that there are still package dependencies that are outside the scope of easy_install. Therefore, we recommend that you have the following installed prior to installing ETS: setuptools: minimum version 0.6c9 VTK: minimum version 5.0, recommended 5.2 or later And at least one of: wxPython: minimum version 2.8.7.1 PyQt: minimum version 4.4 For additional installation information, see: https://svn.enthought.com/enthought/wiki/Install What Is ETS? =========== The Enthought Tool Suite (ETS) is a collection of components developed by Enthought and the open-source community, which we use every day to construct scientific applications. It includes a wide variety of components, including: * an extensible application framework * application building blocks * 2-D and 3-D graphics libraries * scientific and math libraries * developer tools The cornerstone on which these tools rest is the Traits package, which provides explicit type declarations in Python; its features include initialization, validation, delegation, notification, and visualization of typed attributes. More information on ETS is available from the development home page: http://code.enthought.com/projects/index.php ETS 3.3.0 is a feature-added update to ETS 3.2.0, including numerous bug-fixes. Some of the notable changes include (sub-projects listed in alphabetical order): Chaco 3.2.0 (July 15, 2009) =========================== Enhancements ------------ * Bounded grids - Horizontal and Vertical grid line start and end points can now be set to a fixed value in data space, or to be the return value of an arbitrary callable. The start and end tick can also be configured via the data_min and data_max traits. * Added dictionary interface to ArrayPlotData * Added a Traits UI view to the ScalesAxis * Added a new chaco.overlays subpackage and a new overlay, the DataBox. * Improved appearance of PlotToolbar * Changed add_default_axis() in the plot_factory module to take an axis class as a keyword argument. * Refactored contour plots into a common base class and implemented proper event handling when their colormaps or data ranges change. * Changed default colormap on img_plot() and contour_plot() method of Plot to be Spectral instead of Jet. * Added two more color palettes to the auto color palette, and created a default_colors module. * Added CandlePlot renderer * Changed Plot Labels to able to display inside the plot area, instead of only on the outside * Added optional title attribute to plot legends * Updated all containers to respect and use the new fixed_preferred_size trait on enable.Component * New Examples: * Interval trait editor as a Chaco example (from Stefan van der Walt) * embedding an interactive Chaco plot inside a VTK RenderWindow using the new Enable VTK backend * lasso tool on an image plot * bounded grid * candle plot Fixes ----- * Fixed call signature of ShowAllTickGenerator.get_ticks() * Plot.title_font is now a delegate to the underlying PlotLabel object (from Chris Colbert) * Fixed mouse event handling bug in RangeSelection (from Stefan van der Walt) * ImagePlots now redraw when their input data changes. * Fixed cached image invalidation in colormapped image plots * Optimized ScatterPlot.map_index() when index_only is True and the index data is sorted * Changed ColormappedScatterPlot to no longer incorrectly apply the fill_alpha to the outline color * Improved date ticking heuristics in chaco.scales subpackage, specifically eliminating the bug where all times between, midnight and 1am would be formatted at too course of a time resolution. * Cleanup of various examples (titles, appearance) * The spectrum.py (audio spectrograph) example now properly closes the audio stream. Enable 3.2.0 (July 15th, 2009) ============================== enthought.enable Enhancements ----------------------------- * Added Slider and Compass widgets * Added an OverlayContainer (almost identical to the one in Chaco) * Added ImageGraphicsContextEnable class so that one can always import a Kiva Image backend-based GraphicsContextEnable * renaming marker_trait to MarkerTrait (the old name is still permitted for backwards compatibility, but should be avoided) * Moved the scatter_markers module from Chaco to Enable, so that Enable components can use MarkerTrait * Added an experimental VTK backend for Enable, along with an example * Changed SVGButtonEditor toggle to draw a SVG under the button SVG instead of drawing a plain box * Added labels for SVGButton * Improving backbuffering performance on the Mac by creating the layer context from the window context instead of from a bitmap. * Adding a "fixed_preferred_size" trait to Components, so that relative size preferences can be expressed amongst different components in a container enthought.enable Fixes ---------------------- * Improved the backend selection to match the Traits UI backend unless ETSConfig.enable_toolkit is explicitly set * Fixed demo_main() in example_support.py so that it doesn't crash IPython * Fixed RGBAColorTrait so it can be used with the null toolkit * Changed the "sys_window" color to be the same as the Traits UI "WindowColor" constant * Fixed backend_cairo's get_text_extent() implementation to match other backends enthought.kiva Enhancements --------------------------- * Added basic gradients to Kiva enthought.kiva Fixes -------------------- * Fixed Python 2.6 datatype errors * Fixed memory leak as reported in ticket 1815 * The macport test is only run on Darwin systems * Removed deprecated calls to old numpy APIs Traits 3.2.0 ============ * Implemented editable_labels attribute in the TabularEditor for enabling editing of the labels (i.e. the first column) * Saving/restoring window positions works with multiple displays of different sizes * New ProgressEditor * Changed default colors for TableEditor * Added support for HTMLEditor for QT backend using QtWebKit * Improved support for opening links in external browser from HTMLEditor * Added support for TabularEditor for QT backend * Added support for marking up the CodeEditor, including adding squiggles and dimming lines * Added SearchEditor * Improved unicode support * Changed behavior of RangeEditor text box to not auto-set * Added support in RangeEditor for specifying the method to evaluate new values. * Add DefaultOverride editor factory courtesy St?fan van der Walt * Removed sys.exit() call from SaveHandler.exit() TraitsBackendQt 3.2.0 (July 15, 2009) ===================================== * Fixed a plethora of layout bugs * Implemented RGBColor trait * Fixed events not fired for 'custom' TextEditor * Improved the method by which the QT backend dispatches trait handlers when dispatch='ui'. Before, the code only worked when on the main application thread or a QThread. Now it works for regular Python threads too. * Fixed events not being fired correctly in TableEditor * Added support or 'click' and 'dclick' factory attributes to the TableEditor * TableEditor instances now editable * Improved FileEditor to look and act like the WX editor * Fixed missing minimize/maximize buttons for resizable dialogs * New HTMLEditor using QtWebKit * New TabularEditor * Added support for panel buttons * New SearchEditor * Added support for clipboard * Now responds to SIGINT correctly rather than raising KeyboardInterrupt TraitsBackendWX 3.2.0 (July 15, 2009) ===================================== * Fixed bug in DateEditor which would not display Feb correctly if the current date was visible and greater than the number of days in Feb * Reduced border_size from 1 to 4 for Group instances * Fixed layout issues: * Windows are now resized if they are larger than the desktop * Windows are now put in a valid location if they were opened off-screen * Windows with smaller parent are no longer opened at invalid positions with negative y values * Implemented editable_labels attribute in the TabularEditor for enabling editing of the labels (i.e. the first column) * Fix bug in ListEditor where a trait change listener would be fired when intermediate traits changed (when extended_name was of the type item1.item2.item3..) leading to a traceback. * Saving/restoring windows now works with multiple displays of different sizes * New ProgressDialog * Improved window colors to match desktop colors more closely * Replaced calls of wx.Yield() with wx.GetApp().Yield(True) * Changed default font to use system font * Fixed TabularEditor compatibility problem with wx 2.6 regarding the page-down key * Fixed bug in propagating click events in the TabularEditor to parent windows * DateEditor wx 2.6 compatability fixed * TableEditor scrollbar fixed * Improved support for opening links in external browser from HTMLEditor * Reduced the number of update events the PythonEditor fired * moved grid package from TraitsGui egg into enthought.pyface.ui.wx * moved clipboard from enthought.util.wx into pyface TraitsGUI 3.1.0 (July 15, 2009) =============================== * Removed Theming support from DockWindows. Borders and tabs are now drawn using lines instead of stretching images. * Changed default font to use the system font * Moved enthought.util.wx.clipboard to Pyface * Moved the grid package out of pyface and into pyface.ui.wx, left deprecated warnings * Improved info shown to the user if toolkits don't work as expected From frostedcheerios at gmail.com Thu Jul 16 17:42:37 2009 From: frostedcheerios at gmail.com (David Martin) Date: Thu, 16 Jul 2009 16:42:37 -0500 Subject: [SciPy-dev] scipy.linalg.eigvals error on 64bit RH3 & RH4 In-Reply-To: <4A5F4735.6000004@gmail.com> References: <4A5BAA75.7070209@gmail.com> <3d375d730907131454j40b93880ubd8eb5ffd6115a8c@mail.gmail.com> <4A5BBF60.4040706@gmail.com> <4A5C15DA.5030405@ar.media.kyoto-u.ac.jp> <4A5CAAC0.3030606@gmail.com> <4A5D41C2.3060505@ar.media.kyoto-u.ac.jp> <4A5F4735.6000004@gmail.com> Message-ID: <4A5F9ECD.6000208@gmail.com> David Martin wrote: > David Cournapeau wrote: >> David Martin wrote: >> >>> Python 2.5.4 (r254:67916, May 20 2009, 22:57:32) [GCC 3.2.3 20030502 >>> (Red Hat Linux 3.2.3-56)] on linux2 >>> Type "help", "copyright", "credits" or "license" for more information. >>> >>> from scipy.linalg import atlas_version >>> >>> atlas_version >>> >> '/home/tester/master/lib/python2.5/site-packages/scipy/linalg/atlas_version.so'> >>> >>> >>> atlas_version.version() >>> ATLAS version 3.8.3 built by builder on Mon Jul 13 20:09:54 CDT 2009: >>> UNAME : Linux centos-3-x86_64-build 2.4.21-47.EL #1 SMP Tue >>> Aug 1 07:58:43 EDT 2006 x86_64 x86_64 x86_64 GNU/Linux >>> INSTFLG : -1 0 -a 1 >>> ARCHDEFS : -DATL_OS_Linux -DATL_ARCH_Core2 -DATL_CPUMHZ=2492 >>> -DATL_SSE3 -DATL_SSE2 -DATL_SSE1 -DATL_USE64BITS -DATL_GAS_x8664 >>> F2CDEFS : -DAdd__ -DF77_INTEGER=int -DStringSunStyle >>> CACHEEDGE: 262144 >>> F77 : g77, version GNU Fortran (GCC 3.2.3 20030502 (Red Hat >>> Linux 3.2.3-56)) 3.2.3 20030502 (Red Hat Linux 3.2.3-56) >>> F77FLAGS : -O -fPIC -g -m64 >>> SMC : gcc, version gcc (GCC) 3.2.3 20030502 (Red Hat Linux >>> 3.2.3-56) >>> SMCFLAGS : -fomit-frame-pointer -mfpmath=sse -msse3 -O2 -fPIC -g >>> -m64 >>> SKC : gcc, version gcc (GCC) 3.2.3 20030502 (Red Hat Linux >>> 3.2.3-56) >>> SKCFLAGS : -fomit-frame-pointer -mfpmath=sse -msse3 -O2 -fPIC -g >>> -m64 >>> >> >> There are some problems with gcc 3.2/3.3 and using sse as the math unit >> (-mfpmath=sse), and I would not be surprised if g77 inherits them as >> well. Maybe Red Hat backported fixes, though, you would have to see this >> with your OS provider. One way to confirm this would be to compile your >> own gcc/gfortran, and recompile everything (lapack, atlas, numpy and >> scipy) with it > > Unfortunately if I upgrade gcc to 3.4.6, the Scipy egg which is built > using that compiler will require libstdc++.so.6 instead of > libstdc++.so.5. libstdc++.so.5 exists on a clean install of RH3 in > /usr/lib, but libstdc++.so.6 does not, so any Scipy built with gcc > 3.4.6 cannot be distributed and used on other RH3 platforms without > updating the c libraries on those platforms before hand (and the goal > of this build is a Scipy egg which I can package in an installer). > As a follow up, it seems the resulting dependency on the newer C libraries won't be a problem. I've rebuilt ATLAS, Numpy, and Scipy using gcc/g77 3.4.6 and all Scipy tests now pass. -David From d_l_goldsmith at yahoo.com Thu Jul 16 20:20:19 2009 From: d_l_goldsmith at yahoo.com (David Goldsmith) Date: Thu, 16 Jul 2009 17:20:19 -0700 (PDT) Subject: [SciPy-dev] 'nother goal met! Message-ID: <796416.43611.qm@web52101.mail.re2.yahoo.com> Scalar Types! (Two items need to be changed to "Unimportant" by someone who can do that, everything else "Needs Review".) DG From mtrumpis at berkeley.edu Thu Jul 16 20:23:08 2009 From: mtrumpis at berkeley.edu (M Trumpis) Date: Thu, 16 Jul 2009 17:23:08 -0700 Subject: [SciPy-dev] threaded fftw + blitz + weave Message-ID: I just finished reworking some code I wrote a while ago that drives FFTW in parallel for repeated transforms. My original code only operated in a restrictive manner, but I finally got around to letting it run in arbitrarily strided arrays. The idea is to plan an FFT of a certain structure on one thread, and then split the data up and apply the plan on multiple threads. (Since I work with images and like my arrays to be centered at N/2, my implementation has the threads do some change of variables magic with hairy strided pointer walking--more memory efficient than fftshift.) I'm wondering if people think this could be useful, and if it would be a small effort to bring it to scipy? here's a look at the c++ code and the weave wrapping https://cirl.berkeley.edu/trac/browser/bic/branches/rtools_weave/root/recon/fftmod/src/blitz_ffts.cc https://cirl.berkeley.edu/trac/browser/bic/branches/rtools_weave/root/recon/fftmod/__init__.py The code is definitely working and passes comparison tests within reasonable accuracy. That said, it's sure to be rough around the edges (I'm not terribly careful coding in C++). Has: *If an N-D array is to be split up into T transforms (of any rank R <= N), and T can be split evenly into P threads, then the tool parallelizes the fft calls *inplace and out of place xforms *shifted or non-shifted xforms Hasn't: *no Microsoft threads (no clue how they work) *no real-to-complex or c2r transforms *no fftw plan caching -- other than what the fftw libs keep persistent *may not work for negatively strided arrays.. haven't checked *definitely won't work for funky strides (==0) of course From d_l_goldsmith at yahoo.com Fri Jul 17 03:05:30 2009 From: d_l_goldsmith at yahoo.com (David Goldsmith) Date: Fri, 17 Jul 2009 00:05:30 -0700 (PDT) Subject: [SciPy-dev] Skypecon Minutes Message-ID: <721331.60629.qm@web52106.mail.re2.yahoo.com> NumPy Doc Summer '09 Marathon July 14, 2009 Skypecon Minutes Start: 18:15 UTC End: 19:15 UTC Present: David Goldsmith (Secretary); Ralf Gommers; Joe Harrington; Jack Liddle; St?fan van der Walt Discussions: 0) Status of current effort Joe expressed general satisfaction with the current state and momentum. Ralf opined (and no one really argued the point) that much of what is left is either ?loose ends,? should be ?Unimportant? if it isn?t already, or will require the contributions of persons w/ more in-depth knowledge of NumPy than the average contributor to date. This motivated the next discussion: 1) Commencement of work on SciPy doc Joe expressed the view that we should be well into the review process for NumPy before beginning work on SciPy. This led to a discussion of the ?two review process,? and, mostly, its labor requirements, vis-?-vis required expertise, reviewer vetting criteria and process, etc.; subsequently, a ticket was filed requesting enhancement of the pydocweb infrastructure to support this, and development of some rough guidelines for Reviewers was assigned to David. Joe also expressed concern over the possibility that SciPy may soon (i.e., on the order of years, since the SciPy doc effort is seen as likely being of the same order) undergo a medium-to-major overhaul, and thus, perhaps, a lot of labor documenting soon-to-be obsolete material may be wasted; St?fan countered that this shouldn?t be a great concern, because even if there is an overhaul, modules within SciPy are relatively independent of each other and not that much in the way of code will be completely discarded, and thus, mostly, if not completely, documenting SciPy as it is presently will not be wasted effort. Also on the other hand, Jack, with David and Ralf?s support, pointed out that the lack of ?pedestrian? docstrings to edit in NumPy, of which SciPy has a plethora, may discourage a lot of the volunteer labor pool, perhaps resulting in another ?stall? in activity. Everyone agreed that this was the greater concern, and it was resolved that work should commence on the SciPy docstring project, beginning with organization of the SciPy material into Milestones ala what we presently have for NumPy, a task which was delegated to Jack. 2) Discussed the idea of a ?User Guide? And whether, for instance, SciPy and NumPy should have separate UG?s or be packaged together and designed and organized accordingly. No resolution. 3) Meeting concluded w/ a specific question Jack asked for clarification of how to document a function which is basically just a convenience function, in that it thinly wraps another function, merely setting one parameter before ?relaying? its call; David noted that this is exactly analogous to the recent discussions about methods equivalent to functions, and the same solution was suggested, namely, document the wrapper function minimally, referring to the wrapped function for details; no opposing view was voiced. Next Skypecon: Wednesday, July 22, 19:00 UTC Notes: A few ?drops,? but relatively minor considering the number of people participating and the technical problems we?ve had in recent Skypecons. From william.ratcliff at gmail.com Fri Jul 17 03:38:43 2009 From: william.ratcliff at gmail.com (william ratcliff) Date: Fri, 17 Jul 2009 03:38:43 -0400 Subject: [SciPy-dev] [Numpy-discussion] [ANN] Announcing the SciPy conference schedule In-Reply-To: <20090716021311.GA21642@phare.normalesup.org> References: <20090716021311.GA21642@phare.normalesup.org> Message-ID: <827183970907170038jef07198r867b086a284e8b4a@mail.gmail.com> A humble suggestion--for the March meeting of the american physical society, there is a roommate finder for splitting hotel rooms. This could be useful in keeping expenses down for some. There should be a way to do it without liability.... Cheers, William On Wed, Jul 15, 2009 at 10:13 PM, Gael Varoquaux < gael.varoquaux at normalesup.org> wrote: > The SciPy conference committee is pleased to announce the schedule of the > conference: > > http://conference.scipy.org/schedule > > This year?s program is very rich. In order to limit the number of > interesting talks that we had to turn down, we decided to reduce the > length of talks. Although this results in many short talks, we hope that > it will foster discussions, and give new ideas. Many subjects are > covered, both varying technical subject in the scientific computing > spectrum, and covering a lot of different research areas. > > I would personally like to thank the members of the program committee, > who spent time reviewing the proposed abstracts and giving the chairs > feedback. > > Fernando Perez and the tutorial presenters are hard at work finishing > planning all the details of the two-day tutorial session that will > precede the conference. An introduction tutorial track and an advanced > tutorial track, both covering various aspect of scientific computing in > Python, presented by experts in the field, should help many people > getting up to speed on the amazing technology driving this community. > > The SciPy 2009 program committee > > * Co-Chair Ga?l Varoquaux, Applied Mathematics and Neuroscience, > * Neurospin, CEA - INRIA Saclay (France) > * Co-Chair St?fan van der Walt, Applied Mathematics, University of > * Stellenbosch (South Africa) > * Michael Aivazis, Center for Advanced Computing Research, California > * Institute of Technology (USA) > * Brian Granger, Physics Department, California Polytechnic State > * University, San Luis Obispo (USA) > * Aric Hagberg, Theoretical Division, Los Alamos National Laboratory > * (USA) > * Konrad Hinsen, Centre de Biophysique Mol?culaire, CNRS Orl?ans > * (France) > * Randall LeVeque, Mathematics, University of Washington, Seattle > * (USA) > * Travis Oliphant, Enthought (USA) > * Prabhu Ramachandran, Department of Aerospace Engineering, IIT > * Bombay (India) > * Raphael Ritz, International Neuroinformatics Coordinating Facility > * (Sweden) > * William Stein, Mathematics, University of Washington, Seattle (USA) > > Conference Chair: Jarrod Millman, Neuroscience Institute, UC Berkeley > (USA) > > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion at scipy.org > http://mail.scipy.org/mailman/listinfo/numpy-discussion > -------------- next part -------------- An HTML attachment was scrubbed... URL: From robert.kern at gmail.com Fri Jul 17 11:21:34 2009 From: robert.kern at gmail.com (Robert Kern) Date: Fri, 17 Jul 2009 10:21:34 -0500 Subject: [SciPy-dev] [SciPy-user] [Numpy-discussion] [ANN] Announcing the SciPy conference schedule In-Reply-To: <827183970907170038jef07198r867b086a284e8b4a@mail.gmail.com> References: <20090716021311.GA21642@phare.normalesup.org> <827183970907170038jef07198r867b086a284e8b4a@mail.gmail.com> Message-ID: <3d375d730907170821l32139208ob3410bd9c9520c01@mail.gmail.com> On Fri, Jul 17, 2009 at 02:38, william ratcliff wrote: > A humble suggestion--for the March meeting of the american physical society, > there is a roommate finder for splitting hotel rooms. ?This could be useful > in keeping expenses down for some. ?There should be a way to do it without > liability.... A wiki page would probably be the best thing given the short time frame. I recommend either the Saga or the Vagabond hotels for keeping costs down and staying close to campus. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From jsseabold at gmail.com Fri Jul 17 13:18:04 2009 From: jsseabold at gmail.com (Skipper Seabold) Date: Fri, 17 Jul 2009 13:18:04 -0400 Subject: [SciPy-dev] Test Design Question for Stats.Models Code Message-ID: Hello all, I am polishing up the generalized linear models right now for the stats.models project and I have a question about using decorators with my tests. The GLM framework has a central model with shared properties and then several variations on this model, so to test I have just as a simplified example: from numpy.testing import * DECIMAL = 4 class check_glm(object): ''' res2 results will be obtained from R or the RModelwrap ''' def test_params(self): assert_almost_equal(self.res1.params, self.res2.params, DECIMAL) def test_resids(self): assert_almost_equal(self.res1.resids, self.res2.resids, DECIMAL) class test_glm_gamma(check_glm): def __init__(self): # Preprocessing to setup results self.res1 = ResultsFromGLM self.res2 = R_Results if __name__=="__main__": run_module_suite() My question is whether I can skip, for arguments sake, test_resids depending, for example, on the class of self.res2 or because I defined the test condition as True in the test_<> class. I tried putting in the check_glm class @dec.skipif(TestCondition, "Skipping this test because of ...") def test_resids(self): ... TestCondition should be None by default, but how can I get the value of TestCondition to evaluate to True if appropriate? I have tried a few different ways, but I am a little stumped. Does this make sense/is it possible? I'm sure I'm missing something obvious, but any insights would be appreciated. Cheers, Skipper From robert.kern at gmail.com Fri Jul 17 13:26:36 2009 From: robert.kern at gmail.com (Robert Kern) Date: Fri, 17 Jul 2009 12:26:36 -0500 Subject: [SciPy-dev] Test Design Question for Stats.Models Code In-Reply-To: References: Message-ID: <3d375d730907171026u3b96becbg2b1cbb2779058ee8@mail.gmail.com> On Fri, Jul 17, 2009 at 12:18, Skipper Seabold wrote: > Hello all, > > I am polishing up the generalized linear models right now for the > stats.models project and I have a question about using decorators with > my tests. ?The GLM framework has a central model with shared > properties and then several variations on this model, so to test I > have just as a simplified example: > > from numpy.testing import * > DECIMAL = 4 > class check_glm(object): > ? ?''' > ? ?res2 results will be obtained from R or the RModelwrap > ? ?''' > > ? ?def test_params(self): > ? ? ? ?assert_almost_equal(self.res1.params, self.res2.params, DECIMAL) > > ? ?def test_resids(self): > ? ? ? ?assert_almost_equal(self.res1.resids, self.res2.resids, DECIMAL) > > class test_glm_gamma(check_glm): > ? ?def __init__(self): > # Preprocessing to setup results > ? ? ? ?self.res1 = ResultsFromGLM > ? ? ? ?self.res2 = R_Results > > if __name__=="__main__": > ? ?run_module_suite() > > My question is whether I can skip, for arguments sake, test_resids > depending, for example, on the class of self.res2 or because I defined > the test condition as True in the test_<> class. ? I tried putting in > the check_glm class > > @dec.skipif(TestCondition, "Skipping this test because of ...") > def test_resids(self): > ... > > TestCondition should be None by default, but how can I get the value > of TestCondition to evaluate to True if appropriate? ?I have tried a > few different ways, but I am a little stumped. Does this make sense/is > it possible? ?I'm sure I'm missing something obvious, but any insights > would be appreciated. I don't think you can do that with the decorator. Just do the test in code inside the method and raise nose.SkipTest explicitly: def test_resids(self): if not isinstance(self.res1, GLMResults): raise nose.SkipTest("Not a GLM test") -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From bsouthey at gmail.com Fri Jul 17 13:42:47 2009 From: bsouthey at gmail.com (Bruce Southey) Date: Fri, 17 Jul 2009 12:42:47 -0500 Subject: [SciPy-dev] Test Design Question for Stats.Models Code In-Reply-To: <3d375d730907171026u3b96becbg2b1cbb2779058ee8@mail.gmail.com> References: <3d375d730907171026u3b96becbg2b1cbb2779058ee8@mail.gmail.com> Message-ID: <4A60B817.8000806@gmail.com> On 07/17/2009 12:26 PM, Robert Kern wrote: > On Fri, Jul 17, 2009 at 12:18, Skipper Seabold wrote: > >> Hello all, >> >> I am polishing up the generalized linear models right now for the >> stats.models project and I have a question about using decorators with >> my tests. The GLM framework has a central model with shared >> properties and then several variations on this model, so to test I >> have just as a simplified example: >> >> from numpy.testing import * >> DECIMAL = 4 >> class check_glm(object): >> ''' >> res2 results will be obtained from R or the RModelwrap >> ''' >> >> def test_params(self): >> assert_almost_equal(self.res1.params, self.res2.params, DECIMAL) >> >> def test_resids(self): >> assert_almost_equal(self.res1.resids, self.res2.resids, DECIMAL) >> >> class test_glm_gamma(check_glm): >> def __init__(self): >> # Preprocessing to setup results >> self.res1 = ResultsFromGLM >> self.res2 = R_Results >> >> if __name__=="__main__": >> run_module_suite() >> >> My question is whether I can skip, for arguments sake, test_resids >> depending, for example, on the class of self.res2 or because I defined >> the test condition as True in the test_<> class. I tried putting in >> the check_glm class >> >> @dec.skipif(TestCondition, "Skipping this test because of ...") >> def test_resids(self): >> ... >> >> TestCondition should be None by default, but how can I get the value >> of TestCondition to evaluate to True if appropriate? I have tried a >> few different ways, but I am a little stumped. Does this make sense/is >> it possible? I'm sure I'm missing something obvious, but any insights >> would be appreciated. >> > > I don't think you can do that with the decorator. Just do the test in > code inside the method and raise nose.SkipTest explicitly: > > def test_resids(self): > if not isinstance(self.res1, GLMResults): > raise nose.SkipTest("Not a GLM test") > > I think you should follow (or do something similar to) the approach given under the section 'Creating many similar tests' in the Numpy guidelines under Tips & Tricks: http://projects.scipy.org/numpy/wiki/TestingGuidelines This gives the option to group tests from related models together. Bruce -------------- next part -------------- An HTML attachment was scrubbed... URL: From jsseabold at gmail.com Fri Jul 17 14:40:38 2009 From: jsseabold at gmail.com (Skipper Seabold) Date: Fri, 17 Jul 2009 14:40:38 -0400 Subject: [SciPy-dev] Test Design Question for Stats.Models Code In-Reply-To: <4A60B817.8000806@gmail.com> References: <3d375d730907171026u3b96becbg2b1cbb2779058ee8@mail.gmail.com> <4A60B817.8000806@gmail.com> Message-ID: On Fri, Jul 17, 2009 at 1:42 PM, Bruce Southey wrote: > On 07/17/2009 12:26 PM, Robert Kern wrote: > > On Fri, Jul 17, 2009 at 12:18, Skipper Seabold wrote: > > > Hello all, > > I am polishing up the generalized linear models right now for the > stats.models project and I have a question about using decorators with > my tests. ?The GLM framework has a central model with shared > properties and then several variations on this model, so to test I > have just as a simplified example: > > from numpy.testing import * > DECIMAL = 4 > class check_glm(object): > ? ?''' > ? ?res2 results will be obtained from R or the RModelwrap > ? ?''' > > ? ?def test_params(self): > ? ? ? ?assert_almost_equal(self.res1.params, self.res2.params, DECIMAL) > > ? ?def test_resids(self): > ? ? ? ?assert_almost_equal(self.res1.resids, self.res2.resids, DECIMAL) > > class test_glm_gamma(check_glm): > ? ?def __init__(self): > # Preprocessing to setup results > ? ? ? ?self.res1 = ResultsFromGLM > ? ? ? ?self.res2 = R_Results > > if __name__=="__main__": > ? ?run_module_suite() > > My question is whether I can skip, for arguments sake, test_resids > depending, for example, on the class of self.res2 or because I defined > the test condition as True in the test_<> class. ? I tried putting in > the check_glm class > > @dec.skipif(TestCondition, "Skipping this test because of ...") > def test_resids(self): > ... > > TestCondition should be None by default, but how can I get the value > of TestCondition to evaluate to True if appropriate? ?I have tried a > few different ways, but I am a little stumped. Does this make sense/is > it possible? ?I'm sure I'm missing something obvious, but any insights > would be appreciated. > > > I don't think you can do that with the decorator. Just do the test in > code inside the method and raise nose.SkipTest explicitly: > > def test_resids(self): > if not isinstance(self.res1, GLMResults): > raise nose.SkipTest("Not a GLM test") > Thanks for the suggestion. This definitely works for the isinstance check, and I will include it for when that's appropriate. Without going into too much mundane details, it's not quite flexible enough for my other needs. I suppose I could have a barrage of if tests with each test in the parent class, but it would be ugly... > > I think you should follow (or do something similar to) the approach given > under the section 'Creating many similar tests' in the Numpy guidelines > under Tips & Tricks: > http://projects.scipy.org/numpy/wiki/TestingGuidelines > > This gives the option to group tests from related models together. > > Bruce > Thanks as well. The tests I had were similar to the example. The difference is that the data is defined for each family (test subclass) of the same model (parent test class). I have moved some of the asserts now to the subclasses like the example, so I can decorate/skip each one as needed (this isn't quite as haphazard as it sounds), but this somewhat defeats the purpose of having the parent class with the tests to reuse in the first place. It's possible that my needs won't allow me to be as lazy as I wanted to be ;). I really wish I could just define the TestConditions "on the fly" how I was originally thinking, but I'm not sure even this would have worked given that I couldn't do it simply with the explicit if tests. Once I finish this round of refactoring perhaps I will link to the tests so that what's going on is clearer and to see if anyone can/wants to point out a better way. Thanks, Skipper From millman at berkeley.edu Fri Jul 17 16:02:29 2009 From: millman at berkeley.edu (Jarrod Millman) Date: Fri, 17 Jul 2009 13:02:29 -0700 Subject: [SciPy-dev] ANN: SciPy 2009 early registration extended to July 22nd Message-ID: The early registration deadline for SciPy 2009 has been extended until Wednesday, July 22, 2009. Please register ( http://conference.scipy.org/to_register ) by this date to take advantage of the reduced early registration rate. Since we just announced the conference schedule, I was asked to provide extra time for people to register. Fortunately, we were able to get a few extra days from our vendors. But we will have to place orders next Thursday, so this is the last time we will be able to extend the deadline for registration. The conference schedule is available here: http://conference.scipy.org/schedule About the conference -------------------- SciPy 2009, the 8th Python in Science conference, will be held from August 18-23, 2009 at Caltech in Pasadena, CA, USA. The conference starts with two days of tutorials to the scientific Python tools. There will be two tracks, one for introduction of the basic tools to beginners, and one for more advanced tools. The tutorials will be followed by two days of talks. Both days of talks will begin with a keynote address. The first day?s keynote will be given by Peter Norvig, the Director of Research at Google; while, the second keynote will be delivered by Jon Guyer, a Materials Scientist in the Thermodynamics and Kinetics Group at NIST. The program committee will select the remaining talks from submissions to our call for papers. All selected talks will be included in our conference proceedings edited by the program committee. After the talks each day we will provide several rooms for impromptu birds of a feather discussions. Finally, the last two days of the conference will be used for a number of coding sprints on the major software projects in our community. For the 8th consecutive year, the conference will bring together the developers and users of the open source software stack for scientific computing with Python. Attendees have the opportunity to review the available tools and how they apply to specific problems. By providing a forum for developers to share their Python expertise with the wider commercial, academic, and research communities, this conference fosters collaboration and facilitates the sharing of software components, techniques, and a vision for high level language use in scientific computing. For further information, please visit the conference homepage: http://conference.scipy.org. Important Dates --------------- * Friday, July 3: Abstracts Due * Wednesday, July 15: Announce accepted talks, post schedule * Wednesday, July 22: Early Registration ends * Tuesday-Wednesday, August 18-19: Tutorials * Thursday-Friday, August 20-21: Conference * Saturday-Sunday, August 22-23: Sprints * Friday, September 4: Papers for proceedings due Executive Committee ------------------- * Jarrod Millman, UC Berkeley, USA (Conference Chair) * Ga?l Varoquaux, INRIA Saclay, France (Program Co-Chair) * St?fan van der Walt, University of Stellenbosch, South Africa (Program Co-Chair) * Fernando P?rez, UC Berkeley, USA (Tutorial Chair) From charlesr.harris at gmail.com Fri Jul 17 19:20:11 2009 From: charlesr.harris at gmail.com (Charles R Harris) Date: Fri, 17 Jul 2009 17:20:11 -0600 Subject: [SciPy-dev] Test Design Question for Stats.Models Code In-Reply-To: References: <3d375d730907171026u3b96becbg2b1cbb2779058ee8@mail.gmail.com> <4A60B817.8000806@gmail.com> Message-ID: On Fri, Jul 17, 2009 at 12:40 PM, Skipper Seabold wrote: > On Fri, Jul 17, 2009 at 1:42 PM, Bruce Southey wrote: > > On 07/17/2009 12:26 PM, Robert Kern wrote: > > > > On Fri, Jul 17, 2009 at 12:18, Skipper Seabold > wrote: > > > > > > Hello all, > > > > I am polishing up the generalized linear models right now for the > > stats.models project and I have a question about using decorators with > > my tests. The GLM framework has a central model with shared > > properties and then several variations on this model, so to test I > > have just as a simplified example: > > > > from numpy.testing import * > > DECIMAL = 4 > > class check_glm(object): > > ''' > > res2 results will be obtained from R or the RModelwrap > > ''' > > > > def test_params(self): > > assert_almost_equal(self.res1.params, self.res2.params, DECIMAL) > > > > def test_resids(self): > > assert_almost_equal(self.res1.resids, self.res2.resids, DECIMAL) > > > > class test_glm_gamma(check_glm): > > def __init__(self): > > # Preprocessing to setup results > > self.res1 = ResultsFromGLM > > self.res2 = R_Results > > > > if __name__=="__main__": > > run_module_suite() > > > > My question is whether I can skip, for arguments sake, test_resids > > depending, for example, on the class of self.res2 or because I defined > > the test condition as True in the test_<> class. I tried putting in > > the check_glm class > > > > @dec.skipif(TestCondition, "Skipping this test because of ...") > > def test_resids(self): > > ... > > > > TestCondition should be None by default, but how can I get the value > > of TestCondition to evaluate to True if appropriate? I have tried a > > few different ways, but I am a little stumped. Does this make sense/is > > it possible? I'm sure I'm missing something obvious, but any insights > > would be appreciated. > > > > > > I don't think you can do that with the decorator. Just do the test in > > code inside the method and raise nose.SkipTest explicitly: > > > > def test_resids(self): > > if not isinstance(self.res1, GLMResults): > > raise nose.SkipTest("Not a GLM test") > > > > Thanks for the suggestion. This definitely works for the isinstance > check, and I will include it for when that's appropriate. Without > going into too much mundane details, it's not quite flexible enough > for my other needs. I suppose I could have a barrage of if tests with > each test in the parent class, but it would be ugly... > > > > > I think you should follow (or do something similar to) the approach given > > under the section 'Creating many similar tests' in the Numpy guidelines > > under Tips & Tricks: > > http://projects.scipy.org/numpy/wiki/TestingGuidelines > > > > This gives the option to group tests from related models together. > > > > Bruce > > > > Thanks as well. The tests I had were similar to the example. The > difference is that the data is defined for each family (test subclass) > of the same model (parent test class). I have moved some of the > asserts now to the subclasses like the example, so I can decorate/skip It is best to explicitly raise AssertionError rather than use assert because assert disappears in a production release. that is to say, it is for debugging, not production code.If you are using tools from numpy.testing there is an assert_ function that you can use instead of assert. Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From josef.pktd at gmail.com Sat Jul 18 07:13:45 2009 From: josef.pktd at gmail.com (josef.pktd at gmail.com) Date: Sat, 18 Jul 2009 13:13:45 +0200 Subject: [SciPy-dev] Test Design Question for Stats.Models Code In-Reply-To: References: <3d375d730907171026u3b96becbg2b1cbb2779058ee8@mail.gmail.com> <4A60B817.8000806@gmail.com> Message-ID: <1cd32cbb0907180413x44c6c330pe5f9c13c6ffaede8@mail.gmail.com> On Fri, Jul 17, 2009 at 8:40 PM, Skipper Seabold wrote: > On Fri, Jul 17, 2009 at 1:42 PM, Bruce Southey wrote: >> On 07/17/2009 12:26 PM, Robert Kern wrote: >> >> On Fri, Jul 17, 2009 at 12:18, Skipper Seabold wrote: >> >> >> Hello all, >> >> I am polishing up the generalized linear models right now for the >> stats.models project and I have a question about using decorators with >> my tests. ?The GLM framework has a central model with shared >> properties and then several variations on this model, so to test I >> have just as a simplified example: >> >> from numpy.testing import * >> DECIMAL = 4 >> class check_glm(object): >> ? ?''' >> ? ?res2 results will be obtained from R or the RModelwrap >> ? ?''' >> >> ? ?def test_params(self): >> ? ? ? ?assert_almost_equal(self.res1.params, self.res2.params, DECIMAL) >> >> ? ?def test_resids(self): >> ? ? ? ?assert_almost_equal(self.res1.resids, self.res2.resids, DECIMAL) >> >> class test_glm_gamma(check_glm): >> ? ?def __init__(self): >> # Preprocessing to setup results >> ? ? ? ?self.res1 = ResultsFromGLM >> ? ? ? ?self.res2 = R_Results >> >> if __name__=="__main__": >> ? ?run_module_suite() >> >> My question is whether I can skip, for arguments sake, test_resids >> depending, for example, on the class of self.res2 or because I defined >> the test condition as True in the test_<> class. ? I tried putting in >> the check_glm class >> >> @dec.skipif(TestCondition, "Skipping this test because of ...") >> def test_resids(self): >> ... >> >> TestCondition should be None by default, but how can I get the value >> of TestCondition to evaluate to True if appropriate? ?I have tried a >> few different ways, but I am a little stumped. Does this make sense/is >> it possible? ?I'm sure I'm missing something obvious, but any insights >> would be appreciated. >> >> >> I don't think you can do that with the decorator. Just do the test in >> code inside the method and raise nose.SkipTest explicitly: >> >> def test_resids(self): >> ? ? if not isinstance(self.res1, GLMResults): >> ? ? ? ? raise nose.SkipTest("Not a GLM test") >> > > Thanks for the suggestion. ?This definitely works for the isinstance > check, and I will include it for when that's appropriate. ?Without > going into too much mundane details, it's not quite flexible enough > for my other needs. ?I suppose I could have a barrage of if tests with > each test in the parent class, but it would be ugly... > >> >> I think you should follow (or do something similar to) the approach given >> under the section 'Creating many similar tests' in the Numpy guidelines >> under Tips & Tricks: >> http://projects.scipy.org/numpy/wiki/TestingGuidelines >> >> This gives the option to group tests from related models together. >> >> Bruce >> > > Thanks as well. ?The tests I had were similar to the example. ?The > difference is that the data is defined for each family (test subclass) > of the same model (parent test class). ?I have moved some of the > asserts now to the subclasses like the example, so I can decorate/skip > each one as needed (this isn't quite as haphazard as it sounds), but > this somewhat defeats the purpose of having the parent class with the > tests to reuse in the first place. ?It's possible that my needs won't > allow me to be as lazy as I wanted to be ;). ?I really wish I could > just define the TestConditions "on the fly" how I was originally > thinking, but I'm not sure even this would have worked given that I > couldn't do it simply with the explicit if tests. > > Once I finish this round of refactoring perhaps I will link to the > tests so that what's going on is clearer and to see if anyone > can/wants to point out a better way. > > Thanks, > > Skipper > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > If you have groups of cases that would require similar skips, you could try a second level in the class hierarchy, where common subgroups of cases inherit from the same intermediate class. But I never tried it with nose, and I'm currently unable to try anything. Josef From Scott.Daniels at Acm.Org Sat Jul 18 15:53:47 2009 From: Scott.Daniels at Acm.Org (Scott David Daniels) Date: Sat, 18 Jul 2009 12:53:47 -0700 Subject: [SciPy-dev] bool8 link returns bool_ docstring In-Reply-To: <9457e7c80907160849k49bbdaf1oca968ed627dbc2a7@mail.gmail.com> References: <937606.42266.qm@web52103.mail.re2.yahoo.com> <9457e7c80907160849k49bbdaf1oca968ed627dbc2a7@mail.gmail.com> Message-ID: St?fan van der Walt wrote: > 2009/7/16 David Goldsmith : >> Once again I believe I've answered my own question (I need to have more faith in my research abilities and not cry for help so quickly): Figure 2.2 in "Guide To Numpy" illustrating the hierarchy of scalar types - >> the names used there are the "base" names, any equivalent is considered an "alias," yes? Oh, but this doesn't answer my first question: is the fact that the Wiki returns the same docstring for equivalent types a bug or a feature? (At this point I'm assuming "feature," but confirmation would be appreciated.) Thanks, > > It's the same object: > > In [8]: np.bool_ > Out[8]: > > In [9]: np.bool8 > Out[9]: All that shows is that the two classes print the same thing. This is how you can tell: >>> np.bool8 is np.bool_ True >>> --Scott David Daniels Scott.Daniels at Acm.Org From pav+sp at iki.fi Sat Jul 18 19:15:12 2009 From: pav+sp at iki.fi (Pauli Virtanen) Date: Sat, 18 Jul 2009 23:15:12 +0000 (UTC) Subject: [SciPy-dev] bool8 link returns bool_ docstring References: <937606.42266.qm@web52103.mail.re2.yahoo.com> <9457e7c80907160849k49bbdaf1oca968ed627dbc2a7@mail.gmail.com> Message-ID: On 2009-07-18, Scott David Daniels wrote: > St??fan van der Walt wrote: >> 2009/7/16 David Goldsmith : >>> Once again I believe I've answered my own question (I need to have more faith in my research abilities and not cry for help so quickly): Figure 2.2 in "Guide To Numpy" illustrating the hierarchy of scalar types - >>> the names used there are the "base" names, any equivalent is considered an "alias," yes? Oh, but this doesn't answer my first question: is the fact that the Wiki returns the same docstring for equivalent types a bug or a feature? (At this point I'm assuming "feature," but confirmation would be appreciated.) Thanks, >> >> It's the same object: >> >> In [8]: np.bool_ >> Out[8]: >> >> In [9]: np.bool8 >> Out[9]: > > All that shows is that the two classes print the same thing. > This is how you can tell: > > >>> np.bool8 is np.bool_ > True > >>> There?'s a slightissue in that eg. the np.intp may either point to int32 or int64. This is decided by Numpy at the build time. The docstrings should reflect the bit-width versions, I believe. The aliases and their behavior is documented manually in scalars.rst. -- Pauli Virtanen From d_l_goldsmith at yahoo.com Sun Jul 19 17:28:22 2009 From: d_l_goldsmith at yahoo.com (David Goldsmith) Date: Sun, 19 Jul 2009 14:28:22 -0700 (PDT) Subject: [SciPy-dev] ufunc.accumulate bug? Message-ID: <746413.4467.qm@web52102.mail.re2.yahoo.com> Hi! Is this a bug: >>> np.multiply.accumulate([2, 3, 5], dtype='f') array([ 2., 6., 30.], dtype=float32) >>> np.multiply.accumulate([2, 3, 5], dtype='g') array([0.0, -2, -3.1050362e+231], dtype=float96) >>> np.multiply.accumulate([2, 3, 5], dtype='F') array([ 2.+0.j, 6.+0.j, 30.+0.j], dtype=complex64) >>> np.multiply.accumulate([2, 3, 5], dtype='G') array([0.0+0.0j, -2+0.0j, -3.1050362e+231+0.0j], dtype=complex192) If not, why not (i.e., what am I missing)? DG From charlesr.harris at gmail.com Sun Jul 19 17:49:34 2009 From: charlesr.harris at gmail.com (Charles R Harris) Date: Sun, 19 Jul 2009 15:49:34 -0600 Subject: [SciPy-dev] ufunc.accumulate bug? In-Reply-To: <746413.4467.qm@web52102.mail.re2.yahoo.com> References: <746413.4467.qm@web52102.mail.re2.yahoo.com> Message-ID: On Sun, Jul 19, 2009 at 3:28 PM, David Goldsmith wrote: > > Hi! Is this a bug: > > >>> np.multiply.accumulate([2, 3, 5], dtype='f') > array([ 2., 6., 30.], dtype=float32) > > >>> np.multiply.accumulate([2, 3, 5], dtype='g') > array([0.0, -2, -3.1050362e+231], dtype=float96) > > >>> np.multiply.accumulate([2, 3, 5], dtype='F') > array([ 2.+0.j, 6.+0.j, 30.+0.j], dtype=complex64) > > >>> np.multiply.accumulate([2, 3, 5], dtype='G') > array([0.0+0.0j, -2+0.0j, -3.1050362e+231+0.0j], dtype=complex192) > > If not, why not (i.e., what am I missing)? > Definitely a bug, probably a wrong type somewhere. Open a ticket. Oh, and what platform is this on? Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From charlesr.harris at gmail.com Sun Jul 19 17:55:36 2009 From: charlesr.harris at gmail.com (Charles R Harris) Date: Sun, 19 Jul 2009 15:55:36 -0600 Subject: [SciPy-dev] ufunc.accumulate bug? In-Reply-To: <746413.4467.qm@web52102.mail.re2.yahoo.com> References: <746413.4467.qm@web52102.mail.re2.yahoo.com> Message-ID: On Sun, Jul 19, 2009 at 3:28 PM, David Goldsmith wrote: > > Hi! Is this a bug: > > >>> np.multiply.accumulate([2, 3, 5], dtype='f') > array([ 2., 6., 30.], dtype=float32) > > >>> np.multiply.accumulate([2, 3, 5], dtype='g') > array([0.0, -2, -3.1050362e+231], dtype=float96) > > >>> np.multiply.accumulate([2, 3, 5], dtype='F') > array([ 2.+0.j, 6.+0.j, 30.+0.j], dtype=complex64) > > >>> np.multiply.accumulate([2, 3, 5], dtype='G') > array([0.0+0.0j, -2+0.0j, -3.1050362e+231+0.0j], dtype=complex192) > > If not, why not (i.e., what am I missing)? > On the other hand, this works for me. What numpy version do you have? There was a bug like this that got fixed in 1.2. Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From charlesr.harris at gmail.com Sun Jul 19 17:58:50 2009 From: charlesr.harris at gmail.com (Charles R Harris) Date: Sun, 19 Jul 2009 15:58:50 -0600 Subject: [SciPy-dev] Porting SciPy to Py3k GSOC project In-Reply-To: <73531abb0903231317w6c373c8u64406ef0501a33f4@mail.gmail.com> References: <73531abb0903221911k1d862881q9db5f387fa93bb39@mail.gmail.com> <73531abb0903231317w6c373c8u64406ef0501a33f4@mail.gmail.com> Message-ID: On Mon, Mar 23, 2009 at 2:17 PM, ross smith wrote: > Hello Again, > > I've attached a draft of my application. Any feedback you can provide > would be greatly appreciated. > > thanks, > > Ross > > On Sun, Mar 22, 2009 at 22:11, ross smith wrote: > >> Hello everyone, >> >> I am interested in porting SciPy/NumPy to Py3k. I've been working this >> past school year to port an existing code base to py3k for a research group >> on campus. A lot of the code relies on SciPy and NumPy but the scope of my >> project didn't let me work on porting either project, to my dismay. I'd >> love the opportunity to port a project I use heavily in my own code and gain >> a better understanding of how it works. >> >> We are supposed to contact the group we would be working with, to flesh >> out the details of our application. I've looked at the application and the >> only thing I know I'll need significant help with is the Milestones >> portion. Of course, Any and all suggestions are welcome! >> >> >> thank you, >> >> Ross Smith >> >> (Gaurdro on Freenode) >> > > Looks like this was stuck in the pipe for a while. I wonder what broke it loose? Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From d_l_goldsmith at yahoo.com Sun Jul 19 18:52:23 2009 From: d_l_goldsmith at yahoo.com (David Goldsmith) Date: Sun, 19 Jul 2009 15:52:23 -0700 (PDT) Subject: [SciPy-dev] ufunc.accumulate bug? Message-ID: <123279.48819.qm@web52105.mail.re2.yahoo.com> Ah, that would (probably) be it then: >>> np.version.version '1.2.1' I'll update and see... Thanks, Charles! DG --- On Sun, 7/19/09, Charles R Harris wrote: > From: Charles R Harris > Subject: Re: [SciPy-dev] ufunc.accumulate bug? > To: "SciPy Developers List" > Date: Sunday, July 19, 2009, 2:55 PM > > > On Sun, Jul 19, 2009 at 3:28 PM, > David Goldsmith > wrote: > > > > Hi! ?Is this a bug: > > > > >>> np.multiply.accumulate([2, 3, 5], > dtype='f') > > array([ ?2., ? 6., ?30.], dtype=float32) > > > > >>> np.multiply.accumulate([2, 3, 5], > dtype='g') > > array([0.0, -2, -3.1050362e+231], dtype=float96) > > > > >>> np.multiply.accumulate([2, 3, 5], > dtype='F') > > array([ ?2.+0.j, ? 6.+0.j, ?30.+0.j], dtype=complex64) > > > > >>> np.multiply.accumulate([2, 3, 5], > dtype='G') > > array([0.0+0.0j, -2+0.0j, -3.1050362e+231+0.0j], > dtype=complex192) > > > > If not, why not (i.e., what am I missing)? > > > On the other hand, this works for me. What numpy version do > you have? There was a bug like this that got fixed in 1.2. > > Chuck > > > > > -----Inline Attachment Follows----- > > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > From d_l_goldsmith at yahoo.com Sun Jul 19 19:03:17 2009 From: d_l_goldsmith at yahoo.com (David Goldsmith) Date: Sun, 19 Jul 2009 16:03:17 -0700 (PDT) Subject: [SciPy-dev] ufunc.accumulate bug? Message-ID: <332661.66230.qm@web52101.mail.re2.yahoo.com> Yup, that was it, thanks again! DG --- On Sun, 7/19/09, Charles R Harris wrote: > David Goldsmith > wrote: > > Hi! ?Is this a bug: > > >>> np.multiply.accumulate([2, 3, 5], > dtype='g') > > array([0.0, -2, -3.1050362e+231], dtype=float96) > > >>> np.multiply.accumulate([2, 3, 5], > dtype='G') > > array([0.0+0.0j, -2+0.0j, -3.1050362e+231+0.0j], > dtype=complex192) > > On the other hand, this works for me. What numpy version do > you have? There was a bug like this that got fixed in 1.2. > > Chuck From cimrman3 at ntc.zcu.cz Tue Jul 21 09:40:56 2009 From: cimrman3 at ntc.zcu.cz (Robert Cimrman) Date: Tue, 21 Jul 2009 15:40:56 +0200 Subject: [SciPy-dev] ANN: SfePy 2009.3 Message-ID: <4A65C568.1070801@ntc.zcu.cz> I am pleased to announce release 2009.3 of SfePy. Description ----------- SfePy (simple finite elements in Python) is a software, distributed under the BSD license, for solving systems of coupled partial differential equations by the finite element method. The code is based on NumPy and SciPy packages. Mailing lists, issue tracking, git repository: http://sfepy.org Home page: http://sfepy.kme.zcu.cz Highlights of this release -------------------------- Finally, SfePy has a basic support for Windows installation via numpy distutils: - still very experimental! - the tests will not finish if umfpack is not installed, as the default direct solver in scipy cannot handle some problems (see recent sfepy-devel mailing list discussions). Major improvements ------------------ - new scripts: - cylindergen.py: cylindrical mesh generator - updated scripts: - postproc.py: - quite usable now for fast first glance at the results - plots point, cell data of all kinds (scalar, vector, tensor) - Viewer is much more configurable - probe.py: - can probe selected quantities only - isfepy: - Viewer is much more configurable - new tests and terms Applications ------------ - phononic materials: - plotting improved - caching of eigen-problem solution and Christoffel acoustic tensor - schroedinger.py: - choose and call DFT solver via solver interface People who contributed to this release: Vladimir Lukes. For more information on this release, see http://sfepy.googlecode.com/svn/web/releases/2009.3_RELEASE_NOTES.txt Best regards, Robert Cimrman From dsdale24 at gmail.com Tue Jul 21 11:59:28 2009 From: dsdale24 at gmail.com (Darren Dale) Date: Tue, 21 Jul 2009 11:59:28 -0400 Subject: [SciPy-dev] spam on wiki Message-ID: How can we remove spam from the wiki? http://www.scipy.org/hosetubingcompany Darren From gael.varoquaux at normalesup.org Tue Jul 21 12:02:10 2009 From: gael.varoquaux at normalesup.org (Gael Varoquaux) Date: Tue, 21 Jul 2009 18:02:10 +0200 Subject: [SciPy-dev] spam on wiki In-Reply-To: References: Message-ID: <20090721160210.GB7789@phare.normalesup.org> On Tue, Jul 21, 2009 at 11:59:28AM -0400, Darren Dale wrote: > How can we remove spam from the wiki? http://www.scipy.org/hosetubingcompany You need to be editor, to have delete privileges. If you want, I can add you. Ga?l From d_l_goldsmith at yahoo.com Tue Jul 21 17:44:43 2009 From: d_l_goldsmith at yahoo.com (David Goldsmith) Date: Tue, 21 Jul 2009 14:44:43 -0700 (PDT) Subject: [SciPy-dev] Reminder: Numpy Doc Summer Marathon Skypecon Message-ID: <773466.36100.qm@web52106.mail.re2.yahoo.com> Tomorrow (Wednesday, July 22) 19:00 UTC. Please email me agenda items. RSVP appreciated but not necessary. Look forward to talking to you... DG From d_l_goldsmith at yahoo.com Wed Jul 22 01:02:46 2009 From: d_l_goldsmith at yahoo.com (David Goldsmith) Date: Tue, 21 Jul 2009 22:02:46 -0700 (PDT) Subject: [SciPy-dev] 'nuther Goal Met! Message-ID: <145357.45239.qm@web52111.mail.re2.yahoo.com> dtypes (ufuncs is almost done, but I have a question on how to document the class itself; see discussion there, or Wiki Q & A page). DG From millman at berkeley.edu Wed Jul 22 12:01:03 2009 From: millman at berkeley.edu (Jarrod Millman) Date: Wed, 22 Jul 2009 09:01:03 -0700 Subject: [SciPy-dev] ANN: SciPy 2009 early registration ends today Message-ID: Today is the last day to register for SciPy 2009 at the early bird rates. Please register (http://conference.scipy.org/to_register ) by the end of the day to take advantage of the reduced early registration rate. The conference schedule is available here: http://conference.scipy.org/schedule The special group rate for the Marriot Hotel is no longer available. However, there are a number of closer and less expensive choices still available: http://admissions.caltech.edu/visiting/accommodations I've been staying at the Vagabond Inn for the last several years: http://www.vagabondinn-pasadena-hotel.com/ It is within easy walking distance of the conference and has just been completely renovated. Rooms at the Vagabond start at $79/night. About the conference -------------------- SciPy 2009, the 8th Python in Science conference, will be held from August 18-23, 2009 at Caltech in Pasadena, CA, USA. The conference starts with two days of tutorials to the scientific Python tools. There will be two tracks, one for introduction of the basic tools to beginners, and one for more advanced tools. The tutorials will be followed by two days of talks. Both days of talks will begin with a keynote address. The first day?s keynote will be given by Peter Norvig, the Director of Research at Google; while, the second keynote will be delivered by Jon Guyer, a Materials Scientist in the Thermodynamics and Kinetics Group at NIST. The program committee will select the remaining talks from submissions to our call for papers. All selected talks will be included in our conference proceedings edited by the program committee. After the talks each day we will provide several rooms for impromptu birds of a feather discussions. Finally, the last two days of the conference will be used for a number of coding sprints on the major software projects in our community. For the 8th consecutive year, the conference will bring together the developers and users of the open source software stack for scientific computing with Python. Attendees have the opportunity to review the available tools and how they apply to specific problems. By providing a forum for developers to share their Python expertise with the wider commercial, academic, and research communities, this conference fosters collaboration and facilitates the sharing of software components, techniques, and a vision for high level language use in scientific computing. For further information, please visit the conference homepage: http://conference.scipy.org. Important Dates --------------- * Friday, July 3: Abstracts Due * Wednesday, July 15: Announce accepted talks, post schedule * Wednesday, July 22: Early Registration ends * Tuesday-Wednesday, August 18-19: Tutorials * Thursday-Friday, August 20-21: Conference * Saturday-Sunday, August 22-23: Sprints * Friday, September 4: Papers for proceedings due Executive Committee ------------------- * Jarrod Millman, UC Berkeley, USA (Conference Chair) * Ga?l Varoquaux, INRIA Saclay, France (Program Co-Chair) * St?fan van der Walt, University of Stellenbosch, South Africa (Program Co-Chair) * Fernando P?rez, UC Berkeley, USA (Tutorial Chair) From d_l_goldsmith at yahoo.com Wed Jul 22 13:10:15 2009 From: d_l_goldsmith at yahoo.com (David Goldsmith) Date: Wed, 22 Jul 2009 10:10:15 -0700 (PDT) Subject: [SciPy-dev] Skypecon in just under 2 hrs. Message-ID: <201176.54273.qm@web52104.mail.re2.yahoo.com> From d_l_goldsmith at yahoo.com Wed Jul 22 15:03:17 2009 From: d_l_goldsmith at yahoo.com (David Goldsmith) Date: Wed, 22 Jul 2009 12:03:17 -0700 (PDT) Subject: [SciPy-dev] Weekly Summer Marathon Skypecon Message-ID: <352893.31483.qm@web52104.mail.re2.yahoo.com> Ready and waiting (or I can simply post an update)... DG From d_l_goldsmith at yahoo.com Wed Jul 22 15:34:06 2009 From: d_l_goldsmith at yahoo.com (d_l_goldsmith at yahoo.com) Date: Wed, 22 Jul 2009 12:34:06 -0700 (PDT) Subject: [SciPy-dev] Update and "open issues" Message-ID: <301617.72415.qm@web52108.mail.re2.yahoo.com> OK, since no one's RSVP-ed in the affirmative, and no one's Skyping me, I'll just post an "Update and Open Issues": Update ------ 0) Jack Liddle has fulfilled his promise to draft a Scipy "Milestones" page (i.e., analogous to the Numpy "Milestones" page): http://docs.scipy.org/scipy/Milestones/ Nice job, Jack, thanks!!! 1) I've added "Reviewer Guidelines" for both technical and presentation reviewers to the "Questions and Answers" page on the Numpy Wiki (please discuss these there). 2) "Basic Objects" on the Numpy "Milestones" page is all but Goal Met - only two (not Unimportant) items remain to be brought to "Needs Review" status: ndarray.strides (I'll give it the old "college try," but I'd really appreciate it if someone w/ a "deeper" understanding _and_ a talent for clear explanation could try to tackle this one), and numpy.ufunc (i.e., the class itself) about which there's an Open Issue (see below). Open Issues ----------- 0) Presently, _class_ numpy.ufunc is documented as if it were a function, and not to function spec to boot (albeit for reasons which make some sense once you see how it deviates).? IMO, it should be documented as the class that it is, but I'm sympathetic to the reason(s) it's done as it presently is, so I'm open to being convinced.? However, it's present deviations are inconsistent w/ the pydocweb application, so if we keep it "as is," we need to decide whether we care, and if so, ticket(s) to make the ap compliant need to filed. The rest are courtesy of Ralf: 1) Ralf would "like to be able to do a few more things on the wiki like make things 'Unimportant,' set things to 'Reviewed, Needs Work,' etc. (That way [he doesn't] have to leave comments in the hope Pauli will spot them eventually. [He's] always hesitant to add things to [Pauli's] mountain of TODOs.)" 2) An update on prospects for future funding. (Joe?) (Perhaps I, DG, should add this to the 'Q&A' so that Joe has a convenient way and place to update us all as things develop...) OK, that's what I have; email me anything you want to add. DG From dwf at cs.toronto.edu Wed Jul 22 16:54:13 2009 From: dwf at cs.toronto.edu (David Warde-Farley) Date: Wed, 22 Jul 2009 16:54:13 -0400 Subject: [SciPy-dev] Update and "open issues" In-Reply-To: <301617.72415.qm@web52108.mail.re2.yahoo.com> References: <301617.72415.qm@web52108.mail.re2.yahoo.com> Message-ID: <32A64157-A56A-4EA8-943D-E119E4B7B218@cs.toronto.edu> Hi David, I mentioned this on the list a while ago but forgot to open a ticket: http://docs.scipy.org/doc/numpy/reference/generated/numpy.where.html?highlight=where#numpy.where Notice the crossreference link to nonzero() is wrong; it sends you to the C API when it should send you to the Python function in the same subpackage. I'm not sure how to debug this, or if there's a systematic way to find other instances where this is happening and fix them. David On 22-Jul-09, at 3:34 PM, d_l_goldsmith at yahoo.com wrote: > > OK, since no one's RSVP-ed in the affirmative, and no one's Skyping > me, I'll just post an "Update and Open Issues": > > Update > ------ > > 0) Jack Liddle has fulfilled his promise to draft a Scipy > "Milestones" page (i.e., analogous to the Numpy "Milestones" page): > > http://docs.scipy.org/scipy/Milestones/ > > Nice job, Jack, thanks!!! > > 1) I've added "Reviewer Guidelines" for both technical and > presentation > reviewers to the "Questions and Answers" page on the Numpy Wiki > (please discuss these there). > > 2) "Basic Objects" on the Numpy "Milestones" page is all but Goal > Met - only two (not Unimportant) items remain to be brought to > "Needs Review" status: ndarray.strides (I'll give it the old > "college try," but I'd really appreciate it if someone w/ a "deeper" > understanding _and_ a talent for clear explanation could try to > tackle this one), and numpy.ufunc (i.e., the class itself) about > which there's an Open Issue (see below). > > Open Issues > ----------- > > 0) Presently, _class_ numpy.ufunc is documented as if it were a > function, and not to function spec to boot (albeit for reasons which > make some sense once you see how it deviates). IMO, it should be > documented as the class that it is, but I'm sympathetic to the > reason(s) it's done as it presently is, so I'm open to being > convinced. However, it's present deviations are inconsistent w/ the > pydocweb application, so if we keep it "as is," we need to decide > whether we care, and if so, ticket(s) to make the ap compliant need > to filed. > > The rest are courtesy of Ralf: > > 1) Ralf would "like to be able to do a few more things on the wiki > like make things 'Unimportant,' set things to 'Reviewed, Needs > Work,' etc. (That way [he doesn't] have to leave comments in the > hope Pauli will spot them eventually. [He's] always hesitant to add > things to [Pauli's] mountain of TODOs.)" > > 2) An update on prospects for future funding. (Joe?) (Perhaps I, > DG, should add this to the 'Q&A' so that Joe has a convenient way > and place to update us all as things develop...) > > > OK, that's what I have; email me anything you want to add. > > DG > > > > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev From d_l_goldsmith at yahoo.com Wed Jul 22 19:13:55 2009 From: d_l_goldsmith at yahoo.com (David Goldsmith) Date: Wed, 22 Jul 2009 16:13:55 -0700 (PDT) Subject: [SciPy-dev] Update and "open issues" Message-ID: <152086.34290.qm@web52101.mail.re2.yahoo.com> Thanks, David. Have you now opened a ticket? DG --- On Wed, 7/22/09, David Warde-Farley wrote: > From: David Warde-Farley > Subject: Re: [SciPy-dev] Update and "open issues" > To: d_l_goldsmith at yahoo.com, "SciPy Developers List" > Date: Wednesday, July 22, 2009, 1:54 PM > Hi David, > > I mentioned this on the list a while ago but forgot to open > a ticket: > > http://docs.scipy.org/doc/numpy/reference/generated/numpy.where.html?highlight=where#numpy.where > > Notice the crossreference link to nonzero() is wrong; it > sends you to the C API when it should send you to the Python > function in the same subpackage. I'm not sure how to debug > this, or if there's a systematic way to find other instances > where this is happening and fix them. > > David > > > On 22-Jul-09, at 3:34 PM, d_l_goldsmith at yahoo.com > wrote: > > > > > OK, since no one's RSVP-ed in the affirmative, and no > one's Skyping me, I'll just post an "Update and Open > Issues": > > > > Update > > ------ > > > > 0) Jack Liddle has fulfilled his promise to draft a > Scipy "Milestones" page (i.e., analogous to the Numpy > "Milestones" page): > > > > http://docs.scipy.org/scipy/Milestones/ > > > > Nice job, Jack, thanks!!! > > > > 1) I've added "Reviewer Guidelines" for both technical > and presentation > > reviewers to the "Questions and Answers" page on the > Numpy Wiki (please discuss these there). > > > > 2) "Basic Objects" on the Numpy "Milestones" page is > all but Goal Met - only two (not Unimportant) items remain > to be brought to "Needs Review" status: ndarray.strides > (I'll give it the old "college try," but I'd really > appreciate it if someone w/ a "deeper" understanding _and_ a > talent for clear explanation could try to tackle this one), > and numpy.ufunc (i.e., the class itself) about which there's > an Open Issue (see below). > > > > Open Issues > > ----------- > > > > 0) Presently, _class_ numpy.ufunc is documented as if > it were a function, and not to function spec to boot (albeit > for reasons which make some sense once you see how it > deviates).? IMO, it should be documented as the class > that it is, but I'm sympathetic to the reason(s) it's done > as it presently is, so I'm open to being convinced.? > However, it's present deviations are inconsistent w/ the > pydocweb application, so if we keep it "as is," we need to > decide whether we care, and if so, ticket(s) to make the ap > compliant need to filed. > > > > The rest are courtesy of Ralf: > > > > 1) Ralf would "like to be able to do a few more things > on the wiki like make things 'Unimportant,' set things to > 'Reviewed, Needs Work,' etc. (That way [he doesn't] have to > leave comments in the hope Pauli will spot them eventually. > [He's] always hesitant to add things to [Pauli's] mountain > of TODOs.)" > > > > 2) An update on prospects for future funding.? > (Joe?)? (Perhaps I, DG, should add this to the > 'Q&A' so that Joe has a convenient way and place to > update us all as things develop...) > > > > > > OK, that's what I have; email me anything you want to > add. > > > > DG > > > > > > > > _______________________________________________ > > Scipy-dev mailing list > > Scipy-dev at scipy.org > > http://mail.scipy.org/mailman/listinfo/scipy-dev > > From ralf.gommers at googlemail.com Wed Jul 22 21:27:48 2009 From: ralf.gommers at googlemail.com (Ralf Gommers) Date: Wed, 22 Jul 2009 21:27:48 -0400 Subject: [SciPy-dev] Update and "open issues" In-Reply-To: <301617.72415.qm@web52108.mail.re2.yahoo.com> References: <301617.72415.qm@web52108.mail.re2.yahoo.com> Message-ID: On Wed, Jul 22, 2009 at 3:34 PM, wrote: > > OK, since no one's RSVP-ed in the affirmative, and no one's Skyping me, > I'll just post an "Update and Open Issues": > > Update > ------ > > 0) Jack Liddle has fulfilled his promise to draft a Scipy "Milestones" page > (i.e., analogous to the Numpy "Milestones" page): > > http://docs.scipy.org/scipy/Milestones/ > > Nice job, Jack, thanks!!! Looks great, thanks Jack. One suggestion maybe to make it a bit shorter (and keep the server alive): remove a lot of class methods from that page if the class is in the milestone. For example, if MatFileReader is an item, it is not really necessary to have its twenty methods as separate items. Cheers, Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From scott.sinclair.za at gmail.com Thu Jul 23 03:07:10 2009 From: scott.sinclair.za at gmail.com (Scott Sinclair) Date: Thu, 23 Jul 2009 09:07:10 +0200 Subject: [SciPy-dev] Update and "open issues" In-Reply-To: <32A64157-A56A-4EA8-943D-E119E4B7B218@cs.toronto.edu> References: <301617.72415.qm@web52108.mail.re2.yahoo.com> <32A64157-A56A-4EA8-943D-E119E4B7B218@cs.toronto.edu> Message-ID: <6a17e9ee0907230007ne6ff9f2kd624b29ae0aba8a6@mail.gmail.com> > 2009/7/22 David Warde-Farley : > Hi David, > > I mentioned this on the list a while ago but forgot to open a ticket: > > http://docs.scipy.org/doc/numpy/reference/generated/numpy.where.html?highlight=where#numpy.where > > Notice the crossreference link to nonzero() is wrong; it sends you to > the C API when it should send you to the Python function in the same > subpackage. I'm not sure how to debug this, or if there's a systematic > way to find other instances where this is happening and fix them. This works correctly in the doc-editor, so must be to do with the way Sphinx is generating the cross-references. Maybe there's a way to force it to search in the python sources first? http://docs.scipy.org/numpy/docs/numpy.core.multiarray.where/ Cheers, Scott From d_l_goldsmith at yahoo.com Thu Jul 23 13:19:10 2009 From: d_l_goldsmith at yahoo.com (David Goldsmith) Date: Thu, 23 Jul 2009 10:19:10 -0700 (PDT) Subject: [SciPy-dev] Bar graph of status in Stats Message-ID: <658793.17194.qm@web52106.mail.re2.yahoo.com> Hi, folks. Would anyone object to the reversing the (chronological) order of the bar graph showing overall status on the Stats page of the Wiki - at this point I don't really think we care about last summer's performance as much as where we are currently, so it'd be nice if that were the first thing one sees instead of the last. ;-) Is pydocweb written such that this requires only a single (or two at most) change(s) in the code? DG From d_l_goldsmith at yahoo.com Thu Jul 23 13:52:11 2009 From: d_l_goldsmith at yahoo.com (d_l_goldsmith at yahoo.com) Date: Thu, 23 Jul 2009 10:52:11 -0700 (PDT) Subject: [SciPy-dev] Category of the week Message-ID: <490336.55740.qm@web52104.mail.re2.yahoo.com> Hi, again! So, how'd we do on last week's (second) "category" of the week (about NaN treatment in masked arrays, IIRC)? I focused on getting the "Basic Objects" section done, so I really didn't follow progress on the other. Anyway, time for a new one: the remaining (un-led) categories requiring substantial work are: "Fourier Transforms," "Other Random Operations," "Masked Arrays (I) and IV," "Operations on Masks," "Even More MA Functions I & II," "Numpy Internals," "C-types," "Other Math," & "Other Array Subclasses." If a cat-leader wants to recruit helpers by nominating their cat, those are: "Financial Functions" (Skipper Seabold), "Random Number Generation" (Alan Jackson), and "Masked Arrays III" (Scott Sinclair). Finally, our "category" of the week could be simply to move all "Goal met (almost)"s to "**Goal Met!**" status; these are: "Array Manipulation," "Functional Operations," "Ndarray II," and "Ufunc." DG PS: What's w/ all the uncategorized white and light grey's one sees on the "All Docstrings" page? From pgmdevlist at gmail.com Thu Jul 23 14:53:07 2009 From: pgmdevlist at gmail.com (Pierre GM) Date: Thu, 23 Jul 2009 14:53:07 -0400 Subject: [SciPy-dev] correlate_nd.src won't compile Message-ID: All, I just run into the following problem while compiling scipy (0.8.0svn5874). Any comments welcome. Thx. (mac OS 10.5.7, i686-apple-darwin9-gcc-4.2.1 (GCC) 4.2.1 (Apple Inc. build 5566)) creating build/temp.macosx-10.5-i386-2.5/scipy/signal creating build/temp.macosx-10.5-i386-2.5/build/src.macosx-10.5- i386-2.5/scipy/signal compile options: '-Iscipy/signal -I/Users/pierregm/ Computing/.pythonenvs/trash_test/lib/python2.5/site-packages/numpy/ core/include -I/opt/local/Library/Frameworks/Python.framework/Versions/ 2.5/include/python2.5 -c' gcc-4.0: scipy/signal/firfilter.c gcc-4.0: build/src.macosx-10.5-i386-2.5/scipy/signal/correlate_nd.c scipy/signal/correlate_nd.c.src: In function ?_imp_correlate_nd_ubyte?: scipy/signal/correlate_nd.c.src:129: warning: implicit declaration of function ?PyArrayNeighborhoodIter_ResetConstant? scipy/signal/correlate_nd.c.src:133: warning: implicit declaration of function ?PyArrayNeighborhoodIter_NextConstant? scipy/signal/correlate_nd.c.src: In function ?_imp_correlate_nd_ubyte_2d?: scipy/signal/correlate_nd.c.src:133: warning: implicit declaration of function ?PyArrayNeighborhoodIter_NextConstant2D? scipy/signal/correlate_nd.c.src: In function ?_correlate_nd_imp?: scipy/signal/correlate_nd.c.src:287: error: ?NPY_NEIGHBORHOOD_ITER_ZERO_PADDING? undeclared (first use in this function) scipy/signal/correlate_nd.c.src:287: error: (Each undeclared identifier is reported only once scipy/signal/correlate_nd.c.src:287: error: for each function it appears in.) scipy/signal/correlate_nd.c.src:287: error: too many arguments to function ?*(_scipy_signal_ARRAY_API + 852u)? scipy/signal/correlate_nd.c.src:301: error: too many arguments to function ?*(_scipy_signal_ARRAY_API + 852u)? scipy/signal/correlate_nd.c.src: In function ?_imp_correlate_nd_ubyte?: scipy/signal/correlate_nd.c.src:129: warning: implicit declaration of function ?PyArrayNeighborhoodIter_ResetConstant? scipy/signal/correlate_nd.c.src:133: warning: implicit declaration of function ?PyArrayNeighborhoodIter_NextConstant? scipy/signal/correlate_nd.c.src: In function ?_imp_correlate_nd_ubyte_2d?: scipy/signal/correlate_nd.c.src:133: warning: implicit declaration of function ?PyArrayNeighborhoodIter_NextConstant2D? scipy/signal/correlate_nd.c.src: In function ?_correlate_nd_imp?: scipy/signal/correlate_nd.c.src:287: error: ?NPY_NEIGHBORHOOD_ITER_ZERO_PADDING? undeclared (first use in this function) scipy/signal/correlate_nd.c.src:287: error: (Each undeclared identifier is reported only once scipy/signal/correlate_nd.c.src:287: error: for each function it appears in.) scipy/signal/correlate_nd.c.src:287: error: too many arguments to function ?*(_scipy_signal_ARRAY_API + 852u)? scipy/signal/correlate_nd.c.src:301: error: too many arguments to function ?*(_scipy_signal_ARRAY_API + 852u)? From charlesr.harris at gmail.com Thu Jul 23 16:11:16 2009 From: charlesr.harris at gmail.com (Charles R Harris) Date: Thu, 23 Jul 2009 14:11:16 -0600 Subject: [SciPy-dev] correlate_nd.src won't compile In-Reply-To: References: Message-ID: On Thu, Jul 23, 2009 at 12:53 PM, Pierre GM wrote: > All, > I just run into the following problem while compiling scipy > (0.8.0svn5874). > Any comments welcome. > Thx. > (mac OS 10.5.7, i686-apple-darwin9-gcc-4.2.1 (GCC) 4.2.1 (Apple Inc. > build 5566)) > > > creating build/temp.macosx-10.5-i386-2.5/scipy/signal > creating build/temp.macosx-10.5-i386-2.5/build/src.macosx-10.5- > i386-2.5/scipy/signal > compile options: '-Iscipy/signal -I/Users/pierregm/ > Computing/.pythonenvs/trash_test/lib/python2.5/site-packages/numpy/ > core/include -I/opt/local/Library/Frameworks/Python.framework/Versions/ > 2.5/include/python2.5 -c' > gcc-4.0: scipy/signal/firfilter.c > gcc-4.0: build/src.macosx-10.5-i386-2.5/scipy/signal/correlate_nd.c > scipy/signal/correlate_nd.c.src: In function ?_imp_correlate_nd_ubyte?: > scipy/signal/correlate_nd.c.src:129: warning: implicit declaration of > function ?PyArrayNeighborhoodIter_ResetConstant? > scipy/signal/correlate_nd.c.src:133: warning: implicit declaration of > function ?PyArrayNeighborhoodIter_NextConstant? > scipy/signal/correlate_nd.c.src: In function > ?_imp_correlate_nd_ubyte_2d?: > scipy/signal/correlate_nd.c.src:133: warning: implicit declaration of > function ?PyArrayNeighborhoodIter_NextConstant2D? > scipy/signal/correlate_nd.c.src: In function ?_correlate_nd_imp?: > scipy/signal/correlate_nd.c.src:287: error: > ?NPY_NEIGHBORHOOD_ITER_ZERO_PADDING? undeclared (first use in this > function) > scipy/signal/correlate_nd.c.src:287: error: (Each undeclared > identifier is reported only once > scipy/signal/correlate_nd.c.src:287: error: for each function it > appears in.) > scipy/signal/correlate_nd.c.src:287: error: too many arguments to > function ?*(_scipy_signal_ARRAY_API + 852u)? > scipy/signal/correlate_nd.c.src:301: error: too many arguments to > function ?*(_scipy_signal_ARRAY_API + 852u)? > scipy/signal/correlate_nd.c.src: In function ?_imp_correlate_nd_ubyte?: > scipy/signal/correlate_nd.c.src:129: warning: implicit declaration of > function ?PyArrayNeighborhoodIter_ResetConstant? > scipy/signal/correlate_nd.c.src:133: warning: implicit declaration of > function ?PyArrayNeighborhoodIter_NextConstant? > scipy/signal/correlate_nd.c.src: In function > ?_imp_correlate_nd_ubyte_2d?: > scipy/signal/correlate_nd.c.src:133: warning: implicit declaration of > function ?PyArrayNeighborhoodIter_NextConstant2D? > scipy/signal/correlate_nd.c.src: In function ?_correlate_nd_imp?: > scipy/signal/correlate_nd.c.src:287: error: > ?NPY_NEIGHBORHOOD_ITER_ZERO_PADDING? undeclared (first use in this > function) > scipy/signal/correlate_nd.c.src:287: error: (Each undeclared > identifier is reported only once > scipy/signal/correlate_nd.c.src:287: error: for each function it > appears in.) > scipy/signal/correlate_nd.c.src:287: error: too many arguments to > function ?*(_scipy_signal_ARRAY_API + 852u)? > scipy/signal/correlate_nd.c.src:301: error: too many arguments to > function ?*(_scipy_signal_ARRAY_API + 852u)? > ____ Looks like a missing include somewhere. I expect David C. will fix it up soon. Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From d_l_goldsmith at yahoo.com Thu Jul 23 18:45:33 2009 From: d_l_goldsmith at yahoo.com (David Goldsmith) Date: Thu, 23 Jul 2009 15:45:33 -0700 (PDT) Subject: [SciPy-dev] Special assignment for someone w/ "Unimportant" privileges Message-ID: <138253.61662.qm@web52110.mail.re2.yahoo.com> OK, I promoted a few white ones on the Milestones page to light grey, counted the remainder (most in one or the other of the MA categories) and then cut, paste, and "counted" all the white ones on the "All docstrings" page: the ratio is about 5:1 (300:60). So I look more closely at some of the white ones on the All docstrings page and see things like: numpy.testing.decorators.deprecated and numpy.ndarray.conjugate Shouldn't the former (and many like it) be classified as "Unimportant"? And why wasn't the latter (and many like it) included in the category of its namespace? So, the "Special Assignment" for someone who has the permissions to do this (I don't) is to go through the white ones (at the very least) on the All docstrings page and reclassify as "Unimportant" those for which such a reclassification is appropriate. Let be any that you're unsure of, but compile a list of 'em, and post it at the Q & A page for others to contribute their opinions (or expert decree). As for the categorizable uncategorized ones, unless someone tells me why they weren't categorized, I'll go ahead and do that. DG From cournape at gmail.com Thu Jul 23 19:56:15 2009 From: cournape at gmail.com (David Cournapeau) Date: Fri, 24 Jul 2009 08:56:15 +0900 Subject: [SciPy-dev] correlate_nd.src won't compile In-Reply-To: References: Message-ID: <5b8d13220907231656k6b38a3aald616823d4e60f7d4@mail.gmail.com> On Fri, Jul 24, 2009 at 3:53 AM, Pierre GM wrote: > All, > I just run into the following problem while compiling scipy > (0.8.0svn5874). > Any comments welcome. > Thx. > (mac OS 10.5.7, i686-apple-darwin9-gcc-4.2.1 (GCC) 4.2.1 (Apple Inc. > build 5566)) You need to update numpy David From ralf.gommers at googlemail.com Thu Jul 23 22:40:31 2009 From: ralf.gommers at googlemail.com (Ralf Gommers) Date: Thu, 23 Jul 2009 22:40:31 -0400 Subject: [SciPy-dev] Special assignment for someone w/ "Unimportant" privileges In-Reply-To: <138253.61662.qm@web52110.mail.re2.yahoo.com> References: <138253.61662.qm@web52110.mail.re2.yahoo.com> Message-ID: On Thu, Jul 23, 2009 at 6:45 PM, David Goldsmith wrote: > > OK, I promoted a few white ones on the Milestones page to light grey, > counted the remainder (most in one or the other of the MA categories) and > then cut, paste, and "counted" all the white ones on the "All docstrings" > page: the ratio is about 5:1 (300:60). So I look more closely at some of > the white ones on the All docstrings page and see things like: > > numpy.testing.decorators.deprecated > > and > > numpy.ndarray.conjugate > > Shouldn't the former (and many like it) be classified as "Unimportant"? > And why wasn't the latter (and many like it) included in the category of > its namespace? > > So, the "Special Assignment" for someone who has the permissions to do this > (I don't) is to go through the white ones (at the very least) on the All > docstrings page and reclassify as "Unimportant" those for which such a > reclassification is appropriate. Let be any that you're unsure of, but > compile a list of 'em, and post it at the Q & A page for others to > contribute their opinions (or expert decree). > > As for the categorizable uncategorized ones, unless someone tells me why > they weren't categorized, I'll go ahead and do that. There's a few different types of uncategorized docstrings: - some that were overlooked. i think charrayarray and matrix can be two new categories for example. - the rst files that form the User Guide, can be left where they are for now. - methods of classes that are in a category (see for example broadcast), those can be left where they are as well. they can easily be done together with the class, but there's no need to stuff them in existing categories. - things like testing and distutils. it's important that they get documented at some point, but they are not very accessible topics and not as urgent as many of the categories. fine where they are. - some unimportant ones. there really are not that many of them. The one big group of docstrings I'm not sure about is everything in numpy.generic. They are all undocumented and the source is not available in the wiki. Anyone know what to do with those? Cheers, Ralf > DG > > > > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at googlemail.com Thu Jul 23 22:46:41 2009 From: ralf.gommers at googlemail.com (Ralf Gommers) Date: Thu, 23 Jul 2009 22:46:41 -0400 Subject: [SciPy-dev] Category of the week In-Reply-To: <490336.55740.qm@web52104.mail.re2.yahoo.com> References: <490336.55740.qm@web52104.mail.re2.yahoo.com> Message-ID: On Thu, Jul 23, 2009 at 1:52 PM, wrote: > > Hi, again! So, how'd we do on last week's (second) "category" of the week > (about NaN treatment in masked arrays, IIRC)? I focused on getting the > "Basic Objects" section done, so I really didn't follow progress on the > other. Not too much happened there I think. > Anyway, time for a new one: the remaining (un-led) categories requiring > substantial work are: > "Fourier Transforms," "Other Random Operations," "Masked Arrays (I) and > IV," "Operations on Masks," "Even More MA Functions I & II," "Numpy > Internals," "C-types," "Other Math," & "Other Array Subclasses." > Fourier Transforms or Other Random Ops? Cheers, Ralf > If a cat-leader wants to recruit helpers by nominating their cat, those > are: > > "Financial Functions" (Skipper Seabold), "Random Number Generation" (Alan > Jackson), and "Masked Arrays III" (Scott Sinclair). > > Finally, our "category" of the week could be simply to move all "Goal met > (almost)"s to "**Goal Met!**" status; these are: > > "Array Manipulation," "Functional Operations," "Ndarray II," and "Ufunc." > > DG > > PS: What's w/ all the uncategorized white and light grey's one sees on the > "All Docstrings" page? > > > > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > -------------- next part -------------- An HTML attachment was scrubbed... URL: From pgmdevlist at gmail.com Fri Jul 24 01:37:16 2009 From: pgmdevlist at gmail.com (Pierre GM) Date: Fri, 24 Jul 2009 01:37:16 -0400 Subject: [SciPy-dev] correlate_nd.src won't compile In-Reply-To: <5b8d13220907231656k6b38a3aald616823d4e60f7d4@mail.gmail.com> References: <5b8d13220907231656k6b38a3aald616823d4e60f7d4@mail.gmail.com> Message-ID: <245E4FB3-56B4-41C3-83C2-875A576BC410@gmail.com> On Jul 23, 2009, at 7:56 PM, David Cournapeau wrote: > On Fri, Jul 24, 2009 at 3:53 AM, Pierre GM > wrote: >> All, >> I just run into the following problem while compiling scipy >> (0.8.0svn5874). >> Any comments welcome. >> Thx. >> (mac OS 10.5.7, i686-apple-darwin9-gcc-4.2.1 (GCC) 4.2.1 (Apple Inc. >> build 5566)) > > You need to update numpy Thx. Is there a way to tell what SVN version of numpy is required for any SVN version of Scipy ? I had updated numpy from SVN not too long ago (a couple of weeks max.) and didn't think about reupdating that soon. From d_l_goldsmith at yahoo.com Fri Jul 24 01:37:23 2009 From: d_l_goldsmith at yahoo.com (David Goldsmith) Date: Thu, 23 Jul 2009 22:37:23 -0700 (PDT) Subject: [SciPy-dev] Special assignment for someone w/ "Unimportant" privileges Message-ID: <679422.26092.qm@web52104.mail.re2.yahoo.com> --- On Thu, 7/23/09, Ralf Gommers wrote: > There's a few different types of uncategorized > docstrings: > > - some that were overlooked. i think charrayarray and > matrix can be two new categories for example. > - methods of classes that are in a category (see for > example broadcast), those can be left where they are as > well. they can easily be done together with the class, but > there's no need to stuff them in existing categories. Matrix already is a category, one marked Goal Met, but I see that many of class matrix's methods are white or grey: Joe, if you read this, since you set the goal, does it include these methods that aren't on the Milestones page themselves, but are methods of a class that is on the Milestones page? I know the general feeling is that the Milestones page is already too long (i.e., takes too long to load because so many objects on it means database queries take a long time), but, if these methods are part of the goal, I'd like to see these "methods-in-the-goal-but-not-on-the-Milestones-page" all in one place somewhere, sorted out of the other "un-categories" Ralf discusses below. I know the Q & A page is also getting long, but what about there? > - the rst files that form the User Guide, can be left where > they are for now. > > - things like testing and distutils. it's important > that they get documented at some point, but they are not > very accessible topics and not as urgent as many of the > categories. fine where they are. OK, so two of these "un-categories" are "Unimportant Now, Do Eventually"; I know we're just going to have to redo the color scheme again shortly to accommodate the dual review process, but how difficult would it be to add one color now, say lavender, or bright orange, for this "UNDE" status? > - some unimportant ones. there really are not that many of > them. Still, it'd be nice to weed them out, as I (and I'm sure many other potential writers) don't know which they are - I don't want to be working on 'em, and I don't want volunteers working on 'em, if they're Unimportant. > The one big group of docstrings I'm not sure about is > everything in numpy.generic. They are all undocumented and > the source is not available in the wiki. Anyone know what to > do with those? Well, someone "in the know" could take a look at what I did for the attributes in the .generic namespace and opine on whether that's (in)sufficient... DG > > Cheers, > Ralf > > > > > > DG > > > > > > > > _______________________________________________ > > Scipy-dev mailing list > > Scipy-dev at scipy.org > > http://mail.scipy.org/mailman/listinfo/scipy-dev > > > > > -----Inline Attachment Follows----- > > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > From david at ar.media.kyoto-u.ac.jp Fri Jul 24 01:25:25 2009 From: david at ar.media.kyoto-u.ac.jp (David Cournapeau) Date: Fri, 24 Jul 2009 14:25:25 +0900 Subject: [SciPy-dev] correlate_nd.src won't compile In-Reply-To: <245E4FB3-56B4-41C3-83C2-875A576BC410@gmail.com> References: <5b8d13220907231656k6b38a3aald616823d4e60f7d4@mail.gmail.com> <245E4FB3-56B4-41C3-83C2-875A576BC410@gmail.com> Message-ID: <4A6945C5.30805@ar.media.kyoto-u.ac.jp> Pierre GM wrote: > > Thx. Is there a way to tell what SVN version of numpy is required for > any SVN version of Scipy ? No. I think that would be too much of a bother during development phases, as it does not that happen often (and it is not that easy to do it right in every case - how to get svn rev with tarballs, git/bzr/hg repos, etc... ). cheers, David From pgmdevlist at gmail.com Fri Jul 24 02:13:22 2009 From: pgmdevlist at gmail.com (Pierre GM) Date: Fri, 24 Jul 2009 02:13:22 -0400 Subject: [SciPy-dev] correlate_nd.src won't compile In-Reply-To: <4A6945C5.30805@ar.media.kyoto-u.ac.jp> References: <5b8d13220907231656k6b38a3aald616823d4e60f7d4@mail.gmail.com> <245E4FB3-56B4-41C3-83C2-875A576BC410@gmail.com> <4A6945C5.30805@ar.media.kyoto-u.ac.jp> Message-ID: <70F6112B-718E-4B48-9CAA-97AA220A8E4D@gmail.com> On Jul 24, 2009, at 1:25 AM, David Cournapeau wrote: > Pierre GM wrote: >> >> Thx. Is there a way to tell what SVN version of numpy is required for >> any SVN version of Scipy ? > > No. I think that would be too much of a bother during development > phases, as it does not that happen often (and it is not that easy to > do > it right in every case - how to get svn rev with tarballs, git/bzr/hg > repos, etc... ). OK. What about just a line in the commit comments, then ? From pgmdevlist at gmail.com Fri Jul 24 02:15:39 2009 From: pgmdevlist at gmail.com (Pierre GM) Date: Fri, 24 Jul 2009 02:15:39 -0400 Subject: [SciPy-dev] Homogenizing stats & mstats Message-ID: All, I was browsing some recent tickets for scipy.stats, and couldn't but noticed that a significant number of them (#845, #822, #901...), are related to some lack of consistency between stats and mstats. I'd like to eventually get rid of mstats all together, provided the same functionalities are supported in stats. * A first step would be to use np.asanyarray instead of np.asarray. That should be sufficient for functions like gmean and hmean for example. * A second step would be to use numpy.ma under the hood, returning either a MaskedArray if the input is a MaskedArray itself, or just a standard ndarray otherwise. That should take care of the functions related to ranking and tie handling (I'm pretty confident into the mstats routines, and we can always double-check the results w/ R). If needed, we could also add a usemask flag, like we do in np.io.genfromtxt. * A third would be to port the remaining routines of mstats.extras to stats or morestats (Harrell-Davies quantiles could be imlemented more efficiently in cython, for example). At each step, we could add a Deprecate warning to a reviewed mstat function and call the corresponding stat function instead. What would be a good time line ? 0.8.0, or is it too late? 0.9.0 ? Comments expected. Thx in advance P. From david at ar.media.kyoto-u.ac.jp Fri Jul 24 01:58:05 2009 From: david at ar.media.kyoto-u.ac.jp (David Cournapeau) Date: Fri, 24 Jul 2009 14:58:05 +0900 Subject: [SciPy-dev] correlate_nd.src won't compile In-Reply-To: <70F6112B-718E-4B48-9CAA-97AA220A8E4D@gmail.com> References: <5b8d13220907231656k6b38a3aald616823d4e60f7d4@mail.gmail.com> <245E4FB3-56B4-41C3-83C2-875A576BC410@gmail.com> <4A6945C5.30805@ar.media.kyoto-u.ac.jp> <70F6112B-718E-4B48-9CAA-97AA220A8E4D@gmail.com> Message-ID: <4A694D6D.5000808@ar.media.kyoto-u.ac.jp> Pierre GM wrote: > > OK. What about just a line in the commit comments, then ? > That's a good idea, I should have done that. David From stefan at sun.ac.za Fri Jul 24 09:08:58 2009 From: stefan at sun.ac.za (=?ISO-8859-1?Q?St=E9fan_van_der_Walt?=) Date: Fri, 24 Jul 2009 15:08:58 +0200 Subject: [SciPy-dev] Homogenizing stats & mstats In-Reply-To: References: Message-ID: <9457e7c80907240608x1e3b7d37je776a826771cc117@mail.gmail.com> 2009/7/24 Pierre GM : > * A second step would be to use numpy.ma under the hood, returning > either a MaskedArray if the input is a MaskedArray itself, or just a > standard ndarray otherwise. That should take care of the functions > related to ranking and tie handling (I'm pretty confident into the > mstats routines, and we can always double-check the results w/ R). If > needed, we could also add a usemask flag, like we do in > np.io.genfromtxt. Would the patch suggested by Darren be useful here? I imagine it would. Regards St?fan From bsouthey at gmail.com Fri Jul 24 11:14:18 2009 From: bsouthey at gmail.com (Bruce Southey) Date: Fri, 24 Jul 2009 10:14:18 -0500 Subject: [SciPy-dev] Homogenizing stats & mstats In-Reply-To: References: Message-ID: <4A69CFCA.8000206@gmail.com> On 07/24/2009 01:15 AM, Pierre GM wrote: > All, > I was browsing some recent tickets for scipy.stats, and couldn't but > noticed that a significant number of them (#845, #822, #901...), are > related to some lack of consistency between stats and mstats. > > I'd like to eventually get rid of mstats all together, provided the > same functionalities are supported in stats. > Yeah, that would be great but I ran out of steam to do more and have not found the time to go back. > * A first step would be to use np.asanyarray instead of np.asarray. > That should be sufficient for functions like gmean and hmean for > example. > Well there should be a couple of patches for those two. http://projects.scipy.org/scipy/ticket/907 http://projects.scipy.org/scipy/ticket/908 It was not clear if some functions should be in scipy or even stats at least in their current form (this made me stop what I was doing). I really hope that Numpy will eventually provide for something like nanmean and nanstd. In some cases these appeared to limited to specific array dimensions (trimboth), others appear to be one liners and those with different names but may be the same function (trim_mean and trimmed_mean). As I now think about these functions, the stats functions do need to split into at least two parts such as descriptive stats like geometric mean (gmean) and statistical test functions like kendalltau. Perhaps even adding a set of utility functions like tmax, tmean and tmin (but these are limited to one dimensional arrays). We also need to address ticket 604:'Statistics functions with new options' at the same time. http://projects.scipy.org/scipy/ticket/604 > * A second step would be to use numpy.ma under the hood, returning > either a MaskedArray if the input is a MaskedArray itself, or just a > standard ndarray otherwise. That should take care of the functions > related to ranking and tie handling (I'm pretty confident into the > mstats routines, and we can always double-check the results w/ R). If > needed, we could also add a usemask flag, like we do in > np.io.genfromtxt. > Really I think that the input object must be preserved unless the user states otherwise. One aspect is that masked arrays automatically masks any noninfinite elements like infinity. For certain stats it is essentially to know that this has occurred as it signals a larger problem but automatically masking this hides this problem. For example: c=np.ma.masked_array([1.,2.,3., np.nan], [1,0,0,0] # provides a masked array with NaN c/2 # automatically masks the np.nan which is fine if you know but not if you do not want nonfinite values masked. It would be great to have at least the Matrix class work (record/structured arrays and even sparse arrays as well) but I do not how sufficient about these to know how. > * A third would be to port the remaining routines of mstats.extras to > stats or morestats (Harrell-Davies quantiles could be imlemented more > efficiently in cython, for example). > > At each step, we could add a Deprecate warning to a reviewed mstat > function and call the corresponding stat function instead. > Unfortunately there is not a one to one matching between the stats and mstats functions. When I started I found 178 functions between the different modules including some that are or should be depreciated. Only about 40 functions (plus a few that should be removed) that have the same name in the stats and masked_basic files. I have not checked these to know if these have the exact same behavior as expected by the input type. There are others that perhaps only differ in name. > What would be a good time line ? 0.8.0, or is it too late? 0.9.0 ? > For 0.8 I think we must at least warn users changes are comming for the stats and mstats as well as make sure that any unnecessary functions are depreciated. Also we could start the process to reorganize the stats functions and combine the stats and mstats functions with the same name and behavior. > Comments expected. > Always! :-) > Thx in advance > P. > Bruce -------------- next part -------------- An HTML attachment was scrubbed... URL: From pgmdevlist at gmail.com Fri Jul 24 14:23:51 2009 From: pgmdevlist at gmail.com (Pierre GM) Date: Fri, 24 Jul 2009 14:23:51 -0400 Subject: [SciPy-dev] Homogenizing stats & mstats In-Reply-To: <4A69CFCA.8000206@gmail.com> References: <4A69CFCA.8000206@gmail.com> Message-ID: <35E246DD-D9D8-4292-90C0-5E5AE4571D29@gmail.com> On Jul 24, 2009, at 11:14 AM, Bruce Southey wrote: > > As I now think about these functions, the stats functions do need to > split into at least two parts such as descriptive stats like > geometric mean (gmean) and statistical test functions like > kendalltau. Perhaps even adding a set of utility functions like > tmax, tmean and tmin (but these are limited to one dimensional > arrays). That was my intention as well to split stats into 2 or 3 files. Descriptive stats (means & quantiles) on one side, tests on the other sound good. Should we start creating these files already side by side with the current stats/mstats files ? Should we create a branch ? > We also need to address ticket 604:'Statistics functions with new > options' at the same time. > http://projects.scipy.org/scipy/ticket/604 Indeed. >> * A second step would be to use numpy.ma under the hood, returning >> either a MaskedArray if the input is a MaskedArray itself, or just a >> standard ndarray otherwise. > Really I think that the input object must be preserved unless the > user states otherwise. One aspect is that masked arrays > automatically masks any noninfinite elements like infinity. For > certain stats it is essentially to know that this has occurred as it > signals a larger problem but automatically masking this hides this > problem. For example: > c=np.ma.masked_array([1.,2.,3., np.nan], [1,0,0,0] # provides a > masked array with NaN > c/2 # automatically masks the np.nan which is fine if you know but > not if you do not want nonfinite values masked. OK, I see the problem here. We could have this usemask tell us whether to use the MA behaviour (invalid output are masked, a MA is output no matter the type of the input) or not (NaN/Infs are preserved, a standard ndarray is output no matter the type of the input). Nevertheless, some of the functions (ranking, tests with ties) work correctly in mstats and not in stats (compared to R): we could use the mstats implementation instead of the stats one, then. > It would be great to have at least the Matrix class work (record/ > structured arrays and even sparse arrays as well) but I do not how > sufficient about these to know how. Not too much a problem for descriptive stats on Matrix if we use np.asanyarray. Structured arrays are a different beast, as the standard functions (+-/*...) don't work (for a good reason, and this may change later on). I've no experience on sparse arrays, so count me out on this one. >> * A third would be to port the remaining routines of mstats.extras to >> stats or morestats (Harrell-Davies quantiles could be imlemented more >> efficiently in cython, for example). >> >> At each step, we could add a Deprecate warning to a reviewed mstat >> function and call the corresponding stat function instead. >> > Unfortunately there is not a one to one matching between the stats > and mstats functions. Mmh, if we proceed methodically, that shouldn't be too much of a problem. Name differences can be easily adressed. Behavior differences are trickier, but may be just bugs waiting for us. > When I started I found 178 functions between the different modules > including some that are or should be depreciated. Only about 40 > functions (plus a few that should be removed) that have the same > name in the stats and masked_basic files. I have not checked these > to know if these have the exact same behavior as expected by the > input type. There are others that perhaps only differ in name. >> What would be a good time line ? 0.8.0, or is it too late? 0.9.0 ? > For 0.8 I think we must at least warn users changes are comming for > the stats and mstats as well as make sure that any unnecessary > functions are depreciated. Also we could start the process to > reorganize the stats functions and combine the stats and mstats > functions with the same name and behavior. When is 0.8.0 supposed to be released ? If it's a matter of just a couple of weeks, we can sit on the issue as long as needed. If it's longer than that, we should probably get started now. From bsouthey at gmail.com Fri Jul 24 16:12:49 2009 From: bsouthey at gmail.com (Bruce Southey) Date: Fri, 24 Jul 2009 15:12:49 -0500 Subject: [SciPy-dev] Homogenizing stats & mstats In-Reply-To: <35E246DD-D9D8-4292-90C0-5E5AE4571D29@gmail.com> References: <4A69CFCA.8000206@gmail.com> <35E246DD-D9D8-4292-90C0-5E5AE4571D29@gmail.com> Message-ID: <4A6A15C1.2030204@gmail.com> On 07/24/2009 01:23 PM, Pierre GM wrote: > On Jul 24, 2009, at 11:14 AM, Bruce Southey wrote: > >> As I now think about these functions, the stats functions do need to >> split into at least two parts such as descriptive stats like >> geometric mean (gmean) and statistical test functions like >> kendalltau. Perhaps even adding a set of utility functions like >> tmax, tmean and tmin (but these are limited to one dimensional >> arrays). >> > > That was my intention as well to split stats into 2 or 3 files. > Descriptive stats (means& quantiles) on one side, tests on the other > sound good. Should we start creating these files already side by side > with the current stats/mstats files ? Should we create a branch ? > Just go ahead and do what you want! :-) The real issue is whether or not the stats files will be replaced by the new versions or be a new entity (that could then replace the old versions). Initially it would good to keep the old versions around to check and test functionality. > >> We also need to address ticket 604:'Statistics functions with new >> options' at the same time. >> http://projects.scipy.org/scipy/ticket/604 >> > > Indeed. > > >>> * A second step would be to use numpy.ma under the hood, returning >>> either a MaskedArray if the input is a MaskedArray itself, or just a >>> standard ndarray otherwise. >>> >> Really I think that the input object must be preserved unless the >> user states otherwise. One aspect is that masked arrays >> automatically masks any noninfinite elements like infinity. For >> certain stats it is essentially to know that this has occurred as it >> signals a larger problem but automatically masking this hides this >> problem. For example: >> c=np.ma.masked_array([1.,2.,3., np.nan], [1,0,0,0] # provides a >> masked array with NaN >> c/2 # automatically masks the np.nan which is fine if you know but >> not if you do not want nonfinite values masked. >> > > OK, I see the problem here. We could have this usemask tell us whether > to use the MA behaviour (invalid output are masked, a MA is output no > matter the type of the input) or not (NaN/Infs are preserved, a > standard ndarray is output no matter the type of the input). > Nevertheless, some of the functions (ranking, tests with ties) work > correctly in mstats and not in stats (compared to R): we could use the > mstats implementation instead of the stats one, then. > > > > >> It would be great to have at least the Matrix class work (record/ >> structured arrays and even sparse arrays as well) but I do not how >> sufficient about these to know how. >> > > Not too much a problem for descriptive stats on Matrix if we use > np.asanyarray. Structured arrays are a different beast, as the > standard functions (+-/*...) don't work (for a good reason, and this > may change later on). I've no experience on sparse arrays, so count me > out on this one. > Sounds like Matrix should be sufficiently easy to incorporate and we leave the rest on the wish list. > > > >>> * A third would be to port the remaining routines of mstats.extras to >>> stats or morestats (Harrell-Davies quantiles could be imlemented more >>> efficiently in cython, for example). >>> >>> At each step, we could add a Deprecate warning to a reviewed mstat >>> function and call the corresponding stat function instead. >>> >>> >> Unfortunately there is not a one to one matching between the stats >> and mstats functions. >> > > Mmh, if we proceed methodically, that shouldn't be too much of a > problem. Name differences can be easily adressed. Behavior differences > are trickier, but may be just bugs waiting for us. > > > >> When I started I found 178 functions between the different modules >> including some that are or should be depreciated. Only about 40 >> functions (plus a few that should be removed) that have the same >> name in the stats and masked_basic files. I have not checked these >> to know if these have the exact same behavior as expected by the >> input type. There are others that perhaps only differ in name. >> > > >>> What would be a good time line ? 0.8.0, or is it too late? 0.9.0 ? >>> >> For 0.8 I think we must at least warn users changes are comming for >> the stats and mstats as well as make sure that any unnecessary >> functions are depreciated. Also we could start the process to >> reorganize the stats functions and combine the stats and mstats >> functions with the same name and behavior. >> > > When is 0.8.0 supposed to be released ? If it's a matter of just a > couple of weeks, we can sit on the issue as long as needed. If it's > longer than that, we should probably get started now. > > While I can not help immediately with this, some I had submitted patches for. So hopefully the following will help. These functions just rename existing functions and perhaps the renaming, as necessary, should be elsewhere (like the distributions): chisqprob erfc fprob ksprob zprob These function are/should be depreciated mean median std var samplestd samplevar I thought that these could be replaced by a one liner using the compress method because these only work for 1d arrays; ie for some cutoff values minval and maxval: tmean a.compress((a>minval) & (aminval) & (aminval) & (aminval) & (aminval) & (a From pgmdevlist at gmail.com Fri Jul 24 16:29:37 2009 From: pgmdevlist at gmail.com (Pierre GM) Date: Fri, 24 Jul 2009 16:29:37 -0400 Subject: [SciPy-dev] Homogenizing stats & mstats In-Reply-To: <4A6A15C1.2030204@gmail.com> References: <4A69CFCA.8000206@gmail.com> <35E246DD-D9D8-4292-90C0-5E5AE4571D29@gmail.com> <4A6A15C1.2030204@gmail.com> Message-ID: <1617E3D1-20AC-4A6C-9ACB-D33830E9C3A4@gmail.com> On Jul 24, 2009, at 4:12 PM, Bruce Southey wrote: > Just go ahead and do what you want! :-) > The real issue is whether or not the stats files will be replaced by > the new versions or be a new entity (that could then replace the old > versions). Initially it would good to keep the old versions around > to check and test functionality. Obviously... > While I can not help immediately with this, some I had submitted > patches for. So hopefully the following will help. Great ! That's quite useful. Now, if I may ask for a bit more, what would you see in a descriptive.py and in a stattest.py ? From d_l_goldsmith at yahoo.com Fri Jul 24 19:21:14 2009 From: d_l_goldsmith at yahoo.com (David Goldsmith) Date: Fri, 24 Jul 2009 16:21:14 -0700 (PDT) Subject: [SciPy-dev] Category of the week Message-ID: <277756.91288.qm@web52112.mail.re2.yahoo.com> --- On Thu, 7/23/09, Ralf Gommers wrote: > Fourier Transforms or Other Random Ops? Well, I just polished off three of the remaining eight in ORO, so let's finish that one off, and if we do so early, dive in to FT. DG > > Cheers, > Ralf > > > > > If a cat-leader wants to recruit helpers by nominating > their cat, those are: > > > > "Financial Functions" (Skipper Seabold), > "Random Number Generation" (Alan Jackson), and > "Masked Arrays III" (Scott Sinclair). > > > > Finally, our "category" of the week could be > simply to move all "Goal met (almost)"s to > "**Goal Met!**" status; these are: > > > > "Array Manipulation," "Functional > Operations," "Ndarray II," and > "Ufunc." > > > > DG > > > > PS: What's w/ all the uncategorized white and light > grey's one sees on the "All Docstrings" page? > > > > > > > > _______________________________________________ > > Scipy-dev mailing list > > Scipy-dev at scipy.org > > http://mail.scipy.org/mailman/listinfo/scipy-dev > > > > From bsouthey at gmail.com Fri Jul 24 22:07:59 2009 From: bsouthey at gmail.com (Bruce Southey) Date: Fri, 24 Jul 2009 21:07:59 -0500 Subject: [SciPy-dev] Homogenizing stats & mstats In-Reply-To: <1617E3D1-20AC-4A6C-9ACB-D33830E9C3A4@gmail.com> References: <4A69CFCA.8000206@gmail.com> <35E246DD-D9D8-4292-90C0-5E5AE4571D29@gmail.com> <4A6A15C1.2030204@gmail.com> <1617E3D1-20AC-4A6C-9ACB-D33830E9C3A4@gmail.com> Message-ID: > > Great ! That's quite useful. Now, if I may ask for a bit more, what > would you see in a descriptive.py and in a stattest.py ? > Attached is my first attempt at this based on what I started ages ago so it is probably not current. It is a csv file with columns: Function Name: name of the function in files Array: Array function where Stats= stats.py, MoreStats=- morestats.py or blank if not present Masked: Masked array function where Mbasic=mstats_basic.py, Mextra=mstats_extras.py or blank if not present Support: functions found in the _support.py Type: In most cases it is obvious where they should go but this column has my (incorrect) guesses at the possible location: a) Internal function: small function defined in another function. I pulled these out in case of duplicated functions but probably should just disappear. b) Utility function: Appear to be helper functions that are not necessary stats based. Some of these probably should have a underscore to indicate these are not for external use. c) Summary statistic: Usually provides one or more descriptive statistics d) Test: Usually involves some sort of testing. I also included the correlation measures here. e) Probability function: renamed functions from special that perhaps should not be defined here. Comment: Possible actions like depreciate. The question mark indicates that I thought that the function might be a repeat or variant of other functions. If anyone feels I have the wrong location then just say so. Also some functions may be special cases of others and thus could be depreciated. Please correct any errors as there should be some. Bruce -------------- next part -------------- A non-text attachment was scrubbed... Name: stats_functions_24july09.csv Type: application/vnd.ms-excel Size: 8731 bytes Desc: not available URL: From cournape at gmail.com Sun Jul 26 11:59:48 2009 From: cournape at gmail.com (David Cournapeau) Date: Mon, 27 Jul 2009 00:59:48 +0900 Subject: [SciPy-dev] scipy.special changes, npymath integration Message-ID: <5b8d13220907260859s2c676d5en42450aab6b4b636e@mail.gmail.com> Hi, A small email to indicate that I have merged a first set of changes in scipy.special, aimed at making scipy.special more portable (in particular on windows with MS compilers). This means the scipy trunk requires the last numpy svn (r7252 as I speak). cheers, David From robert.kern at gmail.com Mon Jul 27 16:58:44 2009 From: robert.kern at gmail.com (Robert Kern) Date: Mon, 27 Jul 2009 15:58:44 -0500 Subject: [SciPy-dev] nbinom.ppf In-Reply-To: <1cd32cbb0907271329q33bec67dqc120575454dbf3dc@mail.gmail.com> References: <199F6194-4953-41F2-9F47-13CC4A5AD936@gmail.com> <3d375d730907271255i32283ae9j624e2a57b52fdac4@mail.gmail.com> <1cd32cbb0907271329q33bec67dqc120575454dbf3dc@mail.gmail.com> Message-ID: <3d375d730907271358s361cd41cke0804f1b80332c93@mail.gmail.com> On Mon, Jul 27, 2009 at 15:29, wrote: > On Mon, Jul 27, 2009 at 3:55 PM, Robert Kern wrote: >> On Mon, Jul 27, 2009 at 14:47, Pierre GM wrote: >>> All, >>> I'm puzzled by this result: >>> ?>>> from scipy.stats.distributions import nbinom >>> ?>>> ?nbinom(.3,.15).ppf(nbinom(.3,.15).cdf(np.arange(20))) >>> array([ ?1., ? 2., ? 2., ? 3., ? 5., ? 5., ? 6., ? 8., ? 9., ?10., ?11., >>> ? ? ? ? 12., ?13., ?14., ?15., ?16., ?17., ?18., ?19., ?20.]) >>> >>> I would have naturally expected np.arange(20). >> >> Floating point shenanigans. The CDF and PPF of discrete distributions >> are step functions. Round-tripping involves evaluating the PPF around >> the step. Naturally, floating point errors are going to nudge you to >> the left or right of that step. >> >>> Using np.round instead of np.ceil in nbinom_gen._ppf seems to solve >>> the issue. >>> Is there any reason not to do it ? >> >> ceil() is correct; round() is not. round() would be okay if the only >> inputs are expected to be outputs of the CDF, but one frequently needs >> the PPF to take all values in [0,1], like for example doing random >> number generation via inversion. > > Do you think it would be useful to round if we are epsilon (?) close > to the next integer? It is more likely that users have a case like > Pierre's where the answer might be the closest integer, instead of an > epsilon below. It would move the floating point error, but to an, in > actual usage, less likely location. > > I think, it is in scipy.special where I saw some code that treats > anything that is within 1e-8 (?) of an integer as the integer. I think the code after the first line is an attempt to remedy this situation: def _ppf(self, q, n, pr): vals = ceil(special.nbdtrik(q,n,pr)) vals1 = vals-1 temp = special.nbdtr(vals1,n,pr) return where(temp >= q, vals1, vals) It is possible that "temp >= q" should be replaced with "temp >= q-eps". -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From pgmdevlist at gmail.com Mon Jul 27 18:20:58 2009 From: pgmdevlist at gmail.com (Pierre GM) Date: Mon, 27 Jul 2009 18:20:58 -0400 Subject: [SciPy-dev] nbinom.ppf In-Reply-To: <3d375d730907271358s361cd41cke0804f1b80332c93@mail.gmail.com> References: <199F6194-4953-41F2-9F47-13CC4A5AD936@gmail.com> <3d375d730907271255i32283ae9j624e2a57b52fdac4@mail.gmail.com> <1cd32cbb0907271329q33bec67dqc120575454dbf3dc@mail.gmail.com> <3d375d730907271358s361cd41cke0804f1b80332c93@mail.gmail.com> Message-ID: <64EF39E2-8AF5-47C0-84D8-ADB4E063A211@gmail.com> On Jul 27, 2009, at 4:58 PM, Robert Kern wrote: > On Mon, Jul 27, 2009 at 15:29, wrote: >> On Mon, Jul 27, 2009 at 3:55 PM, Robert Kern >> wrote: >>> >>> ceil() is correct; round() is not. round() would be okay if the only >>> inputs are expected to be outputs of the CDF, but one frequently >>> needs >>> the PPF to take all values in [0,1], like for example doing random >>> number generation via inversion. Fair enough, but a bit frustrating nevertheless in this particular case. >> Do you think it would be useful to round if we are epsilon (?) close >> to the next integer? It is more likely that users have a case like >> Pierre's where the answer might be the closest integer, instead of an >> epsilon below. It would move the floating point error, but to an, in >> actual usage, less likely location. >> I think, it is in scipy.special where I saw some code that treats >> anything that is within 1e-8 (?) of an integer as the integer. > > I think the code after the first line is an attempt to remedy this > situation: > > def _ppf(self, q, n, pr): > vals = ceil(special.nbdtrik(q,n,pr)) > vals1 = vals-1 > temp = special.nbdtr(vals1,n,pr) > return where(temp >= q, vals1, vals) > > It is possible that "temp >= q" should be replaced with "temp >= q- > eps". Doesn't work in the case I was presenting, as temp is here an array of NaNs. Using >>> vals = ceil(special.nbdtrik(q,n,pr)-eps) seems to work well enough, provided that eps is around 1e-8 as Josef suggested. From josef.pktd at gmail.com Mon Jul 27 19:10:11 2009 From: josef.pktd at gmail.com (josef.pktd at gmail.com) Date: Mon, 27 Jul 2009 19:10:11 -0400 Subject: [SciPy-dev] nbinom.ppf In-Reply-To: <64EF39E2-8AF5-47C0-84D8-ADB4E063A211@gmail.com> References: <199F6194-4953-41F2-9F47-13CC4A5AD936@gmail.com> <3d375d730907271255i32283ae9j624e2a57b52fdac4@mail.gmail.com> <1cd32cbb0907271329q33bec67dqc120575454dbf3dc@mail.gmail.com> <3d375d730907271358s361cd41cke0804f1b80332c93@mail.gmail.com> <64EF39E2-8AF5-47C0-84D8-ADB4E063A211@gmail.com> Message-ID: <1cd32cbb0907271610w6986b256me9730f2ee3c436c5@mail.gmail.com> On Mon, Jul 27, 2009 at 6:20 PM, Pierre GM wrote: > > On Jul 27, 2009, at 4:58 PM, Robert Kern wrote: > >> On Mon, Jul 27, 2009 at 15:29, wrote: >>> On Mon, Jul 27, 2009 at 3:55 PM, Robert Kern >>> wrote: >>>> >>>> ceil() is correct; round() is not. round() would be okay if the only >>>> inputs are expected to be outputs of the CDF, but one frequently >>>> needs >>>> the PPF to take all values in [0,1], like for example doing random >>>> number generation via inversion. > > Fair enough, but a bit frustrating nevertheless in this particular case. > >>> Do you think it would be useful to round if we are epsilon (?) close >>> to the next integer? It is more likely that users have a case like >>> Pierre's where the answer might be the closest integer, instead of an >>> epsilon below. It would move the floating point error, but to an, in >>> actual usage, less likely location. > > > >>> I think, it is in scipy.special where I saw some code that treats >>> anything that is within 1e-8 (?) of an integer as the integer. >> >> I think the code after the first line is an attempt to remedy this >> situation: >> >> ? ?def _ppf(self, q, n, pr): >> ? ? ? ?vals = ceil(special.nbdtrik(q,n,pr)) >> ? ? ? ?vals1 = vals-1 >> ? ? ? ?temp = special.nbdtr(vals1,n,pr) >> ? ? ? ?return where(temp >= q, vals1, vals) >> >> It is possible that "temp >= q" should be replaced with "temp >= q- >> eps". > > Doesn't work in the case I was presenting, as temp is here an array of > NaNs. Using > ?>>> vals = ceil(special.nbdtrik(q,n,pr)-eps) > seems to work well enough, provided that eps is around 1e-8 as Josef > suggested. > I finally remembered, that in the test for the discrete distribution or for histogram with integers, I always use the 1e-8 correction, or 1e-14, depending on what is the more likely numerical error in the calculation. I'm usually not sure what the appropriate correction is. special.nbdtrik doesn't have any documentation, so I don't even know what it is supposed to do. The original failure case should make a nice roundtrip test for the discrete distribution, after the correction. I think, until now I only used floating point cases that are usually non-integers in the tests of the discrete distribution, but not exact roundtrip tests as for the continuous distributions. I will test and change this as soon as possible. Josef > > > > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > From robert.kern at gmail.com Mon Jul 27 19:21:42 2009 From: robert.kern at gmail.com (Robert Kern) Date: Mon, 27 Jul 2009 18:21:42 -0500 Subject: [SciPy-dev] nbinom.ppf In-Reply-To: <1cd32cbb0907271610w6986b256me9730f2ee3c436c5@mail.gmail.com> References: <199F6194-4953-41F2-9F47-13CC4A5AD936@gmail.com> <3d375d730907271255i32283ae9j624e2a57b52fdac4@mail.gmail.com> <1cd32cbb0907271329q33bec67dqc120575454dbf3dc@mail.gmail.com> <3d375d730907271358s361cd41cke0804f1b80332c93@mail.gmail.com> <64EF39E2-8AF5-47C0-84D8-ADB4E063A211@gmail.com> <1cd32cbb0907271610w6986b256me9730f2ee3c436c5@mail.gmail.com> Message-ID: <3d375d730907271621m4aabe914l1536fd6a49da0ee5@mail.gmail.com> On Mon, Jul 27, 2009 at 18:10, wrote: > On Mon, Jul 27, 2009 at 6:20 PM, Pierre GM wrote: >> >> On Jul 27, 2009, at 4:58 PM, Robert Kern wrote: >> >>> On Mon, Jul 27, 2009 at 15:29, wrote: >>>> On Mon, Jul 27, 2009 at 3:55 PM, Robert Kern >>>> wrote: >>>>> >>>>> ceil() is correct; round() is not. round() would be okay if the only >>>>> inputs are expected to be outputs of the CDF, but one frequently >>>>> needs >>>>> the PPF to take all values in [0,1], like for example doing random >>>>> number generation via inversion. >> >> Fair enough, but a bit frustrating nevertheless in this particular case. >> >>>> Do you think it would be useful to round if we are epsilon (?) close >>>> to the next integer? It is more likely that users have a case like >>>> Pierre's where the answer might be the closest integer, instead of an >>>> epsilon below. It would move the floating point error, but to an, in >>>> actual usage, less likely location. >> >> >> >>>> I think, it is in scipy.special where I saw some code that treats >>>> anything that is within 1e-8 (?) of an integer as the integer. >>> >>> I think the code after the first line is an attempt to remedy this >>> situation: >>> >>> ? ?def _ppf(self, q, n, pr): >>> ? ? ? ?vals = ceil(special.nbdtrik(q,n,pr)) >>> ? ? ? ?vals1 = vals-1 >>> ? ? ? ?temp = special.nbdtr(vals1,n,pr) >>> ? ? ? ?return where(temp >= q, vals1, vals) >>> >>> It is possible that "temp >= q" should be replaced with "temp >= q- >>> eps". >> >> Doesn't work in the case I was presenting, as temp is here an array of >> NaNs. Using >> ?>>> vals = ceil(special.nbdtrik(q,n,pr)-eps) >> seems to work well enough, provided that eps is around 1e-8 as Josef >> suggested. >> > > I finally remembered, that in the test for the discrete distribution > or for histogram with integers, I always use the 1e-8 correction, or > 1e-14, depending on what is the more likely numerical error in the > calculation. I'm usually not sure what the appropriate correction is. > > special.nbdtrik doesn't have any documentation, so I don't even know > what it is supposed to do. Just grep for it in the scipy/special sources. It's the inverse of nbdtr, hence the "i", inverting for the "k" parameter given "n" and "p". The problem here is that we are using nbdtr() inside this function to compute temp, but nbdtr doesn't handle n<1. The _cdf() method uses betainc() instead. If one replaces these lines: vals1 = vals-1 temp = special.nbdtr(vals1,n,pr) with these: vals1 = (vals-1).clip(0.0, np.inf) temp = special.betainc(n,vals,pr) then you get the desired answer. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From josef.pktd at gmail.com Mon Jul 27 21:11:17 2009 From: josef.pktd at gmail.com (josef.pktd at gmail.com) Date: Mon, 27 Jul 2009 21:11:17 -0400 Subject: [SciPy-dev] nbinom.ppf In-Reply-To: <3d375d730907271621m4aabe914l1536fd6a49da0ee5@mail.gmail.com> References: <199F6194-4953-41F2-9F47-13CC4A5AD936@gmail.com> <3d375d730907271255i32283ae9j624e2a57b52fdac4@mail.gmail.com> <1cd32cbb0907271329q33bec67dqc120575454dbf3dc@mail.gmail.com> <3d375d730907271358s361cd41cke0804f1b80332c93@mail.gmail.com> <64EF39E2-8AF5-47C0-84D8-ADB4E063A211@gmail.com> <1cd32cbb0907271610w6986b256me9730f2ee3c436c5@mail.gmail.com> <3d375d730907271621m4aabe914l1536fd6a49da0ee5@mail.gmail.com> Message-ID: <1cd32cbb0907271811x411a32dbs4f11bf2f220f6781@mail.gmail.com> On Mon, Jul 27, 2009 at 7:21 PM, Robert Kern wrote: > On Mon, Jul 27, 2009 at 18:10, wrote: >> On Mon, Jul 27, 2009 at 6:20 PM, Pierre GM wrote: >>> >>> On Jul 27, 2009, at 4:58 PM, Robert Kern wrote: >>> >>>> On Mon, Jul 27, 2009 at 15:29, wrote: >>>>> On Mon, Jul 27, 2009 at 3:55 PM, Robert Kern >>>>> wrote: >>>>>> >>>>>> ceil() is correct; round() is not. round() would be okay if the only >>>>>> inputs are expected to be outputs of the CDF, but one frequently >>>>>> needs >>>>>> the PPF to take all values in [0,1], like for example doing random >>>>>> number generation via inversion. >>> >>> Fair enough, but a bit frustrating nevertheless in this particular case. >>> >>>>> Do you think it would be useful to round if we are epsilon (?) close >>>>> to the next integer? It is more likely that users have a case like >>>>> Pierre's where the answer might be the closest integer, instead of an >>>>> epsilon below. It would move the floating point error, but to an, in >>>>> actual usage, less likely location. >>> >>> >>> >>>>> I think, it is in scipy.special where I saw some code that treats >>>>> anything that is within 1e-8 (?) of an integer as the integer. >>>> >>>> I think the code after the first line is an attempt to remedy this >>>> situation: >>>> >>>> ? ?def _ppf(self, q, n, pr): >>>> ? ? ? ?vals = ceil(special.nbdtrik(q,n,pr)) >>>> ? ? ? ?vals1 = vals-1 >>>> ? ? ? ?temp = special.nbdtr(vals1,n,pr) >>>> ? ? ? ?return where(temp >= q, vals1, vals) >>>> >>>> It is possible that "temp >= q" should be replaced with "temp >= q- >>>> eps". >>> >>> Doesn't work in the case I was presenting, as temp is here an array of >>> NaNs. Using >>> ?>>> vals = ceil(special.nbdtrik(q,n,pr)-eps) >>> seems to work well enough, provided that eps is around 1e-8 as Josef >>> suggested. >>> >> >> I finally remembered, that in the test for the discrete distribution >> or for histogram with integers, I always use the 1e-8 correction, or >> 1e-14, depending on what is the more likely numerical error in the >> calculation. I'm usually not sure what the appropriate correction is. >> >> special.nbdtrik doesn't have any documentation, so I don't even know >> what it is supposed to do. > > Just grep for it in the scipy/special sources. It's the inverse of > nbdtr, hence the "i", inverting for the "k" parameter given "n" and > "p". > > The problem here is that we are using nbdtr() inside this function to > compute temp, but nbdtr doesn't handle n<1. The _cdf() method uses > betainc() instead. If one replaces these lines: > > ?vals1 = vals-1 > ?temp = special.nbdtr(vals1,n,pr) > > with these: > > ?vals1 = (vals-1).clip(0.0, np.inf) > ?temp = special.betainc(n,vals,pr) > > then you get the desired answer. That's better. It took me a while to understand the logic behind the way the ceiling error is corrected. The same pattern is also followed by the other discrete distributions that define a _ppf method. It is cleaner then the epsilon correction, but takes longer to figure out what it does. To understand the logic more easily and to be DRY, it would be better to replace the duplication of the _cdf method directly with a call to self._cdf. For example, in changeset 4673, Robert, you changed the _cdf method to use betainc instead of nbdtr, but not the _ppf method. Without the code duplication, partial corrections could be more easily avoided. Is there a reason not to call self._cdf instead? A temporary workaround would be to add, in this case, at least 1e-10 to the cdf in the roundtrip to avoid the floating point ceiling error. >>> nbinom(.3,.15).ppf(nbinom(.3,.15).cdf(np.arange(20))+1e-8) array([ 1., 2., 3., 4., 5., 6., 7., 8., 9., 10., 11., 12., 13., 14., 15., 16., 17., 18., 19., 20.]) >>> nbinom(.3,.15).ppf(nbinom(.3,.15).cdf(np.arange(20))+1e-10) array([ 1., 2., 3., 4., 5., 5., 7., 8., 9., 10., 11., 12., 13., 14., 15., 16., 17., 18., 19., 20.]) >>> nbinom(.3,.15).ppf(nbinom(.3,.15).cdf(np.arange(20))+1e-11) array([ 1., 2., 2., 4., 5., 5., 7., 8., 9., 10., 11., 12., 13., 14., 15., 16., 17., 18., 19., 20.]) Josef > > -- > Robert Kern > > "I have come to believe that the whole world is an enigma, a harmless > enigma that is made terrible by our own mad attempt to interpret it as > though it had an underlying truth." > ?-- Umberto Eco > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > From pgmdevlist at gmail.com Tue Jul 28 00:35:59 2009 From: pgmdevlist at gmail.com (Pierre GM) Date: Tue, 28 Jul 2009 00:35:59 -0400 Subject: [SciPy-dev] special.nbdtrik Message-ID: All, I'm trying to find the sources of special.nbdtrik, but no avail so far. Any help would be greatly appreciated. P. From robert.kern at gmail.com Tue Jul 28 00:40:43 2009 From: robert.kern at gmail.com (Robert Kern) Date: Mon, 27 Jul 2009 23:40:43 -0500 Subject: [SciPy-dev] special.nbdtrik In-Reply-To: References: Message-ID: <3d375d730907272140j34e812fct45f299050de9f42e@mail.gmail.com> On Mon, Jul 27, 2009 at 23:35, Pierre GM wrote: > All, > I'm trying to find the sources of special.nbdtrik, but no avail so far. scipy/special/cdflib/cdfnbn.f -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From pgmdevlist at gmail.com Tue Jul 28 00:51:47 2009 From: pgmdevlist at gmail.com (Pierre GM) Date: Tue, 28 Jul 2009 00:51:47 -0400 Subject: [SciPy-dev] special.nbdtrik In-Reply-To: <3d375d730907272140j34e812fct45f299050de9f42e@mail.gmail.com> References: <3d375d730907272140j34e812fct45f299050de9f42e@mail.gmail.com> Message-ID: <3D0F3093-4776-4B7A-8B3F-C621EA6FCA0C@gmail.com> On Jul 28, 2009, at 12:40 AM, Robert Kern wrote: > > scipy/special/cdflib/cdfnbn.f Thanks a million! P. From robert.kern at gmail.com Tue Jul 28 12:56:41 2009 From: robert.kern at gmail.com (Robert Kern) Date: Tue, 28 Jul 2009 11:56:41 -0500 Subject: [SciPy-dev] nbinom.ppf In-Reply-To: <1cd32cbb0907271811x411a32dbs4f11bf2f220f6781@mail.gmail.com> References: <199F6194-4953-41F2-9F47-13CC4A5AD936@gmail.com> <3d375d730907271255i32283ae9j624e2a57b52fdac4@mail.gmail.com> <1cd32cbb0907271329q33bec67dqc120575454dbf3dc@mail.gmail.com> <3d375d730907271358s361cd41cke0804f1b80332c93@mail.gmail.com> <64EF39E2-8AF5-47C0-84D8-ADB4E063A211@gmail.com> <1cd32cbb0907271610w6986b256me9730f2ee3c436c5@mail.gmail.com> <3d375d730907271621m4aabe914l1536fd6a49da0ee5@mail.gmail.com> <1cd32cbb0907271811x411a32dbs4f11bf2f220f6781@mail.gmail.com> Message-ID: <3d375d730907280956m5425d997u5b37d8a355047439@mail.gmail.com> On Mon, Jul 27, 2009 at 20:11, wrote: > That's better. It took me a while to understand the logic behind the > way the ceiling error is corrected. The same pattern is also followed > by the other discrete distributions that define a _ppf method. It is > cleaner then the epsilon correction, but takes longer to figure out > what it does. > > To understand the logic more easily and to be DRY, it would be better > to replace the duplication of the _cdf method directly with a call to > self._cdf. > For example, in changeset 4673, Robert, you changed the _cdf method to > use betainc instead of nbdtr, but not the _ppf method. Without the > code duplication, partial corrections could be more easily avoided. > > Is there a reason not to call self._cdf instead? Nope. Go ahead. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From d_l_goldsmith at yahoo.com Tue Jul 28 15:24:17 2009 From: d_l_goldsmith at yahoo.com (David Goldsmith) Date: Tue, 28 Jul 2009 12:24:17 -0700 (PDT) Subject: [SciPy-dev] random.set_state also in need of EXPERT attention Message-ID: <82640.89124.qm@web52111.mail.re2.yahoo.com> Hi, folks! Ralf originally requested expert attention for this in its Discussion section; I over-confidently said I thought I could handle it; after too many hours of unproductive research (I seem to have found that there's a lot of "politics" in the field of random number generators? Despite widespread use: http://en.wikipedia.org/wiki/Mersenne_twister, Matsumoto & Nishimura and this algorithm don't appear to have much recognition in certain circles, e.g., those under the influence of Marsaglia, e.g., notably, "Numerical Recipes." I understand that Marsaglia is critical of MT, but for "NR" to completely ignore an algorithm in such widespread use, well, out of curiosity, if anyone knows, "what gives"?) I give up. DG From pav+sp at iki.fi Tue Jul 28 15:27:44 2009 From: pav+sp at iki.fi (Pauli Virtanen) Date: Tue, 28 Jul 2009 19:27:44 +0000 (UTC) Subject: [SciPy-dev] Scipy.org down Message-ID: Scipy.org website seems to be down. Port 80 is open, but GET yields no response. -- Pauli Virtanen From robert.kern at gmail.com Tue Jul 28 15:32:30 2009 From: robert.kern at gmail.com (Robert Kern) Date: Tue, 28 Jul 2009 14:32:30 -0500 Subject: [SciPy-dev] random.set_state also in need of EXPERT attention In-Reply-To: <82640.89124.qm@web52111.mail.re2.yahoo.com> References: <82640.89124.qm@web52111.mail.re2.yahoo.com> Message-ID: <3d375d730907281232kb58f93cq652a4bf56074ee45@mail.gmail.com> On Tue, Jul 28, 2009 at 14:24, David Goldsmith wrote: > > Hi, folks! ?Ralf originally requested expert attention for this in its Discussion section; I over-confidently said I thought I could handle it; after too many hours of unproductive research (I seem to have found that there's a lot of "politics" in the field of random number generators? ?Despite widespread use: http://en.wikipedia.org/wiki/Mersenne_twister, Matsumoto & Nishimura and this algorithm don't appear to have much recognition in certain circles, e.g., those under the influence of Marsaglia, e.g., notably, "Numerical Recipes." ?I understand that Marsaglia is critical of MT, but for "NR" to completely ignore an algorithm in such widespread use, well, out of curiosity, if anyone knows, "what gives"?) I give up. NR is old and stodgy. They don't cover the state-of-the-art for many arts. Ignore them. What "expert attention" is required? -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From robert.kern at gmail.com Tue Jul 28 15:49:42 2009 From: robert.kern at gmail.com (Robert Kern) Date: Tue, 28 Jul 2009 14:49:42 -0500 Subject: [SciPy-dev] Scipy.org down In-Reply-To: References: Message-ID: <3d375d730907281249w5aa2cf75p1633c969a9e6fa93@mail.gmail.com> On Tue, Jul 28, 2009 at 14:27, Pauli Virtanen wrote: > Scipy.org website seems to be down. Port 80 is open, but GET > yields no response. The process has been restarted. Thank you. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From josef.pktd at gmail.com Tue Jul 28 16:06:56 2009 From: josef.pktd at gmail.com (josef.pktd at gmail.com) Date: Tue, 28 Jul 2009 16:06:56 -0400 Subject: [SciPy-dev] docs and doceditor question Message-ID: <1cd32cbb0907281306l5557fd64g4a1bebc527bf973a@mail.gmail.com> what is the source for http://docs.scipy.org/doc/scipy/reference/generated/scipy.interpolate.UnivariateSpline.html ? It's the kind of class documentation that I wanted but I don't see how it is generated. Is it autogenerated or are there specific directives somewhere to specify it. another question: Are the scipy docs supposed to build on Windows. I managed some months ago, but it seems I will have to hunt down again posix specific commands for several hours, to build it. I tried briefly, but without the latest sphinx. Thanks, Josef From robert.kern at gmail.com Tue Jul 28 16:10:50 2009 From: robert.kern at gmail.com (Robert Kern) Date: Tue, 28 Jul 2009 15:10:50 -0500 Subject: [SciPy-dev] docs and doceditor question In-Reply-To: <1cd32cbb0907281306l5557fd64g4a1bebc527bf973a@mail.gmail.com> References: <1cd32cbb0907281306l5557fd64g4a1bebc527bf973a@mail.gmail.com> Message-ID: <3d375d730907281310j553effa0k40048d5520243e39@mail.gmail.com> On Tue, Jul 28, 2009 at 15:06, wrote: > what is the source for > http://docs.scipy.org/doc/scipy/reference/generated/scipy.interpolate.UnivariateSpline.html > ? > > It's the kind of class documentation that I wanted but I don't see how > it is generated. Is it autogenerated or are there specific directives > somewhere to specify it. Click "Edit this Page" in the left-hand side bar. http://docs.scipy.org/scipy/docs/scipy.interpolate.fitpack2.UnivariateSpline/ -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From d_l_goldsmith at yahoo.com Tue Jul 28 16:16:18 2009 From: d_l_goldsmith at yahoo.com (David Goldsmith) Date: Tue, 28 Jul 2009 13:16:18 -0700 (PDT) Subject: [SciPy-dev] random.set_state also in need of EXPERT attention Message-ID: <758501.4470.qm@web52108.mail.re2.yahoo.com> --- On Tue, 7/28/09, Robert Kern wrote: > Goldsmith > wrote: > > > NR is old and stodgy. They don't cover the state-of-the-art > for many arts. Ignore them. Understood. (Still, I'm looking at the 2007 edition - very disappointing. Well, at least I know now not to invest my money in a copy for myself.) > What "expert attention" is required? Quoting Ralf from the Discussion section: "[random.set_state] needs some expert explanation of what the tuple elements mean, and an example of how to use this function to modify the generator state in a meaningful way. [later] [Matsumoto/Nishimura, 1997] is reasonably clear, there's a link to it on the Wikipedia page. I think it will only give you elements 2 and 3 though, 4 and 5 were added later in the implementation and I guess are for some of the distributions in the random module. [Perhaps irrelevant if one doesn't need to decipher the code to write an intelligible docstring] The other difficulty is the Pyrex/Cython source, a bit harder to read than plain Python." DG > > -- > Robert Kern > > "I have come to believe that the whole world is an enigma, > a harmless > enigma that is made terrible by our own mad attempt to > interpret it as > though it had an underlying truth." > ? -- Umberto Eco > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > From josef.pktd at gmail.com Tue Jul 28 16:22:47 2009 From: josef.pktd at gmail.com (josef.pktd at gmail.com) Date: Tue, 28 Jul 2009 16:22:47 -0400 Subject: [SciPy-dev] docs and doceditor question In-Reply-To: <3d375d730907281310j553effa0k40048d5520243e39@mail.gmail.com> References: <1cd32cbb0907281306l5557fd64g4a1bebc527bf973a@mail.gmail.com> <3d375d730907281310j553effa0k40048d5520243e39@mail.gmail.com> Message-ID: <1cd32cbb0907281322h78aa8e1dwff4400fe9d7fa069@mail.gmail.com> On Tue, Jul 28, 2009 at 4:10 PM, Robert Kern wrote: > On Tue, Jul 28, 2009 at 15:06, wrote: >> what is the source for >> http://docs.scipy.org/doc/scipy/reference/generated/scipy.interpolate.UnivariateSpline.html >> ? >> >> It's the kind of class documentation that I wanted but I don't see how >> it is generated. Is it autogenerated or are there specific directives >> somewhere to specify it. > > Click "Edit this Page" in the left-hand side bar. > > http://docs.scipy.org/scipy/docs/scipy.interpolate.fitpack2.UnivariateSpline/ > Thanks, I forgot about these links. However "edit this" only leads to the docstrings. When I follow "show source", I get this http://docs.scipy.org/doc/scipy/reference/_sources/generated/scipy.interpolate.UnivariateSpline.txt which defines which methods are included. It looks autogenerated, but I don't know why some classes have the methods listed and other classes don't. for example, show source of http://docs.scipy.org/doc/scipy/reference/generated/scipy.interpolate.Rbf.html has only __init__ listed. Josef > -- > Robert Kern > > "I have come to believe that the whole world is an enigma, a harmless > enigma that is made terrible by our own mad attempt to interpret it as > though it had an underlying truth." > ?-- Umberto Eco > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > From robert.kern at gmail.com Tue Jul 28 16:34:29 2009 From: robert.kern at gmail.com (Robert Kern) Date: Tue, 28 Jul 2009 15:34:29 -0500 Subject: [SciPy-dev] random.set_state also in need of EXPERT attention In-Reply-To: <758501.4470.qm@web52108.mail.re2.yahoo.com> References: <758501.4470.qm@web52108.mail.re2.yahoo.com> Message-ID: <3d375d730907281334v2decff3al5bcff434311a665@mail.gmail.com> On Tue, Jul 28, 2009 at 15:16, David Goldsmith wrote: > > --- On Tue, 7/28/09, Robert Kern wrote: > >> Goldsmith >> wrote: >> > >> NR is old and stodgy. They don't cover the state-of-the-art >> for many arts. Ignore them. > > Understood. ?(Still, I'm looking at the 2007 edition - very disappointing. ?Well, at least I know now not to invest my money in a copy for myself.) > >> What "expert attention" is required? > > Quoting Ralf from the Discussion section: "[random.set_state] needs some expert explanation of what the tuple elements mean, and an example of how to use this function to modify the generator state in a meaningful way. [later] [Matsumoto/Nishimura, 1997] is reasonably clear, there's a link to it on the Wikipedia page. I think it will only give you elements 2 and 3 though, 4 and 5 were added later in the implementation and I guess are for some of the distributions in the random module. [Perhaps irrelevant if one doesn't need to decipher the code to write an intelligible docstring] The other difficulty is the Pyrex/Cython source, a bit harder to read than plain Python." What is wrong with the current docstring's explanation of the tuple elements? If anything, there should be less information. People should be treating it as an opaque object unless if they are in an unusual circumstance like trying to match an MT implementation in another language. In such a circumstance, they really need to go source diving and understand the code. No amount of documentation is suitable for that. But basically, element 4 is a boolean flag (encoded as an int) for whether element 5 is a cached value for the Gaussian distribution or not. The Gaussian generator generates two values at a time, so we return one and cache one and turn on the flag. The next call returns the cached value and turns off the flag. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From d_l_goldsmith at yahoo.com Tue Jul 28 16:36:26 2009 From: d_l_goldsmith at yahoo.com (d_l_goldsmith at yahoo.com) Date: Tue, 28 Jul 2009 13:36:26 -0700 (PDT) Subject: [SciPy-dev] Milestones status update Message-ID: <120501.30388.qm@web52110.mail.re2.yahoo.com> Un-led categories not near "Goal met": "Masked Arrays," "Masked Arrays IV," "Operations on Masks," "Even More MA Functions, I and II," "Other Math," "Other Array Subclasses," "Numpy Internals," and "C-Types" (these last two are noted as (probably) needing Expert attention). Led categories still needing work: "Financial Functions," (> 50% GM) "Random Number Generation," (> 57% GM) "Masked Arrays III" (> 80% GM). Objects & namespaces w/ "white" un-categorized attributes: broadcast, arrayprint (one), chararray (many!), matrix (many!), memmap, multiarray, numeric (Expert only?), recarray, record, scalarmath, umath, ctypeslib (one, Expert only!), distutils (many, Expert only!), doc, dtype (one), fftpack_lite, generic (many!), arrayterator, scimath, shape_base (one), stride_tricks, utils, testing (decorators, nosetester, utils), ufunc (one), void DG From robert.kern at gmail.com Tue Jul 28 16:38:52 2009 From: robert.kern at gmail.com (Robert Kern) Date: Tue, 28 Jul 2009 15:38:52 -0500 Subject: [SciPy-dev] docs and doceditor question In-Reply-To: <1cd32cbb0907281322h78aa8e1dwff4400fe9d7fa069@mail.gmail.com> References: <1cd32cbb0907281306l5557fd64g4a1bebc527bf973a@mail.gmail.com> <3d375d730907281310j553effa0k40048d5520243e39@mail.gmail.com> <1cd32cbb0907281322h78aa8e1dwff4400fe9d7fa069@mail.gmail.com> Message-ID: <3d375d730907281338le8fe6e6w1ff33b75b075fe17@mail.gmail.com> On Tue, Jul 28, 2009 at 15:22, wrote: > On Tue, Jul 28, 2009 at 4:10 PM, Robert Kern wrote: >> On Tue, Jul 28, 2009 at 15:06, wrote: >>> what is the source for >>> http://docs.scipy.org/doc/scipy/reference/generated/scipy.interpolate.UnivariateSpline.html >>> ? >>> >>> It's the kind of class documentation that I wanted but I don't see how >>> it is generated. Is it autogenerated or are there specific directives >>> somewhere to specify it. >> >> Click "Edit this Page" in the left-hand side bar. >> >> http://docs.scipy.org/scipy/docs/scipy.interpolate.fitpack2.UnivariateSpline/ >> > > Thanks, I forgot about these links. However "edit this" only leads to > the docstrings. When I follow "show source", I get this > http://docs.scipy.org/doc/scipy/reference/_sources/generated/scipy.interpolate.UnivariateSpline.txt > > which defines which methods are included. It looks autogenerated, but > I don't know why some classes have the methods listed and other > classes don't. > > for example, show source of > http://docs.scipy.org/doc/scipy/reference/generated/scipy.interpolate.Rbf.html > has only __init__ listed. Possibly because UnivariateSpline has this in its docstring: """ Methods ------- __call__ get_knots get_coeffs get_residual integral derivatives roots set_smoothing_factor """ while Rbf doesn't. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From d_l_goldsmith at yahoo.com Tue Jul 28 16:40:19 2009 From: d_l_goldsmith at yahoo.com (David Goldsmith) Date: Tue, 28 Jul 2009 13:40:19 -0700 (PDT) Subject: [SciPy-dev] random.set_state also in need of EXPERT attention Message-ID: <647504.96116.qm@web52104.mail.re2.yahoo.com> Okee-dokee, thanks! DG --- On Tue, 7/28/09, Robert Kern wrote: > From: Robert Kern > Subject: Re: [SciPy-dev] random.set_state also in need of EXPERT attention > To: "SciPy Developers List" > Date: Tuesday, July 28, 2009, 1:34 PM > On Tue, Jul 28, 2009 at 15:16, David > Goldsmith > wrote: > > > > --- On Tue, 7/28/09, Robert Kern > wrote: > > > >> Goldsmith > >> wrote: > >> > > >> NR is old and stodgy. They don't cover the > state-of-the-art > >> for many arts. Ignore them. > > > > Understood. ?(Still, I'm looking at the 2007 edition > - very disappointing. ?Well, at least I know now not to > invest my money in a copy for myself.) > > > >> What "expert attention" is required? > > > > Quoting Ralf from the Discussion section: > "[random.set_state] needs some expert explanation of what > the tuple elements mean, and an example of how to use this > function to modify the generator state in a meaningful way. > [later] [Matsumoto/Nishimura, 1997] is reasonably clear, > there's a link to it on the Wikipedia page. I think it will > only give you elements 2 and 3 though, 4 and 5 were added > later in the implementation and I guess are for some of the > distributions in the random module. [Perhaps irrelevant if > one doesn't need to decipher the code to write an > intelligible docstring] The other difficulty is the > Pyrex/Cython source, a bit harder to read than plain > Python." > > What is wrong with the current docstring's explanation of > the tuple > elements? If anything, there should be less information. > People should > be treating it as an opaque object unless if they are in an > unusual > circumstance like trying to match an MT implementation in > another > language. In such a circumstance, they really need to go > source diving > and understand the code. No amount of documentation is > suitable for > that. > > But basically, element 4 is a boolean flag (encoded as an > int) for > whether element 5 is a cached value for the Gaussian > distribution or > not. The Gaussian generator generates two values at a time, > so we > return one and cache one and turn on the flag. The next > call returns > the cached value and turns off the flag. > > -- > Robert Kern > > "I have come to believe that the whole world is an enigma, > a harmless > enigma that is made terrible by our own mad attempt to > interpret it as > though it had an underlying truth." > ? -- Umberto Eco > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > From pgmdevlist at gmail.com Tue Jul 28 17:02:02 2009 From: pgmdevlist at gmail.com (Pierre GM) Date: Tue, 28 Jul 2009 17:02:02 -0400 Subject: [SciPy-dev] Milestones status update In-Reply-To: <120501.30388.qm@web52110.mail.re2.yahoo.com> References: <120501.30388.qm@web52110.mail.re2.yahoo.com> Message-ID: <5E2FD68B-E255-40C9-AA45-363644BA031B@gmail.com> On Jul 28, 2009, at 4:36 PM, d_l_goldsmith at yahoo.com wrote: > > Un-led categories not near "Goal met": > > "Masked Arrays," "Masked Arrays IV," "Operations on Masks," "Even > More MA Functions, I and II," "Other Math," "Other Array > Subclasses," "Numpy Internals," and "C-Types" (these last two are > noted as (probably) needing Expert attention). Uh-oh, looks like I'm in trouble... From d_l_goldsmith at yahoo.com Tue Jul 28 17:35:51 2009 From: d_l_goldsmith at yahoo.com (David Goldsmith) Date: Tue, 28 Jul 2009 14:35:51 -0700 (PDT) Subject: [SciPy-dev] Milestones status update Message-ID: <102676.35302.qm@web52104.mail.re2.yahoo.com> Why you? DG --- On Tue, 7/28/09, Pierre GM wrote: > From: Pierre GM > Subject: Re: [SciPy-dev] Milestones status update > To: "SciPy Developers List" > Date: Tuesday, July 28, 2009, 2:02 PM > > On Jul 28, 2009, at 4:36 PM, d_l_goldsmith at yahoo.com > wrote: > > > > > Un-led categories not near "Goal met": > > > > "Masked Arrays," "Masked Arrays IV," "Operations on > Masks," "Even? > > More MA Functions, I and II," "Other Math," "Other > Array? > > Subclasses," "Numpy Internals," and "C-Types" (these > last two are? > > noted as (probably) needing Expert attention). > > Uh-oh, looks like I'm in trouble... > > > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > From pav+sp at iki.fi Tue Jul 28 17:44:07 2009 From: pav+sp at iki.fi (Pauli Virtanen) Date: Tue, 28 Jul 2009 21:44:07 +0000 (UTC) Subject: [SciPy-dev] random.set_state also in need of EXPERT attention References: <82640.89124.qm@web52111.mail.re2.yahoo.com> Message-ID: (Disclaimer: I'm not a PRNG expert.) On 2009-07-28, David Goldsmith wrote: > Hi, folks! Ralf originally requested expert attention for this > in its Discussion section; I over-confidently said I thought I > could handle it; after too many hours of unproductive research > (I seem to have found that there's a lot of "politics" in the > field of random number generators? Looks like science as usual to me :) I think the natural explanation is that everyone proposing a new PRNG of course wants to advertise its merits. Also, as there is no clear-cut way to quantify the "randomness" of a PRNG and because of speed tradeoffs, there is something to argue about and many different alternatives have been proposed. > Despite widespread use: > http://en.wikipedia.org/wiki/Mersenne_twister, Matsumoto & > Nishimura and this algorithm don't appear to have much > recognition in certain circles, e.g., those under the influence > of Marsaglia, e.g., notably, "Numerical Recipes." The second edition of Numerical recipes was written in 1992, so that explains why it's not there. If you Google it, for the third edition the authors of NR respond [1] that they didn't include MT because it "has just too many operations per random value generated". This is of course understandable in a book that aims to give a focused introduction on the subject. Whether it reflects the merits of the algorithm is then a different question. .. [1] http://www.nr.com/forum/showthread.php?t=1724 > I understand that Marsaglia is critical of MT, but for "NR" to > completely ignore an algorithm in such widespread use, well, > out of curiosity, if anyone knows, "what gives"?) As I see it, the statements sourced in the Wikipedia article are fairly mild, criticising mostly the complexity of the algorithm. Also, they were not backed by anything, and probably should be taken with a grain of salt. Marsaglia seems to have proposed another types of PRNGs in 2003, but these had flaws, which maybe were addressed by Brent later on. (Cf. [2] and follow the references.) The MT article is widely cited (123 citations as reported by ACM, 1346 by Google Scholar), and sampling some of the review-type ones (eg. [3,4]) the generator seems to have done reasonably in various randomness tests and also be reasonable speed-wise. Perhaps the algorithm is not optimal -- after all, it's already more than ten years old -- but it appears to be well tested and understood. .. [2] http://wwwmaths.anu.edu.au/~brent/random.html .. [3] http://dx.doi.org/10.1016/j.csda.2006.05.019 .. [4] http://www.iro.umontreal.ca/~simardr/testu01/tu01.html -- Pauli Virtanen From pav+sp at iki.fi Tue Jul 28 17:50:38 2009 From: pav+sp at iki.fi (Pauli Virtanen) Date: Tue, 28 Jul 2009 21:50:38 +0000 (UTC) Subject: [SciPy-dev] docs and doceditor question References: <1cd32cbb0907281306l5557fd64g4a1bebc527bf973a@mail.gmail.com> <3d375d730907281310j553effa0k40048d5520243e39@mail.gmail.com> <1cd32cbb0907281322h78aa8e1dwff4400fe9d7fa069@mail.gmail.com> <3d375d730907281338le8fe6e6w1ff33b75b075fe17@mail.gmail.com> Message-ID: On 2009-07-28, Robert Kern wrote: > On Tue, Jul 28, 2009 at 15:22, wrote: [clip] >> which defines which methods are included. It looks autogenerated, but >> I don't know why some classes have the methods listed and other >> classes don't. >> >> for example, show source of >> http://docs.scipy.org/doc/scipy/reference/generated/scipy.interpolate.Rbf.html >> has only __init__ listed. > > Possibly because UnivariateSpline has this in its docstring: > > """ > Methods > ------- [clip] > """ > > while Rbf doesn't. Exactly, the list of methods shown is based on the docstring, as discussed in our docstring standard. Unfortunately, it's not possible to decide automatically which methods are relevant and which are not. Re: building documentation on Windows. If you want to lend a hand in making it work (eg. a suitable .bat file), contributions are welcome. If you want to just get it to work, the easiest way is probably to use the latest development version of Sphinx: then just sphinx-build should be enough. -- Pauli Virtanen From pav+sp at iki.fi Tue Jul 28 18:01:43 2009 From: pav+sp at iki.fi (Pauli Virtanen) Date: Tue, 28 Jul 2009 22:01:43 +0000 (UTC) Subject: [SciPy-dev] random.set_state also in need of EXPERT attention References: <82640.89124.qm@web52111.mail.re2.yahoo.com> Message-ID: On 2009-07-28, Pauli Virtanen wrote: [clip] > Also, they were not backed by anything, and probably should be > taken with a grain of salt. Agh, forget this sentence, I read the messages too fast -- these statements were supported by some discussion. -- Pauli Virtanen From d_l_goldsmith at yahoo.com Tue Jul 28 18:10:31 2009 From: d_l_goldsmith at yahoo.com (David Goldsmith) Date: Tue, 28 Jul 2009 15:10:31 -0700 (PDT) Subject: [SciPy-dev] random.set_state also in need of EXPERT attention Message-ID: <688863.82952.qm@web52103.mail.re2.yahoo.com> Awesome, Pauli, thanks! DG --- On Tue, 7/28/09, Pauli Virtanen wrote: > From: Pauli Virtanen > Subject: Re: [SciPy-dev] random.set_state also in need of EXPERT attention > To: scipy-dev at scipy.org > Date: Tuesday, July 28, 2009, 2:44 PM > > (Disclaimer: I'm not a PRNG expert.) > > On 2009-07-28, David Goldsmith > wrote: > > Hi, folks!? Ralf originally requested expert > attention for this > > in its Discussion section; I over-confidently said I > thought I > > could handle it; after too many hours of unproductive > research > > (I seem to have found that there's a lot of "politics" > in the > > field of random number generators? > > Looks like science as usual to me :) I think the natural > explanation is that everyone proposing a new PRNG of course > wants > to advertise its merits. Also, as there is no clear-cut way > to > quantify the "randomness" of a PRNG and because of speed > tradeoffs, there is something to argue about and many > different > alternatives have been proposed. > > > Despite widespread use: > > http://en.wikipedia.org/wiki/Mersenne_twister, > Matsumoto & > > Nishimura and this algorithm don't appear to have much > > > recognition in certain circles, e.g., those under the > influence > > of Marsaglia, e.g., notably, "Numerical Recipes." > > The second edition of Numerical recipes was written in > 1992, so > that explains why it's not there. > > If you Google it, for the third edition the authors of NR > respond > [1] that they didn't include MT because it "has just too > many > operations per random value generated". This is of course > understandable in a book that aims to give a focused > introduction > on the subject. Whether it reflects the merits of the > algorithm > is then a different question. > > .. [1] http://www.nr.com/forum/showthread.php?t=1724 > > > I understand that Marsaglia is critical of MT, but for > "NR" to > > completely ignore an algorithm in such widespread use, > well, > > out of curiosity, if anyone knows, "what gives"?) > > As I see it, the statements sourced in the Wikipedia > article are > fairly mild, criticising mostly the complexity of the > algorithm. > Also, they were not backed by anything, and probably should > be > taken with a grain of salt. > > Marsaglia seems to have proposed another types of PRNGs in > 2003, > but these had flaws, which maybe were addressed by Brent > later > on. (Cf. [2] and follow the references.) > > The MT article is widely cited (123 citations as reported > by ACM, > 1346 by Google Scholar), and sampling some of the > review-type > ones (eg. [3,4]) the generator seems to have done > reasonably in > various randomness tests and also be reasonable speed-wise. > > Perhaps the algorithm is not optimal -- after all, it's > already > more than ten years old -- but it appears to be well tested > and > understood. > > .. [2] http://wwwmaths.anu.edu.au/~brent/random.html > .. [3] http://dx.doi.org/10.1016/j.csda.2006.05.019 > .. [4] http://www.iro.umontreal.ca/~simardr/testu01/tu01.html > > -- > Pauli Virtanen > > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > From d_l_goldsmith at yahoo.com Tue Jul 28 20:19:31 2009 From: d_l_goldsmith at yahoo.com (David Goldsmith) Date: Tue, 28 Jul 2009 17:19:31 -0700 (PDT) Subject: [SciPy-dev] Bug? Message-ID: <738066.17428.qm@web52111.mail.re2.yahoo.com> >>> np.set_printoptions(precision=4) >>> np.lib.scimath.arctanh([1j]) array([ 0.+0.7854j]) so far, so good >>> np.lib.scimath.arctanh(1j) 0.78539816339744828j i.e., set_printoptions didn't affect the result when the input was a scalar. "Feature" or bug? DG From ralf.gommers at googlemail.com Tue Jul 28 20:58:10 2009 From: ralf.gommers at googlemail.com (Ralf Gommers) Date: Tue, 28 Jul 2009 20:58:10 -0400 Subject: [SciPy-dev] Bug? In-Reply-To: <738066.17428.qm@web52111.mail.re2.yahoo.com> References: <738066.17428.qm@web52111.mail.re2.yahoo.com> Message-ID: On Tue, Jul 28, 2009 at 8:19 PM, David Goldsmith wrote: > > >>> np.set_printoptions(precision=4) > >>> np.lib.scimath.arctanh([1j]) > array([ 0.+0.7854j]) > > so far, so good > > >>> np.lib.scimath.arctanh(1j) > 0.78539816339744828j > > i.e., set_printoptions didn't affect the result when the input was a > scalar. "Feature" or bug? A bug, reported last year in numpy ticket #817. Ralf > > > DG > > > > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > -------------- next part -------------- An HTML attachment was scrubbed... URL: From d_l_goldsmith at yahoo.com Tue Jul 28 21:50:13 2009 From: d_l_goldsmith at yahoo.com (David Goldsmith) Date: Tue, 28 Jul 2009 18:50:13 -0700 (PDT) Subject: [SciPy-dev] Bug? Message-ID: <700488.73022.qm@web52111.mail.re2.yahoo.com> Okee-dokee, thanks! DG --- On Tue, 7/28/09, Ralf Gommers wrote: > From: Ralf Gommers > Subject: Re: [SciPy-dev] Bug? > To: "SciPy Developers List" > Date: Tuesday, July 28, 2009, 5:58 PM > > > On Tue, Jul 28, 2009 at 8:19 PM, > David Goldsmith > wrote: > > > > >>> np.set_printoptions(precision=4) > > >>> np.lib.scimath.arctanh([1j]) > > array([ 0.+0.7854j]) > > > > so far, so good > > > > >>> np.lib.scimath.arctanh(1j) > > 0.78539816339744828j > > > > i.e., set_printoptions didn't affect the result when > the input was a scalar. ?"Feature" or > bug? > A bug, reported last year in numpy ticket #817. > > Ralf > ? > > > > > > DG > > > > > > > > _______________________________________________ > > Scipy-dev mailing list > > Scipy-dev at scipy.org > > http://mail.scipy.org/mailman/listinfo/scipy-dev > > > > > -----Inline Attachment Follows----- > > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > From d_l_goldsmith at yahoo.com Tue Jul 28 21:54:41 2009 From: d_l_goldsmith at yahoo.com (David Goldsmith) Date: Tue, 28 Jul 2009 18:54:41 -0700 (PDT) Subject: [SciPy-dev] Nuther update and no Skypecon this week? Message-ID: <104702.43298.qm@web52102.mail.re2.yahoo.com> "Other Math" is now "Goal Met!" Recall (from a previous post to this list) that I can't do the Skypecon tomorrow; no one responded to my request for suggestions of other times, so...I'm assuming no one wants to have one this week (either). DG From jsseabold at gmail.com Wed Jul 29 10:04:49 2009 From: jsseabold at gmail.com (Skipper Seabold) Date: Wed, 29 Jul 2009 10:04:49 -0400 Subject: [SciPy-dev] Scipy.org down In-Reply-To: <3d375d730907281249w5aa2cf75p1633c969a9e6fa93@mail.gmail.com> References: <3d375d730907281249w5aa2cf75p1633c969a9e6fa93@mail.gmail.com> Message-ID: On Tue, Jul 28, 2009 at 3:49 PM, Robert Kern wrote: > On Tue, Jul 28, 2009 at 14:27, Pauli Virtanen wrote: >> Scipy.org website seems to be down. Port 80 is open, but GET >> yields no response. > > The process has been restarted. Thank you. Came to my attention from another ML that the site appears to be down again. Skipper From stefan at sun.ac.za Thu Jul 30 05:23:11 2009 From: stefan at sun.ac.za (=?ISO-8859-1?Q?St=E9fan_van_der_Walt?=) Date: Thu, 30 Jul 2009 11:23:11 +0200 Subject: [SciPy-dev] Agenda items for the next steering committee meeting Message-ID: <9457e7c80907300223i3257cd41s9e05ff4227399d23@mail.gmail.com> Hi all, We are soliciting agenda items for the SciPy Steering Committee meeting taking place during the SciPy09 conference. Please view the related post on the numpy-discussion list and leave any comments there: http://thread.gmane.org/gmane.comp.python.numeric.general/31845 Kind regards St?fan From jh at physics.ucf.edu Fri Jul 31 13:06:37 2009 From: jh at physics.ucf.edu (Joe Harrington) Date: Fri, 31 Jul 2009 13:06:37 -0400 Subject: [SciPy-dev] SciPy Foundation Message-ID: About sixteen months ago, I launched the SciPy Documentation Project and its Marathon. Dozens pitched in and now numpy docs are rapidly approaching a professional level. The "pink wave" ("Needs Review" status) is at 56% today! There is consensus among doc writers that much of the rest can be labeled in the "unimportant" category, so we're close to starting the review push (hold your fire, there is a web site mod to be done first). We're also nearing the end of the summer, and it's time to look ahead. The path for docs is clear, but the path for SciPy is not. I think our weakest area right now is organization of the project. There is no consensus-based plan for improvement of the whole toward a stated goal, no centralized coordination of work, and no funded work focused on many of our weaknesses, notwithstanding my doc effort and what Enthought does for code. I define success as popular adoption in preference to commercial packages. I believe in vote-with-your-feet: this goal will not be reached until all aspects of the package and its presentation to the world exceed those of our commercial competition. Scipy is now a grass roots effort, but that takes it only so far. Other projects, such as OpenOffice and Sage, don't follow this model and do produce quality products that compete with commercial offerings, at least on open-source platforms. Before we can even hope for that, we have to do the following: - Docs - Rest of numpy reference pages reviewed and proofed or marked unimportant - Scipy reference pages - User manual for the whole toolstack - Multiple commercial books - Packaging - Personal Package Archive or equivalent for every release of every OS for the full toolstack (There are tools that do this but we don't use them. NSF requires Metronome - http://nmi.cs.wisc.edu/ - for funding most development grants, so right now we're not even on NSF's radar.) - Track record of having the whole toolstack installation "just work" in a few command lines or clicks for *everyone* - Regular, scheduled releases of numpy and scipy - Coordinated releases of numpy, scipy, and stable scikits into PPA system - Public communication - A real marketing plan - Executing on that plan - Web site geared toward multiple audiences, run by experts at that kind of communication - More webinars, conference booths, training, aimed at all levels - Demos, testimonials, topical forums, all showcased - Code - A full design review for numpy 2.0 - No more inconsistencies like median(), lacking "out", degrees option for angle functions? - Trimming of financial functions, maybe others, from numpy? - Package structure review (eliminate "fromnumeric"?) - Goal that this be the last breakage for numpy API (the real 1.0) - Scipy - Is it maintainable? should it be broken up? - Clear code addition path (or decide never to add more) - Docs (see above) - Add-on packages - Both existence of and good indexing/integration/support for field-specific packages - Clearer development path for new packages - Central hosting system for packages (svn, mailing lists, web, build integration, etc.) - Simultaneous releases of stable packages along with numpy/scipy I posted a basic improvement plan some years back. The core ideas have not changed; it is linked from the bottom of http://scipy.org/Developer_Zone. I chose our major weakness to begin with and started the doc project, using some money I could justify spending simply for the utility of docs for my own research. I funded the work of two doc coordinators, one each this summer and last. Looking at http://docs.scipy.org/numpy/stats/, you can see that when a doc coordinator was being paid (summers), work got done. When not, then not. Without publicly announcing what these guys made, I'll be the first to admit that it wasn't a lot. Yet, those small sums bought a huge contribution to numpy through the work of several dozen volunteers and the major contributions of a few. My conclusion is that active and constant coordination is central to motivating volunteer work, and that without a salary we cannot depend on coordination remaining active. On the other hand, I have heard Enthought's leaders bemoan the high cost of devoting employee time to this project, and the low returns available from selling support to universities and non-profit research institutes. Their leadership has moved us forward, particularly in the area of code, but has not provided the momentum necessary to carry us forward on all fronts. It is time for the public and education sectors to kick in some resources and organizational leadership. We are, after all, benefitting immensely. Since the cost of employee time is not so high for us in the public and education sectors, I propose to continue hiring people like Stefan and David as UCF employees or contractors, and to expand to hiring others in areas like packaging and marketing, provided that funding for those hires can be found. However, my grant situation is no longer as rich as it has been the past two years, and the needs going forward are greater than in the past if we're now to tackle all the points above. So, I will not be hiring another doc guru from my research grants next year. I am confident that others are willing to pitch in financially, but few will pitch in a full FTE, and we need several. We can (and will) set up a donations site, but donation sites tend to receive pizza money unless a sugar daddy comes along. Those benefitting most from the software, notably education, non-profit research, and government institutions, are *forbidden* from making donations by the terms of their grants. NSF doesn't give you money so you can give it away. We need to provide services they can buy on subcontract and a means for handling payments from them. Selling support does not solve the problem, as that requires spending most of the income on servicing that particular client. Rather, we need to sell a chunk of documentation or the packaging of a particular release, and then provide the product not just to that client but to everyone. We can also propose directly for federal and corporate grant funds. I have spoken with several NASA and NSF program managers and with Google's Federal Accounts Representative, and the possibilities for funding are good. But, I am not going to do this alone. We need a strong proposal team to be credible. So, I am seeking a group that is willing to work with me to put up the infrastructure of a funded project, to write grant proposals, and to coordinate a financial effort. Members of this group must have a track record of funded grants, business success, foundation support, etc. We might call it the SciPy Foundation. It could be based at UCF, which has a low overhead rate and has infrastructure (like an HR staff), or it might be independent if we can find a good director willing to devote significant time for relatively low pay compared to what they can likely make elsewhere. I would envision hiring permanent coordinators for docs, packaging, and marketing communications. Enthought appears to have code covered by virtue of having hired Travis, Robert, etc.; how to integrate that with this effort is an open question but not a difficult one, I think, as code is our strongest asset at this point. I invite discussion of this approach and the task list above on the scipy-dev at scipy.org mailing list. If you are seeing this post elsewhere, please reply only on scipy-dev at scipy.org. If you are eligible to lead funding proposals and are interested in participating in grant writing and management activities related to work in our weak areas, please contact me directly. Thanks, --jh-- Prof. Joseph Harrington Planetary Sciences Group Department of Physics MAP 414 4000 Central Florida Blvd. University of Central Florida Orlando, FL 32816-2385 jh at physics.ucf.edu planets.ucf.edu From robert.kern at gmail.com Fri Jul 31 15:27:14 2009 From: robert.kern at gmail.com (Robert Kern) Date: Fri, 31 Jul 2009 14:27:14 -0500 Subject: [SciPy-dev] [SciPy-User] SciPy Foundation In-Reply-To: References: Message-ID: <3d375d730907311227l52282e91mb244f00f4e6eab7a@mail.gmail.com> On Fri, Jul 31, 2009 at 12:06, Joe Harrington wrote: >?Enthought appears to have code covered by virtue of > having hired Travis, Robert, etc.; Eh, what? We work on numpy and scipy in our spare time, just like everyone else. There are rare occasions when a client wants to fund a particular feature, or we need to fix a bug in the course of our work, but that's a far cry from having "code covered". -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From d_l_goldsmith at yahoo.com Fri Jul 31 15:37:58 2009 From: d_l_goldsmith at yahoo.com (David Goldsmith) Date: Fri, 31 Jul 2009 12:37:58 -0700 (PDT) Subject: [SciPy-dev] [SciPy-User] SciPy Foundation In-Reply-To: <3d375d730907311227l52282e91mb244f00f4e6eab7a@mail.gmail.com> Message-ID: <995066.5244.qm@web52109.mail.re2.yahoo.com> Interesting... Now I'm curious to know how many others thought Enthought employees were paid to "keep the code covered"? DG --- On Fri, 7/31/09, Robert Kern wrote: > From: Robert Kern > Subject: Re: [SciPy-dev] [SciPy-User] SciPy Foundation > To: jh at physics.ucf.edu, scipy-dev at scipy.org, "SciPy Users List" > Date: Friday, July 31, 2009, 12:27 PM > On Fri, Jul 31, 2009 at 12:06, Joe > Harrington > wrote: > >?Enthought appears to have code covered by virtue of > > having hired Travis, Robert, etc.; > > Eh, what? We work on numpy and scipy in our spare time, > just like > everyone else. There are rare occasions when a client wants > to fund a > particular feature, or we need to fix a bug in the course > of our work, > but that's a far cry from having "code covered". > > -- > Robert Kern > > "I have come to believe that the whole world is an enigma, > a harmless > enigma that is made terrible by our own mad attempt to > interpret it as > though it had an underlying truth." > ? -- Umberto Eco > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > From charlesr.harris at gmail.com Fri Jul 31 15:59:41 2009 From: charlesr.harris at gmail.com (Charles R Harris) Date: Fri, 31 Jul 2009 13:59:41 -0600 Subject: [SciPy-dev] [SciPy-User] SciPy Foundation In-Reply-To: <995066.5244.qm@web52109.mail.re2.yahoo.com> References: <3d375d730907311227l52282e91mb244f00f4e6eab7a@mail.gmail.com> <995066.5244.qm@web52109.mail.re2.yahoo.com> Message-ID: On Fri, Jul 31, 2009 at 1:37 PM, David Goldsmith wrote: > > Interesting... Now I'm curious to know how many others thought Enthought > employees were paid to "keep the code covered"? > I always figured they had to scramble to pay the bills. Making a small company go isn't easy-peasy. Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From d_l_goldsmith at yahoo.com Fri Jul 31 16:13:26 2009 From: d_l_goldsmith at yahoo.com (David Goldsmith) Date: Fri, 31 Jul 2009 13:13:26 -0700 (PDT) Subject: [SciPy-dev] [SciPy-User] SciPy Foundation In-Reply-To: Message-ID: <236490.74937.qm@web52109.mail.re2.yahoo.com> Understood and agreed, but is your point that since code maintenance and generation would fall under the category of "capital," not "operations," consequently our default assumption as outsiders should be that they do not invest in it (except when "operations" necessitate)? DG --- On Fri, 7/31/09, Charles R Harris wrote: > From: Charles R Harris > Subject: Re: [SciPy-dev] [SciPy-User] SciPy Foundation > To: "SciPy Developers List" > Date: Friday, July 31, 2009, 12:59 PM > > > On Fri, Jul 31, 2009 at 1:37 PM, > David Goldsmith > wrote: > > > > Interesting... ?Now I'm curious to know how many > others thought Enthought employees were paid to "keep > the code covered"? > > > I always figured they had to scramble to pay the bills. > Making a small company go isn't easy-peasy. > > Chuck > > > > -----Inline Attachment Follows----- > > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > From charlesr.harris at gmail.com Fri Jul 31 16:49:09 2009 From: charlesr.harris at gmail.com (Charles R Harris) Date: Fri, 31 Jul 2009 14:49:09 -0600 Subject: [SciPy-dev] [SciPy-User] SciPy Foundation In-Reply-To: <236490.74937.qm@web52109.mail.re2.yahoo.com> References: <236490.74937.qm@web52109.mail.re2.yahoo.com> Message-ID: On Fri, Jul 31, 2009 at 2:13 PM, David Goldsmith wrote: > > Understood and agreed, but is your point that since code maintenance and > generation would fall under the category of "capital," not "operations," > consequently our default assumption as outsiders should be that they do not > invest in it (except when "operations" necessitate)? > I believe they host the svn servers and pay for the bandwidth, so that is a significant investment. They also hire folks from the community who, after all, need to make a living. As to direct investment in code development I think Robert covered it. But I don't know much about what Enthought does, so if you want a definitive statement you will need to ask them. Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From jh at physics.ucf.edu Fri Jul 31 17:04:46 2009 From: jh at physics.ucf.edu (Joe Harrington) Date: Fri, 31 Jul 2009 17:04:46 -0400 Subject: [SciPy-dev] [SciPy-User] SciPy Foundation In-Reply-To: <3d375d730907311227l52282e91mb244f00f4e6eab7a@mail.gmail.com> (message from Robert Kern on Fri, 31 Jul 2009 14:27:14 -0500) References: <3d375d730907311227l52282e91mb244f00f4e6eab7a@mail.gmail.com> Message-ID: Robert wrote: > On Fri, Jul 31, 2009 at 12:06, Joe Harrington wrote: > >?Enthought appears to have code covered by virtue of > > having hired Travis, Robert, etc.; > Eh, what? We work on numpy and scipy in our spare time, just like > everyone else. There are rare occasions when a client wants to fund a > particular feature, or we need to fix a bug in the course of our work, > but that's a far cry from having "code covered". Then please accept my profusest apologies! Eric mentioned to me that Enthough had paid significantly for scipy development and I thought that meant a portion of developers' time. Perhaps this was just in the past. --jh-- From robert.kern at gmail.com Fri Jul 31 17:13:48 2009 From: robert.kern at gmail.com (Robert Kern) Date: Fri, 31 Jul 2009 16:13:48 -0500 Subject: [SciPy-dev] [SciPy-User] SciPy Foundation In-Reply-To: References: <3d375d730907311227l52282e91mb244f00f4e6eab7a@mail.gmail.com> Message-ID: <3d375d730907311413s3bdf7021n4a3056b3edc989b@mail.gmail.com> On Fri, Jul 31, 2009 at 16:04, Joe Harrington wrote: > Robert wrote: >> On Fri, Jul 31, 2009 at 12:06, Joe Harrington wrote: >> >?Enthought appears to have code covered by virtue of >> > having hired Travis, Robert, etc.; > >> Eh, what? We work on numpy and scipy in our spare time, just like >> everyone else. There are rare occasions when a client wants to fund a >> particular feature, or we need to fix a bug in the course of our work, >> but that's a far cry from having "code covered". > > Then please accept my profusest apologies! ?Eric mentioned to me that > Enthough had paid significantly for scipy development and I thought > that meant a portion of developers' time. ?Perhaps this was just in > the past. Still do; it's just not part of our daily duties and is usually focused on what we need, not general maintenance. Not to mention the infrastructure support. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From d_l_goldsmith at yahoo.com Mon Jul 27 15:08:01 2009 From: d_l_goldsmith at yahoo.com (David Goldsmith) Date: Mon, 27 Jul 2009 12:08:01 -0700 (PDT) Subject: [SciPy-dev] Need to reschedule the Summer Marathon weekly Skypecon Message-ID: <312410.64716.qm@web52111.mail.re2.yahoo.com> Hi, folks. I can't do Wed. 19:00 UTC this week; please suggest alternate times - if no one does, I'll assume that no one wants to have a Skypecon this week either. DG From pgmdevlist at gmail.com Mon Jul 27 15:47:44 2009 From: pgmdevlist at gmail.com (Pierre GM) Date: Mon, 27 Jul 2009 15:47:44 -0400 Subject: [SciPy-dev] nbinom.ppf Message-ID: <199F6194-4953-41F2-9F47-13CC4A5AD936@gmail.com> All, I'm puzzled by this result: >>> from scipy.stats.distributions import nbinom >>> nbinom(.3,.15).ppf(nbinom(.3,.15).cdf(np.arange(20))) array([ 1., 2., 2., 3., 5., 5., 6., 8., 9., 10., 11., 12., 13., 14., 15., 16., 17., 18., 19., 20.]) I would have naturally expected np.arange(20). Using np.round instead of np.ceil in nbinom_gen._ppf seems to solve the issue. Is there any reason not to do it ? Thx in advance. P. From robert.kern at gmail.com Mon Jul 27 15:55:27 2009 From: robert.kern at gmail.com (Robert Kern) Date: Mon, 27 Jul 2009 14:55:27 -0500 Subject: [SciPy-dev] nbinom.ppf In-Reply-To: <199F6194-4953-41F2-9F47-13CC4A5AD936@gmail.com> References: <199F6194-4953-41F2-9F47-13CC4A5AD936@gmail.com> Message-ID: <3d375d730907271255i32283ae9j624e2a57b52fdac4@mail.gmail.com> On Mon, Jul 27, 2009 at 14:47, Pierre GM wrote: > All, > I'm puzzled by this result: > ?>>> from scipy.stats.distributions import nbinom > ?>>> ?nbinom(.3,.15).ppf(nbinom(.3,.15).cdf(np.arange(20))) > array([ ?1., ? 2., ? 2., ? 3., ? 5., ? 5., ? 6., ? 8., ? 9., ?10., ?11., > ? ? ? ? 12., ?13., ?14., ?15., ?16., ?17., ?18., ?19., ?20.]) > > I would have naturally expected np.arange(20). Floating point shenanigans. The CDF and PPF of discrete distributions are step functions. Round-tripping involves evaluating the PPF around the step. Naturally, floating point errors are going to nudge you to the left or right of that step. > Using np.round instead of np.ceil in nbinom_gen._ppf seems to solve > the issue. > Is there any reason not to do it ? ceil() is correct; round() is not. round() would be okay if the only inputs are expected to be outputs of the CDF, but one frequently needs the PPF to take all values in [0,1], like for example doing random number generation via inversion. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From josef.pktd at gmail.com Mon Jul 27 16:29:51 2009 From: josef.pktd at gmail.com (josef.pktd at gmail.com) Date: Mon, 27 Jul 2009 16:29:51 -0400 Subject: [SciPy-dev] nbinom.ppf In-Reply-To: <3d375d730907271255i32283ae9j624e2a57b52fdac4@mail.gmail.com> References: <199F6194-4953-41F2-9F47-13CC4A5AD936@gmail.com> <3d375d730907271255i32283ae9j624e2a57b52fdac4@mail.gmail.com> Message-ID: <1cd32cbb0907271329q33bec67dqc120575454dbf3dc@mail.gmail.com> On Mon, Jul 27, 2009 at 3:55 PM, Robert Kern wrote: > On Mon, Jul 27, 2009 at 14:47, Pierre GM wrote: >> All, >> I'm puzzled by this result: >> ?>>> from scipy.stats.distributions import nbinom >> ?>>> ?nbinom(.3,.15).ppf(nbinom(.3,.15).cdf(np.arange(20))) >> array([ ?1., ? 2., ? 2., ? 3., ? 5., ? 5., ? 6., ? 8., ? 9., ?10., ?11., >> ? ? ? ? 12., ?13., ?14., ?15., ?16., ?17., ?18., ?19., ?20.]) >> >> I would have naturally expected np.arange(20). > > Floating point shenanigans. The CDF and PPF of discrete distributions > are step functions. Round-tripping involves evaluating the PPF around > the step. Naturally, floating point errors are going to nudge you to > the left or right of that step. > >> Using np.round instead of np.ceil in nbinom_gen._ppf seems to solve >> the issue. >> Is there any reason not to do it ? > > ceil() is correct; round() is not. round() would be okay if the only > inputs are expected to be outputs of the CDF, but one frequently needs > the PPF to take all values in [0,1], like for example doing random > number generation via inversion. Do you think it would be useful to round if we are epsilon (?) close to the next integer? It is more likely that users have a case like Pierre's where the answer might be the closest integer, instead of an epsilon below. It would move the floating point error, but to an, in actual usage, less likely location. I think, it is in scipy.special where I saw some code that treats anything that is within 1e-8 (?) of an integer as the integer. Josef > > -- > Robert Kern > > "I have come to believe that the whole world is an enigma, a harmless > enigma that is made terrible by our own mad attempt to interpret it as > though it had an underlying truth." > ?-- Umberto Eco > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > From robert.kern at gmail.com Mon Jul 27 16:58:44 2009 From: robert.kern at gmail.com (Robert Kern) Date: Mon, 27 Jul 2009 15:58:44 -0500 Subject: [SciPy-dev] nbinom.ppf In-Reply-To: <1cd32cbb0907271329q33bec67dqc120575454dbf3dc@mail.gmail.com> References: <199F6194-4953-41F2-9F47-13CC4A5AD936@gmail.com> <3d375d730907271255i32283ae9j624e2a57b52fdac4@mail.gmail.com> <1cd32cbb0907271329q33bec67dqc120575454dbf3dc@mail.gmail.com> Message-ID: <3d375d730907271358s361cd41cke0804f1b80332c93@mail.gmail.com> On Mon, Jul 27, 2009 at 15:29, wrote: > On Mon, Jul 27, 2009 at 3:55 PM, Robert Kern wrote: >> On Mon, Jul 27, 2009 at 14:47, Pierre GM wrote: >>> All, >>> I'm puzzled by this result: >>> ?>>> from scipy.stats.distributions import nbinom >>> ?>>> ?nbinom(.3,.15).ppf(nbinom(.3,.15).cdf(np.arange(20))) >>> array([ ?1., ? 2., ? 2., ? 3., ? 5., ? 5., ? 6., ? 8., ? 9., ?10., ?11., >>> ? ? ? ? 12., ?13., ?14., ?15., ?16., ?17., ?18., ?19., ?20.]) >>> >>> I would have naturally expected np.arange(20). >> >> Floating point shenanigans. The CDF and PPF of discrete distributions >> are step functions. Round-tripping involves evaluating the PPF around >> the step. Naturally, floating point errors are going to nudge you to >> the left or right of that step. >> >>> Using np.round instead of np.ceil in nbinom_gen._ppf seems to solve >>> the issue. >>> Is there any reason not to do it ? >> >> ceil() is correct; round() is not. round() would be okay if the only >> inputs are expected to be outputs of the CDF, but one frequently needs >> the PPF to take all values in [0,1], like for example doing random >> number generation via inversion. > > Do you think it would be useful to round if we are epsilon (?) close > to the next integer? It is more likely that users have a case like > Pierre's where the answer might be the closest integer, instead of an > epsilon below. It would move the floating point error, but to an, in > actual usage, less likely location. > > I think, it is in scipy.special where I saw some code that treats > anything that is within 1e-8 (?) of an integer as the integer. I think the code after the first line is an attempt to remedy this situation: def _ppf(self, q, n, pr): vals = ceil(special.nbdtrik(q,n,pr)) vals1 = vals-1 temp = special.nbdtr(vals1,n,pr) return where(temp >= q, vals1, vals) It is possible that "temp >= q" should be replaced with "temp >= q-eps". -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From pgmdevlist at gmail.com Mon Jul 27 18:20:58 2009 From: pgmdevlist at gmail.com (Pierre GM) Date: Mon, 27 Jul 2009 18:20:58 -0400 Subject: [SciPy-dev] nbinom.ppf In-Reply-To: <3d375d730907271358s361cd41cke0804f1b80332c93@mail.gmail.com> References: <199F6194-4953-41F2-9F47-13CC4A5AD936@gmail.com> <3d375d730907271255i32283ae9j624e2a57b52fdac4@mail.gmail.com> <1cd32cbb0907271329q33bec67dqc120575454dbf3dc@mail.gmail.com> <3d375d730907271358s361cd41cke0804f1b80332c93@mail.gmail.com> Message-ID: <64EF39E2-8AF5-47C0-84D8-ADB4E063A211@gmail.com> On Jul 27, 2009, at 4:58 PM, Robert Kern wrote: > On Mon, Jul 27, 2009 at 15:29, wrote: >> On Mon, Jul 27, 2009 at 3:55 PM, Robert Kern >> wrote: >>> >>> ceil() is correct; round() is not. round() would be okay if the only >>> inputs are expected to be outputs of the CDF, but one frequently >>> needs >>> the PPF to take all values in [0,1], like for example doing random >>> number generation via inversion. Fair enough, but a bit frustrating nevertheless in this particular case. >> Do you think it would be useful to round if we are epsilon (?) close >> to the next integer? It is more likely that users have a case like >> Pierre's where the answer might be the closest integer, instead of an >> epsilon below. It would move the floating point error, but to an, in >> actual usage, less likely location. >> I think, it is in scipy.special where I saw some code that treats >> anything that is within 1e-8 (?) of an integer as the integer. > > I think the code after the first line is an attempt to remedy this > situation: > > def _ppf(self, q, n, pr): > vals = ceil(special.nbdtrik(q,n,pr)) > vals1 = vals-1 > temp = special.nbdtr(vals1,n,pr) > return where(temp >= q, vals1, vals) > > It is possible that "temp >= q" should be replaced with "temp >= q- > eps". Doesn't work in the case I was presenting, as temp is here an array of NaNs. Using >>> vals = ceil(special.nbdtrik(q,n,pr)-eps) seems to work well enough, provided that eps is around 1e-8 as Josef suggested. From josef.pktd at gmail.com Mon Jul 27 19:10:11 2009 From: josef.pktd at gmail.com (josef.pktd at gmail.com) Date: Mon, 27 Jul 2009 19:10:11 -0400 Subject: [SciPy-dev] nbinom.ppf In-Reply-To: <64EF39E2-8AF5-47C0-84D8-ADB4E063A211@gmail.com> References: <199F6194-4953-41F2-9F47-13CC4A5AD936@gmail.com> <3d375d730907271255i32283ae9j624e2a57b52fdac4@mail.gmail.com> <1cd32cbb0907271329q33bec67dqc120575454dbf3dc@mail.gmail.com> <3d375d730907271358s361cd41cke0804f1b80332c93@mail.gmail.com> <64EF39E2-8AF5-47C0-84D8-ADB4E063A211@gmail.com> Message-ID: <1cd32cbb0907271610w6986b256me9730f2ee3c436c5@mail.gmail.com> On Mon, Jul 27, 2009 at 6:20 PM, Pierre GM wrote: > > On Jul 27, 2009, at 4:58 PM, Robert Kern wrote: > >> On Mon, Jul 27, 2009 at 15:29, wrote: >>> On Mon, Jul 27, 2009 at 3:55 PM, Robert Kern >>> wrote: >>>> >>>> ceil() is correct; round() is not. round() would be okay if the only >>>> inputs are expected to be outputs of the CDF, but one frequently >>>> needs >>>> the PPF to take all values in [0,1], like for example doing random >>>> number generation via inversion. > > Fair enough, but a bit frustrating nevertheless in this particular case. > >>> Do you think it would be useful to round if we are epsilon (?) close >>> to the next integer? It is more likely that users have a case like >>> Pierre's where the answer might be the closest integer, instead of an >>> epsilon below. It would move the floating point error, but to an, in >>> actual usage, less likely location. > > > >>> I think, it is in scipy.special where I saw some code that treats >>> anything that is within 1e-8 (?) of an integer as the integer. >> >> I think the code after the first line is an attempt to remedy this >> situation: >> >> ? ?def _ppf(self, q, n, pr): >> ? ? ? ?vals = ceil(special.nbdtrik(q,n,pr)) >> ? ? ? ?vals1 = vals-1 >> ? ? ? ?temp = special.nbdtr(vals1,n,pr) >> ? ? ? ?return where(temp >= q, vals1, vals) >> >> It is possible that "temp >= q" should be replaced with "temp >= q- >> eps". > > Doesn't work in the case I was presenting, as temp is here an array of > NaNs. Using > ?>>> vals = ceil(special.nbdtrik(q,n,pr)-eps) > seems to work well enough, provided that eps is around 1e-8 as Josef > suggested. > I finally remembered, that in the test for the discrete distribution or for histogram with integers, I always use the 1e-8 correction, or 1e-14, depending on what is the more likely numerical error in the calculation. I'm usually not sure what the appropriate correction is. special.nbdtrik doesn't have any documentation, so I don't even know what it is supposed to do. The original failure case should make a nice roundtrip test for the discrete distribution, after the correction. I think, until now I only used floating point cases that are usually non-integers in the tests of the discrete distribution, but not exact roundtrip tests as for the continuous distributions. I will test and change this as soon as possible. Josef > > > > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > From robert.kern at gmail.com Mon Jul 27 19:21:42 2009 From: robert.kern at gmail.com (Robert Kern) Date: Mon, 27 Jul 2009 18:21:42 -0500 Subject: [SciPy-dev] nbinom.ppf In-Reply-To: <1cd32cbb0907271610w6986b256me9730f2ee3c436c5@mail.gmail.com> References: <199F6194-4953-41F2-9F47-13CC4A5AD936@gmail.com> <3d375d730907271255i32283ae9j624e2a57b52fdac4@mail.gmail.com> <1cd32cbb0907271329q33bec67dqc120575454dbf3dc@mail.gmail.com> <3d375d730907271358s361cd41cke0804f1b80332c93@mail.gmail.com> <64EF39E2-8AF5-47C0-84D8-ADB4E063A211@gmail.com> <1cd32cbb0907271610w6986b256me9730f2ee3c436c5@mail.gmail.com> Message-ID: <3d375d730907271621m4aabe914l1536fd6a49da0ee5@mail.gmail.com> On Mon, Jul 27, 2009 at 18:10, wrote: > On Mon, Jul 27, 2009 at 6:20 PM, Pierre GM wrote: >> >> On Jul 27, 2009, at 4:58 PM, Robert Kern wrote: >> >>> On Mon, Jul 27, 2009 at 15:29, wrote: >>>> On Mon, Jul 27, 2009 at 3:55 PM, Robert Kern >>>> wrote: >>>>> >>>>> ceil() is correct; round() is not. round() would be okay if the only >>>>> inputs are expected to be outputs of the CDF, but one frequently >>>>> needs >>>>> the PPF to take all values in [0,1], like for example doing random >>>>> number generation via inversion. >> >> Fair enough, but a bit frustrating nevertheless in this particular case. >> >>>> Do you think it would be useful to round if we are epsilon (?) close >>>> to the next integer? It is more likely that users have a case like >>>> Pierre's where the answer might be the closest integer, instead of an >>>> epsilon below. It would move the floating point error, but to an, in >>>> actual usage, less likely location. >> >> >> >>>> I think, it is in scipy.special where I saw some code that treats >>>> anything that is within 1e-8 (?) of an integer as the integer. >>> >>> I think the code after the first line is an attempt to remedy this >>> situation: >>> >>> ? ?def _ppf(self, q, n, pr): >>> ? ? ? ?vals = ceil(special.nbdtrik(q,n,pr)) >>> ? ? ? ?vals1 = vals-1 >>> ? ? ? ?temp = special.nbdtr(vals1,n,pr) >>> ? ? ? ?return where(temp >= q, vals1, vals) >>> >>> It is possible that "temp >= q" should be replaced with "temp >= q- >>> eps". >> >> Doesn't work in the case I was presenting, as temp is here an array of >> NaNs. Using >> ?>>> vals = ceil(special.nbdtrik(q,n,pr)-eps) >> seems to work well enough, provided that eps is around 1e-8 as Josef >> suggested. >> > > I finally remembered, that in the test for the discrete distribution > or for histogram with integers, I always use the 1e-8 correction, or > 1e-14, depending on what is the more likely numerical error in the > calculation. I'm usually not sure what the appropriate correction is. > > special.nbdtrik doesn't have any documentation, so I don't even know > what it is supposed to do. Just grep for it in the scipy/special sources. It's the inverse of nbdtr, hence the "i", inverting for the "k" parameter given "n" and "p". The problem here is that we are using nbdtr() inside this function to compute temp, but nbdtr doesn't handle n<1. The _cdf() method uses betainc() instead. If one replaces these lines: vals1 = vals-1 temp = special.nbdtr(vals1,n,pr) with these: vals1 = (vals-1).clip(0.0, np.inf) temp = special.betainc(n,vals,pr) then you get the desired answer. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From josef.pktd at gmail.com Mon Jul 27 21:11:17 2009 From: josef.pktd at gmail.com (josef.pktd at gmail.com) Date: Mon, 27 Jul 2009 21:11:17 -0400 Subject: [SciPy-dev] nbinom.ppf In-Reply-To: <3d375d730907271621m4aabe914l1536fd6a49da0ee5@mail.gmail.com> References: <199F6194-4953-41F2-9F47-13CC4A5AD936@gmail.com> <3d375d730907271255i32283ae9j624e2a57b52fdac4@mail.gmail.com> <1cd32cbb0907271329q33bec67dqc120575454dbf3dc@mail.gmail.com> <3d375d730907271358s361cd41cke0804f1b80332c93@mail.gmail.com> <64EF39E2-8AF5-47C0-84D8-ADB4E063A211@gmail.com> <1cd32cbb0907271610w6986b256me9730f2ee3c436c5@mail.gmail.com> <3d375d730907271621m4aabe914l1536fd6a49da0ee5@mail.gmail.com> Message-ID: <1cd32cbb0907271811x411a32dbs4f11bf2f220f6781@mail.gmail.com> On Mon, Jul 27, 2009 at 7:21 PM, Robert Kern wrote: > On Mon, Jul 27, 2009 at 18:10, wrote: >> On Mon, Jul 27, 2009 at 6:20 PM, Pierre GM wrote: >>> >>> On Jul 27, 2009, at 4:58 PM, Robert Kern wrote: >>> >>>> On Mon, Jul 27, 2009 at 15:29, wrote: >>>>> On Mon, Jul 27, 2009 at 3:55 PM, Robert Kern >>>>> wrote: >>>>>> >>>>>> ceil() is correct; round() is not. round() would be okay if the only >>>>>> inputs are expected to be outputs of the CDF, but one frequently >>>>>> needs >>>>>> the PPF to take all values in [0,1], like for example doing random >>>>>> number generation via inversion. >>> >>> Fair enough, but a bit frustrating nevertheless in this particular case. >>> >>>>> Do you think it would be useful to round if we are epsilon (?) close >>>>> to the next integer? It is more likely that users have a case like >>>>> Pierre's where the answer might be the closest integer, instead of an >>>>> epsilon below. It would move the floating point error, but to an, in >>>>> actual usage, less likely location. >>> >>> >>> >>>>> I think, it is in scipy.special where I saw some code that treats >>>>> anything that is within 1e-8 (?) of an integer as the integer. >>>> >>>> I think the code after the first line is an attempt to remedy this >>>> situation: >>>> >>>> ? ?def _ppf(self, q, n, pr): >>>> ? ? ? ?vals = ceil(special.nbdtrik(q,n,pr)) >>>> ? ? ? ?vals1 = vals-1 >>>> ? ? ? ?temp = special.nbdtr(vals1,n,pr) >>>> ? ? ? ?return where(temp >= q, vals1, vals) >>>> >>>> It is possible that "temp >= q" should be replaced with "temp >= q- >>>> eps". >>> >>> Doesn't work in the case I was presenting, as temp is here an array of >>> NaNs. Using >>> ?>>> vals = ceil(special.nbdtrik(q,n,pr)-eps) >>> seems to work well enough, provided that eps is around 1e-8 as Josef >>> suggested. >>> >> >> I finally remembered, that in the test for the discrete distribution >> or for histogram with integers, I always use the 1e-8 correction, or >> 1e-14, depending on what is the more likely numerical error in the >> calculation. I'm usually not sure what the appropriate correction is. >> >> special.nbdtrik doesn't have any documentation, so I don't even know >> what it is supposed to do. > > Just grep for it in the scipy/special sources. It's the inverse of > nbdtr, hence the "i", inverting for the "k" parameter given "n" and > "p". > > The problem here is that we are using nbdtr() inside this function to > compute temp, but nbdtr doesn't handle n<1. The _cdf() method uses > betainc() instead. If one replaces these lines: > > ?vals1 = vals-1 > ?temp = special.nbdtr(vals1,n,pr) > > with these: > > ?vals1 = (vals-1).clip(0.0, np.inf) > ?temp = special.betainc(n,vals,pr) > > then you get the desired answer. That's better. It took me a while to understand the logic behind the way the ceiling error is corrected. The same pattern is also followed by the other discrete distributions that define a _ppf method. It is cleaner then the epsilon correction, but takes longer to figure out what it does. To understand the logic more easily and to be DRY, it would be better to replace the duplication of the _cdf method directly with a call to self._cdf. For example, in changeset 4673, Robert, you changed the _cdf method to use betainc instead of nbdtr, but not the _ppf method. Without the code duplication, partial corrections could be more easily avoided. Is there a reason not to call self._cdf instead? A temporary workaround would be to add, in this case, at least 1e-10 to the cdf in the roundtrip to avoid the floating point ceiling error. >>> nbinom(.3,.15).ppf(nbinom(.3,.15).cdf(np.arange(20))+1e-8) array([ 1., 2., 3., 4., 5., 6., 7., 8., 9., 10., 11., 12., 13., 14., 15., 16., 17., 18., 19., 20.]) >>> nbinom(.3,.15).ppf(nbinom(.3,.15).cdf(np.arange(20))+1e-10) array([ 1., 2., 3., 4., 5., 5., 7., 8., 9., 10., 11., 12., 13., 14., 15., 16., 17., 18., 19., 20.]) >>> nbinom(.3,.15).ppf(nbinom(.3,.15).cdf(np.arange(20))+1e-11) array([ 1., 2., 2., 4., 5., 5., 7., 8., 9., 10., 11., 12., 13., 14., 15., 16., 17., 18., 19., 20.]) Josef > > -- > Robert Kern > > "I have come to believe that the whole world is an enigma, a harmless > enigma that is made terrible by our own mad attempt to interpret it as > though it had an underlying truth." > ?-- Umberto Eco > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > From pgmdevlist at gmail.com Tue Jul 28 00:35:59 2009 From: pgmdevlist at gmail.com (Pierre GM) Date: Tue, 28 Jul 2009 00:35:59 -0400 Subject: [SciPy-dev] special.nbdtrik Message-ID: All, I'm trying to find the sources of special.nbdtrik, but no avail so far. Any help would be greatly appreciated. P. From robert.kern at gmail.com Tue Jul 28 00:40:43 2009 From: robert.kern at gmail.com (Robert Kern) Date: Mon, 27 Jul 2009 23:40:43 -0500 Subject: [SciPy-dev] special.nbdtrik In-Reply-To: References: Message-ID: <3d375d730907272140j34e812fct45f299050de9f42e@mail.gmail.com> On Mon, Jul 27, 2009 at 23:35, Pierre GM wrote: > All, > I'm trying to find the sources of special.nbdtrik, but no avail so far. scipy/special/cdflib/cdfnbn.f -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From pgmdevlist at gmail.com Tue Jul 28 00:51:47 2009 From: pgmdevlist at gmail.com (Pierre GM) Date: Tue, 28 Jul 2009 00:51:47 -0400 Subject: [SciPy-dev] special.nbdtrik In-Reply-To: <3d375d730907272140j34e812fct45f299050de9f42e@mail.gmail.com> References: <3d375d730907272140j34e812fct45f299050de9f42e@mail.gmail.com> Message-ID: <3D0F3093-4776-4B7A-8B3F-C621EA6FCA0C@gmail.com> On Jul 28, 2009, at 12:40 AM, Robert Kern wrote: > > scipy/special/cdflib/cdfnbn.f Thanks a million! P. From robert.kern at gmail.com Tue Jul 28 12:56:41 2009 From: robert.kern at gmail.com (Robert Kern) Date: Tue, 28 Jul 2009 11:56:41 -0500 Subject: [SciPy-dev] nbinom.ppf In-Reply-To: <1cd32cbb0907271811x411a32dbs4f11bf2f220f6781@mail.gmail.com> References: <199F6194-4953-41F2-9F47-13CC4A5AD936@gmail.com> <3d375d730907271255i32283ae9j624e2a57b52fdac4@mail.gmail.com> <1cd32cbb0907271329q33bec67dqc120575454dbf3dc@mail.gmail.com> <3d375d730907271358s361cd41cke0804f1b80332c93@mail.gmail.com> <64EF39E2-8AF5-47C0-84D8-ADB4E063A211@gmail.com> <1cd32cbb0907271610w6986b256me9730f2ee3c436c5@mail.gmail.com> <3d375d730907271621m4aabe914l1536fd6a49da0ee5@mail.gmail.com> <1cd32cbb0907271811x411a32dbs4f11bf2f220f6781@mail.gmail.com> Message-ID: <3d375d730907280956m5425d997u5b37d8a355047439@mail.gmail.com> On Mon, Jul 27, 2009 at 20:11, wrote: > That's better. It took me a while to understand the logic behind the > way the ceiling error is corrected. The same pattern is also followed > by the other discrete distributions that define a _ppf method. It is > cleaner then the epsilon correction, but takes longer to figure out > what it does. > > To understand the logic more easily and to be DRY, it would be better > to replace the duplication of the _cdf method directly with a call to > self._cdf. > For example, in changeset 4673, Robert, you changed the _cdf method to > use betainc instead of nbdtr, but not the _ppf method. Without the > code duplication, partial corrections could be more easily avoided. > > Is there a reason not to call self._cdf instead? Nope. Go ahead. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From d_l_goldsmith at yahoo.com Tue Jul 28 15:24:17 2009 From: d_l_goldsmith at yahoo.com (David Goldsmith) Date: Tue, 28 Jul 2009 12:24:17 -0700 (PDT) Subject: [SciPy-dev] random.set_state also in need of EXPERT attention Message-ID: <82640.89124.qm@web52111.mail.re2.yahoo.com> Hi, folks! Ralf originally requested expert attention for this in its Discussion section; I over-confidently said I thought I could handle it; after too many hours of unproductive research (I seem to have found that there's a lot of "politics" in the field of random number generators? Despite widespread use: http://en.wikipedia.org/wiki/Mersenne_twister, Matsumoto & Nishimura and this algorithm don't appear to have much recognition in certain circles, e.g., those under the influence of Marsaglia, e.g., notably, "Numerical Recipes." I understand that Marsaglia is critical of MT, but for "NR" to completely ignore an algorithm in such widespread use, well, out of curiosity, if anyone knows, "what gives"?) I give up. DG From pav+sp at iki.fi Tue Jul 28 15:27:44 2009 From: pav+sp at iki.fi (Pauli Virtanen) Date: Tue, 28 Jul 2009 19:27:44 +0000 (UTC) Subject: [SciPy-dev] Scipy.org down Message-ID: Scipy.org website seems to be down. Port 80 is open, but GET yields no response. -- Pauli Virtanen From robert.kern at gmail.com Tue Jul 28 15:32:30 2009 From: robert.kern at gmail.com (Robert Kern) Date: Tue, 28 Jul 2009 14:32:30 -0500 Subject: [SciPy-dev] random.set_state also in need of EXPERT attention In-Reply-To: <82640.89124.qm@web52111.mail.re2.yahoo.com> References: <82640.89124.qm@web52111.mail.re2.yahoo.com> Message-ID: <3d375d730907281232kb58f93cq652a4bf56074ee45@mail.gmail.com> On Tue, Jul 28, 2009 at 14:24, David Goldsmith wrote: > > Hi, folks! ?Ralf originally requested expert attention for this in its Discussion section; I over-confidently said I thought I could handle it; after too many hours of unproductive research (I seem to have found that there's a lot of "politics" in the field of random number generators? ?Despite widespread use: http://en.wikipedia.org/wiki/Mersenne_twister, Matsumoto & Nishimura and this algorithm don't appear to have much recognition in certain circles, e.g., those under the influence of Marsaglia, e.g., notably, "Numerical Recipes." ?I understand that Marsaglia is critical of MT, but for "NR" to completely ignore an algorithm in such widespread use, well, out of curiosity, if anyone knows, "what gives"?) I give up. NR is old and stodgy. They don't cover the state-of-the-art for many arts. Ignore them. What "expert attention" is required? -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From robert.kern at gmail.com Tue Jul 28 15:49:42 2009 From: robert.kern at gmail.com (Robert Kern) Date: Tue, 28 Jul 2009 14:49:42 -0500 Subject: [SciPy-dev] Scipy.org down In-Reply-To: References: Message-ID: <3d375d730907281249w5aa2cf75p1633c969a9e6fa93@mail.gmail.com> On Tue, Jul 28, 2009 at 14:27, Pauli Virtanen wrote: > Scipy.org website seems to be down. Port 80 is open, but GET > yields no response. The process has been restarted. Thank you. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From josef.pktd at gmail.com Tue Jul 28 16:06:56 2009 From: josef.pktd at gmail.com (josef.pktd at gmail.com) Date: Tue, 28 Jul 2009 16:06:56 -0400 Subject: [SciPy-dev] docs and doceditor question Message-ID: <1cd32cbb0907281306l5557fd64g4a1bebc527bf973a@mail.gmail.com> what is the source for http://docs.scipy.org/doc/scipy/reference/generated/scipy.interpolate.UnivariateSpline.html ? It's the kind of class documentation that I wanted but I don't see how it is generated. Is it autogenerated or are there specific directives somewhere to specify it. another question: Are the scipy docs supposed to build on Windows. I managed some months ago, but it seems I will have to hunt down again posix specific commands for several hours, to build it. I tried briefly, but without the latest sphinx. Thanks, Josef From robert.kern at gmail.com Tue Jul 28 16:10:50 2009 From: robert.kern at gmail.com (Robert Kern) Date: Tue, 28 Jul 2009 15:10:50 -0500 Subject: [SciPy-dev] docs and doceditor question In-Reply-To: <1cd32cbb0907281306l5557fd64g4a1bebc527bf973a@mail.gmail.com> References: <1cd32cbb0907281306l5557fd64g4a1bebc527bf973a@mail.gmail.com> Message-ID: <3d375d730907281310j553effa0k40048d5520243e39@mail.gmail.com> On Tue, Jul 28, 2009 at 15:06, wrote: > what is the source for > http://docs.scipy.org/doc/scipy/reference/generated/scipy.interpolate.UnivariateSpline.html > ? > > It's the kind of class documentation that I wanted but I don't see how > it is generated. Is it autogenerated or are there specific directives > somewhere to specify it. Click "Edit this Page" in the left-hand side bar. http://docs.scipy.org/scipy/docs/scipy.interpolate.fitpack2.UnivariateSpline/ -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From d_l_goldsmith at yahoo.com Tue Jul 28 16:16:18 2009 From: d_l_goldsmith at yahoo.com (David Goldsmith) Date: Tue, 28 Jul 2009 13:16:18 -0700 (PDT) Subject: [SciPy-dev] random.set_state also in need of EXPERT attention Message-ID: <758501.4470.qm@web52108.mail.re2.yahoo.com> --- On Tue, 7/28/09, Robert Kern wrote: > Goldsmith > wrote: > > > NR is old and stodgy. They don't cover the state-of-the-art > for many arts. Ignore them. Understood. (Still, I'm looking at the 2007 edition - very disappointing. Well, at least I know now not to invest my money in a copy for myself.) > What "expert attention" is required? Quoting Ralf from the Discussion section: "[random.set_state] needs some expert explanation of what the tuple elements mean, and an example of how to use this function to modify the generator state in a meaningful way. [later] [Matsumoto/Nishimura, 1997] is reasonably clear, there's a link to it on the Wikipedia page. I think it will only give you elements 2 and 3 though, 4 and 5 were added later in the implementation and I guess are for some of the distributions in the random module. [Perhaps irrelevant if one doesn't need to decipher the code to write an intelligible docstring] The other difficulty is the Pyrex/Cython source, a bit harder to read than plain Python." DG > > -- > Robert Kern > > "I have come to believe that the whole world is an enigma, > a harmless > enigma that is made terrible by our own mad attempt to > interpret it as > though it had an underlying truth." > ? -- Umberto Eco > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > From josef.pktd at gmail.com Tue Jul 28 16:22:47 2009 From: josef.pktd at gmail.com (josef.pktd at gmail.com) Date: Tue, 28 Jul 2009 16:22:47 -0400 Subject: [SciPy-dev] docs and doceditor question In-Reply-To: <3d375d730907281310j553effa0k40048d5520243e39@mail.gmail.com> References: <1cd32cbb0907281306l5557fd64g4a1bebc527bf973a@mail.gmail.com> <3d375d730907281310j553effa0k40048d5520243e39@mail.gmail.com> Message-ID: <1cd32cbb0907281322h78aa8e1dwff4400fe9d7fa069@mail.gmail.com> On Tue, Jul 28, 2009 at 4:10 PM, Robert Kern wrote: > On Tue, Jul 28, 2009 at 15:06, wrote: >> what is the source for >> http://docs.scipy.org/doc/scipy/reference/generated/scipy.interpolate.UnivariateSpline.html >> ? >> >> It's the kind of class documentation that I wanted but I don't see how >> it is generated. Is it autogenerated or are there specific directives >> somewhere to specify it. > > Click "Edit this Page" in the left-hand side bar. > > http://docs.scipy.org/scipy/docs/scipy.interpolate.fitpack2.UnivariateSpline/ > Thanks, I forgot about these links. However "edit this" only leads to the docstrings. When I follow "show source", I get this http://docs.scipy.org/doc/scipy/reference/_sources/generated/scipy.interpolate.UnivariateSpline.txt which defines which methods are included. It looks autogenerated, but I don't know why some classes have the methods listed and other classes don't. for example, show source of http://docs.scipy.org/doc/scipy/reference/generated/scipy.interpolate.Rbf.html has only __init__ listed. Josef > -- > Robert Kern > > "I have come to believe that the whole world is an enigma, a harmless > enigma that is made terrible by our own mad attempt to interpret it as > though it had an underlying truth." > ?-- Umberto Eco > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > From robert.kern at gmail.com Tue Jul 28 16:34:29 2009 From: robert.kern at gmail.com (Robert Kern) Date: Tue, 28 Jul 2009 15:34:29 -0500 Subject: [SciPy-dev] random.set_state also in need of EXPERT attention In-Reply-To: <758501.4470.qm@web52108.mail.re2.yahoo.com> References: <758501.4470.qm@web52108.mail.re2.yahoo.com> Message-ID: <3d375d730907281334v2decff3al5bcff434311a665@mail.gmail.com> On Tue, Jul 28, 2009 at 15:16, David Goldsmith wrote: > > --- On Tue, 7/28/09, Robert Kern wrote: > >> Goldsmith >> wrote: >> > >> NR is old and stodgy. They don't cover the state-of-the-art >> for many arts. Ignore them. > > Understood. ?(Still, I'm looking at the 2007 edition - very disappointing. ?Well, at least I know now not to invest my money in a copy for myself.) > >> What "expert attention" is required? > > Quoting Ralf from the Discussion section: "[random.set_state] needs some expert explanation of what the tuple elements mean, and an example of how to use this function to modify the generator state in a meaningful way. [later] [Matsumoto/Nishimura, 1997] is reasonably clear, there's a link to it on the Wikipedia page. I think it will only give you elements 2 and 3 though, 4 and 5 were added later in the implementation and I guess are for some of the distributions in the random module. [Perhaps irrelevant if one doesn't need to decipher the code to write an intelligible docstring] The other difficulty is the Pyrex/Cython source, a bit harder to read than plain Python." What is wrong with the current docstring's explanation of the tuple elements? If anything, there should be less information. People should be treating it as an opaque object unless if they are in an unusual circumstance like trying to match an MT implementation in another language. In such a circumstance, they really need to go source diving and understand the code. No amount of documentation is suitable for that. But basically, element 4 is a boolean flag (encoded as an int) for whether element 5 is a cached value for the Gaussian distribution or not. The Gaussian generator generates two values at a time, so we return one and cache one and turn on the flag. The next call returns the cached value and turns off the flag. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From d_l_goldsmith at yahoo.com Tue Jul 28 16:36:26 2009 From: d_l_goldsmith at yahoo.com (d_l_goldsmith at yahoo.com) Date: Tue, 28 Jul 2009 13:36:26 -0700 (PDT) Subject: [SciPy-dev] Milestones status update Message-ID: <120501.30388.qm@web52110.mail.re2.yahoo.com> Un-led categories not near "Goal met": "Masked Arrays," "Masked Arrays IV," "Operations on Masks," "Even More MA Functions, I and II," "Other Math," "Other Array Subclasses," "Numpy Internals," and "C-Types" (these last two are noted as (probably) needing Expert attention). Led categories still needing work: "Financial Functions," (> 50% GM) "Random Number Generation," (> 57% GM) "Masked Arrays III" (> 80% GM). Objects & namespaces w/ "white" un-categorized attributes: broadcast, arrayprint (one), chararray (many!), matrix (many!), memmap, multiarray, numeric (Expert only?), recarray, record, scalarmath, umath, ctypeslib (one, Expert only!), distutils (many, Expert only!), doc, dtype (one), fftpack_lite, generic (many!), arrayterator, scimath, shape_base (one), stride_tricks, utils, testing (decorators, nosetester, utils), ufunc (one), void DG From robert.kern at gmail.com Tue Jul 28 16:38:52 2009 From: robert.kern at gmail.com (Robert Kern) Date: Tue, 28 Jul 2009 15:38:52 -0500 Subject: [SciPy-dev] docs and doceditor question In-Reply-To: <1cd32cbb0907281322h78aa8e1dwff4400fe9d7fa069@mail.gmail.com> References: <1cd32cbb0907281306l5557fd64g4a1bebc527bf973a@mail.gmail.com> <3d375d730907281310j553effa0k40048d5520243e39@mail.gmail.com> <1cd32cbb0907281322h78aa8e1dwff4400fe9d7fa069@mail.gmail.com> Message-ID: <3d375d730907281338le8fe6e6w1ff33b75b075fe17@mail.gmail.com> On Tue, Jul 28, 2009 at 15:22, wrote: > On Tue, Jul 28, 2009 at 4:10 PM, Robert Kern wrote: >> On Tue, Jul 28, 2009 at 15:06, wrote: >>> what is the source for >>> http://docs.scipy.org/doc/scipy/reference/generated/scipy.interpolate.UnivariateSpline.html >>> ? >>> >>> It's the kind of class documentation that I wanted but I don't see how >>> it is generated. Is it autogenerated or are there specific directives >>> somewhere to specify it. >> >> Click "Edit this Page" in the left-hand side bar. >> >> http://docs.scipy.org/scipy/docs/scipy.interpolate.fitpack2.UnivariateSpline/ >> > > Thanks, I forgot about these links. However "edit this" only leads to > the docstrings. When I follow "show source", I get this > http://docs.scipy.org/doc/scipy/reference/_sources/generated/scipy.interpolate.UnivariateSpline.txt > > which defines which methods are included. It looks autogenerated, but > I don't know why some classes have the methods listed and other > classes don't. > > for example, show source of > http://docs.scipy.org/doc/scipy/reference/generated/scipy.interpolate.Rbf.html > has only __init__ listed. Possibly because UnivariateSpline has this in its docstring: """ Methods ------- __call__ get_knots get_coeffs get_residual integral derivatives roots set_smoothing_factor """ while Rbf doesn't. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From d_l_goldsmith at yahoo.com Tue Jul 28 16:40:19 2009 From: d_l_goldsmith at yahoo.com (David Goldsmith) Date: Tue, 28 Jul 2009 13:40:19 -0700 (PDT) Subject: [SciPy-dev] random.set_state also in need of EXPERT attention Message-ID: <647504.96116.qm@web52104.mail.re2.yahoo.com> Okee-dokee, thanks! DG --- On Tue, 7/28/09, Robert Kern wrote: > From: Robert Kern > Subject: Re: [SciPy-dev] random.set_state also in need of EXPERT attention > To: "SciPy Developers List" > Date: Tuesday, July 28, 2009, 1:34 PM > On Tue, Jul 28, 2009 at 15:16, David > Goldsmith > wrote: > > > > --- On Tue, 7/28/09, Robert Kern > wrote: > > > >> Goldsmith > >> wrote: > >> > > >> NR is old and stodgy. They don't cover the > state-of-the-art > >> for many arts. Ignore them. > > > > Understood. ?(Still, I'm looking at the 2007 edition > - very disappointing. ?Well, at least I know now not to > invest my money in a copy for myself.) > > > >> What "expert attention" is required? > > > > Quoting Ralf from the Discussion section: > "[random.set_state] needs some expert explanation of what > the tuple elements mean, and an example of how to use this > function to modify the generator state in a meaningful way. > [later] [Matsumoto/Nishimura, 1997] is reasonably clear, > there's a link to it on the Wikipedia page. I think it will > only give you elements 2 and 3 though, 4 and 5 were added > later in the implementation and I guess are for some of the > distributions in the random module. [Perhaps irrelevant if > one doesn't need to decipher the code to write an > intelligible docstring] The other difficulty is the > Pyrex/Cython source, a bit harder to read than plain > Python." > > What is wrong with the current docstring's explanation of > the tuple > elements? If anything, there should be less information. > People should > be treating it as an opaque object unless if they are in an > unusual > circumstance like trying to match an MT implementation in > another > language. In such a circumstance, they really need to go > source diving > and understand the code. No amount of documentation is > suitable for > that. > > But basically, element 4 is a boolean flag (encoded as an > int) for > whether element 5 is a cached value for the Gaussian > distribution or > not. The Gaussian generator generates two values at a time, > so we > return one and cache one and turn on the flag. The next > call returns > the cached value and turns off the flag. > > -- > Robert Kern > > "I have come to believe that the whole world is an enigma, > a harmless > enigma that is made terrible by our own mad attempt to > interpret it as > though it had an underlying truth." > ? -- Umberto Eco > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > From pgmdevlist at gmail.com Tue Jul 28 17:02:02 2009 From: pgmdevlist at gmail.com (Pierre GM) Date: Tue, 28 Jul 2009 17:02:02 -0400 Subject: [SciPy-dev] Milestones status update In-Reply-To: <120501.30388.qm@web52110.mail.re2.yahoo.com> References: <120501.30388.qm@web52110.mail.re2.yahoo.com> Message-ID: <5E2FD68B-E255-40C9-AA45-363644BA031B@gmail.com> On Jul 28, 2009, at 4:36 PM, d_l_goldsmith at yahoo.com wrote: > > Un-led categories not near "Goal met": > > "Masked Arrays," "Masked Arrays IV," "Operations on Masks," "Even > More MA Functions, I and II," "Other Math," "Other Array > Subclasses," "Numpy Internals," and "C-Types" (these last two are > noted as (probably) needing Expert attention). Uh-oh, looks like I'm in trouble... From d_l_goldsmith at yahoo.com Tue Jul 28 17:35:51 2009 From: d_l_goldsmith at yahoo.com (David Goldsmith) Date: Tue, 28 Jul 2009 14:35:51 -0700 (PDT) Subject: [SciPy-dev] Milestones status update Message-ID: <102676.35302.qm@web52104.mail.re2.yahoo.com> Why you? DG --- On Tue, 7/28/09, Pierre GM wrote: > From: Pierre GM > Subject: Re: [SciPy-dev] Milestones status update > To: "SciPy Developers List" > Date: Tuesday, July 28, 2009, 2:02 PM > > On Jul 28, 2009, at 4:36 PM, d_l_goldsmith at yahoo.com > wrote: > > > > > Un-led categories not near "Goal met": > > > > "Masked Arrays," "Masked Arrays IV," "Operations on > Masks," "Even? > > More MA Functions, I and II," "Other Math," "Other > Array? > > Subclasses," "Numpy Internals," and "C-Types" (these > last two are? > > noted as (probably) needing Expert attention). > > Uh-oh, looks like I'm in trouble... > > > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > From pav+sp at iki.fi Tue Jul 28 17:44:07 2009 From: pav+sp at iki.fi (Pauli Virtanen) Date: Tue, 28 Jul 2009 21:44:07 +0000 (UTC) Subject: [SciPy-dev] random.set_state also in need of EXPERT attention References: <82640.89124.qm@web52111.mail.re2.yahoo.com> Message-ID: (Disclaimer: I'm not a PRNG expert.) On 2009-07-28, David Goldsmith wrote: > Hi, folks! Ralf originally requested expert attention for this > in its Discussion section; I over-confidently said I thought I > could handle it; after too many hours of unproductive research > (I seem to have found that there's a lot of "politics" in the > field of random number generators? Looks like science as usual to me :) I think the natural explanation is that everyone proposing a new PRNG of course wants to advertise its merits. Also, as there is no clear-cut way to quantify the "randomness" of a PRNG and because of speed tradeoffs, there is something to argue about and many different alternatives have been proposed. > Despite widespread use: > http://en.wikipedia.org/wiki/Mersenne_twister, Matsumoto & > Nishimura and this algorithm don't appear to have much > recognition in certain circles, e.g., those under the influence > of Marsaglia, e.g., notably, "Numerical Recipes." The second edition of Numerical recipes was written in 1992, so that explains why it's not there. If you Google it, for the third edition the authors of NR respond [1] that they didn't include MT because it "has just too many operations per random value generated". This is of course understandable in a book that aims to give a focused introduction on the subject. Whether it reflects the merits of the algorithm is then a different question. .. [1] http://www.nr.com/forum/showthread.php?t=1724 > I understand that Marsaglia is critical of MT, but for "NR" to > completely ignore an algorithm in such widespread use, well, > out of curiosity, if anyone knows, "what gives"?) As I see it, the statements sourced in the Wikipedia article are fairly mild, criticising mostly the complexity of the algorithm. Also, they were not backed by anything, and probably should be taken with a grain of salt. Marsaglia seems to have proposed another types of PRNGs in 2003, but these had flaws, which maybe were addressed by Brent later on. (Cf. [2] and follow the references.) The MT article is widely cited (123 citations as reported by ACM, 1346 by Google Scholar), and sampling some of the review-type ones (eg. [3,4]) the generator seems to have done reasonably in various randomness tests and also be reasonable speed-wise. Perhaps the algorithm is not optimal -- after all, it's already more than ten years old -- but it appears to be well tested and understood. .. [2] http://wwwmaths.anu.edu.au/~brent/random.html .. [3] http://dx.doi.org/10.1016/j.csda.2006.05.019 .. [4] http://www.iro.umontreal.ca/~simardr/testu01/tu01.html -- Pauli Virtanen From pav+sp at iki.fi Tue Jul 28 17:50:38 2009 From: pav+sp at iki.fi (Pauli Virtanen) Date: Tue, 28 Jul 2009 21:50:38 +0000 (UTC) Subject: [SciPy-dev] docs and doceditor question References: <1cd32cbb0907281306l5557fd64g4a1bebc527bf973a@mail.gmail.com> <3d375d730907281310j553effa0k40048d5520243e39@mail.gmail.com> <1cd32cbb0907281322h78aa8e1dwff4400fe9d7fa069@mail.gmail.com> <3d375d730907281338le8fe6e6w1ff33b75b075fe17@mail.gmail.com> Message-ID: On 2009-07-28, Robert Kern wrote: > On Tue, Jul 28, 2009 at 15:22, wrote: [clip] >> which defines which methods are included. It looks autogenerated, but >> I don't know why some classes have the methods listed and other >> classes don't. >> >> for example, show source of >> http://docs.scipy.org/doc/scipy/reference/generated/scipy.interpolate.Rbf.html >> has only __init__ listed. > > Possibly because UnivariateSpline has this in its docstring: > > """ > Methods > ------- [clip] > """ > > while Rbf doesn't. Exactly, the list of methods shown is based on the docstring, as discussed in our docstring standard. Unfortunately, it's not possible to decide automatically which methods are relevant and which are not. Re: building documentation on Windows. If you want to lend a hand in making it work (eg. a suitable .bat file), contributions are welcome. If you want to just get it to work, the easiest way is probably to use the latest development version of Sphinx: then just sphinx-build should be enough. -- Pauli Virtanen From pav+sp at iki.fi Tue Jul 28 18:01:43 2009 From: pav+sp at iki.fi (Pauli Virtanen) Date: Tue, 28 Jul 2009 22:01:43 +0000 (UTC) Subject: [SciPy-dev] random.set_state also in need of EXPERT attention References: <82640.89124.qm@web52111.mail.re2.yahoo.com> Message-ID: On 2009-07-28, Pauli Virtanen wrote: [clip] > Also, they were not backed by anything, and probably should be > taken with a grain of salt. Agh, forget this sentence, I read the messages too fast -- these statements were supported by some discussion. -- Pauli Virtanen From d_l_goldsmith at yahoo.com Tue Jul 28 18:10:31 2009 From: d_l_goldsmith at yahoo.com (David Goldsmith) Date: Tue, 28 Jul 2009 15:10:31 -0700 (PDT) Subject: [SciPy-dev] random.set_state also in need of EXPERT attention Message-ID: <688863.82952.qm@web52103.mail.re2.yahoo.com> Awesome, Pauli, thanks! DG --- On Tue, 7/28/09, Pauli Virtanen wrote: > From: Pauli Virtanen > Subject: Re: [SciPy-dev] random.set_state also in need of EXPERT attention > To: scipy-dev at scipy.org > Date: Tuesday, July 28, 2009, 2:44 PM > > (Disclaimer: I'm not a PRNG expert.) > > On 2009-07-28, David Goldsmith > wrote: > > Hi, folks!? Ralf originally requested expert > attention for this > > in its Discussion section; I over-confidently said I > thought I > > could handle it; after too many hours of unproductive > research > > (I seem to have found that there's a lot of "politics" > in the > > field of random number generators? > > Looks like science as usual to me :) I think the natural > explanation is that everyone proposing a new PRNG of course > wants > to advertise its merits. Also, as there is no clear-cut way > to > quantify the "randomness" of a PRNG and because of speed > tradeoffs, there is something to argue about and many > different > alternatives have been proposed. > > > Despite widespread use: > > http://en.wikipedia.org/wiki/Mersenne_twister, > Matsumoto & > > Nishimura and this algorithm don't appear to have much > > > recognition in certain circles, e.g., those under the > influence > > of Marsaglia, e.g., notably, "Numerical Recipes." > > The second edition of Numerical recipes was written in > 1992, so > that explains why it's not there. > > If you Google it, for the third edition the authors of NR > respond > [1] that they didn't include MT because it "has just too > many > operations per random value generated". This is of course > understandable in a book that aims to give a focused > introduction > on the subject. Whether it reflects the merits of the > algorithm > is then a different question. > > .. [1] http://www.nr.com/forum/showthread.php?t=1724 > > > I understand that Marsaglia is critical of MT, but for > "NR" to > > completely ignore an algorithm in such widespread use, > well, > > out of curiosity, if anyone knows, "what gives"?) > > As I see it, the statements sourced in the Wikipedia > article are > fairly mild, criticising mostly the complexity of the > algorithm. > Also, they were not backed by anything, and probably should > be > taken with a grain of salt. > > Marsaglia seems to have proposed another types of PRNGs in > 2003, > but these had flaws, which maybe were addressed by Brent > later > on. (Cf. [2] and follow the references.) > > The MT article is widely cited (123 citations as reported > by ACM, > 1346 by Google Scholar), and sampling some of the > review-type > ones (eg. [3,4]) the generator seems to have done > reasonably in > various randomness tests and also be reasonable speed-wise. > > Perhaps the algorithm is not optimal -- after all, it's > already > more than ten years old -- but it appears to be well tested > and > understood. > > .. [2] http://wwwmaths.anu.edu.au/~brent/random.html > .. [3] http://dx.doi.org/10.1016/j.csda.2006.05.019 > .. [4] http://www.iro.umontreal.ca/~simardr/testu01/tu01.html > > -- > Pauli Virtanen > > _______________________________________________ > Scipy-dev mailing list > Scipy-dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev >